Sample records for creating science-driven computer

  1. Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science

    NASA Astrophysics Data System (ADS)

    Baru, C.

    2014-12-01

    Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.

  2. Digital Youth Divas: Exploring Narrative-Driven Curriculum to Spark Middle School Girls' Interest in Computational Activities

    ERIC Educational Resources Information Center

    Pinkard, Nichole; Erete, Sheena; Martin, Caitlin K.; McKinney de Royston, Maxine

    2017-01-01

    Women use technology to mediate numerous aspects of their professional and personal lives. Yet, few design and create these technologies given that women, especially women of color, are grossly underrepresented in computer science and engineering courses. Decisions about participation in STEM are frequently made prior to high school, and these…

  3. Science Pipes: A World of Data at Your Fingertips--Exploring Biodiversity with Online Visualization and Analysis Tools

    ERIC Educational Resources Information Center

    Wilson, Courtney R.; Trautmann, Nancy M.; MaKinster, James G.; Barker, Barbara J.

    2010-01-01

    A new online tool called "Science Pipes" allows students to conduct biodiversity investigations. With this free tool, students create and run analyses that would otherwise require access to unwieldy data sets and the ability to write computer code. Using these data, students can conduct guided inquiries or hypothesis-driven research to…

  4. A Community Publication and Dissemination System for Hydrology Education Materials

    NASA Astrophysics Data System (ADS)

    Ruddell, B. L.

    2015-12-01

    Hosted by CUAHSI and the Science Education Resource Center (SERC), federated by the National Science Digital Library (NSDL), and allied with the Water Data Center (WDC), Hydrologic Information System (HIS), and HydroShare projects, a simple cyberinfrastructure has been launched for the publication and dissemination of data and model driven university hydrology education materials. This lightweight system's metadata describes learning content as a data-driven module with defined data inputs and outputs. This structure allows a user to mix and match modules to create sequences of content that teach both hydrology and computer learning outcomes. Importantly, this modular infrastructure allows an instructor to substitute a module based on updated computer methods for one based on outdated computer methods, hopefully solving the problem of rapid obsolescence that has hampered previous community efforts. The prototype system is now available from CUAHSI and SERC, with some example content. The system is designed to catalog, link to, make visible, and make accessible the existing and future contributions of the community; this system does not create content. Submissions from hydrology educators are eagerly solicited, especially for existing content.

  5. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  6. eButterfly: Leveraging Massive Online Citizen Science for Butterfly Conservation

    PubMed Central

    Prudic, Kathleen L.; McFarland, Kent P.; Oliver, Jeffrey C.; Hutchinson, Rebecca A.; Long, Elizabeth C.; Kerr, Jeremy T.; Larrivée, Maxim

    2017-01-01

    Data collection, storage, analysis, visualization, and dissemination are changing rapidly due to advances in new technologies driven by computer science and universal access to the internet. These technologies and web connections place human observers front and center in citizen science-driven research and are critical in generating new discoveries and innovation in such fields as astronomy, biodiversity, and meteorology. Research projects utilizing a citizen science approach address scientific problems at regional, continental, and even global scales otherwise impossible for a single lab or even a small collection of academic researchers. Here we describe eButterfly an integrative checklist-based butterfly monitoring and database web-platform that leverages the skills and knowledge of recreational butterfly enthusiasts to create a globally accessible unified database of butterfly observations across North America. Citizen scientists, conservationists, policy makers, and scientists are using eButterfly data to better understand the biological patterns of butterfly species diversity and how environmental conditions shape these patterns in space and time. eButterfly in collaboration with thousands of butterfly enthusiasts has created a near real-time butterfly data resource producing tens of thousands of observations per year open to all to share and explore. PMID:28524117

  7. eButterfly: Leveraging Massive Online Citizen Science for Butterfly Consevation.

    PubMed

    Prudic, Kathleen L; McFarland, Kent P; Oliver, Jeffrey C; Hutchinson, Rebecca A; Long, Elizabeth C; Kerr, Jeremy T; Larrivée, Maxim

    2017-05-18

    Data collection, storage, analysis, visualization, and dissemination are changing rapidly due to advances in new technologies driven by computer science and universal access to the internet. These technologies and web connections place human observers front and center in citizen science-driven research and are critical in generating new discoveries and innovation in such fields as astronomy, biodiversity, and meteorology. Research projects utilizing a citizen science approach address scientific problems at regional, continental, and even global scales otherwise impossible for a single lab or even a small collection of academic researchers. Here we describe eButterfly an integrative checklist-based butterfly monitoring and database web-platform that leverages the skills and knowledge of recreational butterfly enthusiasts to create a globally accessible unified database of butterfly observations across North America. Citizen scientists, conservationists, policy makers, and scientists are using eButterfly data to better understand the biological patterns of butterfly species diversity and how environmental conditions shape these patterns in space and time. eButterfly in collaboration with thousands of butterfly enthusiasts has created a near real-time butterfly data resource producing tens of thousands of observations per year open to all to share and explore.

  8. Social Computing as Next-Gen Learning Paradigm: A Platform and Applications

    NASA Astrophysics Data System (ADS)

    Margherita, Alessandro; Taurino, Cesare; Del Vecchio, Pasquale

    As a field at the intersection between computer science and people behavior, social computing can contribute significantly in the endeavor of innovating how individuals and groups interact for learning and working purposes. In particular, the generation of Internet applications tagged as web 2.0 provides an opportunity to create new “environments” where people can exchange knowledge and experience, create new knowledge and learn together. This chapter illustrates the design and application of a prototypal platform which embeds tools such as blog, wiki, folksonomy and RSS in a unique web-based system. This platform has been developed to support a case-based and project-driven learning strategy for the development of business and technology management competencies in undergraduate and graduate education programs. A set of illustrative scenarios are described to show how a learning community can be promoted, created, and sustained through the technological platform.

  9. Massive Data, the Digitization of Science, and Reproducibility of Results

    ScienceCinema

    Stodden, Victoria

    2018-04-27

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  10. Single Cell Genomics: Approaches and Utility in Immunology

    PubMed Central

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-01-01

    Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102

  11. Single-Cell Genomics: Approaches and Utility in Immunology.

    PubMed

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-02-01

    Single-cell genomics offers powerful tools for studying immune cells, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population level. Advances in computer science and single-cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single-cell RNA-sequencing data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. InfoSymbiotics/DDDAS - The power of Dynamic Data Driven Applications Systems for New Capabilities in Environmental -, Geo-, and Space- Sciences

    NASA Astrophysics Data System (ADS)

    Darema, F.

    2016-12-01

    InfoSymbiotics/DDDAS embodies the power of Dynamic Data Driven Applications Systems (DDDAS), a concept whereby an executing application model is dynamically integrated, in a feed-back loop, with the real-time data-acquisition and control components, as well as other data sources of the application system. Advanced capabilities can be created through such new computational approaches in modeling and simulations, and in instrumentation methods, and include: enhancing the accuracy of the application model; speeding-up the computation to allow faster and more comprehensive models of a system, and create decision support systems with the accuracy of full-scale simulations; in addition, the notion of controlling instrumentation processes by the executing application results in more efficient management of application-data and addresses challenges of how to architect and dynamically manage large sets of heterogeneous sensors and controllers, an advance over the static and ad-hoc ways of today - with DDDAS these sets of resources can be managed adaptively and in optimized ways. Large-Scale-Dynamic-Data encompasses the next wave of Big Data, and namely dynamic data arising from ubiquitous sensing and control in engineered, natural, and societal systems, through multitudes of heterogeneous sensors and controllers instrumenting these systems, and where opportunities and challenges at these "large-scales" relate not only to data size but the heterogeneity in data, data collection modalities, fidelities, and timescales, ranging from real-time data to archival data. In tandem with this important dimension of dynamic data, there is an extended view of Big Computing, which includes the collective computing by networked assemblies of multitudes of sensors and controllers, this range from the high-end to the real-time seamlessly integrated and unified, and comprising the Large-Scale-Big-Computing. InfoSymbiotics/DDDAS engenders transformative impact in many application domains, ranging from the nano-scale to the terra-scale and to the extra-terra-scale. The talk will address opportunities for new capabilities together with corresponding research challenges, with illustrative examples from several application areas including environmental sciences, geosciences, and space sciences.

  13. The Role of Physicality in Rich Programming Environments

    ERIC Educational Resources Information Center

    Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin

    2013-01-01

    Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot…

  14. EarthCube: A Community-Driven Cyberinfrastructure for the Geosciences

    NASA Astrophysics Data System (ADS)

    Koskela, Rebecca; Ramamurthy, Mohan; Pearlman, Jay; Lehnert, Kerstin; Ahern, Tim; Fredericks, Janet; Goring, Simon; Peckham, Scott; Powers, Lindsay; Kamalabdi, Farzad; Rubin, Ken; Yarmey, Lynn

    2017-04-01

    EarthCube is creating a dynamic, System of Systems (SoS) infrastructure and data tools to collect, access, analyze, share, and visualize all forms of geoscience data and resources, using advanced collaboration, technological, and computational capabilities. EarthCube, as a joint effort between the U.S. National Science Foundation Directorate for Geosciences and the Division of Advanced Cyberinfrastructure, is a quickly growing community of scientists across all geoscience domains, as well as geoinformatics researchers and data scientists. EarthCube has attracted an evolving, dynamic virtual community of more than 2,500 contributors, including earth, ocean, polar, planetary, atmospheric, geospace, computer and social scientists, educators, and data and information professionals. During 2017, EarthCube will transition to the implementation phase. The implementation will balance "innovation" and "production" to advance cross-disciplinary science goals as well as the development of future data scientists. This presentation will describe the current architecture design for the EarthCube cyberinfrastructure and implementation plan.

  15. The 1987 RIACS annual report

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established at the NASA Ames Research Center in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 64 universities with graduate programs in the aerospace sciences, under several Cooperative Agreements with NASA. RIACS's goal is to provide preeminent leadership in basic and applied computer science research as partners in support of NASA's goals and missions. In pursuit of this goal, RIACS contributes to several of the grand challenges in science and engineering facing NASA: flying an airplane inside a computer; determining the chemical properties of materials under hostile conditions in the atmospheres of earth and the planets; sending intelligent machines on unmanned space missions; creating a one-world network that makes all scientific resources, including those in space, accessible to all the world's scientists; providing intelligent computational support to all stages of the process of scientific investigation from problem formulation to results dissemination; and developing accurate global models for climatic behavior throughout the world. In working with these challenges, we seek novel architectures, and novel ways to use them, that exploit the potential of parallel and distributed computation and make possible new functions that are beyond the current reach of computing machines. The investigation includes pattern computers as well as the more familiar numeric and symbolic computers, and it includes networked systems of resources distributed around the world. We believe that successful computer science research is interdisciplinary: it is driven by (and drives) important problems in other disciplines. We believe that research should be guided by a clear long-term vision with planned milestones. And we believe that our environment must foster and exploit innovation. Our activities and accomplishments for the calendar year 1987 and our plans for 1988 are reported.

  16. High-order hydrodynamic algorithms for exascale computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, Nathaniel Ray

    Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broadmore » range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.« less

  17. Evolution of an Intelligent Deductive Logic Tutor Using Data-Driven Elements

    ERIC Educational Resources Information Center

    Mostafavi, Behrooz; Barnes, Tiffany

    2017-01-01

    Deductive logic is essential to a complete understanding of computer science concepts, and is thus fundamental to computer science education. Intelligent tutoring systems with individualized instruction have been shown to increase learning gains. We seek to improve the way deductive logic is taught in computer science by developing an intelligent,…

  18. A National Virtual Specimen Database for Early Cancer Detection

    NASA Technical Reports Server (NTRS)

    Crichton, Daniel; Kincaid, Heather; Kelly, Sean; Thornquist, Mark; Johnsey, Donald; Winget, Marcy

    2003-01-01

    Access to biospecimens is essential for enabling cancer biomarker discovery. The National Cancer Institute's (NCI) Early Detection Research Network (EDRN) comprises and integrates a large number of laboratories into a network in order to establish a collaborative scientific environment to discover and validate disease markers. The diversity of both the institutions and the collaborative focus has created the need for establishing cross-disciplinary teams focused on integrating expertise in biomedical research, computational and biostatistics, and computer science. Given the collaborative design of the network, the EDRN needed an informatics infrastructure. The Fred Hutchinson Cancer Research Center, the National Cancer Institute,and NASA's Jet Propulsion Laboratory (JPL) teamed up to build an informatics infrastructure creating a collaborative, science-driven research environment despite the geographic and morphology differences of the information systems that existed within the diverse network. EDRN investigators identified the need to share biospecimen data captured across the country managed in disparate databases. As a result, the informatics team initiated an effort to create a virtual tissue database whereby scientists could search and locate details about specimens located at collaborating laboratories. Each database, however, was locally implemented and integrated into collection processes and methods unique to each institution. This meant that efforts to integrate databases needed to be done in a manner that did not require redesign or re-implementation of existing system

  19. Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.

    PubMed

    Stein, Lincoln D

    2008-09-01

    Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.

  20. EarthCube Activities: Community Engagement Advancing Geoscience Research

    NASA Astrophysics Data System (ADS)

    Kinkade, D.

    2015-12-01

    Our ability to advance scientific research in order to better understand complex Earth systems, address emerging geoscience problems, and meet societal challenges is increasingly dependent upon the concept of Open Science and Data. Although these terms are relatively new to the world of research, Open Science and Data in this context may be described as transparency in the scientific process. This includes the discoverability, public accessibility and reusability of scientific data, as well as accessibility and transparency of scientific communication (www.openscience.org). Scientists and the US government alike are realizing the critical need for easy discovery and access to multidisciplinary data to advance research in the geosciences. The NSF-supported EarthCube project was created to meet this need. EarthCube is developing a community-driven common cyberinfrastructure for the purpose of accessing, integrating, analyzing, sharing and visualizing all forms of data and related resources through advanced technological and computational capabilities. Engaging the geoscience community in EarthCube's development is crucial to its success, and EarthCube is providing several opportunities for geoscience involvement. This presentation will provide an overview of the activities EarthCube is employing to entrain the community in the development process, from governance development and strategic planning, to technical needs gathering. Particular focus will be given to the collection of science-driven use cases as a means of capturing scientific and technical requirements. Such activities inform the development of key technical and computational components that collectively will form a cyberinfrastructure to meet the research needs of the geoscience community.

  1. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  2. Reconsidering Simulations in Science Education at a Distance: Features of Effective Use

    ERIC Educational Resources Information Center

    Blake, C.; Scanlon, E.

    2007-01-01

    This paper proposes a reconsideration of use of computer simulations in science education. We discuss three studies of the use of science simulations for undergraduate distance learning students. The first one, "The Driven Pendulum" simulation is a computer-based experiment on the behaviour of a pendulum. The second simulation, "Evolve" is…

  3. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters.

    PubMed

    Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola

    2018-05-01

    Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.

  4. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters

    PubMed Central

    2018-01-01

    Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792

  5. The role of physicality in rich programming environments

    NASA Astrophysics Data System (ADS)

    Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin

    2013-12-01

    Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot Virtual Worlds (RVWs), can be used to teach computer science principles within a robotics context by examining its use in high-school classrooms. We also investigated whether the lack of physicality in these environments impacts student learning by comparing classrooms that used either virtual or physical robots for the RVW curriculum. Results suggest that the RVW environment leads to significant gains in computer science knowledge, that virtual robots lead to faster learning, and that physical robots may have some influence on algorithmic thinking. We discuss the implications of physicality in these programming environments for learning computer science.

  6. [Insert Your Science Here] Week: Creating science-driven public awareness campaigns

    NASA Astrophysics Data System (ADS)

    Mattson, Barbara; Mitchell, Sara; McElvery, Raleigh; Reddy, Francis; Wiessinger, Scott; Skelly, Clare; Saravia, Claire; Straughn, Amber N.; Washington, Dewayne

    2018-01-01

    NASA Goddard’s in-house Astrophysics Communications Team is responsible for facilitating the production of traditional and social media products to provide understanding and inspiration about NASA’s astrophysics missions and discoveries. Our team is largely driven by the scientific news cycle of launches, mission milestones, anniversaries, and discoveries, which can leave a number of topics behind, waiting for a discovery to be highlighted. These overlooked topics include compelling stories about ongoing research, underlying science, and science not tied to a specific mission. In looking for a way to boost coverage of these unsung topics, we struck upon an idea of creating “theme weeks” which bring together the broader scientific community around a topic, object, or scientific concept. This poster will present the first two of our Goddard-led theme weeks: Pulsar Week and Dark Energy Week. We will describe the efforts involved, our metrics, and the benefits and challenges we encountered. We will also suggest a template for doing this for your own science based on our successes.

  7. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats

    2014-06-01

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.

  8. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less

  9. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  10. An Overview of NASA's Intelligent Systems Program

    NASA Technical Reports Server (NTRS)

    Cooke, Daniel E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.

  11. Teaching Computer Science Courses in Distance Learning

    ERIC Educational Resources Information Center

    Huan, Xiaoli; Shehane, Ronald; Ali, Adel

    2011-01-01

    As the success of distance learning (DL) has driven universities to increase the courses offered online, certain challenges arise when teaching computer science (CS) courses to students who are not physically co-located and have individual learning schedules. Teaching CS courses involves high level demonstrations and interactivity between the…

  12. The Leadership Lab for Women: Advancing and Retaining Women in STEM through Professional Development.

    PubMed

    Van Oosten, Ellen B; Buse, Kathleen; Bilimoria, Diana

    2017-01-01

    Innovative professional development approaches are needed to address the ongoing lack of women leaders in science, technology, engineering, and math (STEM) careers. Developed from the research on women who persist in engineering and computing professions and essential elements of women's leadership development, the Leadership Lab for Women in STEM Program was launched in 2014. The Leadership Lab was created as a research-based leadership development program, offering 360-degree feedback, coaching, and practical strategies aimed at increasing the advancement and retention of women in the STEM professions. The goal is to provide women with knowledge, tools and a supportive learning environment to help them navigate, achieve, flourish, and catalyze organizational change in male-dominated and technology-driven organizations. This article describes the importance of creating unique development experiences for women in STEM fields, the genesis of the Leadership Lab, the design and content of the program, and the outcomes for the participants.

  13. Science Fun: Hands-On Science with Dr. Zed.

    ERIC Educational Resources Information Center

    Penrose, Gordon

    This book presents 65 simple, safe, and intriguing hands-on science activities. In doing these simple experiments, children can make a variety of discoveries that will surprise them. It includes many activities from discovering how people see color and what makes people's hair stand on end, to creating a tornado in a jar or a propeller-driven boat…

  14. Theory-Guided Technology in Computer Science.

    ERIC Educational Resources Information Center

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  15. Teaching of Computer Science Topics Using Meta-Programming-Based GLOs and LEGO Robots

    ERIC Educational Resources Information Center

    Štuikys, Vytautas; Burbaite, Renata; Damaševicius, Robertas

    2013-01-01

    The paper's contribution is a methodology that integrates two educational technologies (GLO and LEGO robot) to teach Computer Science (CS) topics at the school level. We present the methodology as a framework of 5 components (pedagogical activities, technology driven processes, tools, knowledge transfer actors, and pedagogical outcomes) and…

  16. An Interdisciplinary Team Project: Psychology and Computer Science Students Create Online Cognitive Tasks

    ERIC Educational Resources Information Center

    Flannery, Kathleen A.; Malita, Mihaela

    2014-01-01

    We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…

  17. The iPlant Collaborative: Cyberinfrastructure for Plant Biology.

    PubMed

    Goff, Stephen A; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B S; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M; Cranston, Karen A; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J; White, Jeffery W; Leebens-Mack, James; Donoghue, Michael J; Spalding, Edgar P; Vision, Todd J; Myers, Christopher R; Lowenthal, David; Enquist, Brian J; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services.

  18. The iPlant Collaborative: Cyberinfrastructure for Plant Biology

    PubMed Central

    Goff, Stephen A.; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E.; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H.; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B. S.; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M.; Cranston, Karen A.; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J.; White, Jeffery W.; Leebens-Mack, James; Donoghue, Michael J.; Spalding, Edgar P.; Vision, Todd J.; Myers, Christopher R.; Lowenthal, David; Enquist, Brian J.; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services. PMID:22645531

  19. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  20. Multimedia Madness: Creating with a Purpose

    ERIC Educational Resources Information Center

    Bodley, Barb; Bremer, Janet

    2004-01-01

    High school students working in a project-driven environment create "projects with a purpose" that give younger students technology-based activities to help them practice skills in reading, math, spelling and science. An elective semester-long course using the Macromedia suite of programs with the objective of learning the software skills of…

  1. Campus Cyberinfrastructure: A Crucial Enabler for Science

    ERIC Educational Resources Information Center

    Freeman, Peter A.; Almes, Guy T.

    2005-01-01

    Driven by the needs of college/university researchers and guided by a blue-ribbon advisory panel chaired by Daniel E. Atkins, the National Science Foundation (NSF) has initiated a broad, multi-directorate activity to create modern cyberinfrastructure and to apply it to transforming the effectiveness of the scientific research enterprise in higher…

  2. Fermilab Friends for Science Education | Support Us

    Science.gov Websites

    economy are driven by scientific and technological innovations. We want a strong future and must support our future scientists and their teachers now. We need a scientifically literate and aware society create new, innovative science education programs and make the best use of unique Fermilab resources

  3. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  4. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

    PubMed Central

    Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  5. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    PubMed

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.

  6. Creating a Pipeline for African American Computing Science Faculty: An Innovative Faculty/Research Mentoring Program Model

    ERIC Educational Resources Information Center

    Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.

    2014-01-01

    African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…

  7. Cognitive Correlates of Performance in Algorithms in a Computer Science Course for High School

    ERIC Educational Resources Information Center

    Avancena, Aimee Theresa; Nishihara, Akinori

    2014-01-01

    Computer science for high school faces many challenging issues. One of these is whether the students possess the appropriate cognitive ability for learning the fundamentals of computer science. Online tests were created based on known cognitive factors and fundamental algorithms and were implemented among the second grade students in the…

  8. Correlation Educational Model in Primary Education Curriculum of Mathematics and Computer Science

    ERIC Educational Resources Information Center

    Macinko Kovac, Maja; Eret, Lidija

    2012-01-01

    This article gives insight into methodical correlation model of teaching mathematics and computer science. The model shows the way in which the related areas of computer science and mathematics can be supplemented, if it transforms the way of teaching and creates a "joint" lessons. Various didactic materials are designed, in which all…

  9. Ushering in a New Frontier in Geospace Through Data Science

    NASA Astrophysics Data System (ADS)

    McGranaghan, Ryan M.; Bhatt, Asti; Matsuo, Tomoko; Mannucci, Anthony J.; Semeter, Joshua L.; Datta-Barua, Seebany

    2017-12-01

    Our understanding and specification of solar-terrestrial interactions benefit from taking advantage of comprehensive data-intensive approaches. These data-driven methods are taking on new importance in light of the shifting data landscape of the geospace system, which extends from the near Earth space environment, through the magnetosphere and interplanetary space, to the Sun. The space physics community faces both an exciting opportunity and an important imperative to create a new frontier built at the intersection of traditional approaches and state-of-the-art data-driven sciences and technologies. This brief commentary addresses the current paradigm of geospace science and the emerging need for data science innovation, discusses the meaning of data science in the context of geospace, and highlights community efforts to respond to the changing landscape.

  10. Metadata-driven Clinical Data Loading into i2b2 for Clinical and Translational Science Institutes.

    PubMed

    Post, Andrew R; Pai, Akshatha K; Willard, Richard; May, Bradley J; West, Andrew C; Agravat, Sanjay; Granite, Stephen J; Winslow, Raimond L; Stephens, David S

    2016-01-01

    Clinical and Translational Science Award (CTSA) recipients have a need to create research data marts from their clinical data warehouses, through research data networks and the use of i2b2 and SHRINE technologies. These data marts may have different data requirements and representations, thus necessitating separate extract, transform and load (ETL) processes for populating each mart. Maintaining duplicative procedural logic for each ETL process is onerous. We have created an entirely metadata-driven ETL process that can be customized for different data marts through separate configurations, each stored in an extension of i2b2 's ontology database schema. We extended our previously reported and open source Eureka! Clinical Analytics software with this capability. The same software has created i2b2 data marts for several projects, the largest being the nascent Accrual for Clinical Trials (ACT) network, for which it has loaded over 147 million facts about 1.2 million patients.

  11. Metadata-driven Clinical Data Loading into i2b2 for Clinical and Translational Science Institutes

    PubMed Central

    Post, Andrew R.; Pai, Akshatha K.; Willard, Richard; May, Bradley J.; West, Andrew C.; Agravat, Sanjay; Granite, Stephen J.; Winslow, Raimond L.; Stephens, David S.

    2016-01-01

    Clinical and Translational Science Award (CTSA) recipients have a need to create research data marts from their clinical data warehouses, through research data networks and the use of i2b2 and SHRINE technologies. These data marts may have different data requirements and representations, thus necessitating separate extract, transform and load (ETL) processes for populating each mart. Maintaining duplicative procedural logic for each ETL process is onerous. We have created an entirely metadata-driven ETL process that can be customized for different data marts through separate configurations, each stored in an extension of i2b2 ‘s ontology database schema. We extended our previously reported and open source Eureka! Clinical Analytics software with this capability. The same software has created i2b2 data marts for several projects, the largest being the nascent Accrual for Clinical Trials (ACT) network, for which it has loaded over 147 million facts about 1.2 million patients. PMID:27570667

  12. A pedagogical example of second-order arithmetic sequences applied to the construction of computer passwords by upper elementary grade students

    NASA Astrophysics Data System (ADS)

    Coggins, Porter E.

    2015-04-01

    The purpose of this paper is (1) to present how general education elementary school age students constructed computer passwords using digital root sums and second-order arithmetic sequences, (2) argue that computer password construction can be used as an engaging introduction to generate interest in elementary school students to study mathematics related to computer science, and (3) share additional mathematical ideas accessible to elementary school students that can be used to create computer passwords. This paper serves to fill a current gap in the literature regarding the integration of mathematical content accessible to upper elementary school students and aspects of computer science in general, and computer password construction in particular. In addition, the protocols presented here can serve as a hook to generate further interest in mathematics and computer science. Students learned to create a random-looking computer password by using biometric measurements of their shoe size, height, and age in months and to create a second-order arithmetic sequence, then converted the resulting numbers into characters that become their computer passwords. This password protocol can be used to introduce students to good computer password habits that can serve a foundation for a life-long awareness of data security. A refinement of the password protocol is also presented.

  13. Molecular mechanics and dynamics characterization of an in silico mutated protein: a stand-alone lab module or support activity for in vivo and in vitro analyses of targeted proteins.

    PubMed

    Chiang, Harry; Robinson, Lucy C; Brame, Cynthia J; Messina, Troy C

    2013-01-01

    Over the past 20 years, the biological sciences have increasingly incorporated chemistry, physics, computer science, and mathematics to aid in the development and use of mathematical models. Such combined approaches have been used to address problems from protein structure-function relationships to the workings of complex biological systems. Computer simulations of molecular events can now be accomplished quickly and with standard computer technology. Also, simulation software is freely available for most computing platforms, and online support for the novice user is ample. We have therefore created a molecular dynamics laboratory module to enhance undergraduate student understanding of molecular events underlying organismal phenotype. This module builds on a previously described project in which students use site-directed mutagenesis to investigate functions of conserved sequence features in members of a eukaryotic protein kinase family. In this report, we detail the laboratory activities of a MD module that provide a complement to phenotypic outcomes by providing a hypothesis-driven and quantifiable measure of predicted structural changes caused by targeted mutations. We also present examples of analyses students may perform. These laboratory activities can be integrated with genetics or biochemistry experiments as described, but could also be used independently in any course that would benefit from a quantitative approach to protein structure-function relationships. Copyright © 2013 Wiley Periodicals, Inc.

  14. Accelerator on a Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    England, Joel

    2014-06-30

    SLAC's Joel England explains how the same fabrication techniques used for silicon computer microchips allowed their team to create the new laser-driven particle accelerator chips. (SLAC Multimedia Communications)

  15. Accelerator on a Chip

    ScienceCinema

    England, Joel

    2018-01-16

    SLAC's Joel England explains how the same fabrication techniques used for silicon computer microchips allowed their team to create the new laser-driven particle accelerator chips. (SLAC Multimedia Communications)

  16. On teaching computer ethics within a computer science department.

    PubMed

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  17. Software for computing plant biomass—BIOPAK users guide.

    Treesearch

    Joseph E. Means; Heather A. Hansen; Greg J. Koerper; Paul B Alaback; Mark W. Klopsch

    1994-01-01

    BIOPAK is a menu-driven package of computer programs for IBM-compatible personal computers that calculates the biomass, area, height, length, or volume of plant components (leaves, branches, stem, crown, and roots). The routines were written in FoxPro, Fortran, and C.BIOPAK was created to facilitate linking of a diverse array of vegetation datasets with the...

  18. Pedagogy for the Connected Science Classroom: Computer Supported Collaborative Science and the Next Generation Science Standards

    ERIC Educational Resources Information Center

    Foley, Brian J.; Reveles, John M.

    2014-01-01

    The prevalence of computers in the classroom is compelling teachers to develop new instructional skills. This paper provides a theoretical perspective on an innovative pedagogical approach to science teaching that takes advantage of technology to create a connected classroom. In the connected classroom, students collaborate and share ideas in…

  19. Synthetic Biology: Knowledge Accessed by Everyone (Open Sources)

    ERIC Educational Resources Information Center

    Sánchez Reyes, Patricia Margarita

    2016-01-01

    Using the principles of biology, along with engineering and with the help of computer, scientists manage to copy. DNA sequences from nature and use them to create new organisms. DNA is created through engineering and computer science managing to create life inside a laboratory. We cannot dismiss the role that synthetic biology could lead in…

  20. Creating and Using a Computer Networking and Systems Administration Laboratory Built under Relaxed Financial Constraints

    ERIC Educational Resources Information Center

    Conlon, Michael P.; Mullins, Paul

    2011-01-01

    The Computer Science Department at Slippery Rock University created a laboratory for its Computer Networks and System Administration and Security courses under relaxed financial constraints. This paper describes the department's experience designing and using this laboratory, including lessons learned and descriptions of some student projects…

  1. The Effects of Gender on the Attitudes towards the Computer Assisted Instruction: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cam, Sefika Sumeyye; Yarar, Gokhan; Toraman, Cetin; Erdamar, Gurcu Koc

    2016-01-01

    The idea that gender factor creates a difference on computer usage and computer-assisted instruction is based upon previous years. At that time, it was thought that some areas like engineering, science and mathematics were for males so it created a difference on the computer usage. Nevertheless, developing technology and females becoming more…

  2. Preservice Science Teachers' Perceptions of Their TPACK Development after Creating Digital Stories

    ERIC Educational Resources Information Center

    Sancar-Tokmak, Hatice; Surmeli, Hikmet; Ozgelen, Sinan

    2014-01-01

    The aim of this case study was to examine pre-service science teachers' (PSTs) perceptions of their Technological Pedagogical Content Knowledge (TPACK) development after creating digital stories based on science topics drawn from the national curriculum. A total of 21 PSTs enrolled in Introduction to Computers II participated in the study. Data…

  3. The University of Washington Mobile Planetarium: A Do-it-yourself Guide

    NASA Astrophysics Data System (ADS)

    Rosenfield, P.; Gaily, J.; Fraser, O.; Wisniewski, J.

    2014-07-01

    The University of Washington mobile planetarium project is a student-driven effort to bring astronomy to secondary schools, and the community, in Seattle, USA. This paper presents the solution that was designed and built in order to use the World- Wide Telescope — a computer program created by Microsoft that displays the astronomical sky as maps, the 3D Universe, and earth science data — from a laptop and an off-the-shelf high-definition (HD) projector located in an inflatable plane- tarium. In the first six months of operation, undergraduates at the University of Washington presented planetarium shows to over 1500 people, and 150 secondary school students created and presented their own astronomy projects in our dome, at their school. This paper aims to share the technical aspects of the project so that others can replicate the model or adapt it to their needs. This project was made possible thanks to a NASA/ESA Hubble Space Telescope education/public outreach grant.

  4. The Leadership Lab for Women: Advancing and Retaining Women in STEM through Professional Development

    PubMed Central

    Van Oosten, Ellen B.; Buse, Kathleen; Bilimoria, Diana

    2017-01-01

    Innovative professional development approaches are needed to address the ongoing lack of women leaders in science, technology, engineering, and math (STEM) careers. Developed from the research on women who persist in engineering and computing professions and essential elements of women’s leadership development, the Leadership Lab for Women in STEM Program was launched in 2014. The Leadership Lab was created as a research-based leadership development program, offering 360-degree feedback, coaching, and practical strategies aimed at increasing the advancement and retention of women in the STEM professions. The goal is to provide women with knowledge, tools and a supportive learning environment to help them navigate, achieve, flourish, and catalyze organizational change in male-dominated and technology-driven organizations. This article describes the importance of creating unique development experiences for women in STEM fields, the genesis of the Leadership Lab, the design and content of the program, and the outcomes for the participants. PMID:29326618

  5. Indirection and computer security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyzemore » common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.« less

  6. Master Plan: The Introduction of Computer Science and Computer Related Instructional Programs, 1982-1985. Office of Instruction Publication Report No. 82-07.

    ERIC Educational Resources Information Center

    Veley, Victor F.; And Others

    This report presents a master plan for the development of computer science and computer-related programs at Los Angeles Trade-Technical College for 1982 through 1985. Introductory material outlines the main elements of the plan: to analyze existing computer courses, to create new courses in Laser Technology, Genetic Engineering, and Robotics; and…

  7. Architecture of a Message-Driven Processor,

    DTIC Science & Technology

    1987-11-01

    Jon Kaplan, Paul Song, Brian Totty, and Scott Wills Artifcial Intelligence Laboratory -4 Laboratory for Computer Science Massachusetts Institute of...Information Dally, Chao, Chien, Hassoun, Horwat, Kaplan, Song, Totty & Wills: Artificial Intelligence i Laboratory and Laboratory for Computer Science, MIT...applied to a problem if we could are 36 bits long (32 data bits + 4 tag bits) and are used to hold efficiently run programs with a granularity of 5s

  8. Demystifying computer science for molecular ecologists.

    PubMed

    Belcaid, Mahdi; Toonen, Robert J

    2015-06-01

    In this age of data-driven science and high-throughput biology, computational thinking is becoming an increasingly important skill for tackling both new and long-standing biological questions. However, despite its obvious importance and conspicuous integration into many areas of biology, computer science is still viewed as an obscure field that has, thus far, permeated into only a few of the biology curricula across the nation. A national survey has shown that lack of computational literacy in environmental sciences is the norm rather than the exception [Valle & Berdanier (2012) Bulletin of the Ecological Society of America, 93, 373-389]. In this article, we seek to introduce a few important concepts in computer science with the aim of providing a context-specific introduction aimed at research biologists. Our goal was to help biologists understand some of the most important mainstream computational concepts to better appreciate bioinformatics methods and trade-offs that are not obvious to the uninitiated. © 2015 John Wiley & Sons Ltd.

  9. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.

  10. Ignoring a Revolution in Military Affairs: The Need to Create a Separate Branch of the Armed Forces for Cyber Warfare

    DTIC Science & Technology

    2017-06-09

    those with talent in the computer sciences. Upon graduation from high school, computer -proficient teenagers are selected for an elite cyber force and...Arguably, the Massachusetts Institute of Technology (M.I.T.) is the premiere institution for computer science. M.I.T. graduates make, on average, $83,455...study specific to computer science and provide certification in programs like ethical hacking, cyber security, and programing. As with the other

  11. Recommendations for open data science.

    PubMed

    Gymrek, Melissa; Farjoun, Yossi

    2016-01-01

    Life science research increasingly relies on large-scale computational analyses. However, the code and data used for these analyses are often lacking in publications. To maximize scientific impact, reproducibility, and reuse, it is crucial that these resources are made publicly available and are fully transparent. We provide recommendations for improving the openness of data-driven studies in life sciences.

  12. Designing for Deeper Learning in a Blended Computer Science Course for Middle School Students

    ERIC Educational Resources Information Center

    Grover, Shuchi; Pea, Roy; Cooper, Stephen

    2015-01-01

    The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course…

  13. A Data-Driven Framework for Rapid Modeling of Wireless Communication Channels

    DTIC Science & Technology

    2013-12-01

    Committee Chair Mathias Kolsch Joel Young Associate Professor of Computer Science Assistant Professor of Computer Science Timothy Chung John J . Leonard...74 xiii Figure 7.8 RSS measurements (relative to S2 buoy) partitioned into 4 groupings anno - tated by the red, green blue and magenta...distribution of this random variable. Suppose it was possible to take additional measurements at other locations (x j | x j 6= xi). In order to do

  14. Toward using games to teach fundamental computer science concepts

    NASA Astrophysics Data System (ADS)

    Edgington, Jeffrey Michael

    Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. We present an investigation where computer games are used to teach two fundamental computer science concepts: boolean expressions and recursion. The games are intended to teach the concepts and not how to implement them in a programming language. For this investigation, two computer games were created. One is designed to teach basic boolean expressions and operators and the other to teach fundamental concepts of recursion. We describe the design and implementation of both games. We evaluate the effectiveness of these games using before and after surveys. The surveys were designed to ascertain basic understanding, attitudes and beliefs regarding the concepts. The boolean game was evaluated with local high school students and students in a college level introductory computer science course. The recursion game was evaluated with students in a college level introductory computer science course. We present the analysis of the collected survey information for both games. This analysis shows a significant positive change in student attitude towards recursion and modest gains in student learning outcomes for both topics.

  15. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  16. Machine learning: Trends, perspectives, and prospects.

    PubMed

    Jordan, M I; Mitchell, T M

    2015-07-17

    Machine learning addresses the question of how to build computers that improve automatically through experience. It is one of today's most rapidly growing technical fields, lying at the intersection of computer science and statistics, and at the core of artificial intelligence and data science. Recent progress in machine learning has been driven both by the development of new learning algorithms and theory and by the ongoing explosion in the availability of online data and low-cost computation. The adoption of data-intensive machine-learning methods can be found throughout science, technology and commerce, leading to more evidence-based decision-making across many walks of life, including health care, manufacturing, education, financial modeling, policing, and marketing. Copyright © 2015, American Association for the Advancement of Science.

  17. Galaxy Makers Exhibition: Re-engagement, Evaluation and Content Legacy through an Online Component

    NASA Astrophysics Data System (ADS)

    Borrow, J.; Harrison, C.

    2017-09-01

    For the Royal Society Summer Science Exhibition 2016, Durham University's Institute of Computational Cosmology created the Galaxy Makers exhibit to communicate our computational cosmology and astronomy research. In addition to the physical exhibit we created an online component to foster re-engagement, create a permanent home for our content and allow us to collect important information about participation and impact. Here we summarise the details of the exhibit and the degree of success attached to the online component. We also share suggestions for further uses and improvements that could be implemented for the online components of other science exhibitions.

  18. A Year in the Life: Two Seventh Grade Teachers Implement One-to-One Computing

    ERIC Educational Resources Information Center

    Garthwait, Abigail; Weller, Herman G.

    2005-01-01

    Maine was the first state to put laptops in the hands of an entire grade of students. This interpretive case study of two middle school science-math teachers was driven by the general question: Given ubiquitous computing, how do teachers use computers in constructing curriculum and delivering instruction? Specifically, the researchers sought to…

  19. A Cyber Enabled Collaborative Environment for Creating, Sharing and Using Data and Modeling Driven Curriculum Modules for Hydrology Education

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.; Fox, S.; Iverson, E. A. R.

    2014-12-01

    With the access to emerging datasets and computational tools, there is a need to bring these capabilities into hydrology classrooms. However, developing curriculum modules using data and models to augment classroom teaching is hindered by a steep technology learning curve, rapid technology turnover, and lack of an organized community cyberinfrastructure (CI) for the dissemination, publication, and sharing of the latest tools and curriculum material for hydrology and geoscience education. The objective of this project is to overcome some of these limitations by developing a cyber enabled collaborative environment for publishing, sharing and adoption of data and modeling driven curriculum modules in hydrology and geosciences classroom. The CI is based on Carleton College's Science Education Resource Center (SERC) Content Management System. Building on its existing community authoring capabilities the system is being extended to allow assembly of new teaching activities by drawing on a collection of interchangeable building blocks; each of which represents a step in the modeling process. Currently the system hosts more than 30 modules or steps, which can be combined to create multiple learning units. Two specific units: Unit Hydrograph and Rational Method, have been used in undergraduate hydrology class-rooms at Purdue University and Arizona State University. The structure of the CI and the lessons learned from its implementation, including preliminary results from student assessments of learning will be presented.

  20. A History of the Liberal Arts Computer Science Consortium and Its Model Curricula

    ERIC Educational Resources Information Center

    Bruce, Kim B.; Cupper, Robert D.; Scot Drysdale, Robert L.

    2010-01-01

    With the support of a grant from the Sloan Foundation, nine computer scientists from liberal arts colleges came together in October, 1984 to form the Liberal Arts Computer Science Consortium (LACS) and to create a model curriculum appropriate for liberal arts colleges. Over the years the membership has grown and changed, but the focus has remained…

  1. First Steps in Computational Systems Biology: A Practical Session in Metabolic Modeling and Simulation

    ERIC Educational Resources Information Center

    Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel

    2009-01-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…

  2. Materials Knowledge Systems in Python - A Data Science Framework for Accelerated Development of Hierarchical Materials.

    PubMed

    Brough, David B; Wheeler, Daniel; Kalidindi, Surya R

    2017-03-01

    There is a critical need for customized analytics that take into account the stochastic nature of the internal structure of materials at multiple length scales in order to extract relevant and transferable knowledge. Data driven Process-Structure-Property (PSP) linkages provide systemic, modular and hierarchical framework for community driven curation of materials knowledge, and its transference to design and manufacturing experts. The Materials Knowledge Systems in Python project (PyMKS) is the first open source materials data science framework that can be used to create high value PSP linkages for hierarchical materials that can be leveraged by experts in materials science and engineering, manufacturing, machine learning and data science communities. This paper describes the main functions available from this repository, along with illustrations of how these can be accessed, utilized, and potentially further refined by the broader community of researchers.

  3. Materials Knowledge Systems in Python - A Data Science Framework for Accelerated Development of Hierarchical Materials

    PubMed Central

    Brough, David B; Wheeler, Daniel; Kalidindi, Surya R.

    2017-01-01

    There is a critical need for customized analytics that take into account the stochastic nature of the internal structure of materials at multiple length scales in order to extract relevant and transferable knowledge. Data driven Process-Structure-Property (PSP) linkages provide systemic, modular and hierarchical framework for community driven curation of materials knowledge, and its transference to design and manufacturing experts. The Materials Knowledge Systems in Python project (PyMKS) is the first open source materials data science framework that can be used to create high value PSP linkages for hierarchical materials that can be leveraged by experts in materials science and engineering, manufacturing, machine learning and data science communities. This paper describes the main functions available from this repository, along with illustrations of how these can be accessed, utilized, and potentially further refined by the broader community of researchers. PMID:28690971

  4. All biology is computational biology.

    PubMed

    Markowetz, Florian

    2017-03-01

    Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.

  5. Towards Test Driven Development for Computational Science with pFUnit

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Clune, Thomas L.

    2014-01-01

    Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.

  6. A Qualitative Study of Students' Computational Thinking Skills in a Data-Driven Computing Class

    ERIC Educational Resources Information Center

    Yuen, Timothy T.; Robbins, Kay A.

    2014-01-01

    Critical thinking, problem solving, the use of tools, and the ability to consume and analyze information are important skills for the 21st century workforce. This article presents a qualitative case study that follows five undergraduate biology majors in a computer science course (CS0). This CS0 course teaches programming within a data-driven…

  7. Bio and health informatics meets cloud : BioVLab as an example.

    PubMed

    Chae, Heejoon; Jung, Inuk; Lee, Hyungro; Marru, Suresh; Lee, Seong-Whan; Kim, Sun

    2013-01-01

    The exponential increase of genomic data brought by the advent of the next or the third generation sequencing (NGS) technologies and the dramatic drop in sequencing cost have driven biological and medical sciences to data-driven sciences. This revolutionary paradigm shift comes with challenges in terms of data transfer, storage, computation, and analysis of big bio/medical data. Cloud computing is a service model sharing a pool of configurable resources, which is a suitable workbench to address these challenges. From the medical or biological perspective, providing computing power and storage is the most attractive feature of cloud computing in handling the ever increasing biological data. As data increases in size, many research organizations start to experience the lack of computing power, which becomes a major hurdle in achieving research goals. In this paper, we review the features of publically available bio and health cloud systems in terms of graphical user interface, external data integration, security and extensibility of features. We then discuss about issues and limitations of current cloud systems and conclude with suggestion of a biological cloud environment concept, which can be defined as a total workbench environment assembling computational tools and databases for analyzing bio/medical big data in particular application domains.

  8. NASA’s Universe of Learning: Providing a Direct Connection to NASA Science for Learners of all Ages with ViewSpace

    NASA Astrophysics Data System (ADS)

    Lawton, Brandon L.; Rhue, Timothy; Smith, Denise A.; Squires, Gordon K.; Biferno, Anya A.; Lestition, Kathleen; Cominsky, Lynn R.; Godfrey, John; Lee, Janice C.; Manning, Colleen

    2018-06-01

    NASA's Universe of Learning creates and delivers science-driven, audience-driven resources and experiences designed to engage and immerse learners of all ages and backgrounds in exploring the universe for themselves. The project is the result of a unique partnership between the Space Telescope Science Institute, Caltech/IPAC, Jet Propulsion Laboratory, Smithsonian Astrophysical Observatory, and Sonoma State University, and is one of 27 competitively-selected cooperative agreements within the NASA Science Mission Directorate STEM Activation program. The NASA's Universe of Learning team draws upon cutting-edge science and works closely with Subject Matter Experts (scientists and engineers) from across the NASA Astrophysics Physics of the Cosmos, Cosmic Origins, and Exoplanet Exploration themes. As one example, NASA’s Universe of Learning program is uniquely able to provide informal learning venues with a direct connection to the science of NASA astrophysics via the ViewSpace platform. ViewSpace is a modular multimedia exhibit where people explore the latest discoveries in our quest to understand the universe. Hours of awe-inspiring video content connect users’ lives with an understanding of our planet and the wonders of the universe. This experience is rooted in informal learning, astronomy, and earth science. Scientists and educators are intimately involved in the production of ViewSpace material. ViewSpace engages visitors of varying backgrounds and experience at museums, science centers, planetariums, and libraries across the United States. In addition to creating content, the Universe of Learning team is updating the ViewSpace platform to provide for additional functionality, including the introduction of digital interactives to make ViewSpace a multi-modal learning experience. During this presentation we will share the ViewSpace platform, explain how Subject Matter Experts are critical in creating content for ViewSpace, and how we are addressing audience needs and using evaluation to support a dedicated user base across the country.

  9. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    EPA Science Inventory

    Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deploy...

  10. Data-driven discovery of new Dirac semimetal materials

    NASA Astrophysics Data System (ADS)

    Yan, Qimin; Chen, Ru; Neaton, Jeffrey

    In recent years, a significant amount of materials property data from high-throughput computations based on density functional theory (DFT) and the application of database technologies have enabled the rise of data-driven materials discovery. In this work, we initiate the extension of the data-driven materials discovery framework to the realm of topological semimetal materials and to accelerate the discovery of novel Dirac semimetals. We implement current available and develop new workflows to data-mine the Materials Project database for novel Dirac semimetals with desirable band structures and symmetry protected topological properties. This data-driven effort relies on the successful development of several automatic data generation and analysis tools, including a workflow for the automatic identification of topological invariants and pattern recognition techniques to find specific features in a massive number of computed band structures. Utilizing this approach, we successfully identified more than 15 novel Dirac point and Dirac nodal line systems that have not been theoretically predicted or experimentally identified. This work is supported by the Materials Project Predictive Modeling Center through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculationsmore » require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.« less

  12. Supercomputing meets seismology in earthquake exhibit

    ScienceCinema

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2018-02-14

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  13. System and method for embedding emotion in logic systems

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2012-01-01

    A system, method, and computer readable-media for creating a stable synthetic neural system. The method includes training an intellectual choice-driven synthetic neural system (SNS), training an emotional rule-driven SNS by generating emotions from rules, incorporating the rule-driven SNS into the choice-driven SNS through an evolvable interface, and balancing the emotional SNS and the intellectual SNS to achieve stability in a nontrivial autonomous environment with a Stability Algorithm for Neural Entities (SANE). Generating emotions from rules can include coding the rules into the rule-driven SNS in a self-consistent way. Training the emotional rule-driven SNS can occur during a training stage in parallel with training the choice-driven SNS. The training stage can include a self assessment loop which measures performance characteristics of the rule-driven SNS against core genetic code. The method uses a stability threshold to measure stability of the incorporated rule-driven SNS and choice-driven SNS using SANE.

  14. Building a co-created citizen science program with gardeners neighboring a superfund site: The Gardenroots case study

    PubMed Central

    Ramirez-Andreotta, Monica D; Brusseau, Mark L; Artiola, Janick; Maier, Raina M; Gandolfi, A Jay

    2014-01-01

    A research project that is only expert-driven may ignore the role of local knowledge in research, give low priority to the development of a comprehensive communication strategy to engage the community, and may not deliver the results of the study to the community in an effective way. Objective To demonstrate how a research program can respond to a community research need, establish a community-academic partnership, and build a co-created citizen science program. Methods A place-based, community-driven project was designed where academics and community members maintained a reciprocal dialogue, and together, we: 1) defined the question for study, 2) gathered information, 3) developed hypotheses, 3) designed data collection methodologies, 4) collected environmental samples (soil, irrigation water, and vegetables), 5) interpreted data, 6) disseminated results and translated results into action, and 7) discussed results and asked new questions. Results The co-created environmental research project produced new data and addressed an additional exposure route (consumption of vegetables grown in soils with elevated arsenic levels). Public participation in scientific research improved environmental health assessment, information transfer, and risk communication efforts. Furthermore, incorporating the community in the scientific process produced both individual learning outcomes and community-level outcomes. Conclusions This approach illustrates the benefits of a community-academic co-created citizen-science program in addressing the complex problems that arise in communities neighboring a contaminated site. Such a project can increase the community's involvement in risk communication and decision-making, which ultimately has the potential to help mitigate exposure and thereby reduce associated risk. PMID:25954473

  15. Building a co-created citizen science program with gardeners neighboring a superfund site: The Gardenroots case study.

    PubMed

    Ramirez-Andreotta, Monica D; Brusseau, Mark L; Artiola, Janick; Maier, Raina M; Gandolfi, A Jay

    2015-01-01

    A research project that is only expert-driven may ignore the role of local knowledge in research, give low priority to the development of a comprehensive communication strategy to engage the community, and may not deliver the results of the study to the community in an effective way. To demonstrate how a research program can respond to a community research need, establish a community-academic partnership, and build a co-created citizen science program. A place-based, community-driven project was designed where academics and community members maintained a reciprocal dialogue, and together, we: 1) defined the question for study, 2) gathered information, 3) developed hypotheses, 3) designed data collection methodologies, 4) collected environmental samples (soil, irrigation water, and vegetables), 5) interpreted data, 6) disseminated results and translated results into action, and 7) discussed results and asked new questions. The co-created environmental research project produced new data and addressed an additional exposure route (consumption of vegetables grown in soils with elevated arsenic levels). Public participation in scientific research improved environmental health assessment, information transfer, and risk communication efforts. Furthermore, incorporating the community in the scientific process produced both individual learning outcomes and community-level outcomes. This approach illustrates the benefits of a community-academic co-created citizen-science program in addressing the complex problems that arise in communities neighboring a contaminated site. Such a project can increase the community's involvement in risk communication and decision-making, which ultimately has the potential to help mitigate exposure and thereby reduce associated risk.

  16. Where Computer Science and Cultural Studies Collide

    ERIC Educational Resources Information Center

    Kirschenbaum, Matthew

    2009-01-01

    Most users have no more knowledge of what their computer or code is actually doing than most automobile owners have of their carburetor or catalytic converter. Nor is any such knowledge necessarily needed. But for academics, driven by an increasing emphasis on the materiality of new media--that is, the social, cultural, and economic factors…

  17. 'Tagger' - a Mac OS X Interactive Graphical Application for Data Inference and Analysis of N-Dimensional Datasets in the Natural Physical Sciences.

    NASA Astrophysics Data System (ADS)

    Morse, P. E.; Reading, A. M.; Lueg, C.

    2014-12-01

    Pattern-recognition in scientific data is not only a computational problem but a human-observer problem as well. Human observation of - and interaction with - data visualization software can augment, select, interrupt and modify computational routines and facilitate processes of pattern and significant feature recognition for subsequent human analysis, machine learning, expert and artificial intelligence systems.'Tagger' is a Mac OS X interactive data visualisation tool that facilitates Human-Computer interaction for the recognition of patterns and significant structures. It is a graphical application developed using the Quartz Composer framework. 'Tagger' follows a Model-View-Controller (MVC) software architecture: the application problem domain (the model) is to facilitate novel ways of abstractly representing data to a human interlocutor, presenting these via different viewer modalities (e.g. chart representations, particle systems, parametric geometry) to the user (View) and enabling interaction with the data (Controller) via a variety of Human Interface Devices (HID). The software enables the user to create an arbitrary array of tags that may be appended to the visualised data, which are then saved into output files as forms of semantic metadata. Three fundamental problems that are not strongly supported by conventional scientific visualisation software are addressed:1] How to visually animate data over time, 2] How to rapidly deploy unconventional parametrically driven data visualisations, 3] How to construct and explore novel interaction models that capture the activity of the end-user as semantic metadata that can be used to computationally enhance subsequent interrogation. Saved tagged data files may be loaded into Tagger, so that tags may be tagged, if desired. Recursion opens up the possibility of refining or overlapping different types of tags, tagging a variety of different POIs or types of events, and of capturing different types of specialist observations of important or noticeable events. Other visualisations and modes of interaction will also be demonstrated, with the aim of discovering knowledge in large datasets in the natural, physical sciences. Fig.1 Wave height data from an oceanographic Wave Rider Buoy. Colors/radii are driven by wave height data.

  18. Building a Semantic Framework for eScience

    NASA Astrophysics Data System (ADS)

    Movva, S.; Ramachandran, R.; Maskey, M.; Li, X.

    2009-12-01

    The e-Science vision focuses on the use of advanced computing technologies to support scientists. Recent research efforts in this area have focused primarily on “enabling” use of infrastructure resources for both data and computational access especially in Geosciences. One of the existing gaps in the existing e-Science efforts has been the failure to incorporate stable semantic technologies within the design process itself. In this presentation, we describe our effort in designing a framework for e-Science built using Service Oriented Architecture. Our framework provides users capabilities to create science workflows and mine distributed data. Our e-Science framework is being designed around a mass market tool to promote reusability across many projects. Semantics is an integral part of this framework and our design goal is to leverage the latest stable semantic technologies. The use of these stable semantic technologies will provide the users of our framework the useful features such as: allow search engines to find their content with RDFa tags; create RDF triple data store for their content; create RDF end points to share with others; and semantically mash their content with other online content available as RDF end point.

  19. Golfing with protons: using research grade simulation algorithms for online games

    NASA Astrophysics Data System (ADS)

    Harold, J.

    2004-12-01

    Scientists have long known the power of simulations. By modeling a system in a computer, researchers can experiment at will, developing an intuitive sense of how a system behaves. The rapid increase in the power of personal computers, combined with technologies such as Flash, Shockwave and Java, allow us to bring research simulations into the education world by creating exploratory environments for the public. This approach is illustrated by a project funded by a small grant from NSF's Informal Science Education program, through an opportunity that provides education supplements to existing research awards. Using techniques adapted from a magnetospheric research program, several Flash based interactives have been developed that allow web site visitors to explore the motion of particles in the Earth's magnetosphere. These pieces were folded into a larger Space Weather Center web project at the Space Science Institute (www.spaceweathercenter.org). Rather than presenting these interactives as plasma simulations per se, the research algorithms were used to create games such as "Magneto Mini Golf", where the balls are protons moving in combined electric and magnetic fields. The "holes" increase in complexity, beginning with no fields and progressing towards a simple model of Earth's magnetosphere. The emphasis of the activity is gameplay, but because it is at its core a plasma simulation, the user develops an intuitive sense of charged particle motion as they progress. Meanwhile, the pieces contain embedded assessments that are measurable through a database driven tracking system. Mining that database not only provides helpful usability information, but allows us to examine whether users are meeting the learning goals of the activities. We will discuss the development and evaluation results of the project, as well as the potential for these types of activities to shift the expectations of what a web site can and should provide educationally.

  20. Creating a Leadership Style

    ERIC Educational Resources Information Center

    Bonnici, Charles A.

    2011-01-01

    Many articles about school improvement talk about data-driven instruction and statistics. In the barrage of evaluative numbers, school leaders can forget that teaching and leading are arts, not sciences. Positive outcomes depend on the ambience of the school, which is a direct result of the leadership style of its principal and assistant…

  1. Intel ISEF 2008 Student Handbook

    ERIC Educational Resources Information Center

    Science Service, 2008

    2008-01-01

    Research is a process by which people discover or create new knowledge about the world in which they live. The International Science and Engineering Fair (ISEF) and Affiliated Fairs are research (data) driven. Students design research projects that provide quantitative data through experimentation followed by analysis and application of that data.…

  2. Do You Think You Can? The Influence of Student Self-Efficacy on the Effectiveness of Tutorial Dialogue for Computer Science

    ERIC Educational Resources Information Center

    Wiggins, Joseph B.; Grafsgaard, Joseph F.; Boyer, Kristy Elizabeth; Wiebe, Eric N.; Lester, James C.

    2017-01-01

    In recent years, significant advances have been made in intelligent tutoring systems, and these advances hold great promise for adaptively supporting computer science (CS) learning. In particular, tutorial dialogue systems that engage students in natural language dialogue can create rich, adaptive interactions. A promising approach to increasing…

  3. What Do Secondary Computer Science Teachers Need? Examining Curriculum, Pedagogy, and Contextual Support

    ERIC Educational Resources Information Center

    Sadik, Olgun

    2017-01-01

    The primary purpose of this study was to identify secondary computer science (CS) teachers' needs, related to knowledge, skills, and school setting, to create more effective CS education in the United States. In addition, this study examined how these needs change based on the participants' years of teaching experience as well as their background…

  4. Microscopic origin and macroscopic implications of lane formation in mixtures of oppositely-driven particles

    NASA Astrophysics Data System (ADS)

    Whitelam, Stephen

    Colloidal particles of two types, driven in opposite directions, can segregate into lanes. I will describe some results on this phenomenon obtained by simple physical arguments and computer simulations. Laning results from rectification of diffusion on the scale of a particle diameter: oppositely-driven particles must, in the time taken to encounter each other in the direction of the drive, diffuse in the perpendicular direction by about one particle diameter. This geometric constraint implies that the diffusion constant of a particle, in the presence of those of the opposite type, grows approximately linearly with Peclet number, a prediction confirmed by our numerics. Such environment-dependent diffusion is statistically similar to an effective interparticle attraction; consistent with this observation, we find that oppositely-driven colloids display features characteristic of the simplest model system possessing both interparticle attractions and persistent motion, the driven Ising lattice gas. Office of Science, Office of Basic Energy Sciences, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.

  5. Simulated Sustainable Societies: Students' Reflections on Creating Future Cities in Computer Games

    ERIC Educational Resources Information Center

    Nilsson, Elisabet M.; Jakobsson, Anders

    2011-01-01

    The empirical study, in this article, involved 42 students (ages 14-15), who used the urban simulation computer game SimCity 4 to create models of sustainable future cities. The aim was to explore in what ways the simulated "real" worlds provided by this game could be a potential facilitator for science learning contexts. The topic investigated is…

  6. Computationally driven drug discovery meeting-3 - Verona (Italy): 4 - 6th of March 2014.

    PubMed

    Costantino, Gabriele

    2014-12-01

    The following article reports on the results and the outcome of a meeting organised at the Aptuit Auditorium in Verona (Italy), which highlighted the current applications of state-of-the-art computational science to drug design in Italy. The meeting, which had > 100 people in attendance, consisted of over 40 presentations and included keynote lectures given by world-renowned speakers. The topics included in the meeting are areas related to ligand and structure-based ligand design and library design and screening; it also provided discussion pertaining to chemometrics. The meeting also stressed the importance of public-private collaboration and reviewed the different approaches to computationally driven drug discovery taken within academia and industry. The meeting helped define the current position of state-of-the-art computational drug discovery in Italy, pointing out criticalities and assets. This kind of focused meeting is important in the sense that it lends the opportunity of a restricted yet representative community of fellow professionals to deeply discuss the current methodological approaches and provide future perspectives for computationally driven drug discovery.

  7. Computers, Networks, and Desegregation at San Jose High Academy.

    ERIC Educational Resources Information Center

    Solomon, Gwen

    1987-01-01

    Describes magnet high school which was created in California to meet desegregation requirements and emphasizes computer technology. Highlights include local computer networks that connect science and music labs, the library/media center, business computer lab, writing lab, language arts skills lab, and social studies classrooms; software; teacher…

  8. EarthCube: Advancing Partnerships, Collaborative Platforms and Knowledge Networks in the Ocean Sciences

    NASA Astrophysics Data System (ADS)

    Stephen, Diggs; Lee, Allison

    2014-05-01

    The National Science Foundation's EarthCube initiative aims to create a community-driven data and knowledge management system that will allow for unprecedented data sharing across the geosciences. More than 2,500 participants through forums, work groups, EarthCube events, and virtual and in-person meetings have participated. The individuals that have engaged represent the core earth-system sciences of solid Earth, Atmosphere, Oceans, and Polar Sciences. EarthCube is a cornerstone of NSF's Cyberinfrastructure for the 21st Century (CIF21) initiative, whose chief objective is to develop a U.S. nationwide, sustainable, and community-based cyberinfrastructure for researchers and educators. Increasingly effective community-driven cyberinfrastructure allows global data discovery and knowledge management and achieves interoperability and data integration across scientific disciplines. There is growing convergence across scientific and technical communities on creating a networked, knowledge management system and scientific data cyberinfrastructure that integrates Earth system and human dimensions data in an open, transparent, and inclusive manner. EarthCube does not intend to replicate these efforts, but build upon them. An agile development process is underway for the development and governance of EarthCube. The agile approach was deliberately selected due to its iterative and incremental nature while promoting adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness.

  9. Virtual Sensor Web Architecture

    NASA Astrophysics Data System (ADS)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  10. NASA’s Universe of Learning: Engaging Subject Matter Experts to Support Museum Alliance Science Briefings

    NASA Astrophysics Data System (ADS)

    Marcucci, Emma; Slivinski, Carolyn; Lawton, Brandon L.; Smith, Denise A.; Squires, Gordon K.; Biferno, Anya A.; Lestition, Kathleen; Cominsky, Lynn R.; Lee, Janice C.; Rivera, Thalia; Walker, Allyson; Spisak, Marilyn

    2018-06-01

    NASA's Universe of Learning creates and delivers science-driven, audience-driven resources and experiences designed to engage and immerse learners of all ages and backgrounds in exploring the universe for themselves. The project is a unique partnership between the Space Telescope Science Institute, Caltech/IPAC, Jet Propulsion Laboratory, Smithsonian Astrophysical Observatory, and Sonoma State University and is part of the NASA SMD Science Activation Collective. The NASA’s Universe of Learning projects pull on the expertise of subject matter experts (scientist and engineers) from across the broad range of NASA Astrophysics themes and missions. One such project, which draws strongly on the expertise of the community, is the NASA’s Universe of Learning Science Briefings, which is done in collaboration with the NASA Museum Alliance. This collaboration presents a monthly hour-long discussion on relevant NASA astrophysics topics or events to an audience composed largely of informal educators from informal learning environments. These professional learning opportunities use experts and resources within the astronomical community to support increased interest and engagement of the informal learning community in NASA Astrophysics-related concepts and events. Briefings are designed to create a foundation for this audience using (1) broad science themes, (2) special events, or (3) breaking science news. The NASA’s Universe of Learning team engages subject matter experts to be speakers and present their science at these briefings to provide a direct connection to NASA Astrophysics science and provide the audience an opportunity to interact directly with scientists and engineers involved in NASA missions. To maximize the usefulness of the Museum Alliance Science Briefings, each briefing highlights resources related to the science theme to support informal educators in incorporating science content into their venues and/or interactions with the public. During this presentation, learn how you can help contribute to the NASA’s Universe of Learning and take part in Science Briefings.

  11. Modular, Semantics-Based Composition of Biosimulation Models

    ERIC Educational Resources Information Center

    Neal, Maxwell Lewis

    2010-01-01

    Biosimulation models are valuable, versatile tools used for hypothesis generation and testing, codification of biological theory, education, and patient-specific modeling. Driven by recent advances in computational power and the accumulation of systems-level experimental data, modelers today are creating models with an unprecedented level of…

  12. From Desktop to Teraflop: Exploiting the U.S. Lead in High Performance Computing. NSF Blue Ribbon Panel on High Performance Computing.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC.

    This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…

  13. Computer Science (CS) Education in Indian Schools: Situation Analysis Using Darmstadt Model

    ERIC Educational Resources Information Center

    Raman, Raghu; Venkatasubramanian, Smrithi; Achuthan, Krishnashree; Nedungadi, Prema

    2015-01-01

    Computer science (CS) and its enabling technologies are at the heart of this information age, yet its adoption as a core subject by senior secondary students in Indian schools is low and has not reached critical mass. Though there have been efforts to create core curriculum standards for subjects like Physics, Chemistry, Biology, and Math, CS…

  14. Understanding and Improving Blind Students' Access to Visual Information in Computer Science Education

    ERIC Educational Resources Information Center

    Baker, Catherine M.

    2017-01-01

    Teaching people with disabilities tech skills empowers them to create solutions to problems they encounter and prepares them for careers. However, computer science is typically taught in a highly visual manner which can present barriers for people who are blind. The goal of this dissertation is to understand and decrease those barriers. The first…

  15. The Effects of Computer-Aided Concept Cartoons and Outdoor Science Activities on Light Pollution

    ERIC Educational Resources Information Center

    Aydin, Güliz

    2015-01-01

    The purpose of this study is to create an awareness of light pollution on seventh grade students via computer aided concept cartoon applications and outdoor science activities and to help them develop solutions; and to determine student opinions on the practices carried out. The study was carried out at a middle school in Mugla province of Aegean…

  16. Data-driven Science in Geochemistry & Petrology: Vision & Reality

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Ghiorso, M. S.; Spear, F. S.

    2013-12-01

    Science in many fields is increasingly ';data-driven'. Though referred to as a ';new' Fourth Paradigm (Hey, 2009), data-driven science is not new, and examples are cited in the Geochemical Society's data policy, including the compilation of Dziewonski & Anderson (1981) that led to PREM, and Zindler & Hart (1986), who compiled mantle isotope data to present for the first time a comprehensive view of the Earth's mantle. Today, rapidly growing data volumes, ubiquity of data access, and new computational and information management technologies enable data-driven science at a radically advanced scale of speed, extent, flexibility, and inclusiveness, with the ability to seamlessly synthesize observations, experiments, theory, and computation, and to statistically mine data across disciplines, leading to more comprehensive, well informed, and high impact scientific advances. Are geochemists, petrologists, and volcanologists ready to participate in this revolution of the scientific process? In the past year, researchers from the VGP community and related disciplines have come together at several cyberinfrastructure related workshops, in part prompted by the EarthCube initiative of the US NSF, to evaluate the status of cyberinfrastructure in their field, to put forth key scientific challenges, and identify primary data and software needs to address these. Science scenarios developed by workshop participants that range from non-equilibrium experiments focusing on mass transport, chemical reactions, and phase transformations (J. Hammer) to defining the abundance of elements and isotopes in every voxel in the Earth (W. McDonough), demonstrate the potential of cyberinfrastructure enabled science, and define the vision of how data access, visualization, analysis, computation, and cross-domain interoperability can and should support future research in VGP. The primary obstacle for data-driven science in VGP remains the dearth of accessible, integrated data from lab and sensor measurements, experiments, and models, both from past and from present studies, and their poor discoverability, interoperability, and standardization. Other deficiencies include the lack of widespread sample curation and online sample catalogs, and broad community support and enforcement of open data sharing policies and a strategy for sustained funding and operation of the cyberinfrastructure. In order to achieve true data-driven science in geochemistry and petrology, one of the primary requirements is to change the way data and models are managed and shared to dramatically improve their access and re-usability. Adoption of new data publication practices, new ways of citing data that ensure attribution and credit to authors, tools that help investigators to seamlessly manage their data throughout the data life cycle, from the point of acquisition to upload to repositories, and population of databases with historical data are among the most urgent needs. The community, especially early career scientists, must work together to produce the cultural shift within the discipline toward sharing of data and knowledge, virtual collaboration, and social networking. Dziewonski, A M, & Anderson, D L: Physics of the Earth and Planet Interiors 25 (4), 297 (1981) Hey, T, Tansley, S, Tolle, K (Eds.): Redmond, VA: Microsoft Research (2009) Zindler, A, & Hart, S R: Ann. Rev. Earth Plan. Sci. 14, 493 (1986)

  17. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  18. Networking Cyberinfrastructure Resources to Support Global, Cross-disciplinary Science

    NASA Astrophysics Data System (ADS)

    Lehnert, K.; Ramamurthy, M. K.

    2016-12-01

    Geosciences are globally connected by nature and the grand challenge problems like climate change, ocean circulations, seasonal predictions, impact of volcanic eruptions, etc. all transcend both disciplinary and geographic boundaries, requiring cross-disciplinary and international partnerships. Cross-disciplinary and international collaborations are also needed to unleash the power of cyber- (or e-) infrastructure (CI) by networking globally distributed, multi-disciplinary data, software, and computing resources to accelerate new scientific insights and discoveries. While the promises of a global and cross-disciplinary CI are exhilarating and real, a range of technical, organizational, and social challenges needs to be overcome in order to achieve alignment and linking of operational data systems, software tools, and computing facilities. New modes of collaboration require agreement on and governance of technical standards and best practices, and funding for necessary modifications. This presentation will contribute the perspective of domain-specific data facilities to the discussion of cross-disciplinary and international collaboration in CI development and deployment, in particular those of IEDA (Interdisciplinary Earth Data Alliance) serving the solid Earth sciences and Unidata serving atmospheric sciences. Both facilities are closely involved with the US NSF EarthCube program that aims to network and augment existing Geoscience CI capabilities "to make disciplinary boundaries permeable, nurture and facilitate knowledge sharing, …, and enhance collaborative pursuit of cross-disciplinary research" (EarthCube Strategic Vision), while also collaborating internationally to network domain-specific and cross-disciplinary CI resources. These collaborations are driven by the substantial benefits to the science community, but create challenges, when operational and funding constraints need to be balanced with adjustments to new joint data curation practices and interoperability standards.

  19. Building a Co-Created Citizen Science Program with Community Members Neighboring a Hazardous Waste Site

    NASA Astrophysics Data System (ADS)

    Ramirez-Andreotta, M.; Brusseau, M. L. L.; Artiola, J. F.; Maier, R. M.; Gandolfi, A. J.

    2015-12-01

    A research project that is only expert-driven may ignore the role of local knowledge in research, often gives low priority to the development of a comprehensive strategy to engage the community, and may not deliver the results of the study to the community in an effective way. To date, only a limited number of co-created citizen science projects, where community members are involved in most or all steps of the scientific process, have been initiated at contaminated sites and even less in conjunction with risk communication. Gardenroots: The Dewey-Humboldt AZ Garden Project was a place-based, co-created citizen science project where community members and researchers together: defined the question for study, developed hypotheses, collected environmental samples, disseminated results broadly, translated the results into action, and posed new research questions. This co-created environmental research project produced new data and addressed an additional exposure route (consumption of vegetables grown in soils with elevated arsenic levels) that was not being evaluated in the current site assessment. Furthermore, co-producing science led to both individual learning and social-ecological outcomes. This approach illustrates the benefits of a co-created citizen-science program in addressing the complex problems that arise in communities neighboring a hazardous waste sites. Such a project increased the community's involvement in regional environmental assessment and decision-making, which has the potential to help mitigate environmental exposures and thereby reduce associated risks.

  20. Robots Bring Math-Powered Ideas to Life

    ERIC Educational Resources Information Center

    Allen, Kasi C.

    2013-01-01

    What if every middle school student learned to create a robot in math class? What if every middle school had a robotics team? Would students view mathematics differently? Would they have a different relationship with technology? Might they see science and engineering as fields driven by innovation rather than memorization? As students find…

  1. Understanding the spin-driven polarizations in Bi MO3 (M = 3 d transition metals) multiferroics

    NASA Astrophysics Data System (ADS)

    Kc, Santosh; Lee, Jun Hee; Cooper, Valentino R.

    Bismuth ferrite (BiFeO3) , a promising multiferroic, stabilizes in a perovskite type rhombohedral crystal structure (space group R3c) at room temperature. Recently, it has been reported that in its ground state it possess a huge spin-driven polarization. To probe the underlying mechanism of this large spin-phonon response, we examine these couplings within other Bi based 3 d transition metal oxides Bi MO3 (M = Ti, V, Cr, Mn, Fe, Co, Ni) using density functional theory. Our results demonstrate that this large spin-driven polarization is a consequence of symmetry breaking due to competition between ferroelectric distortions and anti-ferrodistortive octahedral rotations. Furthermore, we find a strong dependence of these enhanced spin-driven polarizations on the crystal structure; with the rhombohedral phase having the largest spin-induced atomic distortions along [111]. These results give us significant insights into the magneto-electric coupling in these materials which is essential to the magnetic and electric field control of electric polarization and magnetization in multiferroic based devices. Research is supported by the US Department of Energy, Office of Science, Basic Energy Sciences, Materials Science and Engineering Division and the Office of Science Early Career Research Program (V.R.C) and used computational resources at NERSC.

  2. High Tech Programmers in Low-Income Communities: Creating a Computer Culture in a Community Technology Center

    NASA Astrophysics Data System (ADS)

    Kafai, Yasmin B.; Peppler, Kylie A.; Chiu, Grace M.

    For the last twenty years, issues of the digital divide have driven efforts around the world to address the lack of access to computers and the Internet, pertinent and language appropriate content, and technical skills in low-income communities (Schuler & Day, 2004a and b). The title of our paper makes reference to a milestone publication (Schon, Sanyal, & Mitchell, 1998) that showcased some of the early work and thinking in this area. Schon, Sanyal and Mitchell's book edition included an article outlining the Computer Clubhouse, a type of community technology center model, which was developed to create opportunities for youth in low-income communities to become creators and designers of technologies by (1998). The model has been very successful scaling up, with over 110 Computer Clubhouses now in existence worldwide.

  3. A 21st Century Science, Technology, and Innovation Strategy for Americas National Security

    DTIC Science & Technology

    2016-05-01

    areas. Advanced Computing and Communications The exponential growth of the digital economy, driven by ubiquitous computing and communication...weapons- focused R&D, many of the capabilities being developed have significant dual-use potential. Digital connectivity, for instance, brings...scale than traditional recombinant DNA techniques, and to share these designs digitally . Nanotechnology promises the ability to engineer entirely

  4. Toward Using Games to Teach Fundamental Computer Science Concepts

    ERIC Educational Resources Information Center

    Edgington, Jeffrey Michael

    2010-01-01

    Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. …

  5. High performance cellular level agent-based simulation with FLAME for the GPU.

    PubMed

    Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela

    2010-05-01

    Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.

  6. Understanding and Improving Blind Students' Access to Visual Information in Computer Science Education

    NASA Astrophysics Data System (ADS)

    Baker, Catherine M.

    Teaching people with disabilities tech skills empowers them to create solutions to problems they encounter and prepares them for careers. However, computer science is typically taught in a highly visual manner which can present barriers for people who are blind. The goal of this dissertation is to understand and decrease those barriers. The first projects I present looked at the barriers that blind students face. I first present the results of my survey and interviews with blind students with degrees in computer science or related fields. This work highlighted the many barriers that these blind students faced. I then followed-up on one of the barriers mentioned, access to technology, by doing a preliminary accessibility evaluation of six popular integrated development environments (IDEs) and code editors. I found that half were unusable and all had some inaccessible portions. As access to visual information is a barrier in computer science education, I present three projects I have done to decrease this barrier. The first project is Tactile Graphics with a Voice (TGV). This project investigated an alternative to Braille labels for those who do not know Braille and showed that TGV was a potential alternative. The next project was StructJumper, which created a modified abstract syntax tree that blind programmers could use to navigate through code with their screen reader. The evaluation showed that users could navigate more quickly and easily determine the relationships of lines of code when they were using StructJumper compared to when they were not. Finally, I present a tool for dynamic graphs (the type with nodes and edges) which had two different modes for handling focus changes when moving between graphs. I found that the modes support different approaches for exploring the graphs and therefore preferences are mixed based on the user's preferred approach. However, both modes had similar accuracy in completing the tasks. These projects are a first step towards the goal of making computer science education more accessible to blind students. By identifying the barriers that exist and creating solutions to overcome them, we can support increasing the number of blind students in computer science.

  7. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  8. UC Merced Center for Computational Biology Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Michael; Watanabe, Masakatsu

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less

  9. Integrating Information & Communications Technologies into the Classroom

    ERIC Educational Resources Information Center

    Tomei, Lawrence, Ed.

    2007-01-01

    "Integrating Information & Communications Technologies Into the Classroom" examines topics critical to business, computer science, and information technology education, such as: school improvement and reform, standards-based technology education programs, data-driven decision making, and strategic technology education planning. This book also…

  10. Creating a Podcast/Vodcast: A How-To Approach

    NASA Astrophysics Data System (ADS)

    Petersen, C. C.

    2011-09-01

    Creating podcasts and vodcasts is a wonderful way to share news of science research. Public affairs officers use them to reveal the latest discoveries done by scientists in their institutions. Educators can offer podcast/vodcast creation for students who want a unique way to demonstrate their mastery of science topics. Anyone with a computer and a USB microphone can create a podcast. To do a vodcast, you also need a digital video camera and video editing software. This session focused mainly on creating a podcast - writing the script and recording the soundtrack. Attendees also did a short activity to learn to write effective narrative copy for a podcast/vodcast.

  11. Increasing Participation in the Earth Sciences A 35 year Journey

    NASA Astrophysics Data System (ADS)

    Blueford, J. R.

    2006-12-01

    In the 1970's the fact that woman and ethnic minorities men made up approximately10% of the workforce in the geosciences created concern. Determining ways to increase the participation became a topic of discussion amongst many of the geosciences agencies in the United States. Many created scholarships and work opportunities for students. One of the most successful projects was the MPES (Minority Participation in the Earth Science) Program implemented by the U.S. Geological Survey. A key factor in its success was its outreach programs which used employees to work in elementary schools to get children excited about earth sciences. Successive years added teacher workshops and developing career day presentations to help school districts increase the awareness of the earth sciences. However, cutbacks prevented the continuation of these programs, but from the ashes a new non-profit organization of scientists, the Math Science Nucleus, developed curriculum and implementation strategies that used Earth Sciences as a core content area. Using the power of the internet, it provided teachers and parents around the world content driven curriculum. The Integrating Science, Math, and Technology Reference Curriculum is used around the world to help teachers understand how children learn science content.

  12. 2005 White Paper on Institutional Capability Computing Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, B; McCoy, M; Seager, M

    This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinctionmore » has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance that should be maintained in the simulation environment. With the advent of Thunder, the institution directed a change in course from past practice. Instead of making Thunder available to the large body of scientists, as was MCR, and effectively using it as a capacity system, the intent was to make it available to perhaps ten projects so that these teams could run very aggressive problems for breakthrough science. This usage model established Thunder as a capability system. The challenge this strategy raises is that the majority of scientists have not seen an improvement in capacity computing resources since MCR, thus creating significant tension in the system. The question then is: 'How do we address the institution's desire to maintain the potential for breakthrough science and also meet the legitimate requests from the ICEG to achieve balance?' Both the capability and the capacity environments must be addressed through this one procurement. The third challenge is to reach out more aggressively to the national science community to encourage access to LLNL resources as part of a strategy for sharpening our science through collaboration. Related to this, LLNL has been unable in the past to provide access for sensitive foreign nationals (SFNs) to the Livermore Computing (LC) unclassified 'yellow' network. Identifying some mechanism for data sharing between LLNL computational scientists and SFNs would be a first practical step in fostering cooperative, collaborative relationships with an important and growing sector of the American science community.« less

  13. Ten recommendations for software engineering in research.

    PubMed

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  14. Introducing Computational Thinking through Hands-on Projects Using R with Applications to Calculus, Probability and Data Analysis

    ERIC Educational Resources Information Center

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-01-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data…

  15. User-driven sampling strategies in image exploitation

    NASA Astrophysics Data System (ADS)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  16. Using XML Configuration-Driven Development to Create a Customizable Ground Data System

    NASA Technical Reports Server (NTRS)

    Nash, Brent; DeMore, Martha

    2009-01-01

    The Mission data Processing and Control Subsystem (MPCS) is being developed as a multi-mission Ground Data System with the Mars Science Laboratory (MSL) as the first fully supported mission. MPCS is a fully featured, Java-based Ground Data System (GDS) for telecommand and telemetry processing based on Configuration-Driven Development (CDD). The eXtensible Markup Language (XML) is the ideal language for CDD because it is easily readable and editable by all levels of users and is also backed by a World Wide Web Consortium (W3C) standard and numerous powerful processing tools that make it uniquely flexible. The CDD approach adopted by MPCS minimizes changes to compiled code by using XML to create a series of configuration files that provide both coarse and fine grained control over all aspects of GDS operation.

  17. High Throughput Screening of Toxicity Pathways Perturbed by Environmental Chemicals

    EPA Science Inventory

    Toxicology, a field largely unchanged over the past several decades, is undergoing a significant transformation driven by a number of forces – the increasing number of chemicals needing assessment, changing legal requirements, advances in biology and computer science, and concern...

  18. Computer Aided Teaching of Digital Signal Processing.

    ERIC Educational Resources Information Center

    Castro, Ian P.

    1990-01-01

    Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…

  19. Connecting Teachers and Students with Science Experts: NASA's Expedition Earth and Beyond Program

    NASA Astrophysics Data System (ADS)

    Graff, P. V.; Stefanov, W. L.; Willis, K. J.; Runco, S.; McCollum, T.; Baker, M.; Mailhot, M.; Lindgren, C. F.

    2010-12-01

    Classroom teachers are challenged with engaging and preparing today’s students for the future. Activities are driven by state required skills, education standards, and high stakes testing. How can educators teach required standards and motivate students to not only learn essential skills, but also acquire a sense of intrigue to want to learn more? One way is to allow students to take charge of their learning and conduct student-driven research. NASA’s Expedition Earth and Beyond program, based at the NASA Johnson Space Center, is designed to do just that. The program, developed by both educators and scientists, promotes inquiry-based investigations in classrooms (grades 5-14) by using current NASA data. By combining the expertise of teachers, who understand the everyday challenges of working with students, and scientists, who work with the process of science as they conduct their own research, the result is a realistic and useable means in which to promote authentic research in classrooms. NASA’s Expedition Earth and Beyond Program was created with the understanding that there are three important aspects that enable teachers to implement authentic research experiences in the classroom. These aspects are: 1) Standards-aligned, inquiry based curricular resources and an implementation structure to support student-driven research; 2) Professional development opportunities to learn techniques and strategies to ensure seamless implementation of resources; and 3) Ongoing support. Expedition Earth and Beyond provides all three of these aspects and adds two additional and inspiring motivators. One is the opportunity for student research teams to request new data. Data requested and approved would be acquired by astronauts orbiting Earth on the International Space Station. This aspect is part of the process of science structure and provides a powerful way to excite students. The second, and perhaps more significant motivator, is the creation of connections between science experts and classrooms. Scientists are able to connect with participating classrooms on a variety of different levels, including being a mentor. These powerful connections provide extraordinary opportunities for students to develop the rigor and relevance of their research, along with encouraging them to have a sense of pride in the work they are doing in school. Providing teachers with skills and the confidence to promote authentic research investigations in the classroom will equip them to create science literate students, and by extension, improve the public understanding of science. The opportunity to connect classrooms with science experts creates personal experiences that are engaging, motivating and impactful. These impactful experiences will help prepare today’s students to become the next generation of scientists or perhaps science educators who can help continue these powerful connections for generations to come.

  20. Data-driven Ontology Development: A Case Study at NASA's Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Hertz, J.; Huffer, E.; Kusterer, J.

    2012-12-01

    Well-founded ontologies are key to enabling transformative semantic technologies and accelerating scientific research. One example is semantically enabled search and discovery, making scientific data accessible and more understandable by accurately modeling a complex domain. The ontology creation process remains a challenge for many anxious to pursue semantic technologies. The key may be that the creation process -- whether formal, community-based, automated or semi-automated -- should encompass not only a foundational core and supplemental resources but also a focus on the purpose or mission the ontology is created to support. Are there tools or processes to de-mystify, assess or enhance the resulting ontology? We suggest that comparison and analysis of a domain-focused ontology can be made using text engineering tools for information extraction, tokenizers, named entity transducers and others. The results are analyzed to ensure the ontology reflects the core purpose of the domain's mission and that the ontology integrates and describes the supporting data in the language of the domain - how the science is analyzed and discussed among all users of the data. Commonalities and relationships among domain resources describing the Clouds and Earth's Radiant Energy (CERES) Bi-Directional Scan (BDS) datasets from NASA's Atmospheric Science Data Center are compared. The domain resources include: a formal ontology created for CERES; scientific works such as papers, conference proceedings and notes; information extracted from the datasets (i.e., header metadata); and BDS scientific documentation (Algorithm Theoretical Basis Documents, collection guides, data quality summaries and others). These resources are analyzed using the open source software General Architecture for Text Engineering, a mature framework for computational tasks involving human language.

  1. Science Education and Technology: Opportunities to Enhance Student Learning.

    ERIC Educational Resources Information Center

    Woolsey, Kristina; Bellamy, Rachel

    1997-01-01

    Describes how technological capabilities such as calculation, imaging, networking, and portability support a range of pedagogical approaches, such as inquiry-based science and dynamic modeling. Includes as examples software products created at Apple Computer and others available in the marketplace. (KDFB)

  2. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    PubMed

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  3. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    PubMed Central

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2008-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259

  4. Promoting elementary students' epistemology of science through computer-supported knowledge-building discourse and epistemic reflection

    NASA Astrophysics Data System (ADS)

    Lin, Feng; Chan, Carol K. K.

    2018-04-01

    This study examined the role of computer-supported knowledge-building discourse and epistemic reflection in promoting elementary-school students' scientific epistemology and science learning. The participants were 39 Grade 5 students who were collectively pursuing ideas and inquiry for knowledge advance using Knowledge Forum (KF) while studying a unit on electricity; they also reflected on the epistemic nature of their discourse. A comparison class of 22 students, taught by the same teacher, studied the same unit using the school's established scientific investigation method. We hypothesised that engaging students in idea-driven and theory-building discourse, as well as scaffolding them to reflect on the epistemic nature of their discourse, would help them understand their own scientific collaborative discourse as a theory-building process, and therefore understand scientific inquiry as an idea-driven and theory-building process. As hypothesised, we found that students engaged in knowledge-building discourse and reflection outperformed comparison students in scientific epistemology and science learning, and that students' understanding of collaborative discourse predicted their post-test scientific epistemology and science learning. To further understand the epistemic change process among knowledge-building students, we analysed their KF discourse to understand whether and how their epistemic practice had changed after epistemic reflection. The implications on ways of promoting epistemic change are discussed.

  5. The Impact of Three-Dimensional Computational Modeling on Student Understanding of Astronomical Concepts: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Hansen, John; Barnett, Michael; MaKinster, James; Keating, Thomas

    2004-01-01

    The increased availability of computational modeling software has created opportunities for students to engage in scientific inquiry through constructing computer-based models of scientific phenomena. However, despite the growing trend of integrating technology into science curricula, educators need to understand what aspects of these technologies…

  6. Using Computer Technology to Create a Revolutionary New Style of Biology.

    ERIC Educational Resources Information Center

    Monaghan, Peter

    1993-01-01

    A $13-million gift of William Gates III to the University of Washington has enabled establishment of the country's first department in molecular biotechnology, a combination of medicine and molecular biology to be practiced by researchers versed in a variety of fields, including computer science, computation, applied physics, and engineering. (MSE)

  7. Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon

    2014-01-01

    Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.

  8. [Rationalities of knowledge production: on transformations of objects, technologies and information in biomedicine and the life sciences].

    PubMed

    Paul, Norbert W

    2009-09-01

    Since decades, scientific change has been interpreted in the light of paradigm shifts and scientific revolutions. The Kuhnian interpretation of scientific change however is now more and more confronted with non-disciplinary thinking in both, science and studies on science. This paper explores how research in biomedicine and the life sciences can be characterized by different rationalities, sometimes converging, sometimes contradictory, all present at the same time with varying ways of influence, impact, and visibility. In general, the rationality of objects is generated by fitting new objects and findings into a new experimental context. The rationality of hypotheses is a move towards the construction of novel explanatory tools and models. This is often inseparable meshing with the third, the technological rationality, in which a technology-driven, self-supporting and sometimes self-referential refinement of methods and technologies comes along with an extension into other fields. During the second and the third phase, the new and emerging fields tend to expand their explanatory reach not only across disciplinary boundaries but also into the social sphere, creating what has been characterized as "exceptionalism" (e.g. genetic exceptionalism or neuro-exceptionalism). Finally, recent biomedicine and life-sciences reach a level in which experimental work becomes more and more data-driven because the technologically constructed experimental systems generate a plethora of findings (data) which at some point start to blur the original hypotheses. For the rationality of information the materiality of research practices becomes secondary and research objects are more and more getting out of sight. Finally, the credibility of science as a practice becomes more and more dependent on consensus about the applicability and relevance of its results. The rationality of interest (and accountability) has become more and more characteristic for a research process which is no longer primarily determined by the desire for knowledge but by the desire for relevance. This paper explores in which ways object-driven and hypotheses-driven experimental life-sciences transformed into domains of experimental research evolving in a technologically constructed, data-driven environment in which they are subjected to constant morphing due to the forces of different rationalities.

  9. Paper-and-Pencil Programming Strategy toward Computational Thinking for Non-Majors: Design Your Solution

    ERIC Educational Resources Information Center

    Kim, Byeongsu; Kim, Taehun; Kim, Jonghoon

    2013-01-01

    The paper-and-pencil programming strategy (PPS) is a way of representing an idea logically by any representation that can be created using paper and pencil. It was developed for non-computer majors to improve their understanding and use of computational thinking and increase interest in learning computer science. A total of 110 non-majors in their…

  10. From Both Sides, Now: Librarians Team up with Computer Scientist to Deliver Virtual Computer-Information Literacy Instruction

    ERIC Educational Resources Information Center

    Loesch, Martha Fallahay

    2011-01-01

    Two members of the library faculty at Seton Hall University teamed up with a respected professor of mathematics and computer science, in order to create an online course that introduces information literacy both from the perspectives of the computer scientist and from the instruction librarian. This collaboration is unique in that it addresses the…

  11. Inventors in the Making

    ERIC Educational Resources Information Center

    Murray, Jenny; Bartelmay, Kathy

    2005-01-01

    Can second-grade students construct an understanding of sophisticated science processes and explore physics concepts while creating their own inventions? Yes! Students accomplished this and much more through a month-long project in which they used Legos and Robolab, the Lego computer programing software, to create their own inventions. One…

  12. Modeling hazardous mass flows Geoflows09: Mathematical and computational aspects of modeling hazardous geophysical mass flows; Seattle, Washington, 9–11 March 2009

    USGS Publications Warehouse

    Iverson, Richard M.; LeVeque, Randall J.

    2009-01-01

    A recent workshop at the University of Washington focused on mathematical and computational aspects of modeling the dynamics of dense, gravity-driven mass movements such as rock avalanches and debris flows. About 30 participants came from seven countries and brought diverse backgrounds in geophysics; geology; physics; applied and computational mathematics; and civil, mechanical, and geotechnical engineering. The workshop was cosponsored by the U.S. Geological Survey Volcano Hazards Program, by the U.S. National Science Foundation through a Vertical Integration of Research and Education (VIGRE) in the Mathematical Sciences grant to the University of Washington, and by the Pacific Institute for the Mathematical Sciences. It began with a day of lectures open to the academic community at large and concluded with 2 days of focused discussions and collaborative work among the participants.

  13. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  14. Cognitive Computational Neuroscience: A New Conference for an Emerging Discipline.

    PubMed

    Naselaris, Thomas; Bassett, Danielle S; Fletcher, Alyson K; Kording, Konrad; Kriegeskorte, Nikolaus; Nienborg, Hendrikje; Poldrack, Russell A; Shohamy, Daphna; Kay, Kendrick

    2018-05-01

    Understanding the computational principles that underlie complex behavior is a central goal in cognitive science, artificial intelligence, and neuroscience. In an attempt to unify these disconnected communities, we created a new conference called Cognitive Computational Neuroscience (CCN). The inaugural meeting revealed considerable enthusiasm but significant obstacles remain. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. First-principles data-driven discovery of transition metal oxides for artificial photosynthesis

    NASA Astrophysics Data System (ADS)

    Yan, Qimin

    We develop a first-principles data-driven approach for rapid identification of transition metal oxide (TMO) light absorbers and photocatalysts for artificial photosynthesis using the Materials Project. Initially focusing on Cr, V, and Mn-based ternary TMOs in the database, we design a broadly-applicable multiple-layer screening workflow automating density functional theory (DFT) and hybrid functional calculations of bulk and surface electronic and magnetic structures. We further assess the electrochemical stability of TMOs in aqueous environments from computed Pourbaix diagrams. Several promising earth-abundant low band-gap TMO compounds with desirable band edge energies and electrochemical stability are identified by our computational efforts and then synergistically evaluated using high-throughput synthesis and photoelectrochemical screening techniques by our experimental collaborators at Caltech. Our joint theory-experiment effort has successfully identified new earth-abundant copper and manganese vanadate complex oxides that meet highly demanding requirements for photoanodes, substantially expanding the known space of such materials. By integrating theory and experiment, we validate our approach and develop important new insights into structure-property relationships for TMOs for oxygen evolution photocatalysts, paving the way for use of first-principles data-driven techniques in future applications. This work is supported by the Materials Project Predictive Modeling Center and the Joint Center for Artificial Photosynthesis through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231. Computational resources also provided by the Department of Energy through the National Energy Supercomputing Center.

  16. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    NASA Astrophysics Data System (ADS)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  17. OlyMPUS - The Ontology-based Metadata Portal for Unified Semantics

    NASA Astrophysics Data System (ADS)

    Huffer, E.; Gleason, J. L.

    2015-12-01

    The Ontology-based Metadata Portal for Unified Semantics (OlyMPUS), funded by the NASA Earth Science Technology Office Advanced Information Systems Technology program, is an end-to-end system designed to support data consumers and data providers, enabling the latter to register their data sets and provision them with the semantically rich metadata that drives the Ontology-Driven Interactive Search Environment for Earth Sciences (ODISEES). OlyMPUS leverages the semantics and reasoning capabilities of ODISEES to provide data producers with a semi-automated interface for producing the semantically rich metadata needed to support ODISEES' data discovery and access services. It integrates the ODISEES metadata search system with multiple NASA data delivery tools to enable data consumers to create customized data sets for download to their computers, or for NASA Advanced Supercomputing (NAS) facility registered users, directly to NAS storage resources for access by applications running on NAS supercomputers. A core function of NASA's Earth Science Division is research and analysis that uses the full spectrum of data products available in NASA archives. Scientists need to perform complex analyses that identify correlations and non-obvious relationships across all types of Earth System phenomena. Comprehensive analytics are hindered, however, by the fact that many Earth science data products are disparate and hard to synthesize. Variations in how data are collected, processed, gridded, and stored, create challenges for data interoperability and synthesis, which are exacerbated by the sheer volume of available data. Robust, semantically rich metadata can support tools for data discovery and facilitate machine-to-machine transactions with services such as data subsetting, regridding, and reformatting. Such capabilities are critical to enabling the research activities integral to NASA's strategic plans. However, as metadata requirements increase and competing standards emerge, metadata provisioning becomes increasingly burdensome to data producers. The OlyMPUS system helps data providers produce semantically rich metadata, making their data more accessible to data consumers, and helps data consumers quickly discover and download the right data for their research.

  18. Bigdata Driven Cloud Security: A Survey

    NASA Astrophysics Data System (ADS)

    Raja, K.; Hanifa, Sabibullah Mohamed

    2017-08-01

    Cloud Computing (CC) is a fast-growing technology to perform massive-scale and complex computing. It eliminates the need to maintain expensive computing hardware, dedicated space, and software. Recently, it has been observed that massive growth in the scale of data or big data generated through cloud computing. CC consists of a front-end, includes the users’ computers and software required to access the cloud network, and back-end consists of various computers, servers and database systems that create the cloud. In SaaS (Software as-a-Service - end users to utilize outsourced software), PaaS (Platform as-a-Service-platform is provided) and IaaS (Infrastructure as-a-Service-physical environment is outsourced), and DaaS (Database as-a-Service-data can be housed within a cloud), where leading / traditional cloud ecosystem delivers the cloud services become a powerful and popular architecture. Many challenges and issues are in security or threats, most vital barrier for cloud computing environment. The main barrier to the adoption of CC in health care relates to Data security. When placing and transmitting data using public networks, cyber attacks in any form are anticipated in CC. Hence, cloud service users need to understand the risk of data breaches and adoption of service delivery model during deployment. This survey deeply covers the CC security issues (covering Data Security in Health care) so as to researchers can develop the robust security application models using Big Data (BD) on CC (can be created / deployed easily). Since, BD evaluation is driven by fast-growing cloud-based applications developed using virtualized technologies. In this purview, MapReduce [12] is a good example of big data processing in a cloud environment, and a model for Cloud providers.

  19. India's Computational Biology Growth and Challenges.

    PubMed

    Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy

    2016-09-01

    India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges.

  20. Pattern recognition with "materials that compute".

    PubMed

    Fang, Yan; Yashin, Victor V; Levitan, Steven P; Balazs, Anna C

    2016-09-01

    Driven by advances in materials and computer science, researchers are attempting to design systems where the computer and material are one and the same entity. Using theoretical and computational modeling, we design a hybrid material system that can autonomously transduce chemical, mechanical, and electrical energy to perform a computational task in a self-organized manner, without the need for external electrical power sources. Each unit in this system integrates a self-oscillating gel, which undergoes the Belousov-Zhabotinsky (BZ) reaction, with an overlaying piezoelectric (PZ) cantilever. The chemomechanical oscillations of the BZ gels deflect the PZ layer, which consequently generates a voltage across the material. When these BZ-PZ units are connected in series by electrical wires, the oscillations of these units become synchronized across the network, where the mode of synchronization depends on the polarity of the PZ. We show that the network of coupled, synchronizing BZ-PZ oscillators can perform pattern recognition. The "stored" patterns are set of polarities of the individual BZ-PZ units, and the "input" patterns are coded through the initial phase of the oscillations imposed on these units. The results of the modeling show that the input pattern closest to the stored pattern exhibits the fastest convergence time to stable synchronization behavior. In this way, networks of coupled BZ-PZ oscillators achieve pattern recognition. Further, we show that the convergence time to stable synchronization provides a robust measure of the degree of match between the input and stored patterns. Through these studies, we establish experimentally realizable design rules for creating "materials that compute."

  1. Creating Mobile and Web Application Programming Interfaces (APIs) for NASA Science Data

    NASA Astrophysics Data System (ADS)

    Oostra, D.; Chambers, L. H.; Lewis, P. M.; Moore, S. W.

    2011-12-01

    The Atmospheric Science Data Center (ASDC) at the NASA Langley Research Center in Virginia houses almost three petabytes of data, a collection that increases every day. To put it into perspective, it is estimated that three petabytes of data storage could store a digitized copy of all printed material in U.S. research libraries. There are more than ten other NASA data centers like the ASDC. Scientists and the public use this data for research, science education, and to understand our environment. Most importantly these data provide the potential for all of us make new discoveries. NASA is about making discoveries. Galileo was quoted as saying, "All discoveries are easy to understand once they are discovered. The point is to discover them." To that end, NASA stores vast amounts of publicly available data. This paper examines an approach to create web applications that serve NASA data in ways that specifically address the mobile web application technologies that are quickly emerging. Mobile data is not a new concept. What is new, is that user driven tools have recently become available that allow users to create their own mobile applications. Through the use of these cloud-based tools users can produce complete native mobile applications. Thus, mobile apps can now be created by everyone, regardless of their programming experience or expertise. This work will explore standards and methods for creating dynamic and malleable application programming interfaces (APIs) that allow users to access and use NASA science data for their own needs. The focus will be on experiences that broaden and increase the scope and usage of NASA science data sets.

  2. GRAPHICS MANAGER (GFXMGR): An interactive graphics software program for the Advanced Electronics Design (AED) graphics controller, Model 767

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faculjak, D.A.

    1988-03-01

    Graphics Manager (GFXMGR) is menu-driven, user-friendly software designed to interactively create, edit, and delete graphics displays on the Advanced Electronics Design (AED) graphics controller, Model 767. The software runs on the VAX family of computers and has been used successfully in security applications to create and change site layouts (maps) of specific facilities. GFXMGR greatly benefits graphics development by minimizing display-development time, reducing tedium on the part of the user, and improving system performance. It is anticipated that GFXMGR can be used to create graphics displays for many types of applications. 8 figs., 2 tabs.

  3. Incorporating iPad Technology: Creating More Effective Language Classrooms

    ERIC Educational Resources Information Center

    Ahmed, Khawlah; Nasser, Omaima

    2015-01-01

    Technology today plays a significant role in the lives of many students who are part of a technology-driven culture that they have grown up with. It would seem unimaginable for young adults today to communicate or exchange ideas without using technology. The plethora of devices competing with the computer, from smartphones to tablets, just to name…

  4. Understanding Resonance Graphs Using Easy Java Simulations (EJS) and Why We Use EJS

    ERIC Educational Resources Information Center

    Wee, Loo Kang; Lee, Tat Leong; Chew, Charles; Wong, Darren; Tan, Samuel

    2015-01-01

    This paper reports a computer model simulation created using Easy Java Simulation (EJS) for learners to visualize how the steady-state amplitude of a driven oscillating system varies with the frequency of the periodic driving force. The simulation shows (N = 100) identical spring-mass systems being subjected to (1) a periodic driving force of…

  5. State-of-the-Art Model Driven Game Development: A Survey of Technological Solutions for Game-Based Learning

    ERIC Educational Resources Information Center

    Tang, Stephen; Hanneghan, Martin

    2011-01-01

    Game-based learning harnesses the advantages of computer games technology to create a fun, motivating and interactive virtual learning environment that promotes problem-based experiential learning. Such an approach is advocated by many commentators to provide an enhanced learning experience than those based on traditional didactic methods.…

  6. Development of Intelligent Computer-Assisted Instruction Systems to Facilitate Reading Skills of Learning-Disabled Children

    DTIC Science & Technology

    1993-12-01

    Unclassified/Unlimited 13. ABSTRACT ~Maximum 2W0 worr*J The purpose of this thesis is to develop a high-level model to create seli"adapting software which...Department of Computer Science ABSTRACT The purpose of this thesis is to develop a high-level model to create self-adapting software which teaches learning...stimulating and demanding. The power of the system model described herein is that it can vary as needed by the individual student. The system will

  7. A conceptual framework to support exposure science research and complete the source-to-outcome continuum for risk assessment

    EPA Science Inventory

    While knowledge of exposure is fundamental to assessing and mitigating risks, exposure information has been costly and difficult to generate. Driven by major scientific advances in analytical methods, biomonitoring, computational tools, and a newly articulated vision for a great...

  8. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    PubMed Central

    2011-01-01

    Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854

  9. Highlighting entanglement of cultures via ranking of multilingual Wikipedia articles.

    PubMed

    Eom, Young-Ho; Shepelyansky, Dima L

    2013-01-01

    How different cultures evaluate a person? Is an important person in one culture is also important in the other culture? We address these questions via ranking of multilingual Wikipedia articles. With three ranking algorithms based on network structure of Wikipedia, we assign ranking to all articles in 9 multilingual editions of Wikipedia and investigate general ranking structure of PageRank, CheiRank and 2DRank. In particular, we focus on articles related to persons, identify top 30 persons for each rank among different editions and analyze distinctions of their distributions over activity fields such as politics, art, science, religion, sport for each edition. We find that local heroes are dominant but also global heroes exist and create an effective network representing entanglement of cultures. The Google matrix analysis of network of cultures shows signs of the Zipf law distribution. This approach allows to examine diversity and shared characteristics of knowledge organization between cultures. The developed computational, data driven approach highlights cultural interconnections in a new perspective. Dated: June 26, 2013.

  10. Highlighting Entanglement of Cultures via Ranking of Multilingual Wikipedia Articles

    PubMed Central

    Eom, Young-Ho; Shepelyansky, Dima L.

    2013-01-01

    How different cultures evaluate a person? Is an important person in one culture is also important in the other culture? We address these questions via ranking of multilingual Wikipedia articles. With three ranking algorithms based on network structure of Wikipedia, we assign ranking to all articles in 9 multilingual editions of Wikipedia and investigate general ranking structure of PageRank, CheiRank and 2DRank. In particular, we focus on articles related to persons, identify top 30 persons for each rank among different editions and analyze distinctions of their distributions over activity fields such as politics, art, science, religion, sport for each edition. We find that local heroes are dominant but also global heroes exist and create an effective network representing entanglement of cultures. The Google matrix analysis of network of cultures shows signs of the Zipf law distribution. This approach allows to examine diversity and shared characteristics of knowledge organization between cultures. The developed computational, data driven approach highlights cultural interconnections in a new perspective. Dated: June 26, 2013 PMID:24098338

  11. Computational Design of Animated Mechanical Characters

    NASA Astrophysics Data System (ADS)

    Coros, Stelian; Thomaszewski, Bernhard; DRZ Team Team

    2014-03-01

    A factor key to the appeal of modern CG movies and video-games is that the virtual worlds they portray place no bounds on what can be imagined. Rapid manufacturing devices hold the promise of bringing this type of freedom to our own world, by enabling the fabrication of physical objects whose appearance, deformation behaviors and motions can be precisely specified. In order to unleash the full potential of this technology however, computational design methods that create digital content suitable for fabrication need to be developed. In recent work, we presented a computational design system that allows casual users to create animated mechanical characters. Given an articulated character as input, the user designs the animated character by sketching motion curves indicating how they should move. For each motion curve, our framework creates an optimized mechanism that reproduces it as closely as possible. The resulting mechanisms are attached to the character and then connected to each other using gear trains, which are created in a semi-automated fashion. The mechanical assemblies generated with our system can be driven with a single input driver, such as a hand-operated crank or an electric motor, and they can be fabricated using rapid prototyping devices.

  12. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Allcock, William; Beggio, Chris

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less

  13. Using Parent and Teacher Voices in the Creation of a Western-Based Early Childhood English-Language Program in China

    ERIC Educational Resources Information Center

    Shimpi, Priya M.; Paik, Jae H.; Wanerman, Todd; Johnson, Rebecca; Li, Hui; Duh, Shinchieh

    2015-01-01

    The current English-language research and educational program was driven by an initiative to create a more interactive, theme-based bilingual language education model for preschools in Chengdu, China. During a 2-week teacher education program centered at the Experimental Kindergarten of the Chinese Academy of Sciences in Chengdu, China, a team of…

  14. Full-Body Musculoskeletal Model for Muscle-Driven Simulation of Human Gait.

    PubMed

    Rajagopal, Apoorva; Dembia, Christopher L; DeMers, Matthew S; Delp, Denny D; Hicks, Jennifer L; Delp, Scott L

    2016-10-01

    Musculoskeletal models provide a non-invasive means to study human movement and predict the effects of interventions on gait. Our goal was to create an open-source 3-D musculoskeletal model with high-fidelity representations of the lower limb musculature of healthy young individuals that can be used to generate accurate simulations of gait. Our model includes bony geometry for the full body, 37 degrees of freedom to define joint kinematics, Hill-type models of 80 muscle-tendon units actuating the lower limbs, and 17 ideal torque actuators driving the upper body. The model's musculotendon parameters are derived from previous anatomical measurements of 21 cadaver specimens and magnetic resonance images of 24 young healthy subjects. We tested the model by evaluating its computational time and accuracy of simulations of healthy walking and running. Generating muscle-driven simulations of normal walking and running took approximately 10 minutes on a typical desktop computer. The differences between our muscle-generated and inverse dynamics joint moments were within 3% (RMSE) of the peak inverse dynamics joint moments in both walking and running, and our simulated muscle activity showed qualitative agreement with salient features from experimental electromyography data. These results suggest that our model is suitable for generating muscle-driven simulations of healthy gait. We encourage other researchers to further validate and apply the model to study other motions of the lower extremity. The model is implemented in the open-source software platform OpenSim. The model and data used to create and test the simulations are freely available at https://simtk.org/home/full_body/, allowing others to reproduce these results and create their own simulations.

  15. Full body musculoskeletal model for muscle-driven simulation of human gait

    PubMed Central

    Rajagopal, Apoorva; Dembia, Christopher L.; DeMers, Matthew S.; Delp, Denny D.; Hicks, Jennifer L.; Delp, Scott L.

    2017-01-01

    Objective Musculoskeletal models provide a non-invasive means to study human movement and predict the effects of interventions on gait. Our goal was to create an open-source, three-dimensional musculoskeletal model with high-fidelity representations of the lower limb musculature of healthy young individuals that can be used to generate accurate simulations of gait. Methods Our model includes bony geometry for the full body, 37 degrees of freedom to define joint kinematics, Hill-type models of 80 muscle-tendon units actuating the lower limbs, and 17 ideal torque actuators driving the upper body. The model’s musculotendon parameters are derived from previous anatomical measurements of 21 cadaver specimens and magnetic resonance images of 24 young healthy subjects. We tested the model by evaluating its computational time and accuracy of simulations of healthy walking and running. Results Generating muscle-driven simulations of normal walking and running took approximately 10 minutes on a typical desktop computer. The differences between our muscle-generated and inverse dynamics joint moments were within 3% (RMSE) of the peak inverse dynamics joint moments in both walking and running, and our simulated muscle activity showed qualitative agreement with salient features from experimental electromyography data. Conclusion These results suggest that our model is suitable for generating muscle-driven simulations of healthy gait. We encourage other researchers to further validate and apply the model to study other motions of the lower-extremity. Significance The model is implemented in the open source software platform OpenSim. The model and data used to create and test the simulations are freely available at https://simtk.org/home/full_body/, allowing others to reproduce these results and create their own simulations. PMID:27392337

  16. Educational Hypermedia Resources Facilitator

    ERIC Educational Resources Information Center

    Garcia, Francisco Jose; Garcia, Joaquin

    2005-01-01

    Within the university the introduction of computers is creating a new criterion of differentiation between those who as a matter of course become integrated in the technocratic trend deriving from the daily use of these machines and those who become isolated by not using them. This difference increases when computer science and communications…

  17. Learner-Interface Interaction for Technology-Enhanced Active Learning

    ERIC Educational Resources Information Center

    Sinha, Neelu; Khreisat, Laila; Sharma, Kiron

    2009-01-01

    Neelu Sinha, Laila Khreisat, and Kiron Sharma describe how learner-interface interaction promotes active learning in computer science education. In a pilot study using technology that combines DyKnow software with a hardware platform of pen-enabled HP Tablet notebook computers, Sinha, Khreisat, and Sharma created dynamic learning environments by…

  18. Evaluating Computer-Related Incidents on Campus

    ERIC Educational Resources Information Center

    Rothschild, Daniel; Rezmierski, Virginia

    2004-01-01

    The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…

  19. Talking Back to Teacher

    ERIC Educational Resources Information Center

    Fischman, Josh

    2007-01-01

    In this article, the author talks about Classroom Presenter, a computer program that aids in student participation during class discussions and makes boring lectures more interactive. The program was created by Richard J. Anderson, a professor of computer science at the University of Washington, in Seattle. Classroom Presenter is now in use in…

  20. GES DISC Data Recipes in Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Li, A.; Banavige, B.; Garimella, K.; Rice, J.; Shen, S.; Liu, Z.

    2017-12-01

    The Earth Science Data and Information System (ESDIS) Project manages twelve Distributed Active Archive Centers (DAACs) which are geographically dispersed across the United States. The DAACs are responsible for ingesting, processing, archiving, and distributing Earth science data produced from various sources (satellites, aircraft, field measurements, etc.). In response to projections of an exponential increase in data production, there has been a recent effort to prototype various DAAC activities in the cloud computing environment. This, in turn, led to the creation of an initiative, called the Cloud Analysis Toolkit to Enable Earth Science (CATEES), to develop a Python software package in order to transition Earth science data processing to the cloud. This project, in particular, supports CATEES and has two primary goals. One, transition data recipes created by the Goddard Earth Science Data and Information Service Center (GES DISC) DAAC into an interactive and educational environment using Jupyter Notebooks. Two, acclimate Earth scientists to cloud computing. To accomplish these goals, we create Jupyter Notebooks to compartmentalize the different steps of data analysis and help users obtain and parse data from the command line. We also develop a Docker container, comprised of Jupyter Notebooks, Python library dependencies, and command line tools, and configure it into an easy to deploy package. The end result is an end-to-end product that simulates the use case of end users working in the cloud computing environment.

  1. User-Driven Sampling Strategies in Image Exploitation

    DOE PAGES

    Harvey, Neal R.; Porter, Reid B.

    2013-12-23

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-drivenmore » sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. We discovered that in user-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. Furthermore, in preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.« less

  2. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  3. Let's Use Cognitive Science to Create Collaborative Workstations.

    PubMed

    Reicher, Murray A; Wolfe, Jeremy M

    2016-05-01

    When informed by an understanding of cognitive science, radiologists' workstations could become collaborative to improve radiologists' performance and job satisfaction. The authors review relevant literature and present several promising areas of research, including image toggling, eye tracking, cognitive computing, intelligently restricted messaging, work habit tracking, and innovative input devices. The authors call for more research in "perceptual design," a promising field that can complement advances in computer-aided detection. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  4. The Experiences of Female High School Students and Interest in STEM: Factors Leading to the Selection of an Engineering or Computer Science Major

    ERIC Educational Resources Information Center

    Genoways, Sharon K.

    2017-01-01

    STEM (Science, Technology, Engineering and Math) education creates critical thinkers, increases science literacy, and enables the next generation of innovators, which leads to new products and processes that sustain our economy (Hossain & Robinson, 2012). We have been hearing the warnings for several years, that there simply are not enough…

  5. Bringing Computational Thinking into the High School Science and Math Classroom

    NASA Astrophysics Data System (ADS)

    Trouille, Laura; Beheshti, E.; Horn, M.; Jona, K.; Kalogera, V.; Weintrop, D.; Wilensky, U.; University CT-STEM Project, Northwestern; University CenterTalent Development, Northwestern

    2013-01-01

    Computational thinking (for example, the thought processes involved in developing algorithmic solutions to problems that can then be automated for computation) has revolutionized the way we do science. The Next Generation Science Standards require that teachers support their students’ development of computational thinking and computational modeling skills. As a result, there is a very high demand among teachers for quality materials. Astronomy provides an abundance of opportunities to support student development of computational thinking skills. Our group has taken advantage of this to create a series of astronomy-based computational thinking lesson plans for use in typical physics, astronomy, and math high school classrooms. This project is funded by the NSF Computing Education for the 21st Century grant and is jointly led by Northwestern University’s Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA), the Computer Science department, the Learning Sciences department, and the Office of STEM Education Partnerships (OSEP). I will also briefly present the online ‘Astro Adventures’ courses for middle and high school students I have developed through NU’s Center for Talent Development. The online courses take advantage of many of the amazing online astronomy enrichment materials available to the public, including a range of hands-on activities and the ability to take images with the Global Telescope Network. The course culminates with an independent computational research project.

  6. ALCF Data Science Program: Productive Data-centric Supercomputing

    NASA Astrophysics Data System (ADS)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  7. Materials inspired by mathematics.

    PubMed

    Kotani, Motoko; Ikeda, Susumu

    2016-01-01

    Our world is transforming into an interacting system of the physical world and the digital world. What will be the materials science in the new era? With the rising expectations of the rapid development of computers, information science and mathematical science including statistics and probability theory, 'data-driven materials design' has become a common term. There is knowledge and experience gained in the physical world in the form of know-how and recipes for the creation of material. An important key is how we establish vocabulary and grammar to translate them into the language of the digital world. In this article, we outline how materials science develops when it encounters mathematics, showing some emerging directions.

  8. Mars for Earthlings: an analog approach to Mars in undergraduate education.

    PubMed

    Chan, Marjorie; Kahmann-Robinson, Julia

    2014-01-01

    Mars for Earthlings (MFE) is a terrestrial Earth analog pedagogical approach to teaching undergraduate geology, planetary science, and astrobiology. MFE utilizes Earth analogs to teach Mars planetary concepts, with a foundational backbone in Earth science principles. The field of planetary science is rapidly changing with new technologies and higher-resolution data sets. Thus, it is increasingly important to understand geological concepts and processes for interpreting Mars data. MFE curriculum is topically driven to facilitate easy integration of content into new or existing courses. The Earth-Mars systems approach explores planetary origins, Mars missions, rocks and minerals, active driving forces/tectonics, surface sculpting processes, astrobiology, future explorations, and hot topics in an inquiry-driven environment. Curriculum leverages heavily upon multimedia resources, software programs such as Google Mars and JMARS, as well as NASA mission data such as THEMIS, HiRISE, CRISM, and rover images. Two years of MFE class evaluation data suggest that science literacy and general interest in Mars geology and astrobiology topics increased after participation in the MFE curriculum. Students also used newly developed skills to create a Mars mission team presentation. The MFE curriculum, learning modules, and resources are available online at http://serc.carleton.edu/marsforearthlings/index.html.

  9. Using Scenarios to Design Complex Technology-Enhanced Learning Environments

    ERIC Educational Resources Information Center

    de Jong, Ton; Weinberger, Armin; Girault, Isabelle; Kluge, Anders; Lazonder, Ard W.; Pedaste, Margus; Ludvigsen, Sten; Ney, Muriel; Wasson, Barbara; Wichmann, Astrid; Geraedts, Caspar; Giemza, Adam; Hovardas, Tasos; Julien, Rachel; van Joolingen, Wouter R.; Lejeune, Anne; Manoli, Constantinos C.; Matteman, Yuri; Sarapuu, Tago; Verkade, Alex; Vold, Vibeke; Zacharia, Zacharias C.

    2012-01-01

    Science Created by You (SCY) learning environments are computer-based environments in which students learn about science topics in the context of addressing a socio-scientific problem. Along their way to a solution for this problem students produce many types of intermediate products or learning objects. SCY learning environments center the entire…

  10. The comparative effect of individually-generated vs. collaboratively-generated computer-based concept mapping on science concept learning

    NASA Astrophysics Data System (ADS)

    Kwon, So Young

    Using a quasi-experimental design, the researcher investigated the comparative effects of individually-generated and collaboratively-generated computer-based concept mapping on middle school science concept learning. Qualitative data were analyzed to explain quantitative findings. One hundred sixty-one students (74 boys and 87 girls) in eight, seventh grade science classes at a middle school in Southeast Texas completed the entire study. Using prior science performance scores to assure equivalence of student achievement across groups, the researcher assigned the teacher's classes to one of the three experimental groups. The independent variable, group, consisted of three levels: 40 students in a control group, 59 students trained to individually generate concept maps on computers, and 62 students trained to collaboratively generate concept maps on computers. The dependent variables were science concept learning as demonstrated by comprehension test scores, and quality of concept maps created by students in experimental groups as demonstrated by rubric scores. Students in the experimental groups received concept mapping training and used their newly acquired concept mapping skills to individually or collaboratively construct computer-based concept maps during study time. The control group, the individually-generated concept mapping group, and the collaboratively-generated concept mapping group had equivalent learning experiences for 50 minutes during five days, excepting that students in a control group worked independently without concept mapping activities, students in the individual group worked individually to construct concept maps, and students in the collaborative group worked collaboratively to construct concept maps during their study time. Both collaboratively and individually generated computer-based concept mapping had a positive effect on seventh grade middle school science concept learning but neither strategy was more effective than the other. However, the students who collaboratively generated concept maps created significantly higher quality concept maps than those who individually generated concept maps. The researcher concluded that the concept mapping software, Inspiration(TM), fostered construction of students' concept maps individually or collaboratively for science learning and helped students capture their evolving creative ideas and organize them for meaningful learning. Students in both the individual and the collaborative concept mapping groups had positive attitudes toward concept mapping using Inspiration(TM) software.

  11. A Projection Quality-Driven Tube Current Modulation Method in Cone-Beam CT for IGRT: Proof of Concept.

    PubMed

    Men, Kuo; Dai, Jianrong

    2017-12-01

    To develop a projection quality-driven tube current modulation method in cone-beam computed tomography for image-guided radiotherapy based on the prior attenuation information obtained by the planning computed tomography and then evaluate its effect on a reduction in the imaging dose. The QCKV-1 phantom with different thicknesses (0-400 mm) of solid water upon it was used to simulate different attenuation (μ). Projections were acquired with a series of tube current-exposure time product (mAs) settings, and a 2-dimensional contrast to noise ratio was analyzed for each projection to create a lookup table of mAs versus 2-dimensional contrast to noise ratio, μ. Before a patient underwent computed tomography, the maximum attenuation [Formula: see text] within the 95% range of each projection angle (θ) was estimated according to the planning computed tomography images. Then, a desired 2-dimensional contrast to noise ratio value was selected, and the mAs setting at θ was calculated with the lookup table of mAs versus 2-dimensional contrast to noise ratio,[Formula: see text]. Three-dimensional cone-beam computed tomography images were reconstructed using the projections acquired with the selected mAs. The imaging dose was evaluated with a polymethyl methacrylate dosimetry phantom in terms of volume computed tomography dose index. Image quality was analyzed using a Catphan 503 phantom with an oval body annulus and a pelvis phantom. For the Catphan 503 phantom, the cone-beam computed tomography image obtained by the projection quality-driven tube current modulation method had a similar quality to that of conventional cone-beam computed tomography . However, the proposed method could reduce the imaging dose by 16% to 33% to achieve an equivalent contrast to noise ratio value. For the pelvis phantom, the structural similarity index was 0.992 with a dose reduction of 39.7% for the projection quality-driven tube current modulation method. The proposed method could reduce the additional dose to the patient while not degrading the image quality for cone-beam computed tomography. The projection quality-driven tube current modulation method could be especially beneficial to patients who undergo cone-beam computed tomography frequently during a treatment course.

  12. Pattern recognition with “materials that compute”

    PubMed Central

    Fang, Yan; Yashin, Victor V.; Levitan, Steven P.; Balazs, Anna C.

    2016-01-01

    Driven by advances in materials and computer science, researchers are attempting to design systems where the computer and material are one and the same entity. Using theoretical and computational modeling, we design a hybrid material system that can autonomously transduce chemical, mechanical, and electrical energy to perform a computational task in a self-organized manner, without the need for external electrical power sources. Each unit in this system integrates a self-oscillating gel, which undergoes the Belousov-Zhabotinsky (BZ) reaction, with an overlaying piezoelectric (PZ) cantilever. The chemomechanical oscillations of the BZ gels deflect the PZ layer, which consequently generates a voltage across the material. When these BZ-PZ units are connected in series by electrical wires, the oscillations of these units become synchronized across the network, where the mode of synchronization depends on the polarity of the PZ. We show that the network of coupled, synchronizing BZ-PZ oscillators can perform pattern recognition. The “stored” patterns are set of polarities of the individual BZ-PZ units, and the “input” patterns are coded through the initial phase of the oscillations imposed on these units. The results of the modeling show that the input pattern closest to the stored pattern exhibits the fastest convergence time to stable synchronization behavior. In this way, networks of coupled BZ-PZ oscillators achieve pattern recognition. Further, we show that the convergence time to stable synchronization provides a robust measure of the degree of match between the input and stored patterns. Through these studies, we establish experimentally realizable design rules for creating “materials that compute.” PMID:27617290

  13. A Software Hub for High Assurance Model-Driven Development and Analysis

    DTIC Science & Technology

    2007-01-23

    verification of UML models in TLPVS. In Thomas Baar, Alfred Strohmeier, Ana Moreira, and Stephen J. Mellor, editors, UML 2004 - The Unified Modeling...volume 3785 of Lecture Notes in Computer Science, pages 52–65, Manchester, UK, Nov 2005. Springer. [GH04] Günter Graw and Peter Herrmann. Transformation

  14. From Intuition to Evidence: A Data-Driven Approach to Transforming CS Education

    ERIC Educational Resources Information Center

    Allevato, Anthony J.

    2012-01-01

    Educators in many disciplines are too often forced to rely on intuition about how students learn and the effectiveness of teaching to guide changes and improvements to their curricula. In computer science, systems that perform automated collection and assessment of programming assignments are seeing increased adoption, and these systems generate a…

  15. Effects of Response-Driven Feedback in Computer Science Learning

    ERIC Educational Resources Information Center

    Fernandez Aleman, J. L.; Palmer-Brown, D.; Jayne, C.

    2011-01-01

    This paper presents the results of a project on generating diagnostic feedback for guided learning in a first-year course on programming and a Master's course on software quality. An online multiple-choice questions (MCQs) system is integrated with neural network-based data analysis. Findings about how students use the system suggest that the…

  16. Stretching the Traditional Notion of Experiment in Computing: Explorative Experiments.

    PubMed

    Schiaffonati, Viola

    2016-06-01

    Experimentation represents today a 'hot' topic in computing. If experiments made with the support of computers, such as computer simulations, have received increasing attention from philosophers of science and technology, questions such as "what does it mean to do experiments in computer science and engineering and what are their benefits?" emerged only recently as central in the debate over the disciplinary status of the discipline. In this work we aim at showing, also by means of paradigmatic examples, how the traditional notion of controlled experiment should be revised to take into account a part of the experimental practice in computing along the lines of experimentation as exploration. Taking inspiration from the discussion on exploratory experimentation in the philosophy of science-experimentation that is not theory-driven-we advance the idea of explorative experiments that, although not new, can contribute to enlarge the debate about the nature and role of experimental methods in computing. In order to further refine this concept we recast explorative experiments as socio-technical experiments, that test new technologies in their socio-technical contexts. We suggest that, when experiments are explorative, control should be intended in a posteriori form, in opposition to the a priori form that usually takes place in traditional experimental contexts.

  17. Computational intelligence in earth sciences and environmental applications: issues and challenges.

    PubMed

    Cherkassky, V; Krasnopolsky, V; Solomatine, D P; Valdes, J

    2006-03-01

    This paper introduces a generic theoretical framework for predictive learning, and relates it to data-driven and learning applications in earth and environmental sciences. The issues of data quality, selection of the error function, incorporation of the predictive learning methods into the existing modeling frameworks, expert knowledge, model uncertainty, and other application-domain specific problems are discussed. A brief overview of the papers in the Special Issue is provided, followed by discussion of open issues and directions for future research.

  18. Trends in life science grid: from computing grid to knowledge grid.

    PubMed

    Konagaya, Akihiko

    2006-12-18

    Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  19. Trends in life science grid: from computing grid to knowledge grid

    PubMed Central

    Konagaya, Akihiko

    2006-01-01

    Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:17254294

  20. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    ERIC Educational Resources Information Center

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  1. The Application of Embodied Conversational Agents for Mentoring African American STEM Doctoral Students

    ERIC Educational Resources Information Center

    Gosha, Kinnis

    2013-01-01

    This dissertation presents the design, development and short-term evaluation of an embodied conversational agent designed to mentor human users. An embodied conversational agent (ECA) was created and programmed to mentor African American computer science majors on their decision to pursue graduate study in computing. Before constructing the ECA,…

  2. Computer Simulation of Compression and Energy Release upon Laser Irradiation of Cylindrically Symmetric Target

    NASA Astrophysics Data System (ADS)

    Kuzenov, V. V.

    2017-12-01

    The paper is devoted to the theoretical and computational study of compression and energy release for magneto-inertial plasma confinement. This approach makes it possible to create new high-density plasma sources, apply them in materials science experiments, and use them in promising areas of power engineering.

  3. Case Study on the Use of Microcomputers in Primary Schools in Bar-le-Duc (France).

    ERIC Educational Resources Information Center

    Dieschbourg, Robert

    1988-01-01

    Examines a project which involves the introduction of computer science into elementary schools to create an awareness of data processing as an intellectual, technological, and socio-cultural phenomenon. Concludes that the early computer experience and group work involved in the project enhances student social and psychological development. (GEA)

  4. Using Pedagogical Tools to Help Hispanics be Successful in Computer Science

    NASA Astrophysics Data System (ADS)

    Irish, Rodger

    Irish, Rodger, Using Pedagogical Tools to Help Hispanics Be Successful in Computer Science. Master of Science (MS), July 2017, 68 pp., 4 tables, 2 figures, references 48 titles. Computer science (CS) jobs are a growing field and pay a living wage, but the Hispanics are underrepresented in this field. This project seeks to give an overview of several contributing factors to this problem. It will then explore some possible solutions to this problem and how a combination of some tools (teaching methods) can create the best possible outcome. It is my belief that this approach can produce successful Hispanics to fill the needed jobs in the CS field. Then the project will test its hypothesis. I will discuss the tools used to measure progress both in the affective and the cognitive domains. I will show how the decision to run a Computer Club was reached and the results of the research. The conclusion will summarize the results and tell of future research that still needs to be done.

  5. Stochastic Process Creation

    NASA Astrophysics Data System (ADS)

    Esparza, Javier

    In many areas of computer science entities can “reproduce”, “replicate”, or “create new instances”. Paramount examples are threads in multithreaded programs, processes in operating systems, and computer viruses, but many others exist: procedure calls create new incarnations of the callees, web crawlers discover new pages to be explored (and so “create” new tasks), divide-and-conquer procedures split a problem into subproblems, and leaves of tree-based data structures become internal nodes with children. For lack of a better name, I use the generic term systems with process creation to refer to all these entities.

  6. Scientific Visualization and Computational Science: Natural Partners

    NASA Technical Reports Server (NTRS)

    Uselton, Samuel P.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Scientific visualization is developing rapidly, stimulated by computational science, which is gaining acceptance as a third alternative to theory and experiment. Computational science is based on numerical simulations of mathematical models derived from theory. But each individual simulation is like a hypothetical experiment; initial conditions are specified, and the result is a record of the observed conditions. Experiments can be simulated for situations that can not really be created or controlled. Results impossible to measure can be computed.. Even for observable values, computed samples are typically much denser. Numerical simulations also extend scientific exploration where the mathematics is analytically intractable. Numerical simulations are used to study phenomena from subatomic to intergalactic scales and from abstract mathematical structures to pragmatic engineering of everyday objects. But computational science methods would be almost useless without visualization. The obvious reason is that the huge amounts of data produced require the high bandwidth of the human visual system, and interactivity adds to the power. Visualization systems also provide a single context for all the activities involved from debugging the simulations, to exploring the data, to communicating the results. Most of the presentations today have their roots in image processing, where the fundamental task is: Given an image, extract information about the scene. Visualization has developed from computer graphics, and the inverse task: Given a scene description, make an image. Visualization extends the graphics paradigm by expanding the possible input. The goal is still to produce images; the difficulty is that the input is not a scene description displayable by standard graphics methods. Visualization techniques must either transform the data into a scene description or extend graphics techniques to display this odd input. Computational science is a fertile field for visualization research because the results vary so widely and include things that have no known appearance. The amount of data creates additional challenges for both hardware and software systems. Evaluations of visualization should ultimately reflect the insight gained into the scientific phenomena. So making good visualizations requires consideration of characteristics of the user and the purpose of the visualization. Knowledge about human perception and graphic design is also relevant. It is this breadth of knowledge that stimulates proposals for multidisciplinary visualization teams and intelligent visualization assistant software. Visualization is an immature field, but computational science is stimulating research on a broad front.

  7. Computational Experiments for Science and Engineering Education

    NASA Technical Reports Server (NTRS)

    Xie, Charles

    2011-01-01

    How to integrate simulation-based engineering and science (SBES) into the science curriculum smoothly is a challenging question. For the importance of SBES to be appreciated, the core value of simulations-that they help people understand natural phenomena and solve engineering problems-must be taught. A strategy to achieve this goal is to introduce computational experiments to the science curriculum to replace or supplement textbook illustrations and exercises and to complement or frame hands-on or wet lab experiments. In this way, students will have an opportunity to learn about SBES without compromising other learning goals required by the standards and teachers will welcome these tools as they strengthen what they are already teaching. This paper demonstrates this idea using a number of examples in physics, chemistry, and engineering. These exemplary computational experiments show that it is possible to create a curriculum that is both deeper and wider.

  8. Architectural Strategies for Enabling Data-Driven Science at Scale

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  9. Agile science: creating useful products for behavior change in the real world.

    PubMed

    Hekler, Eric B; Klasnja, Predrag; Riley, William T; Buman, Matthew P; Huberty, Jennifer; Rivera, Daniel E; Martin, Cesar A

    2016-06-01

    Evidence-based practice is important for behavioral interventions but there is debate on how best to support real-world behavior change. The purpose of this paper is to define products and a preliminary process for efficiently and adaptively creating and curating a knowledge base for behavior change for real-world implementation. We look to evidence-based practice suggestions and draw parallels to software development. We argue to target three products: (1) the smallest, meaningful, self-contained, and repurposable behavior change modules of an intervention; (2) "computational models" that define the interaction between modules, individuals, and context; and (3) "personalization" algorithms, which are decision rules for intervention adaptation. The "agile science" process includes a generation phase whereby contender operational definitions and constructs of the three products are created and assessed for feasibility and an evaluation phase, whereby effect size estimates/casual inferences are created. The process emphasizes early-and-often sharing. If correct, agile science could enable a more robust knowledge base for behavior change.

  10. Target volume and artifact evaluation of a new data-driven 4D CT.

    PubMed

    Martin, Rachael; Pan, Tinsu

    Four-dimensional computed tomography (4D CT) is often used to define the internal gross target volume (IGTV) for radiation therapy of lung cancer. Traditionally, this technique requires the use of an external motion surrogate; however, a new image, data-driven 4D CT, has become available. This study aims to describe this data-driven 4D CT and compare target contours created with it to those created using standard 4D CT. Cine CT data of 35 patients undergoing stereotactic body radiation therapy were collected and sorted into phases using standard and data-driven 4D CT. IGTV contours were drawn using a semiautomated method on maximum intensity projection images of both 4D CT methods. Errors resulting from reproducibility of the method were characterized. A comparison of phase image artifacts was made using a normalized cross-correlation method that assigned a score from +1 (data-driven "better") to -1 (standard "better"). The volume difference between the data-driven and standard IGTVs was not significant (data driven was 2.1 ± 1.0% smaller, P = .08). The Dice similarity coefficient showed good similarity between the contours (0.949 ± 0.006). The mean surface separation was 0.4 ± 0.1 mm and the Hausdorff distance was 3.1 ± 0.4 mm. An average artifact score of +0.37 indicated that the data-driven method had significantly fewer and/or less severe artifacts than the standard method (P = 1.5 × 10 -5 for difference from 0). On average, the difference between IGTVs derived from data-driven and standard 4D CT was not clinically relevant or statistically significant, suggesting data-driven 4D CT can be used in place of standard 4D CT without adjustments to IGTVs. The relatively large differences in some patients were usually attributed to limitations in automatic contouring or differences in artifacts. Artifact reduction and setup simplicity suggest a clinical advantage to data-driven 4D CT. Published by Elsevier Inc.

  11. The ethics of smart cities and urban science.

    PubMed

    Kitchin, Rob

    2016-12-28

    Software-enabled technologies and urban big data have become essential to the functioning of cities. Consequently, urban operational governance and city services are becoming highly responsive to a form of data-driven urbanism that is the key mode of production for smart cities. At the heart of data-driven urbanism is a computational understanding of city systems that reduces urban life to logic and calculative rules and procedures, which is underpinned by an instrumental rationality and realist epistemology. This rationality and epistemology are informed by and sustains urban science and urban informatics, which seek to make cities more knowable and controllable. This paper examines the forms, practices and ethics of smart cities and urban science, paying particular attention to: instrumental rationality and realist epistemology; privacy, datafication, dataveillance and geosurveillance; and data uses, such as social sorting and anticipatory governance. It argues that smart city initiatives and urban science need to be re-cast in three ways: a re-orientation in how cities are conceived; a reconfiguring of the underlying epistemology to openly recognize the contingent and relational nature of urban systems, processes and science; and the adoption of ethical principles designed to realize benefits of smart cities and urban science while reducing pernicious effects.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).

  12. Arctic research in the classroom: A teacher's experiences translated into data driven lesson plans

    NASA Astrophysics Data System (ADS)

    Kendrick, E. O.; Deegan, L.

    2011-12-01

    Incorporating research into high school science classrooms can promote critical thinking skills and provide a link between students and the scientific community. Basic science concepts become more relevant to students when taught in the context of research. A vital component of incorporating current research into classroom lessons is involving high school teachers in authentic research. The National Science Foundation sponsored Research Experience for Teachers (RET) program has inspired me to bring research to my classroom, communicate the importance of research in the classroom to other teachers and create lasting connections between students and the research community. Through my experiences as an RET at Toolik Field Station in Alaska, I have created several hands-on lessons and laboratory activities that are based on current arctic research and climate change. Each lesson uses arctic research as a theme for exemplifying basic biology concepts as well as increasing awareness of current topics such as climate change. For instance, data collected on the Kuparuk River will be incorporated into classroom activities that teach concepts such as primary production, trophic levels in a food chain and nutrient cycling within an ecosystem. Students will not only understand the biological concepts but also recognize the ecological implications of the research being conducted in the arctic. By using my experience in arctic research as a template, my students will gain a deeper understanding of the scientific process. I hope to create a crucial link of information between the science community and science education in public schools.

  13. A Living Library: New Model for Global Electronic Interactivity and Networking in the Garden.

    ERIC Educational Resources Information Center

    Sherk, Bonnie

    1995-01-01

    Describes the Living Library, an idea to create a network of international cultural parks in different cities of the world using new communications technologies on-line in a garden setting, bringing the humanities, sciences, and social sciences to life through plants, visual and performed artworks, lectures, and computer and on-line satellite…

  14. Time-resolved Sensing of Meso-scale Shock Compression with Multilayer Photonic Crystal Structures

    NASA Astrophysics Data System (ADS)

    Scripka, David; Lee, Gyuhyon; Summers, Christopher J.; Thadhani, Naresh

    2017-06-01

    Multilayer Photonic Crystal structures can provide spatially and temporally resolved data needed to validate theoretical and computational models relevant for understanding shock compression in heterogeneous materials. Two classes of 1-D photonic crystal multilayer structures were studied: optical microcavities (OMC) and distributed Bragg reflectors (DBR). These 0.5 to 5 micron thick structures were composed of SiO2, Al2O3, Ag, and PMMA layers fabricated primarily via e-beam evaporation. The multilayers have unique spectral signatures inherently linked to their time-resolved physical states. By observing shock-induced changes in these signatures, an optically-based pressure sensor was developed. Results to date indicate that both OMCs and DBRs exhibit nanosecond-resolved spectral shifts of several to 10s of nanometers under laser-driven shock compression loads of 0-10 GPa, with the magnitude of the shift strongly correlating to the shock load magnitude. Additionally, spatially and temporally resolved spectral shifts under heterogeneous laser-driven shock compression created by partial beam blocking have been successfully demonstrated. These results illustrate the potential for multilayer structures to serve as meso-scale sensors, capturing temporal and spatial pressure profile evolutions in shock-compressed heterogeneous materials, and revealing meso-scale pressure distributions across a shocked surface. Supported by DTRA Grant HDTRA1-12-1-005 and DoD, AFOSR, National Defense Science and Eng. Graduate Fellowship, 32 CFR 168a.

  15. Information visualisation for science and policy: engaging users and avoiding bias.

    PubMed

    McInerny, Greg J; Chen, Min; Freeman, Robin; Gavaghan, David; Meyer, Miriah; Rowland, Francis; Spiegelhalter, David J; Stefaner, Moritz; Tessarolo, Geizi; Hortal, Joaquin

    2014-03-01

    Visualisations and graphics are fundamental to studying complex subject matter. However, beyond acknowledging this value, scientists and science-policy programmes rarely consider how visualisations can enable discovery, create engaging and robust reporting, or support online resources. Producing accessible and unbiased visualisations from complicated, uncertain data requires expertise and knowledge from science, policy, computing, and design. However, visualisation is rarely found in our scientific training, organisations, or collaborations. As new policy programmes develop [e.g., the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES)], we need information visualisation to permeate increasingly both the work of scientists and science policy. The alternative is increased potential for missed discoveries, miscommunications, and, at worst, creating a bias towards the research that is easiest to display. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. We're Having a Seed Sale.

    ERIC Educational Resources Information Center

    Riss, Pam Helfers

    1994-01-01

    Botany meets computer science in this activity, which challenges students to create a computerized seed catalog. Class members work together to develop a database of plants, much like the major seed companies do. (PR)

  17. A new DoD initiative: the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program

    NASA Astrophysics Data System (ADS)

    Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.

    2008-07-01

    In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.

  18. Harnessing Big Data for Systems Pharmacology

    PubMed Central

    Xie, Lei; Draizen, Eli J.; Bourne, Philip E.

    2017-01-01

    Systems pharmacology aims to holistically understand mechanisms of drug actions to support drug discovery and clinical practice. Systems pharmacology modeling (SPM) is data driven. It integrates an exponentially growing amount of data at multiple scales (genetic, molecular, cellular, organismal, and environmental). The goal of SPM is to develop mechanistic or predictive multiscale models that are interpretable and actionable. The current explosions in genomics and other omics data, as well as the tremendous advances in big data technologies, have already enabled biologists to generate novel hypotheses and gain new knowledge through computational models of genome-wide, heterogeneous, and dynamic data sets. More work is needed to interpret and predict a drug response phenotype, which is dependent on many known and unknown factors. To gain a comprehensive understanding of drug actions, SPM requires close collaborations between domain experts from diverse fields and integration of heterogeneous models from biophysics, mathematics, statistics, machine learning, and semantic webs. This creates challenges in model management, model integration, model translation, and knowledge integration. In this review, we discuss several emergent issues in SPM and potential solutions using big data technology and analytics. The concurrent development of high-throughput techniques, cloud computing, data science, and the semantic web will likely allow SPM to be findable, accessible, interoperable, reusable, reliable, interpretable, and actionable. PMID:27814027

  19. Harnessing Big Data for Systems Pharmacology.

    PubMed

    Xie, Lei; Draizen, Eli J; Bourne, Philip E

    2017-01-06

    Systems pharmacology aims to holistically understand mechanisms of drug actions to support drug discovery and clinical practice. Systems pharmacology modeling (SPM) is data driven. It integrates an exponentially growing amount of data at multiple scales (genetic, molecular, cellular, organismal, and environmental). The goal of SPM is to develop mechanistic or predictive multiscale models that are interpretable and actionable. The current explosions in genomics and other omics data, as well as the tremendous advances in big data technologies, have already enabled biologists to generate novel hypotheses and gain new knowledge through computational models of genome-wide, heterogeneous, and dynamic data sets. More work is needed to interpret and predict a drug response phenotype, which is dependent on many known and unknown factors. To gain a comprehensive understanding of drug actions, SPM requires close collaborations between domain experts from diverse fields and integration of heterogeneous models from biophysics, mathematics, statistics, machine learning, and semantic webs. This creates challenges in model management, model integration, model translation, and knowledge integration. In this review, we discuss several emergent issues in SPM and potential solutions using big data technology and analytics. The concurrent development of high-throughput techniques, cloud computing, data science, and the semantic web will likely allow SPM to be findable, accessible, interoperable, reusable, reliable, interpretable, and actionable.

  20. Convergence of service, policy, and science toward consumer-driven mental health care.

    PubMed

    Carroll, Christopher D; Manderscheid, Ronald W; Daniels, Allen S; Compagni, Amelia

    2006-12-01

    A common theme is emerging in sentinel reports on the United States health care system. Consumer relevance and demands on service systems and practices are influencing how mental health care is delivered and how systems will be shaped in the future. The present report seeks to assemble a confluence of consumer-driven themes from noteworthy reports on the state of the mental health system in the U.S. It also explores innovative efforts, promising practices, collaborative efforts, as well as identification of barriers to consumer-directed care, with possible solutions. The report reviews the relevant public mental health policy and data used in published work. The findings indicate an increasing public and private interest in promoting consumer-driven care, even though historical systems of care predominate, and often create, barriers to wide-spread redesign of a consumer-centered mental health care system. Innovative consumer-driven practices are increasing as quality, choice, and self-determination become integral parts of a redesigned U.S. mental health care system. The use of consumer-driven approaches in mental health is limited at best. These programs challenge industry norms and traditional practices. Limitations include the need for additional and thorough evaluations of effectiveness (cost and clinical) and replicability of consumer-directed programs. Consumer-driven services indicate that mental health consumers are expecting to be more participative in their mental health care. This expectation will influence how traditional mental health services and providers become more consumer-centric and meet the demand. Public and private interest in consumer-driven health care range from creating cost-conscious consumers to individualized control of recovery. The health care sector should seek to invest more resources in the provision of consumer-driven health care programs. The results of this study have implications and are informative for other countries where consumer-directed care is delivered in either the private or public health care systems. More research is needed to obtain further evidence on the use of consumer-driven services and their overall effectiveness.

  1. Computer Games Created by Middle School Girls: Can They Be Used to Measure Understanding of Computer Science Concepts?

    ERIC Educational Resources Information Center

    Denner, Jill; Werner, Linda; Ortiz, Eloy

    2012-01-01

    Computer game programming has been touted as a promising strategy for engaging children in the kinds of thinking that will prepare them to be producers, not just users of technology. But little is known about what they learn when programming a game. In this article, we present a strategy for coding student games, and summarize the results of an…

  2. CosmoQuest: A Cyber-Infrastructure for Crowdsourcing Planetary Surface Mapping and More

    NASA Astrophysics Data System (ADS)

    Gay, P.; Lehan, C.; Moore, J.; Bracey, G.; Gugliucci, N.

    2014-04-01

    The design and implementation of programs to crowdsource science presents a unique set of challenges to system architects, programmers, and designers. The CosmoQuest Citizen Science Builder (CSB) is an open source platform designed to take advantage of crowd computing and open source platforms to solve crowdsourcing problems in Planetary Science. CSB combines a clean user interface with a powerful back end to allow the quick design and deployment of citizen science sites that meet the needs of both the random Joe Public, and the detail driven Albert Professional. In this talk, the software will be overviewed, and the results of usability testing and accuracy testing with both citizen and professional scientists will be discussed.

  3. Materials inspired by mathematics

    PubMed Central

    Kotani, Motoko; Ikeda, Susumu

    2016-01-01

    Abstract Our world is transforming into an interacting system of the physical world and the digital world. What will be the materials science in the new era? With the rising expectations of the rapid development of computers, information science and mathematical science including statistics and probability theory, ‘data-driven materials design’ has become a common term. There is knowledge and experience gained in the physical world in the form of know-how and recipes for the creation of material. An important key is how we establish vocabulary and grammar to translate them into the language of the digital world. In this article, we outline how materials science develops when it encounters mathematics, showing some emerging directions. PMID:27877877

  4. Arctic Boreal Vulnerability Experiment (ABoVE) Science Cloud

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Schnase, J. L.; McInerney, M.; Webster, W. P.; Sinno, S.; Thompson, J. H.; Griffith, P. C.; Hoy, E.; Carroll, M.

    2014-12-01

    The effects of climate change are being revealed at alarming rates in the Arctic and Boreal regions of the planet. NASA's Terrestrial Ecology Program has launched a major field campaign to study these effects over the next 5 to 8 years. The Arctic Boreal Vulnerability Experiment (ABoVE) will challenge scientists to take measurements in the field, study remote observations, and even run models to better understand the impacts of a rapidly changing climate for areas of Alaska and western Canada. The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center (GSFC) has partnered with the Terrestrial Ecology Program to create a science cloud designed for this field campaign - the ABoVE Science Cloud. The cloud combines traditional high performance computing with emerging technologies to create an environment specifically designed for large-scale climate analytics. The ABoVE Science Cloud utilizes (1) virtualized high-speed InfiniBand networks, (2) a combination of high-performance file systems and object storage, and (3) virtual system environments tailored for data intensive, science applications. At the center of the architecture is a large object storage environment, much like a traditional high-performance file system, that supports data proximal processing using technologies like MapReduce on a Hadoop Distributed File System (HDFS). Surrounding the storage is a cloud of high performance compute resources with many processing cores and large memory coupled to the storage through an InfiniBand network. Virtual systems can be tailored to a specific scientist and provisioned on the compute resources with extremely high-speed network connectivity to the storage and to other virtual systems. In this talk, we will present the architectural components of the science cloud and examples of how it is being used to meet the needs of the ABoVE campaign. In our experience, the science cloud approach significantly lowers the barriers and risks to organizations that require high performance computing solutions and provides the NCCS with the agility required to meet our customers' rapidly increasing and evolving requirements.

  5. Graduate Training at the Interface of Computational and Experimental Biology: An Outcome Report from a Partnership of Volunteers between a University and a National Laboratory

    PubMed Central

    von Arnim, Albrecht G.; Missra, Anamika

    2017-01-01

    Leading voices in the biological sciences have called for a transformation in graduate education leading to the PhD degree. One area commonly singled out for growth and innovation is cross-training in computational science. In 1998, the University of Tennessee (UT) founded an intercollegiate graduate program called the UT-ORNL Graduate School of Genome Science and Technology in partnership with the nearby Oak Ridge National Laboratory. Here, we report outcome data that attest to the program’s effectiveness in graduating computationally enabled biologists for diverse careers. Among 77 PhD graduates since 2003, the majority came with traditional degrees in the biological sciences, yet two-thirds moved into computational or hybrid (computational–experimental) positions. We describe the curriculum of the program and how it has changed. We also summarize how the program seeks to establish cohesion between computational and experimental biologists. This type of program can respond flexibly and dynamically to unmet training needs. In conclusion, this study from a flagship, state-supported university may serve as a reference point for creating a stable, degree-granting, interdepartmental graduate program in computational biology and allied areas. PMID:29167223

  6. Exploring Do-It-Yourself Approaches in Computational Quantum Chemistry: The Pedagogical Benefits of the Classical Boys Algorithm

    ERIC Educational Resources Information Center

    Orsini, Gabriele

    2015-01-01

    The ever-increasing impact of molecular quantum calculations over chemical sciences implies a strong and urgent need for the elaboration of proper teaching strategies in university curricula. In such perspective, this paper proposes an extensive project for a student-driven, cooperative, from-scratch implementation of a general Hartree-Fock…

  7. Creating an Online Laboratory

    DTIC Science & Technology

    2015-03-18

    Problem (TSP) to solve, a canonical computer science problem that involves identifying the shortest itinerary for a hypothetical salesman traveling among a...also created working versions of the travelling salesperson problem , prisoners’ dilemma, public goods game, ultimatum game, word ladders, and...the task within networks of others performing the task. Thus, we built five problems which could be embedded in networks: the traveling salesperson

  8. It's All About the Data: Workflow Systems and Weather

    NASA Astrophysics Data System (ADS)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible under high-volume conditions, or the searchability and manageability of the resulting data products is disappointingly low. The provenance of a data product is a record of its lineage, or trace of the execution history that resulted in the product. The provenance of a forecast model result, e.g., captures information about the executable version of the model, configuration parameters, input data products, execution environment, and owner. Provenance enables data to be properly attributed and captures critical parameters about the model run so the quality of the result can be ascertained. Proper provenance is essential to providing reproducible scientific computing results. Workflow languages used in science discovery are complete programming languages, and in theory can support any logic expressible by a programming language. The execution environments supporting the workflow engines, on the other hand, are subject to constraints on physical resources, and hence in practice the workflow task graphs used in science utilize relatively few of the cataloged workflow patterns. It is important to note that these workflows are executed on demand, and are executed once. Into this context is introduced the need for science discovery that is responsive to real time information. If we can use simple programming models and abstractions to make scientific discovery involving real-time data accessible to specialists who share and utilize data across scientific domains, we bring science one step closer to solving the largest of human problems.

  9. The visual attention saliency map for movie retrospection

    NASA Astrophysics Data System (ADS)

    Rogalska, Anna; Napieralski, Piotr

    2018-04-01

    The visual saliency map is becoming important and challenging for many scientific disciplines (robotic systems, psychophysics, cognitive neuroscience and computer science). Map created by the model indicates possible salient regions by taking into consideration face presence and motion which is essential in motion pictures. By combining we can obtain credible saliency map with a low computational cost.

  10. Natural Language Processing in Game Studies Research: An Overview

    ERIC Educational Resources Information Center

    Zagal, Jose P.; Tomuro, Noriko; Shepitsen, Andriy

    2012-01-01

    Natural language processing (NLP) is a field of computer science and linguistics devoted to creating computer systems that use human (natural) language as input and/or output. The authors propose that NLP can also be used for game studies research. In this article, the authors provide an overview of NLP and describe some research possibilities…

  11. The Issue of Gender Equity in Computer Science--What Students Say

    ERIC Educational Resources Information Center

    Miliszewska, Iwona; Barker, Gayle; Henderson, Fiona; Sztendur, Ewa

    2006-01-01

    The under-representation and poor retention of women in computing courses at Victoria University is a concern that has continued to defy all attempts to resolve it. Despite a range of initiatives created to encourage participation and improve retention of females in the courses, the percentage of female enrolments has declined significantly in…

  12. Developing Deep Learning Applications for Life Science and Pharma Industry.

    PubMed

    Siegismund, Daniel; Tolkachev, Vasily; Heyse, Stephan; Sick, Beate; Duerr, Oliver; Steigele, Stephan

    2018-06-01

    Deep Learning has boosted artificial intelligence over the past 5 years and is seen now as one of the major technological innovation areas, predicted to replace lots of repetitive, but complex tasks of human labor within the next decade. It is also expected to be 'game changing' for research activities in pharma and life sciences, where large sets of similar yet complex data samples are systematically analyzed. Deep learning is currently conquering formerly expert domains especially in areas requiring perception, previously not amenable to standard machine learning. A typical example is the automated analysis of images which are typically produced en-masse in many domains, e. g., in high-content screening or digital pathology. Deep learning enables to create competitive applications in so-far defined core domains of 'human intelligence'. Applications of artificial intelligence have been enabled in recent years by (i) the massive availability of data samples, collected in pharma driven drug programs (='big data') as well as (ii) deep learning algorithmic advancements and (iii) increase in compute power. Such applications are based on software frameworks with specific strengths and weaknesses. Here, we introduce typical applications and underlying frameworks for deep learning with a set of practical criteria for developing production ready solutions in life science and pharma research. Based on our own experience in successfully developing deep learning applications we provide suggestions and a baseline for selecting the most suited frameworks for a future-proof and cost-effective development. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Hydrocomplexity: Addressing water security and emergent environmental risks

    NASA Astrophysics Data System (ADS)

    Kumar, Praveen

    2015-07-01

    Water security and emergent environmental risks are among the most significant societal concerns. They are highly interlinked to other global risks such as those related to climate, human health, food, human migration, biodiversity loss, urban sustainability, etc. Emergent risks result from the confluence of unanticipated interactions from evolving interdependencies between complex systems, such as those embedded in the water cycle. They are associated with the novelty of dynamical possibilities that have significant potential consequences to human and ecological systems, and not with probabilities based on historical precedence. To ensure water security we need to be able to anticipate the likelihood of risk possibilities as they present the prospect of the most impact through cascade of vulnerabilities. They arise due to a confluence of nonstationary drivers that include growing population, climate change, demographic shifts, urban growth, and economic expansion, among others, which create novel interdependencies leading to a potential of cascading network effects. Hydrocomplexity aims to address water security and emergent risks through the development of science, methods, and practices with the potential to foster a "Blue Revolution" akin to the Green revolution for food security. It blends both hard infrastructure based solution with soft knowledge driven solutions to increase the range of planning and design, management, mitigation and adaptation strategies. It provides a conceptual and synthetic framework to enable us to integrate discovery science and engineering, observational and information science, computational and communication systems, and social and institutional approaches to address consequential water and environmental challenges.

  14. The application of cloud computing to scientific workflows: a study of cost and performance.

    PubMed

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  15. Science and Software

    NASA Astrophysics Data System (ADS)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site, ideally maintained by someone in a funded position. Perhaps the biggest challenge is the reality that researches who use software, as opposed to develop software, are more attractive university hires because they are more likely to be "big picture" scientists that publish in the highest profile journals, although sometimes the two go together.

  16. eHealth research from the user's perspective.

    PubMed

    Hesse, Bradford W; Shneiderman, Ben

    2007-05-01

    The application of information technology (IT) to issues of healthcare delivery has had a long and tortuous history in the United States. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask "what can the computer do?" New advances in eHealth are prompting developers to ask "what can people do?" How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a healthcare system that is (1) safe, (2) effective (evidence based), (3) patient centered, and (4) timely. Relying on the eHealth researcher's intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient-physician), group (family-staff), community, and broad environmental levels.

  17. Datalist: A Value Added Service to Enable Easy Data Selection

    NASA Technical Reports Server (NTRS)

    Li, Angela; Hegde, Mahabaleshwa; Bryant, Keith; Seiler, Edward; Shie, Chung-Lin; Teng, William; Liu, Zhong; Hearty, Thomas; Shen, Suhung; Kempler, Steven; hide

    2016-01-01

    Imagine a user wanting to study hurricane events. This could involve searching and downloading multiple data variables from multiple data sets. The currently available services from the Goddard Earth Sciences Data and Information Services Center (GES DISC) only allow the user to select one data set at a time. The GES DISC started a Data List initiative, in order to enable users to easily select multiple data variables. A Data List is a collection of predefined or user-defined data variables from one or more archived data sets. Target users of Data Lists include science teams, individual science researchers, application users, and educational users. Data Lists are more than just data. Data Lists effectively provide users with a sophisticated integrated data and services package, including metadata, citation, documentation, visualization, and data-specific services, all available from one-stop shopping. Data Lists are created based on the software architecture of the GES DISC Unified User Interface (UUI). The Data List service is completely data-driven, and a Data List is treated just as any other data set. The predefined Data Lists, created by the experienced GES DISC science support team, should save a significant amount of time that users would otherwise have to spend.

  18. Integrating Intelligent Systems Domain Knowledge Into the Earth Science Curricula

    NASA Astrophysics Data System (ADS)

    Güereque, M.; Pennington, D. D.; Pierce, S. A.

    2017-12-01

    High-volume heterogeneous datasets are becoming ubiquitous, migrating to center stage over the last ten years and transcending the boundaries of computationally intensive disciplines into the mainstream, becoming a fundamental part of every science discipline. Despite the fact that large datasets are now pervasive across industries and academic disciplines, the array of skills is generally absent from earth science programs. This has left the bulk of the student population without access to curricula that systematically teach appropriate intelligent-systems skills, creating a void for skill sets that should be universal given their need and marketability. While some guidance regarding appropriate computational thinking and pedagogy is appearing, there exist few examples where these have been specifically designed and tested within the earth science domain. Furthermore, best practices from learning science have not yet been widely tested for developing intelligent systems-thinking skills. This research developed and tested evidence based computational skill modules that target this deficit with the intention of informing the earth science community as it continues to incorporate intelligent systems techniques and reasoning into its research and classrooms.

  19. Art, science, and immersion: data-driven experiences

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Monroe, Laura; Ford Morie, Jacquelyn; Aguilera, Julieta

    2013-03-01

    This panel and dialog-paper explores the potentials at the intersection of art, science, immersion and highly dimensional, "big" data to create new forms of engagement, insight and cultural forms. We will address questions such as: "What kinds of research questions can be identified at the intersection of art + science + immersive environments that can't be expressed otherwise?" "How is art+science+immersion distinct from state-of-the art visualization?" "What does working with immersive environments and visualization offer that other approaches don't or can't?" "Where does immersion fall short?" We will also explore current trends in the application of immersion for gaming, scientific data, entertainment, simulation, social media and other new forms of big data. We ask what expressive, arts-based approaches can contribute to these forms in the broad cultural landscape of immersive technologies.

  20. Manifesto of computational social science

    NASA Astrophysics Data System (ADS)

    Conte, R.; Gilbert, N.; Bonelli, G.; Cioffi-Revilla, C.; Deffuant, G.; Kertesz, J.; Loreto, V.; Moat, S.; Nadal, J.-P.; Sanchez, A.; Nowak, A.; Flache, A.; San Miguel, M.; Helbing, D.

    2012-11-01

    The increasing integration of technology into our lives has created unprecedented volumes of data on society's everyday behaviour. Such data opens up exciting new opportunities to work towards a quantitative understanding of our complex social systems, within the realms of a new discipline known as Computational Social Science. Against a background of financial crises, riots and international epidemics, the urgent need for a greater comprehension of the complexity of our interconnected global society and an ability to apply such insights in policy decisions is clear. This manifesto outlines the objectives of this new scientific direction, considering the challenges involved in it, and the extensive impact on science, technology and society that the success of this endeavour is likely to bring about.

  1. Challenges in integrating multidisciplinary data into a single e-infrastructure

    NASA Astrophysics Data System (ADS)

    Atakan, Kuvvet; Jeffery, Keith G.; Bailo, Daniele; Harrison, Matthew

    2015-04-01

    The European Plate Observing System (EPOS) aims to create a pan-European infrastructure for solid Earth science to support a safe and sustainable society. The mission of EPOS is to monitor and understand the dynamic and complex Earth system by relying on new e-science opportunities and integrating diverse and advanced Research Infrastructures in Europe for solid Earth Science. EPOS will enable innovative multidisciplinary research for a better understanding of the Earth's physical and chemical processes that control earthquakes, volcanic eruptions, ground instability and tsunami as well as the processes driving tectonics and Earth's surface dynamics. EPOS will improve our ability to better manage the use of the subsurface of the Earth. Through integration of data, models and facilities EPOS will allow the Earth Science community to make a step change in developing new concepts and tools for key answers to scientific and socio-economic questions concerning geo-hazards and geo-resources as well as Earth sciences applications to the environment and to human welfare. EPOS is now getting into its Implementation Phase (EPOS-IP). One of the main challenges during the implementation phase is the integration of multidisciplinary data into a single e-infrastructure. Multidisciplinary data are organized and governed by the Thematic Core Services (TCS) and are driven by various scientific communities encompassing a wide spectrum of Earth science disciplines. TCS data, data products and services will be integrated into a platform "the ICS system" that will ensure their interoperability and access to these services by the scientific community as well as other users within the society. This requires dedicated tasks for interactions with the various TCS-WPs, as well as the various distributed ICS (ICS-Ds), such as High Performance Computing (HPC) facilities, large scale data storage facilities, complex processing and visualization tools etc. Computational Earth Science (CES) services are identified as a transversal activity and as such need to be harmonized and provided within the ICS. In order to develop a metadata catalogue and the ICS system, the content from the entire spectrum of services included in TCS, ICS-Ds as well as CES activities, need to be organized in a systematic manner taking into account global and European IT-standards, while complying with the user needs and data provider requirements.

  2. High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away

    NASA Astrophysics Data System (ADS)

    Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.

    2012-09-01

    By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data, and so the costs of running applications vary widely according to how they use resources. The cloud is well suited to processing CPU-bound (and memory bound) workflows such as the periodogram code, given the relatively low cost of processing in comparison with I/O operations. I/O-bound applications such as Montage perform best on high-performance clusters with fast networks and parallel file-systems. Science-driven Cyberinfrastructure: Montage has been widely used as a driver application to develop workflow management services, such as task scheduling in distributed environments, designing fault tolerance techniques for job schedulers, and developing workflow orchestration techniques. Running Parallel Applications Across Distributed Cloud Environments: Data processing will eventually take place in parallel distributed across cyber infrastructure environments having different architectures. We have used the Pegasus Work Management System (WMS) to successfully run applications across three very different environments: TeraGrid, OSG (Open Science Grid), and FutureGrid. Provisioning resources across different grids and clouds (also referred to as Sky Computing), involves establishing a distributed environment, where issues of, e.g, remote job submission, data management, and security need to be addressed. This environment also requires building virtual machine images that can run in different environments. Usually, each cloud provides basic images that can be customized with additional software and services. In most of our work, we provisioned compute resources using a custom application, called Wrangler. Pegasus WMS abstracts the architectures of the compute environments away from the end-user, and can be considered a first-generation tool suitable for scientists to run their applications on disparate environments.

  3. The ethics of smart cities and urban science

    PubMed Central

    2016-01-01

    Software-enabled technologies and urban big data have become essential to the functioning of cities. Consequently, urban operational governance and city services are becoming highly responsive to a form of data-driven urbanism that is the key mode of production for smart cities. At the heart of data-driven urbanism is a computational understanding of city systems that reduces urban life to logic and calculative rules and procedures, which is underpinned by an instrumental rationality and realist epistemology. This rationality and epistemology are informed by and sustains urban science and urban informatics, which seek to make cities more knowable and controllable. This paper examines the forms, practices and ethics of smart cities and urban science, paying particular attention to: instrumental rationality and realist epistemology; privacy, datafication, dataveillance and geosurveillance; and data uses, such as social sorting and anticipatory governance. It argues that smart city initiatives and urban science need to be re-cast in three ways: a re-orientation in how cities are conceived; a reconfiguring of the underlying epistemology to openly recognize the contingent and relational nature of urban systems, processes and science; and the adoption of ethical principles designed to realize benefits of smart cities and urban science while reducing pernicious effects. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336794

  4. Computational Typologies of Multidimensional End-of-Primary-School Performance Profiles from an Educational Perspective of Large-Scale TIMSS and PIRLS Surveys

    ERIC Educational Resources Information Center

    Unlu, Ali; Schurig, Michael

    2015-01-01

    Recently, performance profiles in reading, mathematics and science were created using the data collectively available in the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) 2011. In addition, a classification of children to the end of their primary school years was…

  5. Mars for Earthlings: An Analog Approach to Mars in Undergraduate Education

    PubMed Central

    Kahmann-Robinson, Julia

    2014-01-01

    Abstract Mars for Earthlings (MFE) is a terrestrial Earth analog pedagogical approach to teaching undergraduate geology, planetary science, and astrobiology. MFE utilizes Earth analogs to teach Mars planetary concepts, with a foundational backbone in Earth science principles. The field of planetary science is rapidly changing with new technologies and higher-resolution data sets. Thus, it is increasingly important to understand geological concepts and processes for interpreting Mars data. MFE curriculum is topically driven to facilitate easy integration of content into new or existing courses. The Earth-Mars systems approach explores planetary origins, Mars missions, rocks and minerals, active driving forces/tectonics, surface sculpting processes, astrobiology, future explorations, and hot topics in an inquiry-driven environment. Curriculum leverages heavily upon multimedia resources, software programs such as Google Mars and JMARS, as well as NASA mission data such as THEMIS, HiRISE, CRISM, and rover images. Two years of MFE class evaluation data suggest that science literacy and general interest in Mars geology and astrobiology topics increased after participation in the MFE curriculum. Students also used newly developed skills to create a Mars mission team presentation. The MFE curriculum, learning modules, and resources are available online at http://serc.carleton.edu/marsforearthlings/index.html. Key Words: Mars—Geology—Planetary science—Astrobiology—NASA education. Astrobiology 14, 42–49. PMID:24359289

  6. Artificial-life researchers try to create social reality.

    PubMed

    Flam, F

    1994-08-12

    Some scientists, among them cosmologist Stephen Hawking, argue that computer viruses are alive. A better case might be made for many of the self-replicating silicon-based creatures featured at the fourth Conference on Artificial Life, held on 5 to 8 July in Boston. Researchers from computer science, biology, and other disciplines presented computer programs that, among other things, evolved cooperative strategies in a selfish world and recreated themselves in ever more complex forms.

  7. Overview of NASA's Universe of Learning: An Integrated Astrophysics STEM Learning and Literacy Program

    NASA Astrophysics Data System (ADS)

    Smith, Denise; Lestition, Kathleen; Squires, Gordon; Biferno, Anya A.; Cominsky, Lynn; Manning, Colleen; NASA's Universe of Learning Team

    2018-01-01

    NASA's Universe of Learning creates and delivers science-driven, audience-driven resources and experiences designed to engage and immerse learners of all ages and backgrounds in exploring the universe for themselves. The project is the result of a unique partnership between the Space Telescope Science Institute, Caltech/IPAC, Jet Propulsion Laboratory, Smithsonian Astrophysical Observatory, and Sonoma State University, and is one of 27 competitively-selected cooperative agreements within the NASA Science Mission Directorate STEM Activation program. The NASA's Universe of Learning team draws upon cutting-edge science and works closely with Subject Matter Experts (scientists and engineers) from across the NASA Astrophysics Physics of the Cosmos, Cosmic Origins, and Exoplanet Exploration themes. Together we develop and disseminate data tools and participatory experiences, multimedia and immersive experiences, exhibits and community programs, and professional learning experiences that meet the needs of our audiences, with attention to underserved and underrepresented populations. In doing so, scientists and educators from the partner institutions work together as a collaborative, integrated Astrophysics team to support NASA objectives to enable STEM education, increase scientific literacy, advance national education goals, and leverage efforts through partnerships. Robust program evaluation is central to our efforts, and utilizes portfolio analysis, process studies, and studies of reach and impact. This presentation will provide an overview of NASA's Universe of Learning, our direct connection to NASA Astrophysics, and our collaborative work with the NASA Astrophysics science community.

  8. Towards efficient data exchange and sharing for big-data driven materials science: metadata and data formats

    NASA Astrophysics Data System (ADS)

    Ghiringhelli, Luca M.; Carbogno, Christian; Levchenko, Sergey; Mohamed, Fawzi; Huhs, Georg; Lüders, Martin; Oliveira, Micael; Scheffler, Matthias

    2017-11-01

    With big-data driven materials research, the new paradigm of materials science, sharing and wide accessibility of data are becoming crucial aspects. Obviously, a prerequisite for data exchange and big-data analytics is standardization, which means using consistent and unique conventions for, e.g., units, zero base lines, and file formats. There are two main strategies to achieve this goal. One accepts the heterogeneous nature of the community, which comprises scientists from physics, chemistry, bio-physics, and materials science, by complying with the diverse ecosystem of computer codes and thus develops "converters" for the input and output files of all important codes. These converters then translate the data of each code into a standardized, code-independent format. The other strategy is to provide standardized open libraries that code developers can adopt for shaping their inputs, outputs, and restart files, directly into the same code-independent format. In this perspective paper, we present both strategies and argue that they can and should be regarded as complementary, if not even synergetic. The represented appropriate format and conventions were agreed upon by two teams, the Electronic Structure Library (ESL) of the European Center for Atomic and Molecular Computations (CECAM) and the NOvel MAterials Discovery (NOMAD) Laboratory, a European Centre of Excellence (CoE). A key element of this work is the definition of hierarchical metadata describing state-of-the-art electronic-structure calculations.

  9. Computer Science Career Network

    DTIC Science & Technology

    2013-03-01

    development model. TopCoder’s development model is competition-based, meaning that TopCoder conducts competitions to develop digital assets. TopCoder...success in running a competition that had as an objective creating digital assets, and we intend to run more of them, to create assets for...cash prizes and merchandise . This includes social media contests, contests will all our games, special referral contests, and a couple NASA

  10. Long live the Data Scientist, but can he/she persist?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2011-12-01

    In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.

  11. Online citizen science games: Opportunities for the biological sciences.

    PubMed

    Curtis, Vickie

    2014-12-01

    Recent developments in digital technologies and the rise of the Internet have created new opportunities for citizen science. One of these has been the development of online citizen science games where complex research problems have been re-imagined as online multiplayer computer games. Some of the most successful examples of these can be found within the biological sciences, for example, Foldit, Phylo and EteRNA. These games offer scientists the opportunity to crowdsource research problems, and to engage with those outside the research community. Games also enable those without a background in science to make a valid contribution to research, and may also offer opportunities for informal science learning.

  12. R&D100 Finalist: Neuromorphic Cyber Microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follett, David; Naegle, John; Suppona, Roger

    The Neuromorphic Cyber Microscope provides security analysts with unprecedented visibility of their network, computer and storage assets. This processor is the world's first practical implementation of neuromorphic technology to a major computer science mission. Working with Lewis Rhodes Labs, engineers at Sandia National Laboratories have created a device that is orders of magnitude faster at analyzing data to identify cyber-attacks.

  13. Computer-based, Jeopardy™-like game in general chemistry for engineering majors

    NASA Astrophysics Data System (ADS)

    Ling, S. S.; Saffre, F.; Kadadha, M.; Gater, D. L.; Isakovic, A. F.

    2013-03-01

    We report on the design of Jeopardy™-like computer game for enhancement of learning of general chemistry for engineering majors. While we examine several parameters of student achievement and attitude, our primary concern is addressing the motivation of students, which tends to be low in a traditionally run chemistry lectures. The effect of the game-playing is tested by comparing paper-based game quiz, which constitutes a control group, and computer-based game quiz, constituting a treatment group. Computer-based game quizzes are Java™-based applications that students run once a week in the second part of the last lecture of the week. Overall effectiveness of the semester-long program is measured through pretest-postest conceptual testing of general chemistry. The objective of this research is to determine to what extent this ``gamification'' of the course delivery and course evaluation processes may be beneficial to the undergraduates' learning of science in general, and chemistry in particular. We present data addressing gender-specific difference in performance, as well as background (pre-college) level of general science and chemistry preparation. We outline the plan how to extend such approach to general physics courses and to modern science driven electives, and we offer live, in-lectures examples of our computer gaming experience. We acknowledge support from Khalifa University, Abu Dhabi

  14. Utilization of computer technology by science teachers in public high schools and the impact of standardized testing

    NASA Astrophysics Data System (ADS)

    Priest, Richard Harding

    A significant percentage of high school science teachers are not using computers to teach their students or prepare them for standardized testing. A survey of high school science teachers was conducted to determine how they are having students use computers in the classroom, why science teachers are not using computers in the classroom, which variables were relevant to their not using computers, and what are the effects of standardized testing on the use of technology in the high school science classroom. A self-administered questionnaire was developed to measure these aspects of computer integration and demographic information. A follow-up telephone interview survey of a portion of the original sample was conducted in order to clarify questions, correct misunderstandings, and to draw out more holistic descriptions from the subjects. The primary method used to analyze the quantitative data was frequency distributions. Multiple regression analysis was used to investigate the relationships between the barriers and facilitators and the dimensions of instructional use, frequency, and importance of the use of computers. All high school science teachers in a large urban/suburban school district were sent surveys. A response rate of 58% resulted from two mailings of the survey. It was found that contributing factors to why science teachers do not use computers were not enough up-to-date computers in their classrooms and other educational commitments and duties do not leave them enough time to prepare lessons that include technology. While a high percentage of science teachers thought their school and district administrations were supportive of technology, they also believed more inservice technology training and follow-up activities to support that training are needed and more software needs to be created. The majority of the science teachers do not use the computer to help students prepare for standardized tests because they believe they can prepare students more efficiently without a computer. Nearly half of the teachers, however, gave lack of time to prepare instructional materials and lack of a means to project a computer image to the whole class as reasons they do not use computers. A significant percentage thought science standardized testing was having a negative effect on computer use.

  15. Automating CapCom: Pragmatic Operations and Technology Research for Human Exploration of Mars

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2003-01-01

    During the Apollo program, NASA and the scientific community used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. More recently, computer scientists and human factors specialists have followed geologists and biologists into the field, learning how science is actually done on expeditions in extreme environments. Research stations have been constructed by the Mars Society in the Arctic and American southwest, providing facilities for hundreds of researchers to investigate how small crews might live and work on Mars. Combining these interests-science, operations, and technology-in Mars analog field expeditions provides tremendous synergy and authenticity to speculations about Mars missions. By relating historical analyses of Apollo and field science, engineers are creating experimental prototypes that provide significant new capabilities, such as a computer system that automates some of the functions of Apollo s CapCom. Thus, analog studies have created a community of practice-a new collaboration between scientists and engineers-so that technology begins with real human needs and works incrementally towards the challenges of the human exploration of Mars.

  16. Leveraging Citizen Science and Information Technology for Population Physical Activity Promotion.

    PubMed

    King, Abby C; Winter, Sandra J; Sheats, Jylana L; Rosas, Lisa G; Buman, Matthew P; Salvo, Deborah; Rodriguez, Nicole M; Seguin, Rebecca A; Moran, Mika; Garber, Randi; Broderick, Bonnie; Zieff, Susan G; Sarmiento, Olga Lucia; Gonzalez, Silvia A; Banchoff, Ann; Dommarco, Juan Rivera

    2016-05-15

    While technology is a major driver of many of society's comforts, conveniences, and advances, it has been responsible, in a significant way, for engineering regular physical activity and a number of other positive health behaviors out of people's daily lives. A key question concerns how to harness information and communication technologies (ICT) to bring about positive changes in the health promotion field. One such approach involves community-engaged "citizen science," in which local residents leverage the potential of ICT to foster data-driven consensus-building and mobilization efforts that advance physical activity at the individual, social, built environment, and policy levels. The history of citizen science in the research arena is briefly described and an evidence-based method that embeds citizen science in a multi-level, multi-sectoral community-based participatory research framework for physical activity promotion is presented. Several examples of this citizen science-driven community engagement framework for promoting active lifestyles, called "Our Voice", are discussed, including pilot projects from diverse communities in the U.S. as well as internationally. The opportunities and challenges involved in leveraging citizen science activities as part of a broader population approach to promoting regular physical activity are explored. The strategic engagement of citizen scientists from socio-demographically diverse communities across the globe as both assessment as well as change agents provides a promising, potentially low-cost and scalable strategy for creating more active, healthful, and equitable neighborhoods and communities worldwide.

  17. Leveraging Citizen Science and Information Technology for Population Physical Activity Promotion

    PubMed Central

    King, Abby C.; Winter, Sandra J.; Sheats, Jylana L.; Rosas, Lisa G.; Buman, Matthew P.; Salvo, Deborah; Rodriguez, Nicole M.; Seguin, Rebecca A.; Moran, Mika; Garber, Randi; Broderick, Bonnie; Zieff, Susan G.; Sarmiento, Olga Lucia; Gonzalez, Silvia A.; Banchoff, Ann; Dommarco, Juan Rivera

    2016-01-01

    PURPOSE While technology is a major driver of many of society’s comforts, conveniences, and advances, it has been responsible, in a significant way, for engineering regular physical activity and a number of other positive health behaviors out of people’s daily lives. A key question concerns how to harness information and communication technologies (ICT) to bring about positive changes in the health promotion field. One such approach involves community-engaged “citizen science,” in which local residents leverage the potential of ICT to foster data-driven consensus-building and mobilization efforts that advance physical activity at the individual, social, built environment, and policy levels. METHOD The history of citizen science in the research arena is briefly described and an evidence-based method that embeds citizen science in a multi-level, multi-sectoral community-based participatory research framework for physical activity promotion is presented. RESULTS Several examples of this citizen science-driven community engagement framework for promoting active lifestyles, called “Our Voice”, are discussed, including pilot projects from diverse communities in the U.S. as well as internationally. CONCLUSIONS The opportunities and challenges involved in leveraging citizen science activities as part of a broader population approach to promoting regular physical activity are explored. The strategic engagement of citizen scientists from socio-demographically diverse communities across the globe as both assessment as well as change agents provides a promising, potentially low-cost and scalable strategy for creating more active, healthful, and equitable neighborhoods and communities worldwide. PMID:27525309

  18. Spectral and spatial characterisation of laser-driven positron beams

    DOE PAGES

    Sarri, G.; Warwick, J.; Schumaker, W.; ...

    2016-10-18

    The generation of high-quality relativistic positron beams is a central area of research in experimental physics, due to their potential relevance in a wide range of scientific and engineering areas, ranging from fundamental science to practical applications. There is now growing interest in developing hybrid machines that will combine plasma-based acceleration techniques with more conventional radio-frequency accelerators, in order to minimise the size and cost of these machines. Here we report on recent experiments on laser-driven generation of high-quality positron beams using a relatively low energy and potentially table-top laser system. Lastly, the results obtained indicate that current technology allowsmore » to create, in a compact setup, positron beams suitable for injection in radio-frequency accelerators.« less

  19. An intelligent value-driven scheduling system for Space Station Freedom with special emphasis on the electric power system

    NASA Technical Reports Server (NTRS)

    Krupp, Joseph C.

    1991-01-01

    The Electric Power Control System (EPCS) created by Decision-Science Applications, Inc. (DSA) for the Lewis Research Center is discussed. This system makes decisions on what to schedule and when to schedule it, including making choices among various options or ways of performing a task. The system is goal-directed and seeks to shape resource usage in an optimal manner using a value-driven approach. Discussed here are considerations governing what makes a good schedule, how to design a value function to find the best schedule, and how to design the algorithm that finds the schedule that maximizes this value function. Results are shown which demonstrate the usefulness of the techniques employed.

  20. Simulation and Visualization of Chaos in a Driven Nonlinear Pendulum -- An Aid to Introducing Chaotic Systems in Physics

    NASA Astrophysics Data System (ADS)

    Akpojotor, Godfrey; Ehwerhemuepha, Louis; Amromanoh, Ogheneriobororue

    2013-03-01

    The presence of physical systems whose characteristics change in a seemingly erratic manner gives rise to the study of chaotic systems. The characteristics of these systems are due to their hypersensitivity to changes in initial conditions. In order to understand chaotic systems, some sort of simulation and visualization is pertinent. Consequently, in this work, we have simulated and graphically visualized chaos in a driven nonlinear pendulum as a means of introducing chaotic systems. The results obtained which highlight the hypersensitivity of the pendulum are used to discuss the effectiveness of teaching and learning the physics of chaotic system using Python. This study is one of the many studies under the African Computational Science and Engineering Tour Project (PASET) which is using Python to model, simulate and visualize concepts, laws and phenomena in Science and Engineering to compliment the teaching/learning of theory and experiment.

  1. The quantum computer game: citizen science

    NASA Astrophysics Data System (ADS)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  2. Using Alice 2.0 to Design Games for People with Stroke.

    PubMed

    Proffitt, Rachel; Kelleher, Caitlin; Baum, M Carolyn; Engsberg, Jack

    2012-08-01

    Computer and videogames are gaining in popularity as rehabilitation tools. Unfortunately, most systems still require extensive programming/engineering knowledge to create, something that therapists, as novice programmers, do not possess. There is software designed to allow novice programmers to create storyboard and games through simple drag-and-drop formats; however, the applications for therapeutic game development have not been studied. The purpose of this study was to have an occupational therapy (OT) student with no prior computer programming experience learn how to create computer games for persons with stroke using Alice 2.0, a drag-and-drop editor, designed by Carnegie Mellon University (Pittsburgh, PA). The OT student learned how to use Alice 2.0 through a textbook, tutorials, and assistance from computer science students. She kept a journal of her process, detailing her successes and challenges. The OT student created three games for people with stroke using Alice 2.0. She found that although there were many supports in Alice for creating stories, it lacked critical pieces necessary for game design. Her recommendations for a future programming environment for therapists were that it (1) be efficient, (2) include basic game design pieces so therapists do not have to create them, (3) provide technical support, and (4) be simple. With the incorporation of these recommendations, a future programming environment for therapists will be an effective tool for therapeutic game development.

  3. Idea Bank.

    ERIC Educational Resources Information Center

    Buck, Cheryl A.; And Others

    1988-01-01

    Introduces 12 activities for teaching science. Includes one way to begin the school year, peristalsis demonstration, candy-coated metrics, 3-D constellations, 35-mm astrophotography, create an alien organism, jet propulsion, computer programs for pendulum calculations, plant versus animal, chocolate chip petroleum, paper rockets, and…

  4. Dynamics of list-server discussion on genetically modified foods.

    PubMed

    Triunfol, Marcia L; Hines, Pamela J

    2004-04-01

    Computer-mediated discussion lists, or list-servers, are popular tools in settings ranging from professional to personal to educational. A discussion list on genetically modified food (GMF) was created in September 2000 as part of the Forum on Genetically Modified Food developed by Science Controversies: Online Partnerships in Education (SCOPE), an educational project that uses computer resources to aid research and learning around unresolved scientific questions. The discussion list "GMF-Science" was actively supported from January 2001 to May 2002. The GMF-Science list welcomed anyone interested in discussing the controversies surrounding GMF. Here, we analyze the dynamics of the discussions and how the GMF-Science list may contribute to learning. Activity on the GMF-Science discussion list reflected some but not all the controversies that were appearing in more traditional publication formats, broached other topics not well represented in the published literature, and tended to leave undiscussed the more technical research developments.

  5. Simulation and animation of sensor-driven robots.

    PubMed

    Chen, C; Trivedi, M M; Bidlack, C R

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aid the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the users visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.

  6. Cosmology Without Finality

    NASA Astrophysics Data System (ADS)

    Mahootian, F.

    2009-12-01

    The rapid convergence of advancing sensor technology, computational power, and knowledge discovery techniques over the past decade has brought unprecedented volumes of astronomical data together with unprecedented capabilities of data assimilation and analysis. A key result is that a new, data-driven "observational-inductive'' framework for scientific inquiry is taking shape and proving viable. The anticipated rise in data flow and processing power will have profound effects, e.g., confirmations and disconfirmations of existing theoretical claims both for and against the big bang model. But beyond enabling new discoveries can new data-driven frameworks of scientific inquiry reshape the epistemic ideals of science? The history of physics offers a comparison. The Bohr-Einstein debate over the "completeness'' of quantum mechanics centered on a question of ideals: what counts as science? We briefly examine lessons from that episode and pose questions about their applicability to cosmology. If the history of 20th century physics is any indication, the abandonment of absolutes (e.g., space, time, simultaneity, continuity, determinacy) can produce fundamental changes in understanding. The classical ideal of science, operative in both physics and cosmology, descends from the European Enlightenment. This ideal has for over 200 years guided science to seek the ultimate order of nature, to pursue the absolute theory, the "theory of everything.'' But now that we have new models of scientific inquiry powered by new technologies and driven more by data than by theory, it is time, finally, to relinquish dreams of a "final'' theory.

  7. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teeguarden, Justin G.; Tan, Yu-Mei; Edwards, Stephen W.

    Driven by major scientific advances in analytical methods, biomonitoring, and computational exposure assessment, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the computationally enabled “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) conceptmore » in the toxicological sciences. The AEP framework offers an intuitive approach to successful organization of exposure science data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathway and adverse outcome pathways, completing the source to outcome continuum and setting the stage for more efficient integration of exposure science and toxicity testing information. Together these frameworks form and inform a decision making framework with the flexibility for risk-based, hazard-based or exposure-based decisions.« less

  8. Tidal flushing and wind driven circulation of Ahe atoll lagoon (Tuamotu Archipelago, French Polynesia) from in situ observations and numerical modelling.

    PubMed

    Dumas, F; Le Gendre, R; Thomas, Y; Andréfouët, S

    2012-01-01

    Hydrodynamic functioning and water circulation of the semi-closed deep lagoon of Ahe atoll (Tuamotu Archipelago, French Polynesia) were investigated using 1 year of field data and a 3D hydrodynamical model. Tidal amplitude averaged less than 30 cm, but tide generated very strong currents (2 ms(-1)) in the pass, creating a jet-like circulation that partitioned the lagoon into three residual circulation cells. The pass entirely flushed excess water brought by waves-induced radiation stress. Circulation patterns were computed for climatological meteorological conditions and summarized with stream function and flushing time. Lagoon hydrodynamics and general overturning circulation was driven by wind. Renewal time was 250 days, whereas the e-flushing time yielded a lagoon-wide 80-days average. Tide-driven flush through the pass and wind-driven overturning circulation designate Ahe as a wind-driven, tidally and weakly wave-flushed deep lagoon. The 3D model allows studying pearl oyster larvae dispersal in both realistic and climatological conditions for aquaculture applications. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Smart Health - Potential and Pathways: A Survey

    NASA Astrophysics Data System (ADS)

    Arulananthan, C.; Hanifa, Sabibullah Mohamed

    2017-08-01

    Healthcare is an imperative key field of research, where individuals or groups can be engaged in the self-tracking of any kind of biological, physical, behavioral, or environmental information. In a massive health care data, the valuable information is hidden. The quantity of the available unstructured data has been expanding on an exponential scale. The newly developing Disruptive Technologies can handle many challenges that face data analysis and ability to extract valuable information via data analytics. Connected Wellness in Healthcare would retrieve patient’s physiological, pathological and behavioral parameters through sensors to perform inner workings of human body analysis. Disruptive technologies can take us from a reactive illness-driven to a proactive wellness-driven system in health care. It is need to be strive and create a smart health system towards wellness-driven instead of being illness-driven, today’s biggest problem in health care. Wellness-driven-analytics application help to promote healthiest living environment called “Smart Health”, deliver empower based quality of living. The contributions of this survey reveals and opens (touches uncovered areas) the possible doors in the line of research on smart health and its computing technologies.

  10. Data-Driven Software Framework for Web-Based ISS Telescience

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.

    2005-01-01

    Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.

  11. A direct-to-drive neural data acquisition system.

    PubMed

    Kinney, Justin P; Bernstein, Jacob G; Meyer, Andrew J; Barber, Jessica B; Bolivar, Marti; Newbold, Bryan; Scholvin, Jorg; Moore-Kochlacs, Caroline; Wentz, Christian T; Kopell, Nancy J; Boyden, Edward S

    2015-01-01

    Driven by the increasing channel count of neural probes, there is much effort being directed to creating increasingly scalable electrophysiology data acquisition (DAQ) systems. However, all such systems still rely on personal computers for data storage, and thus are limited by the bandwidth and cost of the computers, especially as the scale of recording increases. Here we present a novel architecture in which a digital processor receives data from an analog-to-digital converter, and writes that data directly to hard drives, without the need for a personal computer to serve as an intermediary in the DAQ process. This minimalist architecture may support exceptionally high data throughput, without incurring costs to support unnecessary hardware and overhead associated with personal computers, thus facilitating scaling of electrophysiological recording in the future.

  12. A direct-to-drive neural data acquisition system

    PubMed Central

    Kinney, Justin P.; Bernstein, Jacob G.; Meyer, Andrew J.; Barber, Jessica B.; Bolivar, Marti; Newbold, Bryan; Scholvin, Jorg; Moore-Kochlacs, Caroline; Wentz, Christian T.; Kopell, Nancy J.; Boyden, Edward S.

    2015-01-01

    Driven by the increasing channel count of neural probes, there is much effort being directed to creating increasingly scalable electrophysiology data acquisition (DAQ) systems. However, all such systems still rely on personal computers for data storage, and thus are limited by the bandwidth and cost of the computers, especially as the scale of recording increases. Here we present a novel architecture in which a digital processor receives data from an analog-to-digital converter, and writes that data directly to hard drives, without the need for a personal computer to serve as an intermediary in the DAQ process. This minimalist architecture may support exceptionally high data throughput, without incurring costs to support unnecessary hardware and overhead associated with personal computers, thus facilitating scaling of electrophysiological recording in the future. PMID:26388740

  13. Visions 2025 and Linkage to NEXT

    NASA Technical Reports Server (NTRS)

    Wiscombe, W.; Lau, William K. M. (Technical Monitor)

    2002-01-01

    This talk will describe the progress to date on creating a science-driven vision for the NASA Earth Science Enterprise (ESE) in the post-2010 period. This effort began in the Fall of 2001 by organizing five science workgroups with representatives from NASA, academia and other agencies: Long-Term Climate, Medium-Term Climate, Extreme Weather, Biosphere & Ecosystems, and Solid Earth, Ice Sheets, & Sea Level. Each workgroup was directed to scope out one Big Question, including not just the science but the observational and modeling requirements, the information system requirements, and the applications and benefits to society. This first set of five Big Questions is now in hand and has been presented to the ESE Director. It includes: water resources, intraseasonal predictability, tropical cyclogenesis, invasive species, and sea level. Each of these topics will be discussed briefly. How this effort fits into the NEXT vision exercise and into Administrator O'Keefe's new vision for NASA will also be discussed.

  14. [Nanotechnology: a big revolution from the small world].

    PubMed

    Bassi, Matteo; Santinello, Irene; Bevilacqua, Andrea; Bassi, Pierfrancesco

    2013-01-01

    Nanotechnology is a multidisciplinary field originating from the interaction of several different disciplines, such as engineering, physics, biology and chemistry. New materials and devices effectively interact with the body at molecular level, yielding a brand new range of highly selective and targeted applications designed to maximize the therapeutic efficiency while reducing the side effects. Liposomes, quantum dots, carbon nanotubes and superparamagnetic nanoparticles are among the most assessed nanotechnologies. Meanwhile, other futuristic platforms are paving the way toward a new scientific paradigm, able to deeply change the research path in the medical science. The growth of nanotechnology, driven by the dramatic advances in science and technology, clearly creates new opportunities for the development of the medical science and disease treatment in human health care. Despite the concerns and the on-going studies about their safety, nanotechnology clearly emerges as holding the promise of delivering one of the greatest breakthroughs in the history of medical science.

  15. RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning

    PubMed Central

    O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara

    2014-01-01

    Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503

  16. Data mining with iPlant: a meeting report from the 2013 GARNet workshop, Data mining with iPlant.

    PubMed

    Martin, Lisa; Cook, Charis; Matasci, Naim; Williams, Jason; Bastow, Ruth

    2015-01-01

    High-throughput sequencing technologies have rapidly moved from large international sequencing centres to individual laboratory benchtops. These changes have driven the 'data deluge' of modern biology. Submissions of nucleotide sequences to GenBank, for example, have doubled in size every year since 1982, and individual data sets now frequently reach terabytes in size. While 'big data' present exciting opportunities for scientific discovery, data analysis skills are not part of the typical wet bench biologist's experience. Knowing what to do with data, how to visualize and analyse them, make predictions, and test hypotheses are important barriers to success. Many researchers also lack adequate capacity to store and share these data, creating further bottlenecks to effective collaboration between groups and institutes. The US National Science Foundation-funded iPlant Collaborative was established in 2008 to form part of the data collection and analysis pipeline and help alleviate the bottlenecks associated with the big data challenge in plant science. Leveraging the power of high-performance computing facilities, iPlant provides free-to-use cyberinfrastructure to enable terabytes of data storage, improve analysis, and facilitate collaborations. To help train UK plant science researchers to use the iPlant platform and understand how it can be exploited to further research, GARNet organized a four-day Data mining with iPlant workshop at Warwick University in September 2013. This report provides an overview of the workshop, and highlights the power of the iPlant environment for lowering barriers to using complex bioinformatics resources, furthering discoveries in plant science research and providing a platform for education and outreach programmes. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  17. Using Science Driven Technologies for the Defense and Security Applications

    NASA Technical Reports Server (NTRS)

    Habib, Shahid; Zukor, Dorthy; Ambrose, Stephen D.

    2004-01-01

    For the past three decades, Earth science remote sensing technologies have been providing enormous amounts of useful data and information in broadening our understanding of our home planet as a system. This research, as it has expanded our learning process, has also generated additional questions. This has further resulted in establishing new science requirements, which have culminated in defining and pushing the state-of-the-art technology needs. NASA s Earth science program has deployed 18 highly complex satellites, with a total of 80 sensors, so far and is in a process of defining and launching multiple observing systems in the next decade. Due to the heightened security alert of the nation, researchers and technologists are paying serious attention to the use of these science driven technologies for dual use. In other words, how such sophisticated observing and measuring systems can be used in detecting multiple types of security concerns with a substantial lead time so that the appropriate law enforcement agencies can take adequate steps to defuse any potential risky scenarios. This paper examines numerous NASA technologies such as laser/lidar systems, microwave and millimeter wave technologies, optical observing systems, high performance computational techniques for rapid analyses, and imaging products that can have a tremendous pay off for security applications.

  18. Herding Cats: Geocuration Practices Employed for Field Research Data Collection Activities and Visualization by Blueprint Earth

    NASA Astrophysics Data System (ADS)

    Hoover, R.; Harrison, M.; Sonnenthal, N.; Hernandez, A.; Pelaez, J.

    2015-12-01

    Researchers investigating interdisciplinary topics must work to understand the barriers created by information siloes in order to productively collaborate on complex Earth science questions. These barriers create acute challenges when research is driven by observations rather than hypotheses, as communication between collaborators hinges on data synthesis techniques that often vary greatly between disciplines. Field data collection across disciplines creates even more challenges, and employing student researchers of varying abilities demands an approach that is structured, and yet still flexible enough to accommodate inherent differences in the subjective portions of student data collection. Blueprint Earth is performing system-level environmental observations in the broad areas of geology, biology, hydrology, and atmospheric science. Traditional field data collection methodologies are employed for ease of reproducibility, but must translate across disciplinary information siloes. Information collected must be readily useable in the formulation of hypotheses based on field observations, which necessitates an understanding of key metrics by all investigators involved in data analysis. Blueprint Earth demonstrates the ability to create clear data standards across several disciplines while incorporating a quality control process and this allows for conversion of data into functional visualizations. Additionally, geocuration is organized such that data will be ready for public dissemination upon completion of field research.

  19. eHealth Research from the User’s Perspective

    PubMed Central

    Hesse, Bradford W.; Shneiderman, Ben

    2007-01-01

    The application of Information Technology (IT) to issues of healthcare delivery has had a long and tortuous history in the U.S. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask “what can the computer do?” New advances in eHealth are prompting developers to ask “what can people do?” How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a health care system that is (a) safe, (b) effective (evidence-based), (c) patient-centered, and (d) timely. Relying on the eHealth researcher’s intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient/physician), group (family/staff), community, and broad environmental levels. PMID:17466825

  20. ARACHNE: A neural-neuroglial network builder with remotely controlled parallel computing

    PubMed Central

    Rusakov, Dmitri A.; Savtchenko, Leonid P.

    2017-01-01

    Creating and running realistic models of neural networks has hitherto been a task for computing professionals rather than experimental neuroscientists. This is mainly because such networks usually engage substantial computational resources, the handling of which requires specific programing skills. Here we put forward a newly developed simulation environment ARACHNE: it enables an investigator to build and explore cellular networks of arbitrary biophysical and architectural complexity using the logic of NEURON and a simple interface on a local computer or a mobile device. The interface can control, through the internet, an optimized computational kernel installed on a remote computer cluster. ARACHNE can combine neuronal (wired) and astroglial (extracellular volume-transmission driven) network types and adopt realistic cell models from the NEURON library. The program and documentation (current version) are available at GitHub repository https://github.com/LeonidSavtchenko/Arachne under the MIT License (MIT). PMID:28362877

  1. University of Washington's eScience Institute Promotes New Training and Career Pathways in Data Science

    NASA Astrophysics Data System (ADS)

    Stone, S.; Parker, M. S.; Howe, B.; Lazowska, E.

    2015-12-01

    Rapid advances in technology are transforming nearly every field from "data-poor" to "data-rich." The ability to extract knowledge from this abundance of data is the cornerstone of 21st century discovery. At the University of Washington eScience Institute, our mission is to engage researchers across disciplines in developing and applying advanced computational methods and tools to real world problems in data-intensive discovery. Our research team consists of individuals with diverse backgrounds in domain sciences such as astronomy, oceanography and geology, with complementary expertise in advanced statistical and computational techniques such as data management, visualization, and machine learning. Two key elements are necessary to foster careers in data science: individuals with cross-disciplinary training in both method and domain sciences, and career paths emphasizing alternative metrics for advancement. We see persistent and deep-rooted challenges for the career paths of people whose skills, activities and work patterns don't fit neatly into the traditional roles and success metrics of academia. To address these challenges the eScience Institute has developed training programs and established new career opportunities for data-intensive research in academia. Our graduate students and post-docs have mentors in both a methodology and an application field. They also participate in coursework and tutorials to advance technical skill and foster community. Professional Data Scientist positions were created to support research independence while encouraging the development and adoption of domain-specific tools and techniques. The eScience Institute also supports the appointment of faculty who are innovators in developing and applying data science methodologies to advance their field of discovery. Our ultimate goal is to create a supportive environment for data science in academia and to establish global recognition for data-intensive discovery across all fields.

  2. Alliance for Computational Science Collaboration: HBCU Partnership at Alabama A&M University Continuing High Performance Computing Research and Education at AAMU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Xiaoqing; Deng, Z. T.

    2009-11-10

    This is the final report for the Department of Energy (DOE) project DE-FG02-06ER25746, entitled, "Continuing High Performance Computing Research and Education at AAMU". This three-year project was started in August 15, 2006, and it was ended in August 14, 2009. The objective of this project was to enhance high performance computing research and education capabilities at Alabama A&M University (AAMU), and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. AAMU has successfully completed all the proposed research and educational tasks. Through the support of DOE, AAMU was able tomore » provide opportunities to minority students through summer interns and DOE computational science scholarship program. In the past three years, AAMU (1). Supported three graduate research assistants in image processing for hypersonic shockwave control experiment and in computational science related area; (2). Recruited and provided full financial support for six AAMU undergraduate summer research interns to participate Research Alliance in Math and Science (RAMS) program at Oak Ridge National Lab (ORNL); (3). Awarded highly competitive 30 DOE High Performance Computing Scholarships ($1500 each) to qualified top AAMU undergraduate students in science and engineering majors; (4). Improved high performance computing laboratory at AAMU with the addition of three high performance Linux workstations; (5). Conducted image analysis for electromagnetic shockwave control experiment and computation of shockwave interactions to verify the design and operation of AAMU-Supersonic wind tunnel. The high performance computing research and education activities at AAMU created great impact to minority students. As praised by Accreditation Board for Engineering and Technology (ABET) in 2009, ?The work on high performance computing that is funded by the Department of Energy provides scholarships to undergraduate students as computational science scholars. This is a wonderful opportunity to recruit under-represented students.? Three ASEE papers were published in 2007, 2008 and 2009 proceedings of ASEE Annual Conferences, respectively. Presentations of these papers were also made at the ASEE Annual Conferences. It is very critical to continue the research and education activities.« less

  3. The application of computer image analysis in life sciences and environmental engineering

    NASA Astrophysics Data System (ADS)

    Mazur, R.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.

    2014-04-01

    The main aim of the article was to present research on the application of computer image analysis in Life Science and Environmental Engineering. The authors used different methods of computer image analysis in developing of an innovative biotest in modern biomonitoring of water quality. Created tools were based on live organisms such as bioindicators Lemna minor L. and Hydra vulgaris Pallas as well as computer image analysis method in the assessment of negatives reactions during the exposition of the organisms to selected water toxicants. All of these methods belong to acute toxicity tests and are particularly essential in ecotoxicological assessment of water pollutants. Developed bioassays can be used not only in scientific research but are also applicable in environmental engineering and agriculture in the study of adverse effects on water quality of various compounds used in agriculture and industry.

  4. How can we improve Science, Technology, Engineering, and Math education to encourage careers in Biomedical and Pathology Informatics?

    PubMed

    Uppal, Rahul; Mandava, Gunasheil; Romagnoli, Katrina M; King, Andrew J; Draper, Amie J; Handen, Adam L; Fisher, Arielle M; Becich, Michael J; Dutta-Moscato, Joyeeta

    2016-01-01

    The Computer Science, Biology, and Biomedical Informatics (CoSBBI) program was initiated in 2011 to expose the critical role of informatics in biomedicine to talented high school students.[1] By involving them in Science, Technology, Engineering, and Math (STEM) training at the high school level and providing mentorship and research opportunities throughout the formative years of their education, CoSBBI creates a research infrastructure designed to develop young informaticians. Our central premise is that the trajectory necessary to be an expert in the emerging fields of biomedical informatics and pathology informatics requires accelerated learning at an early age.In our 4(th) year of CoSBBI as a part of the University of Pittsburgh Cancer Institute (UPCI) Academy (http://www.upci.upmc.edu/summeracademy/), and our 2nd year of CoSBBI as an independent informatics-based academy, we enhanced our classroom curriculum, added hands-on computer science instruction, and expanded research projects to include clinical informatics. We also conducted a qualitative evaluation of the program to identify areas that need improvement in order to achieve our goal of creating a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics in the era of big data and personalized medicine.

  5. The study of early human embryos using interactive 3-dimensional computer reconstructions.

    PubMed

    Scarborough, J; Aiton, J F; McLachlan, J C; Smart, S D; Whiten, S C

    1997-07-01

    Tracings of serial histological sections from 4 human embryos at different Carnegie stages were used to create 3-dimensional (3D) computer models of the developing heart. The models were constructed using commercially available software developed for graphic design and the production of computer generated virtual reality environments. They are available as interactive objects which can be downloaded via the World Wide Web. This simple method of 3D reconstruction offers significant advantages for understanding important events in morphological sciences.

  6. Tidal truncation and barotropic convergence in a channel network tidally driven from opposing entrances

    USGS Publications Warehouse

    Warner, J.C.; Schoellhamer, D.; Schladow, G.

    2003-01-01

    Residual circulation patterns in a channel network that is tidally driven from entrances on opposite sides are controlled by the temporal phasing and spatial asymmetry of the two forcing tides. The Napa/Sonoma Marsh Complex in San Francisco Bay, CA, is such a system. A sill on the west entrance to the system prevents a complete tidal range at spring tides that results in tidal truncation of water levels. Tidal truncation does not occur on the east side but asymmetries develop due to friction and off-channel wetland storage. The east and west asymmetric tides meet in the middle to produce a barotropic convergence zone that controls the transport of water and sediment. During spring tides, tidally averaged water-surface elevations are higher on the truncated west side. This creates tidally averaged fluxes of water and sediment to the east. During neap tides, the water levels are not truncated and the propagation speed of the tides controls residual circulation, creating a tidally averaged flux in the opposite direction. ?? 2003 Elsevier Science B.V. All rights reserved.

  7. Imagining tomorrow's university in an era of open science.

    PubMed

    Howe, Adina; Howe, Michael; Kaleita, Amy L; Raman, D Raj

    2017-01-01

    As part of a recent workshop entitled "Imagining Tomorrow's University", we were asked to visualize the future of universities as research becomes increasingly data- and computation-driven, and identify a set of principles characterizing pertinent opportunities and obstacles presented by this shift. In order to establish a holistic view, we take a multilevel approach and examine the impact of open science on individual scholars and how this impacts as well as on the university as a whole. At the university level, open science presents a double-edged sword: when well executed, open science can accelerate the rate of scientific inquiry across the institution and beyond; however, haphazard or half-hearted efforts are likely to squander valuable resources, diminish university productivity and prestige, and potentially do more harm than good. We present our perspective on the role of open science at the university.

  8. Of the Helmholtz Club, South-Californian seedbed for visual and cognitive neuroscience, and its patron Francis Crick.

    PubMed

    Aicardi, Christine

    2014-03-01

    Taking up the view that semi-institutional gatherings such as clubs, societies, research schools, have been instrumental in creating sheltered spaces from which many a 20th-century project-driven interdisciplinary research programme could develop and become established within the institutions of science, the paper explores the history of one such gathering from its inception in the early 1980s into the 2000s, the Helmholtz Club, which brought together scientists from such various research fields as neuroanatomy, neurophysiology, psychophysics, computer science and engineering, who all had an interest in the study of the visual system and of higher cognitive functions relying on visual perception such as visual consciousness. It argues that British molecular biologist turned South Californian neuroscientist Francis Crick had an early and lasting influence over the Helmholtz Club of which he was a founding pillar, and that from its inception, the club served as a constitutive element in his long-term plans for a neuroscience of vision and of cognition. Further, it argues that in this role, the Helmholtz Club served many purposes, the primary of which was to be a social forum for interdisciplinary discussion, where 'discussion' was not mere talk but was imbued with an epistemic value and as such, carefully cultivated. Finally, it questions what counts as 'doing science' and in turn, definitions of success and failure-and provides some material evidence towards re-appraising the successfulness of Crick's contribution to the neurosciences. Copyright © 2013 The Author. Published by Elsevier Ltd.. All rights reserved.

  9. Publisher Correction: Western US volcanism due to intruding oceanic mantle driven by ancient Farallon slabs

    NASA Astrophysics Data System (ADS)

    Zhou, Quan; Liu, Lijun; Hu, Jiashun

    2018-05-01

    In the version of this Article originally published, data points representing mafic eruptions were missing from Fig. 4b, the corrected version is shown below. Furthermore, the authors omitted to include the following acknowledgements to the provider of the computational resources: "This research is part of the Blue Waters sustained-petascale computing project, which is supported by the National Science Foundation (awards OCI-0725070 and ACI-1238993) and the state of Illinois. Blue Waters is a joint effort of the University of Illinois at Urbana-Champaign and its National Center for Supercomputing Applications. This work is also part of the `PRAC Title 4-D Geodynamic Modeling With Data Assimilation: Origin Of Intra-Plate Volcanism In The Pacific Northwest' PRAC allocation support by the National Science Foundation (award number ACI 1516586). This work also used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1548562." Figure 4 and the Acknowledgements section have been updated in the online version of the Article.

  10. High-efficiency high-energy Ka source for the critically-required maximum illumination of x-ray optics on Z using Z-petawatt-driven laser-breakout-afterburner accelerated ultrarelativistic electrons LDRD .

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sefkow, Adam B.; Bennett, Guy R.

    2010-09-01

    Under the auspices of the Science of Extreme Environments LDRD program, a <2 year theoretical- and computational-physics study was performed (LDRD Project 130805) by Guy R Bennett (formally in Center-01600) and Adam B. Sefkow (Center-01600): To investigate novel target designs by which a short-pulse, PW-class beam could create a brighter K{alpha} x-ray source than by simple, direct-laser-irradiation of a flat foil; Direct-Foil-Irradiation (DFI). The computational studies - which are still ongoing at this writing - were performed primarily on the RedStorm supercomputer at Sandia National Laboratories Albuquerque site. The motivation for a higher efficiency K{alpha} emitter was very clear: asmore » the backlighter flux for any x-ray imaging technique on the Z accelerator increases, the signal-to-noise and signal-to-background ratios improve. This ultimately allows the imaging system to reach its full quantitative potential as a diagnostic. Depending on the particular application/experiment this would imply, for example, that the system would have reached its full design spatial resolution and thus the capability to see features that might otherwise be indiscernible with a traditional DFI-like x-ray source. This LDRD began FY09 and ended FY10.« less

  11. University of Maryland MRSEC - Research: Seed 1

    Science.gov Websites

    . University of Maryland Materials Research Science and Engineering Center Home About Us Leadership & Biochemistry Wolfgang Losert, Physics, IPST, IREAP Ben Shapiro, Bio-Engineering, Aerospace Engineering Edo Waks, Electrical & Computer Engineering, IREAP, JQI Creating specific functional patterns

  12. An ontology-driven, diagnostic modeling system.

    PubMed

    Haug, Peter J; Ferraro, Jeffrey P; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Dean, Nathan; Jones, Jason

    2013-06-01

    To present a system that uses knowledge stored in a medical ontology to automate the development of diagnostic decision support systems. To illustrate its function through an example focused on the development of a tool for diagnosing pneumonia. We developed a system that automates the creation of diagnostic decision-support applications. It relies on a medical ontology to direct the acquisition of clinic data from a clinical data warehouse and uses an automated analytic system to apply a sequence of machine learning algorithms that create applications for diagnostic screening. We refer to this system as the ontology-driven diagnostic modeling system (ODMS). We tested this system using samples of patient data collected in Salt Lake City emergency rooms and stored in Intermountain Healthcare's enterprise data warehouse. The system was used in the preliminary development steps of a tool to identify patients with pneumonia in the emergency department. This tool was compared with a manually created diagnostic tool derived from a curated dataset. The manually created tool is currently in clinical use. The automatically created tool had an area under the receiver operating characteristic curve of 0.920 (95% CI 0.916 to 0.924), compared with 0.944 (95% CI 0.942 to 0.947) for the manually created tool. Initial testing of the ODMS demonstrates promising accuracy for the highly automated results and illustrates the route to model improvement. The use of medical knowledge, embedded in ontologies, to direct the initial development of diagnostic computing systems appears feasible.

  13. Driving Ms. Data: Creating Data-Driven Possibilities

    ERIC Educational Resources Information Center

    Hoffman, Richard

    2005-01-01

    This article describes how driven Web sites help schools and districts maximize their IT resources by making online content more "self-service" for users. It shows how to set up the capacity to create data-driven sites. By definition, a data-driven Web site is one in which the content comes from some back-end data source, such as a…

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kornreich, Drew E; Vaidya, Rajendra U; Ammerman, Curtt N

    Integrated Computational Materials Engineering (ICME) is a novel overarching approach to bridge length and time scales in computational materials science and engineering. This approach integrates all elements of multi-scale modeling (including various empirical and science-based models) with materials informatics to provide users the opportunity to tailor material selections based on stringent application needs. Typically, materials engineering has focused on structural requirements (stress, strain, modulus, fracture toughness etc.) while multi-scale modeling has been science focused (mechanical threshold strength model, grain-size models, solid-solution strengthening models etc.). Materials informatics (mechanical property inventories) on the other hand, is extensively data focused. All of thesemore » elements are combined within the framework of ICME to create architecture for the development, selection and design new composite materials for challenging environments. We propose development of the foundations for applying ICME to composite materials development for nuclear and high-radiation environments (including nuclear-fusion energy reactors, nuclear-fission reactors, and accelerators). We expect to combine all elements of current material models (including thermo-mechanical and finite-element models) into the ICME framework. This will be accomplished through the use of a various mathematical modeling constructs. These constructs will allow the integration of constituent models, which in tum would allow us to use the adaptive strengths of using a combinatorial scheme (fabrication and computational) for creating new composite materials. A sample problem where these concepts are used is provided in this summary.« less

  15. Gesture Analysis for Astronomy Presentation Software

    NASA Astrophysics Data System (ADS)

    Robinson, Marc A.

    Astronomy presentation software in a planetarium setting provides a visually stimulating way to introduce varied scientific concepts, including computer science concepts, to a wide audience. However, the underlying computational complexity and opportunities for discussion are often overshadowed by the brilliance of the presentation itself. To bring this discussion back out into the open, a method needs to be developed to make the computer science applications more visible. This thesis introduces the GAAPS system, which endeavors to implement free-hand gesture-based control of astronomy presentation software, with the goal of providing that talking point to begin the discussion of computer science concepts in a planetarium setting. The GAAPS system incorporates gesture capture and analysis in a unique environment presenting unique challenges, and introduces a novel algorithm called a Bounding Box Tree to create and select features for this particular gesture data. This thesis also analyzes several different machine learning techniques to determine a well-suited technique for the classification of this particular data set, with an artificial neural network being chosen as the implemented algorithm. The results of this work will allow for the desired introduction of computer science discussion into the specific setting used, as well as provide for future work pertaining to gesture recognition with astronomy presentation software.

  16. Quality user support: Supporting quality users

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woolley, T.C.

    1994-12-31

    During the past decade, fundamental changes have occurred in technical computing in the oil industry. Technical computing systems have moved from local, fragmented quantity, to global, integrated, quality. The compute power available to the average geoscientist at his desktop has grown exponentially. Technical computing applications have increased in integration and complexity. At the same time, there has been a significant change in the work force due to the pressures of restructuring, and the increased focus on international opportunities. The profile of the user of technical computing resources has changed. Users are generally more mature, knowledgeable, and team oriented than theirmore » predecessors. In the 1990s, computer literacy is a requirement. This paper describes the steps taken by Oryx Energy Company to address the problems and opportunities created by the explosive growth in computing power and needs, coupled with the contraction of the business. A successful user support strategy will be described. Characteristics of the program include: (1) Client driven support; (2) Empowerment of highly skilled professionals to fill the support role; (3) Routine and ongoing modification to the support plan; (4) Utilization of the support assignment to create highly trained advocates on the line; (5) Integration of the support role to the reservoir management team. Results of the plan include a highly trained work force, stakeholder teams that include support personnel, and global support from a centralized support organization.« less

  17. ASC ATDM Level 2 Milestone #5325: Asynchronous Many-Task Runtime System Analysis and Assessment for Next Generation Platforms.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Gavin Matthew; Bettencourt, Matthew Tyler; Bova, Steven W.

    2015-09-01

    This report provides in-depth information and analysis to help create a technical road map for developing next- generation Orogramming mocleN and runtime systemsl that support Advanced Simulation and Computing (ASC) work- load requirements. The focus herein is on 4synchronous many-task (AMT) model and runtime systems, which are of great interest in the context of "Oriascale7 computing, as they hold the promise to address key issues associated with future extreme-scale computer architectures. This report includes a thorough qualitative and quantitative examination of three best-of-class AIM] runtime systemsHCharm-HE, Legion, and Uintah, all of which are in use as part of the Centers.more » The studies focus on each of the runtimes' programmability, performance, and mutability. Through the experiments and analysis presented, several overarching Predictive Science Academic Alliance Program II (PSAAP-II) Ascl findings emerge. From a performance perspective, AIVT11runtimes show tremendous potential for addressing extreme- scale challenges. Empirical studies show an AM11 runtime can mitigate performance heterogeneity inherent to the machine itself and that Message Passing Interface (MP1) and AM11runtimes perform comparably under balanced con- ditions. From a programmability and mutability perspective however, none of the runtimes in this study are currently ready for use in developing production-ready Sandia ASCIapplications. The report concludes by recommending a co- design path forward, wherein application, programming model, and runtime system developers work together to define requirements and solutions. Such a requirements-driven co-design approach benefits the community as a whole, with widespread community engagement mitigating risk for both application developers developers. and high-performance computing inntime systein« less

  18. The HEP Software and Computing Knowledge Base

    NASA Astrophysics Data System (ADS)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  19. Graduate Training at the Interface of Computational and Experimental Biology: An Outcome Report from a Partnership of Volunteers between a University and a National Laboratory.

    PubMed

    von Arnim, Albrecht G; Missra, Anamika

    2017-01-01

    Leading voices in the biological sciences have called for a transformation in graduate education leading to the PhD degree. One area commonly singled out for growth and innovation is cross-training in computational science. In 1998, the University of Tennessee (UT) founded an intercollegiate graduate program called the UT-ORNL Graduate School of Genome Science and Technology in partnership with the nearby Oak Ridge National Laboratory. Here, we report outcome data that attest to the program's effectiveness in graduating computationally enabled biologists for diverse careers. Among 77 PhD graduates since 2003, the majority came with traditional degrees in the biological sciences, yet two-thirds moved into computational or hybrid (computational-experimental) positions. We describe the curriculum of the program and how it has changed. We also summarize how the program seeks to establish cohesion between computational and experimental biologists. This type of program can respond flexibly and dynamically to unmet training needs. In conclusion, this study from a flagship, state-supported university may serve as a reference point for creating a stable, degree-granting, interdepartmental graduate program in computational biology and allied areas. © 2017 A. G. von Arnim and A. Missra. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  20. Data-driven Applications for the Sun-Earth System

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.

    2016-12-01

    Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.

  1. Technical Challenges and Opportunities of Centralizing Space Science Mission Operations (SSMO) at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ido, Haisam; Burns, Rich

    2015-01-01

    The NASA Goddard Space Science Mission Operations project (SSMO) is performing a technical cost-benefit analysis for centralizing and consolidating operations of a diverse set of missions into a unified and integrated technical infrastructure. The presentation will focus on the notion of normalizing spacecraft operations processes, workflows, and tools. It will also show the processes of creating a standardized open architecture, creating common security models and implementations, interfaces, services, automations, notifications, alerts, logging, publish, subscribe and middleware capabilities. The presentation will also discuss how to leverage traditional capabilities, along with virtualization, cloud computing services, control groups and containers, and possibly Big Data concepts.

  2. Toward Computational Cumulative Biology by Combining Models of Biological Datasets

    PubMed Central

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations—for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database. PMID:25427176

  3. Toward computational cumulative biology by combining models of biological datasets.

    PubMed

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations-for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database.

  4. Conceptual Modeling in the Time of the Revolution: Part II

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.

  5. Technology in Science and Mathematics Education.

    ERIC Educational Resources Information Center

    Buccino, Alphonse

    Provided are several perspectives on technology, addressing changes in learners related to technology, changes in contemporary life related to technology, and changes in subject areas related to technology (indicating that technology has created such new tools for inquiry as computer programming, word processing, online database searches, and…

  6. The Educational Uses of Intermedia.

    ERIC Educational Resources Information Center

    Launhardt, Julie; Kahn, Paul

    1992-01-01

    Uses of Intermedia, computer software designed to help instructors express relationships between concepts in the sciences and humanities, are discussed. The kinds of educational problems Intermedia was intended to address are described, some materials created using it are surveyed, and experiences with Intermedia in various educational contexts…

  7. Automated metadata--final project report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schissel, David

    This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO projectmore » was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.« less

  8. Nanotechnology and dentistry

    PubMed Central

    Ozak, Sule Tugba; Ozkan, Pelin

    2013-01-01

    Nanotechnology deals with the physical, chemical, and biological properties of structures and their components at nanoscale dimensions. Nanotechnology is based on the concept of creating functional structures by controlling atoms and molecules on a one-by-one basis. The use of this technology will allow many developments in the health sciences as well as in materials science, bio-technology, electronic and computer technology, aviation, and space exploration. With developments in materials science and biotechnology, nanotechnology is especially anticipated to provide advances in dentistry and innovations in oral health-related diagnostic and therapeutic methods. PMID:23408486

  9. Significance of informal (on-the-job) learning and leadership development in health systems: lessons from a district finance team in South Africa

    PubMed Central

    Choonara, S; Goudge, J; Nxumalo, N; Eyles, J

    2017-01-01

    Background The district health system (DHS) has a critical role to play in the delivery of primary healthcare (PHC). Effective district management, particularly leadership is considered to be crucial element of the DHS. Internationally, the debate around developing leadership competencies such as motivation or empowerment of staff, managing relationships, being solution driven as well as fostering teamwork are argued to be possible through approaches such as formal and informal training. Despite growing multidisciplinary evidence in fields such as engineering, computer sciences and health sciences there remains little empirical evidence of these approaches, especially the informal approach. Findings are based on a broader doctoral thesis which explored district financial management; although the core focus of this paper draws attention to the significance of informal learning and its practical value in developing leadership competencies. Methods A qualitative case study was conducted in one district in the Gauteng province, South Africa. Purposive and snowballing techniques yielded a sample of 18 participants, primarily based at a district level. Primary data collected through in-depth interviews and observations (participant and non-participant) were analysed using thematic analysis. Findings Results indicate the sorts of complexities, particularly financial management challenges which staff face and draws attention to the use of two informal learning strategies—learning from others (how to communicate, delegate) and fostering team-based learning. Such strategies played a role in developing a cadre of leaders at a district level who displayed essential competencies such as motivating staff, and problem solving. Conclusions It is crucial for health systems, especially those in financially constrained settings to find cost-effective ways to develop leadership competencies such as being solution driven or motivating and empowering staff. This study illustrates that it is possible to develop such competencies through creating and nurturing a learning environment (on-the-job training) which could be incorporated into everyday practice. PMID:28588998

  10. Significance of informal (on-the-job) learning and leadership development in health systems: lessons from a district finance team in South Africa.

    PubMed

    Choonara, S; Goudge, J; Nxumalo, N; Eyles, J

    2017-01-01

    The district health system (DHS) has a critical role to play in the delivery of primary healthcare (PHC). Effective district management, particularly leadership is considered to be crucial element of the DHS. Internationally, the debate around developing leadership competencies such as motivation or empowerment of staff, managing relationships, being solution driven as well as fostering teamwork are argued to be possible through approaches such as formal and informal training. Despite growing multidisciplinary evidence in fields such as engineering, computer sciences and health sciences there remains little empirical evidence of these approaches, especially the informal approach. Findings are based on a broader doctoral thesis which explored district financial management; although the core focus of this paper draws attention to the significance of informal learning and its practical value in developing leadership competencies. A qualitative case study was conducted in one district in the Gauteng province, South Africa. Purposive and snowballing techniques yielded a sample of 18 participants, primarily based at a district level. Primary data collected through in-depth interviews and observations (participant and non-participant) were analysed using thematic analysis. Results indicate the sorts of complexities, particularly financial management challenges which staff face and draws attention to the use of two informal learning strategies-learning from others (how to communicate, delegate) and fostering team-based learning. Such strategies played a role in developing a cadre of leaders at a district level who displayed essential competencies such as motivating staff, and problem solving. It is crucial for health systems, especially those in financially constrained settings to find cost-effective ways to develop leadership competencies such as being solution driven or motivating and empowering staff. This study illustrates that it is possible to develop such competencies through creating and nurturing a learning environment (on-the-job training) which could be incorporated into everyday practice.

  11. Data base development and research and editorial support

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The Life Sciences Bibliographic Data Base was created in 1981 and subsequently expanded. A systematic, professional system was developed to collect, organize, and disseminate information about scientific publications resulting from research. The data base consists of bibliographic information and hard copies of all research papers published by Life Sciences-supported investigators. Technical improvements were instituted in the database. To minimize costs, take advantage of advances in personal computer technology, and achieve maximum flexibility and control, the data base was transferred from the JSC computer to personal computers at George Washington University (GWU). GWU also performed a range of related activities such as conducting in-depth searches on a variety of subjects, retrieving scientific literature, preparing presentations, summarizing research progress, answering correspondence requiring reference support, and providing writing and editorial support.

  12. Collaborative Visualization Project: shared-technology learning environments for science learning

    NASA Astrophysics Data System (ADS)

    Pea, Roy D.; Gomez, Louis M.

    1993-01-01

    Project-enhanced science learning (PESL) provides students with opportunities for `cognitive apprenticeships' in authentic scientific inquiry using computers for data-collection and analysis. Student teams work on projects with teacher guidance to develop and apply their understanding of science concepts and skills. We are applying advanced computing and communications technologies to augment and transform PESL at-a-distance (beyond the boundaries of the individual school), which is limited today to asynchronous, text-only networking and unsuitable for collaborative science learning involving shared access to multimedia resources such as data, graphs, tables, pictures, and audio-video communication. Our work creates user technology (a Collaborative Science Workbench providing PESL design support and shared synchronous document views, program, and data access; a Science Learning Resource Directory for easy access to resources including two-way video links to collaborators, mentors, museum exhibits, media-rich resources such as scientific visualization graphics), and refine enabling technologies (audiovisual and shared-data telephony, networking) for this PESL niche. We characterize participation scenarios for using these resources and we discuss national networked access to science education expertise.

  13. Lightweight Data Systems in the Cloud: Costs, Benefits and Best Practices

    NASA Astrophysics Data System (ADS)

    Fatland, R.; Arendt, A. A.; Howe, B.; Hess, N. J.; Futrelle, J.

    2015-12-01

    We present here a simple analysis of both the cost and the benefit of using the cloud in environmental science circa 2016. We present this set of ideas to enable the potential 'cloud adopter' research scientist to explore and understand the tradeoffs in moving some aspect of their compute work to the cloud. We present examples, design patterns and best practices as an evolving body of knowledge that help optimize benefit to the research team. Thematically this generally means not starting from a blank page but rather learning how to find 90% of the solution to a problem pre-built. We will touch on four topics of interest. (1) Existing cloud data resources (NASA, WHOI BCO DMO, etc) and how they can be discovered, used and improved. (2) How to explore, compare and evaluate cost and compute power from many cloud options, particularly in relation to data scale (size/complexity). (3) What are simple / fast 'Lightweight Data System' procedures that take from 20 minutes to one day to implement and that have a clear immediate payoff in environmental data-driven research. Examples include publishing a SQL Share URL at (EarthCube's) CINERGI as a registered data resource and creating executable papers on a cloud-hosted Jupyter instance, particularly iPython notebooks. (4) Translating the computational terminology landscape ('cloud', 'HPC cluster', 'hadoop', 'spark', 'machine learning') into examples from the community of practice to help the geoscientist build or expand their mental map. In the course of this discussion -- which is about resource discovery, adoption and mastery -- we provide direction to online resources in support of these themes.

  14. Application of Ontologies for Big Earth Data

    NASA Astrophysics Data System (ADS)

    Huang, T.; Chang, G.; Armstrong, E. M.; Boening, C.

    2014-12-01

    Connected data is smarter data! Earth Science research infrastructure must do more than just being able to support temporal, geospatial discovery of satellite data. As the Earth Science data archives continue to expand across NASA data centers, the research communities are demanding smarter data services. A successful research infrastructure must be able to present researchers the complete picture, that is, datasets with linked citations, related interdisciplinary data, imageries, current events, social media discussions, and scientific data tools that are relevant to the particular dataset. The popular Semantic Web for Earth and Environmental Terminology (SWEET) ontologies is a collection of ontologies and concepts designed to improve discovery and application of Earth Science data. The SWEET ontologies collection was initially developed to capture the relationships between keywords in the NASA Global Change Master Directory (GCMD). Over the years this popular ontologies collection has expanded to cover over 200 ontologies and 6000 concepts to enable scalable classification of Earth system science concepts and Space science. This presentation discusses the semantic web technologies as the enabling technology for data-intensive science. We will discuss the application of the SWEET ontologies as a critical component in knowledge-driven research infrastructure for some of the recent projects, which include the DARPA Ontological System for Context Artifact and Resources (OSCAR), 2013 NASA ACCESS Virtual Quality Screening Service (VQSS), and the 2013 NASA Sea Level Change Portal (SLCP) projects. The presentation will also discuss the benefits in using semantic web technologies in developing research infrastructure for Big Earth Science Data in an attempt to "accommodate all domains and provide the necessary glue for information to be cross-linked, correlated, and discovered in a semantically rich manner." [1] [1] Savas Parastatidis: A platform for all that we know: creating a knowledge-driven research infrastructure. The Fourth Paradigm 2009: 165-172

  15. Reaching for the cloud: on the lessons learned from grid computing technology transfer process to the biomedical community.

    PubMed

    Mohammed, Yassene; Dickmann, Frank; Sax, Ulrich; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which led to the creation of the Grid. The inter domain transfer process of this technology has hitherto been an intuitive process without in depth analysis. Some difficulties facing the life science community in this transfer can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies which have achieved certain stability. Grid and Cloud solutions are technologies, which are still in flux. We show how Grid computing creates new difficulties in the transfer process that are not considered in Bozeman's model. We show why the success of healthgrids should be measured by the qualified scientific human capital and the opportunities created, and not primarily by the market impact. We conclude with recommendations that can help improve the adoption of Grid and Cloud solutions into the biomedical community. These results give a more concise explanation of the difficulties many life science IT projects are facing in the late funding periods, and show leveraging steps that can help overcoming the "vale of tears".

  16. Data-driven system to predict academic grades and dropout.

    PubMed

    Rovira, Sergi; Puertas, Eloi; Igual, Laura

    2017-01-01

    Nowadays, the role of a tutor is more important than ever to prevent students dropout and improve their academic performance. This work proposes a data-driven system to extract relevant information hidden in the student academic data and, thus, help tutors to offer their pupils a more proactive personal guidance. In particular, our system, based on machine learning techniques, makes predictions of dropout intention and courses grades of students, as well as personalized course recommendations. Moreover, we present different visualizations which help in the interpretation of the results. In the experimental validation, we show that the system obtains promising results with data from the degree studies in Law, Computer Science and Mathematics of the Universitat de Barcelona.

  17. Formation of droplet interface bilayers in a Teflon tube

    NASA Astrophysics Data System (ADS)

    Walsh, Edmond; Feuerborn, Alexander; Cook, Peter R.

    2016-09-01

    Droplet-interface bilayers (DIBs) have applications in disciplines ranging from biology to computing. We present a method for forming them manually using a Teflon tube attached to a syringe pump; this method is simple enough it should be accessible to those without expertise in microfluidics. It exploits the properties of interfaces between three immiscible liquids, and uses fluid flow through the tube to pack together drops coated with lipid monolayers to create bilayers at points of contact. It is used to create functional nanopores in DIBs composed of phosphocholine using the protein α-hemolysin (αHL), to demonstrate osmotically-driven mass transfer of fluid across surfactant-based DIBs, and to create arrays of DIBs. The approach is scalable, and thousands of DIBs can be prepared using a robot in one hour; therefore, it is feasible to use it for high throughput applications.

  18. Astronomy and Computing: A new journal for the astronomical computing community

    NASA Astrophysics Data System (ADS)

    Accomazzi, Alberto; Budavári, Tamás; Fluke, Christopher; Gray, Norman; Mann, Robert G.; O'Mullane, William; Wicenec, Andreas; Wise, Michael

    2013-02-01

    We introduce Astronomy and Computing, a new journal for the growing population of people working in the domain where astronomy overlaps with computer science and information technology. The journal aims to provide a new communication channel within that community, which is not well served by current journals, and to help secure recognition of its true importance within modern astronomy. In this inaugural editorial, we describe the rationale for creating the journal, outline its scope and ambitions, and seek input from the community in defining in detail how the journal should work towards its high-level goals.

  19. ODISEES Data Portal Announcement

    Atmospheric Science Data Center

    2015-11-13

    ... larger image The Ontology-Driven Interactive Search Environment for Earth Science, developed at the Atmospheric Science Data Center ... The Ontology-Driven Interactive Search Environment for Earth Science, developed at the Atmospheric Science Data Center ...

  20. Dynamically adaptive data-driven simulation of extreme hydrological flows

    NASA Astrophysics Data System (ADS)

    Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint

    2018-02-01

    Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.

  1. Astro Data Science: The Next Generation

    NASA Astrophysics Data System (ADS)

    Mentzel, Chris

    2018-01-01

    Astronomers have been at the forefront of data-driven discovery since before the days of Kepler. Using data in the scientific inquiry into the workings of the the universe is the lifeblood of the field. This said, data science is considered a new thing, and researchers from every discipline are rushing to learn data science techniques, train themselves on data science tools, and even leaving academia to become data scientists. It is undeniable that our ability to harness new computational and statistical methods to make sense of today’s unprecedented size, complexity, and fast streaming data is helping scientists make new discoveries. The question now is how to ensure that researchers can employ these tools and use them appropriately. This talk will cover the state of data science as it relates to scientific research and the role astronomers play in its development, use, and training the next generation of astro-data scientists.

  2. What We've Learned about Assessing Hands-On Science.

    ERIC Educational Resources Information Center

    Shavelson, Richard J.; Baxter, Gail P.

    1992-01-01

    A recent study compared hands-on scientific inquiry assessment to assessments involving lab notebooks, computer simulations, short-answer paper-and-pencil problems, and multiple-choice questions. Creating high quality performance assessments is a costly, time-consuming process requiring considerable scientific and technological know-how. Improved…

  3. Smart Data Infrastructure: The Sixth Generation of Mediation for Data Science

    NASA Astrophysics Data System (ADS)

    Fox, P. A.

    2014-12-01

    In the emergent "fourth paradigm" (data-driven) science, the scientific method is enhanced by the integration of significant data sources into the practice of scientific research. To address Big Science, there are challenges in understanding the role of data in enabling researchers to attack not just disciplinary issues, but also the system-level, large-scale, and transdisciplinary global scientific challenges facing society.Recognizing that the volume of data is only one of many dimensions to be considered, there is a clear need for improved data infrastructures to mediate data and information exchange, which we contend will need to be powered by semantic technologies. One clear need is to provide computational approaches for researchers to discover appropriate data resources, rapidly integrate data collections from heterogeneously resources or multiple data sets, and inter-compare results to allow generation and validation of hypotheses. Another trend is toward automated tools that allow researchers to better find and reuse data that they currently don't know they need, let alone know how to find. Again semantic technologies will be required. Finally, to turn data analytics from "art to science", technical solutions are needed for cross-dataset validation, reproducibility studies on data-driven results, and the concomitant citation of data products allowing recognition for those who curate and share important data resources.

  4. A conceptual framework to support exposure science research ...

    EPA Pesticide Factsheets

    While knowledge of exposure is fundamental to assessing and mitigating risks, exposure information has been costly and difficult to generate. Driven by major scientific advances in analytical methods, biomonitoring, computational tools, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition that allows it to be more agile, predictive, and data- and knowledge-driven. A necessary element of this evolved paradigm is an organizational and predictive framework for exposure science that furthers the application of systems-based approaches. To enable such systems-based approaches, we proposed the Aggregate Exposure Pathway (AEP) concept to organize data and information emerging from an invigorated and expanding field of exposure science. The AEP framework is a layered structure that describes the elements of an exposure pathway, as well as the relationship between those elements. The basic building blocks of an AEP adopt the naming conventions used for Adverse Outcome Pathways (AOPs): Key Events (KEs) to describe the measurable, obligate steps through the AEP; and Key Event Relationships (KERs) describe the linkages between KEs. Importantly, the AEP offers an intuitive approach to organize exposure information from sources to internal site of action, setting the stage for predicting stressor concentrations at an internal target site. These predicted concentrations can help inform the r

  5. Advancing research collaborations among agencies through the Interagency Arctic Research Policy Committee: A necessary step for linking science to policy.

    NASA Astrophysics Data System (ADS)

    LaValley, M.; Starkweather, S.; Bowden, S.

    2017-12-01

    The Arctic is changing rapidly as average temperatures rise. As an Arctic nation, the United States is directly affected by these changes. It is imperative that these changes be understood to make effective policy decisions. Since the research needs of the Arctic are large and wide-ranging, most Federal agencies fund some aspect of Arctic research. As a result, the U.S. government regularly works to coordinate Federal Arctic research in order to reduce duplication of effort and costs, and to enhance the research's system perspective. The government's Interagency Arctic Research Policy Committee (IARPC) accomplishes this coordination through its policy-driven five-year Arctic Research Plans and collaboration teams (CTs), which are research topic-oriented teams tasked with implementing the plans. The policies put forth by IARPC thus inform science, however IARPC has been less successful of making these science outcomes part of an iterative decision making process. IARPC's mandate to facilitate coordinated research through information sharing communities can be viewed a prerequisite step in the science-to- decision making process. Research collaborations and the communities of practice facilitated by IARPC allow scientists to connect with a wider community of scientists and stakeholders and, in turn, the larger issues in need of policy solutions. These connections help to create a pathway through which research may increasingly reflect policy goals and inform decisions. IARPC has been growing into a more useful model for the science-to-decision making interface since the publication of its Arctic Research Plan FY2017-2021, and it is useful to evaluate how and why IARPC is progressing in this realm. To understand the challenges facing interagency research collaboration and the progress IARPC has made, the Chukchi Beaufort and Communities CTs, were evaluated as case studies. From the case studies, several recommendations for enhancing collaborations across Federal agencies emerge, including establishing appropriate agency leadership; determining focused and achievable scope of team goals; providing room for bottom-up, community-driven determination of goals; and finally, building relationships and creating an inclusive team environment.

  6. The GI Project: a prototype electronic textbook for high school biology.

    PubMed

    Calhoun, P S; Fishman, E K

    1997-01-01

    A prototype electronic science textbook for secondary education was developed to help bridge the gap between state-of-the-art medical technology and the basic science classroom. The prototype combines the latest in radiologic imaging techniques with a user-friendly multimedia computer program to teach the anatomy, physiology, and diseases of the gastrointestinal (GI) tract. The program includes original text, illustrations, photographs, animations, images from upper GI studies, plain radiographs, computed tomographic images, and three-dimensional reconstructions. These features are intended to create a stimulus-rich environment in which the high school science student can enjoy a variety of interactive experiences that will facilitate the learning process. The computer-based book is a new educational tool that promises to play a prominent role in the coming years. Current research suggests that computer-based books are valuable as an alternative educational medium. Although it is not yet clear what form textbooks will take in the future, computer-based books are already proving valuable as an alternative educational medium. For beginning students, they reinforce the material found in traditional textbooks and class presentations; for advanced students, they provide motivation to learn outside the traditional classroom.

  7. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  8. Aerodynamic Optimization of Rocket Control Surface Geometry Using Cartesian Methods and CAD Geometry

    NASA Technical Reports Server (NTRS)

    Nelson, Andrea; Aftosmis, Michael J.; Nemec, Marian; Pulliam, Thomas H.

    2004-01-01

    Aerodynamic design is an iterative process involving geometry manipulation and complex computational analysis subject to physical constraints and aerodynamic objectives. A design cycle consists of first establishing the performance of a baseline design, which is usually created with low-fidelity engineering tools, and then progressively optimizing the design to maximize its performance. Optimization techniques have evolved from relying exclusively on designer intuition and insight in traditional trial and error methods, to sophisticated local and global search methods. Recent attempts at automating the search through a large design space with formal optimization methods include both database driven and direct evaluation schemes. Databases are being used in conjunction with surrogate and neural network models as a basis on which to run optimization algorithms. Optimization algorithms are also being driven by the direct evaluation of objectives and constraints using high-fidelity simulations. Surrogate methods use data points obtained from simulations, and possibly gradients evaluated at the data points, to create mathematical approximations of a database. Neural network models work in a similar fashion, using a number of high-fidelity database calculations as training iterations to create a database model. Optimal designs are obtained by coupling an optimization algorithm to the database model. Evaluation of the current best design then gives either a new local optima and/or increases the fidelity of the approximation model for the next iteration. Surrogate methods have also been developed that iterate on the selection of data points to decrease the uncertainty of the approximation model prior to searching for an optimal design. The database approximation models for each of these cases, however, become computationally expensive with increase in dimensionality. Thus the method of using optimization algorithms to search a database model becomes problematic as the number of design variables is increased.

  9. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as the example application area, we believe that our conclusions are of value to the wider environmental and geoscience community as far as the use of code and models for scientific advancement is concerned. References: [1] Hutton, C., T. Wagener, J. Freer, D. Han, C. Duffy, and B. Arheimer (2016), Most computational hydrology is not reproducible, so is it really science?, Water Resour. Res., 52, 7548-7555, doi:10.1002/2016WR019285. [2] Ceola, S., et al. (2015), Virtual laboratories: New opportunities for collaborative water science, Hydrol. Earth Syst. Sci. Discuss., 11(12), 13443-13478, doi:10.5194/hessd-11-13443-2014.

  10. Simulation and animation of sensor-driven robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.; Trivedi, M.M.; Bidlack, C.R.

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aide the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the usersmore » visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.« less

  11. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    NASA Astrophysics Data System (ADS)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  12. Visual Activities for Assessing Non-science Majors’ Understanding in Introductory Astronomy

    NASA Astrophysics Data System (ADS)

    Loranz, Daniel; Prather, E. E.; Slater, T. F.

    2006-12-01

    One of the most ardent challenges for astronomy teachers is to deeply and meaningfully assess students’ conceptual and quantitative understanding of astronomy topics. In an effort to uncover students’ actual understanding, members and affiliates of the Conceptual Astronomy and Physics Education Research (CAPER) Team at the University of Arizona and Truckee Meadows Community College are creating and field-testing innovative approaches to assessment. Leveraging from the highly successful work on interactive lecture demonstrations from astronomy and physics education research, we are creating a series of conceptually rich questions that are matched to visually captivating and purposefully interactive astronomical animations. These conceptually challenging tasks are being created to span the entire domain of topics in introductory astronomy for non-science majoring undergraduates. When completed, these sorting tasks and vocabulary-in-context activities will be able to be delivered via a drag-and-drop computer interface.

  13. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Hack, James; Riley, Katherine

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less

  14. Games and Simulations for Climate, Weather and Earth Science Education

    NASA Astrophysics Data System (ADS)

    Russell, R. M.; Clark, S.

    2015-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory.

  15. Data-driven non-linear elasticity: constitutive manifold construction and problem discretization

    NASA Astrophysics Data System (ADS)

    Ibañez, Ruben; Borzacchiello, Domenico; Aguado, Jose Vicente; Abisset-Chavanne, Emmanuelle; Cueto, Elias; Ladeveze, Pierre; Chinesta, Francisco

    2017-11-01

    The use of constitutive equations calibrated from data has been implemented into standard numerical solvers for successfully addressing a variety problems encountered in simulation-based engineering sciences (SBES). However, the complexity remains constantly increasing due to the need of increasingly detailed models as well as the use of engineered materials. Data-Driven simulation constitutes a potential change of paradigm in SBES. Standard simulation in computational mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,\\ldots ), whereas the second one consists of models that scientists have extracted from collected, either natural or synthetic, data. Data-driven (or data-intensive) simulation consists of directly linking experimental data to computers in order to perform numerical simulations. These simulations will employ laws, universally recognized as epistemic, while minimizing the need of explicit, often phenomenological, models. The main drawback of such an approach is the large amount of required data, some of them inaccessible from the nowadays testing facilities. Such difficulty can be circumvented in many cases, and in any case alleviated, by considering complex tests, collecting as many data as possible and then using a data-driven inverse approach in order to generate the whole constitutive manifold from few complex experimental tests, as discussed in the present work.

  16. 2016 New Horizons Lecture: Beyond Imaging-Radiology of Tomorrow.

    PubMed

    Hricak, Hedvig

    2018-03-01

    This article is based on the New Horizons lecture delivered at the 2016 Radiological Society of North America Annual Meeting. It addresses looming changes for radiology, many of which stem from the disruptive effects of the Fourth Industrial Revolution. This is an emerging era of unprecedented rapid innovation marked by the integration of diverse disciplines and technologies, including data science, machine learning, and artificial intelligence-technologies that narrow the gap between man and machine. Technologic advances and the convergence of life sciences, physical sciences, and bioengineering are creating extraordinary opportunities in diagnostic radiology, image-guided therapy, targeted radionuclide therapy, and radiology informatics, including radiologic image analysis. This article uses the example of oncology to make the case that, if members in the field of radiology continue to be innovative and continuously reinvent themselves, radiology can play an ever-increasing role in both precision medicine and value-driven health care. © RSNA, 2018.

  17. An ontology-driven semantic mash-up of gene and biological pathway information: Application to the domain of nicotine dependence

    PubMed Central

    Sahoo, Satya S.; Bodenreider, Olivier; Rutter, Joni L.; Skinner, Karen J.; Sheth, Amit P.

    2008-01-01

    Objectives This paper illustrates how Semantic Web technologies (especially RDF, OWL, and SPARQL) can support information integration and make it easy to create semantic mashups (semantically integrated resources). In the context of understanding the genetic basis of nicotine dependence, we integrate gene and pathway information and show how three complex biological queries can be answered by the integrated knowledge base. Methods We use an ontology-driven approach to integrate two gene resources (Entrez Gene and HomoloGene) and three pathway resources (KEGG, Reactome and BioCyc), for five organisms, including humans. We created the Entrez Knowledge Model (EKoM), an information model in OWL for the gene resources, and integrated it with the extant BioPAX ontology designed for pathway resources. The integrated schema is populated with data from the pathway resources, publicly available in BioPAX-compatible format, and gene resources for which a population procedure was created. The SPARQL query language is used to formulate queries over the integrated knowledge base to answer the three biological queries. Results Simple SPARQL queries could easily identify hub genes, i.e., those genes whose gene products participate in many pathways or interact with many other gene products. The identification of the genes expressed in the brain turned out to be more difficult, due to the lack of a common identification scheme for proteins. Conclusion Semantic Web technologies provide a valid framework for information integration in the life sciences. Ontology-driven integration represents a flexible, sustainable and extensible solution to the integration of large volumes of information. Additional resources, which enable the creation of mappings between information sources, are required to compensate for heterogeneity across namespaces. Resource page http://knoesis.wright.edu/research/lifesci/integration/structured_data/JBI-2008/ PMID:18395495

  18. An ontology-driven semantic mashup of gene and biological pathway information: application to the domain of nicotine dependence.

    PubMed

    Sahoo, Satya S; Bodenreider, Olivier; Rutter, Joni L; Skinner, Karen J; Sheth, Amit P

    2008-10-01

    This paper illustrates how Semantic Web technologies (especially RDF, OWL, and SPARQL) can support information integration and make it easy to create semantic mashups (semantically integrated resources). In the context of understanding the genetic basis of nicotine dependence, we integrate gene and pathway information and show how three complex biological queries can be answered by the integrated knowledge base. We use an ontology-driven approach to integrate two gene resources (Entrez Gene and HomoloGene) and three pathway resources (KEGG, Reactome and BioCyc), for five organisms, including humans. We created the Entrez Knowledge Model (EKoM), an information model in OWL for the gene resources, and integrated it with the extant BioPAX ontology designed for pathway resources. The integrated schema is populated with data from the pathway resources, publicly available in BioPAX-compatible format, and gene resources for which a population procedure was created. The SPARQL query language is used to formulate queries over the integrated knowledge base to answer the three biological queries. Simple SPARQL queries could easily identify hub genes, i.e., those genes whose gene products participate in many pathways or interact with many other gene products. The identification of the genes expressed in the brain turned out to be more difficult, due to the lack of a common identification scheme for proteins. Semantic Web technologies provide a valid framework for information integration in the life sciences. Ontology-driven integration represents a flexible, sustainable and extensible solution to the integration of large volumes of information. Additional resources, which enable the creation of mappings between information sources, are required to compensate for heterogeneity across namespaces. RESOURCE PAGE: http://knoesis.wright.edu/research/lifesci/integration/structured_data/JBI-2008/

  19. MIT Laboratory for Computer Science Progress Report 20 - July 1982 - Jun 1983,

    DTIC Science & Technology

    1984-07-01

    system by the Programming Technology Group. Research in the second and largest area entitled Machines, Languages , and Systems, strives to discover and...utilization and cost effectiveness . For example, the Programming Methodology Group and the Real Time Systems Group are developing languages and...100 Megabits per second when implemented with the 1.2[im. n- well cMOS process. 3. LANGUAGES 3.1. Demand Driven Evaluation In his engineer’s thesis

  20. Scaffolding Learning by Modelling: The Effects of Partially Worked-out Models

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Bollen, Lars; de Jong, Ton; Lazonder, Ard W.

    2016-01-01

    Creating executable computer models is a potentially powerful approach to science learning. Learning by modelling is also challenging because students can easily get overwhelmed by the inherent complexities of the task. This study investigated whether offering partially worked-out models can facilitate students' modelling practices and promote…

  1. Computer Techniques for Studying Coverage, Overlaps, and Gaps in Collections.

    ERIC Educational Resources Information Center

    White, Howard D.

    1987-01-01

    Describes techniques for using the Statistical Package for the Social Sciences (SSPS) to create tables for cooperative collection development across a number of libraries. Specific commands are given to generate holdings profiles focusing on collection coverage, overlaps, gaps, or other areas of interest, from a master bibliographic list. (CLB)

  2. Putting Science Literacy on Display

    ERIC Educational Resources Information Center

    Hayman, Arlene; Hoppe, Carole; Deniz, Hasan

    2012-01-01

    Imagine a classroom where students are actively engaged in seeking scientific knowledge from books and computers. Think of a classroom in which students fervently write to create PowerPoint presentations about their scientific topic and then enthusiastically practice their speaking roles to serve as docents in a classroom museum setting. Visualize…

  3. Air Force Laboratory’s 2005 Technology Milestones

    DTIC Science & Technology

    2006-01-01

    Computational materials science methods can benefit the design and property prediction of complex real-world materials. With these models , scientists and...Warfighter Page Air High - Frequency Acoustic System...800) 203-6451 High - Frequency Acoustic System Payoff Scientists created the High - Frequency Acoustic Suppression Technology (HiFAST) airflow control

  4. Social Science Instructional Modules Workshop.

    ERIC Educational Resources Information Center

    Nelson, Elizabeth; Nelson, Edward

    The five instructional packages in this collection were created by faculty members in the California State Universities to introduce students--and even faculty--to the easy steps involved in working with computers in instructional settings. Designed for students and faculty in entry-level courses who have little or no background in quantitative…

  5. To the Cloud! A Grassroots Proposal to Accelerate Brain Science Discovery

    PubMed Central

    Vogelstein, Joshua T.; Mensh, Brett; Hausser, Michael; Spruston, Nelson; Evans, Alan; Kording, Konrad; Amunts, Katrin; Ebell, Christoph; Muller, Jeff; Telefont, Martin; Hill, Sean; Koushika, Sandhya P.; Cali, Corrado; Valdés-Sosa, Pedro Antonio; Littlewood, Peter; Koch, Christof; Saalfeld, Stephan; Kepecs, Adam; Peng, Hanchuan; Halchenko, Yaroslav O.; Kiar, Gregory; Poo, Mu-Ming; Poline, Jean-Baptiste; Milham, Michael P.; Schaffer, Alyssa Picchini; Gidron, Rafi; Okano, Hideyuki; Calhoun, Vince D; Chun, Miyoung; Kleissas, Dean M.; Vogelstein, R. Jacob; Perlman, Eric; Burns, Randal; Huganir, Richard; Miller, Michael I.

    2018-01-01

    The revolution in neuroscientific data acquisition is creating an analysis challenge. We propose leveraging cloud-computing technologies to enable large-scale neurodata storing, exploring, analyzing, and modeling. This utility will empower scientists globally to generate and test theories of brain function and dysfunction. PMID:27810005

  6. Telescience workstation

    NASA Technical Reports Server (NTRS)

    Brown, Robert L.; Doyle, Dee; Haines, Richard F.; Slocum, Michael

    1989-01-01

    As part of the Telescience Testbed Pilot Program, the Universities Space Research Association/ Research Institute for Advanced Computer Science (USRA/RIACS) proposed to support remote communication by providing a network of human/machine interfaces, computer resources, and experimental equipment which allows: remote science, collaboration, technical exchange, and multimedia communication. The telescience workstation is intended to provide a local computing environment for telescience. The purpose of the program are as follows: (1) to provide a suitable environment to integrate existing and new software for a telescience workstation; (2) to provide a suitable environment to develop new software in support of telescience activities; (3) to provide an interoperable environment so that a wide variety of workstations may be used in the telescience program; (4) to provide a supportive infrastructure and a common software base; and (5) to advance, apply, and evaluate the telescience technolgy base. A prototype telescience computing environment designed to bring practicing scientists in domains other than their computer science into a modern style of doing their computing was created and deployed. This environment, the Telescience Windowing Environment, Phase 1 (TeleWEn-1), met some, but not all of the goals stated above. The TeleWEn-1 provided a window-based workstation environment and a set of tools for text editing, document preparation, electronic mail, multimedia mail, raster manipulation, and system management.

  7. Creating Next Generation Teacher Preparation Programs to Support Implementation of the Next Generation Science Standards and Common Core State Standards in K-12 Schools: An Opportunity for the Earth and Space Sciences

    NASA Astrophysics Data System (ADS)

    Geary, E. E.; Egger, A. E.; Julin, S.; Ronca, R.; Vokos, S.; Ebert, E.; Clark-Blickenstaff, J.; Nollmeyer, G.

    2015-12-01

    A consortium of two and four year Washington State Colleges and Universities in partnership with Washington's Office of the Superintendent of Public Instruction (OSPI), the Teachers of Teachers of Science, and Teachers of Teachers of Mathematics, and other key stakeholders, is currently working to improve science and mathematics learning for all Washington State students by creating a new vision for STEM teacher preparation in Washington State aligned with the Next Generation Science Standards (NGSS) and the Common Core State Standards (CCSS) in Mathematics and Language Arts. Specific objectives include: (1) strengthening elementary and secondary STEM Teacher Preparation courses and curricula, (2) alignment of STEM teacher preparation programs across Washington State with the NGSS and CCSS, (3) development of action plans to support implementation of STEM Teacher Preparation program improvement at Higher Education Institutions (HEIs) across the state, (4) stronger collaborations between HEIs, K-12 schools, government agencies, Non-Governmental Organizations, and STEM businesses, involved in the preparation of preservice STEM teachers, (5) new teacher endorsements in Computer Science and Engineering, and (6) development of a proto-type model for rapid, adaptable, and continuous improvement of STEM teacher preparation programs. A 2015 NGSS gap analysis of teacher preparation programs across Washington State indicates relatively good alignment of courses and curricula with NGSS Disciplinary Core Ideas and Scientific practices, but minimal alignment with NGSS Engineering practices and Cross Cutting Concepts. Likewise, Computer Science and Sustainability ideas and practices are not well represented in current courses and curricula. During the coming year teams of STEM faculty, education faculty and administrators will work collaboratively to develop unique action plans for aligning and improving STEM teacher preparation courses and curricula at their institutions.

  8. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  9. The effect of an STC orientation to teaching on student academic performance and motivation in secondary earth science

    NASA Astrophysics Data System (ADS)

    Corbin, Robert Arthur

    Student achievement gaps among subgroups remain a prevalent and critical issue in urban education systems. In many classes these students remain the target---and often the victims---of test-driven curriculum. Missing from their urban education is one of the most important aspects of a true education: a sense of place within that education. Science educators and educational researchers might consider the benefits of Sociotransformative Constructivism (STC) as a means of creating a more meaningful education for urban youth. This study examined the impact of an STC teaching orientation on student motivation and academic performance in secondary earth science students. The mixed methodology employed used both qualitative and quantitative data. Data collection consisted of STC activities, survey data, classroom observations, studentgenerated work and threaded discussions. Statistical analysis included independent t-tests of pre- and post-instruction concept maps. The results showed that the adaptation of an STC teaching orientation has a positive impact on student motivation and performance in secondary earth science.

  10. Climbing the Slope of Enlightenment during NASA's Arctic Boreal Vulnerability Experiment

    NASA Astrophysics Data System (ADS)

    Griffith, P. C.; Hoy, E.; Duffy, D.; McInerney, M.

    2015-12-01

    The Arctic Boreal Vulnerability Experiment (ABoVE) is a new field campaign sponsored by NASA's Terrestrial Ecology Program and designed to improve understanding of the vulnerability and resilience of Arctic and boreal social-ecological systems to environmental change (http://above.nasa.gov). ABoVE is integrating field-based studies, modeling, and data from airborne and satellite remote sensing. The NASA Center for Climate Simulation (NCCS) has partnered with the NASA Carbon Cycle and Ecosystems Office (CCEO) to create a high performance science cloud for this field campaign. The ABoVE Science Cloud combines high performance computing with emerging technologies and data management with tools for analyzing and processing geographic information to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage for "big data" with integrated data management, and integration of core variables from in-situ networks. The ABoVE Science Cloud is a collaboration that is accelerating the pace of new Arctic science for researchers participating in the field campaign. Specific examples of the utilization of the ABoVE Science Cloud by several funded projects will be presented.

  11. Designing a data portal for synthesis modeling

    NASA Astrophysics Data System (ADS)

    Holmes, M. A.

    2006-12-01

    Processing of field and model data in multi-disciplinary integrated science studies is a vital part of synthesis modeling. Collection and storage techniques for field data vary greatly between the participating scientific disciplines due to the nature of the data being collected, whether it be in situ, remotely sensed, or recorded by automated data logging equipment. Spreadsheets, personal databases, text files and binary files are used in the initial storage and processing of the raw data. In order to be useful to scientists, engineers and modelers the data need to be stored in a format that is easily identifiable, accessible and transparent to a variety of computing environments. The Model Operations and Synthesis (MOAS) database and associated web portal were created to provide such capabilities. The industry standard relational database is comprised of spatial and temporal data tables, shape files and supporting metadata accessible over the network, through a menu driven web-based portal or spatially accessible through ArcSDE connections from the user's local GIS desktop software. A separate server provides public access to spatial data and model output in the form of attributed shape files through an ArcIMS web-based graphical user interface.

  12. Looking back: forward looking

    PubMed Central

    2017-01-01

    Abstract GigaScience is now 5 years old, having been launched at the 2012 Intelligent Systems for Molecular Biology conference. Anyone who has attended what is the largest computational biology conference since then has had the opportunity to join us for each birthday celebration—and receive 1 of our fun T-shirts as a party prize. Since launching, we have pushed our agenda of openness, transparency, reproducibility, and reusability. Here, we look back at our first 5 years and what we have done to forward our open science goals in scientific publishing. Our mainstay has been to create a process that allows the availability and publication of as many “research objects” as possible to create a more complete way of communicating how the research process is done. PMID:28938718

  13. Schematic driven layout of Reed Solomon encoders

    NASA Technical Reports Server (NTRS)

    Arave, Kari; Canaris, John; Miles, Lowell; Whitaker, Sterling

    1992-01-01

    Two Reed Solomon error correcting encoders are presented. Schematic driven layout tools were used to create the encoder layouts. Special consideration had to be given to the architecture and logic to provide scalability of the encoder designs. Knowledge gained from these projects was used to create a more flexible schematic driven layout system.

  14. Like a bridge over troubled water--Opening pathways for integrating social sciences and humanities into nuclear research.

    PubMed

    Turcanu, Catrinel; Schröder, Jantine; Meskens, Gaston; Perko, Tanja; Rossignol, Nicolas; Carlé, Benny; Hardeman, Frank

    2016-03-01

    Research on nuclear technologies has been largely driven by a detachment of the 'technical content' from the 'social context'. However, social studies of science and technology--also for the nuclear domain--emphasize that 'the social' and 'the technical' dimensions of technology development are inter-related and co-produced. In an effort to create links between nuclear research and innovation and society in mutually beneficial ways, the Belgian Nuclear Research Centre started fifteen years ago a 'Programme of Integration of Social Aspects into nuclear research' (PISA). In line with broader science-policy agendas (responsible research and innovation and technology assessment), this paper argues that the importance of such programmes is threefold. First, their multi-disciplinary basis and participatory character contribute to a better understanding of the interactions between science, technology and society, in general, and the complexity of nuclear technology assessment in particular. Second, their functioning as (self -)critical policy supportive research with outreach to society is an essential prerequisite for policies aiming at generating societal trust in the context of controversial issues related to nuclear technologies and exposure to ionising radiation. Third, such programmes create an enriching dynamic in the organisation itself, stimulating collective learning and transdisciplinarity. The paper illustrates with concrete examples these claims and concludes by discussing some key challenges that researchers face while engaging in work of this kind. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Interactive investigations into planetary interiors

    NASA Astrophysics Data System (ADS)

    Rose, I.

    2015-12-01

    Many processes in Earth science are difficult to observe or visualize due to the large timescales and lengthscales over which they operate. The dynamics of planetary mantles are particularly challenging as we cannot even look at the rocks involved. As a result, much teaching material on mantle dynamics relies on static images and cartoons, many of which are decades old. Recent improvements in computing power and technology (largely driven by game and web development) have allowed for advances in real-time physics simulations and visualizations, but these have been slow to affect Earth science education.Here I demonstrate a teaching tool for mantle convection and seismology which solves the equations for conservation of mass, momentum, and energy in real time, allowing users make changes to the simulation and immediately see the effects. The user can ask and answer questions about what happens when they add heat in one place, or take it away from another place, or increase the temperature at the base of the mantle. They can also pause the simulation, and while it is paused, create and visualize seismic waves traveling through the mantle. These allow for investigations into and discussions about plate tectonics, earthquakes, hot spot volcanism, and planetary cooling.The simulation is rendered to the screen using OpenGL, and is cross-platform. It can be run as a native application for maximum performance, but it can also be embedded in a web browser for easy deployment and portability.

  16. Of the Helmholtz Club, South-Californian seedbed for visual and cognitive neuroscience, and its patron Francis Crick

    PubMed Central

    Aicardi, Christine

    2014-01-01

    Taking up the view that semi-institutional gatherings such as clubs, societies, research schools, have been instrumental in creating sheltered spaces from which many a 20th-century project-driven interdisciplinary research programme could develop and become established within the institutions of science, the paper explores the history of one such gathering from its inception in the early 1980s into the 2000s, the Helmholtz Club, which brought together scientists from such various research fields as neuroanatomy, neurophysiology, psychophysics, computer science and engineering, who all had an interest in the study of the visual system and of higher cognitive functions relying on visual perception such as visual consciousness. It argues that British molecular biologist turned South Californian neuroscientist Francis Crick had an early and lasting influence over the Helmholtz Club of which he was a founding pillar, and that from its inception, the club served as a constitutive element in his long-term plans for a neuroscience of vision and of cognition. Further, it argues that in this role, the Helmholtz Club served many purposes, the primary of which was to be a social forum for interdisciplinary discussion, where ‘discussion’ was not mere talk but was imbued with an epistemic value and as such, carefully cultivated. Finally, it questions what counts as ‘doing science’ and in turn, definitions of success and failure—and provides some material evidence towards re-appraising the successfulness of Crick’s contribution to the neurosciences. PMID:24384229

  17. User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge

    USGS Publications Warehouse

    Koltun, G.F.; Gray, John R.; McElhone, T.J.

    1994-01-01

    Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.

  18. Designing Infographics to support teaching complex science subject: A comparison between static and animated Infographics

    NASA Astrophysics Data System (ADS)

    Hassan, Hesham Galal

    This thesis explores the proper principles and rules for creating excellent infographics that communicate information successfully and effectively. Not only does this thesis examine the creation of Infographics, it also tries to answer which format, Static or Animated Infographics, is the most effective when used as a teaching-aid framework for complex science subjects, and if compelling Infographics in the preferred format facilitate the learning experience. The methodology includes the creation of infographic using two formats (Static and Animated) of a fairly complex science subject (Phases Of The Moon), which were then tested for their efficacy as a whole, and the two formats were compared in terms of information comprehension and retention. My hypothesis predicts that the creation of an infographic using the animated format would be more effective in communicating a complex science subject (Phases Of The Moon), specifically when using 3D computer animation to visualize the topic. This would also help different types of learners to easily comprehend science subjects. Most of the animated infographics produced nowadays are created for marketing and business purposes and do not implement the analytical design principles required for creating excellent information design. I believe that science learners are still in need of more variety in their methods of learning information, and that infographics can be of great assistance. The results of this thesis study suggests that using properly designed infographics would be of great help in teaching complex science subjects that involve spatial and temporal data. This could facilitate learning science subjects and consequently impact the interest of young learners in STEM.

  19. Computer version of astronomical ephemerides.

    NASA Astrophysics Data System (ADS)

    Choliy, V. Ya.

    A computer version of astronomical ephemerides for bodies of the Solar System, stars, and astronomical phenomena was created at the Main Astronomical Observatory of the National Academy of Sciences of Ukraine and the Astronomy and Cosmic Physics Department of the Taras Shevchenko National University. The ephemerides will be distributed via INTERNET or in the file form. This information is accessible via the web servers space.ups.kiev.ua and alfven.ups.kiev.ua or the address choliy@astrophys.ups.kiev.ua.

  20. Integrating interactive computational modeling in biology curricula.

    PubMed

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  1. High-performance scientific computing in the cloud

    NASA Astrophysics Data System (ADS)

    Jorissen, Kevin; Vila, Fernando; Rehr, John

    2011-03-01

    Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.

  2. Opportunities and challenges for the life sciences community.

    PubMed

    Kolker, Eugene; Stewart, Elizabeth; Ozdemir, Vural

    2012-03-01

    Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19-20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16-17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org ) was formed to become a Digital Commons for the life sciences community.

  3. Opportunities and Challenges for the Life Sciences Community

    PubMed Central

    Stewart, Elizabeth; Ozdemir, Vural

    2012-01-01

    Abstract Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19–20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16–17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org) was formed to become a Digital Commons for the life sciences community. PMID:22401659

  4. Life science research and drug discovery at the turn of the 21st century: the experience of SwissBioGrid.

    PubMed

    den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph

    2009-04-22

    It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling "in silico" the processes observed "in vitro." The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. SwissBioGrid gave rise to two pilot projects: one for proteomics data analysis and the other for high-throughput molecular docking ("virtual screening") to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the interactions of millions of small molecules with a limited number of protein targets on the coat of the dengue virus. Both present interesting lessons about how scientific practices are changing when they tackle the problems of large-scale data analysis and data management by means of creating a novel technical infrastructure. In the experience of SwissBioGrid, data intensive discovery has a lot to gain from close collaboration with industry and harnessing distributed computing power. Yet the diversity in life science research implies only a limited role for generic infrastructure; and the transience of support means that researchers need to integrate their efforts with others if they want to sustain the benefits of their success, which are otherwise lost.

  5. A High School-Collegiate Outreach Program in Chemistry and Biology Delivering Modern Technology in a Mobile Van

    NASA Astrophysics Data System (ADS)

    Craney, Chris; Mazzeo, April; Lord, Kaye

    1996-07-01

    During the past five years the nation's concern for science education has expanded from a discussion about the future supply of Ph.D. scientists and its impact on the nation's scientific competitiveness to the broader consideration of the science education available to all students. Efforts to improve science education have led many authors to suggest greater collaboration between high school science teachers and their college/university colleagues. This article reviews the experience and outcomes of the Teachers + Occidental = Partnership in Science (TOPS) van program operating in the Los Angeles Metropolitan area. The program emphasizes an extensive ongoing staff development, responsiveness to teachers' concerns, technical and on-site support, and sustained interaction between participants and program staff. Access to modern technology, including computer-driven instruments and commercial data analysis software, coupled with increased teacher content knowledge has led to empowerment of teachers and changes in student interest in science. Results of student and teacher questionnaires are reviewed.

  6. Science alliance: A vital ORNL-UT partnership

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richmond, C.R.; Riedinger, L.; Garritano, T.

    1991-01-01

    Partnerships between Department of Energy national laboratories and universities have long been keys to advancing scientific research and education in the United States. Perhaps the most enduring and closely knit of these relationships is the one between Oak Ridge National Laboratory and the University of Tennessee at Knoxville. Since its birth in the 1940's, ORNL has had a very special relationship with UT, and today the two institutions have closer ties than virtually any other university and national laboratory. Seven years ago, ORNL and UT began a new era of cooperation by creating the Science Alliance, a Center of Excellencemore » at UT sponsored by the Tennessee Higher Education Commission. As the oldest and largest of these centers, the Science Alliance is the primary vehicle through which Tennessee promotes research and educational collaboration between UT and ORNL. By letting the two institutions pool their intellectual and financial resources, the alliance creates a more fertile scientific environment than either could achieve on its own. Part of the UT College of Liberal Arts, the Science Alliance is composed of four divisions (Biological Sciences, Chemical Sciences, Physical Sciences, and Mathematics and Computer Science) that team 100 of the university's top faculty with their outstanding colleagues from ORNL.« less

  7. Big Data Ecosystems Enable Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, Terence J.; Kleese van Dam, Kerstin

    Over the past 5 years, advances in experimental, sensor and computational technologies have driven the exponential growth in the volumes, acquisition rates, variety and complexity of scientific data. As noted by Hey et al in their 2009 e-book The Fourth Paradigm, this availability of large-quantities of scientifically meaningful data has given rise to a new scientific methodology - data intensive science. Data intensive science is the ability to formulate and evaluate hypotheses using data and analysis to extend, complement and, at times, replace experimentation, theory, or simulation. This new approach to science no longer requires scientists to interact directly withmore » the objects of their research; instead they can utilize digitally captured, reduced, calibrated, analyzed, synthesized and visualized results - allowing them carry out 'experiments' in data.« less

  8. Service-Oriented Architectures and Project Optimization for a Special Cost Management Problem Creating Synergies for Informed Change between Qualitative and Quantitative Strategic Management Processes

    DTIC Science & Technology

    2010-05-01

    Science, Werner Heisenberg -Weg 39,85577 Neubiberg, Germany,CA,93943 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S...University of the Federal Armed Forces of Germany Institute for Theoretic Computer Science Mathematics and Operations Research Werner Heisenberg -Weg...Research Werner Heisenberg -Weg 39 85577 Neubiberg, Germany Phone +49 89 6004 2400 Marco Schuler—Marco Schuler is an active Officer of the Federal

  9. Provenance-Powered Automatic Workflow Generation and Composition

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  10. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.

  11. Records for conversion of laser energy to nuclear energy in exploding nanostructures

    NASA Astrophysics Data System (ADS)

    Jortner, Joshua; Last, Isidore

    2017-09-01

    Table-top nuclear fusion reactions in the chemical physics laboratory can be driven by high-energy dynamics of Coulomb exploding, multicharged, deuterium containing nanostructures generated by ultraintense, femtosecond, near-infrared laser pulses. Theoretical-computational studies of table-top laser-driven nuclear fusion of high-energy (up to 15 MeV) deuterons with 7Li, 6Li and D nuclei demonstrate the attainment of high fusion yields within a source-target reaction design, which constitutes the highest table-top fusion efficiencies obtained up to date. The conversion efficiency of laser energy to nuclear energy (0.1-1.0%) for table-top fusion is comparable to that for DT fusion currently accomplished for 'big science' inertial fusion setups.

  12. Data-driven system to predict academic grades and dropout

    PubMed Central

    Rovira, Sergi; Puertas, Eloi

    2017-01-01

    Nowadays, the role of a tutor is more important than ever to prevent students dropout and improve their academic performance. This work proposes a data-driven system to extract relevant information hidden in the student academic data and, thus, help tutors to offer their pupils a more proactive personal guidance. In particular, our system, based on machine learning techniques, makes predictions of dropout intention and courses grades of students, as well as personalized course recommendations. Moreover, we present different visualizations which help in the interpretation of the results. In the experimental validation, we show that the system obtains promising results with data from the degree studies in Law, Computer Science and Mathematics of the Universitat de Barcelona. PMID:28196078

  13. Development strategies for science learning management to transition in the 21st century of Thailand 4.0

    NASA Astrophysics Data System (ADS)

    Jedaman, Pornchai; Buraphan, Khajornsak; Yuenyong, Chokchai; Suksup, Charoen; Kraisriwattana, Benchalax

    2018-01-01

    Science learning management aims to analyze the development strategies for science learning management to transition in the 21st Century of Thailand 4.0. Is qualitative study employed review of documentary, questionnaire both to the participatory action learning with the teachers intwenty-five Secondary education area offices in the basic education of Thailand. The participants were cluster sampling random of each 150 persons. Data analysis includes data reduction, data organization, data interpretation to conclusion. The main of this study were to a creating innovation, links and access to technology as well as to the changes. It is very important for needs to be learning management for effective of science subject in the educational. Led to the plan to driven for the science learning management were a success in the 21st century, spanning strategy were converted of practical the steps throughinstitutional research and development to solve problems in changing identity, reorientation, paradigm shifted, transformation of cultural to propel the country for first world Nation in the elements were "6R12C3E".

  14. Quantitative and Qualitative Evaluation of The Structural Designing of Medical Informatics Dynamic Encyclopedia.

    PubMed

    Safdari, Reza; Shahmoradi, Leila; Hosseini-Beheshti, Molouk-Sadat; Nejad, Ahmadreza Farzaneh; Hosseiniravandi, Mohammad

    2015-10-01

    Encyclopedias and their compilation have become so prevalent as a valid cultural medium in the world. The daily development of computer industry and the expansion of various sciences have made indispensable the compilation of electronic, specialized encyclopedias, especially the web-based ones. This is an applied-developmental study conducted in 2014. First, the main terms in the field of medical informatics were gathered using MeSH Online 2014 and the supplementary terms of each were determined, and then the tree diagram of the terms was drawn based on their relationship in MeSH. Based on the studies done by the researchers, the tree diagram of the encyclopedia was drawn with respect to the existing areas in this field, and the terms gathered were put in related domains. In MeSH, 75 preferred terms together with 249 supplementary ones were indexed. One of the informatics' sub-branches is biomedical informatics and health which itself consists of three sub-divisions of bioinformatics, clinical informatics, and health informatics. Medical informatics which is a subdivision of clinical informatics has developed from the three fields of medical sciences, management and social sciences, and computational sciences and mathematics. Medical Informatics is created of confluence and fusion and applications of the three major scientific branches include health and biological sciences, social sciences and management sciences, computing and mathematical sciences, and according to that the structure of MeSH is weak for future development of Encyclopedia of Medical Informatics.

  15. Perceptions of teaching and learning automata theory in a college-level computer science course

    NASA Astrophysics Data System (ADS)

    Weidmann, Phoebe Kay

    This dissertation identifies and describes student and instructor perceptions that contribute to effective teaching and learning of Automata Theory in a competitive college-level Computer Science program. Effective teaching is the ability to create an appropriate learning environment in order to provide effective learning. We define effective learning as the ability of a student to meet instructor set learning objectives, demonstrating this by passing the course, while reporting a good learning experience. We conducted our investigation through a detailed qualitative case study of two sections (118 students) of Automata Theory (CS 341) at The University of Texas at Austin taught by Dr. Lily Quilt. Because Automata Theory has a fixed curriculum in the sense that many curricula and textbooks agree on what Automata Theory contains, differences being depth and amount of material to cover in a single course, a case study would allow for generalizable findings. Automata Theory is especially problematic in a Computer Science curriculum since students are not experienced in abstract thinking before taking this course, fail to understand the relevance of the theory, and prefer classes with more concrete activities such as programming. This creates a special challenge for any instructor of Automata Theory as motivation becomes critical for student learning. Through the use of student surveys, instructor interviews, classroom observation, material and course grade analysis we sought to understand what students perceived, what instructors expected of students, and how those perceptions played out in the classroom in terms of structure and instruction. Our goal was to create suggestions that would lead to a better designed course and thus a higher student success rate in Automata Theory. We created a unique theoretical basis, pedagogical positivism, on which to study college-level courses. Pedagogical positivism states that through examining instructor and student perceptions of teaching and learning, improvements to a course are possible. These improvements can eventually develop a "best practice" instructional environment. This view is not possible under a strictly constructivist learning theory as there is no way to teach a group of individuals in a "best" way. Using this theoretical basis, we examined the gathered data from CS 341. (Abstract shortened by UMI.)

  16. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    NASA Astrophysics Data System (ADS)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.

  17. A policy for science.

    PubMed

    Lauer, Michael S

    2012-06-12

    Policy and science often interact. Typically, we think of policymakers looking to scientists for advice on issues informed by science. We may appreciate less the opposite look: where people outside science inform policies that affect the conduct of science. In clinical medicine, we are forced to make decisions about practices for which there is insufficient, inadequate evidence to know whether they improve clinical outcomes, yet the health care system may not be structured to rapidly generate needed evidence. For example, when the Centers for Medicare and Medicaid Services noted insufficient evidence to support routine use of computed tomography angiography and they called for a national commitment to completion of randomized trials, their call ran into substantial opposition. I use the computed tomography angiography story to illustrate how we might consider a "policy for science" in which stakeholders would band together to identify evidence gaps and to use their influence to promote the efficient design, implementation, and completion of high-quality randomized trials. Such a policy for science could create a culture that incentivizes and invigorates the rapid generation of evidence, ultimately engaging all clinicians, all patients, and indeed all stakeholders into the scientific enterprise. Copyright © 2012 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  18. Science Planning and Orbit Classification for Solar Probe Plus

    NASA Astrophysics Data System (ADS)

    Kusterer, M. B.; Fox, N. J.; Rodgers, D. J.; Turner, F. S.

    2016-12-01

    There are a number of challenges for the Science Planning Team (SPT) of the Solar Probe Plus (SPP) Mission. Since SPP is using a decoupled payload operations approach, tight coordination between the mission operations and payload teams will be required. The payload teams must manage the volume of data that they write to the spacecraft solid-state recorders (SSR) for their individual instruments for downlink to the ground. Making this process more difficult, the geometry of the celestial bodies and the spacecraft during some of the SPP mission orbits cause limited uplink and downlink opportunities. The payload teams will also be required to coordinate power on opportunities, command uplink opportunities, and data transfers from instrument memory to the spacecraft SSR with the operation team. The SPT also intend to coordinate observations with other spacecraft and ground based systems. To solve these challenges, detailed orbit activity planning is required in advance for each orbit. An orbit planning process is being created to facilitate the coordination of spacecraft and payload activities for each orbit. An interactive Science Planning Tool is being designed to integrate the payload data volume and priority allocations, spacecraft ephemeris, attitude, downlink and uplink schedules, spacecraft and payload activities, and other spacecraft ephemeris. It will be used during science planning to select the instrument data priorities and data volumes that satisfy the orbit data volume constraints and power on, command uplink and data transfer time periods. To aid in the initial stages of science planning we have created an orbit classification scheme based on downlink availability and significant science events. Different types of challenges arise in the management of science data driven by orbital geometry and operational constraints, and this scheme attempts to identify the patterns that emerge.

  19. The OSG open facility: A sharing ecosystem

    DOE PAGES

    Jayatilaka, B.; Levshina, T.; Rynge, M.; ...

    2015-12-23

    The Open Science Grid (OSG) ties together individual experiments’ computing power, connecting their resources to create a large, robust computing grid, this computing infrastructure started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero. In the years since, the OSG has broadened its focus to also address the needs of other US researchers and increased delivery of Distributed High Through-put Computing (DHTC) to users from a wide variety of disciplines via the OSG Open Facility. Presently, the Open Facility delivers about 100 million computing wall hours per year to researchers whomore » are not already associated with the owners of the computing sites, this is primarily accomplished by harvesting and organizing the temporarily unused capacity (i.e. opportunistic cycles) from the sites in the OSG. Using these methods, OSG resource providers and scientists share computing hours with researchers in many other fields to enable their science, striving to make sure that these computing power used with maximal efficiency. Furthermore, we believe that expanded access to DHTC is an essential tool for scientific innovation and work continues in expanding this service.« less

  20. Games and Simulations for Climate, Weather and Earth Science Education

    NASA Astrophysics Data System (ADS)

    Russell, R. M.

    2014-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory. More info available at: scied.ucar.edu/events/agu-2014-games-simulations-sessions

  1. Computational Study of Breathing-type Processes in Driven, Confined, Granular Alignments

    DTIC Science & Technology

    2012-04-17

    Government of India, Title: : “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” June 29, 2011 2. Physics Seminar, Indian...Institute of Science, Bangalore, India, Title: “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” June 30, 2011 3. Physics...Department Colloquium, SUNY Buffalo, Title: “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” January 20, 2011. 4

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bland, Arthur S Buddy; Hack, James J; Baker, Ann E

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energymore » assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next-generation systems.« less

  3. Students' Opinions on the Light Pollution Application

    ERIC Educational Resources Information Center

    Özyürek, Cengiz; Aydin, Güliz

    2015-01-01

    The purpose of this study is to determine the impact of computer-animated concept cartoons and outdoor science activities on creating awareness among seventh graders about light pollution. It also aims to identify the views of the students on the activities that were carried out. This study used one group pre-test/post-test experimental design…

  4. Programs as Causal Models: Speculations on Mental Programs and Mental Representation

    ERIC Educational Resources Information Center

    Chater, Nick; Oaksford, Mike

    2013-01-01

    Judea Pearl has argued that counterfactuals and causality are central to intelligence, whether natural or artificial, and has helped create a rich mathematical and computational framework for formally analyzing causality. Here, we draw out connections between these notions and various current issues in cognitive science, including the nature of…

  5. Knowledge Structures of Entering Computer Networking Students and Their Instructors

    ERIC Educational Resources Information Center

    DiCerbo, Kristen E.

    2007-01-01

    Students bring prior knowledge to their learning experiences. This prior knowledge is known to affect how students encode and later retrieve new information learned. Teachers and content developers can use information about students' prior knowledge to create more effective lessons and materials. In many content areas, particularly the sciences,…

  6. Creating a STEM-Literate Society

    ERIC Educational Resources Information Center

    Dow, Mirah J.

    2014-01-01

    Picture a professor of library and information science working in her office on the fourth floor of the university library. She is away from student traffic and relatively isolated from other faculty on campus. Nevertheless, the professor has much to be proud of as she uses computer technology to reach, teach, and interact with enrolled students…

  7. How to build better memory training games

    PubMed Central

    Deveau, Jenni; Jaeggi, Susanne M.; Zordan, Victor; Phung, Calvin; Seitz, Aaron R.

    2015-01-01

    Can we create engaging training programs that improve working memory (WM) skills? While there are numerous procedures that attempt to do so, there is a great deal of controversy regarding their efficacy. Nonetheless, recent meta-analytic evidence shows consistent improvements across studies on lab-based tasks generalizing beyond the specific training effects (Au et al., 2014; Karbach and Verhaeghen, 2014), however, there is little research into how WM training aids participants in their daily life. Here we propose that incorporating design principles from the fields of Perceptual Learning (PL) and Computer Science might augment the efficacy of WM training, and ultimately lead to greater learning and transfer. In particular, the field of PL has identified numerous mechanisms (including attention, reinforcement, multisensory facilitation and multi-stimulus training) that promote brain plasticity. Also, computer science has made great progress in the scientific approach to game design that can be used to create engaging environments for learning. We suggest that approaches integrating knowledge across these fields may lead to a more effective WM interventions and better reflect real world conditions. PMID:25620916

  8. On transferring the grid technology to the biomedical community.

    PubMed

    Mohammed, Yassene; Sax, Ulrich; Dickmann, Frank; Lippert, Joerg; Solodenko, Juri; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which resulted in the Grid. The inter domain transfer process of this technology has been an intuitive process. Some difficulties facing the life science community can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies that have achieved certain stability. Grid and Cloud solutions are technologies that are still in flux. We illustrate how Grid computing creates new difficulties for the technology transfer process that are not considered in Bozeman's model. We show why the success of health Grids should be measured by the qualified scientific human capital and opportunities created, and not primarily by the market impact. With two examples we show how the Grid technology transfer theory corresponds to the reality. We conclude with recommendations that can help improve the adoption of Grid solutions into the biomedical community. These results give a more concise explanation of the difficulties most life science IT projects are facing in the late funding periods, and show some leveraging steps which can help to overcome the "vale of tears".

  9. Creation of a 3-dimensional virtual dental patient for computer-guided surgery and CAD-CAM interim complete removable and fixed dental prostheses: A clinical report.

    PubMed

    Harris, Bryan T; Montero, Daniel; Grant, Gerald T; Morton, Dean; Llop, Daniel R; Lin, Wei-Shao

    2017-02-01

    This clinical report proposes a digital workflow using 2-dimensional (2D) digital photographs, a 3D extraoral facial scan, and cone beam computed tomography (CBCT) volumetric data to create a 3D virtual patient with craniofacial hard tissue, remaining dentition (including surrounding intraoral soft tissue), and the realistic appearance of facial soft tissue at an exaggerated smile under static conditions. The 3D virtual patient was used to assist the virtual diagnostic tooth arrangement process, providing patient with a pleasing preoperative virtual smile design that harmonized with facial features. The 3D virtual patient was also used to gain patient's pretreatment approval (as a communication tool), design a prosthetically driven surgical plan for computer-guided implant surgery, and fabricate the computer-aided design and computer-aided manufacturing (CAD-CAM) interim prostheses. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  10. Games and Simulations for Climate, Weather and Earth Science Education

    NASA Astrophysics Data System (ADS)

    Russell, R. M.

    2013-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by education groups at NCAR/UCAR in Boulder, primarily Spark and the COMET Program. These materials have been disseminated via Spark's web site (spark.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility. Spark has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory.

  11. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    NASA Astrophysics Data System (ADS)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and science and sensor quality analysis tools. In this presentation we will describe the GRAVITE systems and subsystems, architecture, technical specifications, capabilities and resources, distributed data and products and the latest advances to support the JPSS science algorithm implementation, validation and testing.

  12. Environmental Data-Driven Inquiry and Exploration (EDDIE)- Water Focused Modules for interacting with Big Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Meixner, T.; Gougis, R.; O'Reilly, C.; Klug, J.; Richardson, D.; Castendyk, D.; Carey, C.; Bader, N.; Stomberg, J.; Soule, D. C.

    2016-12-01

    High-frequency sensor data are driving a shift in the Earth and environmental sciences. The availability of high-frequency data creates an engagement opportunity for undergraduate students in primary research by using large, long-term, and sensor-based, data directly in the scientific curriculum. Project EDDIE (Environmental Data-Driven Inquiry & Exploration) has developed flexible classroom activity modules designed to meet a series of pedagogical goals that include (1) developing skills required to manipulate large datasets at different scales to conduct inquiry-based investigations; (2) developing students' reasoning about statistical variation; and (3) fostering accurate student conceptions about the nature of environmental science. The modules cover a wide range of topics, including lake physics and metabolism, stream discharge, water quality, soil respiration, seismology, and climate change. In this presentation we will focus on a sequence of modules of particular interest to hydrologists - stream discharge, water quality and nutrient loading. Assessment results show that our modules are effective at making students more comfortable analyzing data, improved understanding of statistical concepts, and stronger data analysis capability. This project is funded by an NSF TUES grant (NSF DEB 1245707).

  13. Science Opportunity Analyzer (SOA) Version 8

    NASA Technical Reports Server (NTRS)

    Witoff, Robert J.; Polanskey, Carol A.; Aguinaldo, Anna Marie A.; Liu, Ning; Hofstadter, Mark D.

    2013-01-01

    SOA allows scientists to plan spacecraft observations. It facilitates the identification of geometrically interesting times in a spacecraft s orbit that a user can use to plan observations or instrument-driven spacecraft maneuvers. These observations can then be visualized multiple ways in both two- and three-dimensional views. When observations have been optimized within a spacecraft's flight rules, the resulting plans can be output for use by other JPL uplink tools. Now in its eighth major version, SOA improves on these capabilities in a modern and integrated fashion. SOA consists of five major functions: Opportunity Search, Visualization, Observation Design, Constraint Checking, and Data Output. Opportunity Search is a GUI-driven interface to existing search engines that can be used to identify times when a spacecraft is in a specific geometrical relationship with other bodies in the solar system. This function can be used for advanced mission planning as well as for making last-minute adjustments to mission sequences in response to trajectory modifications. Visualization is a key aspect of SOA. The user can view observation opportunities in either a 3D representation or as a 2D map projection. Observation Design allows the user to orient the spacecraft and visualize the projection of the instrument field of view for that orientation using the same views as Opportunity Search. Constraint Checking is provided to validate various geometrical and physical aspects of an observation design. The user has the ability to easily create custom rules or to use official project-generated flight rules. This capability may also allow scientists to easily assess the cost to science if flight rule changes occur. Data Output allows the user to compute ancillary data related to an observation or to a given position of the spacecraft along its trajectory. The data can be saved as a tab-delimited text file or viewed as a graph. SOA combines science planning functionality unique to both JPL and the sponsoring spacecraft. SOA is able to ingest JPL SPICE Kernels that are used to drive the tool and its computations. A Percy search engine is then included that identifies interesting time periods for the user to build observations. When observations are then built, flight-like orientation algorithms replicate spacecraft dynamics to closely simulate the flight spacecraft s dynamics. SOA v8 represents large steps forward from SOA v7 in terms of quality, reliability, maintainability, efficiency, and user experience. A tailored agile development environment has been built around SOA that provides automated unit testing, continuous build and integration, a consolidated Web-based code and documentation storage environment, modern Java enhancements, and a focus on usability

  14. The need for scientific software engineering in the pharmaceutical industry

    NASA Astrophysics Data System (ADS)

    Luty, Brock; Rose, Peter W.

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  15. The need for scientific software engineering in the pharmaceutical industry.

    PubMed

    Luty, Brock; Rose, Peter W

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  16. Advanced earth observation spacecraft computer-aided design software: Technical, user and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Krauze, L. D.

    1983-01-01

    The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dress, W.B.

    Rosen's modeling relation is embedded in Popper's three worlds to provide an heuristic tool for model building and a guide for thinking about complex systems. The utility of this construct is demonstrated by suggesting a solution to the problem of pseudo science and a resolution of the famous Bohr-Einstein debates. A theory of bizarre systems is presented by an analogy with entangled particles of quantum mechanics. This theory underscores the poverty of present-day computational systems (e.g., computers) for creating complex and bizarre entities by distinguishing between mechanism and organism.

  18. Meta!Blast computer game: a pipeline from science to 3D art to education

    NASA Astrophysics Data System (ADS)

    Schneller, William; Campbell, P. J.; Bassham, Diane; Wurtele, Eve Syrkin

    2012-03-01

    Meta!Blast (http://www.metablast.org) is designed to address the challenges students often encounter in understanding cell and metabolic biology. Developed by faculty and students in biology, biochemistry, computer science, game design, pedagogy, art and story, Meta!Blast is being created using Maya (http://usa.autodesk.com/maya/) and the Unity 3D (http://unity3d.com/) game engine, for Macs and PCs in classrooms; it has also been exhibited in an immersive environment. Here, we describe the pipeline from protein structural data and holographic information to art to the threedimensional (3D) environment to the game engine, by which we provide a publicly-available interactive 3D cellular world that mimics a photosynthetic plant cell.

  19. Benefits and applications of interdisciplinary digital tools for environmental meta-reviews and analyses

    NASA Astrophysics Data System (ADS)

    Grubert, Emily; Siders, Anne

    2016-09-01

    Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.

  20. The Science DMZ: A Network Design Pattern for Data-Intensive Science

    DOE PAGES

    Dart, Eli; Rotman, Lauren; Tierney, Brian; ...

    2014-01-01

    The ever-increasing scale of scientific data has become a significant challenge for researchers that rely on networks to interact with remote computing systems and transfer results to collaborators worldwide. Despite the availability of high-capacity connections, scientists struggle with inadequate cyberinfrastructure that cripples data transfer performance, and impedes scientific progress. The Science DMZ paradigm comprises a proven set of network design patterns that collectively address these problems for scientists. We explain the Science DMZ model, including network architecture, system configuration, cybersecurity, and performance tools, that creates an optimized network environment for science. We describe use cases from universities, supercomputing centers andmore » research laboratories, highlighting the effectiveness of the Science DMZ model in diverse operational settings. In all, the Science DMZ model is a solid platform that supports any science workflow, and flexibly accommodates emerging network technologies. As a result, the Science DMZ vastly improves collaboration, accelerating scientific discovery.« less

  1. Explorative search of distributed bio-data to answer complex biomedical questions

    PubMed Central

    2014-01-01

    Background The huge amount of biomedical-molecular data increasingly produced is providing scientists with potentially valuable information. Yet, such data quantity makes difficult to find and extract those data that are most reliable and most related to the biomedical questions to be answered, which are increasingly complex and often involve many different biomedical-molecular aspects. Such questions can be addressed only by comprehensively searching and exploring different types of data, which frequently are ordered and provided by different data sources. Search Computing has been proposed for the management and integration of ranked results from heterogeneous search services. Here, we present its novel application to the explorative search of distributed biomedical-molecular data and the integration of the search results to answer complex biomedical questions. Results A set of available bioinformatics search services has been modelled and registered in the Search Computing framework, and a Bioinformatics Search Computing application (Bio-SeCo) using such services has been created and made publicly available at http://www.bioinformatics.deib.polimi.it/bio-seco/seco/. It offers an integrated environment which eases search, exploration and ranking-aware combination of heterogeneous data provided by the available registered services, and supplies global results that can support answering complex multi-topic biomedical questions. Conclusions By using Bio-SeCo, scientists can explore the very large and very heterogeneous biomedical-molecular data available. They can easily make different explorative search attempts, inspect obtained results, select the most appropriate, expand or refine them and move forward and backward in the construction of a global complex biomedical query on multiple distributed sources that could eventually find the most relevant results. Thus, it provides an extremely useful automated support for exploratory integrated bio search, which is fundamental for Life Science data driven knowledge discovery. PMID:24564278

  2. How Evolution May Work Through Curiosity-Driven Developmental Process.

    PubMed

    Oudeyer, Pierre-Yves; Smith, Linda B

    2016-04-01

    Infants' own activities create and actively select their learning experiences. Here we review recent models of embodied information seeking and curiosity-driven learning and show that these mechanisms have deep implications for development and evolution. We discuss how these mechanisms yield self-organized epigenesis with emergent ordered behavioral and cognitive developmental stages. We describe a robotic experiment that explored the hypothesis that progress in learning, in and for itself, generates intrinsic rewards: The robot learners probabilistically selected experiences according to their potential for reducing uncertainty. In these experiments, curiosity-driven learning led the robot learner to successively discover object affordances and vocal interaction with its peers. We explain how a learning curriculum adapted to the current constraints of the learning system automatically formed, constraining learning and shaping the developmental trajectory. The observed trajectories in the robot experiment share many properties with those in infant development, including a mixture of regularities and diversities in the developmental patterns. Finally, we argue that such emergent developmental structures can guide and constrain evolution, in particular with regard to the origins of language. Copyright © 2016 Cognitive Science Society, Inc.

  3. OpenWorm: an open-science approach to modeling Caenorhabditis elegans.

    PubMed

    Szigeti, Balázs; Gleeson, Padraig; Vella, Michael; Khayrulin, Sergey; Palyanov, Andrey; Hokanson, Jim; Currie, Michael; Cantarelli, Matteo; Idili, Giovanni; Larson, Stephen

    2014-01-01

    OpenWorm is an international collaboration with the aim of understanding how the behavior of Caenorhabditis elegans (C. elegans) emerges from its underlying physiological processes. The project has developed a modular simulation engine to create computational models of the worm. The modularity of the engine makes it possible to easily modify the model, incorporate new experimental data and test hypotheses. The modeling framework incorporates both biophysical neuronal simulations and a novel fluid-dynamics-based soft-tissue simulation for physical environment-body interactions. The project's open-science approach is aimed at overcoming the difficulties of integrative modeling within a traditional academic environment. In this article the rationale is presented for creating the OpenWorm collaboration, the tools and resources developed thus far are outlined and the unique challenges associated with the project are discussed.

  4. Looking back: forward looking.

    PubMed

    Edmunds, Scott C; Nogoy, Nicole A; Zauner, Hans; Li, Peter; Hunter, Christopher I; Zhe, Xiao Si; Goodman, Laurie

    2017-09-01

    GigaScience is now 5 years old, having been launched at the 2012 Intelligent Systems for Molecular Biology conference. Anyone who has attended what is the largest computational biology conference since then has had the opportunity to join us for each birthday celebration-and receive 1 of our fun T-shirts as a party prize. Since launching, we have pushed our agenda of openness, transparency, reproducibility, and reusability. Here, we look back at our first 5 years and what we have done to forward our open science goals in scientific publishing. Our mainstay has been to create a process that allows the availability and publication of as many "research objects" as possible to create a more complete way of communicating how the research process is done. © The Authors 2017. Published by Oxford University Press.

  5. The assessment of virtual reality for human anatomy instruction

    NASA Technical Reports Server (NTRS)

    Benn, Karen P.

    1994-01-01

    This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.

  6. Astrophysical Connections to Collapsing Radiative Shock Experiments

    NASA Astrophysics Data System (ADS)

    Reighard, A. B.; Hansen, J. F.; Bouquet, S.; Koenig, M.

    2005-10-01

    Radiative shocks occur in many high-energy density explosions, but prove difficult to create in laboratory experiments or to fully model with astrophysical codes. Low astrophysical densities combined with powerful explosions provide ideal conditions for producing radiative shocks. Here we describe an experiment significant to astrophysical shocks, which produces a driven, planar radiative shock in low density Xe gas. Including radiation effects precludes scaling experiments directly to astrophysical conditions via Euler equations, as can be done in purely hydrodynamic experiments. We use optical depth considerations to make comparisons between the driven shock in xenon and specific astrophysical phenomena. This planar shock may be subject to thin shell instabilities similar to those affecting the evolution of astrophysical shocks. This research was sponsored by the National Nuclear Security Administration under the Stewardship Science Academic Alliances program through DOE Research Grants DE-FG52-03NA00064, DE-FG53-2005-NA26014, and other grants and contracts.

  7. Advancing Capabilities for Understanding the Earth System Through Intelligent Systems, the NSF Perspective

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Zanzerkia, E. E.; Munoz-Avila, H.

    2015-12-01

    The National Science Foundation (NSF) Directorate for Geosciences (GEO) and Directorate for Computer and Information Science (CISE) acknowledge the significant scientific challenges required to understand the fundamental processes of the Earth system, within the atmospheric and geospace, Earth, ocean and polar sciences, and across those boundaries. A broad view of the opportunities and directions for GEO are described in the report "Dynamic Earth: GEO imperative and Frontiers 2015-2020." Many of the aspects of geosciences research, highlighted both in this document and other community grand challenges, pose novel problems for researchers in intelligent systems. Geosciences research will require solutions for data-intensive science, advanced computational capabilities, and transformative concepts for visualizing, using, analyzing and understanding geo phenomena and data. Opportunities for the scientific community to engage in addressing these challenges are available and being developed through NSF's portfolio of investments and activities. The NSF-wide initiative, Cyberinfrastructure Framework for 21st Century Science and Engineering (CIF21), looks to accelerate research and education through new capabilities in data, computation, software and other aspects of cyberinfrastructure. EarthCube, a joint program between GEO and the Advanced Cyberinfrastructure Division, aims to create a well-connected and facile environment to share data and knowledge in an open, transparent, and inclusive manner, thus accelerating our ability to understand and predict the Earth system. EarthCube's mission opens an opportunity for collaborative research on novel information systems enhancing and supporting geosciences research efforts. NSF encourages true, collaborative partnerships between scientists in computer sciences and the geosciences to meet these challenges.

  8. Interfacing with in-Situ Data Networks during the Arctic Boreal Vulnerability Experiment (ABoVE)

    NASA Astrophysics Data System (ADS)

    McInerney, M.; Griffith, P. C.; Duffy, D.; Hoy, E.; Schnase, J. L.; Sinno, S.; Thompson, J. H.

    2014-12-01

    The Arctic Boreal Vulnerability Experiment (ABoVE) is designed to improve understanding of the causes and impacts of ecological changes in Arctic/boreal regions, and will integrate field-based studies, modeling, and data from airborne and satellite remote sensing. ABoVE will result in a fuller understanding of ecosystem vulnerability and resilience to environmental change in the Arctic and boreal regions of western North America, and provide scientific information required to develop options for societal responses to the impacts of these changes. The studies sponsored by NASA during ABoVE will be coordinated with research and in-situ monitoring activities being sponsored by a number of national and international partners. The NASA Center for Climate Simulation at the Goddard Space Flight Center has partnered with the NASA Carbon Cycle & Ecosystems Office to create a science cloud designed for this field campaign - the ABoVE Science Cloud (ASC). The ASC combines high performance computing with emerging technologies to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage with integrated data management, and integration of core variables from in-situ networks identified by the ABoVE Science Definition Team. In this talk, we will present the scientific requirements driving the development of the ABoVE Science Cloud, discuss the necessary interfaces, both computational and human, with in-situ monitoring networks, and show examples of how the ASC is being used to meet the needs of the ABoVE campaign.

  9. Earth System Grid II, Turning Climate Datasets into Community Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Don

    2006-08-01

    The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less

  10. Large-Scale NASA Science Applications on the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2005-01-01

    Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.

  11. Renaissance of the Web

    NASA Astrophysics Data System (ADS)

    McCarty, M.

    2009-09-01

    The renaissance of the web has driven development of many new technologies that have forever changed the way we write software. The resulting tools have been applied to both solve problems and creat new ones in a wide range of domains ranging from monitor and control user interfaces to information distribution. This discussion covers which of and how these technologies are being used in the astronomical computing community. Topics include JavaScript, Cascading Style Sheets, HTML, XML, JSON, RSS, iCalendar, Java, PHP, Python, Ruby on Rails, database technologies, and web frameworks/design patterns.

  12. Resonantly driven CNOT gate for electron spins

    NASA Astrophysics Data System (ADS)

    Zajac, D. M.; Sigillito, A. J.; Russ, M.; Borjans, F.; Taylor, J. M.; Burkard, G.; Petta, J. R.

    2018-01-01

    To build a universal quantum computer—the kind that can handle any computational task you throw at it—an essential early step is to demonstrate the so-called CNOT gate, which acts on two qubits. Zajac et al. built an efficient CNOT gate by using electron spin qubits in silicon quantum dots, an implementation that is especially appealing because of its compatibility with existing semiconductor-based electronics (see the Perspective by Schreiber and Bluhm). To showcase the potential, the authors used the gate to create an entangled quantum state called the Bell state.

  13. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Dennis L.

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  14. Nanocrystalline copper films are never flat.

    PubMed

    Zhang, Xiaopu; Han, Jian; Plombon, John J; Sutton, Adrian P; Srolovitz, David J; Boland, John J

    2017-07-28

    We used scanning tunneling microscopy to study low-angle grain boundaries at the surface of nearly planar copper nanocrystalline (111) films. The presence of grain boundaries and their emergence at the film surface create valleys composed of dissociated edge dislocations and ridges where partial dislocations have recombined. Geometric analysis and simulations indicated that valleys and ridges were created by an out-of-plane grain rotation driven by reduction of grain boundary energy. These results suggest that in general, it is impossible to form flat two-dimensional nanocrystalline films of copper and other metals exhibiting small stacking fault energies and/or large elastic anisotropy, which induce a large anisotropy in the dislocation-line energy. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  15. Power to the People: Addressing Big Data Challenges in Neuroscience by Creating a New Cadre of Citizen Neuroscientists.

    PubMed

    Roskams, Jane; Popović, Zoran

    2016-11-02

    Global neuroscience projects are producing big data at an unprecedented rate that informatic and artificial intelligence (AI) analytics simply cannot handle. Online games, like Foldit, Eterna, and Eyewire-and now a new neuroscience game, Mozak-are fueling a people-powered research science (PPRS) revolution, creating a global community of "new experts" that over time synergize with computational efforts to accelerate scientific progress, empowering us to use our collective cerebral talents to drive our understanding of our brain. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Brokering Capabilities for EarthCube - supporting Multi-disciplinary Earth Science Research

    NASA Astrophysics Data System (ADS)

    Jodha Khalsa, Siri; Pearlman, Jay; Nativi, Stefano; Browdy, Steve; Parsons, Mark; Duerr, Ruth; Pearlman, Francoise

    2013-04-01

    The goal of NSF's EarthCube is to create a sustainable infrastructure that enables the sharing of all geosciences data, information, and knowledge in an open, transparent and inclusive manner. Brokering of data and improvements in discovery and access are a key to data exchange and promotion of collaboration across the geosciences. In this presentation we describe an evolutionary process of infrastructure and interoperability development focused on participation of existing science research infrastructures and augmenting them for improved access. All geosciences communities already have, to a greater or lesser degree, elements of an information infrastructure in place. These elements include resources such as data archives, catalogs, and portals as well as vocabularies, data models, protocols, best practices and other community conventions. What is necessary now is a process for levering these diverse infrastructure elements into an overall infrastructure that provides easy discovery, access and utilization of resources across disciplinary boundaries. Brokers connect disparate systems with only minimal burdens upon those systems, and enable the infrastructure to adjust to new technical developments and scientific requirements as they emerge. Robust cyberinfrastructure will arise only when social, organizational, and cultural issues are resolved in tandem with the creation of technology-based services. This is a governance issue, but is facilitated by infrastructure capabilities that can impact the uptake of new interdisciplinary collaborations and exchange. Thus brokering must address both the cyberinfrastructure and computer technology requirements and also the social issues to allow improved cross-domain collaborations. This is best done through use-case-driven requirements and agile, iterative development methods. It is important to start by solving real (not hypothetical) information access and use problems via small pilot projects that develop capabilities targeted to specific communities. Brokering, as a critical capability for connecting systems, evolves over time through more connections and increased functionality. This adaptive process allows for continual evaluation as to how well science-driven use cases are being met. There is a near term, and possibly unique, opportunity through EarthCube and European e-Infrastructure projects to increase the impact and interconnectivity of projects. In the developments described in this presentation, brokering has been demonstrated to be an essential part of a robust, adaptive technical infrastructure and demonstration and user scenarios can address of both the governance and detailed implementation paths forward. The EarthCube Brokering roadmap proposes the expansion of brokering pilots into fully operational prototypes that work with the broader science and informatics communities to answer these questions, connect existing and emerging systems, and evolve the EarthCube infrastructure.

  17. Approximate Computing Techniques for Iterative Graph Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panyala, Ajay R.; Subasi, Omer; Halappanavar, Mahantesh

    Approximate computing enables processing of large-scale graphs by trading off quality for performance. Approximate computing techniques have become critical not only due to the emergence of parallel architectures but also the availability of large scale datasets enabling data-driven discovery. Using two prototypical graph algorithms, PageRank and community detection, we present several approximate computing heuristics to scale the performance with minimal loss of accuracy. We present several heuristics including loop perforation, data caching, incomplete graph coloring and synchronization, and evaluate their efficiency. We demonstrate performance improvements of up to 83% for PageRank and up to 450x for community detection, with lowmore » impact of accuracy for both the algorithms. We expect the proposed approximate techniques will enable scalable graph analytics on data of importance to several applications in science and their subsequent adoption to scale similar graph algorithms.« less

  18. The science of visual analysis at extreme scale

    NASA Astrophysics Data System (ADS)

    Nowell, Lucy T.

    2011-01-01

    Driven by market forces and spanning the full spectrum of computational devices, computer architectures are changing in ways that present tremendous opportunities and challenges for data analysis and visual analytic technologies. Leadership-class high performance computing system will have as many as a million cores by 2020 and support 10 billion-way concurrency, while laptop computers are expected to have as many as 1,000 cores by 2015. At the same time, data of all types are increasing exponentially and automated analytic methods are essential for all disciplines. Many existing analytic technologies do not scale to make full use of current platforms and fewer still are likely to scale to the systems that will be operational by the end of this decade. Furthermore, on the new architectures and for data at extreme scales, validating the accuracy and effectiveness of analytic methods, including visual analysis, will be increasingly important.

  19. Singularity: Scientific containers for mobility of compute.

    PubMed

    Kurtzer, Gregory M; Sochat, Vanessa; Bauer, Michael W

    2017-01-01

    Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.

  20. Singularity: Scientific containers for mobility of compute

    PubMed Central

    Kurtzer, Gregory M.; Bauer, Michael W.

    2017-01-01

    Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science. PMID:28494014

  1. IT Tools for Teachers and Scientists, Created by Undergraduate Researchers

    NASA Astrophysics Data System (ADS)

    Millar, A. Z.; Perry, S.

    2007-12-01

    Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part of a multi-media, general education curriculum at University of Southern California. Throughout this meeting, at the SCEC booth, UseIT interns will be demonstrating both the serious games and SCEC-VDO. SCEC/UseIT is a National Science Foundation Research Experience for Undergraduates site.

  2. FAQ's | College of Engineering & Applied Science

    Science.gov Websites

    zipped (compressed) format. This will help when the file is very large or created by one of the high end Milwaukee Engineer People Faculty and Staff Biomedical Engineering Civil & Environmental Engineering Computer Labs Technical Questions The labs are generally open 24/7, how will I know when a lab/system

  3. Incorporating Prototyping and Iteration into Intervention Development: A Case Study of a Dining Hall-Based Intervention

    ERIC Educational Resources Information Center

    McClain, Arianna D.; Hekler, Eric B.; Gardner, Christopher D.

    2013-01-01

    Background: Previous research from the fields of computer science and engineering highlight the importance of an iterative design process (IDP) to create more creative and effective solutions. Objective: This study describes IDP as a new method for developing health behavior interventions and evaluates the effectiveness of a dining hall--based…

  4. Rocket Science for the Internet

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Rainfinity, a company resulting from the commercialization of Reliable Array of Independent Nodes (RAIN), produces the product, Rainwall. Rainwall runs a cluster of computer workstations, creating a distributed Internet gateway. When Rainwall detects a failure in software or hardware, traffic is shifted to a healthy gateway without interruptions to Internet service. It more evenly distributes workload across servers, providing less down time.

  5. Promoting Continuing Computer Science Education through a Massively Open Online Course

    ERIC Educational Resources Information Center

    Oliver, Kevin

    2016-01-01

    This paper presents the results of a comparison study between graduate students taking a software security course at an American university and international working professionals taking a version of the same course online through a free massive open online course (MOOC) created in the Google CourseBuilder learning environment. A goal of the study…

  6. Student and Staff Perceptions of Key Aspects of Computer Science Engineering Capstone Projects

    ERIC Educational Resources Information Center

    Olarte, Juan José; Dominguez, César; Jaime, Arturo; Garcia-Izquierdo, Francisco José

    2016-01-01

    In carrying out their capstone projects, students use knowledge and skills acquired throughout their degree program to create a product or provide a technical service. An assigned advisor guides the students and supervises the work, and a committee assesses the projects. This study compares student and staff perceptions of key aspects of…

  7. The Role of Technology in Advancing Performance Standards in Science and Mathematics Learning.

    ERIC Educational Resources Information Center

    Quellmalz, Edys

    Technology permeates the lives of most Americans: voice mail, personal computers, and the ever-blinking VCR clock have become commonplace. In schools, it is creating educational opportunities at a dizzying pace and, within and beyond the classroom, it is providing unprecedented access to a universe of ideas and resources. As a next step, the…

  8. Creating Engaging Online Learning Material with the JSAV JavaScript Algorithm Visualization Library

    ERIC Educational Resources Information Center

    Karavirta, Ville; Shaffer, Clifford A.

    2016-01-01

    Data Structures and Algorithms are a central part of Computer Science. Due to their abstract and dynamic nature, they are a difficult topic to learn for many students. To alleviate these learning difficulties, instructors have turned to algorithm visualizations (AV) and AV systems. Research has shown that especially engaging AVs can have an impact…

  9. Realizing Scientific Methods for Cyber Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, Thomas E.; Manz, David O.; Edgar, Thomas W.

    There is little doubt among cyber security researchers about the lack of scientic rigor that underlies much of the liter-ature. The issues are manifold and are well documented. Further complicating the problem is insufficient scientic methods to address these issues. Cyber security melds man and machine: we inherit the challenges of computer science, sociology, psychology, and many other elds and create new ones where these elds interface. In this paper we detail a partial list of challenges imposed by rigorous science and survey how other sciences have tackled them, in the hope of applying a similar approach to cyber securitymore » science. This paper is by no means comprehensive: its purpose is to foster discussion in the community on how we can improve rigor in cyber security science.« less

  10. Data science ethics in government.

    PubMed

    Drew, Cat

    2016-12-28

    Data science can offer huge opportunities for government. With the ability to process larger and more complex datasets than ever before, it can provide better insights for policymakers and make services more tailored and efficient. As with all new technologies, there is a risk that we do not take up its opportunities and miss out on its enormous potential. We want people to feel confident to innovate with data. So, over the past 18 months, the Government Data Science Partnership has taken an open, evidence-based and user-centred approach to creating an ethical framework. It is a practical document that brings all the legal guidance together in one place, and is written in the context of new data science capabilities. As part of its development, we ran a public dialogue on data science ethics, including deliberative workshops, an experimental conjoint survey and an online engagement tool. The research supported the principles set out in the framework as well as provided useful insight into how we need to communicate about data science. It found that people had a low awareness of the term 'data science', but that showing data science examples can increase broad support for government exploring innovative uses of data. But people's support is highly context driven. People consider acceptability on a case-by-case basis, first thinking about the overall policy goals and likely intended outcome, and then weighing up privacy and unintended consequences. The ethical framework is a crucial start, but it does not solve all the challenges it highlights, particularly as technology is creating new challenges and opportunities every day. Continued research is needed into data minimization and anonymization, robust data models, algorithmic accountability, and transparency and data security. It also has revealed the need to set out a renewed deal between the citizen and state on data, to maintain and solidify trust in how we use people's data for social good.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).

  11. Quantitative and Qualitative Evaluation of The Structural Designing of Medical Informatics Dynamic Encyclopedia

    PubMed Central

    Safdari, Reza; Shahmoradi, Leila; Hosseini-beheshti, Molouk-sadat; Nejad, Ahmadreza Farzaneh; Hosseiniravandi, Mohammad

    2015-01-01

    Introduction: Encyclopedias and their compilation have become so prevalent as a valid cultural medium in the world. The daily development of computer industry and the expansion of various sciences have made indispensable the compilation of electronic, specialized encyclopedias, especially the web-based ones. Materials and Methods: This is an applied-developmental study conducted in 2014. First, the main terms in the field of medical informatics were gathered using MeSH Online 2014 and the supplementary terms of each were determined, and then the tree diagram of the terms was drawn based on their relationship in MeSH. Based on the studies done by the researchers, the tree diagram of the encyclopedia was drawn with respect to the existing areas in this field, and the terms gathered were put in related domains. Findings: In MeSH, 75 preferred terms together with 249 supplementary ones were indexed. One of the informatics’ sub-branches is biomedical informatics and health which itself consists of three sub-divisions of bioinformatics, clinical informatics, and health informatics. Medical informatics which is a subdivision of clinical informatics has developed from the three fields of medical sciences, management and social sciences, and computational sciences and mathematics. Results and Discussion: Medical Informatics is created of confluence and fusion and applications of the three major scientific branches include health and biological sciences, social sciences and management sciences, computing and mathematical sciences, and according to that the structure of MeSH is weak for future development of Encyclopedia of Medical Informatics. PMID:26635440

  12. Efficacy of ACA Strategies in Biography-Driven Science Teaching: An Investigation

    ERIC Educational Resources Information Center

    MacDonald, Grizelda L.; Miller, Stuart S.; Murry, Kevin; Herrera, Socorro; Spears, Jacqueline D.

    2013-01-01

    This study explored the biography-driven approach to teaching culturally and linguistically diverse students in science education. Biography-driven instruction (BDI) embraces student diversity by incorporating students' sociocultural, linguistic, cognitive, and academic dimensions of their biographies into the learning process (Herrera in…

  13. Study of Plasma Liner Driven Magnetized Target Fusion Via Advanced Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samulyak, Roman V.; Brookhaven National Lab.; Parks, Paul

    The feasibility of the plasma liner driven Magnetized Target Fusion (MTF) via terascale numerical simulations will be assessed. In the MTF concept, a plasma liner, formed by merging of a number (60 or more) of radial, highly supersonic plasma jets, implodes on the target in the form of two compact plasma toroids, and compresses it to conditions of the fusion ignition. By avoiding major difficulties associated with both the traditional laser driven inertial confinement fusion and solid liner driven MTF, the plasma liner driven MTF potentially provides a low-cost and fast R&D path towards the demonstration of practical fusion energy.more » High fidelity numerical simulations of full nonlinear models associated with the plasma liner MTF using state-of-art numerical algorithms and terascale computing are necessary in order to resolve uncertainties and provide guidance for future experiments. At Stony Brook University, we have developed unique computational capabilities that ideally suite the MTF problem. The FronTier code, developed in collaboration with BNL and LANL under DOE funding including SciDAC for the simulation of 3D multi-material hydro and MHD flows, has beenbenchmarked and used for fundamental and engineering problems in energy science applications. We have performed 3D simulations of converging supersonic plasma jets, their merger and the formation of the plasma liner, and a study of the corresponding oblique shock problem. We have studied the implosion of the plasma liner on the magnetized plasma target by resolving Rayleigh-Taylor instabilities in 2D and 3D and other relevant physics and estimate thermodynamic conditions of the target at the moment of maximum compression and the hydrodynamic efficiency of the method.« less

  14. Zooniverse - A Platform for Data-Driven Citizen Science

    NASA Astrophysics Data System (ADS)

    Smith, A.; Lintott, C.; Bamford, S.; Fortson, L.

    2011-12-01

    In July 2007 a team of astrophysicists created a web-based astronomy project called Galaxy Zoo in which members of the public were asked to classify galaxies from the Sloan Digital Sky Survey by their shape. Over the following year a community of more than 150,000 people classified each of the 1 million galaxies more than 50 times each. Four years later this community of 'citizen scientists' is more than 450,000 strong and is contributing their time and efforts to more than 10 Zooniverse projects each with its own science team and research case. With projects ranging from transcribing ancient greek texts (ancientlives.org) to lunar science (moonzoo.org) the challenges to the Zooniverse community have gone well beyond the relatively simple original Galaxy Zoo interface. Delivering a range of citizen science projects to a large web-based audience presents challenges on a number of fronts including interface design, data architecture/modelling and reduction techniques, web-infrastructure and software design. In this paper we will describe how the Zooniverse team (a collaboration of scientists, software developers and educators ) have developed tools and techniques to solve some of these issues.

  15. ALPS: A Linear Program Solver

    NASA Technical Reports Server (NTRS)

    Ferencz, Donald C.; Viterna, Larry A.

    1991-01-01

    ALPS is a computer program which can be used to solve general linear program (optimization) problems. ALPS was designed for those who have minimal linear programming (LP) knowledge and features a menu-driven scheme to guide the user through the process of creating and solving LP formulations. Once created, the problems can be edited and stored in standard DOS ASCII files to provide portability to various word processors or even other linear programming packages. Unlike many math-oriented LP solvers, ALPS contains an LP parser that reads through the LP formulation and reports several types of errors to the user. ALPS provides a large amount of solution data which is often useful in problem solving. In addition to pure linear programs, ALPS can solve for integer, mixed integer, and binary type problems. Pure linear programs are solved with the revised simplex method. Integer or mixed integer programs are solved initially with the revised simplex, and the completed using the branch-and-bound technique. Binary programs are solved with the method of implicit enumeration. This manual describes how to use ALPS to create, edit, and solve linear programming problems. Instructions for installing ALPS on a PC compatible computer are included in the appendices along with a general introduction to linear programming. A programmers guide is also included for assistance in modifying and maintaining the program.

  16. Visual Analytics Tools for Sustainable Lifecycle Design: Current Status, Challenges, and Future Opportunities.

    PubMed

    Ramanujan, Devarajan; Bernstein, William Z; Chandrasegaran, Senthil K; Ramani, Karthik

    2017-01-01

    The rapid rise in technologies for data collection has created an unmatched opportunity to advance the use of data-rich tools for lifecycle decision-making. However, the usefulness of these technologies is limited by the ability to translate lifecycle data into actionable insights for human decision-makers. This is especially true in the case of sustainable lifecycle design (SLD), as the assessment of environmental impacts, and the feasibility of making corresponding design changes, often relies on human expertise and intuition. Supporting human sense-making in SLD requires the use of both data-driven and user-driven methods while exploring lifecycle data. A promising approach for combining the two is through the use of visual analytics (VA) tools. Such tools can leverage the ability of computer-based tools to gather, process, and summarize data along with the ability of human-experts to guide analyses through domain knowledge or data-driven insight. In this paper, we review previous research that has created VA tools in SLD. We also highlight existing challenges and future opportunities for such tools in different lifecycle stages-design, manufacturing, distribution & supply chain, use-phase, end-of-life, as well as life cycle assessment. Our review shows that while the number of VA tools in SLD is relatively small, researchers are increasingly focusing on the subject matter. Our review also suggests that VA tools can address existing challenges in SLD and that significant future opportunities exist.

  17. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  18. Feature Statistics Modulate the Activation of Meaning During Spoken Word Processing.

    PubMed

    Devereux, Barry J; Taylor, Kirsten I; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K

    2016-03-01

    Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in (distinctiveness/sharedness) and likelihood of co-occurrence (correlational strength)--determine conceptual activation. To test these claims, we investigated the role of distinctiveness/sharedness and correlational strength in speech-to-meaning mapping, using a lexical decision task and computational simulations. Responses were faster for concepts with higher sharedness, suggesting that shared features are facilitatory in tasks like lexical decision that require access to them. Correlational strength facilitated responses for slower participants, suggesting a time-sensitive co-occurrence-driven settling mechanism. The computational simulation showed similar effects, with early effects of shared features and later effects of correlational strength. These results support a general-to-specific account of conceptual processing, whereby early activation of shared features is followed by the gradual emergence of a specific target representation. Copyright © 2015 The Authors. Cognitive Science published by Cognitive Science Society, Inc.

  19. A symbiotic approach to fluid equations and non-linear flux-driven simulations of plasma dynamics

    NASA Astrophysics Data System (ADS)

    Halpern, Federico

    2017-10-01

    The fluid framework is ubiquitous in studies of plasma transport and stability. Typical forms of the fluid equations are motivated by analytical work dating several decades ago, before computer simulations were indispensable, and can be, therefore, not optimal for numerical computation. We demonstrate a new first-principles approach to obtaining manifestly consistent, skew-symmetric fluid models, ensuring internal consistency and conservation properties even in discrete form. Mass, kinetic, and internal energy become quadratic (and always positive) invariants of the system. The model lends itself to a robust, straightforward discretization scheme with inherent non-linear stability. A simpler, drift-ordered form of the equations is obtained, and first results of their numerical implementation as a binary framework for bulk-fluid global plasma simulations are demonstrated. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences, Theory Program, under Award No. DE-FG02-95ER54309.

  20. Multiplex Recurrence Networks

    NASA Astrophysics Data System (ADS)

    Eroglu, Deniz; Marwan, Norbert

    2017-04-01

    The complex nature of a variety of phenomena in physical, biological, or earth sciences is driven by a large number of degrees of freedom which are strongly interconnected. Although the evolution of such systems is described by multivariate time series (MTS), so far research mostly focuses on analyzing these components one by one. Recurrence based analyses are powerful methods to understand the underlying dynamics of a dynamical system and have been used for many successful applications including examples from earth science, economics, or chemical reactions. The backbone of these techniques is creating the phase space of the system. However, increasing the dimension of a system requires increasing the length of the time series in order get significant and reliable results. This requirement is one of the challenges in many disciplines, in particular in palaeoclimate, thus, it is not easy to create a phase space from measured MTS due to the limited number of available obervations (samples). To overcome this problem, we suggest to create recurrence networks from each component of the system and combine them into a multiplex network structure, the multiplex recurrence network (MRN). We test the MRN by using prototypical mathematical models and demonstrate its use by studying high-dimensional palaeoclimate dynamics derived from pollen data from the Bear Lake (Utah, US). By using the MRN, we can distinguish typical climate transition events, e.g., such between Marine Isotope Stages.

  1. Developing a Science Commons for Geosciences

    NASA Astrophysics Data System (ADS)

    Lenhardt, W. C.; Lander, H.

    2016-12-01

    Many scientific communities, recognizing the research possibilities inherent in data sets, have created domain specific archives such as the Incorporated Research Institutions for Seismology (iris.edu) and ClinicalTrials.gov. Though this is an important step forward, most scientists, including geoscientists, also use a variety of software tools and at least some amount of computation to conduct their research. While the archives make it simpler for scientists to locate the required data, provisioning disk space, compute resources, and network bandwidth can still require significant efforts. This challenge exists despite the wealth of resources available to researchers, namely lab IT resources, institutional IT resources, national compute resources (XSEDE, OSG), private clouds, public clouds, and the development of cyberinfrastructure technologies meant to facilitate use of those resources. Further tasks include obtaining and installing required tools for analysis and visualization. If the research effort is a collaboration or involves certain types of data, then the partners may well have additional non-scientific tasks such as securing the data and developing secure sharing methods for the data. These requirements motivate our investigations into the "Science Commons". This paper will present a working definition of a science commons, compare and contrast examples of existing science commons, and describe a project based at RENCI to implement a science commons for risk analytics. We will then explore what a similar tool might look like for the geosciences.

  2. Mission Driven Science at Argonne

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thackery, Michael; Wang, Michael; Young, Linda

    2012-07-05

    Mission driven science at Argonne means applying science and scientific knowledge to a physical and "real world" environment. Examples include testing a theoretical model through the use of formal science or solving a practical problem through the use of natural science. At the laboratory, our materials scientists are leading the way in producing energy solutions today that could help reduce and remove the energy crisis of tomorrow.

  3. Medical Impairment and Computational Reduction to a Single Whole Person Impairment (WPI) Rating Value

    NASA Astrophysics Data System (ADS)

    Artz, Jerry; Alchemy, John; Weilepp, Anne; Bongiovanni, Michael; Siddhartha, Kumar

    2014-03-01

    A medical, biophysics, engineering collaboration has produced a standardized cloud-based application for creating automated WPI ratings. The project assigns numerical values to injuries/illness in accordance with the American Medical Association Guides to the Evaluation of Permanent Impairment, Fifth Edition, AMA Press handbook, 5th edition (with 63 medical contributors and 89 medical reviewers). The AMA Guide serves as the industry standard for assigning impairment values for 32 US states and 190 other countries. Clinical medical data is collected using a menu-driven user interface which is computationally combined into a single numeric value. A medical doctor performs a biometric analysis and enters the quantitative data into a mobile device. The data is analyzed using proprietary validation algorithms, and a WPI Impairment rating is created. The findings are imbedded into a formalized medicolegal report in a matter of minutes. This particular presentation will concentrate upon the WPI rating of the spine--cervical, thoracic, and lumbar. Both common rating techniques will be presented--i.e., Diagnosis Related Estimates (DRE) and Range of Motion (ROM).

  4. An introduction to interactive hypermedia.

    PubMed

    Lynch, P J; Jaffe, C C

    1990-01-01

    Current computers can create and display documents that incorporate a variety of audiovisual media, and can be organized to allow the user, guided by curiosity and not by a fixed path through the material, to move through the information in non-linear pathways. These hypermedia documents and the concept of hypertext offer significant new possibilities for the creation of educational materials for the biomedical sciences. If the full capabilities of the computer are to be used to enhance the educational experience for learners, computer professionals need to collaborate with publishing and teaching professionals. Biomedical communications professionals can and should play a role in establishing and evaluating hypermedia documents for medical education.

  5. Political science. Reverse-engineering censorship in China: randomized experimentation and participant observation.

    PubMed

    King, Gary; Pan, Jennifer; Roberts, Margaret E

    2014-08-22

    Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored. Copyright © 2014, American Association for the Advancement of Science.

  6. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework.

    PubMed

    Teeguarden, Justin G; Tan, Yu-Mei; Edwards, Stephen W; Leonard, Jeremy A; Anderson, Kim A; Corley, Richard A; Kile, Molly L; Simonich, Staci M; Stone, David; Tanguay, Robert L; Waters, Katrina M; Harper, Stacey L; Williams, David E

    2016-05-03

    Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the "systems approaches" used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences. Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making.

  7. Data-Driven Approaches to Empirical Discovery

    DTIC Science & Technology

    1988-10-31

    if nece ry and identify by block number) empirical discovery history of science data-driven heuristics numeric laws theoretical terms scope of laws...to the normative side. Machine Discovery and the History of Science The history of science studies the actual path followed by scientists over the

  8. Modeling the dynamics of multipartite quantum systems created departing from two-level systems using general local and non-local interactions

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco

    2017-12-01

    Quantum information is an emergent area merging physics, mathematics, computer science and engineering. To reach its technological goals, it is requiring adequate approaches to understand how to combine physical restrictions, computational approaches and technological requirements to get functional universal quantum information processing. This work presents the modeling and the analysis of certain general type of Hamiltonian representing several physical systems used in quantum information and establishing a dynamics reduction in a natural grammar for bipartite processing based on entangled states.

  9. The EPA Comptox Chemistry Dashboard: A Web-Based Data ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data but recent developments have focused on the development of a new software architecture that assembles the resources into a single platform. A new web application, the CompTox Chemistry Dashboard provides access to data associated with ~720,000 chemical substances. These data include experimental and predicted physicochemical property data, bioassay screening data associated with the ToxCast program, product and functional use information and a myriad of related data of value to environmental scientists. The dashboard provides chemical-based searching based on chemical names, synonyms and CAS Registry Numbers. Flexible search capabilities allow for chemical identificati

  10. The EPA CompTox Chemistry Dashboard - an online resource ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data. Recent work has focused on the development of a new architecture that assembles the resources into a single platform. With a focus on delivering access to Open Data streams, web service integration accessibility and a user-friendly web application the CompTox Dashboard provides access to data associated with ~720,000 chemical substances. These data include research data in the form of bioassay screening data associated with the ToxCast program, experimental and predicted physicochemical properties, product and functional use information and related data of value to environmental scientists. This presentation will provide an overview of the CompTox Dashboard and its va

  11. Discovery and Development of ATP-Competitive mTOR Inhibitors Using Computational Approaches.

    PubMed

    Luo, Yao; Wang, Ling

    2017-11-16

    The mammalian target of rapamycin (mTOR) is a central controller of cell growth, proliferation, metabolism, and angiogenesis. This protein is an attractive target for new anticancer drug development. Significant progress has been made in hit discovery, lead optimization, drug candidate development and determination of the three-dimensional (3D) structure of mTOR. Computational methods have been applied to accelerate the discovery and development of mTOR inhibitors helping to model the structure of mTOR, screen compound databases, uncover structure-activity relationship (SAR) and optimize the hits, mine the privileged fragments and design focused libraries. Besides, computational approaches were also applied to study protein-ligand interactions mechanisms and in natural product-driven drug discovery. Herein, we survey the most recent progress on the application of computational approaches to advance the discovery and development of compounds targeting mTOR. Future directions in the discovery of new mTOR inhibitors using computational methods are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  12. Toward more transparent and reproducible omics studies through a common metadata checklist and data publications.

    PubMed

    Kolker, Eugene; Özdemir, Vural; Martens, Lennart; Hancock, William; Anderson, Gordon; Anderson, Nathaniel; Aynacioglu, Sukru; Baranova, Ancha; Campagna, Shawn R; Chen, Rui; Choiniere, John; Dearth, Stephen P; Feng, Wu-Chun; Ferguson, Lynnette; Fox, Geoffrey; Frishman, Dmitrij; Grossman, Robert; Heath, Allison; Higdon, Roger; Hutz, Mara H; Janko, Imre; Jiang, Lihua; Joshi, Sanjay; Kel, Alexander; Kemnitz, Joseph W; Kohane, Isaac S; Kolker, Natali; Lancet, Doron; Lee, Elaine; Li, Weizhong; Lisitsa, Andrey; Llerena, Adrian; Macnealy-Koch, Courtney; Marshall, Jean-Claude; Masuzzo, Paola; May, Amanda; Mias, George; Monroe, Matthew; Montague, Elizabeth; Mooney, Sean; Nesvizhskii, Alexey; Noronha, Santosh; Omenn, Gilbert; Rajasimha, Harsha; Ramamoorthy, Preveen; Sheehan, Jerry; Smarr, Larry; Smith, Charles V; Smith, Todd; Snyder, Michael; Rapole, Srikanth; Srivastava, Sanjeeva; Stanberry, Larissa; Stewart, Elizabeth; Toppo, Stefano; Uetz, Peter; Verheggen, Kenneth; Voy, Brynn H; Warnich, Louise; Wilhelm, Steven W; Yandl, Gregory

    2014-01-01

    Biological processes are fundamentally driven by complex interactions between biomolecules. Integrated high-throughput omics studies enable multifaceted views of cells, organisms, or their communities. With the advent of new post-genomics technologies, omics studies are becoming increasingly prevalent; yet the full impact of these studies can only be realized through data harmonization, sharing, meta-analysis, and integrated research. These essential steps require consistent generation, capture, and distribution of metadata. To ensure transparency, facilitate data harmonization, and maximize reproducibility and usability of life sciences studies, we propose a simple common omics metadata checklist. The proposed checklist is built on the rich ontologies and standards already in use by the life sciences community. The checklist will serve as a common denominator to guide experimental design, capture important parameters, and be used as a standard format for stand-alone data publications. The omics metadata checklist and data publications will create efficient linkages between omics data and knowledge-based life sciences innovation and, importantly, allow for appropriate attribution to data generators and infrastructure science builders in the post-genomics era. We ask that the life sciences community test the proposed omics metadata checklist and data publications and provide feedback for their use and improvement.

  13. Toward More Transparent and Reproducible Omics Studies Through a Common Metadata Checklist and Data Publications.

    PubMed

    Kolker, Eugene; Özdemir, Vural; Martens, Lennart; Hancock, William; Anderson, Gordon; Anderson, Nathaniel; Aynacioglu, Sukru; Baranova, Ancha; Campagna, Shawn R; Chen, Rui; Choiniere, John; Dearth, Stephen P; Feng, Wu-Chun; Ferguson, Lynnette; Fox, Geoffrey; Frishman, Dmitrij; Grossman, Robert; Heath, Allison; Higdon, Roger; Hutz, Mara H; Janko, Imre; Jiang, Lihua; Joshi, Sanjay; Kel, Alexander; Kemnitz, Joseph W; Kohane, Isaac S; Kolker, Natali; Lancet, Doron; Lee, Elaine; Li, Weizhong; Lisitsa, Andrey; Llerena, Adrian; MacNealy-Koch, Courtney; Marshall, Jean-Claude; Masuzzo, Paola; May, Amanda; Mias, George; Monroe, Matthew; Montague, Elizabeth; Mooney, Sean; Nesvizhskii, Alexey; Noronha, Santosh; Omenn, Gilbert; Rajasimha, Harsha; Ramamoorthy, Preveen; Sheehan, Jerry; Smarr, Larry; Smith, Charles V; Smith, Todd; Snyder, Michael; Rapole, Srikanth; Srivastava, Sanjeeva; Stanberry, Larissa; Stewart, Elizabeth; Toppo, Stefano; Uetz, Peter; Verheggen, Kenneth; Voy, Brynn H; Warnich, Louise; Wilhelm, Steven W; Yandl, Gregory

    2013-12-01

    Biological processes are fundamentally driven by complex interactions between biomolecules. Integrated high-throughput omics studies enable multifaceted views of cells, organisms, or their communities. With the advent of new post-genomics technologies, omics studies are becoming increasingly prevalent; yet the full impact of these studies can only be realized through data harmonization, sharing, meta-analysis, and integrated research. These essential steps require consistent generation, capture, and distribution of metadata. To ensure transparency, facilitate data harmonization, and maximize reproducibility and usability of life sciences studies, we propose a simple common omics metadata checklist. The proposed checklist is built on the rich ontologies and standards already in use by the life sciences community. The checklist will serve as a common denominator to guide experimental design, capture important parameters, and be used as a standard format for stand-alone data publications. The omics metadata checklist and data publications will create efficient linkages between omics data and knowledge-based life sciences innovation and, importantly, allow for appropriate attribution to data generators and infrastructure science builders in the post-genomics era. We ask that the life sciences community test the proposed omics metadata checklist and data publications and provide feedback for their use and improvement.

  14. 50 Years of the Astro-Science Workshop at the Adler Planetarium

    NASA Astrophysics Data System (ADS)

    Hammergren, Mark; Martynowycz, M. W.; Ratliff, G.

    2014-01-01

    Since 1964, the Adler Planetarium has hosted a program for highly motivated and interested high-school students known as the Astro-Science Workshop (ASW). Created in response to the national “call to arms” for improved science education following the stunning launch of Sputnik, ASW was originally conducted as an extracurricular astronomy class on Saturday mornings throughout the school year, for many years under the leadership of Northwestern University professor J. Allen Hynek. A gradual decline in student interest in the 1990’s led to a redesign of ASW as a summer program featuring hands-on, student-driven investigation and experimentation. Since 2002, ASW has been organized and taught by graduate student “scientist-educators” and funded through a series of grants from the NSF. For the past seven years, students have designed, built, and flown experiments on helium balloons to altitudes of around 30 km (100,000 feet). Here, as we enter its 50th anniversary, we present the history of the Astro-Science Workshop, its context among the small but still vibrant community of post-Sputnik science enrichment programs, and its rich legacy of inspiring generations of astronomers and other explorers.

  15. Using science and psychology to improve the dissemination and evaluation of scientific work.

    PubMed

    Buttliere, Brett T

    2014-01-01

    Here I outline some of what science can tell us about the problems in psychological publishing and how to best address those problems. First, the motivation behind questionable research practices is examined (the desire to get ahead or, at least, not fall behind). Next, behavior modification strategies are discussed, pointing out that reward works better than punishment. Humans are utility seekers and the implementation of current change initiatives is hindered by high initial buy-in costs and insufficient expected utility. Open science tools interested in improving science should team up, to increase utility while lowering the cost and risk associated with engagement. The best way to realign individual and group motives will probably be to create one, centralized, easy to use, platform, with a profile, a feed of targeted science stories based upon previous system interaction, a sophisticated (public) discussion section, and impact metrics which use the associated data. These measures encourage high quality review and other prosocial activities while inhibiting self-serving behavior. Some advantages of centrally digitizing communications are outlined, including ways the data could be used to improve the peer review process. Most generally, it seems that decisions about change design and implementation should be theory and data driven.

  16. Big Computing in Astronomy: Perspectives and Challenges

    NASA Astrophysics Data System (ADS)

    Pankratius, Victor

    2014-06-01

    Hardware progress in recent years has led to astronomical instruments gathering large volumes of data. In radio astronomy for instance, the current generation of antenna arrays produces data at Tbits per second, and forthcoming instruments will expand these rates much further. As instruments are increasingly becoming software-based, astronomers will get more exposed to computer science. This talk therefore outlines key challenges that arise at the intersection of computer science and astronomy and presents perspectives on how both communities can collaborate to overcome these challenges.Major problems are emerging due to increases in data rates that are much larger than in storage and transmission capacity, as well as humans being cognitively overwhelmed when attempting to opportunistically scan through Big Data. As a consequence, the generation of scientific insight will become more dependent on automation and algorithmic instrument control. Intelligent data reduction will have to be considered across the entire acquisition pipeline. In this context, the presentation will outline the enabling role of machine learning and parallel computing.BioVictor Pankratius is a computer scientist who joined MIT Haystack Observatory following his passion for astronomy. He is currently leading efforts to advance astronomy through cutting-edge computer science and parallel computing. Victor is also involved in projects such as ALMA Phasing to enhance the ALMA Observatory with Very-Long Baseline Interferometry capabilities, the Event Horizon Telescope, as well as in the Radio Array of Portable Interferometric Detectors (RAPID) to create an analysis environment using parallel computing in the cloud. He has an extensive track record of research in parallel multicore systems and software engineering, with contributions to auto-tuning, debugging, and empirical experiments studying programmers. Victor has worked with major industry partners such as Intel, Sun Labs, and Oracle. He holds a distinguished doctorate and a Habilitation degree in Computer Science from the University of Karlsruhe. Contact him at pankrat@mit.edu, victorpankratius.com, or Twitter @vpankratius.

  17. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Astrophysics Data System (ADS)

    Chen, A.; Pham, L.; Kempler, S.; Theobald, M.; Esfandiari, A.; Campino, J.; Vollmer, B.; Lynnes, C.

    2011-12-01

    Cloud Computing technology has been used to offer high-performance and low-cost computing and storage resources for both scientific problems and business services. Several cloud computing services have been implemented in the commercial arena, e.g. Amazon's EC2 & S3, Microsoft's Azure, and Google App Engine. There are also some research and application programs being launched in academia and governments to utilize Cloud Computing. NASA launched the Nebula Cloud Computing platform in 2008, which is an Infrastructure as a Service (IaaS) to deliver on-demand distributed virtual computers. Nebula users can receive required computing resources as a fully outsourced service. NASA Goddard Earth Science Data and Information Service Center (GES DISC) migrated several GES DISC's applications to the Nebula as a proof of concept, including: a) The Simple, Scalable, Script-based Science Processor for Measurements (S4PM) for processing scientific data; b) the Atmospheric Infrared Sounder (AIRS) data process workflow for processing AIRS raw data; and c) the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI) for online access to, analysis, and visualization of Earth science data. This work aims to evaluate the practicability and adaptability of the Nebula. The initial work focused on the AIRS data process workflow to evaluate the Nebula. The AIRS data process workflow consists of a series of algorithms being used to process raw AIRS level 0 data and output AIRS level 2 geophysical retrievals. Migrating the entire workflow to the Nebula platform is challenging, but practicable. After installing several supporting libraries and the processing code itself, the workflow is able to process AIRS data in a similar fashion to its current (non-cloud) configuration. We compared the performance of processing 2 days of AIRS level 0 data through level 2 using a Nebula virtual computer and a local Linux computer. The result shows that Nebula has significantly better performance than the local machine. Much of the difference was due to newer equipment in the Nebula than the legacy computer, which is suggestive of a potential economic advantage beyond elastic power, i.e., access to up-to-date hardware vs. legacy hardware that must be maintained past its prime to amortize the cost. In addition to a trade study of advantages and challenges of porting complex processing to the cloud, a tutorial was developed to enable further progress in utilizing the Nebula for Earth Science applications and understanding better the potential for Cloud Computing in further data- and computing-intensive Earth Science research. In particular, highly bursty computing such as that experienced in the user-demand-driven Giovanni system may become more tractable in a Cloud environment. Our future work will continue to focus on migrating more GES DISC's applications/instances, e.g. Giovanni instances, to the Nebula platform and making matured migrated applications to be in operation on the Nebula.

  18. Using Program Theory-Driven Evaluation Science to Crack the Da Vinci Code

    ERIC Educational Resources Information Center

    Donaldson, Stewart I.

    2005-01-01

    Program theory-driven evaluation science uses substantive knowledge, as opposed to method proclivities, to guide program evaluations. It aspires to update, clarify, simplify, and make more accessible the evolving theory of evaluation practice commonly referred to as theory-driven or theory-based evaluation. The evaluator in this chapter provides a…

  19. Current status and future perspectives of electron interactions with molecules, clusters, surfaces, and interfaces [Workshop on Fundamental challenges in electron-driven chemistry; Workshop on Electron-driven processes: Scientific challenges and technological opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, Kurt H.; McCurdy, C. William; Orlando, Thomas M.

    2000-09-01

    This report is based largely on presentations and discussions at two workshops and contributions from workshop participants. The workshop on Fundamental Challenges in Electron-Driven Chemistry was held in Berkeley, October 9-10, 1998, and addressed questions regarding theory, computation, and simulation. The workshop on Electron-Driven Processes: Scientific Challenges and Technological Opportunities was held at Stevens Institute of Technology, March 16-17, 2000, and focused largely on experiments. Electron-molecule and electron-atom collisions initiate and drive almost all the relevant chemical processes associated with radiation chemistry, environmental chemistry, stability of waste repositories, plasma-enhanced chemical vapor deposition, plasma processing of materials for microelectronic devices andmore » other applications, and novel light sources for research purposes (e.g. excimer lamps in the extreme ultraviolet) and in everyday lighting applications. The life sciences are a rapidly advancing field where the important role of electron-driven processes is only now beginning to be recognized. Many of the applications of electron-initiated chemical processes require results in the near term. A large-scale, multidisciplinary and collaborative effort should be mounted to solve these problems in a timely way so that their solution will have the needed impact on the urgent questions of understanding the physico-chemical processes initiated and driven by electron interactions.« less

  20. Moving code - Sharing geoprocessing logic on the Web

    NASA Astrophysics Data System (ADS)

    Müller, Matthias; Bernard, Lars; Kadner, Daniel

    2013-09-01

    Efficient data processing is a long-standing challenge in remote sensing. Effective and efficient algorithms are required for product generation in ground processing systems, event-based or on-demand analysis, environmental monitoring, and data mining. Furthermore, the increasing number of survey missions and the exponentially growing data volume in recent years have created demand for better software reuse as well as an efficient use of scalable processing infrastructures. Solutions that address both demands simultaneously have begun to slowly appear, but they seldom consider the possibility to coordinate development and maintenance efforts across different institutions, community projects, and software vendors. This paper presents a new approach to share, reuse, and possibly standardise geoprocessing logic in the field of remote sensing. Drawing from the principles of service-oriented design and distributed processing, this paper introduces moving-code packages as self-describing software components that contain algorithmic code and machine-readable descriptions of the provided functionality, platform, and infrastructure, as well as basic information about exploitation rights. Furthermore, the paper presents a lean publishing mechanism by which to distribute these packages on the Web and to integrate them in different processing environments ranging from monolithic workstations to elastic computational environments or "clouds". The paper concludes with an outlook toward community repositories for reusable geoprocessing logic and their possible impact on data-driven science in general.

  1. Computer-aided discovery of a metal-organic framework with superior oxygen uptake.

    PubMed

    Moghadam, Peyman Z; Islamoglu, Timur; Goswami, Subhadip; Exley, Jason; Fantham, Marcus; Kaminski, Clemens F; Snurr, Randall Q; Farha, Omar K; Fairen-Jimenez, David

    2018-04-11

    Current advances in materials science have resulted in the rapid emergence of thousands of functional adsorbent materials in recent years. This clearly creates multiple opportunities for their potential application, but it also creates the following challenge: how does one identify the most promising structures, among the thousands of possibilities, for a particular application? Here, we present a case of computer-aided material discovery, in which we complete the full cycle from computational screening of metal-organic framework materials for oxygen storage, to identification, synthesis and measurement of oxygen adsorption in the top-ranked structure. We introduce an interactive visualization concept to analyze over 1000 unique structure-property plots in five dimensions and delimit the relationships between structural properties and oxygen adsorption performance at different pressures for 2932 already-synthesized structures. We also report a world-record holding material for oxygen storage, UMCM-152, which delivers 22.5% more oxygen than the best known material to date, to the best of our knowledge.

  2. A Public + Private Mashup for Computer Science Education

    ERIC Educational Resources Information Center

    Wang, Kevin

    2013-01-01

    Getting called into the boss's office isn't always fun. Memories of trips to the school principal's office flash through one's mind. But the day last year that the author was called in to meet with their division vice president turned out to be a very good day. Executives at his company, Microsoft, had noticed the program he created in his spare…

  3. Changing from computing grid to knowledge grid in life-science grid.

    PubMed

    Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy

    2009-09-01

    Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  4. A Unified Data-Driven Approach for Programming In Situ Analysis and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, Alex

    The placement and movement of data is becoming the key limiting factor on both performance and energy efficiency of high performance computations. As systems generate more data, it is becoming increasingly difficult to actually move that data elsewhere for post-processing, as the rate of improvements in supporting I/O infrastructure is not keeping pace. Together, these trends are creating a shift in how we think about exascale computations, from a viewpoint that focuses on FLOPS to one that focuses on data and data-centric operations as fundamental to the reasoning about, and optimization of, scientific workflows on extreme-scale architectures. The overarching goalmore » of our effort was the study of a unified data-driven approach for programming applications and in situ analysis and visualization. Our work was to understand the interplay between data-centric programming model requirements at extreme-scale and the overall impact of those requirements on the design, capabilities, flexibility, and implementation details for both applications and the supporting in situ infrastructure. In this context, we made many improvements to the Legion programming system (one of the leading data-centric models today) and demonstrated in situ analyses on real application codes using these improvements.« less

  5. Development of Multi-Physics Dynamics Models for High-Frequency Large-Amplitude Structural Response Simulation

    NASA Technical Reports Server (NTRS)

    Derkevorkian, Armen; Peterson, Lee; Kolaini, Ali R.; Hendricks, Terry J.; Nesmith, Bill J.

    2016-01-01

    An analytic approach is demonstrated to reveal potential pyroshock -driven dynamic effects causing power losses in the Thermo -Electric (TE) module bars of the Mars Science Laboratory (MSL) Multi -Mission Radioisotope Thermoelectric Generator (MMRTG). This study utilizes high- fidelity finite element analysis with SIERRA/PRESTO codes to estimate wave propagation effects due to large -amplitude suddenly -applied pyro shock loads in the MMRTG. A high fidelity model of the TE module bar was created with approximately 30 million degrees -of-freedom (DOF). First, a quasi -static preload was applied on top of the TE module bar, then transient tri- axial acceleration inputs were simultaneously applied on the preloaded module. The applied input acceleration signals were measured during MMRTG shock qualification tests performed at the Jet Propulsion Laboratory. An explicit finite element solver in the SIERRA/PRESTO computational environment, along with a 3000 processor parallel super -computing framework at NASA -AMES, was used for the simulation. The simulation results were investigated both qualitatively and quantitatively. The predicted shock wave propagation results provide detailed structural responses throughout the TE module bar, and key insights into the dynamic response (i.e., loads, displacements, accelerations) of critical internal spring/piston compression systems, TE materials, and internal component interfaces in the MMRTG TE module bar. They also provide confidence on the viability of this high -fidelity modeling scheme to accurately predict shock wave propagation patterns within complex structures. This analytic approach is envisioned for modeling shock sensitive hardware susceptible to intense shock environments positioned near shock separation devices in modern space vehicles and systems.

  6. Re-Awakening the Learner: Creating Learner-Centric, Standards-Driven Schools

    ERIC Educational Resources Information Center

    Stoll, Copper; Giddings, Gene

    2012-01-01

    Transformation of public education requires the reawakening of the sleeping giant in the room: the learners. Students, teachers, and principals must develop a learner-centric, standards-driven school. "Reawakening the Learner" is a guide to creating just such an environment. Continua describe the journey of teachers, teacher leaders, and…

  7. Preface to "Should animal welfare be law or market driven?"

    USDA-ARS?s Scientific Manuscript database

    The Bioethics Symposium, entitled “Should animal welfare be law or market driven?” was held at the joint annual meeting of the American Dairy Science Association, American Society of Animal Science, Poultry Science Association, Asociación Mexicana de Producción Animal, and Canadian Society of Animal...

  8. Proactive monitoring of an onshore wind farm through lidar measurements, SCADA data and a data-driven RANS solver

    NASA Astrophysics Data System (ADS)

    Iungo, Giacomo Valerio; Camarri, Simone; Ciri, Umberto; El-Asha, Said; Leonardi, Stefano; Rotea, Mario A.; Santhanagopalan, Vignesh; Viola, Francesco; Zhan, Lu

    2016-11-01

    Site conditions, such as topography and local climate, as well as wind farm layout strongly affect performance of a wind power plant. Therefore, predictions of wake interactions and their effects on power production still remain a great challenge in wind energy. For this study, an onshore wind turbine array was monitored through lidar measurements, SCADA and met-tower data. Power losses due to wake interactions were estimated to be approximately 4% and 2% of the total power production under stable and convective conditions, respectively. This dataset was then leveraged for the calibration of a data driven RANS (DDRANS) solver, which is a compelling tool for prediction of wind turbine wakes and power production. DDRANS is characterized by a computational cost as low as that for engineering wake models, and adequate accuracy achieved through data-driven tuning of the turbulence closure model. DDRANS is based on a parabolic formulation, axisymmetry and boundary layer approximations, which allow achieving low computational costs. The turbulence closure model consists in a mixing length model, which is optimally calibrated with the experimental dataset. Assessment of DDRANS is then performed through lidar and SCADA data for different atmospheric conditions. This material is based upon work supported by the National Science Foundation under the I/UCRC WindSTAR, NSF Award IIP 1362033.

  9. Creating technical heritage object replicas in a virtual environment

    NASA Astrophysics Data System (ADS)

    Egorova, Olga; Shcherbinin, Dmitry

    2016-03-01

    The paper presents innovative informatics methods for creating virtual technical heritage replicas, which are of significant scientific and practical importance not only to researchers but to the public in general. By performing 3D modeling and animation of aircrafts, spaceships, architectural-engineering buildings, and other technical objects, the process of learning is achieved while promoting the preservation of the replicas for future generations. Modern approaches based on the wide usage of computer technologies attract a greater number of young people to explore the history of science and technology and renew their interest in the field of mechanical engineering.

  10. AMP: a science-driven web-based application for the TeraGrid

    NASA Astrophysics Data System (ADS)

    Woitaszek, M.; Metcalfe, T.; Shorrock, I.

    The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.

  11. Computational Toxicology at the US EPA | Science Inventory ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t

  12. Enabling Earth Science: The Facilities and People of the NCCS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.

  13. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics

    PubMed Central

    Dutta-Moscato, Joyeeta; Gopalakrishnan, Vanathi; Lotze, Michael T.; Becich, Michael J.

    2014-01-01

    This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM) training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical) informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC)), Richard Hersheberger, PhD (Currently, Dean at Roswell Park), and Megan Seippel, MS (the administrator) launched the University of Pittsburgh Cancer Institute (UPCI) Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI) was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical informatics will be critical to assuring their success as leaders in the era of big data and personalized medicine. PMID:24860688

  14. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics.

    PubMed

    Dutta-Moscato, Joyeeta; Gopalakrishnan, Vanathi; Lotze, Michael T; Becich, Michael J

    2014-01-01

    This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM) training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical) informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC)), Richard Hersheberger, PhD (Currently, Dean at Roswell Park), and Megan Seippel, MS (the administrator) launched the University of Pittsburgh Cancer Institute (UPCI) Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI) was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical informatics will be critical to assuring their success as leaders in the era of big data and personalized medicine.

  15. A Distributed Laboratory for Event-Driven Coastal Prediction and Hazard Planning

    NASA Astrophysics Data System (ADS)

    Bogden, P.; Allen, G.; MacLaren, J.; Creager, G. J.; Flournoy, L.; Sheng, Y. P.; Graber, H.; Graves, S.; Conover, H.; Luettich, R.; Perrie, W.; Ramakrishnan, L.; Reed, D. A.; Wang, H. V.

    2006-12-01

    The 2005 Atlantic hurricane season was the most active in recorded history. Collectively, 2005 hurricanes caused more than 2,280 deaths and record damages of over 100 billion dollars. Of the storms that made landfall, Dennis, Emily, Katrina, Rita, and Wilma caused most of the destruction. Accurate predictions of storm-driven surge, wave height, and inundation can save lives and help keep recovery costs down, provided the information gets to emergency response managers in time. The information must be available well in advance of landfall so that responders can weigh the costs of unnecessary evacuation against the costs of inadequate preparation. The SURA Coastal Ocean Observing and Prediction (SCOOP) Program is a multi-institution collaboration implementing a modular, distributed service-oriented architecture for real time prediction and visualization of the impacts of extreme atmospheric events. The modular infrastructure enables real-time prediction of multi- scale, multi-model, dynamic, data-driven applications. SURA institutions are working together to create a virtual and distributed laboratory integrating coastal models, simulation data, and observations with computational resources and high speed networks. The loosely coupled architecture allows teams of computer and coastal scientists at multiple institutions to innovate complex system components that are interconnected with relatively stable interfaces. The operational system standardizes at the interface level to enable substantial innovation by complementary communities of coastal and computer scientists. This architectural philosophy solves a long-standing problem associated with the transition from research to operations. The SCOOP Program thereby implements a prototype laboratory consistent with the vision of a national, multi-agency initiative called the Integrated Ocean Observing System (IOOS). Several service- oriented components of the SCOOP enterprise architecture have already been designed and implemented, including data archive and transport services, metadata registry and retrieval (catalog), resource management, and portal interfaces. SCOOP partners are integrating these at the service level and implementing reconfigurable workflows for several kinds of user scenarios, and are working with resource providers to prototype new policies and technologies for on-demand computing.

  16. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    NASA Astrophysics Data System (ADS)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  17. Understanding Life : The Evolutionary Dynamics of Complexity and Semiosis

    NASA Astrophysics Data System (ADS)

    Loeckenhoff, Helmut K.

    2010-11-01

    Post-Renaissance sciences created different cultures. To establish an epistemological base, Physics were separated from the Mental domain. Consciousness was excluded from science. Life Sciences were left in between e.g. LaMettrie's `man—machine' (1748) and 'vitalism' [e.g. Bergson 4]. Causative thinking versus intuitive arguing limited strictly comprehensive concepts. First ethology established a potential shared base for science, proclaiming the `biology paradigm' in the middle of the 20th century. Initially procured by Cybernetics and Systems sciences, `constructivist' models prepared a new view on human perception and thus also of scientific `objectivity when introducing the `observer'. In sequel Computer sciences triggered the ICT revolution. In turn ICT helped to develop Chaos and Complexity sciences, Non-linear Mathematics and its spin-offs in the formal sciences [Spencer-Brown 49] as e.g. (proto-)logics. Models of life systems, as e.g. Anticipatory Systems, integrated epistemology with mathematics and Anticipatory Computing [Dubois 11, 12, 13, 14] connecting them with Semiotics. Seminal ideas laid in the turn of the 19th to the 20th century [J. v. Uexküll 53] detected the co-action and co-evolvement of environments and life systems. Bio-Semiotics ascribed purpose, intent and meaning as essential qualities of life. The concepts of Systems Biology and Qualitative Research enriched and develop also anthropologies and humanities. Brain research added models of (higher) consciousness. An avant-garde is contemplating a science including consciousness as one additional base. New insights from the extended qualitative approach led to re-conciliation of basic assumptions of scientific inquiry, creating the `epistemological turn'. Paradigmatically, resting on macro- micro- and recently on nano-biology, evolution biology sired fresh scripts of evolution [W. Wieser 60,61]. Its results tie to hypotheses describing the emergence of language, of the human mind and of culture [e.g. R. Logan 34]. The different but related approaches are yet but loosely connected. Recent efforts search for a shared foundation e.g. in a set of Transdisciplinary base models [Loeckenhoff 30, 31]. The domain of pure mental constructions as ideologies/religions and spiritual phenomena will be implied.

  18. The experiences of female high school students and interest in STEM: Factors leading to the selection of an engineering or computer science major

    NASA Astrophysics Data System (ADS)

    Genoways, Sharon K.

    STEM (Science, Technology, Engineering and Math) education creates critical thinkers, increases science literacy, and enables the next generation of innovators, which leads to new products and processes that sustain our economy (Hossain & Robinson, 2012). We have been hearing the warnings for several years, that there simply are not enough young scientists entering into the STEM professional pathways to replace all of the retiring professionals (Brown, Brown, Reardon, & Merrill, 2011; Harsh, Maltese, & Tai, 2012; Heilbronner, 2011; Scott, 2012). The problem is not necessarily due to a lack of STEM skills and concept proficiency. There also appears to be a lack of interest in these fields. Recent evidence suggests that many of the most proficient students, especially minority students and women, have been gravitating away from science and engineering toward other professions. (President's Council of Advisors on Science and Technology, 2010). The purpose of this qualitative research study was an attempt to determine how high schools can best prepare and encourage young women for a career in engineering or computer science. This was accomplished by interviewing a pool of 21 women, 5 recent high school graduates planning to major in STEM, 5 college students who had completed at least one full year of coursework in an engineering or computer science major and 11 professional women who had been employed as an engineer or computer scientist for at least one full year. These women were asked to share the high school courses, activities, and experiences that best prepared them to pursue an engineering or computer science major. Five central themes emerged from this study; coursework in physics and calculus, promotion of STEM camps and clubs, teacher encouragement of STEM capabilities and careers, problem solving, critical thinking and confidence building activities in the classroom, and allowing students the opportunity to fail and ask questions in a safe environment. These themes may be implemented by any instructor, in any course, who wishes to provide students with the means to success in their quest for a STEM career.

  19. Medicinal chemistry in drug discovery in big pharma: past, present and future.

    PubMed

    Campbell, Ian B; Macdonald, Simon J F; Procopiou, Panayiotis A

    2018-02-01

    The changes in synthetic and medicinal chemistry and related drug discovery science as practiced in big pharma over the past few decades are described. These have been predominantly driven by wider changes in society namely the computer, internet and globalisation. Thoughts about the future of medicinal chemistry are also discussed including sharing the risks and costs of drug discovery and the future of outsourcing. The continuing impact of access to substantial computing power and big data, the use of algorithms in data analysis and drug design are also presented. The next generation of medicinal chemists will communicate in ways that reflect social media and the results of constantly being connected to each other and data. Copyright © 2017. Published by Elsevier Ltd.

  20. Identifying the relationship between feedback provided in computer-assisted instructional modules, science self-efficacy, and academic achievement

    NASA Astrophysics Data System (ADS)

    Mazingo, Diann Etsuko

    Feedback has been identified as a key variable in developing academic self-efficacy. The types of feedback can vary from a traditional, objectivist approach that focuses on minimizing learner errors to a more constructivist approach, focusing on facilitating understanding. The influx of computer-based courses, whether online or through a series of computer-assisted instruction (CAI) modules require that the current research of effective feedback techniques in the classroom be extended to computer environments in order to impact their instructional design. In this study, exposure to different types of feedback during a chemistry CAI module was studied in relation to science self-efficacy (SSE) and performance on an objective-driven assessment (ODA) of the chemistry concepts covered in the unit. The quantitative analysis consisted of two separate ANCOVAs on the dependent variables, using pretest as the covariate and group as the fixed factor. No significant differences were found for either variable between the three groups on adjusted posttest means for the ODA and SSE measures (.95F(2, 106) = 1.311, p = 0.274 and .95F(2, 106) = 1.080, p = 0.344, respectively). However, a mixed methods approach yielded valuable qualitative insights into why only one overall quantitative effect was observed. These findings are discussed in relation to the need to further refine the instruments and methods used in order to more fully explore the possibility that type of feedback might play a role in developing SSE, and consequently, improve academic performance in science. Future research building on this study may reveal significance that could impact instructional design practices for developing online and computer-based instruction.

  1. Investigations of Physical Processes in Microgravity Relevant to Space Electrochemical Power Systems

    NASA Technical Reports Server (NTRS)

    Lvovich, Vadim F.; Green, Robert; Jakupca, Ian

    2015-01-01

    NASA has performed physical science microgravity flight experiments in the areas of combustion science, fluid physics, material science and fundamental physics research on the International Space Station (ISS) since 2001. The orbital conditions on the ISS provide an environment where gravity driven phenomena, such as buoyant convection, are nearly negligible. Gravity strongly affects fluid behavior by creating forces that drive motion, shape phase boundaries and compress gases. The need for a better understanding of fluid physics has created a vigorous, multidisciplinary research community whose ongoing vitality is marked by the continuous emergence of new fields in both basic and applied science. In particular, the low-gravity environment offers a unique opportunity for the study of fluid physics and transport phenomena that are very relevant to management of fluid - gas separations in fuel cell and electrolysis systems. Experiments conducted in space have yielded rich results. These results provided valuable insights into fundamental fluid and gas phase behavior that apply to space environments and could not be observed in Earth-based labs. As an example, recent capillary flow results have discovered both an unexpected sensitivity to symmetric geometries associated with fluid container shape, and identified key regime maps for design of corner or wedge-shaped passive gas-liquid phase separators. In this presentation we will also briefly review some of physical science related to flight experiments, such as boiling, that have applicability to electrochemical systems, along with ground-based (drop tower, low gravity aircraft) microgravity electrochemical research. These same buoyancy and interfacial phenomena effects will apply to electrochemical power and energy storage systems that perform two-phase separation, such as water-oxygen separation in life support electrolysis, and primary space power generation devices such as passive primary fuel cell.

  2. Emerging technologies and perspectives for nutrition research in European Union 7th Framework Programme.

    PubMed

    de Froidmont-Görtz, Isabelle B M

    2009-12-01

    Nutrition trends in Europe are driven by taste, health and convenience. The possibilities of research using new technologies and tools such as nutrigenomics, imaging techniques, nanotechnology, bioinformatics, cognitive sciences, innovative processes are very promising to support these nutrition trends and in particular their health aspects. This is supported by European Union research. The opportunities offered in the 7th Framework Programme (FP7), among other innovations, will contribute to the general aim of improving nutrition policy as well as improving products from the food industry in accordance with the Lisbon strategy to create employment and improve the quality of life of the European citizens.

  3. Dropping Out of Computer Science: A Phenomenological Study of Student Lived Experiences in Community College Computer Science

    NASA Astrophysics Data System (ADS)

    Gilbert-Valencia, Daniel H.

    California community colleges contribute alarmingly few computer science degree or certificate earners. While the literature shows clear K-12 impediments to CS matriculation in higher education, very little is known about the experiences of those who overcome initial impediments to CS yet do not persist through to program completion. This phenomenological study explores insights into that specific experience by interviewing underrepresented, low income, first-generation college students who began community college intending to transfer to 4-year institutions majoring in CS but switched to another field and remain enrolled or graduated. This study explores the lived experiences of students facing barriers, their avenues for developing interest in CS, and the persistence support systems they encountered, specifically looking at how students constructed their academic choice from these experiences. The growing diversity within California's population necessitates that experiences specific to underrepresented students be considered as part of this exploration. Ten semi-structured interviews and observations were conducted, transcribed and coded. Artifacts supporting student experiences were also collected. Data was analyzed through a social-constructivist lens to provide insight into experiences and how they can be navigated to create actionable strategies for community college computer science departments wishing to increase student success. Three major themes emerged from this research: (1) students shared pre-college characteristics; (2) faced similar challenges in college CS courses; and (3) shared similar reactions to the "work" of computer science. Results of the study included (1) CS interest development hinged on computer ownership in the home; (2) participants shared characteristics that were ideal for college success but not CS success; and (3) encounters in CS departments produced unique challenges for participants. Though CS interest was and remains abundant, opportunities for learning programming skills before college were non-existent and there were few opportunities in college to build skills or establish a peer support networks. Recommendations for institutional leaders and further research are also provided.

  4. Damsel: A Data Model Storage Library for Exascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Liao, Wei-keng

    Computational science applications have been described as having one of seven motifs (the “seven dwarfs”), each having a particular pattern of computation and communication. From a storage and I/O perspective, these applications can also be grouped into a number of data model motifs describing the way data is organized and accessed during simulation, analysis, and visualization. Major storage data models developed in the 1990s, such as Network Common Data Format (netCDF) and Hierarchical Data Format (HDF) projects, created support for more complex data models. Development of both netCDF and HDF5 was influenced by multi-dimensional dataset storage requirements, but their accessmore » models and formats were designed with sequential storage in mind (e.g., a POSIX I/O model). Although these and other high-level I/O libraries have had a beneficial impact on large parallel applications, they do not always attain a high percentage of peak I/O performance due to fundamental design limitations, and they do not address the full range of current and future computational science data models. The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. The project consists of three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community. The product of this project, Damsel library, is openly available for download from http://cucis.ece.northwestern.edu/projects/DAMSEL. Several case studies and application programming interface reference are also available to assist new users to learn to use the library.« less

  5. Re-designing an Earth Sciences outreach program for Rhode Island public elementary schools to address new curricular standards and logistical realities in the community

    NASA Astrophysics Data System (ADS)

    Richter, N.; Vachula, R. S.; Pascuzzo, A.; Prilipko Huber, O.

    2017-12-01

    In contrast to middle and high school students, elementary school students in Rhode Island (RI) have no access to dedicated science teachers, resulting in uneven quality and scope of science teaching across the state. In an attempt to improve science education in local public elementary schools, the Department of Earth, Environmental, and Planetary Sciences (DEEPS) at Brown University initiated a student-driven science-teaching program that was supported by a NSF K-12 grant from 2007 to 2014. The program led to the development of an extensive in-house lesson plan database and supported student-led outreach and teaching in several elementary and middle school classrooms. After funding was terminated, the program continued on a volunteer basis, providing year-round science teaching for several second-grade classrooms. During the 2016-2017 academic year, New Generation Science Standards (NGSS) were introduced in RI public schools, and it became apparent that our outreach efforts required adaptation to be more efficient and relevant for both elementary school students and teachers. To meet these new needs, DEEPS, in collaboration with the Providence Public School District, created an intensive summer re-design program involving both graduate and undergraduate students. Three multi-lesson units were developed in collaboration with volunteer public school teachers to specifically address NGSS goals for earth science teaching in 2nd, 3rd and 4th grades. In the 2017-2018 academic year DEEPS students will co-teach the science lessons with the public school teachers in two local elementary schools. At the end of the next academic year all lesson plans and activities will be made publically available through a newly designed DEEPS outreach website. We herein detail our efforts to create and implement new educational modules with the goals of: (1) empowering teachers to instruct science, (2) engaging students and fostering lasting STEM interest and competency, (3) optimizing volunteer resources, (4) meeting new state curricular standards, (5) developing publicly available lesson plans for other teachers and outreach programs, (6) institutionalizing the outreach program within the DEEPS community, and (7) cultivating STEM retention at the grassroots level.

  6. Seeing Science through Symmetry

    NASA Astrophysics Data System (ADS)

    Gould, L. I.

    Seeing Through Symmetry is a course that introduces non-science majors to the pervasive influence of symmetry in science. The concept of symmetry is usedboth as a link between subjects (such as physics, biology, mathematics, music, poetry, and art) and as a method within a subject. This is done through the development and use of interactive multimedia learning environments to stimulate learning. Computer-based labs enable the student to further explore the concept by being gently led from the arts to science. This talk is an update that includes some of the latest changes to the course. Explanations are given on methodology and how a variety of interactive multimedia tools contribute to both the lecture and lab portion of the course (created in 1991 and taught almost every semester since then, including one in Sweden).

  7. Teaching Astronomy and Computation with Gaia: A New Curriculum for an Extra-curricular High School Program

    NASA Astrophysics Data System (ADS)

    Schwab, Ellianna; Faherty, Jacqueline K.; Barua, Prachurjya; Cooper, Ellie; Das, Debjani; Simone-Gonzalez, Luna; Sowah, Maxine; Valdez, Laura; BridgeUP: STEM

    2018-01-01

    BridgeUP: STEM (BridgeUP) is a program at the American Museum of Natural History (AMNH) that seeks to empower women by providing early-career scientists with research fellowships and high-school aged women with instruction in computer science and algorithmic methods. BridgeUP achieves this goal by employing post-baccalaureate women as Helen Fellows, who, in addition to conducting their own scientific research, mentor and teach high school students from the New York City area. The courses, targeted at early high-school students, are designed to teach algorithmic thinking and scientific methodology through the lens of computational science. In this poster we present the new BridgeUP astronomy curriculum created for 9th and 10th grade girls.The astronomy course we present is designed to introduce basic concepts as well as big data manipulation through a guided exploration of Gaia (DR1). Students learn about measuring astronomical distances through hands-on lab experiments illustrating the brightness/distance relationship, angular size calculations of the height of AMNH buildings, and in-depth Hertzsprung-Russell Diagram activities. Throughout these labs, students increase their proficiency in collecting and analyzing data, while learning to build and share code in teams. The students use their new skills to create color-color diagrams of known co-moving clusters (Oh et al. 2017) in the DR1 dataset using Python, Pandas and Matplotlib. We discuss the successes and lessons learned in the first implementation of this curriculum and show the preliminary work of six of the students, who are continuing with computational astronomy research over the current school year.

  8. Bridging the Sciences of Mindfulness and Romantic Relationships.

    PubMed

    Karremans, Johan C; Schellekens, Melanie P J; Kappen, Gesa

    2017-02-01

    Research on mindfulness, defined as paying conscious and non-judgmental attention to present-moment experiences, has increased rapidly in the past decade but has focused almost entirely on the benefits of mindfulness for individual well-being. This article considers the role of mindfulness in romantic relationships. Although strong claims have been made about the potentially powerful role of mindfulness in creating better relationships, it is less clear whether, when, and how this may occur. This article integrates the literatures on mindfulness and romantic relationship science, and sketches a theory-driven model and future research agenda to test possible pathways of when and how mindfulness may affect romantic relationship functioning. We review some initial direct and indirect evidence relevant to the proposed model. Finally, we discuss the implications of how studying mindfulness may further our understanding of romantic relationship (dys)functioning, and how mindfulness may be a promising and effective tool in couple interventions.

  9. An integrative conceptual framework of disability. New directions for research.

    PubMed

    Tate, Denise G; Pledger, Constance

    2003-04-01

    Advances in research on disability and rehabilitation are essential to creating equal opportunity, economic self-sufficiency, and full participation for persons with disabilities. Historically, such initiatives have focused on separate and specific areas, including neuroscience, molecular biology and genetics, gerontology, engineering and physical sciences, and social and behavioral sciences. Research on persons with disabilities should examine the broader context and trends of society that affect the total environment of persons with disabilities. This article examines the various disability paradigms across time, assessing the relative contribution of the socioecological perspective in guiding research designed to improve the lives of persons with disabilities. The authors recommend new research directions that include a focus on life span issues, biomedicine, biotechnology, the efficacy and effectiveness of current interventions, an emphasis on consumer-driven investigations within a socioecological perspective of disability, and the implications for research and practice.

  10. Conscientious Classification: A Data Scientist's Guide to Discrimination-Aware Classification.

    PubMed

    d'Alessandro, Brian; O'Neil, Cathy; LaGatta, Tom

    2017-06-01

    Recent research has helped to cultivate growing awareness that machine-learning systems fueled by big data can create or exacerbate troubling disparities in society. Much of this research comes from outside of the practicing data science community, leaving its members with little concrete guidance to proactively address these concerns. This article introduces issues of discrimination to the data science community on its own terms. In it, we tour the familiar data-mining process while providing a taxonomy of common practices that have the potential to produce unintended discrimination. We also survey how discrimination is commonly measured, and suggest how familiar development processes can be augmented to mitigate systems' discriminatory potential. We advocate that data scientists should be intentional about modeling and reducing discriminatory outcomes. Without doing so, their efforts will result in perpetuating any systemic discrimination that may exist, but under a misleading veil of data-driven objectivity.

  11. Evolution of a Collaborative Model between Nursing and Computer Science Faculty and a Community Service Organization to Develop an Information System

    PubMed Central

    Carson, Anne; Troy, Douglas

    2007-01-01

    Nursing and computer science students and faculty worked with the American Red Cross to investigate the potential for information technology to provide Red Cross disaster services nurses with improved access to accurate community resources in times of disaster. Funded by a national three-year grant, this interdisciplinary partnership led to field testing of an information system to support local community disaster preparedness at seven Red Cross chapters across the United States. The field test results demonstrate the benefits of the technology and the value of interdisciplinary research. The work also created a sustainable learning and research model for the future. This paper describes the collaborative model employed in this interdisciplinary research and exemplifies the benefits to faculty and students of well-timed interdisciplinary and community collaboration. PMID:18600129

  12. TerraFERMA: The Transparent Finite Element Rapid Model Assembler for multiphysics problems in Earth sciences

    NASA Astrophysics Data System (ADS)

    Wilson, Cian R.; Spiegelman, Marc; van Keken, Peter E.

    2017-02-01

    We introduce and describe a new software infrastructure TerraFERMA, the Transparent Finite Element Rapid Model Assembler, for the rapid and reproducible description and solution of coupled multiphysics problems. The design of TerraFERMA is driven by two computational needs in Earth sciences. The first is the need for increased flexibility in both problem description and solution strategies for coupled problems where small changes in model assumptions can lead to dramatic changes in physical behavior. The second is the need for software and models that are more transparent so that results can be verified, reproduced, and modified in a manner such that the best ideas in computation and Earth science can be more easily shared and reused. TerraFERMA leverages three advanced open-source libraries for scientific computation that provide high-level problem description (FEniCS), composable solvers for coupled multiphysics problems (PETSc), and an options handling system (SPuD) that allows the hierarchical management of all model options. TerraFERMA integrates these libraries into an interface that organizes the scientific and computational choices required in a model into a single options file from which a custom compiled application is generated and run. Because all models share the same infrastructure, models become more reusable and reproducible, while still permitting the individual researcher considerable latitude in model construction. TerraFERMA solves partial differential equations using the finite element method. It is particularly well suited for nonlinear problems with complex coupling between components. TerraFERMA is open-source and available at http://terraferma.github.io, which includes links to documentation and example input files.

  13. Investigating the Effect of Argument-Driven Inquiry in Laboratory Instruction

    ERIC Educational Resources Information Center

    Demircioglu, Tuba; Ucar, Sedat

    2015-01-01

    The aim of this study is to investigate the effect of argument-driven inquiry (ADI) based laboratory instruction on the academic achievement, argumentativeness, science process skills, and argumentation levels of pre-service science teachers in the General Physics Laboratory III class. The study was conducted with 79 pre-service science teachers.…

  14. XMM-Newton Remote Interface to Science Analysis Software: First Public Version

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Gabriel, C.

    2011-07-01

    We present the first public beta release of the XMM-Newton Remote Interface to Science Analysis (RISA) software, available through the official XMM-Newton web pages. In a nutshell, RISA is a web based application that encapsulates the XMM-Newton data analysis software. The client identifies observations and creates XMM-Newton workflows. The server processes the client request, creates job templates and sends the jobs to a computer. RISA has been designed to help, at the same time, non-expert and professional XMM-Newton users. Thanks to the predefined threads, non-expert users can easily produce light curves and spectra. And on the other hand, expert user can use the full parameter interface to tune their own analysis. In both cases, the VO compliant client/server design frees the users from having to install any specific software to analyze XMM-Newton data.

  15. Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nugent, Peter E.; Simonson, J. Michael

    2011-10-24

    This report is based on the Department of Energy (DOE) Workshop on “Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery” that was held at the Bethesda Marriott in Maryland on October 24-25, 2011. The workshop brought together leading researchers from the Basic Energy Sciences (BES) facilities and Advanced Scientific Computing Research (ASCR). The workshop was co-sponsored by these two Offices to identify opportunities and needs for data analysis, ownership, storage, mining, provenance and data transfer at light sources, neutron sources, microscopy centers and other facilities. Their charge was to identify current and anticipated issues inmore » the acquisition, analysis, communication and storage of experimental data that could impact the progress of scientific discovery, ascertain what knowledge, methods and tools are needed to mitigate present and projected shortcomings and to create the foundation for information exchanges and collaboration between ASCR and BES supported researchers and facilities. The workshop was organized in the context of the impending data tsunami that will be produced by DOE’s BES facilities. Current facilities, like SLAC National Accelerator Laboratory’s Linac Coherent Light Source, can produce up to 18 terabytes (TB) per day, while upgraded detectors at Lawrence Berkeley National Laboratory’s Advanced Light Source will generate ~10TB per hour. The expectation is that these rates will increase by over an order of magnitude in the coming decade. The urgency to develop new strategies and methods in order to stay ahead of this deluge and extract the most science from these facilities was recognized by all. The four focus areas addressed in this workshop were: Workflow Management - Experiment to Science: Identifying and managing the data path from experiment to publication. Theory and Algorithms: Recognizing the need for new tools for computation at scale, supporting large data sets and realistic theoretical models. Visualization and Analysis: Supporting near-real-time feedback for experiment optimization and new ways to extract and communicate critical information from large data sets. Data Processing and Management: Outlining needs in computational and communication approaches and infrastructure needed to handle unprecedented data volume and information content. It should be noted that almost all participants recognized that there were unlikely to be any turn-key solutions available due to the unique, diverse nature of the BES community, where research at adjacent beamlines at a given light source facility often span everything from biology to materials science to chemistry using scattering, imaging and/or spectroscopy. However, it was also noted that advances supported by other programs in data research, methodologies, and tool development could be implemented on reasonable time scales with modest effort. Adapting available standard file formats, robust workflows, and in-situ analysis tools for user facility needs could pay long-term dividends. Workshop participants assessed current requirements as well as future challenges and made the following recommendations in order to achieve the ultimate goal of enabling transformative science in current and future BES facilities: Theory and analysis components should be integrated seamlessly within experimental workflow. Develop new algorithms for data analysis based on common data formats and toolsets. Move analysis closer to experiment. Move the analysis closer to the experiment to enable real-time (in-situ) streaming capabilities, live visualization of the experiment and an increase of the overall experimental efficiency. Match data management access and capabilities with advancements in detectors and sources. Remove bottlenecks, provide interoperability across different facilities/beamlines and apply forefront mathematical techniques to more efficiently extract science from the experiments. This workshop report examines and reviews the status of several BES facilities and highlights the successes and shortcomings of the current data and communication pathways for scientific discovery. It then ascertains what methods and tools are needed to mitigate present and projected data bottlenecks to science over the next 10 years. The goal of this report is to create the foundation for information exchanges and collaborations among ASCR and BES supported researchers, the BES scientific user facilities, and ASCR computing and networking facilities. To jumpstart these activities, there was a strong desire to see a joint effort between ASCR and BES along the lines of the highly successful Scientific Discovery through Advanced Computing (SciDAC) program in which integrated teams of engineers, scientists and computer scientists were engaged to tackle a complete end-to-end workflow solution at one or more beamlines, to ascertain what challenges will need to be addressed in order to handle future increases in data« less

  16. Computer-assisted learning in medicine. How to create a novel software for immunology.

    PubMed

    Colsman, Andreas; Sticherling, Michael; Stöpel, Claus; Emmrich, Frank

    2006-06-01

    Teaching medical issues is increasingly demanding due to the permanent progress in medical sciences. Simultaneously, software applications are rapidly advancing with regard to their availability and easy use. Here a novel teaching program is presented for immunology, which is one of the fastest expanding topics in medical sciences. The requirements of media didactics were transferred to this e-learning tool for German students. After implementation, medical students evaluated the software and the different learning approaches showed acceptance. Altogether this novel software compares favourably to other English e-learning tools available in the Internet.

  17. Executable research compendia in geoscience research infrastructures

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel

    2017-04-01

    From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with existing platforms for display and control These integrations are vital for capturing workflows in RIs and connect key stakeholders (scientists, publishers, librarians). They are demonstrated using developments by the DFG-funded project Opening Reproducible Research (http://o2r.info). Semi-automatic creation of ERCs based on research workflows is a core goal of the project. References [0] Tony Hey, Stewart Tansley, Kristin Tolle (eds), 2009. The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. [1] P. Martin et al., Open Information Linking for Environmental Research Infrastructures, 2015 IEEE 11th International Conference on e-Science, Munich, 2015, pp. 513-520. doi: 10.1109/eScience.2015.66 [2] Y. Chen et al., Analysis of Common Requirements for Environmental Science Research Infrastructures, The International Symposium on Grids and Clouds (ISGC) 2013, Taipei, 2013, http://pos.sissa.it/archive/conferences/179/032/ISGC [3] Opening Reproducible Research, Geophysical Research Abstracts Vol. 18, EGU2016-7396, 2016, http://meetingorganizer.copernicus.org/EGU2016/EGU2016-7396.pdf

  18. Inspired by design and driven by innovation. A conceptual model for radical design driven as a sustainable business model for Malaysian furniture design

    NASA Astrophysics Data System (ADS)

    Yusof, Wan Zaiyana Mohd; Fadzline Muhamad Tamyez, Puteri

    2018-04-01

    The definition of innovation does not help the entrepreneurs, business person or innovator to truly grasp what it means to innovate, hence we hear that government has spend millions of ringgit on “innovation” by doing R & D. However, the result has no avail in terms of commercial value. Innovation can be defined as the exploitation of commercialization of an idea or invention to create economic or social value. Most Entrepreneurs and business managers, regard innovation as creating economic value, while forgetting that innovation also create value for society or the environment. The ultimate goal as Entrepreneur, inventor or researcher is to exploit innovation to create value. As changes happen in society and economy, organizations and enterprises have to keep up and this requires innovation. This conceptual paper is to study the radical design driven innovation in the Malaysian furniture industry as a business model which the overall aim of the study is to examine the radical design driven innovation in Malaysia and how it compares with findings from Western studies. This paper will familiarize readers with the innovation and describe the radical design driven perspective that is adopted in its conceptual framework and design process.

  19. ORBIT: an integrated environment for user-customized bioinformatics tools.

    PubMed

    Bellgard, M I; Hiew, H L; Hunter, A; Wiebrands, M

    1999-10-01

    There are a large number of computational programs freely available to bioinformaticians via a client/server, web-based environment. However, the client interface to these tools (typically an html form page) cannot be customized from the client side as it is created by the service provider. The form page is usually generic enough to cater for a wide range of users. However, this implies that a user cannot set as 'default' advanced program parameters on the form or even customize the interface to his/her specific requirements or preferences. Currently, there is a lack of end-user interface environments that can be modified by the user when accessing computer programs available on a remote server running on an intranet or over the Internet. We have implemented a client/server system called ORBIT (Online Researcher's Bioinformatics Interface Tools) where individual clients can have interfaces created and customized to command-line-driven, server-side programs. Thus, Internet-based interfaces can be tailored to a user's specific bioinformatic needs. As interfaces are created on the client machine independent of the server, there can be different interfaces to the same server-side program to cater for different parameter settings. The interface customization is relatively quick (between 10 and 60 min) and all client interfaces are integrated into a single modular environment which will run on any computer platform supporting Java. The system has been developed to allow for a number of future enhancements and features. ORBIT represents an important advance in the way researchers gain access to bioinformatics tools on the Internet.

  20. Synthetic river valleys: Creating prescribed topography for form-process inquiry and river rehabilitation design

    NASA Astrophysics Data System (ADS)

    Brown, R. A.; Pasternack, G. B.; Wallender, W. W.

    2014-06-01

    The synthesis of artificial landforms is complementary to geomorphic analysis because it affords a reflection on both the characteristics and intrinsic formative processes of real world conditions. Moreover, the applied terminus of geomorphic theory is commonly manifested in the engineering and rehabilitation of riverine landforms where the goal is to create specific processes associated with specific morphology. To date, the synthesis of river topography has been explored outside of geomorphology through artistic renderings, computer science applications, and river rehabilitation design; while within geomorphology it has been explored using morphodynamic modeling, such as one-dimensional simulation of river reach profiles, two-dimensional simulation of river networks, and three-dimensional simulation of subreach scale river morphology. To date, no approach allows geomorphologists, engineers, or river rehabilitation practitioners to create landforms of prescribed conditions. In this paper a method for creating topography of synthetic river valleys is introduced that utilizes a theoretical framework that draws from fluvial geomorphology, computer science, and geometric modeling. Such a method would be valuable to geomorphologists in understanding form-process linkages as well as to engineers and river rehabilitation practitioners in developing design surfaces that can be rapidly iterated. The method introduced herein relies on the discretization of river valley topography into geometric elements associated with overlapping and orthogonal two-dimensional planes such as the planform, profile, and cross section that are represented by mathematical functions, termed geometric element equations. Topographic surfaces can be parameterized independently or dependently using a geomorphic covariance structure between the spatial series of geometric element equations. To illustrate the approach and overall model flexibility examples are provided that are associated with mountain, lowland, and hybrid synthetic river valleys. To conclude, recommended advances such as multithread channels are discussed along with potential applications.

Top