Sample records for large-scale science projects

  1. Investigating and Stimulating Primary Teachers' Attitudes Towards Science: Summary of a Large-Scale Research Project

    ERIC Educational Resources Information Center

    Walma van der Molen, Juliette; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical concept of attitude, methodological flaws in…

  2. The epistemic culture in an online citizen science project: Programs, antiprograms and epistemic subjects.

    PubMed

    Kasperowski, Dick; Hillman, Thomas

    2018-05-01

    In the past decade, some areas of science have begun turning to masses of online volunteers through open calls for generating and classifying very large sets of data. The purpose of this study is to investigate the epistemic culture of a large-scale online citizen science project, the Galaxy Zoo, that turns to volunteers for the classification of images of galaxies. For this task, we chose to apply the concepts of programs and antiprograms to examine the 'essential tensions' that arise in relation to the mobilizing values of a citizen science project and the epistemic subjects and cultures that are enacted by its volunteers. Our premise is that these tensions reveal central features of the epistemic subjects and distributed cognition of epistemic cultures in these large-scale citizen science projects.

  3. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  4. Development and Large-Scale Validation of an Instrument to Assess Arabic-Speaking Students' Attitudes toward Science

    ERIC Educational Resources Information Center

    Abd-El-Khalick, Fouad; Summers, Ryan; Said, Ziad; Wang, Shuai; Culbertson, Michael

    2015-01-01

    This study is part of a large-scale project focused on "Qatari students' Interest in, and Attitudes toward, Science" (QIAS). QIAS aimed to gauge Qatari student attitudes toward science in grades 3-12, examine factors that impact these attitudes, and assess the relationship between student attitudes and prevailing modes of science…

  5. Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER.

    PubMed

    Chatzigeorgiou, Giorgos; Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos

    2016-01-01

    Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues.

  6. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    NASA Astrophysics Data System (ADS)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning and management, with proper consideration to potential technological developments and new options. Finally, it highlights the urgent need for a broader framework that integrates the physical science-related aspects ("hard sciences") and the human science-related aspects ("soft sciences").

  7. Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER

    PubMed Central

    Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos

    2016-01-01

    Abstract Background Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. New information The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues. PMID:28174507

  8. The New Big Science at the NSLS

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2016-03-01

    The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.

  9. Primary Teachers Conducting Inquiry Projects: Effects on Attitudes towards Teaching Science and Conducting Inquiry

    ERIC Educational Resources Information Center

    van Aalderen-Smeets, Sandra I.; Walma van der Molen, Juliette H.; van Hest, Erna G. W. C. M.; Poortman, Cindy

    2017-01-01

    This study used an experimental, pretest-posttest control group design to investigate whether participation in a large-scale inquiry project would improve primary teachers' attitudes towards teaching science and towards conducting inquiry. The inquiry project positively affected several elements of teachers' attitudes. Teachers felt less anxious…

  10. What Will the Neighbors Think? Building Large-Scale Science Projects Around the World

    ScienceCinema

    Jones, Craig; Mrotzek, Christian; Toge, Nobu; Sarno, Doug

    2017-12-22

    Public participation is an essential ingredient for turning the International Linear Collider into a reality. Wherever the proposed particle accelerator is sited in the world, its neighbors -- in any country -- will have something to say about hosting a 35-kilometer-long collider in their backyards. When it comes to building large-scale physics projects, almost every laboratory has a story to tell. Three case studies from Japan, Germany and the US will be presented to examine how community relations are handled in different parts of the world. How do particle physics laboratories interact with their local communities? How do neighbors react to building large-scale projects in each region? How can the lessons learned from past experiences help in building the next big project? These and other questions will be discussed to engage the audience in an active dialogue about how a large-scale project like the ILC can be a good neighbor.

  11. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  12. The Power of Engaging Citizen Scientists for Scientific Progress

    PubMed Central

    Garbarino, Jeanne; Mason, Christopher E.

    2016-01-01

    Citizen science has become a powerful force for scientific inquiry, providing researchers with access to a vast array of data points while connecting nonscientists to the authentic process of science. This citizen-researcher relationship creates an incredible synergy, allowing for the creation, execution, and analysis of research projects that would otherwise prove impossible in traditional research settings, namely due to the scope of needed human or financial resources (or both). However, citizen-science projects are not without their challenges. For instance, as projects are scaled up, there is concern regarding the rigor and usability of data collected by citizens who are not formally trained in research science. While these concerns are legitimate, we have seen examples of highly successful citizen-science projects from multiple scientific disciplines that have enhanced our collective understanding of science, such as how RNA molecules fold or determining the microbial metagenomic snapshot of an entire public transportation system. These and other emerging citizen-science projects show how improved protocols for reliable, large-scale science can realize both an improvement of scientific understanding for the general public and novel views of the world around us. PMID:27047581

  13. Science Diplomacy in Large International Collaborations

    NASA Astrophysics Data System (ADS)

    Barish, Barry C.

    2011-04-01

    What opportunities and challenges does the rapidly growing internationalization of science, especially large scale science and technology projects, present for US science policy? On one hand, the interchange of scientists, the sharing of technology and facilities and the working together on common scientific goals promotes better understanding and better science. On the other hand, challenges are presented, because the science cannot be divorced from government policies, and solutions must be found for issues varying from visas to making reliable international commitments.

  14. Final Report on DOE Project entitled Dynamic Optimized Advanced Scheduling of Bandwidth Demands for Large-Scale Science Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramamurthy, Byravamurthy

    2014-05-05

    In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less

  15. eScience for molecular-scale simulations and the eMinerals project.

    PubMed

    Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H

    2009-03-13

    We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.

  16. The Aeolus project: Science outreach through art.

    PubMed

    Drumm, Ian A; Belantara, Amanda; Dorney, Steve; Waters, Timothy P; Peris, Eulalia

    2015-04-01

    With a general decline in people's choosing to pursue science and engineering degrees there has never been a greater need to raise the awareness of lesser known fields such as acoustics. Given this context, a large-scale public engagement project, the 'Aeolus project', was created to raise awareness of acoustics science through a major collaboration between an acclaimed artist and acoustics researchers. It centred on touring the large singing sculpture Aeolus during 2011/12, though the project also included an extensive outreach programme of talks, exhibitions, community workshops and resources for schools. Described here are the motivations behind the project and the artwork itself, the ways in which scientists and an artist collaborated, and the public engagement activities designed as part of the project. Evaluation results suggest that the project achieved its goal of inspiring interest in the discipline of acoustics through the exploration of an other-worldly work of art. © The Author(s) 2013.

  17. Living the lesson: can the Lifestyle Project be used to achieve deep learning in environmental earth science?

    NASA Astrophysics Data System (ADS)

    Padden, M.; Whalen, K.

    2013-12-01

    Students in a large, second-year environmental earth science class made significant changes to their daily lives over a three-week period to learn how small-scale actions interact with global-scaled issues such as water and energy supplies, waste management and agriculture. The Lifestyle Project (Kirk and Thomas, 2003) was slightly adapted to fit a large-class setting (350 students). Students made changes to their lifestyle in self-selected categories (water, home heating, transportation, waste, food) and created journals over a three-week period as the changes increased in difficulty. The goal of this study is to gain an understanding of which aspects of the project played a pivotal role in impacting long-term learning. Content analysis of the journal entries and follow-up interviews are used to investigate if the Lifestyle Project is having a lasting impact on the students 18 months after the initial assignment.

  18. Performance Assessments in Science: Hands-On Tasks and Scoring Guides.

    ERIC Educational Resources Information Center

    Stecher, Brian M.; Klein, Stephen P.

    In 1992, RAND received a grant from the National Science Foundation to study the technical quality of performance assessments in science and to evaluate their feasibility for use in large-scale testing programs. The specific goals of the project were to assess the reliability and validity of hands-on science testing and to investigate the cost and…

  19. Large-scale visualization projects for teaching software engineering.

    PubMed

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  20. The Snowmastodon Project: cutting-edge science on the blade of a bulldozer

    USGS Publications Warehouse

    Pigati, Jeffery S.; Miller, Ian M.; Johnson, Kirk R.

    2015-01-01

    Cutting-edge science happens at a variety of scales, from the individual and intimate to the large-scale and collaborative. The publication of a special issue of Quaternary Research in Nov. 2014 dedicated to the scientific findings of the “Snowmastodon Project” highlights what can be done when natural history museums, governmental agencies, and academic institutions work toward a common goal.

  1. The Use of the Nature of Scientific Knowledge Scale as a Entrance Assessment in a Large, Online Citizen Science Project

    NASA Astrophysics Data System (ADS)

    Price, Aaron

    2010-01-01

    Citizen Sky is a new three-year, astronomical citizen science project launched in June, 2009 with funding from the National Science Foundation. This paper reports on early results of an assessment delivered to 1000 participants when they first joined the project. The goal of the assessment, based on the Nature of Scientific Knowledge Scale (NSKS), is to characterize their attitudes towards the nature of scientific knowledge. Our results are that the NSKS components of the assessment achieved high levels of reliability. Both reliability and overall scores fall within the range reported from other NSKS studies in the literature. Correlation analysis with other components of the assessment reveals some factors, such as age and understanding of scientific evidence, may be reflected in scores of subscales of NSKS items. Further work will be done using online discourse analysis and interviews. Overall, we find that the NSKS can be used as an entrance assessment for an online citizen science project.

  2. Establishment of a National Wind Energy Center at University of Houston

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Su Su

    The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less

  3. Large-Scale Science Observatories: Building on What We Have Learned from USArray

    NASA Astrophysics Data System (ADS)

    Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.

    2015-12-01

    With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community governance structures were put in place to ensure a focus on science needs and goals, to provide an informed review of the project's results, and to carefully balance consistency of observations with technical evolution. We will summarize lessons learned from USArray and how these can be applied to future efforts such as SZO.

  4. A general method for large-scale fabrication of Cu nanoislands/dragonfly wing SERS flexible substrates

    NASA Astrophysics Data System (ADS)

    Wang, Yuhong; Wang, Mingli; Shen, Lin; Zhu, Yanying; Sun, Xin; Shi, Guochao; Xu, Xiaona; Li, Ruifeng; Ma, Wanli

    2018-01-01

    Not Available Project supported by the Youth Fund Project of University Science and Technology Plan of Hebei Provincial Department of Education, China (Grant No. QN2015004) and the Doctoral Fund of Yanshan University, China (Grant No. B924).

  5. The Human Genome Project: big science transforms biology and medicine.

    PubMed

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  6. A Large-Scale Inquiry-Based Astronomy Intervention Project: Impact on Students' Content Knowledge Performance and Views of Their High School Science Classroom

    ERIC Educational Resources Information Center

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena; Deehan, James

    2016-01-01

    In this paper, we present the results from a study of the impact on students involved in a large-scale inquiry-based astronomical high school education intervention in Australia. Students in this intervention were led through an educational design allowing them to undertake an investigative approach to understanding the lifecycle of stars more…

  7. Trinity to Trinity 1945-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moniz, Ernest; Carr, Alan; Bethe, Hans

    The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advancedmore » supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.« less

  8. Trinity to Trinity 1945-2015

    ScienceCinema

    Moniz, Ernest; Carr, Alan; Bethe, Hans; Morrison, Phillip; Ramsay, Norman; Teller, Edward; Brixner, Berlyn; Archer, Bill; Agnew, Harold; Morrison, John

    2018-01-16

    The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advanced supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.

  9. Biological science in conservation

    Treesearch

    David M. Johns

    2000-01-01

    Large-scale wildlands reserve systems offer one of the best hopes for slowing, if not reversing, the loss of biodiversity and wilderness. Establishing such reserves requires both sound biology and effective advocacy. Attempts by The Wildlands Project and its cooperators to meld science and advocacy in the service of conservation is working, but is not without some...

  10. Breaking barriers through collaboration: the example of the Cell Migration Consortium.

    PubMed

    Horwitz, Alan Rick; Watson, Nikki; Parsons, J Thomas

    2002-10-15

    Understanding complex integrated biological processes, such as cell migration, requires interdisciplinary approaches. The Cell Migration Consortium, funded by a Large-Scale Collaborative Project Award from the National Institute of General Medical Science, develops and disseminates new technologies, data, reagents, and shared information to a wide audience. The development and operation of this Consortium may provide useful insights for those who plan similarly large-scale, interdisciplinary approaches.

  11. Exposing the Science in Citizen Science: Fitness to Purpose and Intentional Design.

    PubMed

    Parrish, Julia K; Burgess, Hillary; Weltzin, Jake F; Fortson, Lucy; Wiggins, Andrea; Simmons, Brooke

    2018-05-21

    Citizen science is a growing phenomenon. With millions of people involved and billions of in-kind dollars contributed annually, this broad extent, fine grain approach to data collection should be garnering enthusiastic support in the mainstream science and higher education communities. However, many academic researchers demonstrate distinct biases against the use of citizen science as a source of rigorous information. To engage the public in scientific research, and the research community in the practice of citizen science, a mutual understanding is needed of accepted quality standards in science, and the corresponding specifics of project design and implementation when working with a broad public base. We define a science-based typology focused on the degree to which projects deliver the type(s) and quality of data/work needed to produce valid scientific outcomes directly useful in science and natural resource management. Where project intent includes direct contribution to science and the public is actively involved either virtually or hands-on, we examine the measures of quality assurance (methods to increase data quality during the design and implementation phases of a project) and quality control (post hoc methods to increase the quality of scientific outcomes). We suggest that high quality science can be produced with massive, largely one-off, participation if data collection is simple and quality control includes algorithm voting, statistical pruning and/or computational modeling. Small to mid-scale projects engaging participants in repeated, often complex, sampling can advance quality through expert-led training and well-designed materials, and through independent verification. Both approaches - simplification at scale and complexity with care - generate more robust science outcomes.

  12. Investigating the Impact of NGSS-Aligned Professional Development on PreK-3 Teachers' Science Content Knowledge and Pedagogy

    ERIC Educational Resources Information Center

    Tuttle, Nicole; Kaderavek, Joan N.; Molitor, Scott; Czerniak, Charlene M.; Johnson-Whitt, Eugenia; Bloomquist, Debra; Namatovu, Winnifred; Wilson, Grant

    2016-01-01

    This pilot study investigates the impact of a 2-week professional development Summer Institute on PK-3 teachers' knowledge and practices. This Summer Institute is a component of [program], a large-scale early-childhood science project that aims to transform PK-3 science teaching. The mixed-methods study examined concept maps, lesson plans, and…

  13. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less

  14. Data integration in the era of omics: current and future challenges

    PubMed Central

    2014-01-01

    To integrate heterogeneous and large omics data constitutes not only a conceptual challenge but a practical hurdle in the daily analysis of omics data. With the rise of novel omics technologies and through large-scale consortia projects, biological systems are being further investigated at an unprecedented scale generating heterogeneous and often large data sets. These data-sets encourage researchers to develop novel data integration methodologies. In this introduction we review the definition and characterize current efforts on data integration in the life sciences. We have used a web-survey to assess current research projects on data-integration to tap into the views, needs and challenges as currently perceived by parts of the research community. PMID:25032990

  15. An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.

    PubMed

    2012-11-01

    Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.

  16. Power monitoring and control for large scale projects: SKA, a case study

    NASA Astrophysics Data System (ADS)

    Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis

    2016-07-01

    Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.

  17. APDA's Contribution to Current Research and Citizen Science

    NASA Astrophysics Data System (ADS)

    Barker, Thurburn; Castelaz, M. W.; Cline, J. D.; Hudec, R.

    2010-01-01

    The Astronomical Photographical Data Archive (APDA) is dedicated to the collection, restoration, preservation, and digitization of astronomical photographic data that eventually can be accessed via the Internet by the global community of scientists, researchers and students. Located on the Pisgah Astronomical Research Institute campus, APDA now includes collections from North America totaling more than 100,000 photographic plates and films. Two new large scale research projects, and one citizen science project have now been developed from the archived data. One unique photographic data collection covering the southern hemisphere contains the signatures of diffuse interstellar bands (DIBs) within the stellar spectra on objective prism plates. We plan to digitize the spectra, identify the DIBs, and map out the large scale spatial extent of DIBS. The goal is to understand the Galactic environment suitable to the DIB molecules. Another collection contains spectra with nearly the same dispersion as the GAIA Satellite low dispersion slitless spectrophotometers, BP and RP. The plates will be used to develop standards for GAIA spectra. To bring the data from APDA to the general public, we have developed the citizen science project called Stellar Classification Online - Public Exploration (SCOPE). SCOPE allows the citizen scientist to classify up to a half million stars on objective prism plates. We will present the status of each of these projects.

  18. The Human Genome Project: big science transforms biology and medicine

    PubMed Central

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project. PMID:24040834

  19. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  20. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  1. Internationalization Measures in Large Scale Research Projects

    NASA Astrophysics Data System (ADS)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  2. How Robust Is Your Project? From Local Failures to Global Catastrophes: A Complex Networks Approach to Project Systemic Risk.

    PubMed

    Ellinas, Christos; Allan, Neil; Durugbo, Christopher; Johansson, Anders

    2015-01-01

    Current societal requirements necessitate the effective delivery of complex projects that can do more while using less. Yet, recent large-scale project failures suggest that our ability to successfully deliver them is still at its infancy. Such failures can be seen to arise through various failure mechanisms; this work focuses on one such mechanism. Specifically, it examines the likelihood of a project sustaining a large-scale catastrophe, as triggered by single task failure and delivered via a cascading process. To do so, an analytical model was developed and tested on an empirical dataset by the means of numerical simulation. This paper makes three main contributions. First, it provides a methodology to identify the tasks most capable of impacting a project. In doing so, it is noted that a significant number of tasks induce no cascades, while a handful are capable of triggering surprisingly large ones. Secondly, it illustrates that crude task characteristics cannot aid in identifying them, highlighting the complexity of the underlying process and the utility of this approach. Thirdly, it draws parallels with systems encountered within the natural sciences by noting the emergence of self-organised criticality, commonly found within natural systems. These findings strengthen the need to account for structural intricacies of a project's underlying task precedence structure as they can provide the conditions upon which large-scale catastrophes materialise.

  3. Barriers Inhibiting Inquiry-Based Science Teaching and Potential Solutions: Perceptions of Positively Inclined Early Adopters

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; Danaia, Lena; McKinnon, David H.

    2017-07-01

    In recent years, calls for the adoption of inquiry-based pedagogies in the science classroom have formed a part of the recommendations for large-scale high school science reforms. However, these pedagogies have been problematic to implement at scale. This research explores the perceptions of 34 positively inclined early-adopter teachers in relation to their implementation of inquiry-based pedagogies. The teachers were part of a large-scale Australian high school intervention project based around astronomy. In a series of semi-structured interviews, the teachers identified a number of common barriers that prevented them from implementing inquiry-based approaches. The most important barriers identified include the extreme time restrictions on all scales, the poverty of their common professional development experiences, their lack of good models and definitions for what inquiry-based teaching actually is, and the lack of good resources enabling the capacity for change. Implications for expectations of teachers and their professional learning during educational reform and curriculum change are discussed.

  4. Rethinking Big Science. Modest, mezzo, grand science and the development of the Bevalac, 1971-1993.

    PubMed

    Westfall, Catherine

    2003-03-01

    Historians of science have tended to focus exclusively on scale in investigations of largescale research, perhaps because it has been easy to assume that comprehending a phenomenon dubbed "Big Science" hinges on an understanding of bigness. A close look at Lawrence Berkeley Laboratory's Bevalac, a medium-scale "mezzo science" project formed by uniting two preexisting machines--the modest SuperHILAC and the grand Bevatron--shows what can be gained by overcoming this preoccupation with bigness. The Bevalac story reveals how interconnections, connections, and disconnections ultimately led to the development of a new kind of science that transformed the landscape of large-scale research in the United States. Important lessons in historiography also emerge: the value of framing discussions in terms of networks, the necessity of constantly expanding and refining methodology, and the importance of avoiding the rhetoric of participants and instead finding words to tell our own stories.

  5. Promoting Student Progressions in Science Classrooms: A Video Study

    ERIC Educational Resources Information Center

    Jin, Hui; Johnson, Michele E.; Shin, Hyo Jeong; Anderson, Charles W.

    2017-01-01

    This study was conducted in a large-scale environmental literacy project. In the project, we developed a Learning Progression Framework (LPF) for matter and energy in social-ecological systems; the LPF contains four achievement levels. Based on the LPF, we designed a Plant Unit to help Levels 2 and 3 students advance to Level 4 of the LPF. In the…

  6. Designing Citizen Science Projects in the Era of Mega-Information and Connected Activism

    NASA Astrophysics Data System (ADS)

    Pompea, S. M.

    2010-12-01

    The design of citizen science projects must take many factors into account in order to be successful. Currently, there are a wide variety of citizen science projects with different aims, audiences, reporting methods, and degrees of scientific rigor and usefulness. Projects function on local, national, and worldwide scales and range in time from limited campaigns to around the clock projects. For current and future projects, advanced cell phones and mobile computing allow an unprecedented degree of connectivity and data transfer. These advances will greatly influence the design of citizen science projects. An unprecedented amount of data is available for data mining by interested citizen scientists; how can projects take advantage of this? Finally, a variety of citizen scientist projects have social activism and change as part of their mission and goals. How can this be harnessed in a constructive and efficient way? The design of projects must also select the proper role for experts and novices, provide quality control, and must motivate users to encourage long-term involvement. Effective educational and instructional materials design can be used to design responsive and effective projects in a more highly connected age with access to very large amounts of information.

  7. NEON Citizen Science: Planning and Prototyping

    NASA Astrophysics Data System (ADS)

    Newman, S. J.; Henderson, S.; Gardiner, L. S.; Ward, D.; Gram, W.

    2011-12-01

    The National Ecological Observatory Network (NEON) will be a national resource for ecological research and education. NEON citizen science projects are being designed to increase awareness and educate citizen scientists about the impacts of climate change, land-use change, and invasive species on continental-scale ecological processes as well as expand NEON data collection capacity by enabling laypersons to collect geographically distributed data. The citizen science area of the NEON web portal will enable citizen scientists to collect, contribute, interpret, and visualize scientific data, as well as access training modules, collection protocols and targeted learning experiences related to citizen science project topics. For NEON, citizen science projects are a means for interested people to interact with and contribute to NEON science. Investigations at vast spatial and temporal scales often require rapid acquisition of large amounts of data from a geographically distributed population of "human sensors." As a continental-scale ecological observatory, NEON is uniquely positioned to develop strategies to effectively integrate data collected by non-scientists into scientific databases. Ultimately, we plan to work collaboratively to transform the practice of science to include "citizens" or non-scientists in the process. Doing science is not limited to scientists, and breaking down the barriers between scientists and citizens will help people better understand the power of using science in their own decision making. In preparation for fully developing the NEON citizen science program, we are partnering with Project BudBurst (PBB), a citizen science project focused on monitoring plant phenology. The educational goals of PBB are to: (1) increase awareness of climate change, (2) educate citizen scientists about the impacts of climate change on plants and the environment, and (3) increase science literacy by engaging participants in the scientific process. Phenology was chosen as the focus of this citizen science campaign because it is a visible and comprehensible way of demonstrating the effects of climate change. In addition, plants are readily accessible in nearly every neighborhood and park, and wild area across the continent, so people can make observations whether they live near an inner city park or in the rural countryside. Recently, NEON developed data visualization tools for Project BudBurst to engage citizen science participants in "doing science" beyond data collection. By prototyping NEON citizen science through Project BudBurst, NEON is developing a better understanding of how to build a citizen science program that addresses areas of awareness, mastery, and leadership of scientific information like that which NEON will produce over the next 30 years.

  8. MIGHTEE: The MeerKAT International GHz Tiered Extragalactic Exploration

    NASA Astrophysics Data System (ADS)

    Taylor, A. Russ; Jarvis, Matt

    2017-05-01

    The MeerKAT telescope is the precursor of the Square Kilometre Array mid-frequency dish array to be deployed later this decade on the African continent. MIGHTEE is one of the MeerKAT large survey projects designed to pathfind SKA key science in cosmology and galaxy evolution. Through a tiered radio continuum deep imaging project including several fields totaling 20 square degrees to microJy sensitivities and an ultra-deep image of a single 1 square degree field of view, MIGHTEE will explore dark matter and large scale structure, the evolution of galaxies, including AGN activity and star formation as a function of cosmic time and environment, the emergence and evolution of magnetic fields in galaxies, and the magnetic counter part to large scale structure of the universe.

  9. Education for Professional Engineering Practice

    ERIC Educational Resources Information Center

    Bramhall, Mike D.; Short, Chris

    2014-01-01

    This paper reports on a funded collaborative large-scale curriculum innovation and enhancement project undertaken as part of a UK National Higher Education Science, Technology Engineering and Mathematics (STEM) programme. Its aim was to develop undergraduate curricula to teach appropriate skills for professional engineering practice more…

  10. Neutron Tomography at the Los Alamos Neutron Science Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, William Riley

    Neutron imaging is an incredibly powerful tool for non-destructive sample characterization and materials science. Neutron tomography is one technique that results in a three-dimensional model of the sample, representing the interaction of the neutrons with the sample. This relies both on reliable data acquisition and on image processing after acquisition. Over the course of the project, the focus has changed from the former to the latter, culminating in a large-scale reconstruction of a meter-long fossilized skull. The full reconstruction is not yet complete, though tools have been developed to improve the speed and accuracy of the reconstruction. This project helpsmore » to improve the capabilities of LANSCE and LANL with regards to imaging large or unwieldy objects.« less

  11. HNSciCloud - Overview and technical Challenges

    NASA Astrophysics Data System (ADS)

    Gasthuber, Martin; Meinhard, Helge; Jones, Robert

    2017-10-01

    HEP is only one of many sciences with sharply increasing compute requirements that cannot be met by profiting from Moore’s law alone. Commercial clouds potentially allow for realising larger economies of scale. While some small-scale experience requiring dedicated effort has been collected, public cloud resources have not been integrated yet with the standard workflows of science organisations in their private data centres; in addition, European science has not ramped up to significant scale yet. The HELIX NEBULA Science Cloud project - HNSciCloud, partly funded by the European Commission, addresses these points. Ten organisations under CERN’s leadership, covering particle physics, bioinformatics, photon science and other sciences, have joined to procure public cloud resources as well as dedicated development efforts towards this integration. The HNSciCloud project faces the challenge to accelerate developments performed by the selected commercial providers. In order to guarantee cost efficient usage of IaaS resources across a wide range of scientific communities, the technical requirements had to be carefully constructed. With respect to current IaaS offerings, dataintensive science is the biggest challenge; other points that need to be addressed concern identity federations, network connectivity and how to match business practices of large IaaS providers with those of public research organisations. In the first section, this paper will give an overview of the project and explain the findings so far. The last section will explain the key points of the technical requirements and present first results of the experience of the procurers with the services in comparison to their’on-premise’ infrastructure.

  12. Success in large high-technology projects: What really works?

    NASA Astrophysics Data System (ADS)

    Crosby, P.

    2014-08-01

    Despite a plethora of tools, technologies and management systems, successful execution of big science and engineering projects remains problematic. The sheer scale of globally funded projects such as the Large Hadron Collider and the Square Kilometre Array telescope means that lack of project success can impact both on national budgets, and collaborative reputations. In this paper, I explore data from contemporary literature alongside field research from several current high-technology projects in Europe and Australia, and reveal common `pressure points' that are shown to be key influencers of project control and success. I discuss the how mega-science projects sit between being merely complicated, and chaotic, and explain the importance of understanding multiple dimensions of project complexity. Project manager/leader traits are briefly discussed, including capability to govern and control such enterprises. Project structures are examined, including the challenge of collaborations. I show that early attention to building project resilience, curbing optimism, and risk alertness can help prepare large high-tech projects against threats, and why project managers need to understand aspects of `the silent power of time'. Mission assurance is advanced as a critical success function, alongside the deployment of task forces and new combinations of contingency plans. I argue for increased project control through industrial-style project reviews, and show how post-project reviews are an under-used, yet invaluable avenue of personal and organisational improvement. Lastly, I discuss the avoidance of project amnesia through effective capture of project knowledge, and transfer of lessons-learned to subsequent programs and projects.

  13. Space research - At a crossroads

    NASA Technical Reports Server (NTRS)

    Mcdonald, Frank B.

    1987-01-01

    Efforts which must be expended if U.S. space research is to regain vitality in the next few years are discussed. Small-scale programs are the cornerstone for big science projects, giving both researchers and students a chance to practice the development of space missions and hardware and identify promising goals for larger projects. Small projects can be carried aloft by balloons, sounding rockets, the Shuttle and ELVs. It is recommended that NASA continue the development of remote sensing systems, and join with other government agencies to fund space-based materials science, space biology and medical research. Increased international cooperation in space projects is necessary for affording moderate to large scale missions, for political reasons, and to maximize available space resources. Finally, the establishment and funding of long-range goals in space, particularly the development of the infrastructure and technologies for the exploration and colonization of the planets, must be viewed as the normal outgrowth of the capabilities being developed for LEO operations.

  14. Primary teachers conducting inquiry projects: effects on attitudes towards teaching science and conducting inquiry

    NASA Astrophysics Data System (ADS)

    van Aalderen-Smeets, Sandra I.; Walma van der Molen, Juliette H.; van Hest, Erna G. W. C. M.; Poortman, Cindy

    2017-01-01

    This study used an experimental, pretest-posttest control group design to investigate whether participation in a large-scale inquiry project would improve primary teachers' attitudes towards teaching science and towards conducting inquiry. The inquiry project positively affected several elements of teachers' attitudes. Teachers felt less anxious about teaching science and felt less dependent on contextual factors compared to the control group. With regard to attitude towards conducting inquiry, teachers felt less anxious and more able to conduct an inquiry project. There were no effects on other attitude components, such as self-efficacy beliefs or relevance beliefs, or on self-reported science teaching behaviour. These results indicate that practitioner research may have a partially positive effect on teachers' attitudes, but that it may not be sufficient to fully change primary teachers' attitudes and their actual science teaching behaviour. In comparison, a previous study showed that attitude-focused professional development in science education has a more profound impact on primary teachers' attitudes and science teaching behaviour. In our view, future interventions aiming to stimulate science teaching should combine both approaches, an explicit focus on attitude change together with familiarisation with inquiry, in order to improve primary teachers' attitudes and classroom practices.

  15. A streamlined collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, exemplified by the Indonesian Biodiversity Discovery and Information System (IndoBioSys).

    PubMed

    Schmidt, Olga; Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan

    2017-01-01

    Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology - Indonesian Institute of Sciences (RCB-LIPI, Bogor).

  16. Science capabilities of the Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    Devost, Daniel; McConnachie, Alan; Flagey, Nicolas; Cote, Patrick; Balogh, Michael; Driver, Simon P.; Venn, Kim

    2017-01-01

    The Maunakea Spectroscopic Explorer (MSE) project will transform the CFHT 3.6m optical telescope into a 10m class dedicated multiobject spectroscopic facility, with an ability to simultaneously measure thousands of objects with a spectral resolution range spanning 2,000 to 20,000. The project is currently in design phase, with full science operations nominally starting in 2025. MSE will enable transformational science in areas as diverse as exoplanetary host characterization; stellar monitoring campaigns; tomographic mapping of the interstellar and intergalactic media; the in-situ chemical tagging of the distant Galaxy; connecting galaxies to the large scale structure of the Universe; measuring the mass functions of cold dark matter sub-halos in galaxy and cluster-scale hosts; reverberation mapping of supermassive black holes in quasars. MSE is an essential follow-up facility to current and next generations of multi-wavelength imaging surveys, including LSST, Gaia, Euclid, eROSITA, SKA, and WFIRST, and is an ideal feeder facility for E-ELT, TMT and GMT. I will give an update on the status of the project and review some of the most exciting scientific capabilities of the observatory.

  17. Development and Large-Scale Validation of an Instrument to Assess Arabic-Speaking Students' Attitudes Toward Science

    NASA Astrophysics Data System (ADS)

    Abd-El-Khalick, Fouad; Summers, Ryan; Said, Ziad; Wang, Shuai; Culbertson, Michael

    2015-11-01

    This study is part of a large-scale project focused on 'Qatari students' Interest in, and Attitudes toward, Science' (QIAS). QIAS aimed to gauge Qatari student attitudes toward science in grades 3-12, examine factors that impact these attitudes, and assess the relationship between student attitudes and prevailing modes of science teaching in Qatari schools. This report details the development and validation of the 'Arabic-Speaking Students' Attitudes toward Science Survey' (ASSASS), which was specifically developed for the purposes of the QIAS project. The theories of reasoned action and planned behavior (TRAPB) [Ajzen, I., & Fishbein, M. (2005). The influence of attitudes on behavior. In D. Albarracín, B. T. Johnson, & M. P. Zanna (Eds.), The handbook of attitudes (pp. 173-221). Mahwah, NJ: Erlbaum] guided the instrument development. Development and validation of the ASSASS proceeded in 3 phases. First, a 10-member expert panel examined an initial pool of 74 items, which were revised and consolidated into a 60-item version of the instrument. This version was piloted with 369 Qatari students from the target schools and grade levels. Analyses of pilot data resulted in a refined version of the ASSASS, which was administered to a national probability sample of 3027 participants representing all students enrolled in grades 3-12 in the various types of schools in Qatar. Of the latter, 1978 students completed the Arabic version of the instrument. Analyses supported a robust, 5-factor model for the instrument, which is consistent with the TRAPB framework. The factors were: Attitudes toward science and school science, unfavorable outlook on science, control beliefs about ability in science, behavioral beliefs about the consequences of engaging with science, and intentions to pursue science.

  18. Support of an Active Science Project by a Large Information System: Lessons for the EOS Era

    NASA Technical Reports Server (NTRS)

    Angelici, Gary L.; Skiles, J. W.; Popovici, Lidia Z.

    1993-01-01

    The ability of large information systems to support the changing data requirements of active science projects is being tested in a NASA collaborative study. This paper briefly profiles both the active science project and the large information system involved in this effort and offers some observations about the effectiveness of the project support. This is followed by lessons that are important for those participating in large information systems that need to support active science projects or that make available the valuable data produced by these projects. We learned in this work that it is difficult for a large information system focused on long term data management to satisfy the requirements of an on-going science project. For example, in order to provide the best service, it is important for all information system staff to keep focused on the needs and constraints of the scientists in the development of appropriate services. If the lessons learned in this and other science support experiences are not applied by those involved with large information systems of the EOS (Earth Observing System) era, then the final data products produced by future science projects may not be robust or of high quality, thereby making the conduct of the project science less efficacious and reducing the value of these unique suites of data for future research.

  19. A streamlined collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, exemplified by the Indonesian Biodiversity Discovery and Information System (IndoBioSys)

    PubMed Central

    Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan

    2017-01-01

    Abstract Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology – Indonesian Institute of Sciences (RCB-LIPI, Bogor). PMID:29134041

  20. Three-dimensional presentation of the earth and planets in classrooms and science centers with a spherical screen

    NASA Astrophysics Data System (ADS)

    Saito, A.; Tsugawa, T.; Odagi, Y.; Nishi, N.; Miyazaki, S.; Ichikawa, H.

    2012-12-01

    Educational programs have been developed for the earth and planetary science using a three-dimensional presentation system of the Earth and planets with a spherical screen. They have been used in classrooms of universities, high schools, elementary schools, and science centers. Two-dimensional map is a standard tool to present the data of the Earth and planets. However the distortion of the shape is inevitable especially for the map of wide areas. Three-dimensional presentation of the Earth, such as globes, is an only way to avoid this distortion. There are several projects to present the earth and planetary science results in three-dimension digitally, such as Science on a sphere (SOS) by NOAA, and Geo-cosmos by the National Museum of Emerging Science and Innovation (Miraikan), Japan. These projects are relatively large-scale in instruments and cost, and difficult to use in classrooms and small-scale science centers. Therefore we developed a portable, scalable and affordable system of the three-dimensional presentation of the Earth and planets, Dagik Earth. This system uses a spherical screen and a PC projector. Several educational programs have been developed using Dagik Earth under collaboration of the researchers of the earth and planetary science and science education, school teachers, and curators of science centers, and used in schools and museums in Japan, Taiwan and other countries. It helps learners to achieve the proper cognition of the shape and size of the phenomena on the Earth and planets. Current status and future development of the project will be introduced in the presentation.

  1. Reality check in the project management of EU funding

    NASA Astrophysics Data System (ADS)

    Guo, Chenbo

    2015-04-01

    A talk addressing workload, focuses, impacts and outcomes of project management (hereinafter PM) Two FP7 projects serve as objects for investigation. In the Earth Science sector NACLIM is a large scale collaborative project with 18 partners from North and West Europe. NACLIM aims at investigating and quantifying the predictability of the North Atlantic/Arctic sea surface temperature, sea ice variability and change on seasonal to decadal time scales which have a crucial impact on weather and climate in Europe. PRIMO from Political Science is a global PhD program funded by Marie Curie ITN instrument with 11 partners from Europe, Eurasia and BRICS countries focusing on the rise of regional powers and its impact on international politics at large. Although the two projects are granted by different FP7 funding instruments, stem from different cultural backgrounds and have different goals, the inherent processes and the key focus of the PM are quite alike. Only the operational management is at some point distinguished from one another. From the administrative point of view, understanding of both EU requirements and the country-specific regulations is essential; it also helps us identifying the grey area in order to carry out the projects more efficiently. The talk will focus on our observation of the day-to-day PM flows - primarily the project implementation - with few particular cases: transparency issues, e.g. priority settings of non-research stakeholders including the conflict in the human resources field, End-User integration, gender issues rising up during a monitoring visit and ethical aspects in field research. Through a brief comparison of both projects we summarize a range of dos and don'ts, an "acting instead of reacting" line of action, and the conclusion to a systematic overall management instead of exclusively project controlling. In a nutshell , the talk aims at providing the audience a summary of the observation in management methodologies and toolkits applied in both projects, our best practices and lessons learnt in coordinating large international consortia.

  2. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, Wes

    2016-07-24

    The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing wasmore » not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.« less

  3. Final Report for DE-SC0002298 Agency Number: DE-PS02-09ER09-01 An Advanced Network and distributed Storage Laboratory (ANDSL) for Data Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    2014-08-17

    The original intent of this project was to build and operate an Advanced Network and Distributed Storage Laboratory (ANDSL) for Data Intensive Science that will prepare the Open Science Grid (OSG) community for a new generation of wide area communication capabilities operating at a 100Gb rate. Given the significant cut in our proposed budget we changed the scope of the ANDSL to focus on the software aspects of the laboratory – workload generators and monitoring tools and on the offering of experimental data to the ANI project. The main contributions of our work are twofold: early end-user input and experimentalmore » data to the ANI project and software tools for conducting large scale end-to-end data placement experiments.« less

  4. [Structural Study in the Platform for Drug Discovery, Informatics, and Structural Life Science].

    PubMed

    Senda, Toshiya

    2016-01-01

    The Platform for Drug Discovery, Informatics, and Structural Life Science (PDIS), which has been launched since FY2012, is a national project in the field of structural biology. The PDIS consists of three cores - structural analysis, control, and informatics - and aims to support life science researchers who are not familiar with structural biology. The PDIS project is able to provide full-scale support for structural biology research. The support provided by the PDIS project includes protein purification with various expression systems, large scale protein crystallization, crystal structure determination, small angle scattering (SAXS), NMR, electron microscopy, bioinformatics, etc. In order to utilize these methods of support, PDIS users need to submit an application form to the one-stop service office. Submitted applications will be reviewed by three referees. It is strongly encouraged that PDIS users have sufficient discussion with researchers in the PDIS project before submitting the application. This discussion is very useful in the process of project design, particularly for beginners in structural biology. In addition to this user support, the PDIS project has conducted R&D, which includes the development of synchrotron beamlines. In the PDIS project, PF and SPring-8 have developed beamlines for micro-crystallography, high-throughput data collection, supramolecular assembly, and native single anomalous dispersion (SAD) phasing. The newly developed beamlines have been open to all users, and have accelerated structural biology research. Beamlines for SAXS have also been developed, which has dramatically increased bio-SAXS users.

  5. Activating social strategies: Face-to-face interaction in technology-mediated citizen science.

    PubMed

    Cappa, Francesco; Laut, Jeffrey; Nov, Oded; Giustiniano, Luca; Porfiri, Maurizio

    2016-11-01

    The use of crowds in research activities by public and private organizations is growing under different forms. Citizen science is a popular means of engaging the general public in research activities led by professional scientists. By involving a large number of amateur scientists, citizen science enables distributed data collection and analysis on a scale that would be otherwise difficult and costly to achieve. While advancements in information technology in the past few decades have fostered the growth of citizen science through online participation, several projects continue to fail due to limited participation. Such web-based projects may isolate the citizen scientists from the researchers. By adopting the perspective of social strategy, we investigate within a measure-manipulate-measure experiment if motivations to participate in a citizen science project can be positively influenced by a face-to-face interaction with the scientists leading the project. Such an interaction provides the participants with the possibility of asking questions on the spot and obtaining a detailed explanation of the citizen science project, its scientific merit, and environmental relevance. Social and cultural factors that moderate the effect brought about by face-to-face interactions on the motivations are also dissected and analyzed. Our findings provide an exploratory insight into a means for motivating crowds to participate in online environmental monitoring projects, also offering possible selection criteria of target audience. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. The Inspiring Science Education project and the resources for HEP analysis by university students

    NASA Astrophysics Data System (ADS)

    Fassouliotis, Dimitris; Kourkoumelis, Christine; Vourakis, Stylianos

    2016-11-01

    The Inspiring Science Education outreach project has been running for more than two years, creating a large number of inquiry based educational resources for high-school teachers and students. Its goal is the promotion of science education in schools though new methods built on the inquiry based education techniques, involving large consortia of European partners and implementation of large-scale pilots in schools. Recent hands-on activities, developing and testing the above mentioned innovative applications are reviewed. In general, there is a lack for educational scenaria and laboratory courses earmarked for more advanced, namely university, students. At the University of Athens for the last four years, the HYPATIA on-line event analysis tool has been used as a lab course for fourth year undergraduate physics students, majoring in HEP. Up to now, the course was limited to visual inspection of a few tens of ATLAS events. Recently the course was enriched with additional analysis exercises, which involve large samples of events. The students through a user friendly interface can analyse the samples and optimize the cut selection in order to search for new physics. The implementation of this analysis is described.

  7. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also present some of our findings from applying machine learning and data analytics on the processed SAR data streams. We will also present lessons learned on how to ease the SAR community onto interfacing with these cloud-based SAR science data systems.

  8. NEON Citizen Science: Planning and Prototyping (Invited)

    NASA Astrophysics Data System (ADS)

    Gram, W.

    2010-12-01

    The National Ecological Observatory Network (NEON) will be a national resource for ecological research and education. NEON citizen science projects are being designed to increase awareness and educate citizen scientists about the impacts of climate change, land-use change, and invasive species on continental-scale ecological processes as well as expand NEON data collection capacity by enabling laypersons to collect geographically distributed data. The citizen science area of the NEON web portal will enable citizen scientists to collect, contribute, interpret, and visualize scientific data, as well as access training modules, collection protocols and targeted learning experiences related to citizen science project topics. For NEON, citizen science projects are a means for interested people to interact with and contribute to NEON science. Investigations at vast spatial and temporal scales often require rapid acquisition of large amounts of data from a geographically distributed population of “human sensors.” As a continental-scale ecological observatory, NEON is uniquely positioned to develop strategies to effectively integrate data collected by non-scientists into scientific databases. Ultimately, we plan to work collaboratively to transform the practice of science to include “citizens” or non-scientists in the process. Doing science is not limited to scientists, and breaking down the barriers between scientists and citizens will help people better understand the power of using science in their own decision making. In preparation for fully developing the NEON citizen science program, we are partnering with Project BudBurst (PBB), a citizen science project focused on monitoring plant phenology. The educational goals of PBB are to: (1) increase awareness of climate change, (2) educate citizen scientists about the impacts of climate change on plants and the environment, and (3) increase science literacy by engaging participants in the scientific process. Phenology was chosen as the focus of this citizen science campaign because it is a visible and comprehensible way of demonstrating the effects of climate change. In addition, plants are readily accessible in nearly every neighborhood and park, and wild areas across the continent, so people can make observations whether they live near an inner city park or in the rural countryside. Recently, NEON built 3 web tools that enable users to visualize PBB data. The tools include a mapping function that displays selected PBB distributional data on a map, an animated map that shows “green up” through time and space, and a graphing tool that compares number of species flowering or leafing out with day length. This prototyping will help NEON better understand how to engage citizen science participants in “doing science” beyond data collection.

  9. Satellite Imagery Production and Processing Using Apache Hadoop

    NASA Astrophysics Data System (ADS)

    Hill, D. V.; Werpy, J.

    2011-12-01

    The United States Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center Land Science Research and Development (LSRD) project has devised a method to fulfill its processing needs for Essential Climate Variable (ECV) production from the Landsat archive using Apache Hadoop. Apache Hadoop is the distributed processing technology at the heart of many large-scale, processing solutions implemented at well-known companies such as Yahoo, Amazon, and Facebook. It is a proven framework and can be used to process petabytes of data on thousands of processors concurrently. It is a natural fit for producing satellite imagery and requires only a few simple modifications to serve the needs of science data processing. This presentation provides an invaluable learning opportunity and should be heard by anyone doing large scale image processing today. The session will cover a description of the problem space, evaluation of alternatives, feature set overview, configuration of Hadoop for satellite image processing, real-world performance results, tuning recommendations and finally challenges and ongoing activities. It will also present how the LSRD project built a 102 core processing cluster with no financial hardware investment and achieved ten times the initial daily throughput requirements with a full time staff of only one engineer. Satellite Imagery Production and Processing Using Apache Hadoop is presented by David V. Hill, Principal Software Architect for USGS LSRD.

  10. Problems in merging Earth sensing satellite data sets

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.; Goldberg, Michael J.

    1987-01-01

    Satellite remote sensing systems provide a tremendous source of data flow to the Earth science community. These systems provide scientists with data of types and on a scale previously unattainable. Looking forward to the capabilities of Space Station and the Earth Observing System (EOS), the full realization of the potential of satellite remote sensing will be handicapped by inadequate information systems. There is a growing emphasis in Earth science research to ask questions which are multidisciplinary in nature and global in scale. Many of these research projects emphasize the interactions of the land surface, the atmosphere, and the oceans through various physical mechanisms. Conducting this research requires large and complex data sets and teams of multidisciplinary scientists, often working at remote locations. A review of the problems of merging these large volumes of data into spatially referenced and manageable data sets is presented.

  11. The age of citizen science: Stimulating future environmental research

    NASA Astrophysics Data System (ADS)

    Burgess, S. N.

    2010-12-01

    Public awareness of the state of the ocean is growing with issues such as climate change, over-harvesting, marine pollution, coral bleaching, ocean acidification and sea level rise appearing regularly in popular media outlets. Society is also placing greater value on the range of ecosystem services the ocean provides. This increased consciousness of environmental change due to a combination of anthropogenic activities and impacts from climate change offers scientists the opportunity of engaging citizens in environmental research. The term citizen science refers to scientific research carried out by citizens and led by professionals, which involves large scale data collection whilst simultaneously engaging and educating those who participate. Most projects that engage citizen scientists have been specifically designed to provide an educational benefit to the volunteer and benefit the scientific inquiry by collecting extensive data sets over large geographical areas. Engaging the public in environmental science is not a new concept and successful projects (such as the Audobon Christmas Bird Count and Earthwatch) have been running for several decades resulting in hundreds of thousands of people conducting long-term field research in partnership with scientists based at universities worldwide. The realm of citizen science projects is continually expanding, with public engagement options ranging from science online; to backyard afternoon studies; to fully immersive experiential learning projects running for weeks at a time. Some organisations, such as Earthwatch also work in partnership with private industry; giving scientists access to more funding opportunities than those avenues traditionally available. These scientist -industry partnerships provide mutual benefits as the results of research projects in environments such as coastal ecosystems feed directly back into business risk strategies; for example mitigating shoreline erosion, storm surges, over fishing and warming water temperatures. Citizen science projects fulfill the requirements of government granting institutions for outreach and scientific communication. This presentation will highlight marine research projects, which have not only engaged citizens in the scientific process but also discuss the impacts of associated outreach, capacity building and community environmental stewardship.

  12. Data management for community research projects: A JGOFS case study

    NASA Technical Reports Server (NTRS)

    Lowry, Roy K.

    1992-01-01

    Since the mid 1980s, much of the marine science research effort in the United Kingdom has been focused into large scale collaborative projects involving public sector laboratories and university departments, termed Community Research Projects. Two of these, the Biogeochemical Ocean Flux Study (BOFS) and the North Sea Project incorporated large scale data collection to underpin multidisciplinary modeling efforts. The challenge of providing project data sets to support the science was met by a small team within the British Oceanographic Data Centre (BODC) operating as a topical data center. The role of the data center was to both work up the data from the ship's sensors and to combine these data with sample measurements into online databases. The working up of the data was achieved by a unique symbiosis between data center staff and project scientists. The project management, programming and data processing skills of the data center were combined with the oceanographic experience of the project communities to develop a system which has produced quality controlled, calibrated data sets from 49 research cruises in 3.5 years of operation. The data center resources required to achieve this were modest and far outweighed by the time liberated in the scientific community by the removal of the data processing burden. Two online project databases have been assembled containing a very high proportion of the data collected. As these are under the control of BODC their long term availability as part of the UK national data archive is assured. The success of the topical data center model for UK Community Research Project data management has been founded upon the strong working relationships forged between the data center and project scientists. These can only be established by frequent personal contact and hence the relatively small size of the UK has been a critical factor. However, projects covering a larger, even international scale could be successfully supported by a network of topical data centers managing online databases which are interconnected by object oriented distributed data management systems over wide area networks.

  13. DOE Joint Genome Institute 2008 Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, David

    2009-03-12

    While initially a virtual institute, the driving force behind the creation of the DOE Joint Genome Institute in Walnut Creek, California in the Fall of 1999 was the Department of Energy's commitment to sequencing the human genome. With the publication in 2004 of a trio of manuscripts describing the finished 'DOE Human Chromosomes', the Institute successfully completed its human genome mission. In the time between the creation of the Department of Energy Joint Genome Institute (DOE JGI) and completion of the Human Genome Project, sequencing and its role in biology spread to fields extending far beyond what could be imaginedmore » when the Human Genome Project first began. Accordingly, the targets of the DOE JGI's sequencing activities changed, moving from a single human genome to the genomes of large numbers of microbes, plants, and other organisms, and the community of users of DOE JGI data similarly expanded and diversified. Transitioning into operating as a user facility, the DOE JGI modeled itself after other DOE user facilities, such as synchrotron light sources and supercomputer facilities, empowering the science of large numbers of investigators working in areas of relevance to energy and the environment. The JGI's approach to being a user facility is based on the concept that by focusing state-of-the-art sequencing and analysis capabilities on the best peer-reviewed ideas drawn from a broad community of scientists, the DOE JGI will effectively encourage creative approaches to DOE mission areas and produce important science. This clearly has occurred, only partially reflected in the fact that the DOE JGI has played a major role in more than 45 papers published in just the past three years alone in Nature and Science. The involvement of a large and engaged community of users working on important problems has helped maximize the impact of JGI science. A seismic technological change is presently underway at the JGI. The Sanger capillary-based sequencing process that dominated how sequencing was done in the last decade is being replaced by a variety of new processes and sequencing instruments. The JGI, with an increasing number of next-generation sequencers, whose throughput is 100- to 1,000-fold greater than the Sanger capillary-based sequencers, is increasingly focused in new directions on projects of scale and complexity not previously attempted. These new directions for the JGI come, in part, from the 2008 National Research Council report on the goals of the National Plant Genome Initiative as well as the 2007 National Research Council report on the New Science of Metagenomics. Both reports outline a crucial need for systematic large-scale surveys of the plant and microbial components of the biosphere as well as an increasing need for large-scale analysis capabilities to meet the challenge of converting sequence data into knowledge. The JGI is extensively discussed in both reports as vital to progress in these fields of major national interest. JGI's future plan for plants and microbes includes a systematic approach for investigation of these organisms at a scale requiring the special capabilities of the JGI to generate, manage, and analyze the datasets. JGI will generate and provide not only community access to these plant and microbial datasets, but also the tools for analyzing them. These activities will produce essential knowledge that will be needed if we are to be able to respond to the world's energy and environmental challenges. As the JGI Plant and Microbial programs advance, the JGI as a user facility is also evolving. The Institute has been highly successful in bending its technical and analytical skills to help users solve large complex problems of major importance, and that effort will continue unabated. The JGI will increasingly move from a central focus on 'one-off' user projects coming from small user communities to much larger scale projects driven by systematic and problem-focused approaches to selection of sequencing targets. Entire communities of scientists working in a particular field, such as feedstock improvement or biomass degradation, will be users of this information. Despite this new emphasis, an investigator-initiated user program will remain. This program in the future will replace small projects that increasingly can be accomplished without the involvement of JGI, with imaginative large-scale 'Grand Challenge' projects of foundational relevance to energy and the environment that require a new scale of sequencing and analysis capabilities. Close interactions with the DOE Bioenergy Research Centers, and with other DOE institutions that may follow, will also play a major role in shaping aspects of how the JGI operates as a user facility. Based on increased availability of high-throughput sequencing, the JGI will increasingly provide to users, in addition to DNA sequencing, an array of both pre- and post-sequencing value-added capabilities to accelerate their science.« less

  14. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  15. Moving forward: Responding to and mitigating effects of the MPB epidemic [Chapter 8

    Treesearch

    Claudia Regan; Barry Bollenbacher; Rob Gump; Mike Hillis

    2014-01-01

    The final webinar in the Future Forest Webinar Series provided an example of how managers utilized available science to address questions about post-epidemic forest conditions. Assessments of current conditions and projected trends, and how these compare with historical patterns, provide important information for land management planning. Large-scale disturbance events...

  16. AFRL/Cornell Information Assurance Institute

    DTIC Science & Technology

    2007-03-01

    revewing this colection ofinformation . Send connents regarding this burden estimate or any other aspect of this collection of information, indcudng...collabora- tions involving Cornell and AFRL researchers, with * AFRL researchers able to participate in Cornell research projects, fa- cilitating technology ...approach to developing a science base and technology for supporting large-scale reliable distributed systems. First, so- lutions to core problems were

  17. Composites for Exploration Upper Stage

    NASA Technical Reports Server (NTRS)

    Fikes, J. C.; Jackson, J. R.; Richardson, S. W.; Thomas, A. D.; Mann, T. O.; Miller, S. G.

    2016-01-01

    The Composites for Exploration Upper Stage (CEUS) was a 3-year, level III project within the Technology Demonstration Missions program of the NASA Space Technology Mission Directorate. Studies have shown that composites provide important programmatic enhancements, including reduced weight to increase capability and accelerated expansion of exploration and science mission objectives. The CEUS project was focused on technologies that best advanced innovation, infusion, and broad applications for the inclusion of composites on future large human-rated launch vehicles and spacecraft. The benefits included near- and far-term opportunities for infusion (NASA, industry/commercial, Department of Defense), demonstrated critical technologies and technically implementable evolvable innovations, and sustained Agency experience. The initial scope of the project was to advance technologies for large composite structures applicable to the Space Launch System (SLS) Exploration Upper Stage (EUS) by focusing on the affordability and technical performance of the EUS forward and aft skirts. The project was tasked to develop and demonstrate critical composite technologies with a focus on full-scale materials, design, manufacturing, and test using NASA in-house capabilities. This would have demonstrated a major advancement in confidence and matured the large-scale composite technology to a Technology Readiness Level 6. This project would, therefore, have bridged the gap for providing composite application to SLS upgrades, enabling future exploration missions.

  18. Consortium biology in immunology: the perspective from the Immunological Genome Project.

    PubMed

    Benoist, Christophe; Lanier, Lewis; Merad, Miriam; Mathis, Diane

    2012-10-01

    Although the field has a long collaborative tradition, immunology has made less use than genetics of 'consortium biology', wherein groups of investigators together tackle large integrated questions or problems. However, immunology is naturally suited to large-scale integrative and systems-level approaches, owing to the multicellular and adaptive nature of the cells it encompasses. Here, we discuss the value and drawbacks of this organization of research, in the context of the long-running 'big science' debate, and consider the opportunities that may exist for the immunology community. We position this analysis in light of our own experience, both positive and negative, as participants of the Immunological Genome Project.

  19. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  20. [Privacy and public benefit in using large scale health databases].

    PubMed

    Yamamoto, Ryuichi

    2014-01-01

    In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.

  1. Chemically intuited, large-scale screening of MOFs by machine learning techniques

    NASA Astrophysics Data System (ADS)

    Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.

    2017-10-01

    A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.

  2. Development and Validation of a Project Package for Junior Secondary School Basic Science

    ERIC Educational Resources Information Center

    Udofia, Nsikak-Abasi

    2014-01-01

    This was a Research and Developmental study designed to develop and validate projects for Junior Secondary School Basic Science instruction and evaluation. The projects were developed using the project blueprint and sent for validation by experts in science education and measurement and evaluation; using a project validation scale. They were to…

  3. Exposing and deposing hyper-economized school science

    NASA Astrophysics Data System (ADS)

    Bencze, John Lawrence

    2010-06-01

    Despite indications of the problematic nature of laissez faire capitalism, such as the convictions of corporate leaders and the global financial crisis that appeared to largely stem from a de-regulated financial services industry, it seems clear that societies and environments continue to be strongly influenced by hyper-economized worldviews and practices. Given the importance of societal acceptance of a potentially dominant ideological perspective, it is logical to assume that it would be critical for students to be prepared to function in niches prioritizing unrestricted for-profit commodity exchanges. Indeed, in their article in this issue, Lyn Carter and Ranjith Dediwalage appear to support this claim in their analyses of the large-scale and expensive Australian curriculum and instruction project, Sustainability by the Bay. More specifically, they effectively demonstrate that this project manifests several characteristics that would suggest neoliberal and neoconservative influences—ideological perspectives that they argue are largely fundamental to the functioning of the global economic system. In this forum article, possible adverse effects of neoliberalism and neoconservatism on school science are discussed—with further justification for Carter and Dediwalage's concerns. Additionally, however, this article raises the possibility of subverting neoliberalism and neoconservatism in science education through application of communitarian ideals.

  4. Can Citizen Science Assist in Determining Koala (Phascolarctos cinereus) Presence in a Declining Population?

    PubMed

    Flower, Emily; Jones, Darryl; Bernede, Lilia

    2016-07-14

    The acceptance and application of citizen science has risen over the last 10 years, with this rise likely attributed to an increase in public awareness surrounding anthropogenic impacts affecting urban ecosystems. Citizen science projects have the potential to expand upon data collected by specialist researchers as they are able to gain access to previously unattainable information, consequently increasing the likelihood of an effective management program. The primary objective of this research was to develop guidelines for a successful regional-scale citizen science project following a critical analysis of 12 existing citizen science case studies. Secondly, the effectiveness of these guidelines was measured through the implementation of a citizen science project, Koala Quest, for the purpose of estimating the presence of koalas in a fragmented landscape. Consequently, this research aimed to determine whether citizen-collected data can augment traditional science research methods, by comparing and contrasting the abundance of koala sightings gathered by citizen scientists and professional researchers. Based upon the guidelines developed, Koala Quest methodologies were designed, the study conducted, and the efficacy of the project assessed. To combat the high variability of estimated koala populations due to differences in counting techniques, a national monitoring and evaluation program is required, in addition to a standardised method for conducting koala population estimates. Citizen science is a useful method for monitoring animals such as the koala, which are sparsely distributed throughout a vast geographical area, as the large numbers of volunteers recruited by a citizen science project are capable of monitoring a similarly broad spatial range.

  5. Computational nuclear quantum many-body problem: The UNEDF project

    NASA Astrophysics Data System (ADS)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  6. Neuroscience thinks big (and collaboratively).

    PubMed

    Kandel, Eric R; Markram, Henry; Matthews, Paul M; Yuste, Rafael; Koch, Christof

    2013-09-01

    Despite cash-strapped times for research, several ambitious collaborative neuroscience projects have attracted large amounts of funding and media attention. In Europe, the Human Brain Project aims to develop a large-scale computer simulation of the brain, whereas in the United States, the Brain Activity Map is working towards establishing a functional connectome of the entire brain, and the Allen Institute for Brain Science has embarked upon a 10-year project to understand the mouse visual cortex (the MindScope project). US President Barack Obama's announcement of the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies Initiative) in April 2013 highlights the political commitment to neuroscience and is expected to further foster interdisciplinary collaborations, accelerate the development of new technologies and thus fuel much needed medical advances. In this Viewpoint article, five prominent neuroscientists explain the aims of the projects and how they are addressing some of the questions (and criticisms) that have arisen.

  7. The Role of APEX as a Pathfinder for AtLAST

    NASA Astrophysics Data System (ADS)

    Wyrowski, Friedrich

    2018-01-01

    Now more than 12 years in operation, the Atacama Pathfinder Experiment (APEX) 12 m submillimeter telescope has significantly contributed to a wide variety of submillimeter astronomy science areas, ranging from the discoveries of new molecules to large and deep imaging of the submillimeter sky. While ALMA operation is in full swing, APEX is strengthening its role not only as pathfinder for studying large source samples and spatial scales to prepare detailed high angular resolution ALMA follow ups, but also as fast response instruments to complement new results from ALMA. Furthermore, APEX ensures southern hemisphere access for submillimeter projects complementing archival Herschel research as well as new SOFIA science. With new broadband and multipixel receivers as well as large cameras for wide-field continuum imaging, APEX will pave the way towards the science envisioned with ATLAST. In this contribution, the current status and ongoing upgrades of APEX will be discussed, with an emphasis on the importance of continuous cutting edge science and state-of-the-art instrumentation that will bridge the gap towards ATLAST.

  8. ROADNET: A Real-time Data Aware System for Earth, Oceanographic, and Environmental Applications

    NASA Astrophysics Data System (ADS)

    Vernon, F.; Hansen, T.; Lindquist, K.; Ludascher, B.; Orcutt, J.; Rajasekar, A.

    2003-12-01

    The Real-time Observatories, Application, and Data management Network (ROADNet) Program aims to develop an integrated, seamless, and transparent environmental information network that will deliver geophysical, oceanographic, hydrological, ecological, and physical data to a variety of users in real-time. ROADNet is a multidisciplinary, multinational partnership of researchers, policymakers, natural resource managers, educators, and students who aim to use the data to advance our understanding and management of coastal, ocean, riparian, and terrestrial Earth systems in Southern California, Mexico, and well off shore. To date, project activity and funding have focused on the design and deployment of network linkages and on the exploratory development of the real-time data management system. We are currently adapting powerful "Data Grid" technologies to the unique challenges associated with the management and manipulation of real-time data. Current "Grid" projects deal with static data files, and significant technical innovation is required to address fundamental problems of real-time data processing, integration, and distribution. The technologies developed through this research will create a system that dynamically adapt downstream processing, cataloging, and data access interfaces when sensors are added or removed from the system; provide for real-time processing and monitoring of data streams--detecting events, and triggering computations, sensor and logger modifications, and other actions; integrate heterogeneous data from multiple (signal) domains; and provide for large-scale archival and querying of "consolidated" data. The software tools which must be developed do not exist, although limited prototype systems are available. This research has implications for the success of large-scale NSF initiatives in the Earth sciences (EarthScope), ocean sciences (OOI- Ocean Observatories Initiative), biological sciences (NEON - National Ecological Observatory Network) and civil engineering (NEES - Network for Earthquake Engineering Simulation). Each of these large scale initiatives aims to collect real-time data from thousands of sensors, and each will require new technologies to process, manage, and communicate real-time multidisciplinary environmental data on regional, national, and global scales.

  9. Managing Risk and Uncertainty in Large-Scale University Research Projects

    ERIC Educational Resources Information Center

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  10. Collaboratively Architecting a Scalable and Adaptable Petascale Infrastructure to Support Transdisciplinary Scientific Research for the Australian Earth and Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Evans, B. J. K.; Pugh, T.; Lescinsky, D. T.; Foster, C.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) at the Australian National University (ANU) is a partnership between CSIRO, ANU, Bureau of Meteorology (BoM) and Geoscience Australia. Recent investments in a 1.2 PFlop Supercomputer (Raijin), ~ 20 PB data storage using Lustre filesystems and a 3000 core high performance cloud have created a hybrid platform for higher performance computing and data-intensive science to enable large scale earth and climate systems modelling and analysis. There are > 3000 users actively logging in and > 600 projects on the NCI system. Efficiently scaling and adapting data and software systems to petascale infrastructures requires the collaborative development of an architecture that is designed, programmed and operated to enable users to interactively invoke different forms of in-situ computation over complex and large scale data collections. NCI makes available major and long tail data collections from both the government and research sectors based on six themes: 1) weather, climate and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology and 6) astronomy, bio and social. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. Collections are the operational form for data management and access. Similar data types from individual custodians are managed cohesively. Use of international standards for discovery and interoperability allow complex interactions within and between the collections. This design facilitates a transdisciplinary approach to research and enables a shift from small scale, 'stove-piped' science efforts to large scale, collaborative systems science. This new and complex infrastructure requires a move to shared, globally trusted software frameworks that can be maintained and updated. Workflow engines become essential and need to integrate provenance, versioning, traceability, repeatability and publication. There are also human resource challenges as highly skilled HPC/HPD specialists, specialist programmers, and data scientists are required whose skills can support scaling to the new paradigm of effective and efficient data-intensive earth science analytics on petascale, and soon to be exascale systems.

  11. The Importance of Simulation Workflow and Data Management in the Accelerated Climate Modeling for Energy Project

    NASA Astrophysics Data System (ADS)

    Bader, D. C.

    2015-12-01

    The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.

  12. Community-based native seed production for restoration in Brazil - the role of science and policy.

    PubMed

    Schmidt, I B; de Urzedo, D I; Piña-Rodrigues, F C M; Vieira, D L M; de Rezende, G M; Sampaio, A B; Junqueira, R G P

    2018-05-20

    Large-scale restoration programmes in the tropics require large volumes of high quality, genetically diverse and locally adapted seeds from a large number of species. However, scarcity of native seeds is a critical restriction to achieve restoration targets. In this paper, we analyse three successful community-based networks that supply native seeds and seedlings for Brazilian Amazon and Cerrado restoration projects. In addition, we propose directions to promote local participation, legal, technical and commercialisation issues for up-scaling the market of native seeds for restoration with high quality and social justice. We argue that effective community-based restoration arrangements should follow some principles: (i) seed production must be based on real market demand; (ii) non-governmental and governmental organisations have a key role in supporting local organisation, legal requirements and selling processes; (iii) local ecological knowledge and labour should be valued, enabling local communities to promote large-scale seed production; (iv) applied research can help develop appropriate techniques and solve technical issues. The case studies from Brazil and principles presented here can be useful for the up-scaling restoration ecology efforts in many other parts of the world and especially in tropical countries where improving rural community income is a strategy for biodiversity conservation and restoration. © 2018 German Society for Plant Sciences and The Royal Botanical Society of the Netherlands.

  13. Citizen Science Data and Scaling

    NASA Astrophysics Data System (ADS)

    Henderson, S.; Wasser, L. A.

    2013-12-01

    There is rapid growth in the collection of environmental data by non experts. So called ';citizen scientists' are collecting data on plant phenology, precipitation patterns, bird migration and winter feeding, mating calls of frogs in the spring, and numerous other topics and phenomena related to environmental science. This data is generally submitted to online programs (e.g Project BudBurst, COCORaHS, Project Feederwatch, Frogwatch USA, etc.)and is freely available to scientists, educators, land managers, and decisions makers. While the data is often used to address specific science questions, it also provides the opportunity to explore its utility in the context of ecosystem scaling. Citizen science data is being collected and submitted at an unprecedented rate and is of a spatial and temporal scale previously not possible. The amount of citizen science data vastly exceeds what scientists or land managers can collect on their own. As such, it provides opportunities to address scaling in the environmental sciences. This presentation will explore data from several citizen science programs in the context of scaling.

  14. Science Support: The Building Blocks of Active Data Curation

    NASA Astrophysics Data System (ADS)

    Guillory, A.

    2013-12-01

    While the scientific method is built on reproducibility and transparency, and results are published in peer reviewed literature, we have come to the digital age of very large datasets (now of the order of petabytes and soon exabytes) which cannot be published in the traditional way. To preserve reproducibility and transparency, active curation is necessary to keep and protect the information in the long term, and 'science support' activities provide the building blocks for active data curation. With the explosive growth of data in all fields in recent years, there is a pressing urge for data centres to now provide adequate services to ensure long-term preservation and digital curation of project data outputs, however complex those may be. Science support provides advice and support to science projects on data and information management, from file formats through to general data management awareness. Another purpose of science support is to raise awareness in the science community of data and metadata standards and best practice, engendering a culture where data outputs are seen as valued assets. At the heart of Science support is the Data Management Plan (DMP) which sets out a coherent approach to data issues pertaining to the data generating project. It provides an agreed record of the data management needs and issues within the project. The DMP is agreed upon with project investigators to ensure that a high quality documented data archive is created. It includes conditions of use and deposit to clearly express the ownership, responsibilities and rights associated with the data. Project specific needs are also identified for data processing, visualization tools and data sharing services. As part of the National Centre for Atmospheric Science (NCAS) and National Centre for Earth Observation (NCEO), the Centre for Environmental Data Archival (CEDA) fulfills this science support role of facilitating atmospheric and Earth observation data generating projects to ensure successful management of the data and accompanying information for reuse and repurpose. Specific examples at CEDA include science support provided to FAAM (Facility for Airborne Atmospheric Measurements) aircraft campaigns and large-scale modelling projects such as UPSCALE, the largest ever PRACE (Partnership for Advanced Computing in Europe) computational project, dependent on CEDA to provide the high-performance storage, transfer capability and data analysis environment on the 'super-data-cluster' JASMIN. The impact of science support on scientific research is conspicuous: better documented datasets with an increasing collection of metadata associated to the archived data, ease of data sharing with the use of standards in formats and metadata and data citation. These establish a high-quality of data management ensuring long-term preservation and enabling re-use by peer scientists which ultimately leads to faster paced progress in science.

  15. Conceptual astronomy. II. Replicating conceptual gains, probing attitude changes across three semesters

    NASA Astrophysics Data System (ADS)

    Zeilik, Michael; Schau, Candace; Mattern, Nancy

    1999-10-01

    We report on a long-term, large-scale study of a one-semester, conceptually based, introductory astronomy course with data from more than 400 students over three semesters at the University of New Mexico. Using traditional and alternative assessment tools developed for the project, we examined the pre- and postcourse results for Fall 1994, Spring 1995, and Fall 1995. We find our results are robust: novice students show large, positive gains on assessments of conceptual understanding and connected understanding of the knowledge structure of astronomy. We find no relationship between course achievement and completion of prior courses in science or math; we do find a small to moderate relationship between students' science self-image and course achievement. Also, we detect little change over each semester in students' mildly positive incoming attitudes about astronomy and science.

  16. Earth Science: 49 Science Fair Projects Series.

    ERIC Educational Resources Information Center

    Bonnet, Robert L.; Keen, G. Daniel

    This book offers a large collection of Earth science projects and project ideas for students, teachers, and parents. The projects described are complete but can also be used as spring boards to create expanded projects. Overviews, organizational direction, suggested hypotheses, materials, procedures, and controls are provided. The projects…

  17. Using Analytics to Support Petabyte-Scale Science on the NASA Earth Exchange (NEX)

    NASA Astrophysics Data System (ADS)

    Votava, P.; Michaelis, A.; Ganguly, S.; Nemani, R. R.

    2014-12-01

    NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. Analytics within NEX occurs at several levels - data, workflows, science and knowledge. At the data level, we are focusing on collecting and analyzing any information that is relevant to efficient acquisition, processing and management of data at the smallest granularity, such as files or collections. This includes processing and analyzing all local and many external metadata that are relevant to data quality, size, provenance, usage and other attributes. This then helps us better understand usage patterns and improve efficiency of data handling within NEX. When large-scale workflows are executed on NEX, we capture information that is relevant to processing and that can be analyzed in order to improve efficiencies in job scheduling, resource optimization, or data partitioning that would improve processing throughput. At this point we also collect data provenance as well as basic statistics of intermediate and final products created during the workflow execution. These statistics and metrics form basic process and data QA that, when combined with analytics algorithms, helps us identify issues early in the production process. We have already seen impact in some petabyte-scale projects, such as global Landsat processing, where we were able to reduce processing times from days to hours and enhance process monitoring and QA. While the focus so far has been mostly on support of NEX operations, we are also building a web-based infrastructure that enables users to perform direct analytics on science data - such as climate predictions or satellite data. Finally, as one of the main goals of NEX is knowledge acquisition and sharing, we began gathering and organizing information that associates users and projects with data, publications, locations and other attributes that can then be analyzed as a part of the NEX knowledge graph and used to greatly improve advanced search capabilities. Overall, we see data analytics at all levels as an important part of NEX as we are continuously seeking improvements in data management, workflow processing, use of resources, usability and science acceleration.

  18. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  19. IN13B-1660: Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX

    NASA Technical Reports Server (NTRS)

    Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris

    2016-01-01

    We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.

  20. Analytics and Visualization Pipelines for Big ­Data on the NASA Earth Exchange (NEX) and OpenNEX

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.

    2016-12-01

    We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.

  1. Taking Stock: Existing Resources for Assessing a New Vision of Science Learning

    ERIC Educational Resources Information Center

    Alonzo, Alicia C.; Ke, Li

    2016-01-01

    A new vision of science learning described in the "Next Generation Science Standards"--particularly the science and engineering practices and their integration with content--pose significant challenges for large-scale assessment. This article explores what might be learned from advances in large-scale science assessment and…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drouhard, Margaret MEG G; Steed, Chad A; Hahn, Steven E

    In this paper, we propose strategies and objectives for immersive data visualization with applications in materials science using the Oculus Rift virtual reality headset. We provide background on currently available analysis tools for neutron scattering data and other large-scale materials science projects. In the context of the current challenges facing scientists, we discuss immersive virtual reality visualization as a potentially powerful solution. We introduce a prototype immersive visual- ization system, developed in conjunction with materials scientists at the Spallation Neutron Source, which we have used to explore large crystal structures and neutron scattering data. Finally, we offer our perspective onmore » the greatest challenges that must be addressed to build effective and intuitive virtual reality analysis tools that will be useful for scientists in a wide range of fields.« less

  3. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    NASA Technical Reports Server (NTRS)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  4. Sample Identification at Scale - Implementing IGSN in a Research Agency

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Golodoniuc, P.; Wyborn, L. A.; Devaraju, A.; Fraser, R.

    2015-12-01

    Earth sciences are largely observational and rely on natural samples, types of which vary significantly between science disciplines. Sharing and referencing of samples in scientific literature and across the Web requires the use of globally unique identifiers essential for disambiguation. This practice is very common in other fields, e.g. ISBN in publishing, doi in scientific literature, etc. In Earth sciences however, this is still often done in an ad-hoc manner without the use of unique identifiers. The International Geo Sample Number (IGSN) system provides a persistent, globally unique label for identifying environmental samples. As an IGSN allocating agency, CSIRO implements the IGSN registration service at the organisational scale with contributions from multiple research groups. Capricorn Distal Footprints project is one of the first pioneers and early adopters of the technology in Australia. For this project, IGSN provides a mechanism for identification of new and legacy samples, as well as derived sub-samples. It will ensure transparency and reproducibility in various geochemical sampling campaigns that will involve a diversity of sampling methods. Hence, diverse geochemical and isotopic results can be linked back to the parent sample, particularly where multiple children of that sample have also been analysed. The IGSN integration for this project is still in early stages and requires further consultations on the governance mechanisms that we need to put in place to allow efficient collaboration within CSIRO and collaborating partners on the project including naming conventions, service interfaces, etc. In this work, we present the results of the initial implementation of IGSN in the context of the Capricorn Distal Footprints project. This study has so far demonstrated the effectiveness of the proposed approach, while maintaining the flexibility to adapt to various media types, which is critical in the context of a multi-disciplinary project.

  5. The Development, Field Test and Validation of Scales to Assess Teachers' Attitudes Toward Teaching Elementary School Science.

    ERIC Educational Resources Information Center

    Moore, Richard W.

    The project described in this report is an attempt to develop scales to assess teachers' attitudes toward teaching elementary school science. The instrument produced, Science Teaching Attitude Scales, consists of six scales, each of which has a statement of the attitude to be assessed and five statements to determine the extent to which the…

  6. The Projects for Onboard Autonomy (PROBA2) Science Centre: Sun Watcher Using APS Detectors and Image Processing (SWAP) and Large-Yield Radiometer (LYRA) Science Operations and Data Products

    NASA Astrophysics Data System (ADS)

    Zender, J.; Berghmans, D.; Bloomfield, D. S.; Cabanas Parada, C.; Dammasch, I.; De Groof, A.; D'Huys, E.; Dominique, M.; Gallagher, P.; Giordanengo, B.; Higgins, P. A.; Hochedez, J.-F.; Yalim, M. S.; Nicula, B.; Pylyser, E.; Sanchez-Duarte, L.; Schwehm, G.; Seaton, D. B.; Stanger, A.; Stegen, K.; Willems, S.

    2013-08-01

    The PROBA2 Science Centre (P2SC) is a small-scale science operations centre supporting the Sun observation instruments onboard PROBA2: the EUV imager Sun Watcher using APS detectors and image Processing (SWAP) and Large-Yield Radiometer (LYRA). PROBA2 is one of ESA's small, low-cost Projects for Onboard Autonomy (PROBA) and part of ESA's In-Orbit Technology Demonstration Programme. The P2SC is hosted at the Royal Observatory of Belgium, co-located with both Principal Investigator teams. The P2SC tasks cover science planning, instrument commanding, instrument monitoring, data processing, support of outreach activities, and distribution of science data products. PROBA missions aim for a high degree of autonomy at mission and system level, including the science operations centre. The autonomy and flexibility of the P2SC is reached by a set of web-based interfaces allowing the operators as well as the instrument teams to monitor quasi-continuously the status of the operations, allowing a quick reaction to solar events. In addition, several new concepts are implemented at instrument, spacecraft, and ground-segment levels allowing a high degree of flexibility in the operations of the instruments. This article explains the key concepts of the P2SC, emphasising the automation and the flexibility achieved in the commanding as well as the data-processing chain.

  7. NASA/Drexel program. [research effort in large-scale technical programs management for application to urban problems

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The results are reported of the NASA/Drexel research effort which was conducted in two separate phases. The initial phase stressed exploration of the problem from the point of view of three primary research areas and the building of a multidisciplinary team. The final phase consisted of a clinical demonstration program in which the research associates consulted with the County Executive of New Castle County, Delaware, to aid in solving actual problems confronting the County Government. The three primary research areas of the initial phase are identified as technology, management science, and behavioral science. Five specific projects which made up the research effort are treated separately. A final section contains the conclusions drawn from total research effort as well as from the specific projects.

  8. Citizen science on a smartphone: Participants' motivations and learning.

    PubMed

    Land-Zandstra, Anne M; Devilee, Jeroen L A; Snik, Frans; Buurmeijer, Franka; van den Broek, Jos M

    2016-01-01

    Citizen science provides researchers means to gather or analyse large datasets. At the same time, citizen science projects offer an opportunity for non-scientists to be part of and learn from the scientific process. In the Dutch iSPEX project, a large number of citizens turned their smartphones into actual measurement devices to measure aerosols. This study examined participants' motivation and perceived learning impacts of this unique project. Most respondents joined iSPEX because they wanted to contribute to the scientific goals of the project or because they were interested in the project topics (health and environmental impact of aerosols). In terms of learning impact, respondents reported a gain in knowledge about citizen science and the topics of the project. However, many respondents had an incomplete understanding of the science behind the project, possibly caused by the complexity of the measurements. © The Author(s) 2015.

  9. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    NASA Astrophysics Data System (ADS)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value

  10. Foundations of data-intensive science: Technology and practice for high throughput, widely distributed, data management and analysis systems

    NASA Astrophysics Data System (ADS)

    Johnston, William; Ernst, M.; Dart, E.; Tierney, B.

    2014-04-01

    Today's large-scale science projects involve world-wide collaborations depend on moving massive amounts of data from an instrument to potentially thousands of computing and storage systems at hundreds of collaborating institutions to accomplish their science. This is true for ATLAS and CMS at the LHC, and it is true for the climate sciences, Belle-II at the KEK collider, genome sciences, the SKA radio telescope, and ITER, the international fusion energy experiment. DOE's Office of Science has been collecting science discipline and instrument requirements for network based data management and analysis for more than a decade. As a result of this certain key issues are seen across essentially all science disciplines that rely on the network for significant data transfer, even if the data quantities are modest compared to projects like the LHC experiments. These issues are what this talk will address; to wit: 1. Optical signal transport advances enabling 100 Gb/s circuits that span the globe on optical fiber with each carrying 100 such channels; 2. Network router and switch requirements to support high-speed international data transfer; 3. Data transport (TCP is still the norm) requirements to support high-speed international data transfer (e.g. error-free transmission); 4. Network monitoring and testing techniques and infrastructure to maintain the required error-free operation of the many R&E networks involved in international collaborations; 5. Operating system evolution to support very high-speed network I/O; 6. New network architectures and services in the LAN (campus) and WAN networks to support data-intensive science; 7. Data movement and management techniques and software that can maximize the throughput on the network connections between distributed data handling systems, and; 8. New approaches to widely distributed workflow systems that can support the data movement and analysis required by the science. All of these areas must be addressed to enable large-scale, widely distributed data analysis systems, and the experience of the LHC can be applied to other scientific disciplines. In particular, specific analogies to the SKA will be cited in the talk.

  11. Reviews Book: At Home: A Short History of Private Life Book: The Story of Mathematics Book: Time Travel: A Writer's Guide to the Real Science of Plausible Time Travel Equipment: Rotational Inertial Wands DVD: Planets Book: The Fallacy of Fine-Tuning Equipment: Scale with Dial Equipment: Infrared Thermometers Book: 300 Science and History Projects Book: The Nature of Light and Colour in the Open Air Equipment: Red Tide Spectrometer Web Watch

    NASA Astrophysics Data System (ADS)

    2011-09-01

    WE RECOMMEND The Story of Mathematics Book shows the link between maths and physics Time Travel: A Writer's Guide to the Real Science of Plausible Time Travel Book explains how to write good time-travelling science fiction Rotational Inertial Wands Wands can help explore the theory of inertia Infrared Thermometers Kit measures temperature differences Red Tide Spectrometer Spectrometer gives colour spectra WORTH A LOOK At Home: A Short History of Private Life Bryson explores the history of home life The Fallacy of Fine-Tuning Book wades into the science/religion debate Scale with Dial Cheap scales can be turned into Newton measuring scales 300 Science History Projects Fun science projects for kids to enjoy The Nature of Light and Colour in the Open Air Text looks at fascinating optical effects HANDLE WITH CARE Planets DVD takes a trip through the solar system WEB WATCH Websites offer representations of nuclear chain reactions

  12. Extremely Large Telescope Project Selected in ESFRI Roadmap

    NASA Astrophysics Data System (ADS)

    2006-10-01

    In its first Roadmap, the European Strategy Forum on Research Infrastructures (ESFRI) choose the European Extremely Large Telescope (ELT), for which ESO is presently developing a Reference Design, as one of the large scale projects to be conducted in astronomy, and the only one in optical astronomy. The aim of the ELT project is to build before the end of the next decade an optical/near-infrared telescope with a diameter in the 30-60m range. ESO PR Photo 40/06 The ESFRI Roadmap states: "Extremely Large Telescopes are seen world-wide as one of the highest priorities in ground-based astronomy. They will vastly advance astrophysical knowledge allowing detailed studies of inter alia planets around other stars, the first objects in the Universe, super-massive Black Holes, and the nature and distribution of the Dark Matter and Dark Energy which dominate the Universe. The European Extremely Large Telescope project will maintain and reinforce Europe's position at the forefront of astrophysical research." Said Catherine Cesarsky, Director General of ESO: "In 2004, the ESO Council mandated ESO to play a leading role in the development of an ELT for Europe's astronomers. To that end, ESO has undertaken conceptual studies for ELTs and is currently also leading a consortium of European institutes engaged in studying enabling technologies for such a telescope. The inclusion of the ELT in the ESFRI roadmap, together with the comprehensive preparatory work already done, paves the way for the next phase of this exciting project, the design phase." ESO is currently working, in close collaboration with the European astronomical community and the industry, on a baseline design for an Extremely Large Telescope. The plan is a telescope with a primary mirror between 30 and 60 metres in diameter and a financial envelope of about 750 m Euros. It aims at more than a factor ten improvement in overall performance compared to the current leader in ground based astronomy: the ESO Very Large Telescope at the Paranal Observatory. The draft Baseline Reference Design will be presented to the wider scientific community on 29 - 30 November 2006 at a dedicated ELT Workshop Meeting in Marseille (France) and will be further reiterated. The design is then to be presented to the ESO Council at the end of 2006. The goal is to start the detailed E-ELT design work by the first half of 2007. Launched in April 2002, the European Strategy Forum on Research Infrastructures was set-up following a recommendation of the European Union Council, with the role to support a coherent approach to policy-making on research infrastructures in Europe, and to act as an incubator for international negotiations about concrete initiatives. In particular, ESFRI has prepared a European Roadmap identifying new Research Infrastructure of pan-European interest corresponding to the long term needs of the European research communities, covering all scientific areas, regardless of possible location and likely to be realised in the next 10 to 20 years. The Roadmap was presented on 19 October. It is the result of an intensive two-year consultation and peer review process involving over 1000 high level European and international experts. The Roadmap identifies 35 large scale infrastructure projects, at various stages of development, in seven key research areas including Environmental Sciences; Energy; Materials Sciences; Astrophysics, Astronomy, Particle and Nuclear Physics; Biomedical and Life Sciences; Social Sciences and the Humanities; Computation and data Treatment.

  13. SDN-NGenIA, a software defined next generation integrated architecture for HEP and data intensive science

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Hendricks, T. W.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.

    2017-10-01

    The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment [1] and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.

  14. Semantic Web technologies for the big data in life sciences.

    PubMed

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  15. The Next Level in Automated Solar Flare Forecasting: the EU FLARECAST Project

    NASA Astrophysics Data System (ADS)

    Georgoulis, M. K.; Bloomfield, D.; Piana, M.; Massone, A. M.; Gallagher, P.; Vilmer, N.; Pariat, E.; Buchlin, E.; Baudin, F.; Csillaghy, A.; Soldati, M.; Sathiapal, H.; Jackson, D.; Alingery, P.; Argoudelis, V.; Benvenuto, F.; Campi, C.; Florios, K.; Gontikakis, C.; Guennou, C.; Guerra, J. A.; Kontogiannis, I.; Latorre, V.; Murray, S.; Park, S. H.; Perasso, A.; Sciacchitano, F.; von Stachelski, S.; Torbica, A.; Vischi, D.

    2017-12-01

    We attempt an informative description of the Flare Likelihood And Region Eruption Forecasting (FLARECAST) project, European Commission's first large-scale investment to explore the limits of reliability and accuracy achieved for the forecasting of major solar flares. We outline the consortium, top-level objectives and first results of the project, highlighting the diversity and fusion of expertise needed to deliver what was promised. The project's final product, featuring an openly accessible, fully modular and free to download flare forecasting facility will be delivered in early 2018. The project's three objectives, namely, science, research-to-operations and dissemination / communication, are also discussed: in terms of science, we encapsulate our close-to-final assessment on how close (or far) are we from a practically exploitable solar flare forecasting. In terms of R2O, we briefly describe the architecture of the FLARECAST infrastructure that includes rigorous validation for each forecasting step. From the three different communication levers of the project we finally focus on lessons learned from the two-way interaction with the community of stakeholders and governmental organizations. The FLARECAST project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No. 640216.

  16. Weak lensing by galaxy troughs in DES Science Verification data

    DOE PAGES

    Gruen, D.; Friedrich, O.; Amara, A.; ...

    2015-11-29

    In this study, we measure the weak lensing shear around galaxy troughs, i.e. the radial alignment of background galaxies relative to underdensities in projections of the foreground galaxy field over a wide range of redshift in Science Verification data from the Dark Energy Survey. Our detection of the shear signal is highly significant (10σ–15σ for the smallest angular scales) for troughs with the redshift range z ϵ [0.2, 0.5] of the projected galaxy field and angular diameters of 10 arcmin…1°. These measurements probe the connection between the galaxy, matter density, and convergence fields. By assuming galaxies are biased tracers ofmore » the matter density with Poissonian noise, we find agreement of our measurements with predictions in a fiducial Λ cold dark matter model. The prediction for the lensing signal on large trough scales is virtually independent of the details of the underlying model for the connection of galaxies and matter. Our comparison of the shear around troughs with that around cylinders with large galaxy counts is consistent with a symmetry between galaxy and matter over- and underdensities. In addition, we measure the two-point angular correlation of troughs with galaxies which, in contrast to the lensing signal, is sensitive to galaxy bias on all scales. The lensing signal of troughs and their clustering with galaxies is therefore a promising probe of the statistical properties of matter underdensities and their connection to the galaxy field.« less

  17. Project BudBurst: Continental-scale citizen science for all seasons

    NASA Astrophysics Data System (ADS)

    Henderson, S.; Newman, S. J.; Ward, D.; Havens-Young, K.; Alaback, P.; Meymaris, K.

    2011-12-01

    Project BudBurst's (budburst.org) recent move to the National Ecological Observatory Network (NEON) has benefitted both programs. NEON has been able to use Project BudBurst as a testbed to learn best practices, network with experts in the field, and prototype potential tools for engaging people in continental-scale ecology as NEON develops its citizen science program. Participation in Project BudBurst has grown significantly since the move to NEON. Project BudBurst is a national citizen science initiative designed to engage the public in observations of phenological (plant life cycle) events that raise awareness of climate change, and create a cadre of informed citizen scientists. Citizen science programs such as Project BudBurst provide the opportunity for students and interested laypersons to actively participate in scientific research. Such programs are important not only from an educational perspective, but because they also enable scientists to broaden the geographic and temporal scale of their observations. The goals of Project BudBurst are to 1) increase awareness of phenology as an area of scientific study; 2) Increase awareness of the impacts of changing climates on plants at a continental-scale; and 3) increase science literacy by engaging participants in the scientific process. From its 2008 launch in February, this on-line educational and data-entry program, engaged participants of all ages and walks of life in recording the timing of the leafing and flowering of wild and cultivated species found across the continent. Thus far, thousands of participants from all 50 states have submitted data. This presentation will provide an overview of Project BudBurst and will report on the results of the 2010 field campaign and discuss plans to expand Project BudBurst in 2012 including the use of mobile phones applications for data collection and reporting from the field. Project BudBurst is co-managed by the National Ecological Observatory Network and the Chicago Botanic Garden.

  18. Scale in Education Research: Towards a Multi-Scale Methodology

    ERIC Educational Resources Information Center

    Noyes, Andrew

    2013-01-01

    This article explores some theoretical and methodological problems concerned with scale in education research through a critique of a recent mixed-method project. The project was framed by scale metaphors drawn from the physical and earth sciences and I consider how recent thinking around scale, for example, in ecosystems and human geography might…

  19. Rasch Analysis of Scientific Literacy in an Astronomical Citizen Science Project

    NASA Astrophysics Data System (ADS)

    Price, A.

    2012-06-01

    (Abstract only) We investigate change in attitudes towards science and belief in the nature of science by participants in a citizen science project about astronomy. A pre-test was given to 1,385 participants and a post-test was given six months later to 165 participants. Nine participants were interviewed. Responses were analyzed using the Rasch Rating Scale Model to place Likert data on an interval scale allowing for more sensitive parametric analysis. Results show that overall attitudes did not change, p = .225. However, there was significant change towards attitudes relating to science news (positive) and scientific self efficacy (negative), p = .001 and p = .035, respectively. This change was related to social activity in the project. Beliefs in the nature of science exhibited a small but significant increase, p = .04. Relative positioning of scores on the belief items suggests the increase is mostly due to reinforcement of current beliefs.

  20. Rasch Analysis of Scientific Literacy in an Astronomical Citizen Science Project

    NASA Astrophysics Data System (ADS)

    Price, Aaron

    2011-05-01

    We investigate change in attitudes towards science and belief in the nature of science by participants in a citizen science project about astronomy. A pre-test was given to 1,385 participants and a post-test was given six months later to 165 participants. Nine participants were interviewed. Responses were analyzed using the Rasch Rating Scale Model to place Likert data on an interval scale allowing for more sensitive parametric analysis. Results show that overall attitudes did not change, p = .225. However, there was significant change towards attitudes relating to science news (positive) and scientific self efficacy (negative), p < .001 and p = .035 respectively. This change was related to social activity in the project. Beliefs in the nature of science exhibited a small, but significant increase, p = .04. Relative positioning of scores on the belief items suggests the increase is mostly due to reinforcement of current beliefs.

  1. Changing Permafrost in the Arctic and its Global Effects in the 21st Century (PAGE21): A very large international and integrated project to measure the impact of permafrost degradation on the climate system

    NASA Astrophysics Data System (ADS)

    Lantuit, Hugues; Boike, Julia; Dahms, Melanie; Hubberten, Hans-Wolfgang

    2013-04-01

    The northern permafrost region contains approximately 50% of the estimated global below-ground organic carbon pool and more than twice as much as is contained in the current atmos-pheric carbon pool. The sheer size of this carbon pool, together with the large amplitude of predicted arctic climate change im-plies that there is a high potential for global-scale feedbacks from arctic climate change if these carbon reservoirs are desta-bilized. Nonetheless, significant gaps exist in our current state of knowledge that prevent us from producing accurate assess-ments of the vulnerability of the arctic permafrost to climate change, or of the implications of future climate change for global greenhouse gas (GHG) emissions. Specifically: • Our understanding of the physical and biogeochemical processes at play in permafrost areas is still insuffi-cient in some key aspects • Size estimates for the high latitude continental carbon and nitrogen stocks vary widely between regions and research groups. • The representation of permafrost-related processes in global climate models still tends to be rudimentary, and is one reason for the frequently poor perform-ances of climate models at high latitudes. The key objectives of PAGE21 are: • to improve our understanding of the processes affect-ing the size of the arctic permafrost carbon and nitro-gen pools through detailed field studies and monitor-ing, in order to quantify their size and their vulnerability to climate change, • to produce, assemble and assess high-quality datasets in order to develop and evaluate representations of permafrost and related processes in global models, • to improve these models accordingly, • to use these models to reduce the uncertainties in feed-backs from arctic permafrost to global change, thereby providing the means to assess the feasibility of stabili-zation scenarios, and • to ensure widespread dissemination of our results in order to provide direct input into the ongoing debate on climate-change mitigation. The concept of PAGE21 is to directly address these questions through a close interaction between monitoring activities, proc-ess studies and modeling on the pertinent temporal and spatial scales. Field sites have been selected to cover a wide range of environmental conditions for the validation of large scale mod-els, the development of permafrost monitoring capabilities, the study of permafrost processes, and for overlap with existing monitoring programs. PAGE21 will contribute to upgrading the project sites with the objective of providing a measurement baseline, both for process studies and for modeling programs. PAGE21 is determined to break down the traditional barriers in permafrost sciences between observational and model-supported site studies and large-scale climate modeling. Our concept for the interaction between site-scale studies and large-scale modeling is to establish and maintain a direct link be-tween these two areas for developing and evaluating, on all spatial scales, the land-surface modules of leading European global climate models taking part in the Coupled Model Inter-comparison Project Phase 5 (CMIP5), designed to inform the IPCC process. The timing of this project is such that the main scientific results from PAGE21, and in particular the model-based assessments will build entirely on new outputs and results from the CMIP5 Climate Model Intercomparison Project designed to inform the IPCC Fifth Assessment Report. However, PAGE21 is designed to leave a legacy that will en-dure beyond the lifetime of the projections that it produces. This legacy will comprise • an improved understanding of the key processes and parameters that determine the vulnerability of arctic permafrost to climate change, • the production of a suite of major European coupled climate models including detailed and validated repre-sentations of permafrost-related processes, that will reduce uncertainties in future climate projections pro-duced well beyond the lifetime of PAGE21, and • the training of a new generation of permafrost scien-tists who will bridge the long-standing gap between permafrost field science and global climate modeling, for the long-term benefit of science and society.

  2. Changes in Participants’ Scientific Attitudes and Epistemological Beliefs During an Astronomical Citizen Science Project

    NASA Astrophysics Data System (ADS)

    Price, Aaron

    2012-01-01

    Citizen science projects offer opportunities for non-scientists to take part in scientific research. While their contribution to scientific data collection has been well documented, there is limited research on changes that may occur to their volunteer participants. In this study, we investigated (1) how volunteers’ attitudes towards science and beliefs in the nature of science changed over six months of participation in an astronomy-themed citizen science project and (2) how the level of project participation accounted for these changes. To measure attitudes towards science and beliefs about the nature of science, identical pre- and post-tests were used. We used pre-test data from 1,375 participants and post-test data collected from 175 participants. Responses were analyzed using the Rasch Rating Scale Model. The pre-test sample was used to create the Rasch scales for the two scientific literacy measures. For the pre/post-test comparisons, data from those who completed both tests were used. Fourteen participants who took the pre/post-tests were interviewed. Results show that overall scientific attitudes did not change, p = .812. However, we did find significant changes related towards two scientific attitude items about science in the news (positive change; p < .001, p < .05) and one related to scientific self-efficacy (negative change, p < .05). These changes were related to the participants’ social activity in the project. Beliefs in the nature of science significantly increased between the pre- and post-tests, p = .014. Relative positioning of individual items on the belief scale did not change much and this change was not related to any of our recorded project activity variables. The interviews suggest that the social aspect of the project is important to participants and the change in self-efficacy is not due to a lowering of esteem but rather a greater appreciation for what they have yet to learn.

  3. Global Collaborations - Prospects and Problems

    NASA Astrophysics Data System (ADS)

    Corbett, Ian

    2005-04-01

    International collaboration has long been a feature of science. Collaborative investments in joint facilities and projects have grown considerably over the past 20-40 years, and many projects have been multinational from the start. This has been particularly true in Europe, where intergovernmental organizations such as CERN, ESA, and ESO have enabled European countries to carry out forefront science with state-of-art facilites which would have been beyond the capabilities of any one country. A brief survey of these organizations, their structure, and the possible reasons behind their success is given. The transition from regional to global creates new problems. Global scale projects face a range of generic issues which must be addressed and overcome if the project is to be a success. Each project has its own specific boundary conditions and each adopts an approach best fitted to its own objectives and constraints. Experience with billion dollar projects such as the SSC, LHC, and ITER shows the key problem areas and demonstrates the importance of preparatory work in the early stages to settle issues such as schedule, funding, location, legal and managerial structure, and oversight. A range of current and proposed intercontinental or global projects - so- called ``Megascience Projects" - is reviewed. Such projects, originally a feature of space and particle physics, are now becoming more common, and very large projects in astronomy, for example ALMA and 50 - 100m telescopes, and other areas of physics now fall into the `global' category. These projects are on such a large scale, from any scientific, managerial, financial or political perspective, and have such global importance, that they have necessarily been conceived as international from the outset. Increasing financial pressures on governments and funding agencies in the developed countries place additional demands on the project planning. The contrasting approaches, problems faced, and progress made in various projects will be analyzed and possible lessions drawn out. The role which can be played in the early stages by bodies such as the OECD Global Science Forum and G-8 Carnegie Meetings, where science policy makers meet, is examined. Experience shows that these valuable `scene setting' discussions have to be informed by coordinated input from the scientific community and must be followed up by more detailed discussions between funding agencies or their equivalent, because decision making requires the development of a consensus amongst the participants. This process can be illustrated most effectively by the care with which the ideas for the International Linear Collider have been and are being developed. Agreement on building and operating a facility is not the end of the story. The legitimate desire of scientists in all other countries to be able to participate in exploiting a major new facility has to be taken into account, and that introduces a range of proprietary and sociological issues over data access and rights, and now, with the explosion in computing and storage powers, in data archiving support. These are issues which can be addressed within the scientific community and taken to the political arena via such bodies as the OECD Global Science Forum.

  4. ML-o-Scope: A Diagnostic Visualization System for Deep Machine Learning Pipelines

    DTIC Science & Technology

    2014-05-16

    ML-o-scope: a diagnostic visualization system for deep machine learning pipelines Daniel Bruckner Electrical Engineering and Computer Sciences... machine learning pipelines 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f...the system as a support for tuning large scale object-classification pipelines. 1 Introduction A new generation of pipelined machine learning models

  5. Citizen Science and Lifelong Learning

    ERIC Educational Resources Information Center

    Edwards, Richard

    2014-01-01

    Citizen science projects have grown in number, scale and scope in recent years. Such projects engage members of the public in working with professional scientists in a diverse range of practices. Yet there has been little educational exploration of such projects to date. In particular, there has been limited exploration of the educational…

  6. Investigation of multilayer domains in large-scale CVD monolayer graphene by optical imaging

    NASA Astrophysics Data System (ADS)

    Yu, Yuanfang; Li, Zhenzhen; Wang, Wenhui; Guo, Xitao; Jiang, Jie; Nan, Haiyan; Ni, Zhenhua

    2017-03-01

    CVD graphene is a promising candidate for optoelectronic applications due to its high quality and high yield. However, multi-layer domains could inevitably form at the nucleation centers during the growth. Here, we propose an optical imaging technique to precisely identify the multilayer domains and also the ratio of their coverage in large-scale CVD monolayer graphene. We have also shown that the stacking disorder in twisted bilayer graphene as well as the impurities on the graphene surface could be distinguished by optical imaging. Finally, we investigated the effects of bilayer domains on the optical and electrical properties of CVD graphene, and found that the carrier mobility of CVD graphene is seriously limited by scattering from bilayer domains. Our results could be useful for guiding future optoelectronic applications of large-scale CVD graphene. Project supported by the National Natural Science Foundation of China (Nos. 61422503, 61376104), the Open Research Funds of Key Laboratory of MEMS of Ministry of Education (SEU, China), and the Fundamental Research Funds for the Central Universities.

  7. Ethical considerations of research policy for personal genome analysis: the approach of the Genome Science Project in Japan.

    PubMed

    Minari, Jusaku; Shirai, Tetsuya; Kato, Kazuto

    2014-12-01

    As evidenced by high-throughput sequencers, genomic technologies have recently undergone radical advances. These technologies enable comprehensive sequencing of personal genomes considerably more efficiently and less expensively than heretofore. These developments present a challenge to the conventional framework of biomedical ethics; under these changing circumstances, each research project has to develop a pragmatic research policy. Based on the experience with a new large-scale project-the Genome Science Project-this article presents a novel approach to conducting a specific policy for personal genome research in the Japanese context. In creating an original informed-consent form template for the project, we present a two-tiered process: making the draft of the template following an analysis of national and international policies; refining the draft template in conjunction with genome project researchers for practical application. Through practical use of the template, we have gained valuable experience in addressing challenges in the ethical review process, such as the importance of sharing details of the latest developments in genomics with members of research ethics committees. We discuss certain limitations of the conventional concept of informed consent and its governance system and suggest the potential of an alternative process using information technology.

  8. Climate change impact on streamflow in large-scale river basins: projections and their uncertainties sourced from GCMs and RCP scenarios

    NASA Astrophysics Data System (ADS)

    Nasonova, Olga N.; Gusev, Yeugeniy M.; Kovalev, Evgeny E.; Ayzel, Georgy V.

    2018-06-01

    Climate change impact on river runoff was investigated within the framework of the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP2) using a physically-based land surface model Soil Water - Atmosphere - Plants (SWAP) (developed in the Institute of Water Problems of the Russian Academy of Sciences) and meteorological projections (for 2006-2099) simulated by five General Circulation Models (GCMs) (including GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, and NorESM1-M) for each of four Representative Concentration Pathway (RCP) scenarios (RCP2.6, RCP4.5, RCP6.0, and RCP8.5). Eleven large-scale river basins were used in this study. First of all, SWAP was calibrated and validated against monthly values of measured river runoff with making use of forcing data from the WATCH data set and all GCMs' projections were bias-corrected to the WATCH. Then, for each basin, 20 projections of possible changes in river runoff during the 21st century were simulated by SWAP. Analysis of the obtained hydrological projections allowed us to estimate their uncertainties resulted from application of different GCMs and RCP scenarios. On the average, the contribution of different GCMs to the uncertainty of the projected river runoff is nearly twice larger than the contribution of RCP scenarios. At the same time the contribution of GCMs slightly decreases with time.

  9. Examining Physics Career Interests: Recruitment and Persistence into College

    NASA Astrophysics Data System (ADS)

    Lock, R. M.; Hazari, Z.; Sadler, P. M.; Sonnert, G.

    2012-03-01

    Compared to the undergraduate population, the number of students obtaining physics degrees has been declining since the 1960s. This trend continues despite the increasing number of students taking introductory physics courses in high school and college. Our work uses an ex-post facto design to study the factors that influence students' decision to pursue a career in physics at the beginning of college. These factors include high school physics classroom experiences, other science-related experiences, and students' career motivations. The data used in this study is drawn from the Persistence Research in Science and Engineering (PRiSE) Project, a large-scale study that surveyed a nationally representative sample of college/university students enrolled in introductory English courses about their interests and prior experiences in science.

  10. Impact of SCALE-UP on science teaching self-efficacy of students in general education science courses

    NASA Astrophysics Data System (ADS)

    Cassani, Mary Kay Kuhr

    The objective of this study was to evaluate the effect of two pedagogical models used in general education science on non-majors' science teaching self-efficacy. Science teaching self-efficacy can be influenced by inquiry and cooperative learning, through cognitive mechanisms described by Bandura (1997). The Student Centered Activities for Large Enrollment Undergraduate Programs (SCALE-UP) model of inquiry and cooperative learning incorporates cooperative learning and inquiry-guided learning in large enrollment combined lecture-laboratory classes (Oliver-Hoyo & Beichner, 2004). SCALE-UP was adopted by a small but rapidly growing public university in the southeastern United States in three undergraduate, general education science courses for non-science majors in the Fall 2006 and Spring 2007 semesters. Students in these courses were compared with students in three other general education science courses for non-science majors taught with the standard teaching model at the host university. The standard model combines lecture and laboratory in the same course, with smaller enrollments and utilizes cooperative learning. Science teaching self-efficacy was measured using the Science Teaching Efficacy Belief Instrument - B (STEBI-B; Bleicher, 2004). A science teaching self-efficacy score was computed from the Personal Science Teaching Efficacy (PTSE) factor of the instrument. Using non-parametric statistics, no significant difference was found between teaching models, between genders, within models, among instructors, or among courses. The number of previous science courses was significantly correlated with PTSE score. Student responses to open-ended questions indicated that students felt the larger enrollment in the SCALE-UP room reduced individual teacher attention but that the large round SCALE-UP tables promoted group interaction. Students responded positively to cooperative and hands-on activities, and would encourage inclusion of more such activities in all of the courses. The large enrollment SCALE-UP model as implemented at the host university did not increase science teaching self-efficacy of non-science majors, as hypothesized. This was likely due to limited modification of standard cooperative activities according to the inquiry-guided SCALE-UP model. It was also found that larger SCALE-UP enrollments did not decrease science teaching self-efficacy when standard cooperative activities were used in the larger class.

  11. Delivering the EarthScope Transportable Array as a Community Asset

    NASA Astrophysics Data System (ADS)

    Busby, R. W.; Woodward, R.; Simpson, D. W.; Hafner, K.

    2009-12-01

    The Transportable Array element of EarthScope/USArray is a culmination of years of coordination and planning for a large science initiative via the NSF MREFC program. US researchers and the IRIS Consortium conceived of the science objectives for a continental scale array and, together with the geodetic (PBO) and fault drilling (SAFOD) communities and NSF, successfully merged these scientific objectives with a compelling scientific and technical proposal, accompanied with the budget and schedule to accomplish it. The Transportable Array is now an efficient and exacting execution of an immense technical challenge that, by many measures, is yielding exciting science return, both expected and unanticipated. The technical facility is first-rate in its implementation, yet responsive to science objectives and discovery, actively engaging the community in discussion and new direction. The project is carried out by a core of dedicated and professional staff , guided and advised through considerable feedback from science users who have unprecedented access to high-quality data. This, in a sense, lets seismologists focus on research, rather than be administrators, drivers, shippers, battery mules, electronic technicians and radio hams. Now that USArray is operational, it is interesting to reflect on whether the TA, as a professionally executed project, could succeed as well if it were an independent endeavor, managed and operated outside of the resources developed and available through IRIS and its core programs. We detail how the support the USArray facility provides improves data accessibility and enhances interdisciplinary science. We suggest that the resources and community leadership provided by the IRIS Consortium, and the commitment to the principle of free and open data access, have been basic underpinnings for the success of the TA. This involvement of community-based, scientific leadership in the development of large facilities should be considered in planning future large Earth science or even basic science endeavors. The Global Seismographic Network provides another example where, with strong scientific leadership, the technical objectives have returned far more than expected results from all manner of application of new techniques to high quality data. Again, the key ingredient may be that the project oversight is driven by scientists with free and open access to data and broad and evolving expectations as to how the facility might be applied towards research objectives. Major projects must clearly follow defined plans and budgets; but, while it is important to have managers to motivate schedules and control costs, the energy, vigor and effort to optimize new measures and discover new applications derive from the insights and enthusiasm of the science community.

  12. The Mexican National Programs on Teaching Mathematics and Science with Technology: The Legacy of a Decade of Experiences of Transformation of School Practices and Interactions

    NASA Astrophysics Data System (ADS)

    Sacristán, Ana Isabel; Rojano, Teresa

    Here we give an overview of the Mexican experience of a national program, begun in 1997, of gradual implementation of computational tools in the lower secondary-school classrooms (children 12-15 years-old) for mathematics and science. This project illustrates, through the benefit of long-term hindsight, the successes and difficulties of large-scale massive implementation of technologies in schools. The key factors for success and for transforming school practices seem to be: adequate planning, gradual implementation, continuous training and support, and enough time (years) for assimilation and integration.

  13. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    NASA Astrophysics Data System (ADS)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  14. Validating Common Measures of Self-Efficacy and Career Attitudes within Informal Health Education for Middle and High School Students.

    PubMed

    Peterman, Karen; Withy, Kelley; Boulay, Rachel

    2018-06-01

    A common challenge in the evaluation of K-12 science education is identifying valid scales that are an appropriate fit for both a student's age and the educational outcomes of interest. Though many new scales have been validated in recent years, there is much to learn about the appropriate educational contexts and audiences for these measures. This study investigated two such scales, the DEVISE Self-Efficacy for Science scale and the Career Interest Questionnaire (CIQ), within the context of two related health sciences projects. Consistent patterns were found in the reliability of each scale across three age groups (middle school, high school, early college) and within the context of each project. As expected, self-efficacy and career interest, as measured through these scales, were found to be correlated. The pattern of results for CIQ scores was also similar to that reported in other literature. This study provides examples of how practitioners can validate established measures for new and specific contexts and provides some evidence to support the use of the scales studied in health science education contexts.

  15. Studies on combined model based on functional objectives of large scale complex engineering

    NASA Astrophysics Data System (ADS)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  16. The Holocene Geoarchaeology of the Desert Nile in Northern Sudan

    NASA Astrophysics Data System (ADS)

    Woodward, Jamie; Macklin, Mark; Spencer, Neal; Welsby, Derek; Dalton, Matthew; Hay, Sophie; Hardy, Andrew

    2016-04-01

    Invited Paper Forty years ago Colin Renfrew declared that "every archaeological problem starts as a problem in geoarchaeology" (Renfrew, 1976 p. 2). With this assertion in mind, this paper draws upon the findings from field research in two sectors of the Nile Valley of Northern Sudan dedicated to the exploration of human-environment interactions during the middle and late Holocene. This part of the Nile corridor contains a rich cultural record and an exceptionally well preserved Holocene fluvial archive. A distinctive feature of these records is the variety of evidence for interaction between desert and river over a range of spatial and temporal scales. This interaction presented both challenges and opportunities for its ancient inhabitants. This paper will present evidence for large-scale landscape changes driven by shifts in global climate. It will also show how we have integrated the archaeological and geological records in the Northern Dongola Reach and at Amara West - where long-term field projects led by archaeologists from the British Museum have recognised the importance of a sustained commitment to interdisciplinary research to achieve a fully integrated geoarchaeological approach across a range of scales. The former project is a large-scale landscape survey with multiple sites across an 80 km reach of the Nile whilst the latter has a strong focus on a single New Kingdom town site and changes in its environmental setting. By combining multiple archaeological and geological datasets - and pioneering the use of OSL dating and strontium isotope analysis in the Desert Nile - we have developed a new understanding of human responses to Holocene climate and landscape change in this region. Renfrew, C. (1976) Archaeology and the earth sciences. In: D.A. Davidson and M.I. Shackley (eds) Geoarchaeology: Earth Science and the Past, Duckworth, London, 1-5.

  17. Large-scale fortification of condiments and seasonings as a public health strategy: equity considerations for implementation.

    PubMed

    Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia

    2016-09-01

    Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.

  18. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    NASA Astrophysics Data System (ADS)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  19. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation

    PubMed Central

    Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho

    2014-01-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299

  20. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation.

    PubMed

    Lee, Jae H; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho

    2014-11-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting.

  1. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  2. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  3. Consideration of Experimental Approaches in the Physical and Biological Sciences in Designing Long-Term Watershed Studies in Forested Landscapes

    NASA Astrophysics Data System (ADS)

    Stallard, R. F.

    2011-12-01

    The importance of biological processes in controlling weathering, erosion, stream-water composition, soil formation, and overall landscape development is generally accepted. The U.S. Geological Survey (USGS) Water, Energy, and Biogeochemical Budgets (WEBB) Project in eastern Puerto Rico and Panama and the Smithsonian Tropical Research Institute (STRI) Panama Canal Watershed Experiment (PCWE) are landscape-scale studies based in the humid tropics where the warm temperatures, moist conditions, and luxuriant vegetation promote especially rapid biological and chemical processes - photosynthesis, respiration, decay, and chemical weathering. In both studies features of small-watershed, large-watershed, and landscape-scale-biology experiments are blended to satisfy the research needs of the physical and biological sciences. The WEBB Project has successfully synthesized its first fifteen years of data, and has addressed the influence of land cover, geologic, topographic, and hydrologic variability, including huge storms on a wide range of hydrologic, physical, and biogeochemical processes. The ongoing PCWE should provide a similar synthesis of a moderate-sized humid tropical watershed. The PCWE and the Agua Salud Project (ASP) within the PCWE are now addressing the role of land cover (mature forests, pasture, invasive-grass dominated, secondary succession, native species plantation, and teak) at scales ranging from small watersheds to the whole Panama Canal watershed. Biologists have participated in the experimental design at both watershed scales, and small (0.1 ha) to large (50 ha) forest-dynamic plots have a central role in interfacing between physical scientists and biologists. In these plots, repeated, high-resolution mapping of all woody plants greater than 1-cm diameter provides a description of population changes through time presumably reflecting individual life histories, interactions with other organisms and the influence of landscape processes and climate, thereby bridging the research needs and conceptual scales of hydrologists and biogeochemists with those of biologists. Both experiments are embedded in larger data-collection networks: the WEBB within the hydrological and meteorological monitoring programs of the USGS and other federal agencies, and the PCWE in the long-term monitoring conducted by the Panama Canal Authority (ACP), its antecedents, and STRI. Examination of landscape-scale processes in a changing world requires the development of detailed landscape-scale data sets, including a formulation of reference states that can act as surrogate experimental controls. For example, the concept of a landscape steady state provides a convenient reference in which present-day observations can be interpreted. Extreme hydrological states must also be described, and both WEBB and PCWE have successfully examined the role of droughts and large storms and their impact on geomorphology, biogeochemistry, and biology. These experiments also have provided platforms for research endeavors never contemplated in the original objectives, a testament to the importance of developing approaches that consider the needs of physical and biological sciences.

  4. What Works? Common Practices in High Functioning Afterschool Programs across the Nation in Math, Reading, Science, Arts, Technology, and Homework--A Study by the National Partnership. The Afterschool Program Assessment Guide. CRESST Report 768

    ERIC Educational Resources Information Center

    Huang, Denise; Cho, Jamie; Mostafavi, Sima; Nam, Hannah H.; Oh, Christine; Harven, Aletha; Leon, Seth

    2010-01-01

    In an effort to identify and incorporate exemplary practices into existing and future afterschool programs, the U.S. Department of Education commissioned a large-scale evaluation of the 21st Century Community Learning Center (CCLC) program. The purpose of this evaluation project was to develop resources and professional development that addresses…

  5. Diurnal Cycle of Convection and Interaction with the Large-Scale Circulation

    NASA Technical Reports Server (NTRS)

    Salby, Murry

    2002-01-01

    The science in this effort was scheduled in the project's 3rd and 4th years, after a long record of high-resolution Global Cloud Imagery (GCI) had been produced. Unfortunately, political disruptions that interfered with this project led to its funding being terminated after only two years of support. Nevertheless, the availability of intermediate data opened the door to a number of important scientific studies. Beyond considerations of the diurnal cycle addressed in this grant, the GCI wakes possible a wide range of studies surrounding convection, cloud and precipitation. Several are already underway with colleagues in the US and abroad, who have requested the GCI.

  6. Tools and Techniques for Measuring and Improving Grid Performance

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.

  7. FEASIBILITY OF LARGE-SCALE OCEAN CO2 SEQUESTRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Peter Brewer; Dr. James Barry

    2002-09-30

    We have continued to carry out creative small-scale experiments in the deep ocean to investigate the science underlying questions of possible future large-scale deep-ocean CO{sub 2} sequestration as a means of ameliorating greenhouse gas growth rates in the atmosphere. This project is closely linked to additional research funded by the DoE Office of Science, and to support from the Monterey Bay Aquarium Research Institute. The listing of project achievements here over the past year reflects these combined resources. Within the last project year we have: (1) Published a significant workshop report (58 pages) entitled ''Direct Ocean Sequestration Expert's Workshop'', basedmore » upon a meeting held at MBARI in 2001. The report is available both in hard copy, and on the NETL web site. (2) Carried out three major, deep ocean, (3600m) cruises to examine the physical chemistry, and biological consequences, of several liter quantities released on the ocean floor. (3) Carried out two successful short cruises in collaboration with Dr. Izuo Aya and colleagues (NMRI, Osaka, Japan) to examine the fate of cold (-55 C) CO{sub 2} released at relatively shallow ocean depth. (4) Carried out two short cruises in collaboration with Dr. Costas Tsouris, ORNL, to field test an injection nozzle designed to transform liquid CO{sub 2} into a hydrate slurry at {approx}1000m depth. (5) In collaboration with Prof. Jill Pasteris (Washington University) we have successfully accomplished the first field test of a deep ocean laser Raman spectrometer for probing in situ the physical chemistry of the CO{sub 2} system. (6) Submitted the first major paper on biological impacts as determined from our field studies. (7) Submitted a paper on our measurements of the fate of a rising stream of liquid CO{sub 2} droplets to Environmental Science & Technology. (8) Have had accepted for publication in Eos the first brief account of the laser Raman spectrometer success. (9) Have had two papers submitted for the Greenhouse Gas Technology--6 Conference (Kyoto) accepted. (10) Been nominated by the U.S. Dept. of State to attend the Nov. 2002 IPCC Workshop on Carbon Capture and Storage. (11) Given presentations at national meetings, including the AGU Ocean Sciences Meeting, the American Chemical Society, the Minerals, Materials, and Metals Society, the National Academy of Engineering, and given numerous invited lectures.« less

  8. Science Goals of the U.S. Department of the Interior Southeast Climate Science Center

    USGS Publications Warehouse

    Dalton, Melinda S.

    2011-01-01

    In 2011, the U.S. Department of the Interior Southeast Climate Science Center (CSC) finalized the first draft of its goals for research needed to address the needs of natural and cultural partners for climate science in the Southeastern United States. The science themes described in this draft plan were established to address the information needs of ecoregion conservation partnerships, such as the Landscape Conservation Cooperatives (LCCs) and other regional conservation-science and resource-management partners. These themes were developed using priorities defined by partners and stakeholders in the Southeast and on a large-scale, multidisciplinary project-the Southeast Regional Assessment Project (SERAP)-developed in concert with those partners. Science products developed under these themes will provide models of potential future conditions, assessments of likely impacts, and tools that can be used to inform the conservation management decisions of LCCs and other partners. This information will be critical as managers try to anticipate and adapt to climate change. Resource managers in the Southeast are requesting this type of information, in many cases as a result of observed climate change effects. The Southeast CSC draft science plan identifies six science themes and frames the activities (tasks, with examples of recommended near-term work for each task included herein) related to each theme that are needed to achieve the objectives of the Southeast CSC.

  9. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities (Book)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. The U.S. Department of Energy's Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessary private capital to complete them. This guide is intended to provide a general resource that will begin to develop the Federal employee's awareness and understanding of the project developer's operating environment and the private sector's awareness and understandingmore » of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this guide has been organized to match Federal processes with typical phases of commercial project development. The main purpose of this guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project.« less

  10. Conceptualizing the Science Curriculum: 40 Years of Developing Assessment Frameworks in Three Large-Scale Assessments

    ERIC Educational Resources Information Center

    Kind, Per Morten

    2013-01-01

    The paper analyzes conceptualizations in the science frameworks in three large-scale assessments, Trends in Mathematics and Science Study (TIMSS), Programme for International Student Assessment (PISA), and National Assessment of Educational Progress (NAEP). The assessments have a shared history, but have developed different conceptualizations. The…

  11. Can Citizen Science Assist in Determining Koala (Phascolarctos cinereus) Presence in a Declining Population?

    PubMed Central

    Flower, Emily; Jones, Darryl; Bernede, Lilia

    2016-01-01

    Simple Summary Current scientific methods used to determine national population estimates for species like the koala, where individuals are scattered over a vast area, have failed to deliver an accurate and widely accepted result. Current citizen science projects aimed at mapping koala sightings reported by the public all use different methods and store their data in their own databases, each collecting scattered pieces of a much larger puzzle. To bring these pieces together, this study developed guidelines for a national citizen science project highlighting the importance of using one single method for data collection, and in turn assisting in the development of a national koala population database. Abstract The acceptance and application of citizen science has risen over the last 10 years, with this rise likely attributed to an increase in public awareness surrounding anthropogenic impacts affecting urban ecosystems. Citizen science projects have the potential to expand upon data collected by specialist researchers as they are able to gain access to previously unattainable information, consequently increasing the likelihood of an effective management program. The primary objective of this research was to develop guidelines for a successful regional-scale citizen science project following a critical analysis of 12 existing citizen science case studies. Secondly, the effectiveness of these guidelines was measured through the implementation of a citizen science project, Koala Quest, for the purpose of estimating the presence of koalas in a fragmented landscape. Consequently, this research aimed to determine whether citizen-collected data can augment traditional science research methods, by comparing and contrasting the abundance of koala sightings gathered by citizen scientists and professional researchers. Based upon the guidelines developed, Koala Quest methodologies were designed, the study conducted, and the efficacy of the project assessed. To combat the high variability of estimated koala populations due to differences in counting techniques, a national monitoring and evaluation program is required, in addition to a standardised method for conducting koala population estimates. Citizen science is a useful method for monitoring animals such as the koala, which are sparsely distributed throughout a vast geographical area, as the large numbers of volunteers recruited by a citizen science project are capable of monitoring a similarly broad spatial range. PMID:27429008

  12. Teaching Sustainability as a Large Format Environmental Science Elective

    NASA Astrophysics Data System (ADS)

    Davies, C.; Frisch, M.; Wagner, J.

    2012-12-01

    A challenge in teaching sustainability is engaging students in the global scale and immediacy of environmental impacts, and degree of societal change required to address environmental challenges. Succeeding in a large format Environmental Science elective course with a many as 100 students is an even greater challenge. ENVSC 322 Environmental Sustainability is an innovative new course integrating multiple disciplines, a wide range of external expert speakers and a hands-on community engagement project. The course, in its third year, has been highly successful and impacting for the students, community and faculty involved. The determination of success is based on student and community impacts. Students covered science topics on Earth systems, ecosystem complexity and services through readings and specialist speakers. The interconnection of society and climate was approached through global and local examples with a strong environmental justice component. Experts in a wide range of professional fields were engaged to speak with students on the role and impacts of sustainability in their particular field. Some examples are: Region VII Environmental Protection Agency Environmental Justice Director engaged students in both urban and rural aspects of environmental justice; a Principle Architect and national leader in Green architecture and redevelopment spoke with students regarding the necessity and potential for green urbanism; and industry innovators presented closed-cycle and alternative energy projects. The capstone project and highlight of the course was an individual or team community engagement project on sustainability, designed and implemented by the students. Community engagement projects completed throughout the Kansas City metro area have increased each year in number, quality and impact from 35 the first year to 70 projects this past spring. Students directly engage their communities and through this experience integrate knowledge of environmental systems with how their own society uses and impacts these systems. The direct nature of "doing" a project, not its success, can and has been transformative for many students.

  13. A Large-Scale Inquiry-Based Astronomy Intervention Project: Impact on Students' Content Knowledge Performance and Views of their High School Science Classroom

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena; Deehan, James

    2016-12-01

    In this paper, we present the results from a study of the impact on students involved in a large-scale inquiry-based astronomical high school education intervention in Australia. Students in this intervention were led through an educational design allowing them to undertake an investigative approach to understanding the lifecycle of stars more aligned with the `ideal' picture of school science. Through the use of two instruments, one focused on content knowledge gains and the other on student views of school science, we explore the impact of this design. Overall, students made moderate content knowledge gains although these gains were heavily dependent on the individual teacher, the number of times a teacher implemented and the depth to which an individual teacher went with the provided materials. In terms of students' views, there were significant global changes in their views of their experience of the science classroom. However, there were some areas where no change or slightly negative changes of which some were expected and some were not. From these results, we comment on the necessity of sustained long-period implementations rather than single interventions, the requirement for similarly sustained professional development and the importance of monitoring the impact of inquiry-based implementations. This is especially important as inquiry-based approaches to science are required by many new curriculum reforms, most notably in this context, the new Australian curriculum currently being rolled out.

  14. Knowledge Discovery from Climate Data using Graph-Based Methods

    NASA Astrophysics Data System (ADS)

    Steinhaeuser, K.

    2012-04-01

    Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.

  15. Understanding life together: A brief history of collaboration in biology

    PubMed Central

    Vermeulen, Niki; Parker, John N.; Penders, Bart

    2013-01-01

    The history of science shows a shift from single-investigator ‘little science’ to increasingly large, expensive, multinational, interdisciplinary and interdependent ‘big science’. In physics and allied fields this shift has been well documented, but the rise of collaboration in the life sciences and its effect on scientific work and knowledge has received little attention. Research in biology exhibits different historical trajectories and organisation of collaboration in field and laboratory – differences still visible in contemporary collaborations such as the Census of Marine Life and the Human Genome Project. We employ these case studies as strategic exemplars, supplemented with existing research on collaboration in biology, to expose the different motives, organisational forms and social dynamics underpinning contemporary large-scale collaborations in biology and their relations to historical patterns of collaboration in the life sciences. We find the interaction between research subject, research approach as well as research organisation influencing collaboration patterns and the work of scientists. PMID:23578694

  16. Scale Interactions in the Tropics from a Simple Multi-Cloud Model

    NASA Astrophysics Data System (ADS)

    Niu, X.; Biello, J. A.

    2017-12-01

    Our lack of a complete understanding of the interaction between the moisture convection and equatorial waves remains an impediment in the numerical simulation of large-scale organization, such as the Madden-Julian Oscillation (MJO). The aim of this project is to understand interactions across spatial scales in the tropics from a simplified framework for scale interactions while a using a simplified framework to describe the basic features of moist convection. Using multiple asymptotic scales, Biello and Majda[1] derived a multi-scale model of moist tropical dynamics (IMMD[1]), which separates three regimes: the planetary scale climatology, the synoptic scale waves, and the planetary scale anomalies regime. The scales and strength of the observed MJO would categorize it in the regime of planetary scale anomalies - which themselves are forced from non-linear upscale fluxes from the synoptic scales waves. In order to close this model and determine whether it provides a self-consistent theory of the MJO. A model for diabatic heating due to moist convection must be implemented along with the IMMD. The multi-cloud parameterization is a model proposed by Khouider and Majda[2] to describe the three basic cloud types (congestus, deep and stratiform) that are most responsible for tropical diabatic heating. We implement a simplified version of the multi-cloud model that is based on results derived from large eddy simulations of convection [3]. We present this simplified multi-cloud model and show results of numerical experiments beginning with a variety of convective forcing states. Preliminary results on upscale fluxes, from synoptic scales to planetary scale anomalies, will be presented. [1] Biello J A, Majda A J. Intraseasonal multi-scale moist dynamics of the tropical atmosphere[J]. Communications in Mathematical Sciences, 2010, 8(2): 519-540. [2] Khouider B, Majda A J. A simple multicloud parameterization for convectively coupled tropical waves. Part I: Linear analysis[J]. Journal of the atmospheric sciences, 2006, 63(4): 1308-1323. [3] Dorrestijn J, Crommelin D T, Biello J A, et al. A data-driven multi-cloud model for stochastic parametrization of deep convection[J]. Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, 2013, 371(1991): 20120374.

  17. National Climate Change and Wildlife Science Center project accomplishments: highlights

    USGS Publications Warehouse

    Holl, Sally

    2011-01-01

    The National Climate Change and Wildlife Science Center (NCCWSC) has invested more than $20M since 2008 to put cutting-edge climate science research in the hands of resource managers across the Nation. With NCCWSC support, more than 25 cooperative research initiatives led by U.S. Geological Survey (USGS) researchers and technical staff are advancing our understanding of habitats and species to provide guidance to managers in the face of a changing climate. Projects focus on quantifying and predicting interactions between climate, habitats, species, and other natural resources such as water. Spatial scales of the projects range from the continent of North America, to a regional scale such as the Pacific Northwest United States, to a landscape scale such as the Florida Everglades. Time scales range from the outset of the 20th century to the end of the 21st century. Projects often lead to workshops, presentations, publications and the creation of new websites, computer models, and data visualization tools. Partnership-building is also a key focus of the NCCWSC-supported projects. New and on-going cooperative partnerships have been forged and strengthened with resource managers and scientists at Federal, tribal, state, local, academic, and non-governmental organizations. USGS scientists work closely with resource managers to produce timely and relevant results that can assist managers and policy makers in current resource management decisions. This fact sheet highlights accomplishments of five NCCWSC projects.

  18. Using Computing and Data Grids for Large-Scale Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  19. Large-Scale 3D Printing: The Way Forward

    NASA Astrophysics Data System (ADS)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  20. GEWEX America Prediction Project (GAPP) Science and Implementation Plan

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The purpose of this Science and Implementation Plan is to describe GAPP science objectives and the activities required to meet these objectives, both specifically for the near-term and more generally for the longer-term. The GEWEX Americas Prediction Project (GAPP) is part of the Global Energy and Water Cycle Experiment (GEWEX) initiative that is aimed at observing, understanding and modeling the hydrological cycle and energy fluxes at various time and spatial scales. The mission of GAPP is to demonstrate skill in predicting changes in water resources over intraseasonal-to-interannual time scales, as an integral part of the climate system.

  1. Setting up crowd science projects.

    PubMed

    Scheliga, Kaja; Friesike, Sascha; Puschmann, Cornelius; Fecher, Benedikt

    2016-11-29

    Crowd science is scientific research that is conducted with the participation of volunteers who are not professional scientists. Thanks to the Internet and online platforms, project initiators can draw on a potentially large number of volunteers. This crowd can be involved to support data-rich or labour-intensive projects that would otherwise be unfeasible. So far, research on crowd science has mainly focused on analysing individual crowd science projects. In our research, we focus on the perspective of project initiators and explore how crowd science projects are set up. Based on multiple case study research, we discuss the objectives of crowd science projects and the strategies of their initiators for accessing volunteers. We also categorise the tasks allocated to volunteers and reflect on the issue of quality assurance as well as feedback mechanisms. With this article, we contribute to a better understanding of how crowd science projects are set up and how volunteers can contribute to science. We suggest that our findings are of practical relevance for initiators of crowd science projects, for science communication as well as for informed science policy making. © The Author(s) 2016.

  2. Compensatory mitigation for streams under the Clean Water Act: reassessing science and redirecting policy

    USDA-ARS?s Scientific Manuscript database

    Considerable public funds are annually expended on stream restoration projects, but available science suggests that stream restoration as currently practiced is not effective in recovering ecosystem functional integrity. The physical scale of most stream restoration projects is insufficient because...

  3. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. For the purposes of this Guide, large-scale Federal renewable energy projects are defined as renewable energy facilities larger than 10 megawatts (MW) that are sited on Federal property and lands and typically financed and owned by third parties.1 The U.S. Department of Energy’s Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessarymore » private capital to complete them. This Guide is intended to provide a general resource that will begin to develop the Federal employee’s awareness and understanding of the project developer’s operating environment and the private sector’s awareness and understanding of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this Guide has been organized to match Federal processes with typical phases of commercial project development. FEMP collaborated with the National Renewable Energy Laboratory (NREL) and professional project developers on this Guide to ensure that Federal projects have key elements recognizable to private sector developers and investors. The main purpose of this Guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project. This framework begins the translation between the Federal and private sector operating environments. When viewing the overall« less

  4. The influence of lightning activity and anthropogenic factors on large-scale characteristics of natural fires

    NASA Astrophysics Data System (ADS)

    Eliseev, A. V.; Mokhov, I. I.; Chernokulsky, A. V.

    2017-01-01

    A module for simulating of natural fires (NFs) in the climate model of the A.M. Obukhov Institute of Atmospheric Physics, Russian Academy of Sciences (IAP RAS CM), is extended with respect to the influence of lightning activity and population density on the ignition frequency and fire suppression. The IAP RAS CM is used to perform numerical experiments in accordance with the conditions of the project that intercompares climate models, CMIP5 (Coupled Models Intercomparison Project, phase 5). The frequency of lightning flashes was assigned in accordance with the LIS/OTD satellite data. In the calculations performed, anthropogenic ignitions play an important role in NF occurrences, except for regions at subpolar latitudes and, to a lesser degree, tropical and subtropical regions. Taking into account the dependence of fire frequency on lightning activity and population density intensifies the influence of characteristics of natural fires on the climate changes in tropics and subtropics as compared to the version of the IAP RAS CM that does not take the influence of ignition sources on the large-scale characteristics of NFs into consideration.

  5. AIMES Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, Daniel S; Jha, Shantenu; Weissman, Jon

    2017-01-31

    This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable and interoperablemore » distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less

  6. AIMES Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weissman, Jon; Katz, Dan; Jha, Shantenu

    2017-01-31

    This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable andmore » interoperable distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less

  7. The Hubble Space Telescope's Student ERO Pilot Project: Implementing Formal and Informal Collaborative Projects

    NASA Astrophysics Data System (ADS)

    Eisenhamer, Bonnie; Ryer, H.; McCallister, D.; Taylor, J.; Bishop, M.

    2010-05-01

    The Hubble Space Telescope's Early Release Observations (EROs) were revealed to the public on September 9, 2009, and K-12 students and educators in six states across the country are joining in the celebration. Students and educators in Maryland, Ohio, New York, California, New Mexico, and Delaware have been invited to participate in the Hubble Space Telescope's Student ERO Pilot Project. This is an interdisciplinary project created by STScI's Office of Public Outreach in which students research the four ERO objects and create various types of projects. In recognition of their participation, the projects are displayed at host institutions in each state (museum, science center, school, planetarium or library) during a special public event for participating students, their families, and teachers. As part of its evaluation program, STScI's Office of Public Outreach has been conducting an evaluation of the project to determine the viability and potential of conducting large-scale, formal/informal collaborative projects in the future. This poster will highlight preliminary findings and share lessons learned.

  8. Large Scale GW Calculations on the Cori System

    NASA Astrophysics Data System (ADS)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  9. Conservation economics. Comment on "Using ecological thresholds to evaluate the costs and benefits of set-asides in a biodiversity hotspot".

    PubMed

    Finney, Christopher

    2015-02-13

    Banks-Leite et al. (Reports, 29 August 2014, p. 1041) conclude that a large-scale program to restore the Brazilian Atlantic Forest using payments for environmental services (PES) is economically feasible. They do not analyze transaction costs, which are quantified infrequently and incompletely in the literature. Transaction costs can exceed 20% of total project costs and should be included in future research. Copyright © 2015, American Association for the Advancement of Science.

  10. Advances in the NASA Earth Science Division Applied Science Program

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Bonniksen, C. K.; Escobar, V. M.

    2016-12-01

    The NASA Earth Science Division's Applied Science Program advances the understanding of and ability to used remote sensing data in support of socio-economic needs. The integration of socio-economic considerations in to NASA Earth Science projects has advanced significantly. The large variety of acquisition methods used has required innovative implementation options. The integration of application themes and the implementation of application science activities in flight project is continuing to evolve. The creation of the recently released Earth Science Division, Directive on Project Applications Program and the addition of an application science requirement in the recent EVM-2 solicitation document NASA's current intent. Continuing improvement in the Earth Science Applications Science Program are expected in the areas of thematic integration, Project Applications Program tailoring for Class D missions and transfer of knowledge between scientists and projects.

  11. Scientific literacy of adult participants in an online citizen science project

    NASA Astrophysics Data System (ADS)

    Price, Charles Aaron

    Citizen Science projects offer opportunities for non-scientists to take part in scientific research. Scientific results from these projects have been well documented. However, there is limited research about how these projects affect their volunteer participants. In this study, I investigate how participation in an online, collaborative astronomical citizen science project can be associated with the scientific literacy of its participants. Scientific literacy is measured through three elements: attitude towards science, belief in the nature of science and competencies associated with learning science. The first two elements are measured through a pre-test given to 1,385 participants when they join the project and a post-test given six months later to 125 participants. Attitude towards science was measured using nine Likert-items custom designed for this project and beliefs in the nature of science were measured using a modified version of the Nature of Science Knowledge scale. Responses were analyzed using the Rasch Rating Scale Model. Competencies are measured through analysis of discourse occurring in online asynchronous discussion forums using the Community of Inquiry framework, which describes three types of presence in the online forums: cognitive, social and teaching. Results show that overall attitudes did not change, p = .225. However, there was significant change towards attitudes about science in the news (positive) and scientific self efficacy (negative), p < .001 and p = .035 respectively. Beliefs in the nature of science exhibited a small, but significant increase, p = .04. Relative positioning of scores on the belief items did not change much, suggesting the increase is mostly due to reinforcement of current beliefs. The cognitive and teaching presence in the online forums did not change, p = .807 and p = .505 respectively. However, the social presence did change, p = .011. Overall, these results suggest that multi-faceted, collaborative citizen science projects can have an impact on some aspects of scientific literacy. Using the Rasch Model allowed us to uncover effects that may have otherwise been hidden. Future projects may want to include social interactivity between participants and also make participants specifically aware of how they are contributing to the entire scientific process.

  12. eBird—Using citizen-science data to help solve real-world conservation challenges (Invited)

    NASA Astrophysics Data System (ADS)

    Sullivan, B. L.; Iliff, M. J.; Wood, C. L.; Fink, D.; Kelling, S.

    2010-12-01

    eBird (www.ebird.org) is an Internet-based citizen-science project that collects bird observations worldwide. eBird is foremost a tool for birders, providing users with a resource for bird information and a way to keep track of their personal bird lists, thus establishing a model for sustained participation and new project growth. Importantly, eBird data are shared with scientists and conservationists working to save birds and their habitats. Here we highlight two different ways these data are used: as a real-time data gathering and visualization tool; and as the primary resource for developing large-scale bird distribution models that explore species-habitat associations and climate change scenarios. eBird provides data across broad temporal and spatial scales, and is a valuable tool for documenting and monitoring bird populations facing a multitude of anthropogenic and environmental impacts. For example, a focused effort to monitor birds on Gulf Coast beaches using eBird is providing essential baseline data and enabling long-term monitoring of bird populations throughout the region. Additionally, new data visualization tools that incorporate data from eBird, NOAA, and Google, are specifically designed to highlight the potential impacts of the Gulf oil spill on bird populations. Through a collaboration of partners in the DataONE network, such as the Oak Ridge National Laboratory, we will use supercomputing time from the National Science Foundation’s TeraGrid to allow Lab scientists to model bird migration phenology at the population level based on eBird data. The process involves combining bird observations with remotely sensed variables such as landcover and greening index to predict bird movements. Preliminary results of these models allow us to animate bird movements across large spatial scales, and to explore how migration timing might be affected under different climate change scenarios.

  13. Research data management support for large-scale, long-term, interdisciplinary collaborative research centers with a focus on environmental sciences

    NASA Astrophysics Data System (ADS)

    Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.

    2017-12-01

    Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both repositories support secure and sustainable data storage, backup, documentation, publication with DOIs, search, download, statistics as well as web mapping features. Moreover, RDM consulting and support services as well as training sessions are carried out regularly.

  14. Hydropower licensing and evolving climate: climate knowledge to support risk assessment for long-term infrastructure decisions

    NASA Astrophysics Data System (ADS)

    Ray, A. J.; Walker, S. H.; Trainor, S. F.; Cherry, J. E.

    2014-12-01

    This presentation focuses on linking climate knowledge to the complicated decision process for hydropower dam licensing, and the affected parties involved in that process. The U.S. Federal Energy Regulatory Commission issues of licenses for nonfederal hydroelectric operations, typically 30-50 year licenses, and longer infrastructure lifespan, a similar time frame as the anticipated risks of changing climate and hydrology. Resources managed by other federal and state agencies such as the NOAA National Marine Fisheries Service may be affected by new or re-licensed projects. The federal Integrated Licensing Process gives the opportunity for affected parties to recommend issues for consultative investigation and possible mitigation, such as impacts to downstream fisheries. New or re-licensed projects have the potential to "pre-adapt" by considering and incorporating risks of climate change into their planned operations as license terms and conditions. Hundreds of hydropower facilities will be up for relicensing in the coming years (over 100 in the western Sierra Nevada alone, and large-scale water projects such as the proposed Lake Powell Pipeline), as well as proposed new dams such as the Susitna project in Alaska. Therefore, there is a need for comprehensive guidance on delivering climate analysis to support understanding of risks of hydropower projects to other affected resources, and decisions on licensing. While each project will have a specific context, many of the questions will be similar. We also will discuss best practices for the use of climate science in water project planning and management, and how creating the best and most appropriate science is also still a developing art. We will discuss the potential reliability of that science for consideration in long term planning, licensing, and mitigation planning for those projects. For science to be "actionable," that science must be understood and accepted by the potential users. This process is a negotiation, with climate scientists needing to understand the concerns of users and respond, and users developing a better understanding of the state of climate science in order to make an informed choice. We will also discuss what is needed to streamline providing that analysis for the many re-licensing decisions expected in the upcoming years.

  15. Pulsed laser vaporization synthesis of boron loaded few layered graphene (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Tennyson, Wesley D.; Tian, Mengkun; More, Karren L.; Geohegan, David B.; Puretzky, Alexander A.; Papandrew, Alexander B.; Rouleau, Christopher M.; Yoon, Mina

    2017-02-01

    The bulk production of loose graphene flakes and its doped variants are important for energy applications including batteries, fuel cells, and supercapacitors as well as optoelectronic and thermal applications. While laser-based methods have been reported for large-scale synthesis of single-wall carbon nanohorns (SWNHs), similar large-scale production of graphene has not been reported. Here we explored the synthesis of doped few layered graphene by pulsed laser vaporization (PLV) with the goal of producing an oxidation resistant electrode support for solid acid fuel cells. PLV of graphite with various amounts of boron was carried out in mixtures in either Ar or Ar/H2 at 0.1 MPa at elevated temperatures under conditions typically used for synthesis of SWNHs. Both the addition of hydrogen to the background argon, or the addition of boron to the carbon target, was found to shift the formation of carbon nanohorns to two-dimensional flakes of a new form of few-layer graphene material, with sizes up to microns in dimension as confirmed by XRD and TEM. However, the materials made with boron exhibited superior resistance to carbon corrosion in the solid acid fuel cell and thermal oxidation resistance in air compared to similar product made without boron. Mechanisms for the synthesis and oxidation resistance of these materials will be discussed based upon detailed characterization and modeling. •Synthesis science was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences (BES), Materials Sciences and Engineering Division. Material processing and characterization science supported by ARPA-E under Cooperative Agreement Number DE-AR0000499 and as a user project at the Center for Nanophase Materials Sciences, a Department of Energy Office of Science User Facility.

  16. The Practicalities of Crowdsourcing: Lessons from the Tea Bag Index - UK

    NASA Astrophysics Data System (ADS)

    Duddigan, Sarah; Alexander, Paul; Shaw, Liz; Collins, Chris

    2017-04-01

    The Tea Bag Index -UK is a collaborative project between the University of Reading and the Royal Horticultural Society (RHS), working with members of the gardening community as citizen scientists. This project aims to quantify how decomposition varies across the country, and whether decomposition is influenced by how gardeners manage their soil, particularly with respect to the application of compost. Launched in 2015 as part of a PhD project, the Tea Bag Index- UK project asks willing volunteers to bury tea bags in their gardens, as part of a large scale, litter bag style decomposition rate study. Over 450 sets of tea bags have been dispatched to participants, across the length and breadth of the UK. The group was largely recruited via social media, magazine articles and public engagement events and active discourse was undertaken with these citizen scientists using Facebook, Twitter and regular email communication. In order to run a successful crowdsourcing citizen science project there are number of stages that need to be considered including (but not limited to): planning; launch and recruitment; communications; and feedback. Throughout a project of this nature an understanding of the motivations of your volunteers is vital. Reflecting on these motivations while publicising the project, and communicating regularly with its participants is incredibly important for a successful project.

  17. An Innovative, Multidisciplinary Educational Program in Interactive Information Storage and Retrieval. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Gallagher, Mary C.

    1985-01-01

    There exists a large number of large-scale bibliographic Information Storage and Retrieval Systems containing large amounts of valuable data of interest in a wide variety of research applications. These systems are not used to capacity because the end users, i.e., the researchers, have not been trained in the techniques of accessing such systems. This thesis describes the development of a transportable, university-level course in methods of querying on-line interactive Information Storage and Retrieval systems as a solution to this problem. This course was designed to instruct upper division science and engineering students to enable these end users to directly access such systems. The course is designed to be taught by instructors who are not specialists in either computer science or research skills. It is independent of any particular IS and R system or computer hardware. The project is sponsored by NASA and conducted by the University of Southwestern Louisiana and Southern University.

  18. Designing an External Evaluation of a Large-Scale Software Development Project.

    ERIC Educational Resources Information Center

    Collis, Betty; Moonen, Jef

    This paper describes the design and implementation of the evaluation of the POCO Project, a large-scale national software project in the Netherlands which incorporates the perspective of an evaluator throughout the entire span of the project, and uses the experiences gained from it to suggest an evaluation procedure that could be applied to other…

  19. 78 FR 18348 - Submission for OMB Review; Use of Project Labor Agreements for Federal Construction Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... agreement (PLA), as they may decide appropriate, on large-scale construction projects, where the total cost... procurement. A PLA is a pre-hire collective bargaining agreement with one or more labor organizations that... the use of a project labor agreement (PLA), as they may decide appropriate, on large-scale...

  20. Overview of Proposal on High Resolution Climate Model Simulations of Recent Hurricane and Typhoon Activity: The Impact of SSTs and the Madden Julian Oscillation

    NASA Technical Reports Server (NTRS)

    Schubert, Siegfried; Kang, In-Sik; Reale, Oreste

    2009-01-01

    This talk gives an update on the progress and further plans for a coordinated project to carry out and analyze high-resolution simulations of tropical storm activity with a number of state-of-the-art global climate models. Issues addressed include, the mechanisms by which SSTs control tropical storm. activity on inter-annual and longer time scales, the modulation of that activity by the Madden Julian Oscillation on sub-seasonal time scales, as well as the sensitivity of the results to model formulation. The project also encourages companion coarser resolution runs to help assess resolution dependence, and. the ability of the models to capture the large-scale and long-terra changes in the parameters important for hurricane development. Addressing the above science questions is critical to understanding the nature of the variability of the Asian-Australian monsoon and its regional impacts, and thus CLIVAR RAMP fully endorses the proposed tropical storm simulation activity. The project is open to all interested organizations and investigators, and the results from the runs will be shared among the participants, as well as made available to the broader scientific community for analysis.

  1. The Terra Data Fusion Project: An Update

    NASA Astrophysics Data System (ADS)

    Di Girolamo, L.; Bansal, S.; Butler, M.; Fu, D.; Gao, Y.; Lee, H. J.; Liu, Y.; Lo, Y. L.; Raila, D.; Turner, K.; Towns, J.; Wang, S. W.; Yang, K.; Zhao, G.

    2017-12-01

    Terra is the flagship of NASA's Earth Observing System. Launched in 1999, Terra's five instruments continue to gather data that enable scientists to address fundamental Earth science questions. By design, the strength of the Terra mission has always been rooted in its five instruments and the ability to fuse the instrument data together for obtaining greater quality of information for Earth Science compared to individual instruments alone. As the data volume grows and the central Earth Science questions move towards problems requiring decadal-scale data records, the need for data fusion and the ability for scientists to perform large-scale analytics with long records have never been greater. The challenge is particularly acute for Terra, given its growing volume of data (> 1 petabyte), the storage of different instrument data at different archive centers, the different file formats and projection systems employed for different instrument data, and the inadequate cyberinfrastructure for scientists to access and process whole-mission fusion data (including Level 1 data). Sharing newly derived Terra products with the rest of the world also poses challenges. As such, the Terra Data Fusion Project aims to resolve two long-standing problems: 1) How do we efficiently generate and deliver Terra data fusion products? 2) How do we facilitate the use of Terra data fusion products by the community in generating new products and knowledge through national computing facilities, and disseminate these new products and knowledge through national data sharing services? Here, we will provide an update on significant progress made in addressing these problems by working with NASA and leveraging national facilities managed by the National Center for Supercomputing Applications (NCSA). The problems that we faced in deriving and delivering Terra L1B2 basic, reprojected and cloud-element fusion products, such as data transfer, data fusion, processing on different computer architectures, science, and sharing, will be presented with quantitative specifics. Results from several science-specific drivers for Terra fusion products will also be presented. We demonstrate that the Terra Data Fusion Project itself provides an excellent use-case for the community addressing Big Data and cyberinfrastructure problems.

  2. Lessons learned from post-wildfire monitoring and implications for land management and regional drinking water treatability in Southern Rockies of Alberta

    NASA Astrophysics Data System (ADS)

    Diiwu, J.; Silins, U.; Kevin, B.; Anderson, A.

    2008-12-01

    Like many areas of the Rocky Mountains, Alberta's forests on the eastern slopes of the Rockies have been shaped by decades of successful fire suppression. These forests are at high risk to fire and large scale insect infestation, and climate change will continue to increase these risks. These headwaters forests provide the vast majority of usable surface water supplies to large region of the province, and large scale natural disasters can have dramatic effects on water quality and water availability. The population in the region has steadily increased and now this area is the main source water for many Alberta municipalities, including the City of Calgary, which has a population of over one million. In 2003 a fire burned 21,000 ha in the southern foothills area. The government land managers were concerned about the downstream implications of the fire and salvage operations, however there was very limited scientific information to guide the decision making. This led to establishment of the Southern Rockies Watershed Project, which is a partnership between Alberta Sustainable Resource Development, the provincial government department responsible for land management and the University of Alberta. After five years of data collection, the project has produced quantitative information that was not previously available about the effects of fire and management interventions such as salvage logging on headwaters and regional water quality. This information can be used to make decisions on forest operations, fire suppression, and post-fire salvage operations. In the past few years this project has captured the interest of large municipalities and water treatment researchers who are keen to investigate the potential implications of large natural disturbances to large and small drinking water treatment facilities. Examples from this project will be used to highlight the challenges and successes encountered while bridging the gap between science and land management policy.

  3. Project Management Life Cycle Models to Improve Management in High-rise Construction

    NASA Astrophysics Data System (ADS)

    Burmistrov, Andrey; Siniavina, Maria; Iliashenko, Oksana

    2018-03-01

    The paper describes a possibility to improve project management in high-rise buildings construction through the use of various Project Management Life Cycle Models (PMLC models) based on traditional and agile project management approaches. Moreover, the paper describes, how the split the whole large-scale project to the "project chain" will create the factor for better manageability of the large-scale buildings project and increase the efficiency of the activities of all participants in such projects.

  4. Global Patterns in Students' Views of Science and Interest in Science

    NASA Astrophysics Data System (ADS)

    van Griethuijsen, Ralf A. L. F.; van Eijck, Michiel W.; Haste, Helen; den Brok, Perry J.; Skinner, Nigel C.; Mansour, Nasser; Savran Gencer, Ayse; BouJaoude, Saouma

    2015-08-01

    International studies have shown that interest in science and technology among primary and secondary school students in Western European countries is low and seems to be decreasing. In many countries outside Europe, and especially in developing countries, interest in science and technology remains strong. As part of the large-scale European Union funded `Science Education for Diversity' project, a questionnaire probing potential reasons for this difference was completed by students in the UK, Netherlands, Turkey, Lebanon, India and Malaysia. This questionnaire sought information about favourite courses, extracurricular activities and views on the nature of science. Over 9,000 students aged mainly between 10 and 14 years completed the questionnaire. Results revealed that students in countries outside Western Europe showed a greater interest in school science, in careers related to science and in extracurricular activities related to science than did Western European students. Non-European students were also more likely to hold an empiricist view of the nature of science and to believe that science can solve many problems faced by the world. Multilevel analysis revealed a strong correlation between interest in science and having such a view of the Nature of Science.

  5. Investigating the Impact of NGSS-Aligned Professional Development on PreK-3 Teachers' Science Content Knowledge and Pedagogy

    NASA Astrophysics Data System (ADS)

    Tuttle, Nicole; Kaderavek, Joan N.; Molitor, Scott; Czerniak, Charlene M.; Johnson-Whitt, Eugenia; Bloomquist, Debra; Namatovu, Winnifred; Wilson, Grant

    2016-11-01

    This pilot study investigates the impact of a 2-week professional development Summer Institute on PK-3 teachers' knowledge and practices. This Summer Institute is a component of [program], a large-scale early-childhood science project that aims to transform PK-3 science teaching. The mixed-methods study examined concept maps, lesson plans, and classroom observations to measure possible changes in PK-3 teachers' science content knowledge and classroom practice from 11 teachers who attended the 2014 Summer Institute. Analysis of the concept maps demonstrated statistically significant growth in teachers' science content knowledge. Analysis of teachers' lesson plans demonstrated that the teachers could design high quality science inquiry lessons aligned to the Next Generation Science Standards following the professional development. Finally, examination of teachers' pre- and post-Summer Institute videotaped inquiry lessons showed evidence that teachers were incorporating new inquiry practices into their teaching, especially regarding classroom discourse. Our results suggest that an immersive inquiry experience is effective at beginning a shift towards reform-aligned science and engineering instruction but that early elementary educators require additional support for full mastery.

  6. PoPLAR: Portal for Petascale Lifescience Applications and Research

    PubMed Central

    2013-01-01

    Background We are focusing specifically on fast data analysis and retrieval in bioinformatics that will have a direct impact on the quality of human health and the environment. The exponential growth of data generated in biology research, from small atoms to big ecosystems, necessitates an increasingly large computational component to perform analyses. Novel DNA sequencing technologies and complementary high-throughput approaches--such as proteomics, genomics, metabolomics, and meta-genomics--drive data-intensive bioinformatics. While individual research centers or universities could once provide for these applications, this is no longer the case. Today, only specialized national centers can deliver the level of computing resources required to meet the challenges posed by rapid data growth and the resulting computational demand. Consequently, we are developing massively parallel applications to analyze the growing flood of biological data and contribute to the rapid discovery of novel knowledge. Methods The efforts of previous National Science Foundation (NSF) projects provided for the generation of parallel modules for widely used bioinformatics applications on the Kraken supercomputer. We have profiled and optimized the code of some of the scientific community's most widely used desktop and small-cluster-based applications, including BLAST from the National Center for Biotechnology Information (NCBI), HMMER, and MUSCLE; scaled them to tens of thousands of cores on high-performance computing (HPC) architectures; made them robust and portable to next-generation architectures; and incorporated these parallel applications in science gateways with a web-based portal. Results This paper will discuss the various developmental stages, challenges, and solutions involved in taking bioinformatics applications from the desktop to petascale with a front-end portal for very-large-scale data analysis in the life sciences. Conclusions This research will help to bridge the gap between the rate of data generation and the speed at which scientists can study this data. The ability to rapidly analyze data at such a large scale is having a significant, direct impact on science achieved by collaborators who are currently using these tools on supercomputers. PMID:23902523

  7. Ecological science and sustainability for the 21st century

    USGS Publications Warehouse

    Palmer, Margaret A.; Bernhardt, Emily S.; Chornesky, Elizabeth A.; Collins, Scott L.; Dobson, Andrew P.; Duke, Clifford S.; Gold, Barry; Jacobson, Robert B.; Kingsland, Sharon E.; Kranz, Rhonda H.; Mappin, Michael J.; Martinez, M. Luisa; Micheli, Fiorenza; Morse, Jennifer L.; Pace, Michael L.; Pascual, Mercedes; Palumbi, Stephen S.; Reichman, O. J.; Townsend, Alan R.; Turner, Monica G.

    2005-01-01

    Ecological science has contributed greatly to our understanding of the natural world and the impact of humans on that world. Now, we need to refocus the discipline towards research that ensures a future in which natural systems and the humans they include coexist on a more sustainable planet. Acknowledging that managed ecosystems and intensive exploitation of resources define our future, ecologists must play a greatly expanded role in communicating their research and influencing policy and decisions that affect the environment. To accomplish this, they will have to forge partnerships at scales and in forms they have not traditionally used. These alliances must act within three visionary areas: enhancing the extent to which decisions are ecologically informed; advancing innovative ecological research directed at the sustainability of the planet; and stimulating cultural changes within the science itself, thereby building a forward-looking and international ecology. We recommend: (1) a research initiative to enhance research project development, facilitate large-scale experiments and data collection, and link science to solutions; (2) procedures that will improve interactions among researchers, managers, and decision makers; and (3) efforts to build public understanding of the links between ecosystem services and humans.

  8. Crowd-Sourcing with K-12 citizen scientists: The Continuing Evolution of the GLOBE Program

    NASA Astrophysics Data System (ADS)

    Murphy, T.; Wegner, K.; Andersen, T. J.

    2016-12-01

    Twenty years ago, the Internet was still in its infancy, citizen science was a relatively unknown term, and the idea of a global citizen science database was unheard of. Then the Global Learning and Observations to Benefit the Environment (GLOBE) Program was proposed and this all changed. GLOBE was one of the first K-12 citizen science programs on a global scale. An initial large scale ramp-up of the program was followed by the establishment of a network of partners in countries and within the U.S. Now in the 21st century, the program has over 50 protocols in atmosphere, biosphere, hydrosphere and pedosphere, almost 140 million measurements in the database, a visualization system, collaborations with NASA satellite mission scientists (GPM, SMAP) and other scientists, as well as research projects by GLOBE students. As technology changed over the past two decades, it was integrated into the program's outreach efforts to existing and new members with the result that the program now has a strong social media presence. In 2016, a new app was launched which opened up GLOBE and data entry to citizen scientists of all ages. The app is aimed at fresh audiences, beyond the traditional GLOBE K-12 community. Groups targeted included: scouting organizations, museums, 4H, science learning centers, retirement communities, etc. to broaden participation in the program and increase the number of data available to students and scientists. Through the 20 years of GLOBE, lessons have been learned about changing the management of this type of large-scale program, the use of technology to enhance and improve the experience for members, and increasing community involvement in the program.

  9. Anthropogenic aerosols and the distribution of past large-scale precipitation change

    DOE PAGES

    Wang, Chien

    2015-12-28

    In this paper, the climate response of precipitation to the effects of anthropogenic aerosols is a critical while not yet fully understood aspect in climate science. Results of selected models that participated the Coupled Model Intercomparison Project Phase 5 and the data from the Twentieth Century Reanalysis Project suggest that, throughout the tropics and also in the extratropical Northern Hemisphere, aerosols have largely dominated the distribution of precipitation changes in reference to the preindustrial era in the second half of the last century. Aerosol-induced cooling has offset some of the warming caused by the greenhouse gases from the tropics tomore » the Arctic and thus formed the gradients of surface temperature anomaly that enable the revealed precipitation change patterns to occur. Improved representation of aerosol-cloud interaction has been demonstrated as the key factor for models to reproduce consistent distributions of past precipitation change with the reanalysis data.« less

  10. Environmental Data-Driven Inquiry and Exploration (EDDIE)- Water Focused Modules for interacting with Big Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Meixner, T.; Gougis, R.; O'Reilly, C.; Klug, J.; Richardson, D.; Castendyk, D.; Carey, C.; Bader, N.; Stomberg, J.; Soule, D. C.

    2016-12-01

    High-frequency sensor data are driving a shift in the Earth and environmental sciences. The availability of high-frequency data creates an engagement opportunity for undergraduate students in primary research by using large, long-term, and sensor-based, data directly in the scientific curriculum. Project EDDIE (Environmental Data-Driven Inquiry & Exploration) has developed flexible classroom activity modules designed to meet a series of pedagogical goals that include (1) developing skills required to manipulate large datasets at different scales to conduct inquiry-based investigations; (2) developing students' reasoning about statistical variation; and (3) fostering accurate student conceptions about the nature of environmental science. The modules cover a wide range of topics, including lake physics and metabolism, stream discharge, water quality, soil respiration, seismology, and climate change. In this presentation we will focus on a sequence of modules of particular interest to hydrologists - stream discharge, water quality and nutrient loading. Assessment results show that our modules are effective at making students more comfortable analyzing data, improved understanding of statistical concepts, and stronger data analysis capability. This project is funded by an NSF TUES grant (NSF DEB 1245707).

  11. Position paper: the science of deep specification.

    PubMed

    Appel, Andrew W; Beringer, Lennart; Chlipala, Adam; Pierce, Benjamin C; Shao, Zhong; Weirich, Stephanie; Zdancewic, Steve

    2017-10-13

    We introduce our efforts within the project 'The science of deep specification' to work out the key formal underpinnings of industrial-scale formal specifications of software and hardware components, anticipating a world where large verified systems are routinely built out of smaller verified components that are also used by many other projects. We identify an important class of specification that has already been used in a few experiments that connect strong component-correctness theorems across the work of different teams. To help popularize the unique advantages of that style, we dub it deep specification , and we say that it encompasses specifications that are rich , two-sided , formal and live (terms that we define in the article). Our core team is developing a proof-of-concept system (based on the Coq proof assistant) whose specification and verification work is divided across largely decoupled subteams at our four institutions, encompassing hardware microarchitecture, compilers, operating systems and applications, along with cross-cutting principles and tools for effective specification. We also aim to catalyse interest in the approach, not just by basic researchers but also by users in industry.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).

  12. Geospatial Optimization of Siting Large-Scale Solar Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less

  13. Project EDDIE: Improving Big Data skills in the classroom

    NASA Astrophysics Data System (ADS)

    Soule, D. C.; Bader, N.; Carey, C.; Castendyk, D.; Fuller, R.; Gibson, C.; Gougis, R.; Klug, J.; Meixner, T.; Nave, L. E.; O'Reilly, C.; Richardson, D.; Stomberg, J.

    2015-12-01

    High-frequency sensor-based datasets are driving a paradigm shift in the study of environmental processes. The online availability of high-frequency data creates an opportunity to engage undergraduate students in primary research by using large, long-term, and sensor-based, datasets for science courses. Project EDDIE (Environmental Data-Driven Inquiry & Exploration) is developing flexible classroom activity modules designed to (1) improve quantitative and reasoning skills; (2) develop the ability to engage in scientific discourse and argument; and (3) increase students' engagement in science. A team of interdisciplinary faculty from private and public research universities and undergraduate institutions have developed these modules to meet a series of pedagogical goals that include (1) developing skills required to manipulate large datasets at different scales to conduct inquiry-based investigations; (2) developing students' reasoning about statistical variation; and (3) fostering accurate student conceptions about the nature of environmental science. The modules cover a wide range of topics, including lake physics and metabolism, stream discharge, water quality, soil respiration, seismology, and climate change. Assessment data from questionnaire and recordings collected during the 2014-2015 academic year show that our modules are effective at making students more comfortable analyzing data. Continued development is focused on improving student learning outcomes with statistical concepts like variation, randomness and sampling, and fostering scientific discourse during module engagement. In the coming year, increased sample size will expand our assessment opportunities to comparison groups in upper division courses and allow for evaluation of module-specific conceptual knowledge learned. This project is funded by an NSF TUES grant (NSF DEB 1245707).

  14. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  15. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters.

    PubMed

    Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola

    2018-05-01

    Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.

  16. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters

    PubMed Central

    2018-01-01

    Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792

  17. Blueprints for green biotech: development and application of standards for plant synthetic biology.

    PubMed

    Patron, Nicola J

    2016-06-15

    Synthetic biology aims to apply engineering principles to the design and modification of biological systems and to the construction of biological parts and devices. The ability to programme cells by providing new instructions written in DNA is a foundational technology of the field. Large-scale de novo DNA synthesis has accelerated synthetic biology by offering custom-made molecules at ever decreasing costs. However, for large fragments and for experiments in which libraries of DNA sequences are assembled in different combinations, assembly in the laboratory is still desirable. Biological assembly standards allow DNA parts, even those from multiple laboratories and experiments, to be assembled together using the same reagents and protocols. The adoption of such standards for plant synthetic biology has been cohesive for the plant science community, facilitating the application of genome editing technologies to plant systems and streamlining progress in large-scale, multi-laboratory bioengineering projects. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.

  18. Who Really Wants an Ambitious Large-Scale Restoration of the Seine Estuary? A Strategic Analysis of a Science-Policy Interface Locked in a Stalemate.

    PubMed

    Coreau, Audrey; Narcy, Jean-Baptiste; Lumbroso, Sarah

    2018-05-01

    The development of ecosystem knowledge is an essential condition for effective environmental management but using available knowledge to solve environmental controversies is still difficult in "real" situations. This paper explores the conditions under which ecological knowledge could contribute to the environmental strategies and actions of stakeholders at science-policy interface. Ecological restoration of the Seine estuary is an example of an environmental issue whose overall management has run into difficulties despite the production of a large amount of knowledge by a dedicated organization, GIP Seine Aval. Thanks to an action-research project, based on a futures study, we analyze the reasons of these difficulties and help the GIP Seine Aval adopt a robust strategy to overcome them. According to our results, most local stakeholders involved in the large-scale restoration project emphasize the need for a clear divide between knowledge production and environmental action. This kind of divide may be strategic in a context where the robustness of environmental decisions is strongly depending on the mobilization of "neutral" scientific knowledge. But in our case study, this rather blocks action because some powerful stakeholders continuously ask for more knowledge before taking action. The construction and analysis of possible future scenarios has led to three alternative strategies being identified to counter this stalemate situation: (1) to circumvent difficulties by creating indirect links between knowledge and actions; (2) to use knowledge to sustain advocacy for the interests of each and every stakeholder; (3) to involve citizens in decisions about knowledge production and use, so that environmental issues weight more on the local political agenda.

  19. NOVA making stuff: Season 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leombruni, Lisa; Paulsen, Christine Andrews

    Over the course of four weeks in fall 2013, 11.7 million Americans tuned in to PBS to follow host David Pogue as he led them in search of engineering and scientific breakthroughs poised to change our world. Levitating trains, quantum computers, robotic bees, and bomb-detecting plants—these were just a few of the cutting-edge innovations brought into the living rooms of families across the country in NOVA’s four-part series, Making Stuff: Faster, Wilder, Colder, and Safer. Each of the four one-hour programs gave viewers a behind-the-scenes look at novel technologies poised to change our world—showing them how basic research and scientificmore » discovery can hold the keys to transforming how we live. Making Stuff Season 2 (MS2) combined true entertainment with educational value, creating a popular and engaging series that brought accessible science into the homes of millions. NOVA’s goal to engage the public with such technological innovation and basic research extended beyond the broadcast series, including a variety of online, educational, and promotional activities: original online science reporting, web-only short-form videos, a new online quiz-game, social media engagement and promotion, an educational outreach “toolkit” for science educators to create their own “makerspaces,” an online community of practice, a series of nationwide Innovation Cafés, educator professional development, a suite of teacher resources, an “Idealab,” participation in national conferences, and specialized station relation and marketing. A summative evaluation of the MS2 project indicates that overall, these activities helped make a significant impact on the viewers, users, and participants that NOVA reached. The final evaluation conducted by Concord Evaluation Group (CEG) confidently concluded that the broadcast, website, and outreach activities were successful at achieving the project’s intended impacts. CEG reported that the MS2 series and website content were successful in raising awareness and sparking interest in innovation, and increased public awareness that basic research leads to technological innovation; this interest was also sustained over a six month period. Efforts to create an online community of practice were also successful: the quality of collaboration increased, and community members felt supported while using Maker pedagogy. These findings provide clear evidence that large-scale science media projects like MS2 are an effective means of “moving the needle” on attitudes about and excitement for science. NOVA’s broadcast audience and ratings have always indicated that a large portion of the population is interested in and engages with educational science media on a weekly basis. Yet these evaluation results provide the empirical evidence that beyond being capable of attracting, maintaining, and growing a dedicated group of citizens interested in science, these shows—with their diverse content provided on a variety of media channels—are capable of sparking new interest in science, raising public awareness of the importance of science, and maintaining and growing that interest over time. In a country where approximately a quarter of the population doesn’t know the earth rotates around the sun,1 roughly half still don’t accept evolution,2 and about 20% don’t think climate change is happening,3 the importance of these findings cannot be overstated. The success of MS2 suggests that large-scale media projects dedicated to and linked by coverage of scientific “big ideas” are an effective means of shifting public opinion on—and improving understanding of—science. REFERENCES 1, 2 National Science Foundation, Science and Engineering Indicators (2014). Chapter 7: Science and Technology: Public Attitudes and Understanding. 3 Leiserowitz, A., Maibach, E., Roser-Renouf, C., Feinberg, G., & Rosenthal, S. (2014) Climate change in the American mind: April, 2014. Yale University and George Mason University. New Haven, CT: Yale Project on Climate Change Communication.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Gang

    Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less

  1. Visualization and Analysis of Multi-scale Land Surface Products via Giovanni Portals

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Kempler, Steven J.; Gerasimov, Irina V.

    2013-01-01

    Large volumes of MODIS land data products at multiple spatial resolutions have been integrated into the Giovanni online analysis system to support studies on land cover and land use changes,focused on the Northern Eurasia and Monsoon Asia regions through the LCLUC program. Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) is a Web-based application developed by the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), providing a simple and intuitive way to visualize, analyze, and access Earth science remotely-sensed and modeled data.Customized Giovanni Web portals (Giovanni-NEESPI andGiovanni-MAIRS) have been created to integrate land, atmospheric,cryospheric, and societal products, enabling researchers to do quick exploration and basic analyses of land surface changes, and their relationships to climate, at global and regional scales. This presentation shows a sample Giovanni portal page, lists selected data products in the system, and illustrates potential analyses with imagesand time-series at global and regional scales, focusing on climatology and anomaly analysis. More information is available at the GES DISCMAIRS data support project portal: http:disc.sci.gsfc.nasa.govmairs.

  2. The Computer Science Technical Report (CS-TR) Project: A Pioneering Digital Library Project Viewed from a Library Perspective.

    ERIC Educational Resources Information Center

    Anderson, Greg; And Others

    1996-01-01

    Describes the Computer Science Technical Report Project, one of the earliest investigations into the system engineering of digital libraries which pioneered multiinstitutional collaborative research into technical, social, and legal issues related to the development and implementation of a large, heterogeneous, distributed digital library. (LRW)

  3. How Technicians Can Lead Science Improvements in Any School: A Small-Scale Study in England

    ERIC Educational Resources Information Center

    Jones, Beth; Quinnell, Simon

    2015-01-01

    This article describes how seven schools in England improved their science provision by focusing on the professional development of their science technicians. In September 2013, the Gatsby Charitable Foundation funded the National Science Learning Centre to lead a project connecting secondary schools with experienced senior science technicians…

  4. Project BudBurst - Meeting the Needs of Climate Change Educators and Scientists

    NASA Astrophysics Data System (ADS)

    Henderson, S.

    2015-12-01

    It is challenging for many to get a sense of what climate change means as long periods of time are involved - like decades - which can be difficult to grasp. However, there are a number of citizen science based projects, including NEON's Project BudBurst, that provide the opportunity for both learning about climate change and advancing scientific knowledge. In this presentation, we will share lessons learned from Project BudBurst. Project BudBurst is a national citizen science initiative designed to engage the public in observations of phenological (plant life cycle) events and to increase climate literacy. Project BudBurst is important from an educational perspective, but also because it enables scientists to broaden the geographic and temporal scale of their observations. The goals of Project BudBurst are to 1) increase awareness of phenology as an area of scientific study; 2) Increase awareness of the impacts of changing climates on plants at a continental-scale; and 3) increase science literacy by engaging participants in the scientific process. It was important to better understand if and how Project BudBurst is meeting its goals. Specifically, does participation by non-experts advance scientific knowledge? Does participation advance educational goals and outcomes? Is participation an effective approach to advance/enhance science education in both formal and informal settings? Critical examination of Project BudBurst supports advancement of scientific knowledge and realization of educational objectives. Citizen science collected observations and measurements are being used by scientists as evidenced by the increase of such data in scientific publication. In addition, we found that there is a significant increase in educators utilizing citizen science as part of their instruction. Part of this increase is due to the resources and professional development materials available to educators. Working with partners also demonstrated that the needs of both science and education are being met. Project BudBurst, partners with the PhenoCam Network, National Geographic Society, US Fish and Wildlife Service, National Park Service botanic gardens, science centers and other organizations with both a scientific and educational mission.

  5. JPRS Report, Science & Technology, Japan, MITI’s Large-Scale R&D Projects Reviewed

    DTIC Science & Technology

    1990-02-08

    pollutions, red tide, Active enzymes etc. for cleaners and detergents -- .... .... Intermediates aw materials for r rcosmetics and and medicines moisturizers...PN 00 1H carq 1 0 HZc4 IO -l~ to u .ci~’ *H 0 w 1 Q 0 r - 0 0J w H 04P 04- c0 a) 00 bD O44 0 w w 0 -p 00 021.J4 COQ )a > u0𔃺-0 T1 CL Cfp p P H -H14 OL

  6. Science-based requirements and operations development for the Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    McConnachie, Alan W.; Flagey, Nicolas; Murowinski, Rick; Szeto, Kei; Salmon, Derrick; Withington, Kanoa; Mignot, Shan

    2016-07-01

    MSE is a wide field telescope (1.5 square degree field of view) with an aperture of 11.25m. It is dedicated to multi-object spectroscopy at several different spectral resolutions in the range R 2500 - 40000 over a broad wavelength range (0:36 - 1:8μm). MSE enables transformational science in areas as diverse as exoplanetary host characterization; stellar monitoring campaigns; tomographic mapping of the interstellar and intergalactic media; the in-situ chemical tagging of the distant Galaxy; connecting galaxies to the large scale structure of the Universe; measuring the mass functions of cold dark matter sub-halos in galaxy and cluster-scale hosts; reverberation mapping of supermassive black holes in quasars. Here, we summarize the Observatory and describe the development of the top level science requirements and operational concepts. Specifically, we describe the definition of the Science Requirements to be the set of capabilities that allow certain high impact science programs to be conducted. We cross reference these science cases to the science requirements to illustrate the traceability of this approach. We further discuss the operations model for MSE and describe the development of the Operations Concept Document, one of the foundational documents for the project. We also discuss the next stage in the science based development of MSE, specifically the development of the initial Legacy Survey that will occupy a majority of time on the telescope over the first few years of operation.

  7. On the history of the quantum. Introduction to the HQ4 special issue

    NASA Astrophysics Data System (ADS)

    Navarro, Jaume; Blum, Alexander; Lehner, Christoph

    2017-11-01

    Eight years ago, a special issue in this journal published a dozen papers with new studies on the history of quantum physics. That issue was an output of a conference in Utrecht one year earlier, the second in a series organized by the then existing large-scale project coordinated by the Max Planck Institute for the History of Science and the Fritz Haber Institute. Since then, that project has produced a number of publications, workshops and other academic outcomes, but more importantly, it triggered the consolidation of an international community of historians and philosophers of science producing novel work on the history of quantum physics. Five years after the third meeting, which took place in Berlin in 2010, many of the scholars from that group and some new ones met for four days in Donostia/San Sebastian for the HQ4 meeting. The time was ripe for new results to be shared and discussed, and this issue collects some of the papers presented at that gathering.

  8. Final Report. Institute for Ultralscale Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois

    The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less

  9. Citizen Science, Crowdsourcing and Big Data: A Scientific and Social Framework for Natural Resources and Environments

    NASA Astrophysics Data System (ADS)

    Glynn, P. D.; Jones, J. W.; Liu, S. B.; Shapiro, C. D.; Jenter, H. L.; Hogan, D. M.; Govoni, D. L.; Poore, B. S.

    2014-12-01

    We describe a conceptual framework for Citizen Science that can be applied to improve the understanding and management of natural resources and environments. For us, Citizen Science represents an engagement from members of the public, usually volunteers, in collaboration with paid professionals and technical experts to observe and understand natural resources and environments for the benefit of science and society. Our conceptual framework for Citizen Science includes crowdsourcing of observations (or sampling). It considers a wide range of activities, including volunteer and professional monitoring (e.g. weather and climate variables, water availability and quality, phenology, biota, image capture and remote sensing), as well as joint fact finding and analyses, and participatory mapping and modeling. Spatial distribution and temporal dynamics of the biophysical processes that control natural resources and environments are taken into account within this conceptual framework, as are the availability, scaling and diversity of tools and efforts that are needed to properly describe these biophysical processes. Opportunities are sought within the framework to properly describe, QA/QC, archive, and make readily accessible, the large amounts of information and traceable knowledge required to better understand and manage natural resources and environments. The framework also considers human motivational needs, primarily through a modern version of Maslow's hierarchy of needs. We examine several USGS-based Citizen Science efforts within the context of our framework, including the project called "iCoast - Did the Coast Change?", to understand the utility of the framework, its costs and benefits, and to offer concrete examples of how to expand and sustain specific projects. We make some recommendations that could aid its implementation on a national or larger scale. For example, implementation might be facilitated (1) through greater engagement of paid professionals, and (2) through the involvement of integrating entities, including institutions of learning and agencies with broad science responsibilities.

  10. Pavilion Lake Research Project - using multi-scaled approaches to understanding the provenance, maintenance and morphological characteristics of microbialites

    NASA Astrophysics Data System (ADS)

    Lim, D. S.; Brady, A. L.; Cardman, Z.; Cowie, B. R.; Forrest, A.; Marinova, M.; Shepard, R.; Laval, B.; Slater, G. F.; Gernhardt, M.; Andersen, D. T.; Hawes, I.; Sumner, D. Y.; Trembanis, A. C.; McKay, C. P.

    2009-12-01

    Microbialites can be metre-scale or larger discrete structures that cover kilometre-scale regions, for example in Pavilion Lake, British Columbia, Canada, while the organisms associated with their growth and development are much smaller (less than millimeter scale). As such, a multi-scaled approach to understanding their provenance, maintenance and morphological characteristics is required. Research members of the Pavilion Lake Research Project (PLRP) (www.pavilionlake.com) have been working to understand microbialite morphogenesis in Pavilion Lake, B.C., Canada and the potential for biosignature preservation in these carbonate rocks using a combination of field and lab based techniques. PLRP research participants have been: (1) exploring the physical and chemical limnological properties of the lake, especially as these characteristics pertain to microbialite formation, (2) using geochemical and molecular tools to test the hypothesized biological origin of the microbialites and the associated meso-scale processes, and (3) using geochemical and microscopic tools to characterize potential biosignature preservation in the microbialites on the micro scale. To address these goals, PLRP identified the need to (a) map Pavilion Lake to gain a contextual understanding of microbialite distribution and possible correlation between their lake-wide distribution and the ambient growth conditions, and (b) sample the microbialites, including those from deepest regions of the lake (60m). Initial assessments showed that PLRP science diving operations did not prove adequate for mapping and sample recovery in the large and deep (0.8 km x 5.7 km; 65m max depth) lake. As such, the DeepWorker Science and Exploration (DSE) program was established by the PLRP. At the heart of this program are two DeepWorker (DW) submersibles, single-person vehicles that offer Scientist-Pilots (SP) an opportunity to study the lake in a 1 atm pressurized environment. In addition, the use of Autonomous Underwater Vehicles (AUVs) for landscape level geophysical mapping (side-scan and multibeam) provides and additional large-scale context of the microbialite associations. The multi-scaled approach undertaken by the PLRP team members has created an opportunity to weave together a comprehensive understanding of the modern microbialites in Pavilion Lake, and their relevance to interpreting ancient carbonate fabrics. An overview of the team’s findings to date and on-going research will be presented.

  11. Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching

    ERIC Educational Resources Information Center

    Koch, Franziska D.; Vogt, Joachim

    2015-01-01

    At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…

  12. Cascading failure in the wireless sensor scale-free networks

    NASA Astrophysics Data System (ADS)

    Liu, Hao-Ran; Dong, Ming-Ru; Yin, Rong-Rong; Han, Li

    2015-05-01

    In the practical wireless sensor networks (WSNs), the cascading failure caused by a failure node has serious impact on the network performance. In this paper, we deeply research the cascading failure of scale-free topology in WSNs. Firstly, a cascading failure model for scale-free topology in WSNs is studied. Through analyzing the influence of the node load on cascading failure, the critical load triggering large-scale cascading failure is obtained. Then based on the critical load, a control method for cascading failure is presented. In addition, the simulation experiments are performed to validate the effectiveness of the control method. The results show that the control method can effectively prevent cascading failure. Project supported by the Natural Science Foundation of Hebei Province, China (Grant No. F2014203239), the Autonomous Research Fund of Young Teacher in Yanshan University (Grant No. 14LGB017) and Yanshan University Doctoral Foundation, China (Grant No. B867).

  13. Critical role for hierarchical geospatial analyses in the design of fluvial research, assessment, and management

    EPA Science Inventory

    River science and management can be conducted at a range of spatiotemporal scales from reach to basin levels as long as the project goals and questions are matched correctly with the study design’s spatiotemporal scales and dependent variables. These project goals should also inc...

  14. Multiscale Cloud System Modeling

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Moncrieff, Mitchell W.

    2009-01-01

    The central theme of this paper is to describe how cloud system resolving models (CRMs) of grid spacing approximately 1 km have been applied to various important problems in atmospheric science across a wide range of spatial and temporal scales and how these applications relate to other modeling approaches. A long-standing problem concerns the representation of organized precipitating convective cloud systems in weather and climate models. Since CRMs resolve the mesoscale to large scales of motion (i.e., 10 km to global) they explicitly address the cloud system problem. By explicitly representing organized convection, CRMs bypass restrictive assumptions associated with convective parameterization such as the scale gap between cumulus and large-scale motion. Dynamical models provide insight into the physical mechanisms involved with scale interaction and convective organization. Multiscale CRMs simulate convective cloud systems in computational domains up to global and have been applied in place of contemporary convective parameterizations in global models. Multiscale CRMs pose a new challenge for model validation, which is met in an integrated approach involving CRMs, operational prediction systems, observational measurements, and dynamical models in a new international project: the Year of Tropical Convection, which has an emphasis on organized tropical convection and its global effects.

  15. SAFRR AND Physics-Based Scenarios: The Power of Scientifically Credible Stories

    NASA Astrophysics Data System (ADS)

    Cox, D. A.; Jones, L.

    2015-12-01

    USGS's SAFRR (Science Application for Risk Reduction) Project and its predecessor, the Multi Hazards Demonstration Project, uses the latest earth science to develop scenarios so that communities can improve disaster resilience. SAFRR has created detailed physics-based natural-disaster scenarios of a M7.8 San Andreas earthquake in southern California (ShakeOut), atmospheric-river storms rivaling the Great California Flood of 1862 (ARkStorm), a Tohoku-sized earthquake and tsunami in the eastern Aleutians (SAFRR Tsunami), and now a M7.05 quake on the Hayward Fault in the San Francisco Bay area (HayWired), as novel ways of providing science for decision making. Each scenario is scientifically plausible, deterministic, and large enough to demand attention but not too large to be believable. The scenarios address interacting hazards, requiring involvement of multiple science disciplines and user communities. The scenarios routinely expose hitherto unknown or ignored vulnerabilities, most often in cascading effects missed when impacts are considered in isolation. They take advantage of story telling to provide decision makers with clear explanations and justifications for mitigation and preparedness actions, and have been used for national-to-local disaster response exercises and planning. Effectiveness is further leveraged by downscaling the scenarios to local levels. For example, although the ARkStorm scenario describes state-scale events and has been used that way by NASA and the Navy, SAFRR also partnered with FEMA to focus on two local areas, Ventura County in the coastal plain and the mountain setting of Lake Tahoe with downstream impacts in Reno, Sparks and Carson City. Downscaling and focused analyses increased usefulness to user communities, drawing new participants into the study. SAFRR scenarios have also motivated new research to answer questions uncovered by stakeholders, closing the circle of co-evolving disaster-science and disaster-response improvements.

  16. Side effects of problem-solving strategies in large-scale nutrition science: towards a diversification of health.

    PubMed

    Penders, Bart; Vos, Rein; Horstman, Klasien

    2009-11-01

    Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.

  17. Large scale silver nanowires network fabricated by MeV hydrogen (H+) ion beam irradiation

    NASA Astrophysics Data System (ADS)

    Honey, S.; Naseem, S.; Ishaq, A.; Maaza, M.; Bhatti, M. T.; Wan, D.

    2016-04-01

    A random two-dimensional large scale nano-network of silver nanowires (Ag-NWs) is fabricated by MeV hydrogen (H+) ion beam irradiation. Ag-NWs are irradiated under H+ ion beam at different ion fluences at room temperature. The Ag-NW network is fabricated by H+ ion beam-induced welding of Ag-NWs at intersecting positions. H+ ion beam induced welding is confirmed by transmission electron microscopy (TEM) and scanning electron microscopy (SEM). Moreover, the structure of Ag NWs remains stable under H+ ion beam, and networks are optically transparent. Morphology also remains stable under H+ ion beam irradiation. No slicings or cuttings of Ag-NWs are observed under MeV H+ ion beam irradiation. The results exhibit that the formation of Ag-NW network proceeds through three steps: ion beam induced thermal spikes lead to the local heating of Ag-NWs, the formation of simple junctions on small scale, and the formation of a large scale network. This observation is useful for using Ag-NWs based devices in upper space where protons are abandoned in an energy range from MeV to GeV. This high-quality Ag-NW network can also be used as a transparent electrode for optoelectronics devices. Project supported by the National Research Foundation of South Africa (NRF), the French Centre National pour la Recherche Scientifique, iThemba-LABS, the UNESCO-UNISA Africa Chair in Nanosciences & Nanotechnology, the Third World Academy of Science (TWAS), Organization of Women in Science for the Developing World (OWSDW), the Abdus Salam ICTP via the Nanosciences African Network (NANOAFNET), and the Higher Education Commission (HEC) of Pakistan.

  18. Ontology-Driven Provenance Management in eScience: An Application in Parasite Research

    NASA Astrophysics Data System (ADS)

    Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.

    Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.

  19. The Monsoon-90 / SALSA / EOS / SUDMED / SAHRA / HELP / USPP Experience: A Progression of Interdisciplinary Integration of Science and Decision Making over 20 years.

    NASA Astrophysics Data System (ADS)

    Chehbouni, G.; Goodrich, D.; Kustas, B.; Sorooshian, S.; Shuttleworth, J.; Richter, H.

    2008-12-01

    The Monsoon'90 Experiment conducted at the USDA-ARS Walnut Gulch Experimental Watershed in southeast Arizona was the start of a long arc of subsequent experiments and research that were larger, longer-term, more international, more interdisciplinary, and led to more direct integration of science for decision making and watershed management. In this era, much of our research and science must be more directly relevant to decision-makers and natural resource managers as they increasingly require sophisticated levels of expert findings and scientific results (e.g. interdisciplinary) to make informed decisions. Significant effort beyond focused, single disciplinary research is required conduct interdisciplinary science typical in large scale field experiments. Even greater effort is required to effectively integrate our research across the physical and ecological sciences for direct use by policy and decision makers. This presentation will provide an overview of the evolution of this arc of experiments and long-term projects into a mature integrated science and decision making program. It will discuss the transition in project focus from science and research for understanding; through science for addressing a need; to integrated science and policy development. At each stage the research conducted became more interdisciplinary, first across abiotic disciplines (hydrology, remote sensing, atmospheric science), then by merging abiotic and biotic disciplines (adding ecology and plant physiology), and finally a further integration of economic and social sciences with and policy and decision making for resource management. Lessons learned from this experience will be reviewed with the intent providing guidance to ensure that the resulting research is socially and scientifically relevant and will not only result in cutting edge science but will also directly address the needs of policy makers and resource managers.

  20. Unravelling the structure of species extinction risk for predictive conservation science.

    PubMed

    Lee, Tien Ming; Jetz, Walter

    2011-05-07

    Extinction risk varies across species and space owing to the combined and interactive effects of ecology/life history and geography. For predictive conservation science to be effective, large datasets and integrative models that quantify the relative importance of potential factors and separate rapidly changing from relatively static threat drivers are urgently required. Here, we integrate and map in space the relative and joint effects of key correlates of The International Union for Conservation of Nature-assessed extinction risk for 8700 living birds. Extinction risk varies significantly with species' broad-scale environmental niche, geographical range size, and life-history and ecological traits such as body size, developmental mode, primary diet and foraging height. Even at this broad scale, simple quantifications of past human encroachment across species' ranges emerge as key in predicting extinction risk, supporting the use of land-cover change projections for estimating future threat in an integrative setting. A final joint model explains much of the interspecific variation in extinction risk and provides a remarkably strong prediction of its observed global geography. Our approach unravels the species-level structure underlying geographical gradients in extinction risk and offers a means of disentangling static from changing components of current and future threat. This reconciliation of intrinsic and extrinsic, and of past and future extinction risk factors may offer a critical step towards a more continuous, forward-looking assessment of species' threat status based on geographically explicit environmental change projections, potentially advancing global predictive conservation science.

  1. Toward a first-principles integrated simulation of tokamak edge plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C S; Klasky, Scott A; Cummings, Julian

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less

  2. Partnering Community Decision Makers with Early Career Scientists - The NASA DEVELOP Method for Dual Capacity Building

    NASA Astrophysics Data System (ADS)

    Ross, K. W.; Childs-Gleason, L. M.; Cripps, G. S.; Clayton, A.; Remillard, C.; Watkins, L. E.; Allsbrook, K. N.; Rogers, L.; Ruiz, M. L.

    2017-12-01

    The NASA DEVELOP National Program carries out many projects every year with the goal of bringing the benefits of NASA Earth science to bear on decision-making challenges that are local in scale. Every DEVELOP project partners end users with early/transitioning science professionals. Many of these projects invited communities to consider NASA science data in new ways to help them make informed decisions. All of these projects shared three characteristics: they were rapid, nimble and risk-taking. These projects work well for some communities, but might best be suited as a feasibility studies that build community/institutional capacity towards eventual solutions. This presentation will discuss DEVELOP's lessons learned and best practices in conducting short-term feasibility projects with communities, as well as highlight several past successes.

  3. ESF EUROCORES Programmes In Geosciences And Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Jonckheere, I. G.

    2007-12-01

    In close cooperation with its Member Organisations, the European Science Foundation (ESF) has launched since late 2003 a series of European Collaborative Research (EUROCORES) Programmes. Their aim is to enable researchers in different European countries to develop cooperation and scientific synergy in areas where European scale and scope are required in a global context. The EUROCORES Scheme provides an open, flexible and transparent framework that allows national science funding and science performing agencies to join forces to support excellent European-led research, following a selection among many science-driven suggestions for new Programmes themes submitted by the scientific community. The EUROCORES instrument represents the first large scale attempt of national research (funding) agencies to act together against fragmentation, asynchronicity and duplication of research (funding) within Europe. There are presently 7 EUROCORES Programmes specifically dealing with cutting edge science in the fields of Earth, Climate and Environmental Sciences. The EUROCORES Programmes consist of a number of international, multidisciplinary collaborative research projects running for 3-4 years, selected through independent peer review. Under the overall responsibility of the participating funding agencies, those projects are coordinated and networked together through the scientific guidance of a Scientific Committee, with the support of a Programme Coordinator, responsible at ESF for providing planning, logistics, and the integration and dissemination of science. Strong links are aimed for with other major international programmes and initiatives worldwide. In this framework, linkage to IYPE would be of major interest for the scientific communities involved. Each Programme mobilises 5 to 13 million Euros in direct science funding from 9 to 27 national agencies from 8 to 20 countries. Additional funding for coordination, networking and dissemination is allocated by the ESF through these distinctive research initiatives, to build on the national research efforts and contribute to the capacity building, in relation with typically about 15-20 post-doc positions and/or PhD studentships supported nationally within each Programme. Typical networking activities are topical workshops, open sessions in a larger conference, Programme conference, (summer / winter) schools, exchange visits across projects or programmes. Overall, EUROCORES Programmes are supported by more than 60 national agencies from 30 countries and by the European Science Foundation (ESF) with support by the European Commission, DG Research (Sixth Framework Programme, contract ERAS-CT-2003-980409). In the framework of AGU, a series of present EUROCORES Programmes in the field of Geosciences and Environmental Sciences are presented (e.g., EuroDIVERSITY, EuroDEEP, EUROMARGINS, EuroCLIMATE, and EuroMinScI).

  4. The Advanced Communications Technology Satellite (ACTS) capabilities for serving science

    NASA Technical Reports Server (NTRS)

    Meyer, Thomas R.

    1990-01-01

    Results of research on potential science applications of the NASA Advanced Communications Technology Satellite (ACTS) are presented. Discussed here are: (1) general research on communications related issues; (2) a survey of science-related activities and programs in the local area; (3) interviews of selected scientists and associated telecommunications support personnel whose projects have communications requirements; (4) analysis of linkages between ACTS functionality and science user communications activities and modes of operation; and (5) an analysis of survey results and the projection of conclusions to a national scale.

  5. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  6. Inquiry-Based Educational Design for Large-Scale High School Astronomy Projects Using Real Telescopes

    ERIC Educational Resources Information Center

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena

    2015-01-01

    In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials…

  7. Homogenisation in project management for large German research projects in the Earth system sciences: overcoming the institutional coordination bias

    NASA Astrophysics Data System (ADS)

    Rauser, Florian; Vamborg, Freja

    2016-04-01

    The interdisciplinary project on High Definition Clouds and Precipitation for advancing climate prediction HD(CP)2 (hdcp2.eu) is an example for the trend in fundamental research in Europe to increasingly focus on large national and international research programs that require strong scientific coordination. The current system has traditionally been host-based: project coordination activities and funding is placed at the host institute of the central lead PI of the project. This approach is simple and has the advantage of strong collaboration between project coordinator and lead PI, while exhibiting a list of strong, inherent disadvantages that are also mentioned in this session's description: no community best practice development, lack of integration between similar projects, inefficient methodology development and usage, and finally poor career development opportunities for the coordinators. Project coordinators often leave the project before it is finalized, leaving some of the fundamentally important closing processes to the PIs. This systematically prevents the creation of professional science management expertise within academia, which leads to an automatic imbalance that hinders the outcome of large research programs to help future funding decisions. Project coordinators in academia often do not work in a professional project office environment that could distribute activities and use professional tools and methods between different projects. Instead, every new project manager has to focus on methodological work anew (communication infrastructure, meetings, reporting), even though the technological needs of large research projects are similar. This decreases the efficiency of the coordination and leads to funding that is effectively misallocated. We propose to challenge this system by creating a permanent, virtual "Centre for Earth System Science Management CESSMA" (cessma.com), and changing the approach from host- based to centre-based. This should complement the current system, by creating permanent, sustained options for interactions between large research projects in similar fields. In the long run such a centre might improve on the host-based system because the centre-based solution allows multiple projects to be coordinated in conjunction by experienced science managers, using overlap in meeting organization, reporting, infrastructure, travel and so on. To still maintain close cooperation between project managers and lead PIs, we envision a virtual centre that creates extensive collaborative opportunities by organizing yearly retreats, a shared technical data base, et cetera. As "CESSMA" is work in progress (we have applied for funding for 2016-18), we would like to use this opportunity to discuss chances, potential problems, experiences and options for this attempt to institutionalise the very reason for this session: improved, coordinated, effective science coordination; and to create a central focal point for public / academia interactions.

  8. Scaling Critical Zone analysis tasks from desktop to the cloud utilizing contemporary distributed computing and data management approaches: A case study for project based learning of Cyberinfrastructure concepts

    NASA Astrophysics Data System (ADS)

    Swetnam, T. L.; Pelletier, J. D.; Merchant, N.; Callahan, N.; Lyons, E.

    2015-12-01

    Earth science is making rapid advances through effective utilization of large-scale data repositories such as aerial LiDAR and access to NSF-funded cyberinfrastructures (e.g. the OpenTopography.org data portal, iPlant Collaborative, and XSEDE). Scaling analysis tasks that are traditionally developed using desktops, laptops or computing clusters to effectively leverage national and regional scale cyberinfrastructure pose unique challenges and barriers to adoption. To address some of these challenges in Fall 2014 an 'Applied Cyberinfrastructure Concepts' a project-based learning course (ISTA 420/520) at the University of Arizona focused on developing scalable models of 'Effective Energy and Mass Transfer' (EEMT, MJ m-2 yr-1) for use by the NSF Critical Zone Observatories (CZO) project. EEMT is a quantitative measure of the flux of available energy to the critical zone, and its computation involves inputs that have broad applicability (e.g. solar insolation). The course comprised of 25 students with varying level of computational skills and with no prior domain background in the geosciences, collaborated with domain experts to develop the scalable workflow. The original workflow relying on open-source QGIS platform on a laptop was scaled to effectively utilize cloud environments (Openstack), UA Campus HPC systems, iRODS, and other XSEDE and OSG resources. The project utilizes public data, e.g. DEMs produced by OpenTopography.org and climate data from Daymet, which are processed using GDAL, GRASS and SAGA and the Makeflow and Work-queue task management software packages. Students were placed into collaborative groups to develop the separate aspects of the project. They were allowed to change teams, alter workflows, and design and develop novel code. The students were able to identify all necessary dependencies, recompile source onto the target execution platforms, and demonstrate a functional workflow, which was further improved upon by one of the group leaders over Spring 2015. All of the code, documentation and workflow description are currently available on GitHub and a public data portal is in development. We present a case study of how students reacted to the challenge of a real science problem, their interactions with end-users, what went right, and what could be done better in the future.

  9. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Allcock, William; Beggio, Chris

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less

  10. From Access to Success: Identity Contingencies & African-American Pathways to Science

    ERIC Educational Resources Information Center

    Brown, Bryan A.; Henderson, J. Bryan; Gray, Salina; Donovan, Brian; Sullivan, Shayna

    2013-01-01

    We conducted a mixed-methodological study of matriculation issues for African-American students in science. The project compares the experiences of students currently majoring in science (N = 304) with the experiences of those who have succeeded in earning science degrees (N = 307). Using a 57-item Likert scale questionnaire, participants were…

  11. Science Base and Tools for Evaluating Stream Restoration Project Proposals.

    NASA Astrophysics Data System (ADS)

    Cluer, B.; Thorne, C.; Skidmore, P.; Castro, J.; Pess, G.; Beechie, T.; Shea, C.

    2008-12-01

    Stream restoration, stabilization, or enhancement projects typically employ site-specific designs and site- scale habitat improvement projects have become the default solution to many habitat problems and constraints. Such projects are often planned and implemented without thorough consideration of the broader scale problems that may be contributing to habitat degradation, attention to project resiliency to flood events, accounting for possible changes in climate or watershed land use, or ensuring the long term sustainability of the project. To address these issues, NOAA Fisheries and USFWS have collaboratively commissioned research to develop a science document and accompanying tools to support more consistent and comprehensive review of stream management and restoration projects proposals by Service staff responsible for permitting. The science document synthesizes the body of knowledge in fluvial geomorphology and presents it in a way that is accessible to the Services staff biologists, who are not trained experts in this field. Accompanying the science document are two electronic tools: a Project Information Checklist to assist in evaluating whether a proposal includes all the information necessary to allow critical and thorough project evaluation; and a Project Evaluation Tool (in flow chart format) that guides reviewers through the steps necessary to critically evaluate the quality of the information submitted, the goals and objectives of the project, project planning and development, project design, geomorphic-habitat-species relevance, and risks to listed species. Materials for training Services staff and others in the efficient use of the science document and tools have also been developed. The longer term goals of this effort include: enabling consistent and comprehensive reviews that are completed in a timely fashion by regulators; facilitating improved project planning and design by proponents; encouraging projects that are attuned to their watershed and geomorphic contexts; questioning perceived constraints on project design; reducing the use of hard structures and encouraging deformability; promoting designs that address both risk and uncertainty in applying engineering design standards; allowing for future climate and land use changes; and encouraging post-project monitoring, appraisal and project aftercare.

  12. Experienced and Novice Teachers' Concepts of Spatial Scale

    ERIC Educational Resources Information Center

    Jones, M. Gail; Tretter, Thomas; Taylor, Amy; Oppewal, Tom

    2008-01-01

    Scale is one of the thematic threads that runs through nearly all of the sciences and is considered one of the major prevailing ideas of science. This study explored novice and experienced teachers' concepts of spatial scale with a focus on linear sizes from very small (nanoscale) to very large (cosmic scale). Novice teachers included…

  13. Dagik Earth: A Digital Globe Project for Classrooms, Science Museums, and Research Institutes

    NASA Astrophysics Data System (ADS)

    Saito, A.; Tsugawa, T.

    2017-12-01

    Digital globe system is a powerful tool to make the audiences understand phenomena on the Earth and planets in intuitive way. Geo-cosmos of Miraikan, Japan uses 6-m spherical LED, and is one of the largest systems of digital globe. Science on a Sphere (SOS) by NOAA is a digital globe system that is most widely used in science museums around the world. These systems are so expensive that the usage of the digital globes is mainly limited to large-scale science museums. Dagik Earth is a digital globe project that promotes educational programs using digital globe with low cost. It aims to be used especially in classrooms. The cost for the digital globe of Dagik Earth is from several US dollars if PC and PC projector are available. It uses white spheres, such as balloons and balance balls, as the screen. The software is provided by the project with free of charge for the educational usage. The software runs on devices of Windows, Mac and iOS. There are English and Chinese language versions of the PC software besides Japanese version. The number of the registered users of Dagik Earth is about 1,400 in Japan. About 60% of them belongs to schools, 30% to universities and research institutes, and 8% to science museums. In schools, it is used in classes by teachers, and science activities by students. Several teachers have used the system for five years and more. In a students' activity, Dagik Earth contents on the typhoon, solar eclipse, and satellite launch were created and presented in a school festival. This is a good example of the usage of Dagik Earth for STEM education. In the presentation, the system and activity of Dagik Earth will be presented, and the future expansion of the project will be discussed.

  14. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  15. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papka, M.; Messina, P.; Coffey, R.

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursormore » to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to implement those algorithms. The Data Analytics and Visualization Team lends expertise in tools and methods for high-performance, post-processing of large datasets, interactive data exploration, batch visualization, and production visualization. The Operations Team ensures that system hardware and software work reliably and optimally; system tools are matched to the unique system architectures and scale of ALCF resources; the entire system software stack works smoothly together; and I/O performance issues, bug fixes, and requests for system software are addressed. The User Services and Outreach Team offers frontline services and support to existing and potential ALCF users. The team also provides marketing and outreach to users, DOE, and the broader community.« less

  16. Puget Sound Shorelines and the Impacts of Armoring-Proceedings of a State of the Science Workshop, May 2009

    USGS Publications Warehouse

    Shipman, Hugh; Dethier, Megan N.; Gelfenbaum, Guy R.; Fresh, Kurt L.; Dinicola, Richard S.

    2010-01-01

    The widespread extent and continued construction of seawalls and bulkheads on Puget Sound's beaches has emerged as a significant issue in shoreline management and coastal restoration in the region. Concerns about the impacts of shoreline armoring and managing the potential risks to coastal property are in many ways similar to those in other places, but Puget Sound also poses unique challenges related to its sheltered setting, glacially formed geology, rich estuarine ecology, and historical development pattern. The effects of armoring on shorelines are complex, involving both physical and biological science and requiring consideration of the cumulative impacts of small-scale activities over large scales of space and time. In addition, the issue is controversial, as it often places strongly held private interests in protecting shoreline property against broad public mandates to preserve shorelines for public uses and to protect environmental resources. Communities making difficult decisions about regulating shoreline activities and prioritizing restoration projects need to be informed by the best science available. To address these issues, a scientific workshop was convened in May 2009, specifically to bring local and national experts together to review the state of the science regarding the physical and biological impacts of armoring on sheltered shorelines such as those of Puget Sound.

  17. Making continental-scale environmental programs relevant locally for educators with Project BudBurst

    NASA Astrophysics Data System (ADS)

    Goehring, L.; Henderson, S.; Wasser, L.; Newman, S. J.; Ward, D.

    2012-12-01

    Project BudBurst is a national citizen science initiative designed to engage non professionals in observations of phenological (plant life cycle) events that raise awareness of climate change, and create a cadre of informed citizen scientists. Citizen science programs such as Project BudBurst provide excellent opportunities for educators and their students to actively participate in scientific research. Such programs are important not only from an educational perspective, but because they also enable scientists to broaden the geographic and temporal scale of their observations. The goals of Project BudBurst are to 1) increase awareness of phenology as an area of scientific study; 2) increase awareness of the impacts of changing climates on plants at a continental-scale; and 3) increase science literacy by engaging participants in the scientific process. From its 2008 launch, this on-line program has engaged participants of all ages and walks of life in recording the timing of the leafing and flowering of wild and cultivated species found across the continent, and in contemplating the meaning of such data in their local environments. Thus far, thousands of participants from all 50 states have submitted data. This presentation will provide an overview of Project BudBurst educational resources and share lessons learned from educators in implementing the program in formal and informal education settings. Lesson plans and tips from educators will be highlighted. Project BudBurst is co-managed by the National Ecological Observatory Network and the Chicago Botanic Garden.

  18. Molecular Dynamics-based Simulations of Bulk/Interfacial Structures and Diffusion Behaviors in Nuclear Waste Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Jincheng; Rimsza, Jessica; Deng, Lu

    This NEUP Project aimed to generate accurate atomic structural models of nuclear waste glasses by using large-scale molecular dynamics-based computer simulations and to use these models to investigate self-diffusion behaviors, interfacial structures, and hydrated gel structures formed during dissolution of these glasses. The goal was to obtain realistic and accurate short and medium range structures of these complex oxide glasses, to provide a mechanistic understanding of the dissolution behaviors, and to generate reliable information with predictive power in designing nuclear waste glasses for long-term geological storage. Looking back of the research accomplishments of this project, most of the scientific goalsmore » initially proposed have been achieved through intensive research in the three and a half year period of the project. This project has also generated a wealth of scientific data and vibrant discussions with various groups through collaborations within and outside of this project. Throughout the project one book chapter and 14 peer reviewed journal publications have been generated (including one under review) and 16 presentations (including 8 invited talks) have been made to disseminate the results of this project in national and international conference. Furthermore, this project has trained several outstanding graduate students and young researchers for future workforce in nuclear related field, especially on nuclear waste immobilization. One postdoc and four PhD students have been fully or partially supported through the project with intensive training in the field material science and engineering with expertise on glass science and nuclear waste disposal« less

  19. Intercomparison Project on Parameterizations of Large-Scale Dynamics for Simulations of Tropical Convection

    NASA Astrophysics Data System (ADS)

    Sobel, A. H.; Wang, S.; Bellon, G.; Sessions, S. L.; Woolnough, S.

    2013-12-01

    Parameterizations of large-scale dynamics have been developed in the past decade for studying the interaction between tropical convection and large-scale dynamics, based on our physical understanding of the tropical atmosphere. A principal advantage of these methods is that they offer a pathway to attack the key question of what controls large-scale variations of tropical deep convection. These methods have been used with both single column models (SCMs) and cloud-resolving models (CRMs) to study the interaction of deep convection with several kinds of environmental forcings. While much has been learned from these efforts, different groups' efforts are somewhat hard to compare. Different models, different versions of the large-scale parameterization methods, and experimental designs that differ in other ways are used. It is not obvious which choices are consequential to the scientific conclusions drawn and which are not. The methods have matured to the point that there is value in an intercomparison project. In this context, the Global Atmospheric Systems Study - Weak Temperature Gradient (GASS-WTG) project was proposed at the Pan-GASS meeting in September 2012. The weak temperature gradient approximation is one method to parameterize large-scale dynamics, and is used in the project name for historical reasons and simplicity, but another method, the damped gravity wave (DGW) method, will also be used in the project. The goal of the GASS-WTG project is to develop community understanding of the parameterization methods currently in use. Their strengths, weaknesses, and functionality in models with different physics and numerics will be explored in detail, and their utility to improve our understanding of tropical weather and climate phenomena will be further evaluated. This presentation will introduce the intercomparison project, including background, goals, and overview of the proposed experimental design. Interested groups will be invited to join (it will not be too late), and preliminary results will be presented.

  20. Accelerators for society: succession of European infrastructural projects: CARE, EuCARD, TIARA, EuCARD2

    NASA Astrophysics Data System (ADS)

    Romaniuk, Ryszard S.

    2013-10-01

    Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, shown during the realization of CARE (Coordinated Accelerator R&D), EuCARD (European Coordination of Accelerator R&D) and during the national annual review meeting of the TIARA - Test Infrastructure of European Research Area in Accelerator R&D. The European projects on accelerator technology started in 2003 with CARE. TIARA is an European Collaboration of Accelerator Technology, which by running research projects, technical, networks and infrastructural has a duty to integrate the research and technical communities and infrastructures in the global scale of Europe. The Collaboration gathers all research centers with large accelerator infrastructures. Other ones, like universities, are affiliated as associate members. TIARA-PP (preparatory phase) is an European infrastructural project run by this Consortium and realized inside EU-FP7. The paper presents a general overview of CARE, EuCARD and especially TIARA activities, with an introduction containing a portrait of contemporary accelerator technology and a digest of its applications in modern society. CARE, EuCARD and TIARA activities integrated the European accelerator community in a very effective way. These projects are expected very much to be continued.

  1. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    PubMed

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  2. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    PubMed Central

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  3. Authentic Research Experience and “Big Data” Analysis in the Classroom: Maize Response to Abiotic Stress

    PubMed Central

    Makarevitch, Irina; Frechette, Cameo; Wiatros, Natalia

    2015-01-01

    Integration of inquiry-based approaches into curriculum is transforming the way science is taught and studied in undergraduate classrooms. Incorporating quantitative reasoning and mathematical skills into authentic biology undergraduate research projects has been shown to benefit students in developing various skills necessary for future scientists and to attract students to science, technology, engineering, and mathematics disciplines. While large-scale data analysis became an essential part of modern biological research, students have few opportunities to engage in analysis of large biological data sets. RNA-seq analysis, a tool that allows precise measurement of the level of gene expression for all genes in a genome, revolutionized molecular biology and provides ample opportunities for engaging students in authentic research. We developed, implemented, and assessed a series of authentic research laboratory exercises incorporating a large data RNA-seq analysis into an introductory undergraduate classroom. Our laboratory series is focused on analyzing gene expression changes in response to abiotic stress in maize seedlings; however, it could be easily adapted to the analysis of any other biological system with available RNA-seq data. Objective and subjective assessment of student learning demonstrated gains in understanding important biological concepts and in skills related to the process of science. PMID:26163561

  4. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. In Defense of the National Labs and Big-Budget Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodwin, J R

    2008-07-29

    The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less

  6. Marine anthropogenic litter on British beaches: A 10-year nationwide assessment using citizen science data.

    PubMed

    Nelms, S E; Coombes, C; Foster, L C; Galloway, T S; Godley, B J; Lindeque, P K; Witt, M J

    2017-02-01

    Growing evidence suggests that anthropogenic litter, particularly plastic, represents a highly pervasive and persistent threat to global marine ecosystems. Multinational research is progressing to characterise its sources, distribution and abundance so that interventions aimed at reducing future inputs and clearing extant litter can be developed. Citizen science projects, whereby members of the public gather information, offer a low-cost method of collecting large volumes of data with considerable temporal and spatial coverage. Furthermore, such projects raise awareness of environmental issues and can lead to positive changes in behaviours and attitudes. We present data collected over a decade (2005-2014 inclusive) by Marine Conservation Society (MCS) volunteers during beach litter surveys carried along the British coastline, with the aim of increasing knowledge on the composition, spatial distribution and temporal trends of coastal debris. Unlike many citizen science projects, the MCS beach litter survey programme gathers information on the number of volunteers, duration of surveys and distances covered. This comprehensive information provides an opportunity to standardise data for variation in sampling effort among surveys, enhancing the value of outputs and robustness of findings. We found that plastic is the main constituent of anthropogenic litter on British beaches and the majority of traceable items originate from land-based sources, such as public littering. We identify the coast of the Western English Channel and Celtic Sea as experiencing the highest relative litter levels. Increasing trends over the 10-year time period were detected for a number of individual item categories, yet no statistically significant change in total (effort-corrected) litter was detected. We discuss the limitations of the dataset and make recommendations for future work. The study demonstrates the value of citizen science data in providing insights that would otherwise not be possible due to logistical and financial constraints of running government-funded sampling programmes on such large scales. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  7. Post-Cold War Science and Technology at Los Alamos

    NASA Astrophysics Data System (ADS)

    Browne, John C.

    2002-04-01

    Los Alamos National Laboratory serves the nation through the development and application of leading-edge science and technology in support of national security. Our mission supports national security by: ensuring the safety, security, and reliability of the U.S. nuclear stockpile; reducing the threat of weapons of mass destruction in support of counter terrorism and homeland defense; and solving national energy, environment, infrastructure, and health security problems. We require crosscutting fundamental and advanced science and technology research to accomplish our mission. The Stockpile Stewardship Program develops and applies, advanced experimental science, computational simulation, and technology to ensure the safety and reliability of U.S. nuclear weapons in the absence of nuclear testing. This effort in itself is a grand challenge. However, the terrorist attack of September 11, 2001, reminded us of the importance of robust and vibrant research and development capabilities to meet new and evolving threats to our national security. Today through rapid prototyping we are applying new, innovative, science and technology for homeland defense, to address the threats of nuclear, chemical, and biological weapons globally. Synergistically, with the capabilities that we require for our core mission, we contribute in many other areas of scientific endeavor. For example, our Laboratory has been part of the NASA effort on mapping water on the moon and NSF/DOE projects studying high-energy astrophysical phenomena, understanding fundamental scaling phenomena of life, exploring high-temperature superconductors, investigating quantum information systems, applying neutrons to condensed-matter and nuclear physics research, developing large-scale modeling and simulations to understand complex phenomena, and exploring nanoscience that bridges the atomic to macroscopic scales. In this presentation, I will highlight some of these post-cold war science and technology advances including our national security contributions, and discuss some of challenges for Los Alamos in the future.

  8. Landscape and climate science and scenarios for Florida

    USGS Publications Warehouse

    Terando, Adam; Traxler, Steve; Collazo, Jaime

    2014-01-01

    The Peninsular Florida Landscape Conservation Cooperative (PFLCC) is part of a network of 22 Landscape Conservation Cooperatives (LCCs) that extend from Alaska to the Caribbean. LCCs are regional-applied conservation-science partnerships among Federal agencies, regional organizations, States, tribes, nongovernmental organizations (NGOs), private stakeholders, universities, and other entities within a geographic area. The goal of these conservation-science partnerships is to help inform managers and decision makers at a landscape scale to further the principles of adaptive management and strategic habitat conservation. A major focus for LCCs is to help conservation managers and decision makers respond to large-scale ecosystem and habitat stressors, such as climate change, habitat fragmentation, invasive species, and water scarcity. The purpose of the PFLCC is to facilitate planning, design, and implementation of conservation strategies for fish and wildlife species at the landscape level using the adaptive management framework of strategic habitat conservation—integrating planning, design, delivery, and evaluation. Florida faces a set of unique challenges when responding to regional and global stressors because of its unique ecosystems and assemblages of species, its geographic location at the crossroads of temperate and tropical climates, and its exposure to both rapid urbanization and rising sea levels as the climate warms. In response to these challenges, several landscape-scale science projects were initiated with the goal of informing decision makers about how potential changes in climate and the built environment could impact habitats and ecosystems of concern in Florida and the Southeast United States. In June 2012, the PFLCC, North Carolina State University, convened a workshop at the U.S. Geological Survey (USGS) Coastal and Marine Science Center in St. Petersburg to assess the results of these integrated assessments and to foster an open dialogue about science gaps and future research needs.

  9. Swiss Life Sciences - a science communication project for both schools and the wider public led by the foundation Science et Cité.

    PubMed

    Röthlisberger, Michael

    2012-01-01

    The foundation Science et Cité was founded 1998 with the aim to inform the wider Swiss public about current scientific topics and to generate a dialogue between science and society. Initiated as an independent foundation by the former State Secretary for Science and Research, Dr. Charles Kleiber, Science et Cité is now attached to the Swiss Academies of Arts and Sciences as a competence center for dialogue with the public. Due to its branches in all language regions of the country, the foundation is ideally suited to initiate and implement communication projects on a nationwide scale. These projects are subdivided into three categories: i) science communication for children/adolescents, ii) establishing a dialogue between science and the wider public, and iii) conducting the role of a national center of competence and networking in science communication. Swiss Life Sciences is a project that fits into all of these categories: a year-round program for schools is complemented with an annual event for the wider public. With the involvement of most of the major Swiss universities, the Swiss National Science Foundation, the foundation Gen Suisse and many other partners, Swiss Life Sciences also sets an example of national networking within the science communication community.

  10. Assessing Science Reasoning and Conceptual Understanding in the Primary Grades Using Standardized and Performance-Based Assessments

    ERIC Educational Resources Information Center

    Kim, Kyung Hee; VanTassel-Baska, Joyce; Bracken, Bruce A.; Feng, Annie; Stambaugh, Tamra

    2014-01-01

    Project Clarion, a Jacob K. Javits-funded project, focused on the scale-up of primary-grade science curricula. Curriculum units, based on an Integrated Curriculum Model (ICM), were developed for high-ability learners, but tried out with all students in Title I settings to study the efficacy of the units with all learners. The units focus on the…

  11. Zooniverse - Web scale citizen science with people and machines. (Invited)

    NASA Astrophysics Data System (ADS)

    Smith, A.; Lynn, S.; Lintott, C.; Simpson, R.

    2013-12-01

    The Zooniverse (zooniverse.org) began in 2007 with the launch of Galaxy Zoo, a project in which more than 175,000 people provided shape analyses of more than 1 million galaxy images sourced from the Sloan Digital Sky Survey. These galaxy 'classifications', some 60 million in total, have since been used to produce more than 50 peer-reviewed publications based not only on the original research goals of the project but also because of serendipitous discoveries made by the volunteer community. Based upon the success of Galaxy Zoo the team have gone on to develop more than 25 web-based citizen science projects, all with a strong research focus in a range of subjects from astronomy to zoology where human-based analysis still exceeds that of machine intelligence. Over the past 6 years Zooniverse projects have collected more than 300 million data analyses from over 1 million volunteers providing fantastically rich datasets for not only the individuals working to produce research from their project but also the machine learning and computer vision research communities. The Zooniverse platform has always been developed to be the 'simplest thing that works' implementing only the most rudimentary algorithms for functionality such as task allocation and user-performance metrics - simplifications necessary to scale the Zooniverse such that the core team of developers and data scientists can remain small and the cost of running the computing infrastructure relatively modest. To date these simplifications have been appropriate for the data volumes and analysis tasks being addressed. This situation however is changing: next generation telescopes such as the Large Synoptic Sky Telescope (LSST) will produce data volumes dwarfing those previously analyzed. If citizen science is to have a part to play in analyzing these next-generation datasets then the Zooniverse will need to evolve into a smarter system capable for example of modeling the abilities of users and the complexities of the data being classified in real time. In this session I will outline the current architecture of the Zooniverse platform and introduce new functionality being developed to enable the development of a true 'social machines'. Our platform is evolving into a system capable of integrating human and machine intelligence in a live environment thus capable of addressing some of the biggest challenges in big-data science.

  12. Amphibian and reptile road-kills on tertiary roads in relation to landscape structure: using a citizen science approach with open-access land cover data.

    PubMed

    Heigl, Florian; Horvath, Kathrin; Laaha, Gregor; Zaller, Johann G

    2017-06-26

    Amphibians and reptiles are among the most endangered vertebrate species worldwide. However, little is known how they are affected by road-kills on tertiary roads and whether the surrounding landscape structure can explain road-kill patterns. The aim of our study was to examine the applicability of open-access remote sensing data for a large-scale citizen science approach to describe spatial patterns of road-killed amphibians and reptiles on tertiary roads. Using a citizen science app we monitored road-kills of amphibians and reptiles along 97.5 km of tertiary roads covering agricultural, municipal and interurban roads as well as cycling paths in eastern Austria over two seasons. Surrounding landscape was assessed using open access land cover classes for the region (Coordination of Information on the Environment, CORINE). Hotspot analysis was performed using kernel density estimation (KDE+). Relations between land cover classes and amphibian and reptile road-kills were analysed with conditional probabilities and general linear models (GLM). We also estimated the potential cost-efficiency of a large scale citizen science monitoring project. We recorded 180 amphibian and 72 reptile road-kills comprising eight species mainly occurring on agricultural roads. KDE+ analyses revealed a significant clustering of road-killed amphibians and reptiles, which is an important information for authorities aiming to mitigate road-kills. Overall, hotspots of amphibian and reptile road-kills were next to the land cover classes arable land, suburban areas and vineyards. Conditional probabilities and GLMs identified road-kills especially next to preferred habitats of green toad, common toad and grass snake, the most often found road-killed species. A citizen science approach appeared to be more cost-efficient than monitoring by professional researchers only when more than 400 km of road are monitored. Our findings showed that freely available remote sensing data in combination with a citizen science approach would be a cost-efficient method aiming to identify and monitor road-kill hotspots of amphibians and reptiles on a larger scale.

  13. Climbing the Slope of Enlightenment during NASA's Arctic Boreal Vulnerability Experiment

    NASA Astrophysics Data System (ADS)

    Griffith, P. C.; Hoy, E.; Duffy, D.; McInerney, M.

    2015-12-01

    The Arctic Boreal Vulnerability Experiment (ABoVE) is a new field campaign sponsored by NASA's Terrestrial Ecology Program and designed to improve understanding of the vulnerability and resilience of Arctic and boreal social-ecological systems to environmental change (http://above.nasa.gov). ABoVE is integrating field-based studies, modeling, and data from airborne and satellite remote sensing. The NASA Center for Climate Simulation (NCCS) has partnered with the NASA Carbon Cycle and Ecosystems Office (CCEO) to create a high performance science cloud for this field campaign. The ABoVE Science Cloud combines high performance computing with emerging technologies and data management with tools for analyzing and processing geographic information to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage for "big data" with integrated data management, and integration of core variables from in-situ networks. The ABoVE Science Cloud is a collaboration that is accelerating the pace of new Arctic science for researchers participating in the field campaign. Specific examples of the utilization of the ABoVE Science Cloud by several funded projects will be presented.

  14. Signal to noise quantification of regional climate projections

    NASA Astrophysics Data System (ADS)

    Li, S.; Rupp, D. E.; Mote, P.

    2016-12-01

    One of the biggest challenges in interpreting climate model outputs for impacts studies and adaptation planning is understanding the sources of disagreement among models (which is often used imperfectly as a stand-in for system uncertainty). Internal variability is a primary source of uncertainty in climate projections, especially for precipitation, for which models disagree about even the sign of changes in large areas like the continental US. Taking advantage of a large initial-condition ensemble of regional climate simulations, this study quantifies the magnitude of changes forced by increasing greenhouse gas concentrations relative to internal variability. Results come from a large initial-condition ensemble of regional climate model simulations generated by weather@home, a citizen science computing platform, where the western United States climate was simulated for the recent past (1985-2014) and future (2030-2059) using a 25-km horizontal resolution regional climate model (HadRM3P) nested in global atmospheric model (HadAM3P). We quantify grid point level signal-to-noise not just in temperature and precipitation responses, but also the energy and moisture flux terms that are related to temperature and precipitation responses, to provide important insights regarding uncertainty in climate change projections at local and regional scales. These results will aid modelers in determining appropriate ensemble sizes for different climate variables and help users of climate model output with interpreting climate model projections.

  15. Big Science, Team Science, and Open Science for Neuroscience.

    PubMed

    Koch, Christof; Jones, Allan

    2016-11-02

    The Allen Institute for Brain Science is a non-profit private institution dedicated to basic brain science with an internal organization more commonly found in large physics projects-large teams generating complete, accurate and permanent resources for the mouse and human brain. It can also be viewed as an experiment in the sociology of neuroscience. We here describe some of the singular differences to more academic, PI-focused institutions. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Toward server-side, high performance climate change data analytics in the Earth System Grid Federation (ESGF) eco-system

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Williams, Dean; Aloisio, Giovanni

    2016-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.

  17. U-Science (Invited)

    NASA Astrophysics Data System (ADS)

    Borne, K. D.

    2009-12-01

    The emergence of e-Science over the past decade as a paradigm for Internet-based science was an inevitable evolution of science that built upon the web protocols and access patterns that were prevalent at that time, including Web Services, XML-based information exchange, machine-to-machine communication, service registries, the Grid, and distributed data. We now see a major shift in web behavior patterns to social networks, user-provided content (e.g., tags and annotations), ubiquitous devices, user-centric experiences, and user-led activities. The inevitable accrual of these social networking patterns and protocols by scientists and science projects leads to U-Science as a new paradigm for online scientific research (i.e., ubiquitous, user-led, untethered, You-centered science). U-Science applications include components from semantic e-science (ontologies, taxonomies, folksonomies, tagging, annotations, and classification systems), which is much more than Web 2.0-based science (Wikis, blogs, and online environments like Second Life). Among the best examples of U-Science are Citizen Science projects, including Galaxy Zoo, Stardust@Home, Project Budburst, Volksdata, CoCoRaHS (the Community Collaborative Rain, Hail and Snow network), and projects utilizing Volunteer Geographic Information (VGI). There are also scientist-led projects for scientists that engage a wider community in building knowledge through user-provided content. Among the semantic-based U-Science projects for scientists are those that specifically enable user-based annotation of scientific results in databases. These include the Heliophysics Knowledgebase, BioDAS, WikiProteins, The Entity Describer, and eventually AstroDAS. Such collaborative tagging of scientific data addresses several petascale data challenges for scientists: how to find the most relevant data, how to reuse those data, how to integrate data from multiple sources, how to mine and discover new knowledge in large databases, how to represent and encode the new knowledge, and how to curate the discovered knowledge. This talk will address the emergence of U-Science as a type of Semantic e-Science, and will explore challenges, implementations, and results. Semantic e-Science and U-Science applications and concepts will be discussed within the context of one particular implementation (AstroDAS: Astronomy Distributed Annotation System) and its applicability to petascale science projects such as the LSST (Large Synoptic Survey Telescope), coming online within the next few years.

  18. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    EPA Pesticide Factsheets

    Data from a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrating using predictive computational models on high-throughput screening data to screen thousands of chemicals against the estrogen receptor.This dataset is associated with the following publication:Mansouri , K., A. Abdelaziz, A. Rybacka, A. Roncaglioni, A. Tropsha, A. Varnek, A. Zakharov, A. Worth, A. Richard , C. Grulke , D. Trisciuzzi, D. Fourches, D. Horvath, E. Benfenati , E. Muratov, E.B. Wedebye, F. Grisoni, G.F. Mangiatordi, G.M. Incisivo, H. Hong, H.W. Ng, I.V. Tetko, I. Balabin, J. Kancherla , J. Shen, J. Burton, M. Nicklaus, M. Cassotti, N.G. Nikolov, O. Nicolotti, P.L. Andersson, Q. Zang, R. Politi, R.D. Beger , R. Todeschini, R. Huang, S. Farag, S.A. Rosenberg, S. Slavov, X. Hu, and R. Judson. (Environmental Health Perspectives) CERAPP: Collaborative Estrogen Receptor Activity Prediction Project. ENVIRONMENTAL HEALTH PERSPECTIVES. National Institute of Environmental Health Sciences (NIEHS), Research Triangle Park, NC, USA, 1-49, (2016).

  19. A report on the USL NASA/RECON project. Part 1: The development of a transportable, university level, IS and R educational program

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Gallagher, Suzy; Granier, Martin

    1984-01-01

    A project is described which has as its goal the production of a set of system-independent, discipline-independent, transportable college level courses to educate science and engineering students in the use of large-scale information storage and retrieval systems. This project is being conducted with the cooperation and sponsorship of NASA by R and D teams at the University of Southwest Louisiana and Southern University. Chapter 1 is an introduction, providing an overview and a listing of the management phases. Chapter 2 furnishes general information regarding accomplishments in areas under development. Chapter 3 deals with the development of the course materials by presenting a series of diagrams and keys to depict the progress and interrelationships of various tasks and sub-tasks. Chapter 4 presents plans for activities to be conducted to complete and deliver course materials. The final chapter is a summary of project objectives, methods, plans, and accomplishments.

  20. Diurnal Cycle of Convection and Interaction with the Large-Scale Circulation

    NASA Technical Reports Server (NTRS)

    Salby, Murry L.

    2002-01-01

    The science in this effort was scheduled in the project's third and fourth years, after a long record of high-resolution Global Cloud Imagery (GCI) had been produced. Unfortunately, political disruptions that interfered with this project led to its funding being terminated after only two years of support. Nevertheless, the availability of intermediate data opened the door to a number of important scientific studies. Beyond considerations of the diurnal cycle addressed in this grant, the GCI makes possible a wide range of studies surrounding convection, cloud, and precipitation. Several are already underway with colleagues in the US and abroad, including global cloud simulations, a global precipitation product, global precipitation simulations, upper tropospheric humidity, asynoptic sampling studies, convective organization studies, equatorial wave simulations, and the tropical tropopause.

  1. The beginnings of German governmental sponsorship in astronomy: the solar eclipse expeditions of 1868 as a prelude to the Venus transit expeditions of 1874 and 1882

    NASA Astrophysics Data System (ADS)

    Duerbeck, Hilmar W.

    The origins of the North German expeditions to observe the total solar eclipse of August 18, 1868, are outlined. The incentive was made by politician and science writer Aaron Bernstein, the financing was provided by the North German Federation, and the project was handled by members of the Astronomische Gesellschaft. The astronomical expeditions to Mulwar in India and Aden in South Arabia are summarized, the following archaeological expedition to upper Egypt is also considered. The activities of the participating scientists, also with respect to the preparation of popular accounts, are described. Finally, the impact of these expeditions on the planning of the large-scale project to observe the Venus transits of 1874 and 1882 is investigated.

  2. Flat-plate solar array project. Volume 6: Engineering sciences and reliability

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.; Smokler, M. I.

    1986-01-01

    The Flat-Plate Solar Array (FSA) Project activities directed at developing the engineering technology base required to achieve modules that meet the functional, safety, and reliability requirements of large scale terrestrial photovoltaic systems applications are reported. These activities included: (1) development of functional, safety, and reliability requirements for such applications; (2) development of the engineering analytical approaches, test techniques, and design solutions required to meet the requirements; (3) synthesis and procurement of candidate designs for test and evaluation; and (4) performance of extensive testing, evaluation, and failure analysis of define design shortfalls and, thus, areas requiring additional research and development. A summary of the approach and technical outcome of these activities are provided along with a complete bibliography of the published documentation covering the detailed accomplishments and technologies developed.

  3. Design of the ARES Mars Airplane and Mission Architecture

    NASA Technical Reports Server (NTRS)

    Braun, Robert D.; Wright, Henry S.; Croom, Mark A.; Levine, Joel S.; Spencer, David A.

    2006-01-01

    Significant technology advances have enabled planetary aircraft to be considered as viable science platforms. Such systems fill a unique planetary science measurement gap, that of regional-scale, near-surface observation, while providing a fresh perspective for potential discovery. Recent efforts have produced mature mission and flight system concepts, ready for flight project implementation. This paper summarizes the development of a Mars airplane mission architecture that balances science, implementation risk and cost. Airplane mission performance, flight system design and technology maturation are described. The design, analysis and testing completed demonstrates the readiness of this science platform for use in a Mars flight project.

  4. Adapting to large-scale changes in Advanced Placement Biology, Chemistry, and Physics: the impact of online teacher communities

    NASA Astrophysics Data System (ADS)

    Frumin, Kim; Dede, Chris; Fischer, Christian; Foster, Brandon; Lawrenz, Frances; Eisenkraft, Arthur; Fishman, Barry; Jurist Levy, Abigail; McCoy, Ayana

    2018-03-01

    Over the past decade, the field of teacher professional learning has coalesced around core characteristics of high quality professional development experiences (e.g. Borko, Jacobs, & Koellner, 2010. Contemporary approaches to teacher professional development. In P. L. Peterson, E. Baker, & B. McGaw (Eds.), International encyclopedia of education (Vol. 7, pp. 548-556). Oxford: Elsevier.; Darling-Hammond, Hyler, & Gardner, 2017. Effective teacher professional development. Palo Alto, CA: Learning Policy Institute). Many countries have found these advances of great interest because of a desire to build teacher capacity in science education and across the full curriculum. This paper continues this progress by examining the role and impact of an online professional development community within the top-down, large-scale curriculum and assessment revision of Advanced Placement (AP) Biology, Chemistry, and Physics. This paper is part of a five-year, longitudinal, U.S. National Science Foundation-funded project to study the relative effectiveness of various types of professional development in enabling teachers to adapt to the revised AP course goals and exams. Of the many forms of professional development our research has examined, preliminary analyses indicated that participation in the College Board's online AP Teacher Community (APTC) - where teachers can discuss teaching strategies, share resources, and connect with each other - had positive, direct, and statistically significant association with teacher self-reported shifts in practice and with gains in student AP scores (Fishman et al., 2014). This study explored how usage of the online APTC might be useful to teachers and examined a more robust estimate of these effects. Findings from the experience of AP teachers may be valuable in supporting other large-scale curriculum changes, such as the U.S. Next Generation Science Standards or Common Core Standards, as well as parallel curricular shifts in other countries.

  5. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  6. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.

    PubMed

    Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C

    2011-11-27

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.

  7. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project

    PubMed Central

    Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.

    2011-01-01

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969

  8. Development of a database system for mapping insertional mutations onto the mouse genome with large-scale experimental data

    PubMed Central

    2009-01-01

    Background Insertional mutagenesis is an effective method for functional genomic studies in various organisms. It can rapidly generate easily tractable mutations. A large-scale insertional mutagenesis with the piggyBac (PB) transposon is currently performed in mice at the Institute of Developmental Biology and Molecular Medicine (IDM), Fudan University in Shanghai, China. This project is carried out via collaborations among multiple groups overseeing interconnected experimental steps and generates a large volume of experimental data continuously. Therefore, the project calls for an efficient database system for recording, management, statistical analysis, and information exchange. Results This paper presents a database application called MP-PBmice (insertional mutation mapping system of PB Mutagenesis Information Center), which is developed to serve the on-going large-scale PB insertional mutagenesis project. A lightweight enterprise-level development framework Struts-Spring-Hibernate is used here to ensure constructive and flexible support to the application. The MP-PBmice database system has three major features: strict access-control, efficient workflow control, and good expandability. It supports the collaboration among different groups that enter data and exchange information on daily basis, and is capable of providing real time progress reports for the whole project. MP-PBmice can be easily adapted for other large-scale insertional mutation mapping projects and the source code of this software is freely available at http://www.idmshanghai.cn/PBmice. Conclusion MP-PBmice is a web-based application for large-scale insertional mutation mapping onto the mouse genome, implemented with the widely used framework Struts-Spring-Hibernate. This system is already in use by the on-going genome-wide PB insertional mutation mapping project at IDM, Fudan University. PMID:19958505

  9. Handling the Diversity in the Coming Flood of InSAR Data with the InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Gurrola, E. M.; Sacco, G. F.; Agram, P. S.; Lavalle, M.; Zebker, H. A.

    2014-12-01

    The NASA ESTO-developed InSAR Scientific Computing Environment (ISCE) provides acomputing framework for geodetic image processing for InSAR sensors that ismodular, flexible, and extensible, enabling scientists to reduce measurementsdirectly from a diverse array of radar satellites and aircraft to newgeophysical products. ISCE can serve as the core of a centralized processingcenter to bring Level-0 raw radar data up to Level-3 data products, but isadaptable to alternative processing approaches for science users interested innew and different ways to exploit mission data. This is accomplished throughrigorous componentization of processing codes, abstraction and generalization ofdata models, and a xml-based input interface with multi-level prioritizedcontrol of the component configurations depending on the science processingcontext. The proposed NASA-ISRO SAR (NISAR) Mission would deliver data ofunprecedented quantity and quality, making possible global-scale studies inclimate research, natural hazards, and Earth's ecosystems. ISCE is planned tobecome a key element in processing projected NISAR data into higher level dataproducts, enabling a new class of analyses that take greater advantage of thelong time and large spatial scales of these new data than current approaches.NISAR would be but one mission in a constellation of radar satellites in thefuture delivering such data. ISCE has been incorporated into two prototypecloud-based systems that have demonstrated its elasticity to addressing largerdata processing problems in a "production" context and its ability to becontrolled by individual science users on the cloud for large data problems.

  10. A cloud, precipitation and electrification modeling effort for COHMEX

    NASA Technical Reports Server (NTRS)

    Orville, Harold D.; Helsdon, John H.; Farley, Richard D.

    1991-01-01

    In mid-1987, the Modeling Group of the Institute of Atmospheric Sciences (IAS) began to simulate and analyze cloud runs that were made during the Cooperative Huntsville Meteorological Experiment (COHMEX) Project and later. The cloud model was run nearly every day during the summer 1986 COHMEX Project. The Modeling Group was then funded to analyze the results, make further modeling tests, and help explain the precipitation processes in the Southeastern United States. The main science objectives of COHMEX were: (1) to observe the prestorm environment and understand the physical mechanisms leading to the formation of small convective systems and processes controlling the production of precipitation; (2) to describe the structure of small convective systems producing precipitation including the large and small scale events in the environment surrounding the developing and mature convective system; (3) to understand the interrelationships between electrical activity within the convective system and the process of precipitation; and (4) to develop and test numerical models describing the boundary layer, tropospheric, and cloud scale thermodynamics and dynamics associated with small convective systems. The latter three of these objectives were addressed by the modeling activities of the IAS. A series of cloud modes were used to simulate the clouds that formed during the operational project. The primary models used to date on the project were a two dimensional bulk water model, a two dimensional electrical model, and to a lesser extent, a two dimensional detailed microphysical cloud model. All of the models are based on fully interacting microphysics, dynamics, thermodynamics, and electrical equations. Only the 20 July 1986 case was analyzed in detail, although all of the cases run during the summer were analyzed as to how well they did in predicting the characteristics of the convection for that day.

  11. Education and Public Outreach at The Pavilion Lake Research Project: Fusion of Science and Education using Web 2.0

    NASA Astrophysics Data System (ADS)

    Cowie, B. R.; Lim, D. S.; Pendery, R.; Laval, B.; Slater, G. F.; Brady, A. L.; Dearing, W. L.; Downs, M.; Forrest, A.; Lees, D. S.; Lind, R. A.; Marinova, M.; Reid, D.; Seibert, M. A.; Shepard, R.; Williams, D.

    2009-12-01

    The Pavilion Lake Research Project (PLRP) is an international multi-disciplinary science and exploration effort to explain the origin and preservation potential of freshwater microbialites in Pavilion Lake, British Columbia, Canada. Using multiple exploration platforms including one person DeepWorker submersibles, Autonomous Underwater Vehicles, and SCUBA divers, the PLRP acts as an analogue research site for conducting science in extreme environments, such as the Moon or Mars. In 2009, the PLRP integrated several Web 2.0 technologies to provide a pilot-scale Education and Public Outreach (EPO) program targeting the internet savvy generation. The seamless integration of multiple technologies including Google Earth, Wordpress, Youtube, Twitter and Facebook, facilitated the rapid distribution of exciting and accessible science and exploration information over multiple channels. Field updates, science reports, and multimedia including videos, interactive maps, and immersive visualization were rapidly available through multiple social media channels, partly due to the ease of integration of these multiple technologies. Additionally, the successful application of videoconferencing via a readily available technology (Skype) has greatly increased the capacity of our team to conduct real-time education and public outreach from remote locations. The improved communication afforded by Web 2.0 has increased the quality of EPO provided by the PLRP, and has enabled a higher level of interaction between the science team and the community at large. Feedback from these online interactions suggest that remote communication via Web 2.0 technologies were effective tools for increasing public discourse and awareness of the science and exploration activity at Pavilion Lake.

  12. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  13. Learning about Earth Science: Tables and Tabulations. Superific Science Book X. A Good Apple Science Activity Book for Grades 5-8+.

    ERIC Educational Resources Information Center

    Conway, Lorraine

    In an effort to provide science teachers with the tables and scales most often used in teaching earth science, this document was designed to coordinate each table with meaningful activities, projects and experiments. The major areas covered by the booklet are: (1) electromagnetic waves (with activities about light waves and sound waves); (2) the…

  14. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  15. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  16. (abstract) Science-Project Interaction in the Low-Cost Mission

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1994-01-01

    Large, complex, and highly optimized missions have performed most of the preliminary reconnaisance of the solar system. As a result we have now mapped significant fractions of its total surface (or surface-equivalent) area. Now, however, scientific exploration of the solar system is undergoing a major change in scale, and existing missions find it necessary to limit costs while fulfilling existing goals. In the future, NASA's Discovery program will continue the reconnaisance, exploration, and diagnostic phases of planetary research using lower cost missions, which will include lower cost mission operations systems (MOS). Historically, one of the more expensive functions of MOS has been its interaction with the science community. Traditional MOS elements that this interaction have embraced include mission planning, science (and engineering) event conflict resolution, sequence optimization and integration, data production (e.g., assembly, enhancement, quality assurance, documentation, archive), and other science support services. In the past, the payoff from these efforts has been that use of mission resources has been highly optimized, constraining resources have been generally completely consumed, and data products have been accurate and well documented. But because these functions are expensive we are now challenged to reduce their cost while preserving the benefits. In this paper, we will consider ways of revising the traditional MOS approach that might save project resources while retaining a high degree of service to the Projects' customers. Pre-launch, science interaction can be made simplier by limiting numbers of instruments and by providing greater redundancy in mission plans. Post launch, possibilities include prioritizing data collection into a few categories, easing requirements on real-time of quick-look data delivery, and closer integration of scientists into the mission operation.

  17. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  18. The Disappearing Fourth Wall: John Marburger, Science Policy, and the SSC

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2015-04-01

    John H. Marburger (1941-2011) was a skilled science administrator who had a fresh and unique approach to science policy and science leadership. His posthumously published book Science Policy up Close contains recollections of key science policy episodes in which he participated or observed closely. One was the administration of the Superconducting Supercollider (SSC); Marburger was Chairman of the Universities Research Association, the group charged with managing the SSC, from 1988-1994. Many accounts of the SSC saga attribute its demise to a combination of transitory factors: poor management, rising cost estimates, the collapse of the Soviet Union and thus of the Cold War threat, complaints by ``small science'' that the SSC's ``big science'' was consuming their budget, Congress's desire to cut spending, unwarranted contract regulations imposed by the Department of Energy (DOE) in response to environmental lapses at nuclear weapons laboratories, and so forth. Marburger tells a subtler story whose implications for science policy are more significant and far-reaching. The story involves changes in the attitude of the government towards large scientific projects that reach back to management reforms introduced by the administration of Presidents Johnson, Nixon, and Carter in the 1960s and 1970s. This experience impressed Marburger with the inevitability of public oversight of large scientific projects, and with the need for planners of such projects to establish and make public a cost and schedule tracking system that would model the project's progress and expenditures.

  19. Geospatial Data Science Modeling | Geospatial Data Science | NREL

    Science.gov Websites

    Geospatial Data Science Modeling Geospatial Data Science Modeling NREL uses geospatial data science modeling to develop innovative models and tools for energy professionals, project developers, and consumers . Photo of researchers inspecting maps on a large display. Geospatial modeling at NREL often produces the

  20. Large-Scale Spacecraft Fire Safety Experiments in ISS Resupply Vehicles

    NASA Technical Reports Server (NTRS)

    Ruff, Gary A.; Urban, David

    2013-01-01

    Our understanding of the fire safety risk in manned spacecraft has been limited by the small scale of the testing we have been able to conduct in low-gravity. Fire growth and spread cannot be expected to scale linearly with sample size so we cannot make accurate predictions of the behavior of realistic scale fires in spacecraft based on the limited low-g testing to date. As a result, spacecraft fire safety protocols are necessarily very conservative and costly. Future crewed missions are expected to be longer in duration than previous exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low-gravity, the need for realistic scale testing at reduced gravity has been demonstrated. To address this concern, a spacecraft fire safety research project is underway to reduce the uncertainty and risk in the design of spacecraft fire safety systems by testing at nearly full scale in low-gravity. This project is supported by the NASA Advanced Exploration Systems Program Office in the Human Exploration and Operations Mission Directorate. The activity of this project is supported by an international topical team of fire experts from other space agencies to maximize the utility of the data and to ensure the widest possible scrutiny of the concept. The large-scale space flight experiment will be conducted on three missions; each in an Orbital Sciences Corporation Cygnus vehicle after it has deberthed from the ISS. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew allows the fire products to be released into the cabin. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the atmosphere. The international topical team is collaborating with the NASA team in the definition of the experiment requirements and performing supporting analysis, experimentation and technology development.

  1. Hydrologic Impacts of Climate Change: Quantification of Uncertainties (Alexander von Humboldt Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Mujumdar, Pradeep P.

    2014-05-01

    Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.

  2. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    NASA Astrophysics Data System (ADS)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  3. Optical mapping and its potential for large-scale sequencing projects.

    PubMed

    Aston, C; Mishra, B; Schwartz, D C

    1999-07-01

    Physical mapping has been rediscovered as an important component of large-scale sequencing projects. Restriction maps provide landmark sequences at defined intervals, and high-resolution restriction maps can be assembled from ensembles of single molecules by optical means. Such optical maps can be constructed from both large-insert clones and genomic DNA, and are used as a scaffold for accurately aligning sequence contigs generated by shotgun sequencing.

  4. A numerical projection technique for large-scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang

    2011-10-01

    We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.

  5. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  6. Utility-Scale Solar 2014. An Empirical Analysis of Project Cost, Performance, and Pricing Trends in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark; Seel, Joachim

    2015-09-01

    Other than the nine Solar Energy Generation Systems (“SEGS”) parabolic trough projects built in the 1980s, virtually no large-scale or “utility-scale” solar projects – defined here to include any groundmounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar thermal power (“CSP”) project larger than 5 MW AC – existed in the United States prior to 2007. By 2012 – just five years later – utility-scale had become the largest sector of the overall PV market in the United States, a distinction that was repeated in both 2013 and 2014 and that is expected to continue for at least the nextmore » few years. Over this same short period, CSP also experienced a bit of a renaissance in the United States, with a number of large new parabolic trough and power tower systems – some including thermal storage – achieving commercial operation. With this critical mass of new utility-scale projects now online and in some cases having operated for a number of years (generating not only electricity, but also empirical data that can be mined), the rapidly growing utility-scale sector is ripe for analysis. This report, the third edition in an ongoing annual series, meets this need through in-depth, annually updated, data-driven analysis of not just installed project costs or prices – i.e., the traditional realm of solar economics analyses – but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects in the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are presented where appropriate.« less

  7. Communicating the Needs of Climate Change Policy Makers to Scientists

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.; Escobar, Vanessa M.; Lovell, Heather

    2012-01-01

    This chapter will describe the challenges that earth scientists face in developing science data products relevant to decision maker and policy needs, and will describe strategies that can improve the two-way communication between the scientist and the policy maker. Climate change policy and decision making happens at a variety of scales - from local government implementing solar homes policies to international negotiations through the United Nations Framework Convention on Climate Change. Scientists can work to provide data at these different scales, but if they are not aware of the needs of decision makers or understand what challenges the policy maker is facing, they are likely to be less successful in influencing policy makers as they wished. This is because the science questions they are addressing may be compelling, but not relevant to the challenges that are at the forefront of policy concerns. In this chapter we examine case studies of science-policy partnerships, and the strategies each partnership uses to engage the scientist at a variety of scales. We examine three case studies: the global Carbon Monitoring System pilot project developed by NASA, a forest biomass mapping effort for Silvacarbon project, and a forest canopy cover project being conducted for forest management in Maryland. In each of these case studies, relationships between scientists and policy makers were critical for ensuring the focus of the science as well as the success of the decision-making.

  8. MeerKAT Science: On the Pathway to the SKA

    NASA Astrophysics Data System (ADS)

    MeerKAT Science: On the Pathway to the SKA. MeerKAT is a next generation radio telescope under construction on the African SKA central site in the Karoo plateau of South Africa. When completed in 2017 MeerKAT will be a 64-element array of 13.5-m parabolic antennas distributed over an area with a diameter of 8 km. With a combination of wide bandwidth and field of view, with the large number of antennas and total collecting area, MeerKAT will be one of the world’s most powerful imaging telescopes operating at GHz frequencies. MeerKAT is a science and technology precursor of the SKA mid-frequency dish array, and following several years of operation as a South African telescope will be incorporated into the SKA phase-one facility. The MeerKAT science program will consist of a combination of key science, legacy-style, large survey projects, and smaller projects based on proposals for open time. This workshop, which took place in Stellenbosch in the Western Cape, was held to discuss and plan the broad range of scientific investigations that will be undertaken during the pre-SKA phase of MeerKAT. Topics covered included: technical development and roll out of the MeerKAT science capabilities, details of the large survey projects presented by the project teams, science program concepts for open time, commensal programs such as the Search for Extraterrestrial Intelligence, and the impact of MeerKAT on global Very Long Baseline Interferometry. These proceedings serve as a record of the scientific vision of MeerKAT in the year before its completion, foreshadowing a new era of radio astronomy on the African continent.

  9. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

    PubMed Central

    Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  10. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    PubMed

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.

  11. Big data, open science and the brain: lessons learned from genomics.

    PubMed

    Choudhury, Suparna; Fishman, Jennifer R; McGowan, Michelle L; Juengst, Eric T

    2014-01-01

    The BRAIN Initiative aims to break new ground in the scale and speed of data collection in neuroscience, requiring tools to handle data in the magnitude of yottabytes (10(24)). The scale, investment and organization of it are being compared to the Human Genome Project (HGP), which has exemplified "big science" for biology. In line with the trend towards Big Data in genomic research, the promise of the BRAIN Initiative, as well as the European Human Brain Project, rests on the possibility to amass vast quantities of data to model the complex interactions between the brain and behavior and inform the diagnosis and prevention of neurological disorders and psychiatric disease. Advocates of this "data driven" paradigm in neuroscience argue that harnessing the large quantities of data generated across laboratories worldwide has numerous methodological, ethical and economic advantages, but it requires the neuroscience community to adopt a culture of data sharing and open access to benefit from them. In this article, we examine the rationale for data sharing among advocates and briefly exemplify these in terms of new "open neuroscience" projects. Then, drawing on the frequently invoked model of data sharing in genomics, we go on to demonstrate the complexities of data sharing, shedding light on the sociological and ethical challenges within the realms of institutions, researchers and participants, namely dilemmas around public/private interests in data, (lack of) motivation to share in the academic community, and potential loss of participant anonymity. Our paper serves to highlight some foreseeable tensions around data sharing relevant to the emergent "open neuroscience" movement.

  12. Large-scale precipitation estimation using Kalpana-1 IR measurements and its validation using GPCP and GPCC data

    NASA Astrophysics Data System (ADS)

    Prakash, Satya; Mahesh, C.; Gairola, Rakesh M.

    2011-12-01

    Large-scale precipitation estimation is very important for climate science because precipitation is a major component of the earth's water and energy cycles. In the present study, the GOES precipitation index technique has been applied to the Kalpana-1 satellite infrared (IR) images of every three-hourly, i.e., of 0000, 0300, 0600,…., 2100 hours UTC, for rainfall estimation as a preparatory to the INSAT-3D. After the temperatures of all the pixels in a grid are known, they are distributed to generate a three-hourly 24-class histogram of brightness temperatures of IR (10.5-12.5 μm) images for a 1.0° × 1.0° latitude/longitude box. The daily, monthly, and seasonal rainfall have been estimated using these three-hourly rain estimates for the entire south-west monsoon period of 2009 in the present study. To investigate the potential of these rainfall estimates, the validation of monthly and seasonal rainfall estimates has been carried out using the Global Precipitation Climatology Project and Global Precipitation Climatology Centre data. The validation results show that the present technique works very well for the large-scale precipitation estimation qualitatively as well as quantitatively. The results also suggest that the simple IR-based estimation technique can be used to estimate rainfall for tropical areas at a larger temporal scale for climatological applications.

  13. Problems with the application of hydrogeological science to regulation of Australian mining projects: Carmichael Mine and Doongmabulla Springs

    NASA Astrophysics Data System (ADS)

    Currell, Matthew J.; Werner, Adrian D.; McGrath, Chris; Webb, John A.; Berkman, Michael

    2017-05-01

    Understanding and managing impacts from mining on groundwater-dependent ecosystems (GDEs) and other groundwater users requires development of defensible science supported by adequate field data. This usually leads to the creation of predictive models and analysis of the likely impacts of mining and their accompanying uncertainties. The identification, monitoring and management of impacts on GDEs are often a key component of mine approvals, which need to consider and attempt to minimise the risks that negative impacts may arise. Here we examine a case study where approval for a large mining project in Australia (Carmichael Coal Mine) was challenged in court on the basis that it may result in more extensive impacts on a GDE (Doongmabulla Springs) of high ecological and cultural significance than predicted by the proponent. We show that throughout the environmental assessment and approval process, significant data gaps and scientific uncertainties remained unresolved. Evidence shows that the assumed conceptual hydrogeological model for the springs could be incorrect, and that at least one alternative conceptualisation (that the springs are dependent on a deep fault) is consistent with the available field data. Assumptions made about changes to spring flow as a consequence of mine-induced drawdown also appear problematic, with significant implications for the spring-fed wetlands. Despite the large scale of the project, it appears that critical scientific data required to resolve uncertainties and construct robust models of the springs' relationship to the groundwater system were lacking at the time of approval, contributing to uncertainty and conflict. For this reason, we recommend changes to the approval process that would require a higher standard of scientific information to be collected and reviewed, particularly in relation to key environmental assets during the environmental impact assessment process in future projects.

  14. A scale-based approach to interdisciplinary research and expertise in sports.

    PubMed

    Ibáñez-Gijón, Jorge; Buekers, Martinus; Morice, Antoine; Rao, Guillaume; Mascret, Nicolas; Laurin, Jérome; Montagne, Gilles

    2017-02-01

    After more than 20 years since the introduction of ecological and dynamical approaches in sports research, their promising opportunity for interdisciplinary research has not been fulfilled yet. The complexity of the research process and the theoretical and empirical difficulties associated with an integrated ecological-dynamical approach have been the major factors hindering the generalisation of interdisciplinary projects in sports sciences. To facilitate this generalisation, we integrate the major concepts from the ecological and dynamical approaches to study behaviour as a multi-scale process. Our integration gravitates around the distinction between functional (ecological) and execution (organic) scales, and their reciprocal intra- and inter-scale constraints. We propose an (epistemological) scale-based definition of constraints that accounts for the concept of synergies as emergent coordinative structures. To illustrate how we can operationalise the notion of multi-scale synergies we use an interdisciplinary model of locomotor pointing. To conclude, we show the value of this approach for interdisciplinary research in sport sciences, as we discuss two examples of task-specific dimensionality reduction techniques in the context of an ongoing project that aims to unveil the determinants of expertise in basketball free throw shooting. These techniques provide relevant empirical evidence to help bootstrap the challenging modelling efforts required in sport sciences.

  15. Education for sustainable development - Resources for physics and sciences teachers

    NASA Astrophysics Data System (ADS)

    Miličić, Dragana; Jokić, Ljiljana; Blagdanić, Sanja; Jokić, Stevan

    2016-03-01

    With this article we would like to stress science teachers must doing practical work and communicate on the basis of scientific knowledge and developments, but also allow their students opportunity to discover knowledge through inquiry. During the last five years Serbian project Ruka u testu (semi-mirror of the French project La main á la pâte), as well as European FIBONACCI and SUSTAIN projects have offered to our teachers the wide-scale learning opportunities based on Inquiry Based Science Education (IBSE) and Education for Sustainable Development (ESD). Our current efforts are based on pedagogical guidance, several modules and experimental kits, the website, exhibitions, and trainings and workshops for students and teachers.

  16. Discover the Cosmos - Bringing Cutting Edge Science to Schools across Europe

    NASA Astrophysics Data System (ADS)

    Doran, Rosa

    2015-03-01

    The fast growing number of science data repositories is opening enormous possibilities to scientists all over the world. The emergence of citizen science projects is engaging in science discovery a large number of citizens globally. Astronomical research is now a possibility to anyone having a computer and some form of data access. This opens a very interesting and strategic possibility to engage large audiences in the making and understanding of science. On another perspective it would be only natural to imagine that soon enough data mining will be an active part of the academic path of university or even secondary schools students. The possibility is very exciting but the road not very promising. Even in the most developed nations, where all schools are equipped with modern ICT facilities the use of such possibilities is still a very rare episode. The Galileo Teacher Training Program GTTP, a legacy of IYA2009, is participating in some of the most emblematic projects funded by the European Commission and targeting modern tools, resources and methodologies for science teaching. One of this projects is Discover the Cosmos which is aiming to target this issue by empowering educators with the necessary skills to embark on this innovative path: teaching science while doing science.

  17. Investigating potential transferability of place-based research in land system science

    NASA Astrophysics Data System (ADS)

    Václavík, Tomáš; Langerwisch, Fanny; Cotter, Marc; Fick, Johanna; Häuser, Inga; Hotes, Stefan; Kamp, Johannes; Settele, Josef; Spangenberg, Joachim H.; Seppelt, Ralf

    2016-09-01

    Much of our knowledge about land use and ecosystem services in interrelated social-ecological systems is derived from place-based research. While local and regional case studies provide valuable insights, it is often unclear how relevant this research is beyond the study areas. Drawing generalized conclusions about practical solutions to land management from local observations and formulating hypotheses applicable to other places in the world requires that we identify patterns of land systems that are similar to those represented by the case study. Here, we utilize the previously developed concept of land system archetypes to investigate potential transferability of research from twelve regional projects implemented in a large joint research framework that focus on issues of sustainable land management across four continents. For each project, we characterize its project archetype, i.e. the unique land system based on a synthesis of more than 30 datasets of land-use intensity, environmental conditions and socioeconomic indicators. We estimate the transferability potential of project research by calculating the statistical similarity of locations across the world to the project archetype, assuming higher transferability potentials in locations with similar land system characteristics. Results show that areas with high transferability potentials are typically clustered around project sites but for some case studies can be found in regions that are geographically distant, especially when values of considered variables are close to the global mean or where the project archetype is driven by large-scale environmental or socioeconomic conditions. Using specific examples from the local case studies, we highlight the merit of our approach and discuss the differences between local realities and information captured in global datasets. The proposed method provides a blueprint for large research programs to assess potential transferability of place-based studies to other geographical areas and to indicate possible gaps in research efforts.

  18. Teaching Real Science with a Microcomputer.

    ERIC Educational Resources Information Center

    Naiman, Adeline

    1983-01-01

    Discusses various ways science can be taught using microcomputers, including simulations/games which allow large-scale or historic experiments to be replicated on a manageable scale in a brief time. Examples of several computer programs are also presented, including "Experiments in Human Physiology,""Health Awareness…

  19. Science in Society: Bridging the gap to connect science to decision makers

    NASA Astrophysics Data System (ADS)

    Jones, L.; Bwarie, J.; Pearce, I.

    2016-12-01

    The gap between science and decision making in our society can be large and multi-faceted, involving communication, process, cultural and even subconscious differences. In sweeping generalization, scientists reject anecdotes, focus on uncertainty and details, and expect conflict as part of the scientific process, while non-scientists respond to stories, want certainty and the big picture, and see conflict as a reason to reject the message. Bridging this gap often requires ongoing collaboration to find the intersection of three independent domains: what science can provide, the technical information decision makers need to make the most effective choices and what information decision makers need to motivate action. For ten years, the USGS has experimented with improving the usefulness of its science through the SAFRR (Science Application for Risk Reduction) Project and its predecessor, the Multi Hazards Demonstration Project in Southern California. Through leading and participating in these activities, we have recognized 3 steps that have been essential to successful partnerships between scientists and decision makers. First, determining what makes for a successful product cannot be done in isolation by either scientists or users. The users may want something science cannot produce (e.g., accurate short-term earthquake predictions), while the scientists can fail to see that the product they know how to make may not be relevant to the decisions that need to be made. Real discussions with real exchange and absorption of information on both sides makes for the most useful products. Second, most scientific results need work beyond what belongs in a journal to create a product that can be used. This is not just a different style of communication, but analyses that focus on the community's local questions rather than on scientific advances. Third, probabilities of natural hazards almost never motivate action to mitigate. The probabilities are usually low on human time scales and focus on the uncertainty, conveying a strong message that the hazard may not happen. Presenting the hazard as absolutely certain to occur, just on an unknown time scale is scientifically identical, but conveys a very different emotional message.

  20. Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources

    NASA Technical Reports Server (NTRS)

    Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John

    2000-01-01

    The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.

  1. Equally sloped tomography based X-ray full-field nano-CT at Shanghai Synchrotron Radiation Facility

    NASA Astrophysics Data System (ADS)

    Wang, Yudan; Ren, Yuqi; Zhou, Guangzhao; Du, Guohao; Xie, Honglan; Deng, Biao; Xiao, Tiqiao

    2018-07-01

    X-ray full-field nano-computed tomography (nano-CT) has non-destructive three-dimensional imaging capabilities with high spatial resolution, and has been widely applied to investigate morphology and structures in various areas. Conventional tomography reconstructs a 3D object from a large number of equal-angle projections. For nano-CT, it takes long collecting time due to the large projection numbers and long exposure time. Here, equally-sloped tomography (EST) based nano-CT was implemented and constructed on X-ray imaging beamline at the Shanghai Synchrotron Radiation Facility (SSRF) to overcome or alleviate these difficulties. Preliminary results show that hard TXM with the spatial resolution of 100 nm and the EST-based nano-CT with the ability of 3D nano non-destructive characterization have been realized. This technique promotes hard X-ray imaging capability to nano scales at SSRF and could have applications in many fields including nanomaterials, new energy and life sciences. The study will be helpful for the construction of the new full field X-ray nano-imaging beamline with the spatial resolution of 20 nm at SSRF phase II project.

  2. Higher Education Teachers' Descriptions of Their Own Learning: A Large-Scale Study of Finnish Universities of Applied Sciences

    ERIC Educational Resources Information Center

    Töytäri, Aija; Piirainen, Arja; Tynjälä, Päivi; Vanhanen-Nuutinen, Liisa; Mäki, Kimmo; Ilves, Vesa

    2016-01-01

    In this large-scale study, higher education teachers' descriptions of their own learning were examined with qualitative analysis involving application of principles of phenomenographic research. This study is unique: it is unusual to use large-scale data in qualitative studies. The data were collected through an e-mail survey sent to 5960 teachers…

  3. The impact of large-scale, long-term optical surveys on pulsating star research

    NASA Astrophysics Data System (ADS)

    Soszyński, Igor

    2017-09-01

    The era of large-scale photometric variability surveys began a quarter of a century ago, when three microlensing projects - EROS, MACHO, and OGLE - started their operation. These surveys initiated a revolution in the field of variable stars and in the next years they inspired many new observational projects. Large-scale optical surveys multiplied the number of variable stars known in the Universe. The huge, homogeneous and complete catalogs of pulsating stars, such as Cepheids, RR Lyrae stars, or long-period variables, offer an unprecedented opportunity to calibrate and test the accuracy of various distance indicators, to trace the three-dimensional structure of the Milky Way and other galaxies, to discover exotic types of intrinsically variable stars, or to study previously unknown features and behaviors of pulsators. We present historical and recent findings on various types of pulsating stars obtained from the optical large-scale surveys, with particular emphasis on the OGLE project which currently offers the largest photometric database among surveys for stellar variability.

  4. Se substitution and micro-nano-scale porosity enhancing thermoelectric Cu2Te

    NASA Astrophysics Data System (ADS)

    Shi, Xiaoman; Wang, Guoyu; Wang, Ruifeng; Zhou, Xiaoyuan; Xu, Jingtao; Tang, Jun; Ang, Ran

    2018-04-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 51771126 and 11774247), the Youth Foundation of Science and Technology Department of Sichuan Province, China (Grant No. 2016JQ0051), Sichuan University Outstanding Young Scholars Research Funding (Grant No. 2015SCU04A20), the World First-Class University Construction Funding, and the Fundamental and Frontier Research Project in Chongqing (Grant No. CSTC2015JCYJBX0026).

  5. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.

  6. Teaching Science with Web-Based Inquiry Projects: An Exploratory Investigation

    ERIC Educational Resources Information Center

    Webb, Aubree M.; Knight, Stephanie L.; Wu, X. Ben; Schielack, Jane F.

    2014-01-01

    The purpose of this research is to explore a new computer-based interactive learning approach to assess the impact on student learning and attitudes toward science in a large university ecology classroom. A comparison was done with an established program to measure the relative impact of the new approach. The first inquiry project, BearCam, gives…

  7. The project ownership survey: measuring differences in scientific inquiry experiences.

    PubMed

    Hanauer, David I; Dolan, Erin L

    2014-01-01

    A growing body of research documents the positive outcomes of research experiences for undergraduates, including increased persistence in science. Study of undergraduate lab learning experiences has demonstrated that the design of the experience influences the extent to which students report ownership of the project and that project ownership is one of the psychosocial factors involved in student retention in the sciences. To date, methods for measuring project ownership have not been suitable for the collection of larger data sets. The current study aims to rectify this by developing, presenting, and evaluating a new instrument for measuring project ownership. Eighteen scaled items were generated based on prior research and theory related to project ownership and combined with 30 items shown to measure respondents' emotions about an experience, resulting in the Project Ownership survey (POS). The POS was analyzed to determine its dimensionality, reliability, and validity. The POS had a coefficient alpha of 0.92 and thus has high internal consistency. Known-groups validity was analyzed through the ability of the instrument to differentiate between students who studied in traditional versus research-based laboratory courses. The POS scales as differentiated between the groups and findings paralleled previous results in relation to the characteristics of project ownership.

  8. From Atmospheric Scientist to Data Scientist

    NASA Astrophysics Data System (ADS)

    Knuth, S. L.

    2015-12-01

    Most of my career has been spent analyzing data from research projects in the atmospheric sciences. I spent twelve years researching boundary layer interactions in the polar regions, which included five field seasons in the Antarctic. During this time, I got both a M.S. and Ph.D. in atmospheric science. I learned most of my data science and programming skills throughout this time as part of my research projects. When I graduated with my Ph.D., I was looking for a new and fresh opportunity to enhance the skills I already had while learning more advanced technical skills. I found a position at the University of Colorado Boulder as a Data Research Specialist with Research Computing, a group that provides cyber infrastructure services, including high-speed networking, large-scale data storage, and supercomputing, to university students and researchers. My position is the perfect merriment between advanced technical skills and "softer" skills, while at the same time understanding exactly what the busy scientist needs to understand about their data. I have had the opportunity to help shape our university's data education system, a development that is still evolving. This presentation will detail my career story, the lessons I have learned, my daily work in my new position, and some of the exciting opportunities that opened up in my new career.

  9. Universal quantum computation using all-optical hybrid encoding

    NASA Astrophysics Data System (ADS)

    Guo, Qi; Cheng, Liu-Yong; Wang, Hong-Fu; Zhang, Shou

    2015-04-01

    By employing displacement operations, single-photon subtractions, and weak cross-Kerr nonlinearity, we propose an alternative way of implementing several universal quantum logical gates for all-optical hybrid qubits encoded in both single-photon polarization state and coherent state. Since these schemes can be straightforwardly implemented only using local operations without teleportation procedure, therefore, less physical resources and simpler operations are required than the existing schemes. With the help of displacement operations, a large phase shift of the coherent state can be obtained via currently available tiny cross-Kerr nonlinearity. Thus, all of these schemes are nearly deterministic and feasible under current technology conditions, which makes them suitable for large-scale quantum computing. Project supported by the National Natural Science Foundation of China (Grant Nos. 61465013, 11465020, and 11264042).

  10. Evolution of the indoor biome.

    PubMed

    Martin, Laura J; Adams, Rachel I; Bateman, Ashley; Bik, Holly M; Hawks, John; Hird, Sarah M; Hughes, David; Kembel, Steven W; Kinney, Kerry; Kolokotronis, Sergios-Orestis; Levy, Gabriel; McClain, Craig; Meadow, James F; Medina, Raul F; Mhuireach, Gwynne; Moreau, Corrie S; Munshi-South, Jason; Nichols, Lauren M; Palmer, Clare; Popova, Laura; Schal, Coby; Täubel, Martin; Trautwein, Michelle; Ugalde, Juan A; Dunn, Robert R

    2015-04-01

    Few biologists have studied the evolutionary processes at work in indoor environments. Yet indoor environments comprise approximately 0.5% of ice-free land area--an area as large as the subtropical coniferous forest biome. Here we review the emerging subfield of 'indoor biome' studies. After defining the indoor biome and tracing its deep history, we discuss some of its evolutionary dimensions. We restrict our examples to the species found in human houses--a subset of the environments constituting the indoor biome--and offer preliminary hypotheses to advance the study of indoor evolution. Studies of the indoor biome are situated at the intersection of evolutionary ecology, anthropology, architecture, and human ecology and are well suited for citizen science projects, public outreach, and large-scale international collaborations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A Global Perspective: NASA's Prediction of Worldwide Energy Resources (POWER) Project

    NASA Technical Reports Server (NTRS)

    Zhang, Taiping; Stackhouse, Paul W., Jr.; Chandler, William S.; Hoell, James M.; Westberg, David; Whitlock, Charles H.

    2007-01-01

    The Prediction of the Worldwide Energy Resources (POWER) Project, initiated under the NASA Science Mission Directorate Applied Science Energy Management Program, synthesizes and analyzes data on a global scale that are invaluable to the renewable energy industries, especially to the solar and wind energy sectors. The POWER project derives its data primarily from NASA's World Climate Research Programme (WCRP)/Global Energy and Water cycle Experiment (GEWEX) Surface Radiation Budget (SRB) project (Version 2.9) and the Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS) assimilation model (Version 4). The latest development of the NASA POWER Project and its plans for the future are presented in this paper.

  12. dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    This report introduces publications that report the results of a project that aimed to design a computational framework that enables computational experimentation at scale while supporting the model of “submit locally, compute globally”. The project focuses on estimating application resource needs, finding the appropriate computing resources, acquiring those resources,deploying the applications and data on the resources, managing applications and resources during run.

  13. Materials Science and Physics at Micro/Nano-Scales. FINAL REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Judy Z.

    2009-09-07

    The scope of this project is to study nanostructures of semiconductors and superconductors, which have been regarded as promising building blocks for nanoelectronic and nanoelectric devices. The emphasis of this project is on developing novel synthesis approaches for fabrication of nanostructures with desired physical properties. The ultimate goal is to achieve a full control of the nanostructure growth at microscopic scales. The major experimental achievements obtained are summarized

  14. The Costa Rica GLOBE (Global Learning and Observations to Benefit the Environment) Project as a Learning Science Environment

    NASA Astrophysics Data System (ADS)

    Castro Rojas, María Dolores; Zuñiga, Ana Lourdes Acuña; Ugalde, Emmanuel Fonseca

    2015-12-01

    GLOBE is a global educational program for elementary and high school levels, and its main purpose in Costa Rica is to develop scientific thinking and interest for science in high school students through hydrology research projects that allow them to relate science with environmental issues in their communities. Youth between 12 and 17 years old from public schools participate in science clubs outside of their regular school schedule. A comparison study was performed between different groups, in order to assess GLOBE's applicability as a learning science atmosphere and the motivation and interest it generates in students toward science. Internationally applied scales were used as tools for measuring such indicators, adapted to the Costa Rican context. The results provide evidence statistically significant that the students perceive the GLOBE atmosphere as an enriched environment for science learning in comparison with the traditional science class. Moreover, students feel more confident, motivated and interested in science than their peers who do not participate in the project. However, the results were not statistically significant in this last respect.

  15. III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.

    PubMed

    Davis-Kean, Pamela E; Jager, Justin

    2017-06-01

    For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.

  16. Physiology Education and the Linguistic Jungle of Science

    ERIC Educational Resources Information Center

    Nordquist, Lina

    2008-01-01

    Life sciences can be complicated enough without getting into the names of all its Dalton-scale participants. For many students, composing a project plan or writing a paper is rather like learning a foreign language. In this article, the author argues that there is a linguistic jungle of science, and it may well discourage students from pursuing a…

  17. Social Activism in Elementary Science Education: A Science, Technology, and Society Approach to Teach Global Warming

    ERIC Educational Resources Information Center

    Lester, Benjamin T.; Ma, Li; Lee, Okhee; Lambert, Julie

    2006-01-01

    As part of a large-scale instructional intervention research, this study examined elementary students' science knowledge and awareness of social activism with regard to an increased greenhouse effect and global warming. The study involved fifth-grade students from five elementary schools of varying demographic makeup in a large urban school…

  18. The Pilot Lunar Geologic Mapping Project: Summary Results and Recommendations from the Copernicus Quadrangle

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Gaddis, L. R.; Hagerty, J. J.

    2010-01-01

    The first systematic lunar geologic maps were completed at 1:1M scale for the lunar near side during the 1960s using telescopic and Lunar Orbiter (LO) photographs [1-3]. The program under which these maps were completed established precedents for map base, scale, projection, and boundaries in order to avoid widely discrepant products. A variety of geologic maps were subsequently produced for various purposes, including 1:5M scale global maps [4-9] and large scale maps of high scientific interest (including the Apollo landing sites) [10]. Since that time, lunar science has benefitted from an abundance of surface information, including high resolution images and diverse compositional data sets, which have yielded a host of topical planetary investigations. The existing suite of lunar geologic maps and topical studies provide exceptional context in which to unravel the geologic history of the Moon. However, there has been no systematic approach to lunar geologic mapping since the flight of post-Apollo scientific orbiters. Geologic maps provide a spatial and temporal framework wherein observations can be reliably benchmarked and compared. As such, a lack of a systematic mapping program means that modern (post- Apollo) data sets, their scientific ramifications, and the lunar scientists who investigate these data, are all marginalized in regard to geologic mapping. Marginalization weakens the overall understanding of the geologic evolution of the Moon and unnecessarily partitions lunar research. To bridge these deficiencies, we began a pilot geologic mapping project in 2005 as a means to assess the interest, relevance, and technical methods required for a renewed lunar geologic mapping program [11]. Herein, we provide a summary of the pilot geologic mapping project, which focused on the geologic materials and stratigraphic relationships within the Copernicus quadrangle (0-30degN, 0-45degW).

  19. Enhancing fire science exchange: The Joint Fire Science Program's National Network of Knowledge Exchange Consortia

    Treesearch

    Vita Wright; Crystal Kolden; Todd Kipfer; Kristine Lee; Adrian Leighton; Jim Riddering; Leana Schelvan

    2011-01-01

    The Northern Rocky Mountain region is one of the most fire-prone regions in the United States. With a history of large fires that have shaped national policy, including the fires of 1910 and 2000 in Idaho and Montana and the Yellowstone fires of 1988, this region is projected to have many large severe fires in the future. Communication about fire science needs and...

  20. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    NASA Astrophysics Data System (ADS)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  1. Plans for Embedding ICTs into Teaching and Learning through a Large-Scale Secondary Education Reform in the Country of Georgia

    ERIC Educational Resources Information Center

    Richardson, Jayson W.; Sales, Gregory; Sentocnik, Sonja

    2015-01-01

    Integrating ICTs into international development projects is common. However, focusing on how ICTs support leading, teaching, and learning is often overlooked. This article describes a team's approach to technology integration into the design of a large-scale, five year, teacher and leader professional development project in the country of Georgia.…

  2. NEPTUNE: an under-sea plate scale observatory

    NASA Technical Reports Server (NTRS)

    Beauchamp, P. M.; Heath, G. R.; Maffei, A.; Chave, A.; Howe, B.; Wilcock, W.; Delaney, J.; Kirkham, H.

    2002-01-01

    The NEPTUNE project will establish a linked array of undersea observatories on the Juan de Fuca tectonic plate. This observatory will provide a new kind of research platform for real-time, long-term, plate-scale studies in the ocean and Earth sciences.

  3. Evolution of Cognitive Rehabilitation After Stroke From Traditional Techniques to Smart and Personalized Home-Based Information and Communication Technology Systems: Literature Review

    PubMed Central

    Cogollor, José M; Rojo-Lacal, Javier; Hermsdörfer, Joachim; Arredondo Waldmeyer, Maria Teresa; Giachritsis, Christos; Armstrong, Alan; Breñosa Martinez, Jose Manuel; Bautista Loza, Doris Anabelle; Sebastián, José María

    2018-01-01

    Background Neurological patients after stroke usually present cognitive deficits that cause dependencies in their daily living. These deficits mainly affect the performance of some of their daily activities. For that reason, stroke patients need long-term processes for their cognitive rehabilitation. Considering that classical techniques are focused on acting as guides and are dependent on help from therapists, significant efforts are being made to improve current methodologies and to use eHealth and Web-based architectures to implement information and communication technology (ICT) systems that achieve reliable, personalized, and home-based platforms to increase efficiency and level of attractiveness for patients and carers. Objective The goal of this work was to provide an overview of the practices implemented for the assessment of stroke patients and cognitive rehabilitation. This study puts together traditional methods and the most recent personalized platforms based on ICT technologies and Internet of Things. Methods A literature review has been distributed to a multidisciplinary team of researchers from engineering, psychology, and sport science fields. The systematic review has been focused on published scientific research, other European projects, and the most current innovative large-scale initiatives in the area. A total of 3469 results were retrieved from Web of Science, 284 studies from Journal of Medical Internet Research, and 15 European research projects from Community Research and Development Information Service from the last 15 years were reviewed for classification and selection regarding their relevance. Results A total of 7 relevant studies on the screening of stroke patients have been presented with 6 additional methods for the analysis of kinematics and 9 studies on the execution of goal-oriented activities. Meanwhile, the classical methods to provide cognitive rehabilitation have been classified in the 5 main techniques implemented. Finally, the review has been finalized with the selection of 8 different ICT–based approaches found in scientific-technical studies, 9 European projects funded by the European Commission that offer eHealth architectures, and other large-scale activities such as smart houses and the initiative City4Age. Conclusions Stroke is one of the main causes that most negatively affect countries in the socioeconomic aspect. The design of new ICT-based systems should provide 4 main features for an efficient and personalized cognitive rehabilitation: support in the execution of complex daily tasks, automatic error detection, home-based performance, and accessibility. Only 33% of the European projects presented fulfilled those requirements at the same time. For this reason, current and future large-scale initiatives focused on eHealth and smart environments should try to solve this situation by providing more complete and sophisticated platforms. PMID:29581093

  4. Data management for interdisciplinary field experiments: OTTER project support

    NASA Technical Reports Server (NTRS)

    Angelici, Gary; Popovici, Lidia; Skiles, J. W.

    1993-01-01

    The ability of investigators of an interdisciplinary science project to properly manage the data that are collected during the experiment is critical to the effective conduct of science. When the project becomes large, possibly including several scenes of large-format remotely sensed imagery shared by many investigators requiring several services, the data management effort can involve extensive staff and computerized data inventories. The OTTER (Oregon Transect Ecosystem Research) project was supported by the PLDS (Pilot Land Data System) with several data management services, such as data inventory, certification, and publication. After a brief description of these services, experiences in providing them are compared with earlier data management efforts and some conclusions regarding data management in support of interdisciplinary science are discussed. In addition to providing these services, a major goal of this data management capability was to adopt characteristics of a pro-active attitude, such as flexibility and responsiveness, believed to be crucial for the effective conduct of active, interdisciplinary science. These are also itemized and compared with previous data management support activities. Identifying and improving these services and characteristics can lead to the design and implementation of optimal data management support capabilities, which can result in higher quality science and data products from future interdisciplinary field experiments.

  5. Analysis of central enterprise architecture elements in models of six eHealth projects.

    PubMed

    Virkanen, Hannu; Mykkänen, Juha

    2014-01-01

    Large-scale initiatives for eHealth services have been established in many countries on regional or national level. The use of Enterprise Architecture has been suggested as a methodology to govern and support the initiation, specification and implementation of large-scale initiatives including the governance of business changes as well as information technology. This study reports an analysis of six health IT projects in relation to Enterprise Architecture elements, focusing on central EA elements and viewpoints in different projects.

  6. Solar System Visualization (SSV) Project

    NASA Technical Reports Server (NTRS)

    Todd, Jessida L.

    2005-01-01

    The Solar System Visualization (SSV) project aims at enhancing scientific and public understanding through visual representations and modeling procedures. The SSV project's objectives are to (1) create new visualization technologies, (2) organize science observations and models, and (3) visualize science results and mission Plans. The SSV project currently supports the Mars Exploration Rovers (MER) mission, the Mars Reconnaissance Orbiter (MRO), and Cassini. In support of the these missions, the SSV team has produced pan and zoom animations of large mosaics to reveal details of surface features and topography, created 3D animations of science instruments and procedures, formed 3-D anaglyphs from left and right stereo pairs, and animated registered multi-resolution mosaics to provide context for microscopic images.

  7. Factors That Affect the Centrality Controllability of Scale-Free Networks

    NASA Astrophysics Data System (ADS)

    Hu, Dong; Sun, Xian; Li, Ping; Chen, Yan; Zhang, Jie

    2015-12-01

    Not Available Supported by Foundations of SiChuan Educational Committee under Grant No 13ZB0198, the National Natural Science Foundation of China under Grant Nos 61104224, 81373531, 61104143 and 61573107, and The Science and Technology Fund Project of SWPU (2013XJR011).

  8. CONSIDERATIONS FOR A REGULATORY FRAMEWORK FOR LARGE-SCALE GEOLOGIC SEQUESTRATION OF CARBON DIOXIDE: A NORTH AMERICAN PERSPECTIVE

    EPA Science Inventory

    Large scale geologic sequestration (GS) of carbon dioxide poses a novel set of challenges for regulators. This paper focuses on the unique needs of large scale GS projects in light of the existing regulatory regimes in the United States and Canada and identifies several differen...

  9. Always the bridesmaid and never the bride! Arts, Archaeology and the e-Science agenda

    NASA Astrophysics Data System (ADS)

    Gaffney, V.; Fletcher, R. P.

    There is, without doubt, a strong tradition amongst the Arts and Humanities community of the gifted individuals: academics who can, and do, labour long and hard alone in libraries or museums, to provide significant scholarly works. The creation and organisation of large data sets, the desire for enhanced accessibility to data held in disparate locations and the increasing complexity of our theoretical and methodological aspirations inevitably push us towards greater use of technology and a reliance on interdisciplinary teams to facilitate their use. How far such a process has become established, however, is a moot point. As the director of one Arts-based Visualisation laboratory[1] that possesses an UKlight connection, I would probably observe that the Arts and Humanity community has, largely, remained aloof from many of the recent developments of large-scale, ubiquitous technologies, with the notable exception of those that permit resource discovery. It remains a fact that the emergence, for instance, of GRID technologies in other disciplines has not yet had the impact one might have expected on Arts and Humanities. It seems certain that reticence has not been the consequence of a lack of data within the Arts. Others, including archaeology, sit at the edge of the natural sciences and are prodigious generators of data in their own right, or consumers of digital data generated by other disciplines. Another assertion that may be considered is that Arts research is not amenable to large-scale distributed computing. To a certain extent, successful Grid applications are linked to the ability of researchers to agree methodologies that, to some extent, permit a "mechanistic" world view that is amenable to such analysis. However, in contrast it is not true that Arts research is either so individual, so chaotic or anarchic that it is impossible to demonstrate the value, at least, of e-science applications to our disciplines. Lighting the Blue Touchpaper for UK e-Science - Closing Conference of ESLEA Project The George Hotel, Edinburgh, UK 26-28 March, 200

  10. Focal Plant Observations as a Standardised Method for Pollinator Monitoring: Opportunities and Limitations for Mass Participation Citizen Science.

    PubMed

    Roy, Helen E; Baxter, Elizabeth; Saunders, Aoine; Pocock, Michael J O

    2016-01-01

    Recently there has been increasing focus on monitoring pollinating insects, due to concerns about their declines, and interest in the role of volunteers in monitoring pollinators, particularly bumblebees, via citizen science. The Big Bumblebee Discovery was a one-year citizen science project run by a partnership of EDF Energy, the British Science Association and the Centre for Ecology & Hydrology which sought to assess the influence of the landscape at multiple scales on the diversity and abundance of bumblebees. Timed counts of bumblebees (Bombus spp.; identified to six colour groups) visiting focal plants of lavender (Lavendula spp.) were carried out by about 13 000 primary school children (7-11 years old) from over 4000 schools across the UK. 3948 reports were received totalling 26 868 bumblebees. We found that while the wider landscape type had no significant effect on reported bumblebee abundance, the local proximity to flowers had a significant effect (fewer bumblebees where other flowers were reported to be >5m away from the focal plant). However, the rate of mis-identifcation, revealed by photographs uploaded by participants and a photo-based quiz, was high. Our citizen science results support recent research on the importance of local flocal resources on pollinator abundance. Timed counts of insects visiting a lure plant is potentially an effective approach for standardised pollinator monitoring, engaging a large number of participants with a simple protocol. However, the relatively high rate of mis-identifications (compared to reports from previous pollinator citizen science projects) highlights the importance of investing in resources to train volunteers. Also, to be a scientifically valid method for enquiry, citizen science data needs to be sufficiently high quality, so receiving supporting evidence (such as photographs) would allow this to be tested and for records to be verified.

  11. Microwave Remote Sensing and the Cold Land Processes Field Experiment

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; Cline, Don; Davis, Bert; Hildebrand, Peter H. (Technical Monitor)

    2001-01-01

    The Cold Land Processes Field Experiment (CLPX) has been designed to advance our understanding of the terrestrial cryosphere. Developing a more complete understanding of fluxes, storage, and transformations of water and energy in cold land areas is a critical focus of the NASA Earth Science Enterprise Research Strategy, the NASA Global Water and Energy Cycle (GWEC) Initiative, the Global Energy and Water Cycle Experiment (GEWEX), and the GEWEX Americas Prediction Project (GAPP). The movement of water and energy through cold regions in turn plays a large role in ecological activity and biogeochemical cycles. Quantitative understanding of cold land processes over large areas will require synergistic advancements in 1) understanding how cold land processes, most comprehensively understood at local or hillslope scales, extend to larger scales, 2) improved representation of cold land processes in coupled and uncoupled land-surface models, and 3) a breakthrough in large-scale observation of hydrologic properties, including snow characteristics, soil moisture, the extent of frozen soils, and the transition between frozen and thawed soil conditions. The CLPX Plan has been developed through the efforts of over 60 interested scientists that have participated in the NASA Cold Land Processes Working Group (CLPWG). This group is charged with the task of assessing, planning and implementing the required background science, technology, and application infrastructure to support successful land surface hydrology remote sensing space missions. A major product of the experiment will be a comprehensive, legacy data set that will energize many aspects of cold land processes research. The CLPX will focus on developing the quantitative understanding, models, and measurements necessary to extend our local-scale understanding of water fluxes, storage, and transformations to regional and global scales. The experiment will particularly emphasize developing a strong synergism between process-oriented understanding, land surface models and microwave remote sensing. The experimental design is a multi-sensor, multi-scale (1-ha to 160,000 km ^ {2}) approach to providing the comprehensive data set necessary to address several experiment objectives. A description focusing on the microwave remote sensing components (ground, airborne, and spaceborne) of the experiment will be presented.

  12. StePS: Stereographically Projected Cosmological Simulations

    NASA Astrophysics Data System (ADS)

    Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László

    2018-05-01

    StePS (Stereographically Projected Cosmological Simulations) compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to simulate the evolution of the large-scale structure. This eliminates the need for periodic boundary conditions, which are a numerical convenience unsupported by observation and which modifies the law of force on large scales in an unrealistic fashion. StePS uses stereographic projection for space compactification and naive O(N2) force calculation; this arrives at a correlation function of the same quality more quickly than standard (tree or P3M) algorithms with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence StePS can function as a high-speed prediction tool for modern large-scale surveys.

  13. State of the Art in Large-Scale Soil Moisture Monitoring

    NASA Technical Reports Server (NTRS)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  14. Bioinformatics by Example: From Sequence to Target

    NASA Astrophysics Data System (ADS)

    Kossida, Sophia; Tahri, Nadia; Daizadeh, Iraj

    2002-12-01

    With the completion of the human genome, and the imminent completion of other large-scale sequencing and structure-determination projects, computer-assisted bioscience is aimed to become the new paradigm for conducting basic and applied research. The presence of these additional bioinformatics tools stirs great anxiety for experimental researchers (as well as for pedagogues), since they are now faced with a wider and deeper knowledge of differing disciplines (biology, chemistry, physics, mathematics, and computer science). This review targets those individuals who are interested in using computational methods in their teaching or research. By analyzing a real-life, pharmaceutical, multicomponent, target-based example the reader will experience this fascinating new discipline.

  15. On the Attitude of Secondary 1 Students towards Science

    NASA Astrophysics Data System (ADS)

    Kuppan, L.; Munirah, S. K.; Foong, S. K.; Yeung, A. S.

    2010-07-01

    The understanding of students' attitude towards science will give a sense of direction when designing pedagogical approaches and lesson packages so that reasons for not liking science is arrested and eventually the nation's need for science oriented workforce is addressed in the future. This study is part of a 3-year research project entitled PbI1@School: A large scale study on the effect of "Physics by Inquiry" pedagogy on Secondary One students' attitude and aptitude in science, involving school, National Institute of Education (NIE) Singapore, University of Washington at Seattle and the Ministry of Education (MOE) of Singapore. The results from a survey conducted on a sample size of 215 secondary 1 students indicate that fun in studying science is a major reason for their interest towards the subject. Those who do not like science dislike the idea of surface learning such as memorizing facts and information. Besides, all these students in our sample appear to be inquisitive. We believe that the teaching and learning system needs to be modified to increase or at least sustain the students' interest in science and capitalize on students' inquisitiveness. Although the results obtained are interesting and give an insight on secondary 1 students' attitude towards science, we intend to carry out a more rigorous study to identify correlations between students' responses for different attitude questions to understand deeply their attitude towards science.

  16. Designing the Bridge: Perceptions and Use of Downscaled Climate Data by Climate Modelers and Resource Managers in Hawaii

    NASA Astrophysics Data System (ADS)

    Keener, V. W.; Brewington, L.; Jaspers, K.

    2016-12-01

    To build an effective bridge from the climate modeling community to natural resource managers, we assessed the existing landscape to see where different groups diverge in their perceptions of climate data and needs. An understanding of a given community's shared knowledge and differences can help design more actionable science. Resource managers in Hawaii are eager to have future climate projections at spatial scales relevant to the islands. National initiatives to downscale climate data often exclude US insular regions, so researchers in Hawaii have generated regional dynamically and statistically downscaled projections. Projections of precipitation diverge, however, leading to difficulties in communication and use. Recently, a two day workshop was held with scientists and managers to evaluate available models and determine a set of best practices for moving forward with decision-relevant downscaling in Hawaii. To seed the discussion, the Pacific Regional Integrated Sciences and Assessments (RISA) program conducted a pre-workshop survey (N=65) of climate modelers and freshwater, ecosystem, and wildfire managers working in Hawaii. Scientists reported spending less than half of their time on operational research, although the majority was eager to partner with managers on specific projects. Resource managers had varying levels of familiarity with downscaled climate projections, but reported needing more information about uncertainty for decision making, and were less interested in the technical model details. There were large differences between groups of managers, with 41.7% of freshwater managers reporting that they used climate projections regularly, while a majority of ecosystem and wildfire managers reported having "no familiarity". Scientists and managers rated which spatial and temporal scales were most relevant to decision making. Finally, when asked to compare how confident they were in projections of specific climate variables between the dynamical and statistical data, 80-90% of managers responded that they had no opinion. Workshop attendees were very interested in the survey results, adding to evidence of a need for sustained engagement between modeler and user groups, as well as different strategies for working with different types of resource managers.

  17. Increasing Higher Level Thinking Skills in Science of Gifted Students in Grades 1-4 through "Hands-On" Activities.

    ERIC Educational Resources Information Center

    Dindial, Myrna J.

    This practicum was designed to increase higher level thinking skills of gifted students in primary school. The project sought to retrain students from recalling science information from the textbook to a more challenging and active form of learning through individual projects and small group and large group activities. Students were given…

  18. Urban Elementary STEM Initiative

    ERIC Educational Resources Information Center

    Parker, Carolyn; Abel, Yolanda; Denisova, Ekaterina

    2015-01-01

    The new standards for K-12 science education suggest that student learning should be more integrated and should focus on crosscutting concepts and core ideas from the areas of physical science, life science, Earth/space science, and engineering/technology. This paper describes large-scale, urban elementary-focused science, technology, engineering,…

  19. Polar continental margins: Studies off East Greenland

    NASA Astrophysics Data System (ADS)

    Mienert, J.; Thiede, J.; Kenyon, N. H.; Hollender, F.-J.

    The passive continental margin off east Greenland has been shaped by tectonic and sedimentary processes, and typical physiographic patterns have evolved over the past few million years under the influence of the late Cenozoic Northern Hemisphere glaciations. The Greenland ice shield has been particularly affected.GLORIA (Geological Long Range Inclined Asdic), the Institute of Oceanographic Sciences' (IOS) long-range, side-scan sonar, was used on a 1992 RV Livonia cruise to map large-scale changes in sedimentary patterns along the east Greenland continental margin. The overall objective of this research program was to determine the variety of large-scale seafloor processes to improve our understanding of the interaction between ice sheets, current regimes, and sedimentary processes. In cooperation with IOS and the RV Livonia, a high-quality set of seafloor data has been produced. GLORIA'S first survey of east Greenland's continental margin covered several 1000- × 50-km-wide swaths (Figure 1) and yielded an impressive sidescan sonar image of the complete Greenland Basin and margin (about 250,000 km2). A mosaic of the data was made at a scale of 1:375,000. The base map was prepared with a polar stereographic projection having a standard parallel of 71°.

  20. The Exploratorium Guide to Scale and Structure: Activities for the Elementary Classroom.

    ERIC Educational Resources Information Center

    Kluger-Bell, Barry; And Others

    The theme of Scale and Structure (or simply Scale) appears in many state science frameworks and projects of national importance. The major idea of this theme is that a change in scale will affect the nature of the given structure. This book is designed as a guide and set of activities for third- through eighth-grade teachers. The activities…

  1. Utilizing a scale model solar system project to visualize important planetary science concepts and develop technology and spatial reasoning skills

    NASA Astrophysics Data System (ADS)

    Kortenkamp, Stephen J.; Brock, Laci

    2016-10-01

    Scale model solar systems have been used for centuries to help educate young students and the public about the vastness of space and the relative sizes of objects. We have adapted the classic scale model solar system activity into a student-driven project for an undergraduate general education astronomy course at the University of Arizona. Students are challenged to construct and use their three dimensional models to demonstrate an understanding of numerous concepts in planetary science, including: 1) planetary obliquities, eccentricities, inclinations; 2) phases and eclipses; 3) planetary transits; 4) asteroid sizes, numbers, and distributions; 5) giant planet satellite and ring systems; 6) the Pluto system and Kuiper belt; 7) the extent of space travel by humans and robotic spacecraft; 8) the diversity of extrasolar planetary systems. Secondary objectives of the project allow students to develop better spatial reasoning skills and gain familiarity with technology such as Excel formulas, smart-phone photography, and audio/video editing.During our presentation we will distribute a formal description of the project and discuss our expectations of the students as well as present selected highlights from preliminary submissions.

  2. Environmental challenges threatening the growth of urban agriculture in the United States.

    PubMed

    Wortman, Sam E; Lovell, Sarah Taylor

    2013-09-01

    Urban agriculture, though often difficult to define, is an emerging sector of local food economies in the United States. Although urban and agricultural landscapes are often integrated in countries around the world, the establishment of mid- to large-scale food production in the U.S. urban ecosystem is a relatively new development. Many of the urban agricultural projects in the United States have emerged from social movements and nonprofit organizations focused on urban renewal, education, job training, community development, and sustainability initiatives. Although these social initiatives have traction, critical knowledge gaps exist regarding the science of food production in urban ecosystems. Developing a science-based approach to urban agriculture is essential to the economic and environmental sustainability of the movement. This paper reviews abiotic environmental factors influencing urban cropping systems, including soil contamination and remediation; atmospheric pollutants and altered climatic conditions; and water management, sources, and safety. This review paper seeks to characterize the limited state of the science on urban agricultural systems and identify future research questions most relevant to urban farmers, land-use planners, and environmental consultants. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  3. The Superconducting Supercollider and US Science Policy

    NASA Astrophysics Data System (ADS)

    Marburger, John H.

    2014-06-01

    Reasons for the Superconducting Supercollider's (SSC's) termination include significant changes in the attitude of the government towards large scientific projects originating with management reforms introduced decades earlier. In the 1980s, the government insisted on inclusion of elements of these reforms in the SSC's management contract, including increased demands for accountability, additional liability for contractors, and sanctions for infractions. The SSC's planners could not have opted out of the reforms, which were by then becoming part of all large publicly funded projects. Once these reforms were in place, management mistakes in the SSC's planning and construction became highly visible, leading to termination of the machine. This episode contains two key lessons about science policy. One is that the momentum of the government's management reforms was unstoppable, and its impact on large scientific facilities and projects could not be reversed. The other is that specific measures such as cost and schedule-tracking systems to provide measures of program performance and impact were also inevitable; large scientific projects needed new parameters of accountability and transparency in what can be called the Principle of Assurance.

  4. Design of scale model of plate-shaped absorber in a wide frequency range

    NASA Astrophysics Data System (ADS)

    Yuan, Li-Ming; Xu, Yong-Gang; Gao, Wei; Dai, Fei; Wu, Qi-Lin

    2018-04-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 61601299 and 11404213), the Shanghai Municipal Science and Technology Commission, China (Grant Nos. 17210730900 and 15ZR1439600), and the Defense Industrial Technology, China (Grant No. B2120132001).

  5. When Good Intentions and Reality Meet: Large-Scale Reform of Science Teaching in Urban Schools with Predominantly Latino ELL Students

    ERIC Educational Resources Information Center

    Johnson, Carla C.; Bolshakova, Virginia L. J.; Waldron, Tammy

    2016-01-01

    This study examined the ability of Transformative Professional Development (TPD) to transform science teacher quality and associated impact on science achievement, including particular focus on English Language Learners (ELL). TPD was implemented in a large, low-performing, urban district in the southwest with predominantly Latino ELL populations.…

  6. 78 FR 7464 - Large Scale Networking (LSN) ; Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN) ; Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination...://www.nitrd.gov/nitrdgroups/index.php?title=Joint_Engineering_Team_ (JET)#title. SUMMARY: The JET...

  7. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  8. The Heritage of Earth Science Applications in Policy, Business, and Management of Natural Resources

    NASA Astrophysics Data System (ADS)

    Macauley, M.

    2012-12-01

    From the first hand-held cameras on the Gemini space missions to present day satellite instruments, Earth observations have enhanced the management of natural resources including water, land, and air. Applications include the development of new methodology (for example, developing and testing algorithms or demonstrating how data can be used) and the direct use of data in decisionmaking and policy implementation. Using well-defined bibliographic search indices to systematically survey a broad social science literature, this project enables identification of a host of well-documented, practical and direct applications of Earth science data in resource management. This literature has not previously been well surveyed, aggregated, or analyzed for the heritage of lessons learned in practical application of Earth science data. In the absence of such a survey, the usefulness of Earth science data is underestimated and the factors that make people want to use -- and able to use -- the data are poorly understood. The project extends and updates previous analysis of social science applications of Landsat data to show their contemporary, direct use in new policy, business, and management activities and decisionmaking. The previous surveys (for example, Blumberg and Jacobson 1997; National Research Council 1998) find that the earliest attempts to use data are almost exclusively testing of methodology rather than direct use in resource management. Examples of methodology prototyping include Green et al. (1997) who demonstrate use of remote sensing to detect and monitor changes in land cover and use, Cowen et al. (1995) who demonstrate design and integration of GIS for environmental applications, Hutchinson (1991) who shows uses of data for famine early warning, and Brondizio et al. (1996) who show the link of thematic mapper data with botanical data. Blumberg and Jacobson (in Acevedo et al. 1996) show use of data in a study of urban development in the San Francisco Bay and the Baltimore-Washington metropolitan regions. The earliest direct application of Earth science information to actual decisionmaking began with the use of Landsat data in large-scale government demonstration programs and later, in smaller state and local agency projects. Many of these applications served as experiments to show how to use the data and to test their limitations. These activities served as precursors to more recent applications. Among the newest applications are the use of data to provide essential information to underpin monetary estimates of ecosystem services and the development of "credit" programs for these services. Another example is participatory (citizen science) resource management. This project also identifies the heritage of adoption factors - that is, determinants of the decision to use Earth science data. These factors include previous experience with Earth science data, reliable and transparent validation and verification techniques for new data, the availability and thoroughness of metadata, the ease of access and use of the data products, and technological innovation in computing and software (factors largely outside of the Earth science enterprise but influential in ease of direct use of Earth science data).

  9. Project BALLOTS: Bibliographic Automation of Large Library Operations Using a Time-Sharing System. Progress Report (3/27/69 - 6/26/69).

    ERIC Educational Resources Information Center

    Veaner, Allen B.

    Project BALLOTS is a large-scale library automation development project of the Stanford University Libraries which has demonstrated the feasibility of conducting on-line interactive searches of complex bibliographic files, with a large number of users working simultaneously in the same or different files. This report documents the continuing…

  10. Implementing Projects in Calculus on a Large Scale at the University of South Florida

    ERIC Educational Resources Information Center

    Fox, Gordon A.; Campbell, Scott; Grinshpan, Arcadii; Xu, Xiaoying; Holcomb, John; Bénéteau, Catherine; Lewis, Jennifer E.; Ramachandran, Kandethody

    2017-01-01

    This paper describes the development of a program of project-based learning in Calculus courses at a large urban research university. In this program, students developed research projects in consultation with a faculty advisor in their major, and supervised by their calculus instructors. Students wrote up their projects in a prescribed format…

  11. Interfacial nanobubbles produced by long-time preserved cold water

    NASA Astrophysics Data System (ADS)

    Zhou, Li-Min; Wang, Shuo; Qiu, Jie; Wang, Lei; Wang, Xing-Ya; Li, Bin; Zhang, Li-Juan; Hu, Jun

    2017-09-01

    Not Available Project supported by the Key Laboratory of Interfacial Physics and Technology, Chinese Academy of Sciences, the Open Research Project of the Large Scientific Facility of the Chinese Academy of Sciences, the National Natural Science Foundation of China (Grant Nos. 11079050, 11290165, 11305252, 11575281, and U1532260), the National Key Basic Research Program of China (Grant Nos. 2012CB825705 and 2013CB932801), the National Natural Science Foundation for Outstanding Young Scientists, China (Grant No. 11225527), the Shanghai Academic Leadership Program, China (Grant No. 13XD1404400), and the Program of the Chinese Academy of Sciences (Grant Nos. KJCX2-EW-W09 and QYZDJ-SSW-SLH019)

  12. GSDC: A Unique Data Center in Korea for HEP research

    NASA Astrophysics Data System (ADS)

    Ahn, Sang-Un

    2017-04-01

    Global Science experimental Data hub Center (GSDC) at Korea Institute of Science and Technology Information (KISTI) is a unique data center in South Korea established for promoting the fundamental research fields by supporting them with the expertise on Information and Communication Technology (ICT) and the infrastructure for High Performance Computing (HPC), High Throughput Computing (HTC) and Networking. GSDC has supported various research fields in South Korea dealing with the large scale of data, e.g. RENO experiment for neutrino research, LIGO experiment for gravitational wave detection, Genome sequencing project for bio-medical, and HEP experiments such as CDF at FNAL, Belle at KEK, and STAR at BNL. In particular, GSDC has run a Tier-1 center for ALICE experiment using the LHC at CERN since 2013. In this talk, we present the overview on computing infrastructure that GSDC runs for the research fields and we discuss on the data center infrastructure management system deployed at GSDC.

  13. White House announces “big data” initiative

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2012-04-01

    The world is now generating zetabytes—which is 10 to the 21st power, or a billion trillion bytess—of information every year, according to John Holdren, director of the White House Office of Science and Technology Policy. With data volumes growing exponentially from a variety of sources such as computers running large-scale models, scientific instruments including telescopes and particle accelerators, and even online retail transactions, a key challenge is to better manage and utilize the data. The Big Data Research and Development Initiative, launched by the White House at a 29 March briefing, initially includes six federal departments and agencies providing more than $200 million in new commitments to improve tools and techniques for better accessing, organizing, and using data for scientific advances. The agencies and departments include the National Science Foundation (NSF), Department of Energy, U.S. Geological Survey (USGS), National Institutes of Health (NIH), Department of Defense, and Defense Advanced Research Projects Agency.

  14. On the Photometric Calibration of FORS2 and the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Bramich, D.; Moehler, S.; Coccato, L.; Freudling, W.; Garcia-Dabó, C. E.; Müller, P.; Saviane, I.

    2012-09-01

    An accurate absolute calibration of photometric data to place them on a standard magnitude scale is very important for many science goals. Absolute calibration requires the observation of photometric standard stars and analysis of the observations with an appropriate photometric model including all relevant effects. In the FORS Absolute Photometry (FAP) project, we have developed a standard star observing strategy and modelling procedure that enables calibration of science target photometry to better than 3% accuracy on photometrically stable nights given sufficient signal-to-noise. In the application of this photometric modelling to large photometric databases, we have investigated the Sloan Digital Sky Survey (SDSS) and found systematic trends in the published photometric data. The amplitudes of these trends are similar to the reported typical precision (˜1% and ˜2%) of the SDSS photometry in the griz- and u-bands, respectively.

  15. 77 FR 58415 - Large Scale Networking (LSN); Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET). SUMMARY: The JET, established in 1997, provides for information sharing among Federal...

  16. 78 FR 70076 - Large Scale Networking (LSN)-Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET)#title. SUMMARY: The JET, established in 1997, provides for information sharing among...

  17. Authentic Research Experience and "Big Data" Analysis in the Classroom: Maize Response to Abiotic Stress.

    PubMed

    Makarevitch, Irina; Frechette, Cameo; Wiatros, Natalia

    2015-01-01

    Integration of inquiry-based approaches into curriculum is transforming the way science is taught and studied in undergraduate classrooms. Incorporating quantitative reasoning and mathematical skills into authentic biology undergraduate research projects has been shown to benefit students in developing various skills necessary for future scientists and to attract students to science, technology, engineering, and mathematics disciplines. While large-scale data analysis became an essential part of modern biological research, students have few opportunities to engage in analysis of large biological data sets. RNA-seq analysis, a tool that allows precise measurement of the level of gene expression for all genes in a genome, revolutionized molecular biology and provides ample opportunities for engaging students in authentic research. We developed, implemented, and assessed a series of authentic research laboratory exercises incorporating a large data RNA-seq analysis into an introductory undergraduate classroom. Our laboratory series is focused on analyzing gene expression changes in response to abiotic stress in maize seedlings; however, it could be easily adapted to the analysis of any other biological system with available RNA-seq data. Objective and subjective assessment of student learning demonstrated gains in understanding important biological concepts and in skills related to the process of science. © 2015 I. Makarevitch et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  18. Talking About The Smokes: a large-scale, community-based participatory research project.

    PubMed

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  19. Piers Sellers

    NASA Image and Video Library

    2017-12-08

    Piers Sellers is currently Deputy Director of the Sciences and Exploration Directorate and Acting Director of the Earth Sciences Division at NASA/GSFC. He was born and educated in the United Kingdom and moved to the U.S. in 1982 to carry out climate research at NASA/GSFC. From 1982 to 1996, he worked on global climate problems, particularly those involving interactions between the biosphere and the atmosphere, and was involved in constructing computer models of the global climate system, satellite data interpretation and conducting large-scale field experiments in the USA, Canada, Africa, and Brazil. He served as project scientist for the first large Earth Observing System platform, Terra, launched in 1998. He joined the NASA astronaut corps in 1996 and flew to the International Space Station (ISS) in 2002, 2006, and 2010, carrying out six spacewalks and working on ISS assembly tasks. He returned to Goddard Space Flight Center in June, 2011. Credit: NASA/Goddard/Rebecca Roth NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  20. The NASA Carbon Monitoring System

    NASA Astrophysics Data System (ADS)

    Hurtt, G. C.

    2015-12-01

    Greenhouse gas emission inventories, forest carbon sequestration programs (e.g., Reducing Emissions from Deforestation and Forest Degradation (REDD and REDD+), cap-and-trade systems, self-reporting programs, and their associated monitoring, reporting and verification (MRV) frameworks depend upon data that are accurate, systematic, practical, and transparent. A sustained, observationally-driven carbon monitoring system using remote sensing data has the potential to significantly improve the relevant carbon cycle information base for the U.S. and world. Initiated in 2010, NASA's Carbon Monitoring System (CMS) project is prototyping and conducting pilot studies to evaluate technological approaches and methodologies to meet carbon monitoring and reporting requirements for multiple users and over multiple scales of interest. NASA's approach emphasizes exploitation of the satellite remote sensing resources, computational capabilities, scientific knowledge, airborne science capabilities, and end-to-end system expertise that are major strengths of the NASA Earth Science program. Through user engagement activities, the NASA CMS project is taking specific actions to be responsive to the needs of stakeholders working to improve carbon MRV frameworks. The first phase of NASA CMS projects focused on developing products for U.S. biomass/carbon stocks and global carbon fluxes, and on scoping studies to identify stakeholders and explore other potential carbon products. The second phase built upon these initial efforts, with a large expansion in prototyping activities across a diversity of systems, scales, and regions, including research focused on prototype MRV systems and utilization of COTS technologies. Priorities for the future include: 1) utilizing future satellite sensors, 2) prototyping with commercial off-the-shelf technology, 3) expanding the range of prototyping activities, 4) rigorous evaluation, uncertainty quantification, and error characterization, 5) stakeholder engagement, 6) partnerships with other U.S. agencies and international partners, and 7) modeling and data assimilation.

  1. Investigating the Quality of Project-Based Science and Technology Learning Environments in Elementary School: A Critical Review of Instruments

    ERIC Educational Resources Information Center

    Thys, Miranda; Verschaffel, Lieven; Van Dooren, Wim; Laevers, Ferre

    2016-01-01

    This paper provides a systematic review of instruments that have the potential to measure the quality of project-based science and technology (S&T) learning environments in elementary school. To this end, a comprehensive literature search was undertaken for the large field of S&T learning environments. We conducted a horizontal bottom-up…

  2. Geospatial considerations for a multiorganizational, landscape-scale program

    USGS Publications Warehouse

    O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.

    2013-01-01

    Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.

  3. Overview of the SHIELDS Project at LANL

    NASA Astrophysics Data System (ADS)

    Jordanova, V.; Delzanno, G. L.; Henderson, M. G.; Godinez, H. C.; Jeffery, C. A.; Lawrence, E. C.; Meierbachtol, C.; Moulton, D.; Vernon, L.; Woodroffe, J. R.; Toth, G.; Welling, D. T.; Yu, Y.; Birn, J.; Thomsen, M. F.; Borovsky, J.; Denton, M.; Albert, J.; Horne, R. B.; Lemon, C. L.; Markidis, S.; Young, S. L.

    2015-12-01

    The near-Earth space environment is a highly dynamic and coupled system through a complex set of physical processes over a large range of scales, which responds nonlinearly to driving by the time-varying solar wind. Predicting variations in this environment that can affect technologies in space and on Earth, i.e. "space weather", remains a big space physics challenge. We present a recently funded project through the Los Alamos National Laboratory (LANL) Directed Research and Development (LDRD) program that is developing a new capability to understand, model, and predict Space Hazards Induced near Earth by Large Dynamic Storms, the SHIELDS framework. The project goals are to specify the dynamics of the hot (keV) particles (the seed population for the radiation belts) on both macro- and micro-scale, including important physics of rapid particle injection and acceleration associated with magnetospheric storms/substorms and plasma waves. This challenging problem is addressed using a team of world-class experts in the fields of space science and computational plasma physics and state-of-the-art models and computational facilities. New data assimilation techniques employing data from LANL instruments on the Van Allen Probes and geosynchronous satellites are developed in addition to physics-based models. This research will provide a framework for understanding of key radiation belt drivers that may accelerate particles to relativistic energies and lead to spacecraft damage and failure. The ability to reliably distinguish between various modes of failure is critically important in anomaly resolution and forensics. SHIELDS will enhance our capability to accurately specify and predict the near-Earth space environment where operational satellites reside.

  4. Automated Topographic Change Detection via Dem Differencing at Large Scales Using The Arcticdem Database

    NASA Astrophysics Data System (ADS)

    Candela, S. G.; Howat, I.; Noh, M. J.; Porter, C. C.; Morin, P. J.

    2016-12-01

    In the last decade, high resolution satellite imagery has become an increasingly accessible tool for geoscientists to quantify changes in the Arctic land surface due to geophysical, ecological and anthropomorphic processes. However, the trade off between spatial coverage and spatial-temporal resolution has limited detailed, process-level change detection over large (i.e. continental) scales. The ArcticDEM project utilized over 300,000 Worldview image pairs to produce a nearly 100% coverage elevation model (above 60°N) offering the first polar, high spatial - high resolution (2-8m by region) dataset, often with multiple repeats in areas of particular interest to geo-scientists. A dataset of this size (nearly 250 TB) offers endless new avenues of scientific inquiry, but quickly becomes unmanageable computationally and logistically for the computing resources available to the average scientist. Here we present TopoDiff, a framework for a generalized. automated workflow that requires minimal input from the end user about a study site, and utilizes cloud computing resources to provide a temporally sorted and differenced dataset, ready for geostatistical analysis. This hands-off approach allows the end user to focus on the science, without having to manage thousands of files, or petabytes of data. At the same time, TopoDiff provides a consistent and accurate workflow for image sorting, selection, and co-registration enabling cross-comparisons between research projects.

  5. Mission and Objectives for the X-1 Advanced Radiation Source*

    NASA Astrophysics Data System (ADS)

    Rochau, Gary E.; Ramirez, Juan J.; Raglin, Paul S.

    1998-11-01

    Sandia National Laboratories PO Box 5800, MS-1178, Albuquerque, NM 87185 The X-1 Advanced Radiation Source represents a next step in providing the U.S. Department of Energy's Stockpile Stewardship Program with the high-energy, large volume, laboratory x-ray source for the Radiation Effects Science and Simulation, Inertial Confinement Fusion, and Weapon Physics Programs. Advances in fast pulsed power technology and in z-pinch hohlraums on Sandia National Laboratories' Z Accelerator provide sufficient basis for pursuing the development of X-1. The X-1 plan follows a strategy based on scaling the 2 MJ x-ray output on Z via a 3-fold increase in z-pinch load current. The large volume (>5 cm3), high temperature (>150 eV), temporally long (>10 ns) hohlraums are unique outside of underground nuclear weapon testing. Analytical scaling arguments and hydrodynamic simulations indicate that these hohlraums at temperatures of 230-300 eV will ignite thermonuclear fuel and drive the reaction to a yield of 200 to 1,200 MJ in the laboratory. Non-ignition sources will provide cold x-ray environments (<15 keV) and high yield fusion burn sources will provide high fidelity warm x-ray environments (15 keV-80 keV). This paper will introduce the X-1 Advanced Radiation Source Facility Project, describe the project mission, objective, and preliminary schedule.

  6. Flat Plate Solar Array Project: Proceedings of the 20th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    Mcdonald, R. R.

    1982-01-01

    Progress made by the Flat-Plate Solar Array Project during the period November 1981 to April 1982 is reported. Project analysis and integration, technology research in silicon material, large-area silicon sheet and environmental isolation, cell and module formation, engineering sciences, and module performance and failure analysis are covered.

  7. QA Activities on Two Large RARE Projects at the US EPA, RTP, NC ─ from Fish to Humans

    EPA Science Inventory

    Two RARE (Regional Applied Research Effort) projects are being managed by Janet Diliberto, Linda Birnbaum, and Thomas Hughes. Janet is the Project Officer, Linda is the science advisor and Thomas is the QA and Records Manager for these two RARE projects. These are high visibili...

  8. Replacement of SSE with NASA's POWER Project GIS-enabled Web Data Portal

    Atmospheric Science Data Center

    2018-04-30

    Replacement of SSE with NASA's POWER Project GIS-enabled Web Data Portal Friday, March ... 2018 Replacement of SSE (Release 6) with NASA's Prediction of Worldwide Energy Resource (POWER) Project GIS-enabled Web ... Worldwide Energy Resource (POWER) Project funded largely by NASA Earth Applied Sciences program.   The new POWER web portal ...

  9. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, Panagiotis; /Fermilab; Cary, John

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less

  10. The 300 Area Integrated Field Research Challenge Quality Assurance Project Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fix, N. J.

    Pacific Northwest National Laboratory and a group of expert collaborators are using the U.S. Department of Energy Hanford Site 300 Area uranium plume within the footprint of the 300-FF-5 groundwater operable unit as a site for an Integrated Field-Scale Subsurface Research Challenge (IFRC). The IFRC is entitled Multi-Scale Mass Transfer Processes Controlling Natural Attenuation and Engineered Remediation: An IFRC Focused on the Hanford Site 300 Area Uranium Plume Project. The theme is investigation of multi-scale mass transfer processes. A series of forefront science questions on mass transfer are posed for research that relate to the effect of spatial heterogeneities; themore » importance of scale; coupled interactions between biogeochemical, hydrologic, and mass transfer processes; and measurements/approaches needed to characterize and model a mass transfer-dominated system. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the 300 Area IFRC Project. This plan is designed to be used exclusively by project staff.« less

  11. NASA: Assessments of Selected Large-Scale Projects

    DTIC Science & Technology

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  12. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  13. Using Online Citizen Science to Assess Giant Kelp Abundances Across the Globe with Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Byrnes, J.; Cavanaugh, K. C.; Haupt, A. J.; Trouille, L.; Rosenthal, I.; Bell, T. W.; Rassweiler, A.; Pérez-Matus, A.; Assis, J.

    2017-12-01

    Global scale long-term data sets that document the patterns and variability of human impacts on marine ecosystems are rare. This lack is particularly glaring for underwater species - even moreso for ecologically important ones. Here we demonstrate how online Citizen Science combined with Landsat satellite imagery can help build a picture of change in the dynamics of giant kelp, an important coastal foundation species around the globe, from the 1984 to the present. Giant kelp canopy is visible from Landsat images, but these images defy easy machine classification. To get useful data, images must be processed by hand. While academic researchers have applied this method successfully at sub-regional scales, unlocking the value of the full global dataset has not been possible until given the massive effort required. Here we present Floating Forests (http://floatingforests.org), an international collaboration between kelp forest researchers and the citizen science organization Zooniverse. Floating Forests provides an interface that allows citizen scientists to identify canopy cover of giant kelp on Landsat images, enabling us to scale up the dataset to the globe. We discuss lessons learned from the initial version of the project launched in 2014, a prototype of an image processing pipeline to bring Landsat imagery to citizen science platforms, methods of assessing accuracy of citizen scientists, and preliminary data from our relaunch of the project. Through this project we have developed generalizable tools to facilitate citizen science-based analysis of Landsat and other satellite and aerial imagery. We hope that this create a powerful dataset to unlock our understanding of how global change has altered these critically important species in the sea.

  14. Theme II Joint Work Plan -2017 Collaboration and Knowledge Sharing on Large-scale Demonstration Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoliang; Stauffer, Philip H.

    This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.

  15. Science initial teacher education and superdiversity: educating science teachers for a multi-religious and globalised science classroom

    NASA Astrophysics Data System (ADS)

    De Carvalho, Roussel

    2016-06-01

    Steven Vertovec (2006, 2007) has recently offered a re-interpretation of population diversity in large urban centres due to a considerable increase in immigration patterns in the UK. This complex scenario called superdiversity has been conceptualised to help illuminate significant interactions of variables such as religion, language, gender, age, nationality, labour market and population distribution on a larger scale. The interrelationships of these themes have fundamental implications in a variety of community environments, but especially within our schools. Today, London schools have over 300 languages being spoken by students, all of whom have diverse backgrounds, bringing with them a wealth of experience and, most critically, their own set of religious beliefs. At the same time, Science is a compulsory subject in England's national curriculum, where it requires teachers to deal with important scientific frameworks about the world; teaching about the origins of the universe, life on Earth, human evolution and other topics, which are often in conflict with students' religious views. In order to cope with this dynamic and thought-provoking environment, science initial teacher education (SITE)—especially those catering large urban centres—must evolve to equip science teachers with a meaningful understanding of how to handle a superdiverse science classroom, taking the discourse of inclusion beyond its formal boundaries. Thus, this original position paper addresses how the role of SITE may be re-conceptualised and re-framed in light of the immense challenges of superdiversity as well as how science teachers, as enactors of the science curriculum, must adapt to cater to these changes. This is also the first in a series of papers emerging from an empirical research project trying to capture science teacher educators' own views on religio-scientific issues and their positions on the place of these issues within science teacher education and the science classroom.

  16. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less

  17. Overview of Opportunities for Co-Location of Solar Energy Technologies and Vegetation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Beatty, Brenda; Hill, Graham

    2013-12-01

    Large-scale solar facilities have the potential to contribute significantly to national electricity production. Many solar installations are large-scale or utility-scale, with a capacity over 1 MW and connected directly to the electric grid. Large-scale solar facilities offer an opportunity to achieve economies of scale in solar deployment, yet there have been concerns about the amount of land required for solar projects and the impact of solar projects on local habitat. During the site preparation phase for utility-scale solar facilities, developers often grade land and remove all vegetation to minimize installation and operational costs, prevent plants from shading panels, and minimizemore » potential fire or wildlife risks. However, the common site preparation practice of removing vegetation can be avoided in certain circumstances, and there have been successful examples where solar facilities have been co-located with agricultural operations or have native vegetation growing beneath the panels. In this study we outline some of the impacts that large-scale solar facilities can have on the local environment, provide examples of installations where impacts have been minimized through co-location with vegetation, characterize the types of co-location, and give an overview of the potential benefits from co-location of solar energy projects and vegetation. The varieties of co-location can be replicated or modified for site-specific use at other solar energy installations around the world. We conclude with opportunities to improve upon our understanding of ways to reduce the environmental impacts of large-scale solar installations.« less

  18. Managing a big ground-based astronomy project: the Thirty Meter Telescope (TMT) project

    NASA Astrophysics Data System (ADS)

    Sanders, Gary H.

    2008-07-01

    TMT is a big science project and its scale is greater than previous ground-based optical/infrared telescope projects. This paper will describe the ideal "linear" project and how the TMT project departs from that ideal. The paper will describe the needed adaptations to successfully manage real world complexities. The progression from science requirements to a reference design, the development of a product-oriented Work Breakdown Structure (WBS) and an organization that parallels the WBS, the implementation of system engineering, requirements definition and the progression through Conceptual Design to Preliminary Design will be summarized. The development of a detailed cost estimate structured by the WBS, and the methodology of risk analysis to estimate contingency fund requirements will be summarized. Designing the project schedule defines the construction plan and, together with the cost model, provides the basis for executing the project guided by an earned value performance measurement system.

  19. An Overview of Science Education in the Caribbean: Research, Policy and Practice.

    ERIC Educational Resources Information Center

    Sweeney, Aldrin E.

    2003-01-01

    Analyzes science education in the Caribbean and provides examples of science education policy and practice. Emphasizes large-scale national efforts in Barbados, Bermuda, and Jamaica. Discusses and provides recommendations for future directions in science education in these countries. (Contains 88 references.) (Author/NB)

  20. Collaborative Working for Large Digitisation Projects

    ERIC Educational Resources Information Center

    Yeates, Robin; Guy, Damon

    2006-01-01

    Purpose: To explore the effectiveness of large-scale consortia for disseminating local heritage via the web. To describe the creation of a large geographically based cultural heritage consortium in the South East of England and management lessons resulting from a major web site digitisation project. To encourage the improved sharing of experience…

  1. Can Observation Skills of Citizen Scientists Be Estimated Using Species Accumulation Curves?

    PubMed

    Kelling, Steve; Johnston, Alison; Hochachka, Wesley M; Iliff, Marshall; Fink, Daniel; Gerbracht, Jeff; Lagoze, Carl; La Sorte, Frank A; Moore, Travis; Wiggins, Andrea; Wong, Weng-Keen; Wood, Chris; Yu, Jun

    2015-01-01

    Volunteers are increasingly being recruited into citizen science projects to collect observations for scientific studies. An additional goal of these projects is to engage and educate these volunteers. Thus, there are few barriers to participation resulting in volunteer observers with varying ability to complete the project's tasks. To improve the quality of a citizen science project's outcomes it would be useful to account for inter-observer variation, and to assess the rarely tested presumption that participating in a citizen science projects results in volunteers becoming better observers. Here we present a method for indexing observer variability based on the data routinely submitted by observers participating in the citizen science project eBird, a broad-scale monitoring project in which observers collect and submit lists of the bird species observed while birding. Our method for indexing observer variability uses species accumulation curves, lines that describe how the total number of species reported increase with increasing time spent in collecting observations. We find that differences in species accumulation curves among observers equates to higher rates of species accumulation, particularly for harder-to-identify species, and reveals increased species accumulation rates with continued participation. We suggest that these properties of our analysis provide a measure of observer skill, and that the potential to derive post-hoc data-derived measurements of participant ability should be more widely explored by analysts of data from citizen science projects. We see the potential for inferential results from analyses of citizen science data to be improved by accounting for observer skill.

  2. User's Guide for MapIMG 2: Map Image Re-projection Software Package

    USGS Publications Warehouse

    Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.

    2006-01-01

    BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.

  3. Moderate point: Balanced entropy and enthalpy contributions in soft matter

    NASA Astrophysics Data System (ADS)

    He, Baoji; Wang, Yanting

    2017-03-01

    Various soft materials share some common features, such as significant entropic effect, large fluctuations, sensitivity to thermodynamic conditions, and mesoscopic characteristic spatial and temporal scales. However, no quantitative definitions have yet been provided for soft matter, and the intrinsic mechanisms leading to their common features are unclear. In this work, from the viewpoint of statistical mechanics, we show that soft matter works in the vicinity of a specific thermodynamic state named moderate point, at which entropy and enthalpy contributions among substates along a certain order parameter are well balanced or have a minimal difference. Around the moderate point, the order parameter fluctuation, the associated response function, and the spatial correlation length maximize, which explains the large fluctuation, the sensitivity to thermodynamic conditions, and mesoscopic spatial and temporal scales of soft matter, respectively. Possible applications to switching chemical bonds or allosteric biomachines determining their best working temperatures are also briefly discussed. Project supported by the National Basic Research Program of China (Grant No. 2013CB932804) and the National Natural Science Foundation of China (Grant Nos. 11274319 and 11421063).

  4. Computational biomedicine: a challenge for the twenty-first century.

    PubMed

    Coveney, Peter V; Shublaq, Nour W

    2012-01-01

    With the relentless increase of computer power and the widespread availability of digital patient-specific medical data, we are now entering an era when it is becoming possible to develop predictive models of human disease and pathology, which can be used to support and enhance clinical decision-making. The approach amounts to a grand challenge to computational science insofar as we need to be able to provide seamless yet secure access to large scale heterogeneous personal healthcare data in a facile way, typically integrated into complex workflows-some parts of which may need to be run on high performance computers-in a facile way that is integrated into clinical decision support software. In this paper, we review the state of the art in terms of case studies drawn from neurovascular pathologies and HIV/AIDS. These studies are representative of a large number of projects currently being performed within the Virtual Physiological Human initiative. They make demands of information technology at many scales, from the desktop to national and international infrastructures for data storage and processing, linked by high performance networks.

  5. Making the Case: Workforce, Education, Public Outreach and Communications as Mission-Critical Activities

    NASA Astrophysics Data System (ADS)

    Squires, Gordon K.; Brewer, Janesse; Dawson, Sandra; Program Organizing Committee "Making the Case" workshop 2017

    2018-01-01

    Increasingly, next-generation science projects will never see first light, or will lose their “right to operate” if they are unable to be responsive to emerging societal values and interests. Science projects with a robust and professional Workforce, Education, Public Outreach and Communications (WEPOC) architecture are able to engage and welcome public discourse about science, trade-offs, and what it means to be a good neighbor in a community. In this talk I will update the latest WEPOC efforts for TMT & NASA projects at Caltech/IPAC, and highlight how WEPOC has entered the critical path for many large, international science projects. I will also present a draft working document being developed by many of the world's largest astronomy and high-energy physics WEPOC leaders as an outcome from a "Making the Case" conference held at Caltech in spring 2017.

  6. The Planet Formation Imager (PFI) Project

    NASA Astrophysics Data System (ADS)

    Aarnio, Alicia; Monnier, John; Kraus, Stefan; Ireland, Michael

    2016-07-01

    Among the most fascinating and hotly-debated areas in contemporary astrophysics are the means by which planetary systems are assembled from the large rotating disks of gas and dust which attend a stellar birth. Although important work is being done both in theory and observation, a full understanding of the physics of planet formation can only be achieved by opening observational windows able to directly witness the process in action. The key requirement is then to probe planet-forming systems at the natural spatial scales over which material is being assembled. By definition, this is the so-called Hill Sphere, which delineates the region of influence of a gravitating body within its surrounding environment. The Planet Formation Imager project has crystallized around this challenging goal: to deliver resolved images of Hill-Sphere-sized structures within candidate planet-hosting disks in the nearest star-forming regions. In this contribution I outline the primary science case of PFI and give an overview about the work of the PFI science and technical working group and present radiation-hydrodynamics simulations from which we derive preliminary specifications that guide the design of the facility. Finally, I give an overview about the technologies that we are investigating in order to meet the specifications.

  7. Pathways to policy: Lessons learned in multisectoral collaboration for physical activity and built environment policy development from the Coalitions Linking Action and Science for Prevention (CLASP) initiative.

    PubMed

    Politis, Christopher E; Mowat, David L; Keen, Deb

    2017-06-16

    The Canadian Partnership Against Cancer funded 12 large-scale knowledge to action cancer and chronic disease prevention projects between 2009 and 2016 through the Coalitions Linking Action and Science for Prevention (CLASP) initiative. Two projects, Healthy Canada by Design (HCBD) and Children's Mobility, Health and Happiness (CMHH), developed policies to address physical activity and the built environment through a multisectoral approach. A qualitative analysis involving a review of 183 knowledge products and 8 key informant interviews was conducted to understand what policy changes occurred, and the underlying critical success factors, through these projects. Both projects worked at the local level to change physical activity and built environment policy in 203 sites, including municipalities and schools. Both projects brought multisectoral expertise (e.g., public health, land use planning, transportation engineering, education, etc.) together to inform the development of local healthy public policy in the areas of land use, transportation and school travel planning. Through the qualitative analysis of the knowledge products and key informant interviews, 163 policies were attributed to HCBD and CMHH work. Fourteen "pathways to policy" were identified as critical success factors facilitating and accelerating the development and implementation of physical activity and built environment policy. Of the 14 pathways to policy, 8 had a focus on multisectoral collaboration. The lessons learned from the CLASP experience could support enhanced multisectoral collaborations to accelerate the development and implementation of physical activity and built environment policy in new jurisdictions across Canada and internationally.

  8. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2009-09-30

    Modeling of Burning Emissions ( FLAMBE ) project, and other related parameters. Our plans to embed NAAPS inside NOGAPS may need to be put on hold...AOD, FLAMBE and FAROP at FNMOC are supported by 6.4 funding from PMW-120 for “Large-scale Atmospheric Models”, “Small-scale Atmospheric Models

  9. Resolving the Milky Way and Nearby Galaxies with WFIRST

    NASA Astrophysics Data System (ADS)

    Kalirai, Jasonjot

    High-resolution studies of nearby stellar populations have served as a foundation for our quest to understand the nature of galaxies. Today, studies of resolved stellar populations constrain fundamental relations -- such as the initial mass function of stars, the time scales of stellar evolution, the timing of mass loss and amount of energetic feedback, the color-magnitude relation and its dependency on age and metallicity, the stellar-dark matter connection in galaxy halos, and the build up of stellar populations over cosmic time -- that represent key ingredients in our prescription to interpret light from the Universe and to measure the physical state of galaxies. More than in any other area of astrophysics, WFIRST will yield a transformative impact in measuring and characterizing resolved stellar populations in the Milky Way and nearby galaxies. The proximity and level of detail that such populations need to be studied at directly map to all three pillars of WFIRST capabilities - sensitivity from a 2.4 meter space based telescope, resolution from 0.1" pixels, and large 0.3 degree field of view from multiple detectors. Our WFIRST GO Science Investigation Team (F) will develop three WFIRST (notional) GO programs related to resolved stellar populations to fully stress WFIRST's Wide Field Instrument. The programs will include a Survey of the Milky Way, a Survey of Nearby Galaxy Halos, and a Survey of Star-Forming Galaxies. Specific science goals for each program will be validated through a wide range of observational data sets, simulations, and new algorithms. As an output of this study, our team will deliver optimized strategies and tools to maximize stellar population science with WFIRST. This will include: new grids of IR-optimized stellar evolution and synthetic spectroscopic models; pipelines and algorithms for optimal data reduction at the WFIRST sensitivity and pixel scale; wide field simulations of MW environments and galaxy halos; cosmological simulations of nearby galaxy halos matched to WFIRST observations; strategies and automated algorithms to find substructure and dwarf galaxies in WFIRST IR data sets; and documentation. Our team will work closely with the WFIRST Science Center to translate our notional programs into inputs that can help achieve readiness for WFIRST science operations. This includes building full observing programs with target definitions, observing sequences, scheduling constraints, data processing needs, and calibration requirements. Our team has been chosen carefully. Team members are leading scientists in stellar population work that will be a core science theme for WFIRST and are also involved in all large future astronomy projects that will operate in the WFIRST era. The team is intentionally small, and each member will "own" significant science projects. The team will aggressively advocate for WFIRST through innovative initiatives. The team is also diverse in geographical location, observers and theorists, and gender.

  10. NASA Land Cover and Land Use Change (LCLUC): an interdisciplinary research program.

    PubMed

    Justice, Chris; Gutman, Garik; Vadrevu, Krishna Prasad

    2015-01-15

    Understanding Land Cover/Land Use Change (LCLUC) in diverse regions of the world and at varied spatial scales is one of the important challenges in global change research. In this article, we provide a brief overview of the NASA LCLUC program, its focus areas, and the importance of satellite remote sensing observations in LCLUC research including future directions. The LCLUC Program was designed to be a cross-cutting theme within NASA's Earth Science program. The program aims to develop and use remote sensing technologies to improve understanding of human interactions with the environment. Since 1997, the NASA LCLUC program has supported nearly 280 research projects on diverse topics such as forest loss and carbon, urban expansion, land abandonment, wetland loss, agricultural land use change and land use change in mountain systems. The NASA LCLUC program emphasizes studies where land-use changes are rapid or where there are significant regional or global LCLUC implications. Over a period of years, the LCLUC program has contributed to large regional science programs such as Land Biosphere-Atmosphere (LBA), the Northern Eurasia Earth Science Partnership Initiative (NEESPI), and the Monsoon Area Integrated Regional Study (MAIRS). The primary emphasis of the program will remain on using remote sensing datasets for LCLUC research. The program will continue to emphasize integration of physical and social sciences to address regional to global scale issues of LCLUC for the benefit of society. Copyright © 2014. Published by Elsevier Ltd.

  11. Constructivism in Practice: an Exploratory Study of Teaching Patterns and Student Motivation in Physics Classrooms in Finland, Germany and Switzerland

    NASA Astrophysics Data System (ADS)

    Beerenwinkel, Anne; von Arx, Matthias

    2017-04-01

    For the last three decades, moderate constructivism has become an increasingly prominent perspective in science education. Researchers have defined characteristics of constructivist-oriented science classrooms, but the implementation of such science teaching in daily classroom practice seems difficult. Against this background, we conducted a sub-study within the tri-national research project Quality of Instruction in Physics (QuIP) analysing 60 videotaped physics classes involving a large sample of students ( N = 1192) from Finland, Germany and Switzerland in order to investigate the kinds of constructivist components and teaching patterns that can be found in regular classrooms without any intervention. We applied a newly developed coding scheme to capture constructivist facets of science teaching and conducted principal component and cluster analyses to explore which components and patterns were most prominent in the classes observed. Two underlying components were found, resulting in two scales—Structured Knowledge Acquisition and Fostering Autonomy—which describe key aspects of constructivist teaching. Only the first scale was rather well established in the lessons investigated. Classes were clustered based on these scales. The analysis of the different clusters suggested that teaching physics in a structured way combined with fostering students' autonomy contributes to students' motivation. However, our regression models indicated that content knowledge is a more important predictor for students' motivation, and there was no homogeneous pattern for all gender- and country-specific subgroups investigated. The results are discussed in light of recent discussions on the feasibility of constructivism in practice.

  12. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  13. The Nobel Prize as a Reward Mechanism in the Genomics Era: Anonymous Researchers, Visible Managers and the Ethics of Excellence

    PubMed Central

    2010-01-01

    The Human Genome Project (HGP) is regarded by many as one of the major scientific achievements in recent science history, a large-scale endeavour that is changing the way in which biomedical research is done and expected, moreover, to yield considerable benefit for society. Thus, since the completion of the human genome sequencing effort, a debate has emerged over the question whether this effort merits to be awarded a Nobel Prize and if so, who should be the one(s) to receive it, as (according to current procedures) no more than three individuals can be selected. In this article, the HGP is taken as a case study to consider the ethical question to what extent it is still possible, in an era of big science, of large-scale consortia and global team work, to acknowledge and reward individual contributions to important breakthroughs in biomedical fields. Is it still viable to single out individuals for their decisive contributions in order to reward them in a fair and convincing way? Whereas the concept of the Nobel prize as such seems to reflect an archetypical view of scientists as solitary researchers who, at a certain point in their careers, make their one decisive discovery, this vision has proven to be problematic from the very outset. Already during the first decade of the Nobel era, Ivan Pavlov was denied the Prize several times before finally receiving it, on the basis of the argument that he had been active as a research manager (a designer and supervisor of research projects) rather than as a researcher himself. The question then is whether, in the case of the HGP, a research effort that involved the contributions of hundreds or even thousands of researchers worldwide, it is still possible to “individualise” the Prize? The “HGP Nobel Prize problem” is regarded as an exemplary issue in current research ethics, highlighting a number of quandaries and trends involved in contemporary life science research practices more broadly. PMID:20730106

  14. Development of a large-scale transportation optimization course.

    DOT National Transportation Integrated Search

    2011-11-01

    "In this project, a course was developed to introduce transportation and logistics applications of large-scale optimization to graduate students. This report details what : similar courses exist in other universities, and the methodology used to gath...

  15. Abraham Pais Prize for History of Physics Lecture: Big, Bigger, Too Big? From Los Alamos to Fermilab and the SSC

    NASA Astrophysics Data System (ADS)

    Hoddeson, Lillian

    2012-03-01

    The modern era of big science emerged during World War II. Oppenheimer's Los Alamos laboratory offered the quintessential model of a government-funded, mission-oriented facility directed by a strong charismatic leader. The postwar beneficiaries of this model included the increasingly ambitious large laboratories that participated in particle physics--in particular, Brookhaven, SLAC, and Fermilab. They carried the big science they practiced into a new realm where experiments eventually became as large and costly as entire laboratories had been. Meanwhile the available funding grew more limited causing the physics research to be concentrated into fewer and bigger experiments that appeared never to end. The next phase in American high-energy physics was the Superconducting Super Collider, the most costly pure physics project ever attempted. The SSC's termination was a tragedy for American science, but for historians it offers an opportunity to understand what made the success of earlier large high-energy physics laboratories possible, and what made the continuation of the SSC impossible. The most obvious reason for the SSC's failure was its enormous and escalating budget, which Congress would no longer support. Other factors need to be recognized however: no leader could be found with directing skills as strong as those of Wilson, Panofsky, Lederman, or Richter; the scale of the project subjected it to uncomfortable public and Congressional scrutiny; and the DOE's enforcement of management procedures of the military-industrial complex that clashed with those typical of the scientific community led to the alienation and withdrawal of many of the most creative scientists, and to the perception and the reality of poor management. These factors, exacerbated by negative pressure from scientists in other fields and a post-Cold War climate in which physicists had little of their earlier cultural prestige, discouraged efforts to gain international support. They made the SSC crucially different from its predecessors and sealed its doom.

  16. A 10-year ecosystem restoration community of practice tracks large-scale restoration trends

    EPA Science Inventory

    In 2004, a group of large-scale ecosystem restoration practitioners across the United States convened to start the process of sharing restoration science, management, and best practices under the auspices of a traditional conference umbrella. This forum allowed scientists and dec...

  17. Radio Continuum Surveys with Square Kilometre Array Pathfinders

    NASA Astrophysics Data System (ADS)

    Norris, Ray P.; Afonso, J.; Bacon, D.; Beck, Rainer; Bell, Martin; Beswick, R. J.; Best, Philip; Bhatnagar, Sanjay; Bonafede, Annalisa; Brunetti, Gianfranco; Budavári, Tamás; Cassano, Rossella; Condon, J. J.; Cress, Catherine; Dabbech, Arwa; Feain, I.; Fender, Rob; Ferrari, Chiara; Gaensler, B. M.; Giovannini, G.; Haverkorn, Marijke; Heald, George; Van der Heyden, Kurt; Hopkins, A. M.; Jarvis, M.; Johnston-Hollitt, Melanie; Kothes, Roland; Van Langevelde, Huib; Lazio, Joseph; Mao, Minnie Y.; Martínez-Sansigre, Alejo; Mary, David; Mcalpine, Kim; Middelberg, E.; Murphy, Eric; Padovani, P.; Paragi, Zsolt; Prandoni, I.; Raccanelli, A.; Rigby, Emma; Roseboom, I. G.; Röttgering, H.; Sabater, Jose; Salvato, Mara; Scaife, Anna M. M.; Schilizzi, Richard; Seymour, N.; Smith, Dan J. B.; Umana, Grazia; Zhao, G.-B.; Zinn, Peter-Christian

    2013-03-01

    In the lead-up to the Square Kilometre Array (SKA) project, several next-generation radio telescopes and upgrades are already being built around the world. These include APERTIF (The Netherlands), ASKAP (Australia), e-MERLIN (UK), VLA (USA), e-EVN (based in Europe), LOFAR (The Netherlands), MeerKAT (South Africa), and the Murchison Widefield Array. Each of these new instruments has different strengths, and coordination of surveys between them can help maximise the science from each of them. A radio continuum survey is being planned on each of them with the primary science objective of understanding the formation and evolution of galaxies over cosmic time, and the cosmological parameters and large-scale structures which drive it. In pursuit of this objective, the different teams are developing a variety of new techniques, and refining existing ones. To achieve these exciting scientific goals, many technical challenges must be addressed by the survey instruments. Given the limited resources of the global radio-astronomical community, it is essential that we pool our skills and knowledge. We do not have sufficient resources to enjoy the luxury of re-inventing wheels. We face significant challenges in calibration, imaging, source extraction and measurement, classification and cross-identification, redshift determination, stacking, and data-intensive research. As these instruments extend the observational parameters, we will face further unexpected challenges in calibration, imaging, and interpretation. If we are to realise the full scientific potential of these expensive instruments, it is essential that we devote enough resources and careful study to understanding the instrumental effects and how they will affect the data. We have established an SKA Radio Continuum Survey working group, whose prime role is to maximise science from these instruments by ensuring we share resources and expertise across the projects. Here we describe these projects, their science goals, and the technical challenges which are being addressed to maximise the science return.

  18. Meteor Observations as Big Data Citizen Science

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.; Vinkovic, D.; Schwarz, G.; Nina, A.; Koschny, D.; Lyytinen, E.

    2016-12-01

    Meteor science represents an excellent example of the citizen science project, where progress in the field has been largely determined by amateur observations. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established BigSkyEarth http://bigskyearth.eu/ network.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, Dave; Garzoglio, Gabriele; Kim, Hyunwoo

    As of 2012, a number of US Department of Energy (DOE) National Laboratories have access to a 100 Gb/s wide-area network backbone. The ESnet Advanced Networking Initiative (ANI) project is intended to develop a prototype network, based on emerging 100 Gb/s Ethernet technology. The ANI network will support DOE's science research programs. A 100 Gb/s network test bed is a key component of the ANI project. The test bed offers the opportunity for early evaluation of 100Gb/s network infrastructure for supporting the high impact data movement typical of science collaborations and experiments. In order to make effective use of thismore » advanced infrastructure, the applications and middleware currently used by the distributed computing systems of large-scale science need to be adapted and tested within the new environment, with gaps in functionality identified and corrected. As a user of the ANI test bed, Fermilab aims to study the issues related to end-to-end integration and use of 100 Gb/s networks for the event simulation and analysis applications of physics experiments. In this paper we discuss our findings from evaluating existing HEP Physics middleware and application components, including GridFTP, Globus Online, etc. in the high-speed environment. These will include possible recommendations to the system administrators, application and middleware developers on changes that would make production use of the 100 Gb/s networks, including data storage, caching and wide area access.« less

  20. The effects of topic choice in project-based instruction on undergraduate physical science students' interest, ownership, and motivation

    NASA Astrophysics Data System (ADS)

    Milner-Bolotin, Marina

    2001-07-01

    Motivating nonscience majors in science and mathematics studies became one of the most interesting and important challenges in contemporary science and mathematics education. Therefore, designing and studying a learning environment, which enhances students' motivation, is an important task. This experimental study sought to explore the implications of student autonomy in topic choice in a project-based Physical Science Course for nonscience majors' on students' motivational orientation. It also suggested and tested a model explaining motivational outcomes of project-based learning environment through increased student ownership of science projects. A project, How Things Work, was designed and implemented in this study. The focus of the project was application of physical science concepts learned in the classroom to everyday life situations. Participants of the study (N = 59) were students enrolled in three selected sections of a Physical Science Course, designed to fulfill science requirements for nonscience majors. These sections were taught by the same instructor over a period of an entire 16-week semester at a large public research university. The study focused on four main variables: student autonomy in choosing a project topic, their motivational orientation, student ownership of the project, and the interest in the project topic. Achievement Goal Orientation theory became the theoretical framework for the study. Student motivational orientation, defined as mastery or performance goal orientation, was measured by an Achievement Goal Orientation Questionnaire. Student ownership was measured using an original instrument, Ownership Measurement Questionnaire, designed and tested by the researchers. Repeated measures yoked design, ANOVA, ANCOVA, and multivariate regression analysis were implemented in the study. Qualitative analysis was used to complement and verify quantitative results. It has been found that student autonomy in the project choice did not make a significant impact on their motivational orientation, while their initial interest in the project topic did. The latter was found to be related to students' ownership of the project, which was found to lead to improved mastery goal orientation. These findings indicate that incorporating project-based learning in science teaching may lead to increased student mastery goal orientation, and may result in improved science learning.

  1. Ordering Unstructured Meshes for Sparse Matrix Computations on Leading Parallel Systems

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Li, Xiaoye; Heber, Gerd; Biswas, Rupak

    2000-01-01

    The ability of computers to solve hitherto intractable problems and simulate complex processes using mathematical models makes them an indispensable part of modern science and engineering. Computer simulations of large-scale realistic applications usually require solving a set of non-linear partial differential equations (PDES) over a finite region. For example, one thrust area in the DOE Grand Challenge projects is to design future accelerators such as the SpaHation Neutron Source (SNS). Our colleagues at SLAC need to model complex RFQ cavities with large aspect ratios. Unstructured grids are currently used to resolve the small features in a large computational domain; dynamic mesh adaptation will be added in the future for additional efficiency. The PDEs for electromagnetics are discretized by the FEM method, which leads to a generalized eigenvalue problem Kx = AMx, where K and M are the stiffness and mass matrices, and are very sparse. In a typical cavity model, the number of degrees of freedom is about one million. For such large eigenproblems, direct solution techniques quickly reach the memory limits. Instead, the most widely-used methods are Krylov subspace methods, such as Lanczos or Jacobi-Davidson. In all the Krylov-based algorithms, sparse matrix-vector multiplication (SPMV) must be performed repeatedly. Therefore, the efficiency of SPMV usually determines the eigensolver speed. SPMV is also one of the most heavily used kernels in large-scale numerical simulations.

  2. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  3. The Integrated Marine Postdoc Network: Support for postdoctoral researchers in marine sciences in Kiel, Germany

    NASA Astrophysics Data System (ADS)

    Braker, Gesche; Schelten, Christiane K.

    2016-04-01

    Despite the important role postdoctoral researchers play in the German academic system their status is largely undefined: Being challenged by a multitude of tasks, their employment situation is often characterized by short term contracts and a well-defined and articulated academic career path is lacking. Moreover, their employment situation becomes increasingly insecure as the time post Ph.D. increases unless they manage to shift into a tenured professorship or into similar opportunities in the non-academic employment sector. All this results in insecurity in terms of career perspectives. The support of postdoctoral researchers through the 'Integrated Marine Postdoc Network (IMAP)' has been identified as one of the strategic goals of the Cluster of Excellence 'The Future Ocean' in Kiel, Germany, a large collaborative research project funded through the German Excellence Initiative. To improve the situation of researchers post Ph.D., IMAP has identified three main actions: Building a vibrant community of postdoctoral researchers, engaging in a strategic dialogue on structural changes within the academic system in Germany with special emphasis on more predictable career paths below the professorship level and enhancing the competitiveness of postdoctoral researchers in marine sciences in Kiel through tailored schemes for career support. Since 2012 IMAP has developed into a vibrant network of researchers post Ph.D. who engage in the diverse disciplines of marine sciences in Kiel - in natural, social and medical sciences, computing, economics, and law. With currently more than 90 members working at one of the partner institutions of the Cluster in Kiel - Kiel University, GEOMAR Helmholtz Centre for Ocean Research, and the Institute for the World Economy the network hosts broad scientific expertise in integrated ocean research. It is professionally coordinated and operates at the interface between the partner institutions and large scale collaborative research projects, e.g. the SFB 754 'Climate-biogeochemical interactions in the tropical ocean' in Kiel, to provide a structure for complementary support of researchers post Ph.D. (This contribution is linked to the presentation by Dr. Christiane K. Schelten.)

  4. The Value of Methodical Management: Optimizing Science Results

    NASA Astrophysics Data System (ADS)

    Saby, Linnea

    2016-01-01

    As science progresses, making new discoveries in radio astronomy becomes increasingly complex. Instrumentation must be incredibly fine-tuned and well-understood, scientists must consider the skills and schedules of large research teams, and inter-organizational projects sometimes require coordination between observatories around the globe. Structured and methodical management allows scientists to work more effectively in this environment and leads to optimal science output. This report outlines the principles of methodical project management in general, and describes how those principles are applied at the National Radio Astronomy Observatory (NRAO) in Charlottesville, Virginia.

  5. Lessons from a Large-Scale Assessment: Results from Conceptual Inventories

    ERIC Educational Resources Information Center

    Thacker, Beth; Dulli, Hani; Pattillo, Dave; West, Keith

    2014-01-01

    We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER) (physics education research-informed materials) into a department where most instruction has previously been traditional and a significant…

  6. What’s Needed from Climate Modeling to Advance Actionable Science for Water Utilities?

    NASA Astrophysics Data System (ADS)

    Barsugli, J. J.; Anderson, C. J.; Smith, J. B.; Vogel, J. M.

    2009-12-01

    “…perfect information on climate change is neither available today nor likely to be available in the future, but … over time, as the threats climate change poses to our systems grow more real, predicting those effects with greater certainty is non-discretionary. We’re not yet at a level at which climate change projections can drive climate change adaptation.” (Testimony of WUCA Staff Chair David Behar to the House Committee on Science and Technology, May 5, 2009) To respond to this challenge, the Water Utility Climate Alliance (WUCA) has sponsored a white paper titled “Options for Improving Climate Modeling to Assist Water Utility Planning for Climate Change. ” This report concerns how investments in the science of climate change, and in particular climate modeling and downscaling, can best be directed to help make climate projections more actionable. The meaning of “model improvement” can be very different depending on whether one is talking to a climate model developer or to a water manager trying to incorporate climate projections in to planning. We first surveyed the WUCA members on present and potential uses of climate model projections and on climate inputs to their various system models. Based on those surveys and on subsequent discussions, we identified four dimensions along which improvement in modeling would make the science more “actionable”: improved model agreement on change in key parameters; narrowing the range of model projections; providing projections at spatial and temporal scales that match water utilities system models; providing projections that water utility planning horizons. With these goals in mind we developed four options for improving global-scale climate modeling and three options for improving downscaling that will be discussed. However, there does not seem to be a single investment - the proverbial “magic bullet” -- which will substantially reduce the range of model projections at the scales at which utility planning is conducted. In the near term we feel strongly that water utilities and climate scientists should work together to leverage the upcoming Coupled Model Intercomparison Project, Phase 5 (CMIP5; a coordinated set climate model experiments that will be used to support the upcoming IPCC Fifth Assessment) to better benefit water utilities. In the longer term, even with model and downscaling improvements, it is very likely that substantial uncertainty about future climate change at the desired spatial and temporal scales will remain. Nonetheless, there is no doubt the climate is changing, and the challenge is to work with what we have, or what we can reasonably expect to have in the coming years to make the best decisions we can.

  7. Accurate double many-body expansion potential energy surface of HS2A2A‧) by scaling the external correlation

    NASA Astrophysics Data System (ADS)

    Lu-Lu, Zhang; Yu-Zhi, Song; Shou-Bao, Gao; Yuan, Zhang; Qing-Tian, Meng

    2016-05-01

    A globally accurate single-sheeted double many-body expansion potential energy surface is reported for the first excited state of HS2 by fitting the accurate ab initio energies, which are calculated at the multireference configuration interaction level with the aug-cc-pVQZ basis set. By using the double many-body expansion-scaled external correlation method, such calculated ab initio energies are then slightly corrected by scaling their dynamical correlation. A grid of 2767 ab initio energies is used in the least-square fitting procedure with the total root-mean square deviation being 1.406 kcal·mol-1. The topographical features of the HS2(A2A‧) global potential energy surface are examined in detail. The attributes of the stationary points are presented and compared with the corresponding ab initio results as well as experimental and other theoretical data, showing good agreement. The resulting potential energy surface of HS2(A2A‧) can be used as a building block for constructing the global potential energy surfaces of larger S/H molecular systems and recommended for dynamic studies on the title molecular system. Project supported by the National Natural Science Foundation of China (Grant No. 11304185), the Taishan Scholar Project of Shandong Province, China, the Shandong Provincial Natural Science Foundation, China (Grant No. ZR2014AM022), the Shandong Province Higher Educational Science and Technology Program, China (Grant No. J15LJ03), the China Postdoctoral Science Foundation (Grant No. 2014M561957), and the Post-doctoral Innovation Project of Shandong Province, China (Grant No. 201402013).

  8. Integrating Semantic Information in Metadata Descriptions for a Geoscience-wide Resource Inventory.

    NASA Astrophysics Data System (ADS)

    Zaslavsky, I.; Richard, S. M.; Gupta, A.; Valentine, D.; Whitenack, T.; Ozyurt, I. B.; Grethe, J. S.; Schachne, A.

    2016-12-01

    Integrating semantic information into legacy metadata catalogs is a challenging issue and so far has been mostly done on a limited scale. We present experience of CINERGI (Community Inventory of Earthcube Resources for Geoscience Interoperability), an NSF Earthcube Building Block project, in creating a large cross-disciplinary catalog of geoscience information resources to enable cross-domain discovery. The project developed a pipeline for automatically augmenting resource metadata, in particular generating keywords that describe metadata documents harvested from multiple geoscience information repositories or contributed by geoscientists through various channels including surveys and domain resource inventories. The pipeline examines available metadata descriptions using text parsing, vocabulary management and semantic annotation and graph navigation services of GeoSciGraph. GeoSciGraph, in turn, relies on a large cross-domain ontology of geoscience terms, which bridges several independently developed ontologies or taxonomies including SWEET, ENVO, YAGO, GeoSciML, GCMD, SWO, and CHEBI. The ontology content enables automatic extraction of keywords reflecting science domains, equipment used, geospatial features, measured properties, methods, processes, etc. We specifically focus on issues of cross-domain geoscience ontology creation, resolving several types of semantic conflicts among component ontologies or vocabularies, and constructing and managing facets for improved data discovery and navigation. The ontology and keyword generation rules are iteratively improved as pipeline results are presented to data managers for selective manual curation via a CINERGI Annotator user interface. We present lessons learned from applying CINERGI metadata augmentation pipeline to a number of federal agency and academic data registries, in the context of several use cases that require data discovery and integration across multiple earth science data catalogs of varying quality and completeness. The inventory is accessible at http://cinergi.sdsc.edu, and the CINERGI project web page is http://earthcube.org/group/cinergi

  9. Physical habitat monitoring strategy (PHAMS) for reach-scale restoration effectiveness monitoring

    USGS Publications Warehouse

    Jones, Krista L.; O'Daniel, Scott J.; Beechie, Tim J.; Zakrajsek, John; Webster, John G.

    2015-04-14

    Habitat restoration efforts by the Confederated Tribes of the Umatilla Indian Reservation (CTUIR) have shifted from the site scale (1-10 meters) to the reach scale (100-1,000 meters). This shift was in response to the growing scientific emphasis on process-based restoration and to support from the 2007 Accords Agreement with the Bonneville Power Administration. With the increased size of restoration projects, the CTUIR and other agencies are in need of applicable monitoring methods for assessing large-scale changes in river and floodplain habitats following restoration. The goal of the Physical Habitat Monitoring Strategy is to outline methods that are useful for capturing reach-scale changes in surface and groundwater hydrology, geomorphology, hydrologic connectivity, and riparian vegetation at restoration projects. The Physical Habitat Monitoring Strategy aims to avoid duplication with existing regional effectiveness monitoring protocols by identifying complimentary reach-scale metrics and methods that may improve the ability of CTUIR and others to detect instream and riparian changes at large restoration projects.

  10. Understanding forest ecology from the landscape to the project level

    Treesearch

    Ward McCaughey

    2007-01-01

    Several researchers in the Forestry Sciences Laboratory have been actively involved in BEMRP since its inception in the early 1990s. The recent research on the Trapper Bunkhouse Land Stewardship Project began in 2004. In ecosystem management, sometimes we need to look at the big picture, or the landscape scale, and sometimes we need to work on a more local, or project-...

  11. Magnetic properties and magnetocaloric effects in HoPd intermetallic

    NASA Astrophysics Data System (ADS)

    Zhao-Jun, Mo; Jun, Shen; Xin-Qiang, Gao; Yao, Liu; Jian-Feng, Wu; Bao-Gen, Shen; Ji-Rong, Sun

    2015-03-01

    A large reversible magnetocaloric effect accompanied by a second order magnetic phase transition from PM to FM is observed in the HoPd compound. Under the magnetic field change of and the refrigerant capacity RC for the compound are evaluated to be 20 J/(kg · K) and 342 J/kg, respectively. In particular, large (11.3 J/(kg · K)) and RC (142 J/kg) are achieved under a low magnetic field change of 0-2 T with no thermal hysteresis and magnetic hysteresis loss. The large reversible magnetocaloric effect (both the large -ΔSM and the high RC) indicates that HoPd is a promising material for magnetic refrigeration at low temperature. Project supported by the National Natural Science Foundation of China (Grant Nos. 51322605, 11104337, 51271192, and 11274357) and the Knowledge Innovation Project of the Chinese Academy of Sciences.

  12. The applications of carbon nanomaterials in fiber-shaped energy storage devices

    NASA Astrophysics Data System (ADS)

    Wu, Jingxia; Hong, Yang; Wang, Bingjie

    2018-01-01

    As a promising candidate for future demand, fiber-shaped electrochemical energy storage devices, such as supercapacitors and lithium-ion batteries have obtained considerable attention from academy to industry. Carbon nanomaterials, such as carbon nanotube and graphene, have been widely investigated as electrode materials due to their merits of light weight, flexibility and high capacitance. In this review, recent progress of carbon nanomaterials in flexible fiber-shaped energy storage devices has been summarized in accordance with the development of fibrous electrodes, including the diversified electrode preparation, functional and intelligent device structure, and large-scale production of fibrous electrodes or devices. Project supported by the National Natural Science Foundation of China (Nos. 21634003, 21604012).

  13. General Catalogue of Variable Stars: Current Status and New Name-Lists

    NASA Astrophysics Data System (ADS)

    Samus, N. N.; Kazarovets, E. V.; Kireeva, N. N.; Pastukhova, E. N.; Durlevich, O. V.

    2010-12-01

    A short history of variable-star catalogs is presented. After the second World War, the International Astronomical Union asked astronomers of the Soviet Union to become responsible for variable-star catalogs. Currently, the catalog is kept electronically and is a joint project of the Institute of Astronomy (Russian Academy of Sciences) and Sternberg Astronomical Institute (Moscow University). We review recent trends in the field of variable-star catalogs, discuss problems and new prospects related to modern large-scale automatic photometric sky surveys, outline the subject of discussions on the future of the variable-star catalogs in the profile commissions of the IAU, and call for suggestions from the astronomical community.

  14. Building brains for bodies

    NASA Technical Reports Server (NTRS)

    Brooks, Rodney Allen; Stein, Lynn Andrea

    1994-01-01

    We describe a project to capitalize on newly available levels of computational resources in order to understand human cognition. We will build an integrated physical system including vision, sound input and output, and dextrous manipulation, all controlled by a continuously operating large scale parallel MIMD computer. The resulting system will learn to 'think' by building on its bodily experiences to accomplish progressively more abstract tasks. Past experience suggests that in attempting to build such an integrated system we will have to fundamentally change the way artificial intelligence, cognitive science, linguistics, and philosophy think about the organization of intelligence. We expect to be able to better reconcile the theories that will be developed with current work in neuroscience.

  15. Going wild: what a global small-animal tracking system could do for experimental biologists.

    PubMed

    Wikelski, Martin; Kays, Roland W; Kasdin, N Jeremy; Thorup, Kasper; Smith, James A; Swenson, George W

    2007-01-01

    Tracking animals over large temporal and spatial scales has revealed invaluable and spectacular biological information, particularly when the paths and fates of individuals can be monitored on a global scale. However, only large animals (greater than approximately 300 g) currently can be followed globally because of power and size constraints on the tracking devices. And yet the vast majority of animals is small. Tracking small animals is important because they are often part of evolutionary and ecological experiments, they provide important ecosystem services and they are of conservation concern or pose harm to human health. Here, we propose a small-animal satellite tracking system that would enable the global monitoring of animals down to the size of the smallest birds, mammals (bats), marine life and eventually large insects. To create the scientific framework necessary for such a global project, we formed the ICARUS initiative (www.IcarusInitiative.org), the International Cooperation for Animal Research Using Space. ICARUS also highlights how small-animal tracking could address some of the ;Grand Challenges in Environmental Sciences' identified by the US National Academy of Sciences, such as the spread of infectious diseases or the relationship between biological diversity and ecosystem functioning. Small-animal tracking would allow the quantitative assessment of dispersal and migration in natural populations and thus help solve enigmas regarding population dynamics, extinctions and invasions. Experimental biologists may find a global small-animal tracking system helpful in testing, validating and expanding laboratory-derived discoveries in wild, natural populations. We suggest that the relatively modest investment into a global small-animal tracking system will pay off by providing unprecedented insights into both basic and applied nature. Tracking small animals over large spatial and temporal scales could prove to be one of the most powerful techniques of the early 21st century, offering potential solutions to a wide range of biological and societal questions that date back two millennia to the Greek philosopher Aristotle's enigma about songbird migration. Several of the more recent Grand Challenges in Environmental Sciences, such as the regulation and functional consequences of biological diversity or the surveillance of the population ecology of zoonotic hosts, pathogens or vectors, could also be addressed by a global small-animal tracking system. Our discussion is intended to contribute to an emerging groundswell of scientific support to make such a new technological system happen.

  16. Life science research and drug discovery at the turn of the 21st century: the experience of SwissBioGrid.

    PubMed

    den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph

    2009-04-22

    It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling "in silico" the processes observed "in vitro." The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. SwissBioGrid gave rise to two pilot projects: one for proteomics data analysis and the other for high-throughput molecular docking ("virtual screening") to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the interactions of millions of small molecules with a limited number of protein targets on the coat of the dengue virus. Both present interesting lessons about how scientific practices are changing when they tackle the problems of large-scale data analysis and data management by means of creating a novel technical infrastructure. In the experience of SwissBioGrid, data intensive discovery has a lot to gain from close collaboration with industry and harnessing distributed computing power. Yet the diversity in life science research implies only a limited role for generic infrastructure; and the transience of support means that researchers need to integrate their efforts with others if they want to sustain the benefits of their success, which are otherwise lost.

  17. WISE Science: Web-based Inquiry in the Classroom. Technology, Education--Connections

    ERIC Educational Resources Information Center

    Slotta, James D.; Linn, Marcia C.

    2009-01-01

    This book shares the lessons learned by a large community of educational researchers and science teachers as they designed, developed, and investigated a new technology-enhanced learning environment known as WISE: The Web-Based Inquiry Science Environment. WISE offers a collection of free, customizable curriculum projects on topics central to the…

  18. Beyond PARR - PMEL's Integrated Data Management Strategy

    NASA Astrophysics Data System (ADS)

    Burger, E. F.; O'Brien, K.; Manke, A. B.; Schweitzer, R.; Smith, K. M.

    2016-12-01

    NOAA's Pacific Marine Environmental Laboratory (PMEL) hosts a wide range of scientific projects that span a number of scientific and environmental research disciplines. Each of these 14 research projects have their own data streams that are as diverse as the research. With its requirements for public access to federally funded research results and data, the 2013 White House Office of Science and Technology memo on Public Access to Research Results (PARR) changed the data management landscape for Federal agencies. In 2015, with support from the PMEL Director, Dr. Christopher Sabine, PMEL's Science Data Integration Group (SDIG) initiated a multi-year effort to formulate and implement an integrated data-management strategy for PMEL research efforts. Instead of using external requirements, such as PARR, to define our approach, we focussed on strategies to provide PMEL science projects with a unified framework for data submission, interoperable data access, data storage, and easier data archival to National Data Centers. This improves data access to PMEL scientists, their collaborators, and the public, and also provides a unified lab framework that allows our projects to meet their data management objectives, as well as those required by the PARR. We are implementing this solution in stages that allows us to test technology and architecture choices before comitting to a large scale implementation. SDIG developers have completed the first year of development where our approach is to reuse and leverage existing frameworks and standards. This presentation will describe our data management strategy, explain our phased implementation approach, the software and framework choices, and how these elements help us meet the objectives of this strategy. We will share the lessons learned in dealing with diverse and complex datasets in this first year of implementation and how these outcomes will shape our decisions for this ongoing effort. The data management capabilities now available to scientific projects, and other services being developed to manage and preserve PMEL's scientific data assets for our researchers, their collaborators, and future generations, will be described.

  19. The success of the horse-chestnut leaf-miner, Cameraria ohridella, in the UK revealed with hypothesis-led citizen science.

    PubMed

    Pocock, Michael J O; Evans, Darren M

    2014-01-01

    Citizen science is an increasingly popular way of undertaking research and simultaneously engaging people with science. However, most emphasis of citizen science in environmental science is on long-term monitoring. Here, we demonstrate the opportunities provided by short-term hypothesis-led citizen science. In 2010, we ran the 'Conker Tree Science' project, in which over 3500 people in Great Britain provided data at a national scale of an insect (horse-chestnut leaf-mining moth, Cameraria ohridella) undergoing rapid range-expansion. We addressed two hypotheses, and found that (1) the levels of damage caused to leaves of the horse-chestnut tree, Aesculus hippocastanum, and (2) the level of attack by parasitoids of C. ohridella larvae were both greatest where C. ohridella had been present the longest. Specifically there was a rapid rise in leaf damage during the first three years that C. ohridella was present and only a slight rise thereafter, while estimated rates of parasitism (an index of true rates of parasitism) increased from 1.6 to 5.9% when the time C. ohridella had been present in a location increased from 3 to 6 years. We suggest that this increase is due to recruitment of native generalist parasitoids, rather than the adaptation or host-tracking of more specialized parasitoids, as appears to have occurred elsewhere in Europe. Most data collected by participants were accurate, but the counts of parasitoids from participants showed lower concordance with the counts from experts. We statistically modeled this bias and propagated this through our analyses. Bias-corrected estimates of parasitism were lower than those from the raw data, but the trends were similar in magnitude and significance. With appropriate checks for data quality, and statistically correcting for biases where necessary, hypothesis-led citizen science is a potentially powerful tool for carrying out scientific research across large spatial scales while simultaneously engaging many people with science.

  20. The EuroDIVERSITY Programme: Challenges of Biodiversity Science in Europe

    NASA Astrophysics Data System (ADS)

    Jonckheere, I.

    2009-04-01

    In close cooperation with its Member Organisations, the European Science Foundation (ESF) has launched since late 2003 a series of European Collaborative Research (EUROCORES) Programmes. Their aim is to enable researchers in different European countries to develop cooperation and scientific synergy in areas where European scale and scope are required in a global context. The EUROCORES instrument represents the first large scale attempt of national research (funding) agencies to act together against fragmentation, asynchronicity and duplication of research (funding) within Europe. Although covering all scientific fields, there are presently 13 EUROCORES Programmes dealing with cutting edge science in the fields of Earth, Climate and Environmental Sciences. The aim of the EuroDIVERSITY Programme is to support the emergence of an integrated biodiversity science based on an understanding of fundamental ecological and social processes that drive biodiversity changes and their impacts on ecosystem functioning and society. Ecological systems across the globe are being threatened or transformed at unprecedented rates from local to global scales due to the ever-increasing human domination of natural ecosystems. In particular, massive biodiversity changes are currently taking place, and this trend is expected to continue over the coming decades, driven by the increasing extension and globalisation of human affairs. The EuroDIVERSITY Programme meets the research need triggered by the increasing human footprint worldwide with a focus on generalisations across particular systems and on the generation and validation of theory relevant to experimental and empirical data. The EURODIVERSITY Programme tries to bridge the gaps between the natural and social sciences, between research work on terrestrial, freshwater and marine ecosystems, and between research work on plants, animals and micro-organisms. The Programme was launched in April 2006 and includes 10 international, multidisciplinary collaborative research projects, which are expected to contribute to this goal by initiating or strengthening major collaborative research efforts. Some projects are dealing primarily with microbial diversity (COMIX, METHECO, MiCROSYSTEMS), others try to investigate the biogeochemistry in grassland and forest ecosystems (BEGIN, BioCycle), the landscape and community ecology of biodiversity changes (ASSEMBLE, AGRIPOPES, EcoTRADE), and others focus on the diversity in freshwater (BIOPOOL, MOLARCH). In 2009, the EuroDIVERSITY Programme will integrate the different European research teams involved with collaborative field work campaigns over Europe, international workshops and conferences, as well as joint peer-review publications. For more information about the Programme and its activities, please check the Programme website: www.esf.org/eurodiversity

  1. Extreme Events and Energy Providers: Science and Innovation

    NASA Astrophysics Data System (ADS)

    Yiou, P.; Vautard, R.

    2012-04-01

    Most socio-economic regulations related to the resilience to climate extremes, from infrastructure or network design to insurance premiums, are based on a present-day climate with an assumption of stationarity. Climate extremes (heat waves, cold spells, droughts, storms and wind stilling) affect in particular energy production, supply, demand and security in several ways. While national, European or international projects have generated vast amounts of climate projections for the 21st century, their practical use in long-term planning remains limited. Estimating probabilistic diagnostics of energy user relevant variables from those multi-model projections will help the energy sector to elaborate medium to long-term plans, and will allow the assessment of climate risks associated to those plans. The project "Extreme Events for Energy Providers" (E3P) aims at filling a gap between climate science and its practical use in the energy sector and creating in turn favourable conditions for new business opportunities. The value chain ranges from addressing research questions directly related to energy-significant climate extremes to providing innovative tools of information and decision making (including methodologies, best practices and software) and climate science training for the energy sector, with a focus on extreme events. Those tools will integrate the scientific knowledge that is developed by scientific communities, and translate it into a usable probabilistic framework. The project will deliver projection tools assessing the probabilities of future energy-relevant climate extremes at a range of spatial scales varying from pan-European to local scales. The E3P project is funded by the Knowledge and Innovation Community (KIC Climate). We will present the mechanisms of interactions between academic partners, SMEs and industrial partners for this project. Those mechanisms are elementary bricks of a climate service.

  2. Science Competencies That Go Unassessed

    ERIC Educational Resources Information Center

    Gilmer, Penny J.; Sherdan, Danielle M.; Oosterhof, Albert; Rohani, Faranak; Rouby, Aaron

    2011-01-01

    Present large-scale assessments require the use of item formats, such as multiple choice, that can be administered and scored efficiently. This limits competencies that can be measured by these assessments. An alternative approach to large-scale assessments is being investigated that would include the use of complex performance assessments. As…

  3. ARCUS Project Managers and the Intangible Infrastructure of Large Interdisciplinary Arctic Research Networks

    NASA Astrophysics Data System (ADS)

    Myers, B.; Wiggins, H. V.; Turner-Bogren, E. J.; Warburton, J.

    2017-12-01

    Project Managers at the Arctic Research Consortium of the U.S. (ARCUS) lead initiatives to convene, communicate with, and connect the Arctic research community across challenging disciplinary, geographic, temporal, and cultural boundaries. They regularly serve as the organizing hubs, archivists and memory-keepers for collaborative projects comprised of many loosely affiliated partners. As leading organizers of large open science meetings and other outreach events, they also monitor the interdisciplinary landscape of community needs, concerns, opportunities, and emerging research directions. However, leveraging the ARCUS Project Manager role to strategically build out the intangible infrastructure necessary to advance Arctic research requires a unique set of knowledge, skills, and experience. Drawing on a range of lessons learned from past and ongoing experiences with collaborative science, education and outreach programming, this presentation will highlight a model of ARCUS project management that we believe works best to support and sustain our community in its long-term effort to conquer the complexities of Arctic research.

  4. Pioneering University/Industry Venture Explores VLSI Frontiers.

    ERIC Educational Resources Information Center

    Davis, Dwight B.

    1983-01-01

    Discusses industry-sponsored programs in semiconductor research, focusing on Stanford University's Center for Integrated Systems (CIS). CIS, while pursuing research in semiconductor very-large-scale integration, is merging the fields of computer science, information science, and physical science. Issues related to these university/industry…

  5. The Data Conservancy

    NASA Astrophysics Data System (ADS)

    Choudhury, S.; Duerr, R. E.

    2009-12-01

    NSF's Sustainable Digital Data Preservation and Access Network Partners program is an ambitious attempt to integrate a wide variety of expertise and infrastructure into a network for providing "reliable digital preservation, access, integration, and analysis capabilities for science." One of the first two DataNet award recipients, the Data Conservancy, is itself a network of widely diverse partners led by the libraries at the Johns Hopkins University. The Data Conservancy is built on existing exemplar scientific projects, communities, and virtual organizations that have deep engagement with their user communities, and extensive experience with large-scale distributed system development. Data Conservancy members embrace a shared vision that data curation is not an end, but rather a means to collect, organize, validate, and preserve data needed to address the grand research challenges that face society. Data Conservancy members holdings encompass the entire range of earth, life, and space science data. New to the Data Conservancy is the concept that University libraries will be part of the distributed network of data centers and that data science will become a path in the library and information science curricula. As noted by Winston Tabb (JHU Dean of Libraries) "Data Centers are the new library stacks."

  6. Evaluating Introductory Physics Classes in Light of the ABET Criteria: An Example from the SCALE-UP Project.

    ERIC Educational Resources Information Center

    Saul, Jeffery M.; Deardorff, Duane L.; Abbott, David S.; Allain, Rhett J.; Beichner, Robert J.

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project at North Carolina State University (NCSU) is developing a curriculum to promote learning through in-class group activities in introductory physics classes up to 100 students. The authors are currently in Phase II of the project using a specially designed…

  7. Sustainable Systems SFA 2.0

    ScienceCinema

    Hubbard, Susan

    2018-05-07

    Berkeley Lab Earth Sciences Division Director Susan Hubbard, the Project Lead for the Sustainable Systems Scientific Focus Area (SFA) 2.0, gives an overview of the project and its mission to develop a predictive understanding of terrestrial environments, from the genome to the watershed scales, to enable a new class of solutions for environmental and energy solutions.

  8. Sustainable Systems SFA 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hubbard, Susan

    2015-12-19

    Berkeley Lab Earth Sciences Division Director Susan Hubbard, the Project Lead for the Sustainable Systems Scientific Focus Area (SFA) 2.0, gives an overview of the project and its mission to develop a predictive understanding of terrestrial environments, from the genome to the watershed scales, to enable a new class of solutions for environmental and energy solutions.

  9. Science to support the understanding of Ohio's water resources, 2014-15

    USGS Publications Warehouse

    Shaffer, Kimberly; Kula, Stephanie P.

    2014-01-01

    The U.S. Geological Survey (USGS) works in cooperation with local, State, and other Federal agencies, as well as universities, to furnish decision makers, policy makers, USGS scientists, and the general public with reliable scientific information and tools to assist them in management, stewardship, and use of Ohio’s natural resources. The diversity of scientific expertise among USGS personnel enables them to carry out large- and small-scale multidisciplinary studies. The USGS is unique among government organizations because it has neither regulatory nor developmental authority—its sole product is impartial, credible, relevant, and timely scientific information, equally accessible and available to everyone. The USGS Ohio Water Science Center provides reliable hydrologic and water-related ecological information to aid in the understanding of the use and management of the Nation’s water resources, in general, and Ohio’s water resources, in particular. This fact sheet provides an overview of current (2014) or recently completed USGS studies and data activities pertaining to water resources in Ohio. More information regarding projects of the USGS Ohio Water Science Center is available at http://oh.water.usgs.gov/.

  10. Systems Engineering Challenges for GSFC Space Science Mission Operations

    NASA Technical Reports Server (NTRS)

    Thienel, Julie; Harman, Richard R.

    2017-01-01

    The NASA Goddard Space Flight Center Space Science Mission Operations (SSMO) project currently manages19 missions for the NASA Science Mission Directorate, within the Planetary, Astrophysics, and Heliophysics Divisions. The mission lifespans range from just a few months to more than20 years. The WIND spacecraft, the oldest SSMO mission, was launched in 1994. SSMO spacecraft reside in low earth, geosynchronous,highly elliptical, libration point, lunar, heliocentric,and Martian orbits. SSMO spacecraft range in size from 125kg (Aeronomy of Ice in the Mesosphere (AIM)) to over 4000kg (Fermi Gamma-Ray Space Telescope (Fermi)). The attitude modes include both spin and three-axis stabilized, with varying requirements on pointing accuracy. The spacecraft are operated from control centers at Goddard and off-site control centers;the Lunar Reconnaissance Orbiter (LRO), the Solar Dynamics Observatory (SDO) and Magnetospheric MultiScale (MMS)mission were built at Goddard. The Advanced Composition Explorer (ACE) and Wind are operated out of a multi-mission operations center, which will also host several SSMO-managed cubesats in 2017. This paper focuses on the systems engineeringchallenges for such a large and varied fleet of spacecraft.

  11. Large-scale correlations in gas traced by Mg II absorbers around low-mass galaxies

    NASA Astrophysics Data System (ADS)

    Kauffmann, Guinevere

    2018-03-01

    The physical origin of the large-scale conformity in the colours and specific star formation rates of isolated low-mass central galaxies and their neighbours on scales in excess of 1 Mpc is still under debate. One possible scenario is that gas is heated over large scales by feedback from active galactic nuclei (AGNs), leading to coherent modulation of cooling and star formation between well-separated galaxies. In this Letter, the metal line absorption catalogue of Zhu & Ménard is used to probe gas out to large projected radii around a sample of a million galaxies with stellar masses ˜1010M⊙ and photometric redshifts in the range 0.4 < z < 0.8 selected from Sloan Digital Sky Survey imaging data. This galaxy sample covers an effective volume of 2.2 Gpc3. A statistically significant excess of Mg II absorbers is present around the red-low-mass galaxies compared to their blue counterparts out to projected radii of 10 Mpc. In addition, the equivalent width distribution function of Mg II absorbers around low-mass galaxies is shown to be strongly affected by the presence of a nearby (Rp < 2 Mpc) radio-loud AGNs out to projected radii of 5 Mpc.

  12. 7 CFR 3405.7 - Joint project proposals.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... agricultural sciences. The goals of such joint initiatives should include maximizing the use of limited...), increasing cost-effectiveness through achieving economies of scale, strengthening the scope and quality of a...

  13. 7 CFR 3405.7 - Joint project proposals.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... agricultural sciences. The goals of such joint initiatives should include maximizing the use of limited...), increasing cost-effectiveness through achieving economies of scale, strengthening the scope and quality of a...

  14. 7 CFR 3405.7 - Joint project proposals.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... agricultural sciences. The goals of such joint initiatives should include maximizing the use of limited...), increasing cost-effectiveness through achieving economies of scale, strengthening the scope and quality of a...

  15. Evolution of Cognitive Rehabilitation After Stroke From Traditional Techniques to Smart and Personalized Home-Based Information and Communication Technology Systems: Literature Review.

    PubMed

    Cogollor, José M; Rojo-Lacal, Javier; Hermsdörfer, Joachim; Ferre, Manuel; Arredondo Waldmeyer, Maria Teresa; Giachritsis, Christos; Armstrong, Alan; Breñosa Martinez, Jose Manuel; Bautista Loza, Doris Anabelle; Sebastián, José María

    2018-03-26

    Neurological patients after stroke usually present cognitive deficits that cause dependencies in their daily living. These deficits mainly affect the performance of some of their daily activities. For that reason, stroke patients need long-term processes for their cognitive rehabilitation. Considering that classical techniques are focused on acting as guides and are dependent on help from therapists, significant efforts are being made to improve current methodologies and to use eHealth and Web-based architectures to implement information and communication technology (ICT) systems that achieve reliable, personalized, and home-based platforms to increase efficiency and level of attractiveness for patients and carers. The goal of this work was to provide an overview of the practices implemented for the assessment of stroke patients and cognitive rehabilitation. This study puts together traditional methods and the most recent personalized platforms based on ICT technologies and Internet of Things. A literature review has been distributed to a multidisciplinary team of researchers from engineering, psychology, and sport science fields. The systematic review has been focused on published scientific research, other European projects, and the most current innovative large-scale initiatives in the area. A total of 3469 results were retrieved from Web of Science, 284 studies from Journal of Medical Internet Research, and 15 European research projects from Community Research and Development Information Service from the last 15 years were reviewed for classification and selection regarding their relevance. A total of 7 relevant studies on the screening of stroke patients have been presented with 6 additional methods for the analysis of kinematics and 9 studies on the execution of goal-oriented activities. Meanwhile, the classical methods to provide cognitive rehabilitation have been classified in the 5 main techniques implemented. Finally, the review has been finalized with the selection of 8 different ICT-based approaches found in scientific-technical studies, 9 European projects funded by the European Commission that offer eHealth architectures, and other large-scale activities such as smart houses and the initiative City4Age. Stroke is one of the main causes that most negatively affect countries in the socioeconomic aspect. The design of new ICT-based systems should provide 4 main features for an efficient and personalized cognitive rehabilitation: support in the execution of complex daily tasks, automatic error detection, home-based performance, and accessibility. Only 33% of the European projects presented fulfilled those requirements at the same time. For this reason, current and future large-scale initiatives focused on eHealth and smart environments should try to solve this situation by providing more complete and sophisticated platforms. ©José M Cogollor, Javier Rojo-Lacal, Joachim Hermsdörfer, Manuel Ferre, Maria Teresa Arredondo Waldmeyer, Christos Giachritsis, Alan Armstrong, Jose Manuel Breñosa Martinez, Doris Anabelle Bautista Loza, José María Sebastián. Originally published in JMIR Rehabilitation and Assistive Technology (http://rehab.jmir.org), 26.03.2018.

  16. Alternative projections of the impacts of private investment on southern forests: a comparison of two large-scale forest sector models of the United States.

    Treesearch

    Ralph Alig; Darius Adams; John Mills; Richard Haynes; Peter Ince; Robert Moulton

    2001-01-01

    The TAMM/NAPAP/ATLAS/AREACHANGE(TNAA) system and the Forest and Agriculture Sector Optimization Model (FASOM) are two large-scale forestry sector modeling systems that have been employed to analyze the U.S. forest resource situation. The TNAA system of static, spatial equilibrium models has been applied to make SO-year projections of the U.S. forest sector for more...

  17. On a Game of Large-Scale Projects Competition

    NASA Astrophysics Data System (ADS)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  18. Quantitative evaluation of space charge effects of laser-cooled three-dimensional ion system on a secular motion period scale

    NASA Astrophysics Data System (ADS)

    Du, Li-Jun; Song, Hong-Fang; Chen, Shao-Long; Huang, Yao; Tong, Xin; Guan, Hua; Gao, Ke-Lin

    2018-04-01

    Not Available Project supported by the National Key Research and Development Program of China (Grant No. 2017YFA0304401), the National Natural Science Foundation of China (Grant Nos. 11622434, 11474318, 91336211, and 11634013), the Strategic Priority Research Program of the Chinese Academy of Sciences (Grant No. XDB21030100), Hubei Province Science Fund for Distinguished Young Scholars (Grant No. 2017CFA040), and the Youth Innovation Promotion Association of the Chinese Academy of Sciences (Grant No. 2015274).

  19. Detection and Characterisation of Meteors as a Big Data Citizen Science project

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.

    2017-12-01

    Out of a total around 50,000 meteorites currently known to science, the atmospheric passage was recorded instrumentally in only 30 cases with the potential to derive their atmospheric trajectories and pre-impact heliocentric orbits. Similarly, while the observations of meteors, add thousands of new entries per month to existing databases, it is extremely rare they lead to meteorite recovery. Meteor studies thus represent an excellent example of the Big Data citizen science project, where progress in the field largely depends on the prompt identification and characterisation of meteor events as well as on extensive and valuable contributions by amateur observers. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established EU COST BigSkyEarth http://bigskyearth.eu/ network.

  20. Collaboration in the Humanities, Arts and Social Sciences in Australia

    ERIC Educational Resources Information Center

    Haddow, Gaby; Xia, Jianhong; Willson, Michele

    2017-01-01

    This paper reports on the first large-scale quantitative investigation into collaboration, demonstrated in co-authorship, by Australian humanities, arts and social sciences (HASS) researchers. Web of Science data were extracted for Australian HASS publications, with a focus on the softer social sciences, over the period 2004-2013. The findings…

  1. Taking Stock: Implications of a New Vision of Science Learning for State Science Assessment

    ERIC Educational Resources Information Center

    Wertheim, Jill

    2016-01-01

    This article presents the author's response to the article "Taking Stock: Existing Resources for Assessing a New Vision of Science Learning" by Alonzo and Ke (this issue), which identifies numerous challenges that the Next Generation Science Standards (NGSS) pose for large-scale assessment. Jill Werthem comments that among those…

  2. Global Patterns in Students' Views of Science and Interest in Science

    ERIC Educational Resources Information Center

    van Griethuijsen, Ralf A. L. F.; van Eijck, Michiel W.; Haste, Helen; den Brok, Perry J.; Skinner, Nigel C.; Mansour, Nasser; Savran Gencer, Ayse; BouJaoude, Saouma

    2015-01-01

    International studies have shown that interest in science and technology among primary and secondary school students in Western European countries is low and seems to be decreasing. In many countries outside Europe, and especially in developing countries, interest in science and technology remains strong. As part of the large-scale European Union…

  3. International Cooperation of Space Science and Application in Chinese Manned Space Program

    NASA Astrophysics Data System (ADS)

    Gao, Ming; Guo, Jiong; Yang, Yang

    Early in China Manned Space Program, lots of space science and application projects have been carried out by utilizing the SZ series manned spaceships and the TG-1 spacelab, and remarkable achievements have been attained with the efforts of international partners. Around 2020, China is going to build its space station and carry out space science and application research of larger scale. Along with the scientific utilization plan for Chinese space station, experiment facilities are considered especially for international scientific cooperation, and preparations on international cooperation projects management are made as well. This paper briefs the international scientific cooperation history and achievement in the previous missions of China Manned Space Program. The general resources and facilities that will support potential cooperation projects are then presented. Finally, the international cooperation modes and approaches for utilizing Chinese Space Station are discussed.

  4. Planet Formation Imager (PFI): science vision and key requirements

    NASA Astrophysics Data System (ADS)

    Kraus, Stefan; Monnier, John D.; Ireland, Michael J.; Duchêne, Gaspard; Espaillat, Catherine; Hönig, Sebastian; Juhasz, Attila; Mordasini, Chris; Olofsson, Johan; Paladini, Claudia; Stassun, Keivan; Turner, Neal; Vasisht, Gautam; Harries, Tim J.; Bate, Matthew R.; Gonzalez, Jean-François; Matter, Alexis; Zhu, Zhaohuan; Panic, Olja; Regaly, Zsolt; Morbidelli, Alessandro; Meru, Farzana; Wolf, Sebastian; Ilee, John; Berger, Jean-Philippe; Zhao, Ming; Kral, Quentin; Morlok, Andreas; Bonsor, Amy; Ciardi, David; Kane, Stephen R.; Kratter, Kaitlin; Laughlin, Greg; Pepper, Joshua; Raymond, Sean; Labadie, Lucas; Nelson, Richard P.; Weigelt, Gerd; ten Brummelaar, Theo; Pierens, Arnaud; Oudmaijer, Rene; Kley, Wilhelm; Pope, Benjamin; Jensen, Eric L. N.; Bayo, Amelia; Smith, Michael; Boyajian, Tabetha; Quiroga-Nuñez, Luis Henry; Millan-Gabet, Rafael; Chiavassa, Andrea; Gallenne, Alexandre; Reynolds, Mark; de Wit, Willem-Jan; Wittkowski, Markus; Millour, Florentin; Gandhi, Poshak; Ramos Almeida, Cristina; Alonso Herrero, Almudena; Packham, Chris; Kishimoto, Makoto; Tristram, Konrad R. W.; Pott, Jörg-Uwe; Surdej, Jean; Buscher, David; Haniff, Chris; Lacour, Sylvestre; Petrov, Romain; Ridgway, Steve; Tuthill, Peter; van Belle, Gerard; Armitage, Phil; Baruteau, Clement; Benisty, Myriam; Bitsch, Bertram; Paardekooper, Sijme-Jan; Pinte, Christophe; Masset, Frederic; Rosotti, Giovanni

    2016-08-01

    The Planet Formation Imager (PFI) project aims to provide a strong scientific vision for ground-based optical astronomy beyond the upcoming generation of Extremely Large Telescopes. We make the case that a breakthrough in angular resolution imaging capabilities is required in order to unravel the processes involved in planet formation. PFI will be optimised to provide a complete census of the protoplanet population at all stellocentric radii and over the age range from 0.1 to 100 Myr. Within this age period, planetary systems undergo dramatic changes and the final architecture of planetary systems is determined. Our goal is to study the planetary birth on the natural spatial scale where the material is assembled, which is the "Hill Sphere" of the forming planet, and to characterise the protoplanetary cores by measuring their masses and physical properties. Our science working group has investigated the observational characteristics of these young protoplanets as well as the migration mechanisms that might alter the system architecture. We simulated the imprints that the planets leave in the disk and study how PFI could revolutionise areas ranging from exoplanet to extragalactic science. In this contribution we outline the key science drivers of PFI and discuss the requirements that will guide the technology choices, the site selection, and potential science/technology tradeoffs.

  5. Exploring Event and Status Based Phenological Monitoring in Citizen Science Projects: Lessons Learned from Project BudBurst

    NASA Astrophysics Data System (ADS)

    Ward, D.; Henderson, S.; Newman, S. J.

    2012-12-01

    Citizen science projects in ecology are in a unique position to address the needs of both the science and education communities. Such projects can provide needed data to further understanding of ecological processes at multiple spatial scales while also increasing public understanding of the importance of the ecological sciences. Balancing the needs of both communities, it is important that citizen science programs also provide different 'entry' points to appeal to diverse segments of society. In the case of NEON's Project BudBurst, a national plant phenology citizen science program, two approaches were developed to address the ongoing challenge to recruitment and retention of participants. Initially, Project BudBurst was designed to be an event-based phenology program. Participants were asked to identify a plant and report on the timing of specific phenoevents throughout the year. This approach requires a certain level of participation, which while yielding useful results, is not going to appeal to the broadest audience possible. To broaden participation, in 2011 and 2012, Project BudBurst added campaigns targeted at engaging individuals in making simple status-based reports of a plant they chose. Three targeted field campaigns were identified to take advantage of times when people notice changes to plants in their environment, using simple status-based protocols: Fall Into Phenology, Cherry Blossom Blitz, and Summer Solstice Snapshot. The interest and participation in these single report phenological status-based campaigns exceeded initial expectations. For example, Fall Into Phenology attracted individuals who otherwise had not considered participating in an ongoing field campaign. In the past, observations of fall phenology events submitted to Project BudBurst had been limited. By providing the opportunity for submitting simple, single reports, the number of both new participants and submitted observations increased significantly.

  6. National Academy of Sciences Recommends Continued Support of ALMA Project

    NASA Astrophysics Data System (ADS)

    2000-05-01

    A distinguished panel of scientists today announced their support for the continued funding of the Atacama Large Millimeter Array (ALMA) Project at a press conference given by the National Academy of Sciences. The ALMA Project is an international partnership between U.S. and European astronomy organizations to build a complete imaging telescope that will produce astronomical images at millimeter and submillimeter wavelengths. The U.S. partner is the National Science Foundation, through Associated Universities, Inc., (AUI), led by Dr. Riccardo Giacconi, and the National Radio Astronomy Observatory (NRAO). "We are delighted at this show of continued support from our peers in the scientific community," said Dr. Robert Brown, ALMA U.S. Project Director and Deputy Director of NRAO. "The endorsement adds momentum to the recent strides we've made toward the building of this important telescope." In 1998, the National Research Council, the working arm of the National Academy of Sciences, charged the Astronomy and Astrophysics Survey Committee to "survey the field of space- and ground-based astronomy and astrophysics" and to "recommend priorities for the most important new initiatives of the decade 2000-2010." In a report released today, the committee wrote that it "re-affirms the recommendations of the 1991 Astronomy and Astrophysics Survey Committee by endorsing the completion of . . . the Millimeter Array (MMA, now part of the Atacama Large Millimeter Array)." In the 1991 report "The Decade of Discovery," a previous committee chose the Millimeter Array as one of the most important projects of the decade 1990-2000. Early last year, the National Science Foundation signed a Memorandum of Understanding with a consortium of European organizations that effectively merged the MMA Project with the European Large Southern Array project. The combined project was christened the Atacama Large Millimeter Array. ALMA, expected to consist of 64 antennas with 12-meter diameter dishes, will be built at a high-altitude, extremely dry mountain site in Chile's Atacama desert. The array is scheduled to be completed sometime in this decade. Millimeter-wave astronomy studies the universe in the spectral region where most of its energy lies, between the long-wavelength radio waves and the shorter-wavelength infrared waves. In this realm, ALMA will study the structure of the early universe and the evolution of galaxies; gather crucial data on the formation of stars, protoplanetary disks, and planets; and provide new insights on the familiar objects of our own solar system. "Most of the photons in the Universe lie in the millimeter wavelength regime; among existing or planned instruments only ALMA can image the sources of these photons with the crispness required to understand the events of galaxy, star and planet formation which launched them into space," said NRAO's Dr. Alwyn Wootten, U.S. ALMA Project Scientist. ALMA is an international partnership between the United States (National Science Foundation) and Europe. European participants include the European Southern Observatory, the Centre National de la Recherche Scientifique (France), the Max-Planck Gesellschaft (Germany), the Netherlands Foundation for Research in Astronomy, the United Kingdom Particle Physics and Astronomy Research Council, the Oficina de Ciencia Y Tecnologia/Instituto Geografico Nacional (Spain), and the Swedish Natural Science Research Council. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.

  7. Discovering Beaten Paths in Collaborative Ontology-Engineering Projects using Markov Chains

    PubMed Central

    Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A.; Noy, Natalya F.

    2014-01-01

    Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50, 000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. PMID:24953242

  8. Discovering beaten paths in collaborative ontology-engineering projects using Markov chains.

    PubMed

    Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A; Noy, Natalya F

    2014-10-01

    Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50,000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Activities for the Promotion of Gender Equality in Japan—Japan Society of Applied Physics

    NASA Astrophysics Data System (ADS)

    Kodate, Kashiko; Tanaka, Kazuo

    2005-10-01

    Since 1946, the Japan Society of Applied Physics (JSAP) has strived to promote research and development in applied physics for benefits beyond national boundaries. Activities of JSAP involve multidisciplinary fields, from physics and engineering to life sciences. Of its 23,000 members, 48% are from industry, 29% from academia, and about 7% from semi-autonomous national research laboratories. Its large industrial membership is one of the distinctive features of JSAP. In preparation for the First IUPAP International Conference on Women in Physics (Paris, 2002), JSAP members took the first step under the strong leadership of then-JSAP President Toshio Goto, setting up the Committee for the Promotion Equal Participation of Men and Women in Science and Technology. Equality rather than women's advancement is highlighted to further development in science and technology. Attention is also paid to balancing the number of researchers from different age groups and affiliations. The committee has 22 members: 12 female and 10 male; 7 from corporations, 12 from universities, and 3 from semi-autonomous national research institutes. Its main activities are to organize symposia and meetings, conduct surveys among JSAP members, and provide child-care facilities at meetings and conferences. In 2002 the Japan Physics Society and the Chemical Society of Japan jointly created the Japan Inter-Society Liaison Association for the Promotion of Equal Participation of Men and Women in Science and Engineering. Membership has grown to 44 societies (of which 19 are observers) ranging from mathematics, information, and life sciences to civil engineering. Joint activities across sectors and empower the whole. The Gender Equality Bureau in the Cabinet Office recently launched a large-scale project called "Challenge Campaign" to encourage girls to major in natural science and engineering, which JSAP is co-sponsoring.

  10. The 19th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    Mcdonald, R. R.

    1981-01-01

    The Flat-Plate Solar Array Project is described. Project analysis and integration is discussed. Technology research in silicon material, large-area silicon sheet and environmental isolation; cell and module formation; engineering sciences, and module performance and failure analysis. It includes a report on, and copies of visual presentations made at, the 19th Project Integration Meeting held at Pasadena, California, on November 11, 1981.

  11. The Geo Data Portal an Example Physical and Application Architecture Demonstrating the Power of the "Cloud" Concept.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Walker, J.; Kunicki, T.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics (CIDA), in holding with the President's Digital Government Strategy and the Department of Interior's IT Transformation initiative, has evolved its data center and application architecture toward the "cloud" paradigm. In this case, "cloud" refers to a goal of developing services that may be distributed to infrastructure anywhere on the Internet. This transition has taken place across the entire data management spectrum from data center location to physical hardware configuration to software design and implementation. In CIDA's case, physical hardware resides in Madison at the Wisconsin Water Science Center, in South Dakota at the Earth Resources Observation and Science Center (EROS), and in the near future at a DOI approved commercial vendor. Tasks normally conducted on desktop-based GIS software with local copies of data in proprietary formats are now done using browser-based interfaces to web processing services drawing on a network of standard data-source web services. Organizations are gaining economies of scale through data center consolidation and the creation of private cloud services as well as taking advantage of the commoditization of data processing services. Leveraging open standards for data and data management take advantage of this commoditization and provide the means to reliably build distributed service based systems. This presentation will use CIDA's experience as an illustration of the benefits and hurdles of moving to the cloud. Replicating, reformatting, and processing large data sets, such as downscaled climate projections, traditionally present a substantial challenge to environmental science researchers who need access to data subsets and derived products. The USGS Geo Data Portal (GDP) project uses cloud concepts to help earth system scientists' access subsets, spatial summaries, and derivatives of commonly needed very large data. The GDP project has developed a reusable architecture and advanced processing services that currently accesses archives hosted at Lawrence Livermore National Lab, Oregon State University, the University Corporation for Atmospheric Research, and the U.S. Geological Survey, among others. Several examples of how the GDP project uses cloud concepts will be highlighted in this presentation: 1) The high bandwidth network connectivity of large data centers reduces the need for data replication and storage local to processing services. 2) Standard data serving web services, like OPeNDAP, Web Coverage Services, and Web Feature Services allow GDP services to remotely access custom subsets of data in a variety of formats, further reducing the need for data replication and reformatting. 3) The GDP services use standard web service APIs to allow browser-based user interfaces to run complex and compute-intensive processes for users from any computer with an Internet connection. The combination of physical infrastructure and application architecture implemented for the Geo Data Portal project offer an operational example of how distributed data and processing on the cloud can be used to aid earth system science.

  12. Portraiture in the Large Lecture: Storying One Chemistry Professor's Practical Knowledge

    NASA Astrophysics Data System (ADS)

    Eddleton, Jeannine E.

    Practical knowledge, as defined by Freema Elbaz (1983), is a complex, practically oriented set of understandings which teachers use to actively shape and direct their work. The goal of this study is the construction of a social science portrait that illuminates the practical knowledge of a large lecture professor of general chemistry at a public research university in the southeast. This study continues Elbaz's (1981) work on practical knowledge with the incorporation of a qualitative and intentionally interventionist methodology which "blurs the boundaries of aesthetics and empiricism in an effort to capture the complexity, dynamics, and subtlety of human experience and organizational life," (Lawrence-Lightfoot & Davis, 1997). This collection of interviews, observations, writings, and reflections is designed for an eclectic audience with the intent of initiating conversation on the topic of the large lecture and is a purposeful attempt to link research and practice. Social science portraiture is uniquely suited to this intersection of researcher and researched, the perfect combination of methodology and analysis for a project that is both product and praxis. The following research questions guide the study. • Are aspects of Elbaz's practical knowledge identifiable in the research conversations conducted with a large lecture college professor? • Is practical knowledge identifiable during observations of Patricia's large lecture? Freema Elbaz conducted research conversations with Sarah, a high school classroom and writing resource teacher who conducted much of her teaching work one on one with students. Patricia's practice differs significantly from Sarah's with respect to subject matter and to scale.

  13. On the Application of Science Systems Engineering and Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    NASA Astrophysics Data System (ADS)

    Schlegel, Nicole-Jeanne; Boening, Carmen; Larour, Eric; Limonadi, Daniel; Schodlok, Michael; Seroussi, Helene; Watkins, Michael

    2017-04-01

    Research and development activities at the Jet Propulsion Laboratory (JPL) currently support the creation of a framework to formally evaluate the observational needs within earth system science. One of the pilot projects of this effort aims to quantify uncertainties in global mean sea level rise projections, due to contributions from the continental ice sheets. Here, we take advantage of established uncertainty quantification tools embedded within the JPL-University of California at Irvine Ice Sheet System Model (ISSM). We conduct sensitivity and Monte-Carlo style sampling experiments on forward simulations of the Greenland and Antarctic ice sheets. By varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges, we assess the impact of the different parameter ranges on century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  14. A new mixed subgrid-scale model for large eddy simulation of turbulent drag-reducing flows of viscoelastic fluids

    NASA Astrophysics Data System (ADS)

    Li, Feng-Chen; Wang, Lu; Cai, Wei-Hua

    2015-07-01

    A mixed subgrid-scale (SGS) model based on coherent structures and temporal approximate deconvolution (MCT) is proposed for turbulent drag-reducing flows of viscoelastic fluids. The main idea of the MCT SGS model is to perform spatial filtering for the momentum equation and temporal filtering for the conformation tensor transport equation of turbulent flow of viscoelastic fluid, respectively. The MCT model is suitable for large eddy simulation (LES) of turbulent drag-reducing flows of viscoelastic fluids in engineering applications since the model parameters can be easily obtained. The LES of forced homogeneous isotropic turbulence (FHIT) with polymer additives and turbulent channel flow with surfactant additives based on MCT SGS model shows excellent agreements with direct numerical simulation (DNS) results. Compared with the LES results using the temporal approximate deconvolution model (TADM) for FHIT with polymer additives, this mixed SGS model MCT behaves better, regarding the enhancement of calculating parameters such as the Reynolds number. For scientific and engineering research, turbulent flows at high Reynolds numbers are expected, so the MCT model can be a more suitable model for the LES of turbulent drag-reducing flows of viscoelastic fluid with polymer or surfactant additives. Project supported by the China Postdoctoral Science Foundation (Grant No. 2011M500652), the National Natural Science Foundation of China (Grant Nos. 51276046 and 51206033), and the Specialized Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20112302110020).

  15. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  16. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    NASA Astrophysics Data System (ADS)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    Ecological research addresses challenges relating to the dynamics of the planet, such as changes in climate, biodiversity, ecosystem functioning and services, carbon and energy cycles, natural and human-induced hazards, and adaptation and mitigation strategies that involve many science and engineering disciplines and cross national boundaries. Because of the global nature of these challenges, greater international collaboration is required for knowledge sharing and technology deployment to advance earth science investigations and enhance societal benefits. For example, the Working Group on Biodiversity Preservation and Ecosystem Services (PCAST 2011) noted the scale and complexity of the physical and human resources needed to address these challenges. Many of the most pressing ecological research questions require global-scale data and global scale solutions (Suresh 2012), e.g., interdisciplinary data access from data centers managing ecological resources and hazards, drought, heat islands, carbon cycle, or data used to forecast the rate of spread of invasive species or zoonotic diseases. Variability and change at one location or in one region may well result from the superposition of global processes coupled together with regional and local modes of variability. For example, we know the El Niño-Southern Oscillation large-scale modes of variability in the coupled terrestrial-aquatic-atmospheric systems' correlation with variability in regional rainfall and ecosystem functions. It is therefore a high priority of government and non-government organizations to develop the necessary large scale, world-class research infrastructures for environmental research—and the framework by which these data can be shared, discovered, and utilized by a broad user community of scientists and policymakers, alike. Given that there are many, albeit nascent, efforts to build new environmental observatories/networks globally (e.g., EU-ICOS, EU-Lifewatch, AU-TERN, China-CERN, GEOSS, GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.

  17. How uncertain are climate model projections of water availability indicators across the Middle East?

    PubMed

    Hemming, Debbie; Buontempo, Carlo; Burke, Eleanor; Collins, Mat; Kaye, Neil

    2010-11-28

    The projection of robust regional climate changes over the next 50 years presents a considerable challenge for the current generation of climate models. Water cycle changes are particularly difficult to model in this area because major uncertainties exist in the representation of processes such as large-scale and convective rainfall and their feedback with surface conditions. We present climate model projections and uncertainties in water availability indicators (precipitation, run-off and drought index) for the 1961-1990 and 2021-2050 periods. Ensembles from two global climate models (GCMs) and one regional climate model (RCM) are used to examine different elements of uncertainty. Although all three ensembles capture the general distribution of observed annual precipitation across the Middle East, the RCM is consistently wetter than observations, especially over the mountainous areas. All future projections show decreasing precipitation (ensemble median between -5 and -25%) in coastal Turkey and parts of Lebanon, Syria and Israel and consistent run-off and drought index changes. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) GCM ensemble exhibits drying across the north of the region, whereas the Met Office Hadley Centre work Quantifying Uncertainties in Model ProjectionsAtmospheric (QUMP-A) GCM and RCM ensembles show slight drying in the north and significant wetting in the south. RCM projections also show greater sensitivity (both wetter and drier) and a wider uncertainty range than QUMP-A. The nature of these uncertainties suggests that both large-scale circulation patterns, which influence region-wide drying/wetting patterns, and regional-scale processes, which affect localized water availability, are important sources of uncertainty in these projections. To reduce large uncertainties in water availability projections, it is suggested that efforts would be well placed to focus on the understanding and modelling of both large-scale processes and their teleconnections with Middle East climate and localized processes involved in orographic precipitation.

  18. Philippine Academy of Rehabilitation Medicine emergency basic relief and medical aid mission project (November 2013-February 2014): the role of physiatrists in Super Typhoon Haiyan.

    PubMed

    Ganchoon, Filipinas; Bugho, Rommel; Calina, Liezel; Dy, Rochelle; Gosney, James

    2017-06-09

    Physiatrists have provided humanitarian assistance in recent large-scale global natural disasters. Super Typhoon Haiyan, the deadliest and most costly typhoon in modern Philippine history, made landfall on 8 November 2013 resulting in significant humanitarian needs. Philippine Academy of Rehabilitation Medicine physiatrists conducted a project of 23 emergency basic relief and medical aid missions in response to Super Typhoon Haiyan from November 2013 to February 2014. The final mission was a medical aid mission to the inland rural community of Burauen, Leyte. Summary data were collected, collated, and tabulated; project and mission evaluation was performed. During the humanitarian assistance project, 31,254 basic relief kits containing a variety of food and non-food items were distributed and medical services including consultation, treatment, and medicines were provided to 7255 patients. Of the 344 conditions evaluated in the medical aid mission to Burauen, Leyte 85 (59%) were physical and rehabilitation medicine conditions comprised of musculoskeletal (62 [73%]), neurological (17 [20%]), and dermatological (6 [7%]) diagnoses. Post-mission and project analysis resulted in recommendations and programmatic changes to strengthen response in future disasters. Physiatrists functioned as medical providers, mission team leaders, community advocates, and in other roles. This physiatrist-led humanitarian assistance project met critical basic relief and medical aid needs of persons impacted by Super Typhoon Haiyan, demonstrating significant roles performed by physiatrists in response to a large-scale natural disaster. Resulting disaster programing changes and recommendations may inform a more effective response by PARM mission teams in the Philippines as well as by other South-Eastern Asia teams comprising rehabilitation professionals to large-scale, regional natural disasters. Implications for rehabilitation Large-scale natural disasters including tropical cyclones can have a catastrophic impact on the affected population. In response to Super Typhoon Haiyan, physiatrists representing the Philippine Academy of Rehabilitation Medicine conducted a project of 23 emergency basic relief and medical aid missions from November 2013 to February 2014. Project analysis indicates that medical mission teams responding in similar settings may expect to evaluate a significant number of physical medicine and rehabilitation conditions. Medical rehabilitation with participation by rehabilitation professionals including rehabilitation doctors is essential to the emergency medical response in large-scale natural disasters.

  19. Automated Scheduling of Science Activities for Titan Encounters by Cassini

    NASA Technical Reports Server (NTRS)

    Ray, Trina L.; Knight, Russel L.; Mohr, Dave

    2014-01-01

    In an effort to demonstrate the efficacy of automated planning and scheduling techniques for large missions, we have adapted ASPEN (Activity Scheduling and Planning Environment) [1] and CLASP (Compressed Large-scale Activity Scheduling and Planning) [2] to the domain of scheduling high-level science goals into conflict-free operations plans for Titan encounters by the Cassini spacecraft.

  20. Large-scale standardized phenotyping of strawberry in RosBREED

    USDA-ARS?s Scientific Manuscript database

    A large, multi-institutional, international, research project with the goal of bringing genomicists and plant breeders together was funded by USDA-NIFA Specialty Crop Research Initiative. Apple, cherry, peach, and strawberry are the Rosaceous crops included in the project. Many (900+) strawberry g...

  1. PREFACE: 11th European Conference on Applied Superconductivity (EUCAS2013)

    NASA Astrophysics Data System (ADS)

    Farinon, Stefania; Pallecchi, Ilaria; Malagoli, Andrea; Lamura, Gianrico

    2014-05-01

    During the 11th edition of the European Conference on Applied Superconductivity, successfully held in Genoa from 15-19 September 2013, more than one thousand participants from over 40 countries were registered and contributions of 7 plenary lectures, 23 invited talks, 203 oral talks and 550 posters were presented. The present issue of Journal of Physics: Conference Series (JPCS) collects the 218 submitted papers that were peer reviewed and accepted in the Conference Proceedings. Similarly to the Superconductor Science and Technology Special issue: ''EUCAS 11th European Conference on Applied Superconductivity'' which contains some plenary and invited contributions, as well as some selected contributions, in this issue the papers are sorted according to the four traditional topics of interest of EUCAS, namely Materials (56 papers), Wires and Tapes (47 papers), Large Scale Applications (64 papers) and Electronics (51 papers). While the it Superconductors Science and Technology special issue focuses on the scientific and technological highlights of the conference, this collection provides an overall view of the worldwide research activity on applied superconductivity, mirroring the main guidelines and the hottest issues, which range from basic studies on newly discovered superconducting compounds to the state-of-the-art advances in large scale applications, wires and tapes fabrication and electronics. We would like to point out that, among the JPCS contributions, six papers present works financed by ongoing EU-Japan projects, three papers belong to the session on junctions and SQUIDs dedicated to the memory of Antonio Barone and one paper belongs to the session on pinning and flux dynamics dedicated to the memory of John Clem. Finally, we would like to thank all the people whose careful work contributed to the preparation of this JPCS issue, in particular the session chairs as well as the peer reviewers. The Editors Stefania Farinon (Editor in Chief, Large Scale), Ilaria Pallecchi (Materials), Andrea Malagoli (Wires and Tapes), and Gianrico Lamura (Electronics)

  2. The Success of the Horse-Chestnut Leaf-Miner, Cameraria ohridella, in the UK Revealed with Hypothesis-Led Citizen Science

    PubMed Central

    Pocock, Michael J. O.; Evans, Darren M.

    2014-01-01

    Citizen science is an increasingly popular way of undertaking research and simultaneously engaging people with science. However, most emphasis of citizen science in environmental science is on long-term monitoring. Here, we demonstrate the opportunities provided by short-term hypothesis-led citizen science. In 2010, we ran the ‘Conker Tree Science’ project, in which over 3500 people in Great Britain provided data at a national scale of an insect (horse-chestnut leaf-mining moth, Cameraria ohridella) undergoing rapid range-expansion. We addressed two hypotheses, and found that (1) the levels of damage caused to leaves of the horse-chestnut tree, Aesculus hippocastanum, and (2) the level of attack by parasitoids of C. ohridella larvae were both greatest where C. ohridella had been present the longest. Specifically there was a rapid rise in leaf damage during the first three years that C. ohridella was present and only a slight rise thereafter, while estimated rates of parasitism (an index of true rates of parasitism) increased from 1.6 to 5.9% when the time C. ohridella had been present in a location increased from 3 to 6 years. We suggest that this increase is due to recruitment of native generalist parasitoids, rather than the adaptation or host-tracking of more specialized parasitoids, as appears to have occurred elsewhere in Europe. Most data collected by participants were accurate, but the counts of parasitoids from participants showed lower concordance with the counts from experts. We statistically modeled this bias and propagated this through our analyses. Bias-corrected estimates of parasitism were lower than those from the raw data, but the trends were similar in magnitude and significance. With appropriate checks for data quality, and statistically correcting for biases where necessary, hypothesis-led citizen science is a potentially powerful tool for carrying out scientific research across large spatial scales while simultaneously engaging many people with science. PMID:24465973

  3. SCALING-UP INFORMATION IN LAND-COVER DATA FOR LARGE-SCALE ENVIRONMENTAL ASSESSMENTS

    EPA Science Inventory

    The NLCD project provides national-scope land-cover data for the conterminous United States. The first land-cover data set was completed in 2000, and the continuing need for recent land-cover information has motivated continuation of the project to provide current and change info...

  4. Increasing Geoscience Literacy and Public Support for the Earthscope National Science Initiative Through Informal Education

    NASA Astrophysics Data System (ADS)

    Aubele, J. C.

    2005-12-01

    Geology and geophysics are frequently perceived by the student, teacher, or adult non-geologist as "difficult to understand"; however, most non-geologists of all ages appreciate geological landforms such as mountains, volcanoes and canyons, and are interested in phenomena such as earthquakes and natural resources. Most people are also interested in local connections and newsworthy programs and projects. Therefore, the EarthScope Project is a perfect opportunity to excite and educate the public about solid-Earth geoscience research and to increase the non-geologist's understanding of Earth's dynamic processes. As the EarthScope Project sweeps across the country, the general public must be made aware of the magnitude, scope, excitement, and achievements of this national initiative. However, EarthScope science is difficult for the non-scientist to understand. The project is large-scale and long-term, and its data sets consist of maps, structural graphics, 3D and 4D visualizations, and the integration of many different geophysical instruments, all elements that are difficult for the non-scientist to understand. Targeted programs for students, teachers, and visitors to the National Parks will disseminate EarthScope information; in addition, museums and other informal science education centers can also play an important role in translating scientific research for the general public. Research on learning in museums has shown that museums educate an audience that is self-selected and self-directed (non-captive), includes family/groups, multigenerational, and repeat visitors, and requires presentation of information for a variety of learning styles. Informal science centers have the following advantages in geoscience-related education: (1) graphics/display expertise; (2) flexibility in approach and programming; (3) ability to quickly produce exhibits, educational programming, and curricula themed to specific topics of interest; (4) inclusion of K-12 teachers in the development of educational programs and materials for students, pre-service and in-service teachers, (5) family learning opportunities; (6) community-wide audience ranging from pre-K through Senior Citizen; (7) accessible, visitor-friendly and non-threatening resource site for science information for the community. Museums and other science centers provide concise, factual, reliable and entertaining presentations of the relevant information. It is not enough to simply report on the scientific research, museums educate through object-based and inquiry-based learning and experiential programming.

  5. The Role of Reading Comprehension in Large-Scale Subject-Matter Assessments

    ERIC Educational Resources Information Center

    Zhang, Ting

    2013-01-01

    This study was designed with the overall goal of understanding how difficulties in reading comprehension are associated with early adolescents' performance in large-scale assessments in subject domains including science and civic-related social studies. The current study extended previous research by taking a cognition-centered approach based on…

  6. Mining the Galaxy Zoo Database: Machine Learning Applications

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Wallin, J.; Vedachalam, A.; Baehr, S.; Lintott, C.; Darg, D.; Smith, A.; Fortson, L.

    2010-01-01

    The new Zooniverse initiative is addressing the data flood in the sciences through a transformative partnership between professional scientists, volunteer citizen scientists, and machines. As part of this project, we are exploring the application of machine learning techniques to data mining problems associated with the large and growing database of volunteer science results gathered by the Galaxy Zoo citizen science project. We will describe the basic challenge, some machine learning approaches, and early results. One of the motivators for this study is the acquisition (through the Galaxy Zoo results database) of approximately 100 million classification labels for roughly one million galaxies, yielding a tremendously large and rich set of training examples for improving automated galaxy morphological classification algorithms. In our first case study, the goal is to learn which morphological and photometric features in the Sloan Digital Sky Survey (SDSS) database correlate most strongly with user-selected galaxy morphological class. As a corollary to this study, we are also aiming to identify which galaxy parameters in the SDSS database correspond to galaxies that have been the most difficult to classify (based upon large dispersion in their volunter-provided classifications). Our second case study will focus on similar data mining analyses and machine leaning algorithms applied to the Galaxy Zoo catalog of merging and interacting galaxies. The outcomes of this project will have applications in future large sky surveys, such as the LSST (Large Synoptic Survey Telescope) project, which will generate a catalog of 20 billion galaxies and will produce an additional astronomical alert database of approximately 100 thousand events each night for 10 years -- the capabilities and algorithms that we are exploring will assist in the rapid characterization and classification of such massive data streams. This research has been supported in part through NSF award #0941610.

  7. Successful contracting of prevention services: fighting malnutrition in Senegal and Madagascar.

    PubMed

    Marek, T; Diallo, I; Ndiaye, B; Rakotosalama, J

    1999-12-01

    There are very few documented large-scale successes in nutrition in Africa, and virtually no consideration of contracting for preventive services. This paper describes two successful large-scale community nutrition projects in Africa as examples of what can be done in prevention using the contracting approach in rural as well as urban areas. The two case-studies are the Secaline project in Madagascar, and the Community Nutrition Project in Senegal. The article explains what is meant by 'success' in the context of these two projects, how these results were achieved, and how certain bottlenecks were avoided. Both projects are very similar in the type of service they provide, and in combining private administration with public finance. The article illustrates that contracting out is a feasible option to be seriously considered for organizing certain prevention programmes on a large scale. There are strong indications from these projects of success in terms of reducing malnutrition, replicability and scale, and community involvement. When choosing that option, a government can tap available private local human resources through contracting out, rather than delivering those services by the public sector. However, as was done in both projects studied, consideration needs to be given to using a contract management unit for execution and monitoring, which costs 13-17% of the total project's budget. Rigorous assessments of the cost-effectiveness of contracted services are not available, but improved health outcomes, targeting of the poor, and basic cost data suggest that the programmes may well be relatively cost-effective. Although the contracting approach is not presented as the panacea to solve the malnutrition problem faced by Africa, it can certainly provide an alternative in many countries to increase coverage and quality of services.

  8. Networking for large-scale science: infrastructure, provisioning, transport and application mapping

    NASA Astrophysics Data System (ADS)

    Rao, Nageswara S.; Carter, Steven M.; Wu, Qishi; Wing, William R.; Zhu, Mengxia; Mezzacappa, Anthony; Veeraraghavan, Malathi; Blondin, John M.

    2005-01-01

    Large-scale science computations and experiments require unprecedented network capabilities in the form of large bandwidth and dynamically stable connections to support data transfers, interactive visualizations, and monitoring and steering operations. A number of component technologies dealing with the infrastructure, provisioning, transport and application mappings must be developed and/or optimized to achieve these capabilities. We present a brief account of the following technologies that contribute toward achieving these network capabilities: (a) DOE UltraScienceNet and NSF CHEETAH network testbeds that provide on-demand and scheduled dedicated network connections; (b) experimental results on transport protocols that achieve close to 100% utilization on dedicated 1Gbps wide-area channels; (c) a scheme for optimally mapping a visualization pipeline onto a network to minimize the end-to-end delays; and (d) interconnect configuration and protocols that provides multiple Gbps flows from Cray X1 to external hosts.

  9. The Effect of Student-Centered Approaches on Students' Interest and Achievement in Science: Relevant Topic-Based, Open and Guided Inquiry-Based, and Discussion-Based Approaches

    NASA Astrophysics Data System (ADS)

    Kang, Jingoo; Keinonen, Tuula

    2017-04-01

    Since students have lost their interest in school science, several student-centered approaches, such as using topics that are relevant for students, inquiry-based learning, and discussion-based learning have been implemented to attract pupils into science. However, the effect of these approaches was usually measured in small-scale research, and thus, the large-scale evidence supporting student-centered approaches in general use is insufficient. Accordingly, this study aimed to investigate the effect of student-centered approaches on students' interest and achievement by analyzing a large-scale data set derived from Program for International Student Assessment (PISA) 2006, to add evidence for advocating these approaches in school science, and to generalize the effects on a large population. We used Finnish PISA 2006 data, which is the most recent data that measures science literacy and that contains relevant variables for the constructs of this study. As a consequence of the factor analyses, four teaching methods were grouped as student-centered approaches (relevant topic-based, open and guided inquiry-based, and discussion-based approaches in school science) from the Finnish PISA 2006 sample. The structural equation modeling result indicated that using topics relevant for students positively affected students' interest and achievement in science. Guided inquiry-based learning was also indicated as a strong positive predictor for students' achievement, and its effect was also positively associated with students' interest. On the other hand, open inquiry-based learning was indicated as a strong negative predictor for students' achievement, as was using discussion in school science. Implications and limitations of the study were discussed.

  10. Relationships between Induced Seismicity and Fluid Injection: Development of Strategies to Manage Injection

    NASA Astrophysics Data System (ADS)

    Eichhubl, Peter; Frohlich, Cliff; Gale, Julia; Olson, Jon; Fan, Zhiqiang; Gono, Valerie

    2014-05-01

    Induced seismicity during or following the subsurface injection of waste fluids such as well stimulation flow back and production fluids has recently received heightened public and industry attention. It is understood that induced seismicity occurs by reactivation of existing faults that are generally present in the injection intervals. We seek to address the question why fluid injection triggers earthquakes in some areas and not in others, with the aim toward improved injection methods that optimize injection volume and cost while avoiding induced seismicity. A GIS database has been built of natural and induced earthquakes in four hydrocarbon-producing basins: the Fort Worth Basin, South Texas, East Texas/Louisiana, and the Williston Basin. These areas are associated with disposal from the Barnett, Eagle Ford, Bakken, and Haynesville Shales respectively. In each region we analyzed data that were been collected using temporary seismographs of the National Science Foundation's USArray Transportable Array. Injection well locations, formations, histories, and volumes are also mapped using public and licensed datasets. Faults are mapped at a range of scales for selected areas that show different levels of seismic activity, and scaling relationships used to extrapolate between the seismic and wellbore scale. Reactivation potential of these faults is assessed using fault occurrence, and in-situ stress conditions, identifying areas of high and low fault reactivation potential. A correlation analysis between fault reactivation potential, induced seismicity, and fluid injection will use spatial statistics to quantify the probability of seismic fault reactivation for a given injection pressure in the studied reservoirs. The limiting conditions inducing fault reactivation will be compared to actual injection parameters (volume, rate, injection duration and frequency) where available. The objective of this project is a statistical reservoir- to basin-scale assessment of fault reactivation and seismicity induced by fluid injection. By assessing the occurrence of earthquakes (M>2) evenly across large geographic regions, this project differs from previous studies of injection-induced seismicity that focused on earthquakes large enough to cause public concern in well-populated areas. The understanding of triggered seismicity gained through this project is expected to allow for improved design strategies for waste fluid injection to industry and public decision makers.

  11. Different effect of NiMnCo or FeNiCo on the growth of type-IIa large diamonds with Ti/Cu as nitrogen getter

    NASA Astrophysics Data System (ADS)

    Li, Shang-Sheng; Zhang, He; Su, Tai-Chao; Hu, Qiang; Hu, Mei-Hua; Gong, Chun-Sheng; Ma, Hon-An; Jia, Xiao-Peng; Li, Yong; Xiao, Hong-Yu

    2017-06-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant No. 11604246), the China Postdoctoral Science Foundation (Grant No. 2016M592714), the Professional Practice Demonstration Base for Professional Degree Graduate in Material Engineering of Henan Polytechnic University, China (Grant No. 2016YJD03), the Funds from the Education Department of Henan Province, China (Grant Nos. 12A430010 and 17A430020), and the Project for Key Science and Technology Research of Henan Province, China (Grant No. 162102210275).

  12. Information on a Major New Initiative: Mapping and Sequencing the Human Genome (1986 DOE Memorandum)

    DOE R&D Accomplishments Database

    DeLisi, Charles (Associate Director, Health and Environmental Research, DOE Office of Energy Research)

    1986-05-06

    In the history of the Human Genome Program, Dr. Charles DeLisi and Dr. Alvin Trivelpiece of the Department of Energy (DOE) were instrumental in moving the seeds of the program forward. This May 1986 memo from DeLisi to Trivelpiece, Director of DOE's Office of Energy Research, documents this fact. Following the March 1986 Santa Fe workshop on the subject of mapping and sequencing the human genome, DeLisi's memo outlines workshop conclusions, explains the relevance of this project to DOE and the importance of the Department's laboratories and capabilities, notes the critical experience of DOE in managing projects of this scale and potential magnitude, and recognizes the fact that the project will impact biomedical science in ways which could not be fully anticipated at the time. Subsequently, program guidance was further sought from the DOE Health Effects Research Advisory Committee (HERAC) and the April 1987 HERAC report recommended that DOE and the nation commit to a large, multidisciplinary, scientific and technological undertaking to map and sequence the human genome.

  13. Basin-Scale Hydrologic Impacts of CO2 Storage: Regulatory and Capacity Implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkholzer, J.T.; Zhou, Q.

    Industrial-scale injection of CO{sub 2} into saline sedimentary basins will cause large-scale fluid pressurization and migration of native brines, which may affect valuable groundwater resources overlying the deep sequestration reservoirs. In this paper, we discuss how such basin-scale hydrologic impacts can (1) affect regulation of CO{sub 2} storage projects and (2) may reduce current storage capacity estimates. Our assessment arises from a hypothetical future carbon sequestration scenario in the Illinois Basin, which involves twenty individual CO{sub 2} storage projects in a core injection area suitable for long-term storage. Each project is assumed to inject five million tonnes of CO{sub 2}more » per year for 50 years. A regional-scale three-dimensional simulation model was developed for the Illinois Basin that captures both the local-scale CO{sub 2}-brine flow processes and the large-scale groundwater flow patterns in response to CO{sub 2} storage. The far-field pressure buildup predicted for this selected sequestration scenario suggests that (1) the area that needs to be characterized in a permitting process may comprise a very large region within the basin if reservoir pressurization is considered, and (2) permits cannot be granted on a single-site basis alone because the near- and far-field hydrologic response may be affected by interference between individual sites. Our results also support recent studies in that environmental concerns related to near-field and far-field pressure buildup may be a limiting factor on CO{sub 2} storage capacity. In other words, estimates of storage capacity, if solely based on the effective pore volume available for safe trapping of CO{sub 2}, may have to be revised based on assessments of pressure perturbations and their potential impact on caprock integrity and groundwater resources, respectively. We finally discuss some of the challenges in making reliable predictions of large-scale hydrologic impacts related to CO{sub 2} sequestration projects.« less

  14. ‘Sciencenet’—towards a global search and share engine for all scientific knowledge

    PubMed Central

    Lütjohann, Dominic S.; Shah, Asmi H.; Christen, Michael P.; Richter, Florian; Knese, Karsten; Liebel, Urban

    2011-01-01

    Summary: Modern biological experiments create vast amounts of data which are geographically distributed. These datasets consist of petabytes of raw data and billions of documents. Yet to the best of our knowledge, a search engine technology that searches and cross-links all different data types in life sciences does not exist. We have developed a prototype distributed scientific search engine technology, ‘Sciencenet’, which facilitates rapid searching over this large data space. By ‘bringing the search engine to the data’, we do not require server farms. This platform also allows users to contribute to the search index and publish their large-scale data to support e-Science. Furthermore, a community-driven method guarantees that only scientific content is crawled and presented. Our peer-to-peer approach is sufficiently scalable for the science web without performance or capacity tradeoff. Availability and Implementation: The free to use search portal web page and the downloadable client are accessible at: http://sciencenet.kit.edu. The web portal for index administration is implemented in ASP.NET, the ‘AskMe’ experiment publisher is written in Python 2.7, and the backend ‘YaCy’ search engine is based on Java 1.6. Contact: urban.liebel@kit.edu Supplementary Material: Detailed instructions and descriptions can be found on the project homepage: http://sciencenet.kit.edu. PMID:21493657

  15. Analogue scale modelling of extensional tectonic processes using a large state-of-the-art centrifuge

    NASA Astrophysics Data System (ADS)

    Park, Heon-Joon; Lee, Changyeol

    2017-04-01

    Analogue scale modelling of extensional tectonic processes such as rifting and basin opening has been numerously conducted. Among the controlling factors, gravitational acceleration (g) on the scale models was regarded as a constant (Earth's gravity) in the most of the analogue model studies, and only a few model studies considered larger gravitational acceleration by using a centrifuge (an apparatus generating large centrifugal force by rotating the model at a high speed). Although analogue models using a centrifuge allow large scale-down and accelerated deformation that is derived by density differences such as salt diapir, the possible model size is mostly limited up to 10 cm. A state-of-the-art centrifuge installed at the KOCED Geotechnical Centrifuge Testing Center, Korea Advanced Institute of Science and Technology (KAIST) allows a large surface area of the scale-models up to 70 by 70 cm under the maximum capacity of 240 g-tons. Using the centrifuge, we will conduct analogue scale modelling of the extensional tectonic processes such as opening of the back-arc basin. Acknowledgement This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant number 2014R1A6A3A04056405).

  16. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  17. Telecommunications technology and rural education in the United States

    NASA Technical Reports Server (NTRS)

    Perrine, J. R.

    1975-01-01

    The rural sector of the US is examined from the point of view of whether telecommunications technology can augment the development of rural education. Migratory farm workers and American Indians were the target groups which were examined as examples of groups with special needs in rural areas. The general rural population and the target groups were examined to identify problems and to ascertain specific educational needs. Educational projects utilizing telecommunications technology in target group settings were discussed. Large scale regional ATS-6 satellite-based experimental educational telecommunications projects were described. Costs and organizational factors were also examined for large scale rural telecommunications projects.

  18. Large-scale magnetic fields at high Reynolds numbers in magnetohydrodynamic simulations.

    PubMed

    Hotta, H; Rempel, M; Yokoyama, T

    2016-03-25

    The 11-year solar magnetic cycle shows a high degree of coherence in spite of the turbulent nature of the solar convection zone. It has been found in recent high-resolution magnetohydrodynamics simulations that the maintenance of a large-scale coherent magnetic field is difficult with small viscosity and magnetic diffusivity (≲10 (12) square centimenters per second). We reproduced previous findings that indicate a reduction of the energy in the large-scale magnetic field for lower diffusivities and demonstrate the recovery of the global-scale magnetic field using unprecedentedly high resolution. We found an efficient small-scale dynamo that suppresses small-scale flows, which mimics the properties of large diffusivity. As a result, the global-scale magnetic field is maintained even in the regime of small diffusivities-that is, large Reynolds numbers. Copyright © 2016, American Association for the Advancement of Science.

  19. From darwin to the census of marine life: marine biology as big science.

    PubMed

    Vermeulen, Niki

    2013-01-01

    With the development of the Human Genome Project, a heated debate emerged on biology becoming 'big science'. However, biology already has a long tradition of collaboration, as natural historians were part of the first collective scientific efforts: exploring the variety of life on earth. Such mappings of life still continue today, and if field biology is gradually becoming an important subject of studies into big science, research into life in the world's oceans is not taken into account yet. This paper therefore explores marine biology as big science, presenting the historical development of marine research towards the international 'Census of Marine Life' (CoML) making an inventory of life in the world's oceans. Discussing various aspects of collaboration--including size, internationalisation, research practice, technological developments, application, and public communication--I will ask if CoML still resembles traditional collaborations to collect life. While showing both continuity and change, I will argue that marine biology is a form of natural history: a specific way of working together in biology that has transformed substantially in interaction with recent developments in the life sciences and society. As a result, the paper does not only give an overview of transformations towards large scale research in marine biology, but also shines a new light on big biology, suggesting new ways to deepen the understanding of collaboration in the life sciences by distinguishing between different 'collective ways of knowing'.

  20. The emerging science of precision medicine and pharmacogenomics for Parkinson's disease.

    PubMed

    Payami, Haydeh

    2017-08-01

    Current therapies for Parkinson's disease are problematic because they are symptomatic and have adverse effects. New drugs have failed in clinical trials because of inadequate efficacy. At the core of the problem is trying to make one drug work for all Parkinson's disease patients, when we know this premise is wrong because (1) Parkinson's disease is not a single disease, and (2) no two individuals have the same biological makeup. Precision medicine is the goal to strive for, but we are only at the beginning stages of building the infrastructure for one of the most complex projects in the history of science, and it will be a long time before Parkinson's disease reaps the benefits. Pharmacogenomics, a cornerstone of precision medicine, has already proven successful for many conditions and could also propel drug discovery and improve treatment for Parkinson's disease. To make progress in the pharmacogenomics of Parkinson's disease, we need to change course from small inconclusive candidate gene studies to large-scale rigorously planned genome-wide studies that capture the nuclear genome and the microbiome. Pharmacogenomic studies must use homogenous subtypes of Parkinson's disease or apply the brute force of statistical power to overcome heterogeneity, which will require large sample sizes achievable only via internet-based methods and electronic databases. Large-scale pharmacogenomic studies, together with biomarker discovery efforts, will yield the knowledge necessary to design clinical trials with precision to alleviate confounding by disease heterogeneity and interindividual variability in drug response, two of the major impediments to successful drug discovery and effective treatment. © 2017 International Parkinson and Movement Disorder Society. © 2017 International Parkinson and Movement Disorder Society.

  1. A Profile of California's Secondary Teachers

    ERIC Educational Resources Information Center

    Tripp, Paula J.

    2006-01-01

    The critical and urgent need for large numbers of secondary family and consumer sciences (FCS) teachers has been thoroughly documented (American Association of Family & Consumer Sciences, 1999; Miller & Meszaros, 1996; Rehm & Jackman, 1995) over the past decade. Projected shortages have become a reality because the demand for…

  2. Focal Plant Observations as a Standardised Method for Pollinator Monitoring: Opportunities and Limitations for Mass Participation Citizen Science

    PubMed Central

    Roy, Helen E.; Baxter, Elizabeth; Saunders, Aoine; Pocock, Michael J. O.

    2016-01-01

    Background Recently there has been increasing focus on monitoring pollinating insects, due to concerns about their declines, and interest in the role of volunteers in monitoring pollinators, particularly bumblebees, via citizen science. Methodology / Principal Findings The Big Bumblebee Discovery was a one-year citizen science project run by a partnership of EDF Energy, the British Science Association and the Centre for Ecology & Hydrology which sought to assess the influence of the landscape at multiple scales on the diversity and abundance of bumblebees. Timed counts of bumblebees (Bombus spp.; identified to six colour groups) visiting focal plants of lavender (Lavendula spp.) were carried out by about 13 000 primary school children (7–11 years old) from over 4000 schools across the UK. 3948 reports were received totalling 26 868 bumblebees. We found that while the wider landscape type had no significant effect on reported bumblebee abundance, the local proximity to flowers had a significant effect (fewer bumblebees where other flowers were reported to be >5m away from the focal plant). However, the rate of mis-identifcation, revealed by photographs uploaded by participants and a photo-based quiz, was high. Conclusions / Significance Our citizen science results support recent research on the importance of local flocal resources on pollinator abundance. Timed counts of insects visiting a lure plant is potentially an effective approach for standardised pollinator monitoring, engaging a large number of participants with a simple protocol. However, the relatively high rate of mis-identifications (compared to reports from previous pollinator citizen science projects) highlights the importance of investing in resources to train volunteers. Also, to be a scientifically valid method for enquiry, citizen science data needs to be sufficiently high quality, so receiving supporting evidence (such as photographs) would allow this to be tested and for records to be verified. PMID:26985824

  3. Coastal hazards in a changing world: projecting and communicating future coastal flood risk at the local-scale using the Coastal Storm Modeling System (CoSMoS)

    NASA Astrophysics Data System (ADS)

    O'Neill, Andrea; Barnard, Patrick; Erikson, Li; Foxgrover, Amy; Limber, Patrick; Vitousek, Sean; Fitzgibbon, Michael; Wood, Nathan

    2017-04-01

    The risk of coastal flooding will increase for many low-lying coastal regions as predominant contributions to flooding, including sea level, storm surge, wave setup, and storm-related fluvial discharge, are altered with climate change. Community leaders and local governments therefore look to science to provide insight into how climate change may affect their areas. Many studies of future coastal flooding vulnerability consider sea level and tides, but ignore other important factors that elevate flood levels during storm events, such as waves, surge, and discharge. Here we present a modelling approach that considers a broad range of relevant processes contributing to elevated storm water levels for open coast and embayment settings along the U.S. West Coast. Additionally, we present online tools for communicating community-relevant projected vulnerabilities. The Coastal Storm Modeling System (CoSMoS) is a numerical modeling system developed to predict coastal flooding due to both sea-level rise (SLR) and plausible 21st century storms for active-margin settings like the U.S. West Coast. CoSMoS applies a predominantly deterministic framework of multi-scale models encompassing large geographic scales (100s to 1000s of kilometers) to small-scale features (10s to 1000s of meters), resulting in flood extents that can be projected at a local resolution (2 meters). In the latest iteration of CoSMoS applied to Southern California, U.S., efforts were made to incorporate water level fluctuations in response to regional storm impacts, locally wind-generated waves, coastal river discharge, and decadal-scale shoreline and cliff changes. Coastal hazard projections are available in a user-friendly web-based tool (www.prbo.org/ocof), where users can view variations in flood extent, maximum flood depth, current speeds, and wave heights in response to a range of potential SLR and storm combinations, providing direct support to adaptation and management decisions. In order to capture the societal aspect of the hazard, projections are combined with socioeconomic exposure to produce clear, actionable information (https://www.usgs.gov/apps/hera/); this integrated approach to hazard displays provides an example of how to effectively translate complex climate impacts projections into simple, societally-relevant information.

  4. Strain localisation in the continental lithosphere, a scale-dependent process

    NASA Astrophysics Data System (ADS)

    Jolivet, Laurent; Burov, Evguenii

    2013-04-01

    Strain localisation in continents is a general question tackled by specialists of various disciplines in Earth Sciences. Field geologists working at regional scale are able to describe the succession of events leading to the formation of large strain zones that accommodate large displacement within plate boundaries. On the other end of the spectrum, laboratory experiments provide numbers that quantitatively describe the rheology of rock material at the scale of a few mm and at deformation rates up to 8-10 orders of magnitude faster than in nature. Extrapolating from the scale of the experiment to the scale of the continental lithosphere is a considerable leap across 8-10 orders of magnitude both in space and time. It is however quite obvious that different processes are at work for each scale considered. At the scale of a grain aggregate diffusion within individual grains, dislocation or grain boundary sliding, depending on temperature and fluid conditions, are of primary importance. But at the scale of a mountain belt, a major detachment or a strike-slip shear zone that have accommodated tens or hundreds of kilometres of relative displacement, other parameters will take over such as structural softening and the heterogeneity of the crust inherited from past tectonic events that have juxtaposed rock units of very different compositions and induced a strong orientation of rocks. Once the deformation is localised along major shear zones, grain size reduction, interaction between rocks and fluids and metamorphic reactions and other small-scale processes tend to further localise the strain. Because the crust is colder and more lithologically complex this heterogeneity is likely much more prominent in the crust than in the mantle and then the relative importance of "small-scale" and "large-scale" parameters will be very different in the crust and in the mantle. Thus, depending upon the relative thickness of the crust and mantle in the deforming lithosphere, the role of each mechanism will have more or less important consequences on strain localisation. This complexity sometimes leads to disregard of experimental parameters in large-scale thermo-mechanical models and to use instead ad hoc "large-scale" numbers that better fit the observed geological history. The goal of the ERC RHEOLITH project is to associate to each tectonic process the relevant rheological parameters depending upon the scale considered, in an attempt to elaborate a generalized "Preliminary Rheology Model Set for Lithosphere" (PReMSL), which will cover the entire time and spatial scale range of deformation.

  5. Applied aerodynamics experience for secondary science teachers and students

    NASA Technical Reports Server (NTRS)

    Abbitt, John D., III; Carroll, Bruce F.

    1992-01-01

    The Department of Aerospace Engineering, Mechanics & Engineering Science at the University of Florida in conjunction with the Alachua County, Florida School Board has embarked on a four-year project of university-secondary school collaboration designed to enhance mathematics and science instruction in secondary school classrooms. The goals are to provide teachers with a fundamental knowledge of flight sciences, and to stimulate interest among students, particularly women and minorities, toward careers in engineering, mathematics, and science. In the first year of the project, all thirteen of the eighth grade physical science teachers and all 1200 of the eighth grade physical science students in the county participated. The activities consisted of a three-day seminar taught at the college level for the teachers, several weeks of classroom instruction for all the students, and an airport field trip for a subgroup of about 430 students that included an orientation flight in a Cessna 172 aircraft. The project brought together large numbers of middle school students, teachers, undergraduate and graduate engineering students, school board administrators, and university engineering faculty.

  6. UK Environmental Prediction - integration and evaluation at the convective scale

    NASA Astrophysics Data System (ADS)

    Lewis, Huw; Brunet, Gilbert; Harris, Chris; Best, Martin; Saulter, Andrew; Holt, Jason; Bricheno, Lucy; Brerton, Ashley; Reynard, Nick; Blyth, Eleanor; Martinez de la Torre, Alberto

    2015-04-01

    It has long been understood that accurate prediction and warning of the impacts of severe weather requires an integrated approach to forecasting. This was well demonstrated in the UK throughout winter 2013/14 when an exceptional run of severe winter storms, often with damaging high winds and intense rainfall led to significant damage from the large waves and storm surge along coastlines, and from saturated soils, high river flows and significant flooding inland. The substantial impacts on individuals, businesses and infrastructure indicate a pressing need to understand better the value that might be delivered through more integrated environmental prediction. To address this need, the Met Office, Centre for Ecology & Hydrology and National Oceanography Centre have begun to develop the foundations of a coupled high resolution probabilistic forecast system for the UK at km-scale. This links together existing model components of the atmosphere, coastal ocean, land surface and hydrology. Our initial focus on a 2-year Prototype project will demonstrate the UK coupled prediction concept in research mode, including an analysis of the winter 2013/14 storms and its impacts. By linking science development to operational collaborations such as the UK Natural Hazards Partnership, we can ensure that science priorities are rooted in user requirements. This presentation will provide an overview of UK environmental prediction activities and an update on progress during the first year of the Prototype project. We will present initial results from the coupled model development and discuss the challenges to realise the potential of integrated regional coupled forecasting for improving predictions and applications.

  7. Three-dimensional presentation of the earth and space science data in collaboration among schools, science museums and scientists

    NASA Astrophysics Data System (ADS)

    Saito, Akinori; Tsugawa, Takuya

    Three-dimensional presentation of the earth and space science data is a best tool to show the scientific data of the earth and space. It can display the correct shape on the Earth while any two-dimensional maps distort shapes. Furthermore it helps audience to understand the scale size and phenomena of the earth and planets in an intuitive way. There are several projects of the 3-D presentation of the Earth, such as Science on a Sphere (SOS) by NOAA, and Geo-cosmos by Miraikan, Japan. We are developing a simple, portable and affordable 3-D presentation system, called Dagik Earth. It uses a spherical or hemispherical screen to project data and images using normal PC and PC projector. The minimum size is 8cm and the largest size is 8m in diameter. The Dagik Earth project has developed the software of the 3-D projection in collaboration with scientists, and provides the software to the science museums and school teachers. Because the same system can be used in museums and schools, several science museums play a roll of hub for the school teachers' training on the earth and planetary science class with Dagik Earth. International collaboration with Taiwan, Thailand, and other countries is in progress. In the presentation, we introduce the system of Dagik Earth and the activities using it in the collaboration among schools, science centers, universities and research institutes.

  8. The ELAA 2 Citizen Science Project: The Case for Science, Equity, and Critical Thinking in Adult English Language Instruction

    NASA Astrophysics Data System (ADS)

    Basham, M.

    2012-08-01

    This article summarizes a paper presented at the recent ASP conference Connecting People to Science in Baltimore 2011. This action research study currently in progress aims to explore the impact of integrating science into English language instruction (English Language Acquisition for Adults, or ELLA) serving largely Hispanic immigrants at an adult learning center based in Phoenix, Arizona.

  9. Can Observation Skills of Citizen Scientists Be Estimated Using Species Accumulation Curves?

    PubMed Central

    Kelling, Steve; Johnston, Alison; Hochachka, Wesley M.; Iliff, Marshall; Fink, Daniel; Gerbracht, Jeff; Lagoze, Carl; La Sorte, Frank A.; Moore, Travis; Wiggins, Andrea; Wong, Weng-Keen; Wood, Chris; Yu, Jun

    2015-01-01

    Volunteers are increasingly being recruited into citizen science projects to collect observations for scientific studies. An additional goal of these projects is to engage and educate these volunteers. Thus, there are few barriers to participation resulting in volunteer observers with varying ability to complete the project’s tasks. To improve the quality of a citizen science project’s outcomes it would be useful to account for inter-observer variation, and to assess the rarely tested presumption that participating in a citizen science projects results in volunteers becoming better observers. Here we present a method for indexing observer variability based on the data routinely submitted by observers participating in the citizen science project eBird, a broad-scale monitoring project in which observers collect and submit lists of the bird species observed while birding. Our method for indexing observer variability uses species accumulation curves, lines that describe how the total number of species reported increase with increasing time spent in collecting observations. We find that differences in species accumulation curves among observers equates to higher rates of species accumulation, particularly for harder-to-identify species, and reveals increased species accumulation rates with continued participation. We suggest that these properties of our analysis provide a measure of observer skill, and that the potential to derive post-hoc data-derived measurements of participant ability should be more widely explored by analysts of data from citizen science projects. We see the potential for inferential results from analyses of citizen science data to be improved by accounting for observer skill. PMID:26451728

  10. Science Information System in Japan. NIER Occasional Paper 02/83.

    ERIC Educational Resources Information Center

    Matsumura, Tamiko

    This paper describes the development of a proposed Japanese Science Information System (SIS), a nationwide network of research and academic libraries, large-scale computer centers, national research institutes, and other organizations, to be formed for the purpose of sharing information and resources in the natural sciences, technology, the…

  11. Stemming the Tide: Retaining and Supporting Science Teachers

    ERIC Educational Resources Information Center

    Pirkle, Sheila F.

    2011-01-01

    Chronically high rates of new and experienced science teacher attrition and the findings of new large-scale mentoring programs indicate that administrators should adopt new approaches. A science teacher's role encompasses demanding responsibilities, such as observing laboratory safety and OSHA mandates, as well as management of a business-like,…

  12. Beowulf Distributed Processing and the United States Geological Survey

    USGS Publications Warehouse

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.

  13. Emily Evans | NREL

    Science.gov Websites

    Evans Emily Evans Project Controller Emily.Evans@nrel.gov | 303-275-3125 Emily joined NREL in 2010 . As a Project Administrator in the Integrated Applications Center, Emily works with project managers and teams to develop and maintain project management excellence on large-scale, multi-year projects

  14. Research to Real Life, 2006: Innovations in Deaf-Blindness

    ERIC Educational Resources Information Center

    Leslie, Gail, Ed.

    2006-01-01

    This publication presents several projects that support children who are deaf-blind. These projects are: (1) Learning To Learn; (2) Project SALUTE; (3) Project SPARKLE; (4) Bringing It All Back Home; (5) Project PRIIDE; and (6) Including Students With Deafblindness In Large Scale Assessment Systems. Each project lists components, key practices,…

  15. Close Range Calibration of Long Focal Length Lenses in a Changing Environment

    NASA Astrophysics Data System (ADS)

    Robson, Stuart; MacDonald, Lindsay; Kyle, Stephen; Shortis, Mark R.

    2016-06-01

    University College London is currently developing a large-scale multi-camera system for dimensional control tasks in manufacturing, including part machining, assembly and tracking, as part of the Light Controlled Factory project funded by the UK Engineering and Physical Science Research Council. In parallel, as part of the EU LUMINAR project funded by the European Association of National Metrology Institutes, refraction models of the atmosphere in factory environments are being developed with the intent of modelling and eliminating the effects of temperature and other variations. The accuracy requirements for both projects are extremely demanding, so accordingly improvements in the modelling of both camera imaging and the measurement environment are essential. At the junction of these two projects lies close range camera calibration. The accurate and reliable calibration of cameras across a realistic range of atmospheric conditions in the factory environment is vital in order to eliminate systematic errors. This paper demonstrates the challenge of experimentally isolating environmental effects at the level of a few tens of microns. Longer lines of sight promote the use and calibration of a near perfect perspective projection from a Kern 75mm lens with maximum radial distortion of the order of 0.5m. Coordination of a reference target array, representing a manufactured part, is achieved to better than 0.1mm at a standoff of 8m. More widely, results contribute to better sensor understanding, improved mathematical modelling of factory environments and more reliable coordination of targets to 0.1mm and better over large volumes.

  16. Inquiry-Based Educational Design for Large-Scale High School Astronomy Projects Using Real Telescopes

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena

    2015-12-01

    In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials and general approach can evolve in response to solicited feedback. The improvement cycle concentrates on avoiding overly positive self-evaluation while addressing relevant external school and community factors while concentrating on backward mapping from clearly set goals. Limiting factors, including time, resources, support and the potential for failure in the classroom, are dealt with as much as possible in the large-scale design allowing teachers the best chance of successful implementation in their real-world classroom. The actual approach adopted following the principles of this design is also outlined, which has seen success in bringing real astronomical data and access to telescopes into the high school classroom.

  17. Using Professional Development to Achieve Classroom Reform and Science Proficiency: An Urban Success Story from Southern Nevada, USA

    ERIC Educational Resources Information Center

    Crippen, Kent J.; Biesinger, Kevin D.; Ebert, Ellen K.

    2010-01-01

    This paper provides a detailed description and evaluation of a three-year professional development project in a large urban setting in the southwestern United States. The impetus for the project was curriculum development focused on integrated scientific inquiry. Project goals included the development of a professional learning community, reformed…

  18. Computational data sciences for assessment and prediction of climate extremes

    NASA Astrophysics Data System (ADS)

    Ganguly, A. R.

    2011-12-01

    Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.

  19. Really Large Scale Computer Graphic Projection Using Lasers and Laser Substitutes

    NASA Astrophysics Data System (ADS)

    Rother, Paul

    1989-07-01

    This paper reflects on past laser projects to display vector scanned computer graphic images onto very large and irregular surfaces. Since the availability of microprocessors and high powered visible lasers, very large scale computer graphics projection have become a reality. Due to the independence from a focusing lens, lasers easily project onto distant and irregular surfaces and have been used for amusement parks, theatrical performances, concert performances, industrial trade shows and dance clubs. Lasers have been used to project onto mountains, buildings, 360° globes, clouds of smoke and water. These methods have proven successful in installations at: Epcot Theme Park in Florida; Stone Mountain Park in Georgia; 1984 Olympics in Los Angeles; hundreds of Corporate trade shows and thousands of musical performances. Using new ColorRayTM technology, the use of costly and fragile lasers is no longer necessary. Utilizing fiber optic technology, the functionality of lasers can be duplicated for new and exciting projection possibilities. The use of ColorRayTM technology has enjoyed worldwide recognition in conjunction with Pink Floyd and George Michaels' world wide tours.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abe Lederman

    This report contains the comprehensive summary of the work performed on the SBIR Phase II project (“Distributed Relevance Ranking in Heterogeneous Document Collections”) at Deep Web Technologies (http://www.deepwebtech.com). We have successfully completed all of the tasks defined in our SBIR Proposal work plan (See Table 1 - Phase II Tasks Status). The project was completed on schedule and we have successfully deployed an initial production release of the software architecture at DOE-OSTI for the Science.gov Alliance's search portal (http://www.science.gov). We have implemented a set of grid services that supports the extraction, filtering, aggregation, and presentation of search results from numerousmore » heterogeneous document collections. Illustration 3 depicts the services required to perform QuickRank™ filtering of content as defined in our architecture documentation. Functionality that has been implemented is indicated by the services highlighted in green. We have successfully tested our implementation in a multi-node grid deployment both within the Deep Web Technologies offices, and in a heterogeneous geographically distributed grid environment. We have performed a series of load tests in which we successfully simulated 100 concurrent users submitting search requests to the system. This testing was performed on deployments of one, two, and three node grids with services distributed in a number of different configurations. The preliminary results from these tests indicate that our architecture will scale well across multi-node grid deployments, but more work will be needed, beyond the scope of this project, to perform testing and experimentation to determine scalability and resiliency requirements. We are pleased to report that a production quality version (1.4) of the science.gov Alliance's search portal based on our grid architecture was released in June of 2006. This demonstration portal is currently available at http://science.gov/search30 . The portal allows the user to select from a number of collections grouped by category and enter a query expression (See Illustration 1 - Science.gov 3.0 Search Page). After the user clicks “search” a results page is displayed that provides a list of results from the selected collections ordered by relevance based on the query expression the user provided. Our grid based solution to deep web search and document ranking has already gained attention within DOE, other Government Agencies and a fortune 50 company. We are committed to the continued development of grid based solutions to large scale data access, filtering, and presentation problems within the domain of Information Retrieval and the more general categories of content management, data mining and data analysis.« less

Top