Big questions, big science: meeting the challenges of global ecology.
Schimel, David; Keller, Michael
2015-04-01
Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.
What makes computational open source software libraries successful?
NASA Astrophysics Data System (ADS)
Bangerth, Wolfgang; Heister, Timo
2013-01-01
Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.
SALTON SEA SCIENTIFIC DRILLING PROJECT: SCIENTIFIC PROGRAM.
Sass, J.H.; Elders, W.A.
1986-01-01
The Salton Sea Scientific Drilling Project, was spudded on 24 October 1985, and reached a total depth of 10,564 ft. (3. 2 km) on 17 March 1986. There followed a period of logging, a flow test, and downhole scientific measurements. The scientific goals were integrated smoothly with the engineering and economic objectives of the program and the ideal of 'science driving the drill' in continental scientific drilling projects was achieved in large measure. The principal scientific goals of the project were to study the physical and chemical processes involved in an active, magmatically driven hydrothermal system. To facilitate these studies, high priority was attached to four areas of sample and data collection, namely: (1) core and cuttings, (2) formation fluids, (3) geophysical logging, and (4) downhole physical measurements, particularly temperatures and pressures.
The Superconducting Supercollider and US Science Policy
NASA Astrophysics Data System (ADS)
Marburger, John H.
2014-06-01
Reasons for the Superconducting Supercollider's (SSC's) termination include significant changes in the attitude of the government towards large scientific projects originating with management reforms introduced decades earlier. In the 1980s, the government insisted on inclusion of elements of these reforms in the SSC's management contract, including increased demands for accountability, additional liability for contractors, and sanctions for infractions. The SSC's planners could not have opted out of the reforms, which were by then becoming part of all large publicly funded projects. Once these reforms were in place, management mistakes in the SSC's planning and construction became highly visible, leading to termination of the machine. This episode contains two key lessons about science policy. One is that the momentum of the government's management reforms was unstoppable, and its impact on large scientific facilities and projects could not be reversed. The other is that specific measures such as cost and schedule-tracking systems to provide measures of program performance and impact were also inevitable; large scientific projects needed new parameters of accountability and transparency in what can be called the Principle of Assurance.
Project management in the development of scientific software
NASA Astrophysics Data System (ADS)
Platz, Jochen
1986-08-01
This contribution is a rough outline of a comprehensive project management model for the development of software for scientific applications. The model was tested in the unique environment of the Siemens AG Corporate Research and Technology Division. Its focal points are the structuring of project content - the so-called phase organization, the project organization and the planning model used, and its particular applicability to innovative projects. The outline focuses largely on actual project management aspects rather than associated software engineering measures.
Systems aspects of COBE science data compression
NASA Technical Reports Server (NTRS)
Freedman, I.; Boggess, E.; Seiler, E.
1993-01-01
A general approach to compression of diverse data from large scientific projects has been developed and this paper addresses the appropriate system and scientific constraints together with the algorithm development and test strategy. This framework has been implemented for the COsmic Background Explorer spacecraft (COBE) by retrofitting the existing VAS-based data management system with high-performance compression software permitting random access to the data. Algorithms which incorporate scientific knowledge and consume relatively few system resources are preferred over ad hoc methods. COBE exceeded its planned storage by a large and growing factor and the retrieval of data significantly affects the processing, delaying the availability of data for scientific usage and software test. Embedded compression software is planned to make the project tractable by reducing the data storage volume to an acceptable level during normal processing.
Technologies for Large Data Management in Scientific Computing
NASA Astrophysics Data System (ADS)
Pace, Alberto
2014-01-01
In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.
Citizen science on a smartphone: Participants' motivations and learning.
Land-Zandstra, Anne M; Devilee, Jeroen L A; Snik, Frans; Buurmeijer, Franka; van den Broek, Jos M
2016-01-01
Citizen science provides researchers means to gather or analyse large datasets. At the same time, citizen science projects offer an opportunity for non-scientists to be part of and learn from the scientific process. In the Dutch iSPEX project, a large number of citizens turned their smartphones into actual measurement devices to measure aerosols. This study examined participants' motivation and perceived learning impacts of this unique project. Most respondents joined iSPEX because they wanted to contribute to the scientific goals of the project or because they were interested in the project topics (health and environmental impact of aerosols). In terms of learning impact, respondents reported a gain in knowledge about citizen science and the topics of the project. However, many respondents had an incomplete understanding of the science behind the project, possibly caused by the complexity of the measurements. © The Author(s) 2015.
ERIC Educational Resources Information Center
Simon, Josep; Cuenca-Lorente, Mar
2012-01-01
Although a large number of Spanish secondary schools have preserved an important scientific heritage, including large scientific instrument collections, this heritage has never been officially protected. Their current state is very diverse, and although several research projects have attempted to initiate their recovery and use, their lack of…
NASA Astrophysics Data System (ADS)
Simon, Josep; Cuenca-Lorente, Mar
2012-02-01
Although a large number of Spanish secondary schools have preserved an important scientific heritage, including large scientific instrument collections, this heritage has never been officially protected. Their current state is very diverse, and although several research projects have attempted to initiate their recovery and use, their lack of coordination and wide range of methodological approaches has limited their impact. This paper presents a case-study integrated in a new project supported by the Catalan Scientific Instrument Commission (COMIC) whose final aim is the establishment of a research hub for the preservation, study and use of Spanish scientific instrument collections. Major aims in this project are promoting a better coordination of Spanish projects in this field, and furthering international research on science pedagogy and the material culture of science. The major focus of COMIC is currently the recovery of secondary school collections. This paper provides first, a historical account of the development of secondary education in Spain, and the contemporary establishment of physics and chemistry school collections. Second, we focus on a case-study of three Spanish schools (Valencia, Castellón, and Alicante). Finally, we provide a brief overview of current projects to preserve Spanish school collections, and discuss how COMIC can contribute to help to coordinate them, and to take a step forward interdisciplinary research in this context.
The Fermi LAT Very Important Project (VIP) List of Active Galactic Nuclei
NASA Astrophysics Data System (ADS)
Thompson, David J.; Fermi Large Area Telescope Collaboration
2018-01-01
Using nine years of Fermi Gamma-ray Space Telescope Large Area Telescope (LAT) observations, we have identified 30 projects for Active Galactic Nuclei (AGN) that appear to provide strong prospects for significant scientific advances. This Very Important Project (VIP) AGN list includes AGNs that have good multiwavelength coverage, are regularly detected by the Fermi LAT, and offer scientifically interesting timing or spectral properties. Each project has one or more LAT scientists identified who are actively monitoring the source. They will be regularly updating the LAT results for these VIP AGNs, working together with multiwavelength observers and theorists to maximize the scientific return during the coming years of the Fermi mission. See https://confluence.slac.stanford.edu/display/GLAMCOG/VIP+List+of+AGNs+for+Continued+Study
NASA Astrophysics Data System (ADS)
Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.
2016-01-01
The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.
The Ophidia framework: toward cloud-based data analytics for climate change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni
2015-04-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.
Optimizing procedures for a human genome repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nierman, W.C.
1991-03-01
Large numbers of clones will be generated during the Human Genome Project. As each is characterized, subsets will be identified which are useful to the scientific community at large. These subsets are most readily distributed through public repositories. The American Type Culture Collection (ATCC) is experienced in repository operation, but before this project had no history in managing clones and associated information in large batches instead of individually. This project permitted the ATCC to develop several procedures for automating and thus reducing the cost of characterizing, preserving, and maintaining information about clones.
NASA Astrophysics Data System (ADS)
Price, Aaron
2010-01-01
Citizen Sky is a new three-year, astronomical citizen science project launched in June, 2009 with funding from the National Science Foundation. This paper reports on early results of an assessment delivered to 1000 participants when they first joined the project. The goal of the assessment, based on the Nature of Scientific Knowledge Scale (NSKS), is to characterize their attitudes towards the nature of scientific knowledge. Our results are that the NSKS components of the assessment achieved high levels of reliability. Both reliability and overall scores fall within the range reported from other NSKS studies in the literature. Correlation analysis with other components of the assessment reveals some factors, such as age and understanding of scientific evidence, may be reflected in scores of subscales of NSKS items. Further work will be done using online discourse analysis and interviews. Overall, we find that the NSKS can be used as an entrance assessment for an online citizen science project.
ENVIRONMENTAL IMPLICATIONS OF PLANTS MODIFIED TO CONTAIN INSECTICIDAL GENES
Genetically modified (GM) crops are being grown on large acreages in the United States. Before being approved for sale, sufficient scientific evidence allowed the EPA to determine that they are safe. The results of this research project will strengthen the scientific basis EPA u...
The Power of Engaging Citizen Scientists for Scientific Progress
Garbarino, Jeanne; Mason, Christopher E.
2016-01-01
Citizen science has become a powerful force for scientific inquiry, providing researchers with access to a vast array of data points while connecting nonscientists to the authentic process of science. This citizen-researcher relationship creates an incredible synergy, allowing for the creation, execution, and analysis of research projects that would otherwise prove impossible in traditional research settings, namely due to the scope of needed human or financial resources (or both). However, citizen-science projects are not without their challenges. For instance, as projects are scaled up, there is concern regarding the rigor and usability of data collected by citizens who are not formally trained in research science. While these concerns are legitimate, we have seen examples of highly successful citizen-science projects from multiple scientific disciplines that have enhanced our collective understanding of science, such as how RNA molecules fold or determining the microbial metagenomic snapshot of an entire public transportation system. These and other emerging citizen-science projects show how improved protocols for reliable, large-scale science can realize both an improvement of scientific understanding for the general public and novel views of the world around us. PMID:27047581
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1991-01-01
The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.
Rauschenbach, Ines; Keddis, Ramaydalis; Davis, Diane
2018-01-01
We have redesigned a tried-and-true laboratory exercise into an inquiry-based team activity exploring microbial growth control, and implemented this activity as the basis for preparing a scientific poster in a large, multi-section laboratory course. Spanning most of the semester, this project culminates in a poster presentation of data generated from a student-designed experiment. Students use and apply the scientific method and improve written and verbal communication skills. The guided inquiry format of this exercise provides the opportunity for student collaboration through cooperative learning. For each learning objective, a percentage score was tabulated (learning objective score = points awarded/total possible points). A score of 80% was our benchmark for achieving each objective. At least 76% of the student groups participating in this project over two semesters achieved each learning goal. Student perceptions of the project were evaluated using a survey. Nearly 90% of participating students felt they had learned a great deal in the areas of formulating a hypothesis, experimental design, and collecting and analyzing data; 72% of students felt this project had improved their scientific writing skills. In a separate survey, 84% of students who responded felt that peer review was valuable in improving their final poster submission. We designed this inquiry-based poster project to improve student scientific communication skills. This exercise is appropriate for any microbiology laboratory course whose learning outcomes include the development of scientific inquiry and literacy.
Rauschenbach, Ines; Keddis, Ramaydalis; Davis, Diane
2018-01-01
We have redesigned a tried-and-true laboratory exercise into an inquiry-based team activity exploring microbial growth control, and implemented this activity as the basis for preparing a scientific poster in a large, multi-section laboratory course. Spanning most of the semester, this project culminates in a poster presentation of data generated from a student-designed experiment. Students use and apply the scientific method and improve written and verbal communication skills. The guided inquiry format of this exercise provides the opportunity for student collaboration through cooperative learning. For each learning objective, a percentage score was tabulated (learning objective score = points awarded/total possible points). A score of 80% was our benchmark for achieving each objective. At least 76% of the student groups participating in this project over two semesters achieved each learning goal. Student perceptions of the project were evaluated using a survey. Nearly 90% of participating students felt they had learned a great deal in the areas of formulating a hypothesis, experimental design, and collecting and analyzing data; 72% of students felt this project had improved their scientific writing skills. In a separate survey, 84% of students who responded felt that peer review was valuable in improving their final poster submission. We designed this inquiry-based poster project to improve student scientific communication skills. This exercise is appropriate for any microbiology laboratory course whose learning outcomes include the development of scientific inquiry and literacy. PMID:29904518
Idea Paper: The Lifecycle of Software for Scientific Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubey, Anshu; McInnes, Lois C.
The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less
Envision: An interactive system for the management and visualization of large geophysical data sets
NASA Technical Reports Server (NTRS)
Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.
1995-01-01
Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.
NASA Technical Reports Server (NTRS)
Eller, E. L.
1976-01-01
The project scientists is in a position which rates very high in terms of behavioral study recommendations. His influence over objectives is generally considered to be important. He is highly autonomous in a moderately coordinated environment. He has diverse managerial and technical functions and the performance of these functions require him to grow beyond his role as an experimenter. However, the position within the line organization for those interviewed is also very stimulating, rating almost as high by the same criteria. The role of project scientist may not be the dominant means of professional growth for the experienced scientific investigators. The influence which the project scientist exerts on the project and the stimulation of that position for him are determined largely by his position outside the defined project scientist role. The role of the project scientist is changing because the environment of those who become project scientists is changing.
Optimizing procedures for a human genome repository. Final report, June 1, 1988--November 30, 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nierman, W.C.
1991-03-01
Large numbers of clones will be generated during the Human Genome Project. As each is characterized, subsets will be identified which are useful to the scientific community at large. These subsets are most readily distributed through public repositories. The American Type Culture Collection (ATCC) is experienced in repository operation, but before this project had no history in managing clones and associated information in large batches instead of individually. This project permitted the ATCC to develop several procedures for automating and thus reducing the cost of characterizing, preserving, and maintaining information about clones.
Igniting the Light Elements: The Los Alamos Thermonuclear Weapon Project, 1942-1952
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzpatrick, Anne C.
1999-07-01
The American system of nuclear weapons research and development was conceived and developed not as a result of technological determinism, but by a number of individual architects who promoted the growth of this large technologically-based complex. While some of the technological artifacts of this system, such as the fission weapons used in World War II, have been the subject of many historical studies, their technical successors--fusion (or hydrogen) devices--are representative of the largely unstudied highly secret realms of nuclear weapons science and engineering. In the postwar period a small number of Los Alamos Scientific Laboratory's staff and affiliates were responsiblemore » for theoretical work on fusion weapons, yet the program was subject to both the provisions and constraints of the US Atomic Energy Commission, of which Los Alamos was a part. The Commission leadership's struggle to establish a mission for its network of laboratories, least of all to keep them operating, affected Los Alamos's leaders' decisions as to the course of weapons design and development projects. Adapting Thomas P. Hughes's ''large technological systems'' thesis, I focus on the technical, social, political, and human problems that nuclear weapons scientists faced while pursuing the thermonuclear project, demonstrating why the early American thermonuclear bomb project was an immensely complicated scientific and technological undertaking. I concentrate mainly on Los Alamos Scientific Laboratory's Theoretical, or T, Division, and its members' attempts to complete an accurate mathematical treatment of the ''Super''--the most difficult problem in physics in the postwar period--and other fusion weapon theories. Although tackling a theoretical problem, theoreticians had to address technical and engineering issues as well. I demonstrate the relative value and importance of H-bomb research over time in the postwar era to scientific, politician, and military participants in this project. I analyze how and when participants in the H-bomb project recognized both blatant and subtle problems facing the project, how scientists solved them, and the relationship this process had to official nuclear weapons policies. Consequently, I show how the practice of nuclear weapons science in the postwar period became an extremely complex, technologically-based endeavor.« less
In Defense of the National Labs and Big-Budget Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodwin, J R
2008-07-29
The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less
ERIC Educational Resources Information Center
Crippen, Kent J.; Biesinger, Kevin D.; Ebert, Ellen K.
2010-01-01
This paper provides a detailed description and evaluation of a three-year professional development project in a large urban setting in the southwestern United States. The impetus for the project was curriculum development focused on integrated scientific inquiry. Project goals included the development of a professional learning community, reformed…
Archive interoperability in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Genova, Françoise
2003-02-01
Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Williams, Dean; Aloisio, Giovanni
2016-04-01
In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.
Electronic Collaboration Logbook
NASA Astrophysics Data System (ADS)
Gysin, Suzanne; Mandrichenko, Igor; Podstavkov, Vladimir; Vittone, Margherita
2012-12-01
In HEP, scientific research is performed by large collaborations of organizations and individuals. The logbook of a scientific collaboration is an important part of the collaboration record. Often it contains experimental data. At Fermi National Accelerator Laboratory (FNAL), we developed an Electronic Collaboration Logbook (ECL) application, which is used by about 20 different collaborations, experiments and groups at FNAL. The ECL is the latest iteration of the project formerly known as the Control Room Logbook (CRL). We have been working on mobile (IOS and Android) clients for the ECL. We will present the history, current status and future plans of the project, as well as design, implementation and support solutions made by the project.
Krilowicz, B I; Henter, H; Kamhi-Stein, L
1997-06-01
Providing large numbers of general education students with an introduction to science is a challenge. To meet this challenge, a quarter-long neurophysiology project was developed for use in an introductory biology course. The primary goals of this multistep project were to introduce students to the scientific method, scientific writing, on-line scientific bibliographic databases, and the scientific literature, while improving their academic literacy skills. Students began by collecting data on their own circadian rhythms in autonomic, motor, and cognitive function, reliably demonstrating the predicted circadian changes in heart rate, eye-hand coordination, and adding speed. Students wrote a journal-style article using pooled class data. Students were prepared to write the paper by several methods that were designed to improve academic language skills, including a library training exercise, "modeling" of the writing assignment, and drafting of subsections of the paper. This multistep neurophysiology project represents a significant commitment of time by both students and instructors, but produces a valuable finished product and ideally gives introductory students a positive first experience with science.
Software Engineering for Scientific Computer Simulations
NASA Astrophysics Data System (ADS)
Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.
2004-11-01
Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.
The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.
2015-12-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.
The age of citizen science: Stimulating future environmental research
NASA Astrophysics Data System (ADS)
Burgess, S. N.
2010-12-01
Public awareness of the state of the ocean is growing with issues such as climate change, over-harvesting, marine pollution, coral bleaching, ocean acidification and sea level rise appearing regularly in popular media outlets. Society is also placing greater value on the range of ecosystem services the ocean provides. This increased consciousness of environmental change due to a combination of anthropogenic activities and impacts from climate change offers scientists the opportunity of engaging citizens in environmental research. The term citizen science refers to scientific research carried out by citizens and led by professionals, which involves large scale data collection whilst simultaneously engaging and educating those who participate. Most projects that engage citizen scientists have been specifically designed to provide an educational benefit to the volunteer and benefit the scientific inquiry by collecting extensive data sets over large geographical areas. Engaging the public in environmental science is not a new concept and successful projects (such as the Audobon Christmas Bird Count and Earthwatch) have been running for several decades resulting in hundreds of thousands of people conducting long-term field research in partnership with scientists based at universities worldwide. The realm of citizen science projects is continually expanding, with public engagement options ranging from science online; to backyard afternoon studies; to fully immersive experiential learning projects running for weeks at a time. Some organisations, such as Earthwatch also work in partnership with private industry; giving scientists access to more funding opportunities than those avenues traditionally available. These scientist -industry partnerships provide mutual benefits as the results of research projects in environments such as coastal ecosystems feed directly back into business risk strategies; for example mitigating shoreline erosion, storm surges, over fishing and warming water temperatures. Citizen science projects fulfill the requirements of government granting institutions for outreach and scientific communication. This presentation will highlight marine research projects, which have not only engaged citizens in the scientific process but also discuss the impacts of associated outreach, capacity building and community environmental stewardship.
Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David
2013-01-01
The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.
NASA Astrophysics Data System (ADS)
Cristini, Luisa
2017-04-01
Scientific and technological research carried out within universities and public research institutions often involves large collaborations across several countries. Despite the considerable budget (typically millions of Euros), the high expectations (high impact scientific findings, new technological developments and links with policy makers, industry and civil society) and the length of the project over several years, these international projects often rely heavily on the personal skills of the management team (project coordinator, project manager, principal investigators) without a structured, transferable framework. While this approach has become an established practice, it's not ideal and can jeopardise the success of the entire effort with consequences ranging from schedule delays, loss of templates/systems, financial charges and ultimately project failure. In this presentation I will show the advantages of integrating a globally recognised standard for professional project management, such as the PMP® by the Project Management Institute, into academic research. I will cover the project management knowledge areas (integration management, scope management, time management, cost management, quality management, human resources management, risk management, procurement management, and stakeholder management) and the processes within these throughout the phases of the project lifetime (project initiation, planning, executing, monitoring and controlling, and closure). I will show how application of standardised, transferable procedures, developed within the business & administration sector, can benefit academia and more generally scientific research.
ERIC Educational Resources Information Center
Sands, Ashley Elizabeth
2017-01-01
Ground-based astronomy sky surveys are massive, decades-long investments in scientific data collection. Stakeholders expect these datasets to retain scientific value well beyond the lifetime of the sky survey. However, the necessary investments in knowledge infrastructures for managing sky survey data are not yet in place to ensure the long-term…
ERIC Educational Resources Information Center
Mitchell, Ross
2010-01-01
In the summer of 2008, the Spanish legislature resolved to grant great apes (though not all simians) basic human rights. While the decision to grant such rights came about largely through the lobbying efforts of the Great Ape Project (GAP), the decision has potential reverberations throughout the scientific world and beyond in its implications for…
Big Science and the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Giudice, Gian Francesco
2012-03-01
The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.
Halley's comet exploration and the Japanese Usuda large antenna
NASA Technical Reports Server (NTRS)
Nomura, T.
1986-01-01
An overview of the Japanese PLANET-A project to investigate Halley's Comet is given. The objectives and scientific challenges involved in the project are given, and the nature of the contribution made by the large antenna array located at Usuda-Cho, Nagano Prefecture, Japan is discussed. The structural design of the MS-T5 and PLANET-A probes are given, as well as the tracking and control network for the probes. The construction, design, operating system and site selection for the Usuda antenna station are discussed.
John A. Stanturf; Daniel A. Marion; Martin Spetich; Kenneth Luckow; James M. Guldin; Hal O. Liechty; Calvin E. Meier
2000-01-01
The Ouachita Mountains Ecosystem Management Research Project (OEMP) is a large interdisciplinary research project designed to provide the scientific foundation for landscape management at the scale of watersheds. The OEMP has progressed through three phases: developing natural regeneration alternatives to clearcutting and planting; testing of these alternatives at the...
NAESA Augmentation Pilot Project
NASA Technical Reports Server (NTRS)
Hoover, John J.
1998-01-01
This project was one project within the Native American Earth and Space Academy (NAESA). NAESA is a national initiative comprised of several organizations that support programs which focus on 1) enhancing the technological, scientific and pedagogical skills of K-14 teachers who instruct Native Americans, 2) enhancing the understanding and applications of science, technology, and engineering of college-bound Native Americans and teaching them general college "survival skills" (e.g., test taking, time management, study habits), 3) enhancing the scientific and pedagogical skills of the faculty of tribally-controllcd colleges and community colleges with large Native American enrollments, and 4) strengthening the critical relationships between students, their parents, tribal elders, and their communities. This Augmentation Pilot Project focused on the areas of community-school alliances and intemet technology use in teaching and learning and daily living addressing five major objectives.
Michael Keller; Maria Assunção Silva-Dias; Daniel C. Nepstad; Meinrat O. Andreae
2004-01-01
The Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multi-disciplinary, multinational scientific project led by Brazil. LBA researchers seek to understand Amazonia in its global context especially with regard to regional and global climate. Current development activities in Amazonia including deforestation, logging, cattle ranching, and agriculture...
Helios: Understanding Solar Evolution Through Text Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Randazzese, Lucien
This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance,more » or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.« less
The Atacama Large Aperture Submm/mm Telescope (AtLAST) Project
NASA Astrophysics Data System (ADS)
Bertoldi, Frank
2018-01-01
In the past decade a strong case has been made for the construction of a next generation, 25 to 40-meter large submillimeter telescope, most notably through the CCAT and the Japanese LST projects. Although much effort had been spent on detailed science cases and technological studies, none of these projects have yet secured funding to advance to construction. We invite the interested community to join a study of the scientific merit, technical implementation, and financial path toward what we coin the "Atacama Large Submillimeter Telescope" (AtLAST). Through this community workshop, working groups, and a final report to be released in early 2019, we hope to motivate the global astronomy community to value and support the realization of such a facility.
Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Koo, Michelle; Cao, Yu
Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less
The scientific data acquisition system of the GAMMA-400 space project
NASA Astrophysics Data System (ADS)
Bobkov, S. G.; Serdin, O. V.; Gorbunov, M. S.; Arkhangelskiy, A. I.; Topchiev, N. P.
2016-02-01
The description of scientific data acquisition system (SDAS) designed by SRISA for the GAMMA-400 space project is presented. We consider the problem of different level electronics unification: the set of reliable fault-tolerant integrated circuits fabricated on Silicon-on-Insulator 0.25 mkm CMOS technology and the high-speed interfaces and reliable modules used in the space instruments. The characteristics of reliable fault-tolerant very large scale integration (VLSI) technology designed by SRISA for the developing of computation systems for space applications are considered. The scalable net structure of SDAS based on Serial RapidIO interface including real-time operating system BAGET is described too.
Gene regulation knowledge commons: community action takes care of DNA binding transcription factors
Tripathi, Sushil; Vercruysse, Steven; Chawla, Konika; Christie, Karen R.; Blake, Judith A.; Huntley, Rachael P.; Orchard, Sandra; Hermjakob, Henning; Thommesen, Liv; Lægreid, Astrid; Kuiper, Martin
2016-01-01
A large gap remains between the amount of knowledge in scientific literature and the fraction that gets curated into standardized databases, despite many curation initiatives. Yet the availability of comprehensive knowledge in databases is crucial for exploiting existing background knowledge, both for designing follow-up experiments and for interpreting new experimental data. Structured resources also underpin the computational integration and modeling of regulatory pathways, which further aids our understanding of regulatory dynamics. We argue how cooperation between the scientific community and professional curators can increase the capacity of capturing precise knowledge from literature. We demonstrate this with a project in which we mobilize biological domain experts who curate large amounts of DNA binding transcription factors, and show that they, although new to the field of curation, can make valuable contributions by harvesting reported knowledge from scientific papers. Such community curation can enhance the scientific epistemic process. Database URL: http://www.tfcheckpoint.org PMID:27270715
Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER.
Chatzigeorgiou, Giorgos; Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos
2016-01-01
Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues.
The Non-Impact of Scientific Reviews of Oil Sands Environmental Impact Assessments
NASA Astrophysics Data System (ADS)
Kienzle, S. W.; Byrne, J.
2008-12-01
Schindler (Science, Vol. 192: 509; 1976) stated that Environmental Impact Assessments authors "conduct the studies regardless of how quickly results are demanded, write large, diffuse reports containing reams of uninterpreted and incomplete descriptive data, and in some cases, construct "predictive" models, irrespective of the quality of the data base." Schindler offered a solution: "If we are to protect both our resources and scientific integrity, environmental scientists must seek to put their studies on a scientifically credible basis-to see that problems, terms of reference, funding, time constraints, reports, and conclusions are all within a bona fide scientific framework." When the first scientific panel was formed in 2003 by the Mikisew Cree First Nations (MCFN), Alberta, to objectively review EIAs of proposed oil sands mining projects, the scientific panel uncovered many severe omissions, errors, and a significant lack of substance that could not withstand scientific scrutiny. Neither the Terms of Reference for two major oilsands projects, estimated to be worth approximately CND 15 billion, nor the EIAs (one single EIA was over 11,000 pages long) contained the terms "climate change", "trend analysis", or "risk analysis", and nearly all environmental impacts were described by the proponents as "negligible". The Hydrology Section (over 950 pages in length) of one EIA did not contain a single peer-reviewed scientific publication. In summary, nothing had changed since Schindler's observations 27 years earlier. Since 2003, the authors have reviewed more than a dozen EIAs of proposed oilsands projects in northern Alberta. The "non-impact" of scientific reviews on the quality of EIAs and the insincerity of the stewards of the land are very sobering: apart from cosmetic improvements in the requirements of the Terms of Reference and the writing of the EIAs, no meaningful improvement of scientific content has been made. Key environmental concerns around water resource utilization and contamination, massive boreal forest ecosystem disruption and destruction, insignificant reclamation, and dramatic increases in emission of acidic pollutants and GHGs have never been adequately addressed. Spills of contaminated tailings fluids into the Athabasca River have occurred in the past, and Mikisew Cree Elders have both anecdotal and physical evidence of contamination of downstream areas through to Lake Athabasca. As the Alberta government has declared a sell-out of very large areas of boreal forest for fast profit, the scientific reviews have been ignored. With the exception of a few cosmetic improvements to the EIAs (e.g. climate change is now discussed, however, with incomplete data and incorrect interpretations), the scientific quality of EIAs has not improved. In fact, the Terms of Reference differ in content requirements with each oilsands project, which means that there is inconsistency between EIAs, prohibiting the evaluation of the evolution of the Terms of Reference. The consequences of the disregard of scientific standards are enormous environmental impacts in terms of carbon dioxide output, acid rain, severe health risks of the local population, water quality, and negligible reclamation efforts.
The ImageJ ecosystem: An open platform for biomedical image analysis.
Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Geeson, N.; van den Elsen, E.; Brandt, J.; Quaranta, G.; Salvia, R.
2012-04-01
In the last twenty years the advent of the internet has made it much easier to share the results of scientific research with a wider range of audiences. Where once there were only scientific journals and books, it is now possible to deliver messages and dissemination products instantly, by email or other media, to huge circulation lists; thereby also addressing non-scientific audiences. Most scientific projects now host a website, but until recently few have exploited the communication possibilities to maximum advantage. DESIRE has been a large interdisciplinary and international project working to mitigate desertification by selecting and trialling sustainable land management practices with stakeholders. Therefore it has been very important to use a general project website, and a separate Harmonised Information System, to ensure that partners and stakeholders are able to understand the sustainable options and learn from one another. The project website has included many useful features, such as general project and partner information, a schedule of future meetings, and repositories of publicly (and project only) downloadable documents. Lessons have been learned about communication preferences between groups with different interests. For example, an on-line forum seemed a good way of allowing project partners to have their say on various topics. However it was not well-used and it was concluded that partners preferred to communicate just by email, a medium that they access most days for many uses. Whereas the project website focuses on the latest news, the Harmonised Information System has been used to document the history of the project, stage by stage, filling in each section as results became available. Information can be accessed from the perspective of both the research aims and each study site. Interactive tools and drop-down menus are among the features that are used to make the information as attractive and as accessible as possible. Although English is the language of scientists, material must be presented in local languages for local people, of which at least 9 are implemented in DESIRE. Of course, not everyone can use a computer or view the internet, so some dissemination products do also have to be downloaded and printed and distributed by hand. The DESIRE website can be accessed at http://www.desire-project.eu/, and the DESIRE Harmonised Information System at http://www.desire-his.eu/
NASA Astrophysics Data System (ADS)
Krumhansl, R. A.; Foster, J.; Peach, C. L.; Busey, A.; Baker, I.
2012-12-01
The practice of science and engineering is being revolutionized by the development of cyberinfrastructure for accessing near real-time and archived observatory data. Large cyberinfrastructure projects have the potential to transform the way science is taught in high school classrooms, making enormous quantities of scientific data available, giving students opportunities to analyze and draw conclusions from many kinds of complex data, and providing students with experiences using state-of-the-art resources and techniques for scientific investigations. However, online interfaces to scientific data are built by scientists for scientists, and their design can significantly impede broad use by novices. Knowledge relevant to the design of student interfaces to complex scientific databases is broadly dispersed among disciplines ranging from cognitive science to computer science and cartography and is not easily accessible to designers of educational interfaces. To inform efforts at bridging scientific cyberinfrastructure to the high school classroom, Education Development Center, Inc. and the Scripps Institution of Oceanography conducted an NSF-funded 2-year interdisciplinary review of literature and expert opinion pertinent to making interfaces to large scientific databases accessible to and usable by precollege learners and their teachers. Project findings are grounded in the fundamentals of Cognitive Load Theory, Visual Perception, Schemata formation and Universal Design for Learning. The Knowledge Status Report (KSR) presents cross-cutting and visualization-specific guidelines that highlight how interface design features can address/ ameliorate challenges novice high school students face as they navigate complex databases to find data, and construct and look for patterns in maps, graphs, animations and other data visualizations. The guidelines present ways to make scientific databases more broadly accessible by: 1) adjusting the cognitive load imposed by the user interface and visualizations so that it doesn't exceed the amount of information the learner can actively process; 2) drawing attention to important features and patterns; and 3) enabling customization of visualizations and tools to meet the needs of diverse learners.
Beowulf Distributed Processing and the United States Geological Survey
Maddox, Brian G.
2002-01-01
Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.
The Crucial Role of Amateur-Professional Networks in the Golden Age of Large Surveys (Abstract)
NASA Astrophysics Data System (ADS)
Rodriguez, J. E.
2017-06-01
(Abstract only) With ongoing projects such as HATNet, SuperWASP, KELT, MEarth, and the CoRoT and Kepler/K2 mission, we are in a golden era of large photometric surveys. In addition, LSST and TESS will be coming online in the next three to five years. The combination of all these projects will increased the number of photometrically monitored stars by orders of magnitude. It is expected that these surveys will enhance our knowledge of circumstellar architecture and the early stages of stellar and planetary formation, while providing a better understanding of exoplanet demographics. However, the success of these surveys will be dependent on simultaneous and continued follow up by large networks. With federal scientific funding reduced over the past few years, the availability of astronomical observations has been directly affected. Fortunately, ground based amateur-professional networks like the AAVSO and the KELT Follow-up Network (KELT-FUN) are already providing access to an international, independent resource for professional grade astronomical observations. These networks have both multi-band photometric and spectroscopic capabilities. I provide an overview of the ongoing and future surveys, highlight past and current contributions by amateur-professional networks to scientific discovery, and discuss the role of these networks in upcoming projects.
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
The Current State of Data Transmission Channels from Pushchino to Moscow and Perspectives
NASA Astrophysics Data System (ADS)
Dumsky, D. V.; Isaev, E. A.; Samodurov, V. A.; Shatskaya, M. V.
Since the work of a unique space radio telescope in the international VLBI project "Radioastron" extended to 2017 the transmission and storage of large volumes of scientific and telemetry data obtained during the experiments is still remains actual. This project is carried out by the Astro Space Center of Lebedev Physical Institute in Moscow, Russia. It requires us to maintain in operating state the high-speed link to merge into a single LAN buffer data center in Puschino and scientific information center in Moscow. Still relevant the chanal equipment monitoring system, and storage systems, as well as the timely replacement of hardware and software upgrades, backups, and documentation of the network infrastructure.
The Many Faces of a Software Engineer in a Research Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marinovici, Maria C.; Kirkham, Harold
2013-10-14
The ability to gather, analyze and make decisions based on real world data is changing nearly every field of human endeavor. These changes are particularly challenging for software engineers working in a scientific community, designing and developing large, complex systems. To avoid the creation of a communications gap (almost a language barrier), the software engineers should possess an ‘adaptive’ skill. In the science and engineering research community, the software engineers must be responsible for more than creating mechanisms for storing and analyzing data. They must also develop a fundamental scientific and engineering understanding of the data. This paper looks atmore » the many faces that a software engineer should have: developer, domain expert, business analyst, security expert, project manager, tester, user experience professional, etc. Observations made during work on a power-systems scientific software development are analyzed and extended to describe more generic software development projects.« less
The space telescope: A study of NASA, science, technology, and politics
NASA Technical Reports Server (NTRS)
Smith, Robert William
1989-01-01
Scientific, technological, economic, and political aspects of NASA efforts to orbit a large astronomical telescope are examined in a critical historical review based on extensive interviews with participants and analysis of published and unpublished sources. The scientific advantages of large space telescopes are explained; early plans for space observatories are summarized; the history of NASA and its major programs is surveyed; the redesign of the original Large Space Telescope for Shuttle deployability is discussed; the impact of the yearly funding negotiations with Congress on the development of the final Hubble Space Telescope (HST) is described; and the implications of the HST story for the future of large space science projects are explored. Drawings, photographs, a description of the HST instruments and systems, and lists of the major contractors and institutions participating in the HST program are provided.
[Scientific journalism and epidemiological risk].
Luiz, Olinda do Carmo
2007-01-01
The importance of the communications media in the construction of symbols has been widely acknowledged. Many of the articles on health published in the daily newspapers mention medical studies, sourced from scientific publications focusing on new risks. The disclosure of risk studies in the mass media is also a topic for editorials and articles in scientific journals, focusing the problem of distortions and the appearance of contradictory news items. The purpose of this paper is to explore the meaning and content of disclosing scientific risk studies in large-circulation daily newspapers, analyzing news items published in Brazil and the scientific publications used as their sources during 2000. The "risk" is presented in the scientific research projects as a "black box" in the meaning of Latour, with the news items downplaying scientific disputes and underscoring associations between behavioral habits and the occurrence of diseases, emphasizing individual aspects of the epidemiological approach, to the detriment of the group.
MEMO2 - MEthane goes MObile - MEasurements and Modelling - Part 2
NASA Astrophysics Data System (ADS)
Röckmann, Thomas; Walter, Sylvia
2017-04-01
As mitigation of climate change is a key scientific and societal challenge, the 2015 United Nations Climate Change Conference in Paris (COP21) agreed to limit global warming "well below" 2˚ C and, if possible, below 1.5˚ C. Reaching this target requires massive reductions of greenhouse gas emissions, and achieving significant reduction of greenhouse gas emissions is a logical headline targets of the EU climate action and of the H2020 strategy. CH4 emissions are a major contributor to Europe's global warming impact and emissions are not well quantified yet. There are significant discrepancies between official inventories of emissions and estimates derived from direct atmospheric measurement. Effective emission reduction can only be achieved if sources are properly quantified, and mitigation efforts are verified. New advanced combinations of measurement and modelling are needed to archive such quantification. MEMO2 will contribute to the targets of the EU with a focus on methane (CH4). The project will bridge the gap between large-scale scientific estimates from in situ monitoring programs and the 'bottom-up' estimates of emissions from local sources that are used in the national reporting by I) developing new and advanced mobile methane (CH4) measurements tools and networks, II) isotopic source identification, and III) modelling at different scales. Within the project qualified scientists will be educated in the use and implementation of interdisciplinary knowledge and techniques that are essential to meet and verify emission reduction goals. MEMO2 will facilitate intensive collaboration between the largely academic greenhouse gas monitoring community and non-academic partners who are responsible for evaluating and reporting greenhouse gas emissions to policy makers. MEMO2 is a European Training Network with more than 20 collaborators from 7 countries. It is a 4-years project and we will present the project and its objectives to the scientific community to foster collaboration and scientific exchange from the beginning.
Armenian media coverage of science topics
NASA Astrophysics Data System (ADS)
Mkhitaryan, Marie
2016-12-01
The article discusses features and issues of Armenian media coverage on scientific topics and provides recommendations on how to promote scientific topics in media. The media is more interested in social or public reaction rather than in scientific information itself. Medical science has a large share of the global media coverage. It is followed by articles about environment, space, technology, physics and other areas. Armenian media mainly tends to focus on a scientific topic if at first sight it contains something revolutionary. Media primarily reviews whether that scientific study can affect the Armenian economy and only then decides to refer to it. Unfortunately, nowadays the perception of science is a little distorted in media. We can often see headlines of news where is mentioned that the scientist has made "an invention". Nowadays it is hard to see the border between a scientist and an inventor. In fact, the technological term "invention" attracts the media by making illusionary sensation and ensuring large audience. The report also addresses the "Gitamard" ("A science-man") special project started in 2016 in Mediamax that tells about scientists and their motivations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chase Qishi; Zhu, Michelle Mengxia
The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less
Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER
Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos
2016-01-01
Abstract Background Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. New information The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues. PMID:28174507
Harvard Observing Project (HOP): Involving Undergraduates in Research Projects
NASA Astrophysics Data System (ADS)
Bieryla, Allyson
2017-01-01
The Harvard Observing Project (HOP) is designed to get students excited about observational astronomy while collecting data valuable to the scientific community. The primary goal is to give undergraduates a chance to try out observing with “no strings attached”. Observations are led by experienced observers, mostly graduate students. This not only gives graduate students extra opportunities to interact and teach undergraduates, but also a chance for them to get more observing experience. Each semester, we choose an interesting target and monitor it each week over the course of the semester using Harvard University’s 16-inch DFM Clay Telescope. These observing projects often produce large amounts of data. This provides an excellent dataset for a young undergraduate to analyze. Some successful semester-long observing projects have included variable stars, supernova and binary systems. Short-term projects have included exoplanet candidate followup, asteroid and comet followup and collaborating with the Pro-Am White Dwarf Monitoring (PAWM) project in attempts to detect a transiting Earth-sized planet orbiting a white dwarf. Each dataset is an opportunity for an undergraduate to be introduced to scientific research and present the results to the community.
The human genome: Some assembly required. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-12-31
The Human Genome Project promises to be one of the most rewarding endeavors in modern biology. The cost and the ethical and social implications, however, have made this project the source of considerable debate both in the scientific community and in the public at large. The 1994 Graduate Student Symposium addresses the scientific merits of the project, the technical issues involved in accomplishing the task, as well as the medical and social issues which stem from the wealth of knowledge which the Human Genome Project will help create. To this end, speakers were brought together who represent the diverse areasmore » of expertise characteristic of this multidisciplinary project. The keynote speaker addresses the project`s motivations and goals in the larger context of biological and medical sciences. The first two sessions address relevant technical issues, data collection with a focus on high-throughput sequencing methods and data analysis with an emphasis on identification of coding sequences. The third session explores recent advances in the understanding of genetic diseases and possible routes to treatment. Finally, the last session addresses some of the ethical, social and legal issues which will undoubtedly arise from having a detailed knowledge of the human genome.« less
NASA Astrophysics Data System (ADS)
Michel, Eric; Belkacem, Kevin; Samadi, Reza; Assis Peralta, Raphael de; Renié, Christian; Abed, Mahfoudh; Lin, Guangyuan; Christensen-Dalsgaard, Jørgen; Houdek, Günter; Handberg, Rasmus; Gizon, Laurent; Burston, Raymond; Nagashima, Kaori; Pallé, Pere; Poretti, Ennio; Rainer, Monica; Mistò, Angelo; Panzera, Maria Rosa; Roth, Markus
2017-10-01
The growing amount of seismic data available from space missions (SOHO, CoRoT, Kepler, SDO,…) but also from ground-based facilities (GONG, BiSON, ground-based large programmes…), stellar modelling and numerical simulations, creates new scientific perspectives such as characterizing stellar populations in our Galaxy or planetary systems by providing model-independent global properties of stars such as mass, radius, and surface gravity within several percent accuracy, as well as constraints on the age. These applications address a broad scientific community beyond the solar and stellar one and require combining indices elaborated with data from different databases (e.g. seismic archives and ground-based spectroscopic surveys). It is thus a basic requirement to develop a simple and effcient access to these various data resources and dedicated tools. In the framework of the European project SpaceInn (FP7), several data sources have been developed or upgraded. The Seismic Plus Portal has been developed, where synthetic descriptions of the most relevant existing data sources can be found, as well as tools allowing to localize existing data for given objects or period and helping the data query. This project has been developed within the Virtual Observatory (VO) framework. In this paper, we give a review of the various facilities and tools developed within this programme. The SpaceInn project (Exploitation of Space Data for Innovative Helio- and Asteroseismology) has been initiated by the European Helio- and Asteroseismology Network (HELAS).
Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve
2005-01-01
Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.
The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications
NASA Technical Reports Server (NTRS)
Johnston, William E.
2002-01-01
With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.
The Marshall Islands Data Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoker, A.C.; Conrado, C.L.
1995-09-01
This report is a resource document of the methods and procedures used currently in the Data Management Program of the Marshall Islands Dose Assessment and Radioecology Project. Since 1973, over 60,000 environmental samples have been collected. Our program includes relational database design, programming and maintenance; sample and information management; sample tracking; quality control; and data entry, evaluation and reduction. The usefulness of scientific databases involves careful planning in order to fulfill the requirements of any large research program. Compilation of scientific results requires consolidation of information from several databases, and incorporation of new information as it is generated. The successmore » in combining and organizing all radionuclide analysis, sample information and statistical results into a readily accessible form, is critical to our project.« less
NASA Technical Reports Server (NTRS)
Schmerling, E. R.
1977-01-01
Spacelab was developed by the European Space Agency for the conduction of scientific and technological experiments in space. Spacelab can be taken into earth orbit by the Space Shuttle and returned to earth after a period of 1-3 weeks. The Spacelab modular system of pallets, pressurized modules, and racks can contain large payloads with high power and telemetry requirements. A working group has defined the 'Atmospheres, Magnetospheres, and Plasmas-in-Space' project. The project objectives include the absolute measurement of solar flux in a number of carefully selected bands at the same time at which atmospheric measurements are made. NASA is committed to the concept that the scientist is to play a key role in its scientific programs.
Lakeside: Merging Urban Design with Scientific Analysis
Guzowski, Leah; Catlett, Charlie; Woodbury, Ed
2018-01-16
Researchers at the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago are developing tools that merge urban design with scientific analysis to improve the decision-making process associated with large-scale urban developments. One such tool, called LakeSim, has been prototyped with an initial focus on consumer-driven energy and transportation demand, through a partnership with the Chicago-based architectural and engineering design firm Skidmore, Owings & Merrill, Clean Energy Trust and developer McCaffery Interests. LakeSim began with the need to answer practical questions about urban design and planning, requiring a better understanding about the long-term impact of design decisions on energy and transportation demand for a 600-acre development project on Chicago's South Side - the Chicago Lakeside Development project.
THE VIRTUAL INSTRUMENT: SUPPORT FOR GRID-ENABLED MCELL SIMULATIONS
Casanova, Henri; Berman, Francine; Bartol, Thomas; Gokcay, Erhan; Sejnowski, Terry; Birnbaum, Adam; Dongarra, Jack; Miller, Michelle; Ellisman, Mark; Faerman, Marcio; Obertelli, Graziano; Wolski, Rich; Pomerantz, Stuart; Stiles, Joel
2010-01-01
Ensembles of widely distributed, heterogeneous resources, or Grids, have emerged as popular platforms for large-scale scientific applications. In this paper we present the Virtual Instrument project, which provides an integrated application execution environment that enables end-users to run and interact with running scientific simulations on Grids. This work is performed in the specific context of MCell, a computational biology application. While MCell provides the basis for running simulations, its capabilities are currently limited in terms of scale, ease-of-use, and interactivity. These limitations preclude usage scenarios that are critical for scientific advances. Our goal is to create a scientific “Virtual Instrument” from MCell by allowing its users to transparently access Grid resources while being able to steer running simulations. In this paper, we motivate the Virtual Instrument project and discuss a number of relevant issues and accomplishments in the area of Grid software development and application scheduling. We then describe our software design and report on the current implementation. We verify and evaluate our design via experiments with MCell on a real-world Grid testbed. PMID:20689618
QMC Goes BOINC: Using Public Resource Computing to Perform Quantum Monte Carlo Calculations
NASA Astrophysics Data System (ADS)
Rainey, Cameron; Engelhardt, Larry; Schröder, Christian; Hilbig, Thomas
2008-10-01
Theoretical modeling of magnetic molecules traditionally involves the diagonalization of quantum Hamiltonian matrices. However, as the complexity of these molecules increases, the matrices become so large that this process becomes unusable. An additional challenge to this modeling is that many repetitive calculations must be performed, further increasing the need for computing power. Both of these obstacles can be overcome by using a quantum Monte Carlo (QMC) method and a distributed computing project. We have recently implemented a QMC method within the Spinhenge@home project, which is a Public Resource Computing (PRC) project where private citizens allow part-time usage of their PCs for scientific computing. The use of PRC for scientific computing will be described in detail, as well as how you can contribute to the project. See, e.g., L. Engelhardt, et. al., Angew. Chem. Int. Ed. 47, 924 (2008). C. Schröoder, in Distributed & Grid Computing - Science Made Transparent for Everyone. Principles, Applications and Supporting Communities. (Weber, M.H.W., ed., 2008). Project URL: http://spin.fh-bielefeld.de
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shea, M.
1995-09-01
The proper isolation of radioactive waste is one of today`s most pressing environmental issues. Research is being carried out by many countries around the world in order to answer critical and perplexing questions regarding the safe disposal of radioactive waste. Natural analogue studies are an increasingly important facet of this international research effort. The Pocos de Caldas Project represents a major effort of the international technical and scientific community towards addressing one of modern civilization`s most critical environmental issues - radioactive waste isolation.
Lehner, Thomas; Senthil, Geetha; Addington, Anjené M
2015-01-01
After many years of unfilled promise, psychiatric genetics has seen an unprecedented number of successes in recent years. We hypothesize that the field has reached an inflection point through a confluence of four key developments: advances in genomics; the orientation of the scientific community around large collaborative team science projects; the development of sample and data repositories; and a policy framework for sharing and accessing these resources. We discuss these domains and their effect on scientific progress and provide a perspective on why we think this is only the beginning of a new era in scientific discovery. Published by Elsevier Inc.
Queuing Models of Tertiary Storage
NASA Technical Reports Server (NTRS)
Johnson, Theodore
1996-01-01
Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/
NASA Astrophysics Data System (ADS)
Currell, Matthew J.; Werner, Adrian D.; McGrath, Chris; Webb, John A.; Berkman, Michael
2017-05-01
Understanding and managing impacts from mining on groundwater-dependent ecosystems (GDEs) and other groundwater users requires development of defensible science supported by adequate field data. This usually leads to the creation of predictive models and analysis of the likely impacts of mining and their accompanying uncertainties. The identification, monitoring and management of impacts on GDEs are often a key component of mine approvals, which need to consider and attempt to minimise the risks that negative impacts may arise. Here we examine a case study where approval for a large mining project in Australia (Carmichael Coal Mine) was challenged in court on the basis that it may result in more extensive impacts on a GDE (Doongmabulla Springs) of high ecological and cultural significance than predicted by the proponent. We show that throughout the environmental assessment and approval process, significant data gaps and scientific uncertainties remained unresolved. Evidence shows that the assumed conceptual hydrogeological model for the springs could be incorrect, and that at least one alternative conceptualisation (that the springs are dependent on a deep fault) is consistent with the available field data. Assumptions made about changes to spring flow as a consequence of mine-induced drawdown also appear problematic, with significant implications for the spring-fed wetlands. Despite the large scale of the project, it appears that critical scientific data required to resolve uncertainties and construct robust models of the springs' relationship to the groundwater system were lacking at the time of approval, contributing to uncertainty and conflict. For this reason, we recommend changes to the approval process that would require a higher standard of scientific information to be collected and reviewed, particularly in relation to key environmental assets during the environmental impact assessment process in future projects.
Using Group Research Projects to Stimulate Undergraduate Astronomy Major Learning
NASA Astrophysics Data System (ADS)
McGraw, Allison M.; Hardegree-Ullman, K. K.; Turner, J. D.; Shirley, Y. L.; Walker-LaFollette, A. M.; Robertson, A. N.; Carleton, T. M.; Smart, B. M.; Towner, A. P. M.; Wallace, S. C.; Smith, C. W.; Small, L. C.; Daugherty, M. J.; Guvenen, B. C.; Crawford, B. E.; Austin, C. L.; Schlingman, W. M.
2012-05-01
The University of Arizona Astronomy Club has been working on two large group research projects since 2009. One research project is a transiting extrasolar planet project that is fully student led and run. We observed the transiting exoplanets, TrES-3b and TrES-4b, with the 1.55 meter Kupier Telescope in near-UV and optical filters in order to detect any asymmetries between filters. The second project is a radio astronomy survey utilizing the Arizona Radio Observatory 12m telescope on Kitt Peak to study molecular gas in cold cores identified by the Planck all sky survey. This project provides a unique opportunity for a large group of students to get hands-on experience observing with a world-class radio observatory. These projects involve students in every single step of the process including: proposal writing to obtain telescope time on various Southern Arizona telescopes, observing at these telescopes, data reduction and analysis, managing large data sets, and presenting results at scientific meetings and in journal publications. The primary goal of these projects is to involve students in cutting-edge research early on in their undergraduate studies. The projects are designed to be continuous long term projects so that new students can easily join. As of January 2012 the extrasolar planet project became an official independent study class. New students learn from the more experienced students on the projects creating a learner-centered environment.
NASA Astrophysics Data System (ADS)
Peppoloni, Silvia; Di Capua, Giuseppe; Haslinger, Florian
2017-04-01
Over the last years the attention to ethical and social aspects of scientific research has grown remarkably. Large scientific projects that refer to environment, resources, or natural hazards, assign great importance to the topics of big data and data management, environmental impact, science dissemination and education. These topics are also analyzed from an ethical and social perspective, recognizing the close relation to and evident repercussions on the life and activity of the human communities touched by those projects. ENVRIplus is a Horizon2020 project in which ethics applied to geosciences features as a fundamental issue, at the base of scientific activities. It brings together Environmental and Earth System Research Infrastructures (RIs), projects, and networks, with technical specialist partners to create a more coherent, interdisciplinary and interoperable cluster of Environmental Research Infrastructures across Europe (http://www.envriplus.eu/). In ENVRIplus, ethics applied to geosciences features as a fundamental issue at the base of scientific activities. Within the theme "Societal relevance and understanding", an entire work package aims at developing an ethical framework for RIs. Its objectives are: • increase the awareness of both the scientists and the public on the importance of ethical aspects in Earth and Environmental sciences; • establish a shared ethical reference framework, to be adopted by RIs governing bodies; • increase the awareness of RIs management and operational levels and of the individual involved scientists on their social role in conducting research activities and research work environment; • assess the ethical and social aspects related to the results achieved and deliverables released within the project. As one element of this work we created a questionnaire to investigate how each RI participating in ENVRI Plus faces ethical issues in relation to its activities, and so to understand the level of perception that researchers and technicians involved in the project have on the ethical implications of their scientific activities. Here we present and discuss the results of this survey, together with the next steps towards the formulation of an ethical reference framework.
Scientific Visualization Tools for Enhancement of Undergraduate Research
NASA Astrophysics Data System (ADS)
Rodriguez, W. J.; Chaudhury, S. R.
2001-05-01
Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable four-dimensional interactive environment. These tools allow students to make higher order decisions based on large multidimensional sets of data while diminishing the level of frustration that results from dealing with the details of processing large data sets.
ERIC Educational Resources Information Center
Meydan, Ali
2017-01-01
Scientific research projects competitions for high school students have been held by TUBITAK (The Scientific and Technological Research Council of Turkey) since 1969. Whereas only projects on science were taken into the scope of competition for long years, the projects appropriate to the interdisciplinary approach such as social sciences projects…
NASA Astrophysics Data System (ADS)
Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.
2007-12-01
Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.
The BioMart community portal: an innovative alternative to large, centralized data repositories
USDA-ARS?s Scientific Manuscript database
The BioMart Community Portal (www.biomart.org) is a community-driven effort to provide a unified interface to biomedical databases that are distributed worldwide. The portal provides access to numerous database projects supported by 30 scientific organizations. It includes over 800 different biologi...
A Climate Change Course for Undergraduate Students
ERIC Educational Resources Information Center
Nam, Y.; Ito, E.
2011-01-01
For the past 10 years, a climate change course has been offered in a large Midwest university. This course has been focusing on improving college students' scientific knowledge of climate change and human interactions using historical evidence as well as improving their information literacy in science through a course project that requires…
The CAVE (TM) automatic virtual environment: Characteristics and applications
NASA Technical Reports Server (NTRS)
Kenyon, Robert V.
1995-01-01
Virtual reality may best be defined as the wide-field presentation of computer-generated, multi-sensory information that tracks a user in real time. In addition to the more well-known modes of virtual reality -- head-mounted displays and boom-mounted displays -- the Electronic Visualization Laboratory at the University of Illinois at Chicago recently introduced a third mode: a room constructed from large screens on which the graphics are projected on to three walls and the floor. The CAVE is a multi-person, room sized, high resolution, 3D video and audio environment. Graphics are rear projected in stereo onto three walls and the floor, and viewed with stereo glasses. As a viewer wearing a location sensor moves within its display boundaries, the correct perspective and stereo projections of the environment are updated, and the image moves with and surrounds the viewer. The other viewers in the CAVE are like passengers in a bus, along for the ride. 'CAVE,' the name selected for the virtual reality theater, is both a recursive acronym (Cave Automatic Virtual Environment) and a reference to 'The Simile of the Cave' found in Plato's 'Republic,' in which the philosopher explores the ideas of perception, reality, and illusion. Plato used the analogy of a person facing the back of a cave alive with shadows that are his/her only basis for ideas of what real objects are. Rather than having evolved from video games or flight simulation, the CAVE has its motivation rooted in scientific visualization and the SIGGRAPH 92 Showcase effort. The CAVE was designed to be a useful tool for scientific visualization. The Showcase event was an experiment; the Showcase chair and committee advocated an environment for computational scientists to interactively present their research at a major professional conference in a one-to-many format on high-end workstations attached to large projection screens. The CAVE was developed as a 'virtual reality theater' with scientific content and projection that met the criteria of Showcase.
Future Sky Surveys: New Discovery Frontiers
NASA Astrophysics Data System (ADS)
Tyson, J. Anthony; Borne, Kirk D.
2012-03-01
Driven by the availability of new instrumentation, there has been an evolution in astronomical science toward comprehensive investigations of new phenomena. Major advances in our understanding of the Universe over the history of astronomy have often arisen from dramatic improvements in our capability to observe the sky to greater depth, in previously unexplored wavebands, with higher precision, or with improved spatial, spectral, or temporal resolution. Substantial progress in the important scientific problems of the next decade (determining the nature of dark energy and dark matter, studying the evolution of galaxies and the structure of our own Milky Way, opening up the time domain to discover faint variable objects, and mapping both the inner and outer Solar System) can be achieved through the application of advanced data mining methods and machine learning algorithms operating on the numerous large astronomical databases that will be generated from a variety of revolutionary future sky surveys. Over the next decade, astronomy will irrevocably enter the era of big surveys and of really big telescopes. New sky surveys (some of which will produce petabyte-scale data collections) will begin their operations, and one or more very large telescopes (ELTs = Extremely Large Telescopes) will enter the construction phase. These programs and facilities will generate a remarkable wealth of data of high complexity, endowed with enormous scientific knowledge discovery potential. New parameter spaces will be opened, in multiple wavelength domains as well as the time domain, across wide areas of the sky, and down to unprecedented faint source flux limits. The synergies of grand facilities, massive data collections, and advanced machine learning algorithms will come together to enable discoveries within most areas of astronomical science, including Solar System, exo-planets, star formation, stellar populations, stellar death, galaxy assembly, galaxy evolution, quasar evolution, and cosmology. Current and future sky surveys, comprising an alphabet soup of project names (e.g., Pan- STARRS, WISE, Kepler, DES, VST, VISTA, GAIA, EUCLID, SKA, LSST, and WFIRST; some of which are discussed in Chapters 17, 18, and 20),will contribute to the exponential explosion of complex data in astronomy. The scientific goals of these projects are as monumental as the programs themselves. The core scientific output of all of these will be their scientific data collection. Consequently, data mining and machine learning algorithms and specialists will become a common component of future astronomical research with these facilities. This synergistic combination and collaboration among multiple disciplines are essential in order to maximize the scientific discovery potential, the science output, the research efficiency, and the success of these projects.
NASA Astrophysics Data System (ADS)
Bundschuh, V.; Grueter, J. W.; Kleemann, M.; Melis, M.; Stein, H. J.; Wagner, H. J.; Dittrich, A.; Pohlmann, D.
1982-08-01
A preliminary study was undertaken before a large scale project for construction and survey of about a hundred solar houses was launched. The notion of solar house was defined and the use of solar energy (hot water preparation, heating of rooms, heating of swimming pool, or a combination of these possibilities) were examined. A coherent measuring program was set up. Advantages and inconveniences of the large scale project were reviewed. Production of hot water, evaluation of different concepts and different fabrications of solar systems, coverage of the different systems, conservation of energy, failure frequency and failures statistics, durability of the installation, investment maintenance and energy costs were retained as study parameters. Different solar hot water production systems and the heat counter used for measurements are described.
A decade of human genome project conclusion: Scientific diffusion about our genome knowledge.
Moraes, Fernanda; Góes, Andréa
2016-05-06
The Human Genome Project (HGP) was initiated in 1990 and completed in 2003. It aimed to sequence the whole human genome. Although it represented an advance in understanding the human genome and its complexity, many questions remained unanswered. Other projects were launched in order to unravel the mysteries of our genome, including the ENCyclopedia of DNA Elements (ENCODE). This review aims to analyze the evolution of scientific knowledge related to both the HGP and ENCODE projects. Data were retrieved from scientific articles published in 1990-2014, a period comprising the development and the 10 years following the HGP completion. The fact that only 20,000 genes are protein and RNA-coding is one of the most striking HGP results. A new concept about the organization of genome arose. The ENCODE project was initiated in 2003 and targeted to map the functional elements of the human genome. This project revealed that the human genome is pervasively transcribed. Therefore, it was determined that a large part of the non-protein coding regions are functional. Finally, a more sophisticated view of chromatin structure emerged. The mechanistic functioning of the genome has been redrafted, revealing a much more complex picture. Besides, a gene-centric conception of the organism has to be reviewed. A number of criticisms have emerged against the ENCODE project approaches, raising the question of whether non-conserved but biochemically active regions are truly functional. Thus, HGP and ENCODE projects accomplished a great map of the human genome, but the data generated still requires further in depth analysis. © 2016 by The International Union of Biochemistry and Molecular Biology, 44:215-223, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.
NANOCOSMOS: a trip to the nanoworld
NASA Astrophysics Data System (ADS)
Ruiz Zelmanovitch, N.; Castellanos, M.
2017-03-01
Cosmic dust is made in evolved stars. However, the processes involved in the formation and evolution of dust remain unknown so far. The project ''Gas and dust from stars to the laboratory: exploring the NANOCOSMOS'', takes advantage of the new observational capabilities (increased angular resolution) of the Atacama Large Millimeter/submillimeter Array (ALMA) to unveil the physical and chemical conditions in the dust formation zone of evolved stars. These observations, in combination with novel top-level ultra-high vacuum experiments and astrophysical modelling, will provide a cutting-edge view of cosmic dust. The importance of publishing scientific results based on NANOCOSMOS in the scientific literature goes without saying. But it is also important and a stated NANOCOSMOS objective to disseminate the achievements of the project and its scientific and technological results to a wider audience. In this presentation we will discuss the tools used to spread them to the society. This presentation is structured as follows: 1. What is Astrochemistry?; 2. What is NANOCOSMOS?; 3. Outreach in the NANOCOSMOS programme; 4. Conclusions.
Application of logic models in a large scientific research program.
O'Keefe, Christine M; Head, Richard J
2011-08-01
It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts mission-driven scientific research focussed on delivering results with relevance and impact for Australia, where impact is defined and measured in economic, environmental and social terms at the national level. The Australian Government has recently signalled an increasing emphasis on performance assessment and evaluation, which in the CSIRO context implies an increasing emphasis on ensuring and demonstrating the impact of its research programs. CSIRO continues to develop and improve its approaches to impact planning and evaluation, including conducting a trial of a program logic approach in the CSIRO Preventative Health National Research Flagship. During the trial, improvements were observed in clarity of the research goals and path to impact, as well as in alignment of science and support function activities with national challenge goals. Further benefits were observed in terms of communication of the goals and expected impact of CSIRO's research programs both within CSIRO and externally. The key lesson learned was that significant value was achieved through the process itself, as well as the outcome. Recommendations based on the CSIRO trial may be of interest to managers of scientific research considering developing similar logic models for their research projects. The CSIRO experience has shown that there are significant benefits to be gained, especially if the project participants have a major role in the process of developing the logic model. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Squibb, Gael F.
1984-10-01
The operation teams for the Infrared Astronomical Satellite (IRAS) included scientists from the IRAS International Science Team. The scientific decisions on an hour-to-hour basis, as well as the long-term strategic decisions, were made by science team members. The IRAS scientists were involved in the analysis of the instrument performance, the analysis of the quality of the data, the decision to reacquire data that was contaminated by radiation effects, the strategy for acquiring the survey data, and the process for using the telescope for additional observations, as well as the processing decisions required to ensure the publication of the final scientific products by end of flight operations plus one year. Early in the project, two science team members were selected to be responsible for the scientific operational decisions. One, located at the operations control center in England, was responsible for the scientific aspects of the satellite operations; the other, located at the scientific processing center in Pasadena, was responsible for the scientific aspects of the processing. These science team members were then responsible for approving the design and test of the tools to support their responsibilities and then, after launch, for using these tools in making their decisions. The ability of the project to generate the final science data products one year after the end of flight operations is due in a large measure to the active participation of the science team members in the operations. This paper presents a summary of the operational experiences gained from this scientific involvement.
Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..
The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.
Communicating the Issue of Underwater Noise Pollution: The Deaf as a Fish Project.
Sebastianutto, Linda; Stocker, Michael; Picciulin, Marta
2016-01-01
Aquatic noise pollution is largely ignored by the lay public. How experts communicate this issue is critical to move public opinion. In 2010, the Cassa di Risparmio di Gorizia (CaRiGO) bank sponsored the Deaf as a Fish project that included local underwater noise monitoring, a boat census, a pamphlet for nonexperts, and some seminars and public meetings. This project allowed us to raise interest in this issue. Using accurate and understandable language in a light-humored setting goes far toward cultivating trust from a public audience that can be intimidated or suspicious of complicated scientific messaging.
Traveling Wave Amplifier Driven by a Large Diameter Annular Electron Beam in a Disk-Loaded Structure
2015-10-30
IV MARY LOU ROBINSON, DR-IV Project Officer Chief, High Power Electromagnetics Division This report is published in the interest of scientific and...unlimited. 13. SUPPLEMENTARY NOTES OPS-15-9244 14. ABSTRACT This project studies the viability of a high - power traveling wave tube (TWT) using a novel...CHRISTINE codes. Fair agreement was observed. The preliminary conclusion is that the disk-on-rod TWT is a viable, high - power extension to the conventional
Orchestrating high-throughput genomic analysis with Bioconductor
Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin
2015-01-01
Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503
The SERENDIP 2 SETI project: Current status
NASA Technical Reports Server (NTRS)
Bowyer, C. S.; Werthimer, D.; Donnelly, C.; Herrick, W.; Lampton, M.
1991-01-01
Over the past 30 years, interest in extraterrestrial intelligence has progressed from philosophical discussion to rigorous scientific endeavors attempting to make contact. Since it is impossible to assess the probability of success and the amount of telescope time needed for detection, Search for Extraterrestrial Intelligence (SETI) Projects are plagued with the problem of attaining the large amounts of time needed on the world's precious few large radio telescopes. To circumvent this problem, the Search for Extraterrestrial Radio Emissions from Nearby Developed Intelligent Populations (SERENDIP) instrument operates autonomously in a piggyback mode utilizing whatever observing plan is chosen by the primary observer. In this way, large quantities of high-quality data can be collected in a cost-effective and unobtrusive manner. During normal operations, SERENDIP logs statistically significant events for further offline analysis. Due to the large number of terrestrial and near-space transmitters on earth, a major element of the SERENDIP project involves identifying and rejecting spurious signals from these sources. Another major element of the SERENDIP Project (as well as most other SETI efforts) is detecting extraterrestrial intelligence (ETI) signals. Events selected as candidate ETI signals are studied further in a targeted search program which utilizes between 24 to 48 hours of dedicated telescope time each year.
ERIC Educational Resources Information Center
Kollasch, Aurelia Wiktoria
2012-01-01
Today large research projects require substantial involvement of researchers from different organizations, disciplines, or cultures working in groups or teams to accomplish a common goal of producing, sharing, and disseminating scientific knowledge. This study focuses on the international research team that was launched in response to pressing…
ERIC Educational Resources Information Center
Schmitt-Harsh, Mikaela; Harsh, Joseph A.
2013-01-01
In the past decade, systematic studies have indicated a significant regression in scientific literacy in nonscience students and students across science, technology, engineering, and mathematics disciplines in higher education. Of particular concern, evaluations of introductory lecture-based undergraduate courses have indicated deficiencies in…
On the Merits of "Unusual Field Trips."
ERIC Educational Resources Information Center
Howarth, Dean E.
1999-01-01
Describes the organization and completion of a scientific field trip for a group of high school physics students that was organized primarily around a study of the Manhattan Project. The group visited the Trinity site, Los Alamos, several museums, and the National Radio Astronomy Observatory Very Large Array. Contact information for the various…
Scientific Literacy of Adult Participants in an Online Citizen Science Project
ERIC Educational Resources Information Center
Price, Charles Aaron
2011-01-01
Citizen Science projects offer opportunities for non-scientists to take part in scientific research. Scientific results from these projects have been well documented. However, there is limited research about how these projects affect their volunteer participants. In this study, I investigate how participation in an online, collaborative…
National Science Board Approves VLA Expansion
NASA Astrophysics Data System (ADS)
2001-11-01
The National Science Board, the governing body for the National Science Foundation (NSF), has approved an expansion project for the Very Large Array (VLA) radio telescope in New Mexico. The board recommended an NSF award of approximately 58.3 million for the project over the next decade. The action came at the Board's meeting in Washington on Nov. 15. The Very Large Array The Very Large Array "This approval means that the VLA, already the most scientifically productive ground-based telescope in all of astronomy, will remain at the cutting edge of astrophysical research through the coming decades," said Paul Vanden Bout, director of the National Radio Astronomy Observatory (NRAO). The expansion project will replace aging equipment left over from the VLA's construction during the 1970s with modern technology, improving the VLA's scientific capabilities more than tenfold. Using the existing 27 dish antennas, each weighing 230 tons, the Expanded VLA will have greatly improved ability to image distant celestial objects and to decipher the physical nature of those objects. In addition to the 58.3 million NSF allocation, the governments of Canada and Mexico plan to provide funding for the VLA expansion. The VLA Expansion Project was formally proposed to the NSF, which owns the VLA, last year. Also last year, the project received a strong endorsement from the Astronomy and Astrophysics Survey Committee of the National Research Council, the working arm of the National Academies of Sciences and Engineering. That committee had been given the task of setting nationwide priorities for astronomy spending over the next decade. The Survey Committee report listed the Expanded VLA as an important contributor to new understanding in three high-priority research areas for the next decade: studies of star and planet formation; research into black holes; and unraveling details about the "dawn of the modern universe." Dedicated in 1980, the VLA is the most powerful, flexible and widely- used radio telescope in the world. It brought dramatically-improved observational capabilities to the scientific community two decades ago, and has contributed significantly to nearly every branch of astronomy. More than 2,200 scientists have used the VLA for more than 10,000 separate observing projects. Astronomers seek more than twice as much VLA observing time than can be provided. Since the VLA's dedication, many technical improvements have made it much more capable than its original design contemplated. However, some of the technologies incorporated into the VLA during its construction, while highly advanced for their time, now limit its capabilities. The VLA Expansion Project will replace those older technologies with modern technology, allowing the VLA to realize its full potential as a tool for scientific research. "Keeping the VLA at the forefront of technology is an important priority, and we are fortunate that Sen. Pete Domenici (R-NM) has recognized this for many years. We appreciate his longtime support for this valuable scientific facility," Vanden Bout said. "Senator Domenici is one of the VLA's strongest advocates, and as a leader in the U.S. Senate, has continually supported the VLA and its expansion in Congress and the Federal Government," Vanden Bout added. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.
MeerKAT Science: On the Pathway to the SKA
NASA Astrophysics Data System (ADS)
MeerKAT Science: On the Pathway to the SKA. MeerKAT is a next generation radio telescope under construction on the African SKA central site in the Karoo plateau of South Africa. When completed in 2017 MeerKAT will be a 64-element array of 13.5-m parabolic antennas distributed over an area with a diameter of 8 km. With a combination of wide bandwidth and field of view, with the large number of antennas and total collecting area, MeerKAT will be one of the world’s most powerful imaging telescopes operating at GHz frequencies. MeerKAT is a science and technology precursor of the SKA mid-frequency dish array, and following several years of operation as a South African telescope will be incorporated into the SKA phase-one facility. The MeerKAT science program will consist of a combination of key science, legacy-style, large survey projects, and smaller projects based on proposals for open time. This workshop, which took place in Stellenbosch in the Western Cape, was held to discuss and plan the broad range of scientific investigations that will be undertaken during the pre-SKA phase of MeerKAT. Topics covered included: technical development and roll out of the MeerKAT science capabilities, details of the large survey projects presented by the project teams, science program concepts for open time, commensal programs such as the Search for Extraterrestrial Intelligence, and the impact of MeerKAT on global Very Long Baseline Interferometry. These proceedings serve as a record of the scientific vision of MeerKAT in the year before its completion, foreshadowing a new era of radio astronomy on the African continent.
Science Diplomacy in Large International Collaborations
NASA Astrophysics Data System (ADS)
Barish, Barry C.
2011-04-01
What opportunities and challenges does the rapidly growing internationalization of science, especially large scale science and technology projects, present for US science policy? On one hand, the interchange of scientists, the sharing of technology and facilities and the working together on common scientific goals promotes better understanding and better science. On the other hand, challenges are presented, because the science cannot be divorced from government policies, and solutions must be found for issues varying from visas to making reliable international commitments.
Solar System Visualization (SSV) Project
NASA Technical Reports Server (NTRS)
Todd, Jessida L.
2005-01-01
The Solar System Visualization (SSV) project aims at enhancing scientific and public understanding through visual representations and modeling procedures. The SSV project's objectives are to (1) create new visualization technologies, (2) organize science observations and models, and (3) visualize science results and mission Plans. The SSV project currently supports the Mars Exploration Rovers (MER) mission, the Mars Reconnaissance Orbiter (MRO), and Cassini. In support of the these missions, the SSV team has produced pan and zoom animations of large mosaics to reveal details of surface features and topography, created 3D animations of science instruments and procedures, formed 3-D anaglyphs from left and right stereo pairs, and animated registered multi-resolution mosaics to provide context for microscopic images.
ERIC Educational Resources Information Center
Price, C. Aaron; Lee, Hee-Sun
2013-01-01
Citizen science projects provide non-scientists with opportunities to take part in scientific research. While their contribution to scientific data collection has been well documented, there is limited research on how participation in citizen science projects may affect their scientific literacy. In this study, we investigated (1) how volunteers'…
Cedeno, Diana; Krawicz, Alexandra; Moore, Gary F
2015-06-06
Artificial photosynthesis is described as the great scientific and moral challenge of our time. We imagine a future where a significant portion of our energy is supplied by such technologies. However, many scientific, engineering and policy challenges must be addressed for this realization. Scientific challenges include the development of effective strategies to couple light absorption, electron transfer and catalysis for efficient conversion of light energy to chemical energy as well as the construction and study of structurally diverse assemblies to carry out these processes. In this article, we review recent efforts from our own research to develop a modular approach to interfacing molecular fuel-production catalysts to visible-light-absorbing semiconductors and discuss the role of the interfacing material as a protection layer for the catalysts as well as the underpinning semiconductor. In concluding, we briefly discuss the potential benefits of a globally coordinated project on artificial photosynthesis that interfaces teams of scientists, engineers and policymakers. Further, we offer cautions that such a large interconnected organization should consider. This article is inspired by, and draws largely from, an invited presentation given by the corresponding author at the Royal Society at Chicheley Hall, home of the Kavli Royal Society International Centre, Buckinghamshire on the themed meeting topic: 'Do we need a global project on artificial photosynthesis?'
2014-04-15
NOAA's GOES-13 satellite saw a large pesky front, one that stretched from Maine to Louisiana on April 13 at 16:15 UTC/12:15 p.m. EDT to April 16 at 12:15 p.m. EDT. This weather pattern did not bode well for people who wanted to see the lunar eclipse on April 15. The GOES-13 satellite images and animations are created at NASA/NOAA's GOES Project at the NASA Goddard Space Flight Center, Greenbelt, Md. Credit: NOAA/NASA GOES Project NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
The ICDP Information Network and the Chinese Continental Scientific Drilling CCSD
NASA Astrophysics Data System (ADS)
Conze, R.; Su, D.
2002-12-01
ICDP is an international program investigating the 'System Earth' in multidisciplinary co-operation. Funded drilling projects are characterized by detailed fieldwork at world-class geological sites on the continents and by the global scope of research objectives. During project work, partnering researchers from all over the world work together at remote drill sites and in laboratories at their institutions. Researchers apply a range of highly diverse scientific methodologies, thereby acquiring huge data sets. Multinational co-operation and increasing amounts of scientific data require completely new concepts and practices for scientific work, and place heavy demands on information and communications management. This is achieved by means of the ICDP Information Network. Scientists working on ICDP related data need a central long-term data archive with powerful tools for navigation, data modeling and analysis. The Chinese Continental Scientific Drilling CCSD is a national key scientific and engineering project of the PR China supported by ICDP. The current drill site of CCSD is located in Donghai, Jiangsu Province, the eastern part of the Dabie-Sulu UHP metamorphic belt, which possesses global geological significance. From the spud on June 25, 2001 to April 6, 2002, the 2000m pilot hole was finished with a total core recovery of 88.7% and an average inclination angle of 3-4 degrees. The pilot hole has been transformed to the main hole by hole opening. Deepening and coring of the CCSD-1 main hole is currently in progress. Most of the basic scientific documentation and measurements are done in a large field laboratory directly beside the drill rig, which was set up using the standard of the former German Continental Scientific Drilling (KTB). It includes a powerful infrastructure for computing and electronic communication as well as a comprehensive twofold data and information management: 1. The CCSD-DMIS is a special Data Management Information System for the chinese project management, which is used for internal controlling and decision making. 2. The CCSD-DIS is the specifically designed on-site Drilling Information System, which is used for documentation and archiving of all kinds of scientific and technical information. Both are used in a local Intranet within the field lab, but they also provide certain information via secured Internet services. The CCSD-DIS feeds day-by-day the current reports and new recordings to the CCSD Web portal within the ICDP Information Network (http://www.icdp-online.org/html/sites/donghai/news/news.html). This portal provides chinese and english news and information for the public as well as scientific and technical stuff which is only available for the international CCSD Science Team. Using the example of the CCSD project, a poster and an on-line presentation will show the main components and value-added services of the ICDP Information Network like: ú the common portal for and dissemination of project information by the ICDP Clearinghouse, ú capture of scientific drilling data using individual On-Site Drilling Information Systems (DIS), ú virtual global field laboratories based on eXtended DIS, ú integrated evaluation and analysis of data supported by the ICDP Data Webhouse.
NASA Technical Reports Server (NTRS)
1989-01-01
Important and fundamental scientific progress can be attained through space observations in the wavelengths longward of 1 micron. The formation of galaxies, stars, and planets, the origin of quasars and the nature of active galactic nuclei, the large scale structure of the Universe, and the problem of the missing mass, are among the major scientific issues that can be addressed by these observations. Significant advances in many areas of astrophysics can be made over the next 20 years by implementing the outlined program. This program combines large observatories with smaller projects to create an overall scheme that emphasized complementarity and synergy, advanced technology, community support and development, and the training of the next generation of scientists. Key aspects of the program include: the Space Infrared Telescope Facility; the Stratospheric Observatory for Infrared Astronomy; a robust program of small missions; and the creation of the technology base for future major observatories.
Science friction: data, metadata, and collaboration.
Edwards, Paul N; Mayernik, Matthew S; Batcheller, Archer L; Bowker, Geoffrey C; Borgman, Christine L
2011-10-01
When scientists from two or more disciplines work together on related problems, they often face what we call 'science friction'. As science becomes more data-driven, collaborative, and interdisciplinary, demand increases for interoperability among data, tools, and services. Metadata--usually viewed simply as 'data about data', describing objects such as books, journal articles, or datasets--serve key roles in interoperability. Yet we find that metadata may be a source of friction between scientific collaborators, impeding data sharing. We propose an alternative view of metadata, focusing on its role in an ephemeral process of scientific communication, rather than as an enduring outcome or product. We report examples of highly useful, yet ad hoc, incomplete, loosely structured, and mutable, descriptions of data found in our ethnographic studies of several large projects in the environmental sciences. Based on this evidence, we argue that while metadata products can be powerful resources, usually they must be supplemented with metadata processes. Metadata-as-process suggests the very large role of the ad hoc, the incomplete, and the unfinished in everyday scientific work.
The Disappearing Fourth Wall: John Marburger, Science Policy, and the SSC
NASA Astrophysics Data System (ADS)
Crease, Robert
2015-04-01
John H. Marburger (1941-2011) was a skilled science administrator who had a fresh and unique approach to science policy and science leadership. His posthumously published book Science Policy up Close contains recollections of key science policy episodes in which he participated or observed closely. One was the administration of the Superconducting Supercollider (SSC); Marburger was Chairman of the Universities Research Association, the group charged with managing the SSC, from 1988-1994. Many accounts of the SSC saga attribute its demise to a combination of transitory factors: poor management, rising cost estimates, the collapse of the Soviet Union and thus of the Cold War threat, complaints by ``small science'' that the SSC's ``big science'' was consuming their budget, Congress's desire to cut spending, unwarranted contract regulations imposed by the Department of Energy (DOE) in response to environmental lapses at nuclear weapons laboratories, and so forth. Marburger tells a subtler story whose implications for science policy are more significant and far-reaching. The story involves changes in the attitude of the government towards large scientific projects that reach back to management reforms introduced by the administration of Presidents Johnson, Nixon, and Carter in the 1960s and 1970s. This experience impressed Marburger with the inevitability of public oversight of large scientific projects, and with the need for planners of such projects to establish and make public a cost and schedule tracking system that would model the project's progress and expenditures.
Tullos, Desiree
2009-07-01
The need to understand and minimize negative environmental outcomes associated with large dams has both contributed to and benefited from the introduction and subsequent improvements in the Environmental Impact Assessment (EIA) process. However, several limitations in the EIA process remain, including those associated with the uncertainty and significance of impact projections. These limitations are directly related to the feedback between science and policy, with information gaps in scientific understanding discovered through the EIA process contributing valuable recommendations on critical focus areas for prioritizing and funding research within the fields of ecological conservation and river engineering. This paper presents an analysis of the EIA process for the Three Gorges Project (TGP) in China as a case study for evaluating this feedback between the EIA and science and policy. For one of the best-studied public development projects in the world, this paper presents an investigation into whether patterns exist between the scientific interest (via number of publications) in environmental impacts and (a) the identification of impacts as uncertain or priority by the EIA, (b) decisions or political events associated with the dam, and (c) impact type. This analysis includes the compilation of literature on TGP, characterization of ecosystem interactions and responses to TGP through a hierarchy of impacts, coding of EIA impacts as "uncertain" impacts that require additional study and "priority" impacts that have particularly high significance, mapping of an event chronology to relate policies, institutional changes, and decisions about TGP as "events" that could influence the focus and intensity of scientific investigation, and analysis of the number of publications by impact type and order within the impact hierarchy. From these analyses, it appears that the availability and consistency of scientific information limit the accuracy of environmental impact projections. These analyses also suggest a lack of direct feedback between the EIA process and emerging science, as indicated by the failure of literature to focus on issues related to the design and management of TGP, ultimately challenging the environmental sustainability of the project. While the EIA process has enormous potential for improving both the basic sciences and the planning and sustainability of hydrodevelopment, important institutional changes need to occur for this potential to be realized. This paper concludes with recommendations about those institutional changes needed to improve the feedback between the science and policy, and ultimately the environmental sustainability, of large dams.
Establishment of a National Wind Energy Center at University of Houston
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Su Su
The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less
Delivering the EarthScope Transportable Array as a Community Asset
NASA Astrophysics Data System (ADS)
Busby, R. W.; Woodward, R.; Simpson, D. W.; Hafner, K.
2009-12-01
The Transportable Array element of EarthScope/USArray is a culmination of years of coordination and planning for a large science initiative via the NSF MREFC program. US researchers and the IRIS Consortium conceived of the science objectives for a continental scale array and, together with the geodetic (PBO) and fault drilling (SAFOD) communities and NSF, successfully merged these scientific objectives with a compelling scientific and technical proposal, accompanied with the budget and schedule to accomplish it. The Transportable Array is now an efficient and exacting execution of an immense technical challenge that, by many measures, is yielding exciting science return, both expected and unanticipated. The technical facility is first-rate in its implementation, yet responsive to science objectives and discovery, actively engaging the community in discussion and new direction. The project is carried out by a core of dedicated and professional staff , guided and advised through considerable feedback from science users who have unprecedented access to high-quality data. This, in a sense, lets seismologists focus on research, rather than be administrators, drivers, shippers, battery mules, electronic technicians and radio hams. Now that USArray is operational, it is interesting to reflect on whether the TA, as a professionally executed project, could succeed as well if it were an independent endeavor, managed and operated outside of the resources developed and available through IRIS and its core programs. We detail how the support the USArray facility provides improves data accessibility and enhances interdisciplinary science. We suggest that the resources and community leadership provided by the IRIS Consortium, and the commitment to the principle of free and open data access, have been basic underpinnings for the success of the TA. This involvement of community-based, scientific leadership in the development of large facilities should be considered in planning future large Earth science or even basic science endeavors. The Global Seismographic Network provides another example where, with strong scientific leadership, the technical objectives have returned far more than expected results from all manner of application of new techniques to high quality data. Again, the key ingredient may be that the project oversight is driven by scientists with free and open access to data and broad and evolving expectations as to how the facility might be applied towards research objectives. Major projects must clearly follow defined plans and budgets; but, while it is important to have managers to motivate schedules and control costs, the energy, vigor and effort to optimize new measures and discover new applications derive from the insights and enthusiasm of the science community.
The Role of Empathy in Preparing Teachers to Tackle Bullying
ERIC Educational Resources Information Center
Murphy, Helena; Tubritt, John; Norman, James O'Higgins
2018-01-01
Much research on bullying behaviour in schools among students has been carried out since the 1970's, when Olweus started a large-scale project in Norway which is now generally regarded as the first scientific study on bullying. Yet, there has been little research on how teachers respond to reports of bullying and tackle bullying behaviour in…
Two Undergraduate Projects for Data Acquisition and Control
NASA Astrophysics Data System (ADS)
Hiersche, Kelly; Pena, Tara; Grogan, Tanner; Wright, Matthew
We are designing two separate instruments for use in our undergraduate laboratory. In the first project, a Raspberry Pi is used to simultaneously monitor a large number of current and voltage readings and store them in a database. In our second project, we are constructing our own microcontrollers to work as a general-purpose interface based off work carried out in Review of Scientific Instruments 84, 103101 (2013). It was designed for low cost and simple construction, making it ideal for undergraduate level work. This circuit has room for two interchangeable daughter boards, giving it the capability to work as a general lab interface, lock-in detector, or waveform generator.
A critique of Lilienfeld et al.'s (2000) "The scientific status of projective techniques".
Hibbard, Stephen
2003-06-01
Lilienfeld, Wood, and Garb (2000) published a largely negative critique of the validity and reliability of projective methods, concentrating on the Comprehensive System for the Rorschach (Exner, 1993), 3 systems for coding the Thematic Apperception Test (TAT; Murray, 1943) cards, and human figure drawings. This article is an effort to document and correct what I perceive as errors of omission and commission in the Lilienfeld et al. article. When projective measures are viewed in the light of these corrections, the evidence for the validity and clinical usefulness of the Rorschach and TAT methods is more robust than Lilienfeld et al. represented.
BRAVO (Brazilian Astrophysical Virtual Observatory): data mining development
NASA Astrophysics Data System (ADS)
De Carvalho, R. R.; Capelato, H. V.; Velho, H. C.
2007-08-01
The primary goal of the BRAVO project is to generate investment in information technology, with particular emphasis on datamining and statistical analysis. From a scientific standpoint, the participants assembled to date are engaged in several scientific projects in various fields of cosmology, astrophysics, and data analysis, with significant contributions from international partners. These scientists conduct research on clusters of galaxies, small groups of galaxies, elliptical galaxies, population synthesis, N-body simulations, and a variety of studies in stellar astrophysics. One of the main aspects of this project is the incorporation of these disparate areas of astrophysical research within the context of the coherent development of database technology.Observational cosmology is one of the branches of science experiencing the largest growth in the past few decades. large photometric and spectroscopic surveys have been carried out in both hemispheres. As a result, an extraordinary amount of data in all portions of the electromagnetic spectrum exists, but without standard techniques for storage and distribution. This project will utilize several specific astronomical databases, created to store data generated by several instruments (including SOAR, Gemini, BDA, etc), uniting them within a common framework and with standard interfaces. We are inviting members of the entire Brazilian astronomical community to partake in this effort. This will certainly impact both education and outreach efforts, as well as the future development of astrophysical research. Finally, this project will provide a constant investment in human resources. First, it will do so by stimulating ongoing short technical visits to Johns Hopkins University and Caltech. These will allow us to bring software technology and expertise in datamining back to Brazil. Second, we will organize the Summer School on Software Technology in Astrophysics, which will be designed to ensure that the Brazilian scientific community can take full advantage of the benefits offered by the VO project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srinath Vadlamani; Scott Kruger; Travis Austin
Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less
Use of a metadata documentation and search tool for large data volumes: The NGEE arctic example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devarakonda, Ranjeet; Hook, Leslie A; Killeffer, Terri S
The Online Metadata Editor (OME) is a web-based tool to help document scientific data in a well-structured, popular scientific metadata format. In this paper, we will discuss the newest tool that Oak Ridge National Laboratory (ORNL) has developed to generate, edit, and manage metadata and how it is helping data-intensive science centers and projects, such as the U.S. Department of Energy s Next Generation Ecosystem Experiments (NGEE) in the Arctic to prepare metadata and make their big data produce big science and lead to new discoveries.
NASA Astrophysics Data System (ADS)
Borne, K. D.
2009-12-01
The emergence of e-Science over the past decade as a paradigm for Internet-based science was an inevitable evolution of science that built upon the web protocols and access patterns that were prevalent at that time, including Web Services, XML-based information exchange, machine-to-machine communication, service registries, the Grid, and distributed data. We now see a major shift in web behavior patterns to social networks, user-provided content (e.g., tags and annotations), ubiquitous devices, user-centric experiences, and user-led activities. The inevitable accrual of these social networking patterns and protocols by scientists and science projects leads to U-Science as a new paradigm for online scientific research (i.e., ubiquitous, user-led, untethered, You-centered science). U-Science applications include components from semantic e-science (ontologies, taxonomies, folksonomies, tagging, annotations, and classification systems), which is much more than Web 2.0-based science (Wikis, blogs, and online environments like Second Life). Among the best examples of U-Science are Citizen Science projects, including Galaxy Zoo, Stardust@Home, Project Budburst, Volksdata, CoCoRaHS (the Community Collaborative Rain, Hail and Snow network), and projects utilizing Volunteer Geographic Information (VGI). There are also scientist-led projects for scientists that engage a wider community in building knowledge through user-provided content. Among the semantic-based U-Science projects for scientists are those that specifically enable user-based annotation of scientific results in databases. These include the Heliophysics Knowledgebase, BioDAS, WikiProteins, The Entity Describer, and eventually AstroDAS. Such collaborative tagging of scientific data addresses several petascale data challenges for scientists: how to find the most relevant data, how to reuse those data, how to integrate data from multiple sources, how to mine and discover new knowledge in large databases, how to represent and encode the new knowledge, and how to curate the discovered knowledge. This talk will address the emergence of U-Science as a type of Semantic e-Science, and will explore challenges, implementations, and results. Semantic e-Science and U-Science applications and concepts will be discussed within the context of one particular implementation (AstroDAS: Astronomy Distributed Annotation System) and its applicability to petascale science projects such as the LSST (Large Synoptic Survey Telescope), coming online within the next few years.
Exposing the Science in Citizen Science: Fitness to Purpose and Intentional Design.
Parrish, Julia K; Burgess, Hillary; Weltzin, Jake F; Fortson, Lucy; Wiggins, Andrea; Simmons, Brooke
2018-05-21
Citizen science is a growing phenomenon. With millions of people involved and billions of in-kind dollars contributed annually, this broad extent, fine grain approach to data collection should be garnering enthusiastic support in the mainstream science and higher education communities. However, many academic researchers demonstrate distinct biases against the use of citizen science as a source of rigorous information. To engage the public in scientific research, and the research community in the practice of citizen science, a mutual understanding is needed of accepted quality standards in science, and the corresponding specifics of project design and implementation when working with a broad public base. We define a science-based typology focused on the degree to which projects deliver the type(s) and quality of data/work needed to produce valid scientific outcomes directly useful in science and natural resource management. Where project intent includes direct contribution to science and the public is actively involved either virtually or hands-on, we examine the measures of quality assurance (methods to increase data quality during the design and implementation phases of a project) and quality control (post hoc methods to increase the quality of scientific outcomes). We suggest that high quality science can be produced with massive, largely one-off, participation if data collection is simple and quality control includes algorithm voting, statistical pruning and/or computational modeling. Small to mid-scale projects engaging participants in repeated, often complex, sampling can advance quality through expert-led training and well-designed materials, and through independent verification. Both approaches - simplification at scale and complexity with care - generate more robust science outcomes.
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
Chelle L. Gentemann Receives 2013 Charles S. Falkenberg Award: Response
NASA Astrophysics Data System (ADS)
Gentemann, Chelle L.
2014-01-01
Receiving the 2013 Charles S. Falkenberg Award of the American Geophysical Union came completely as a surprise, wonderful but humbling. It is attributable to those who have made my work possible. Peter Minnett is first on the list. He is a great friend and colleague, an example for us all of how to conduct scientific research. Unstintingly generous with his time, resources, and ideas, he always puts scientific advancement ahead of personal gain. Eric Lindstrom, program manager for NASA's physical oceanography program, has been a role model on how to run large projects and still stay focused on scientific results. His support of this project from the beginning has been instrumental in its success. I also have been lucky enough to work with Frank Wentz, one of the smartest scientists I know. My husband, David White, has put up with much as I have focused on this work, as have our 3-year-old sons, Austin and Bennett. The rest of my family has given their support, love, and inspiration. I wish that my grandfather, who encouraged my interest in science, could be here to share this honor.
Yoho, Rachel A; Vanmali, Binaben H
2016-03-01
The biological sciences encompass topics considered controversial by the American public, such as evolution and climate change. We believe that the development of climate change education in the biology classroom is better informed by an understanding of the history of the teaching of evolution. A common goal for science educators should be to engender a greater respect for and appreciation of science among students while teaching specific content knowledge. Citizen science has emerged as a viable yet underdeveloped method for engaging students of all ages in key scientific issues that impact society through authentic data-driven scientific research. Where successful, citizen science may open avenues of communication and engagement with the scientific process that would otherwise be more difficult to achieve. Citizen science projects demonstrate versatility in education and the ability to test hypotheses by collecting large amounts of often publishable data. We find a great possibility for science education research in the incorporation of citizen science projects in curriculum, especially with respect to "hot topics" of socioscientific debate based on our review of the findings of other authors. Journal of Microbiology & Biology Education.
Exploiting the Use of Social Networking to Facilitate Collaboration in the Scientific Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coppock, Edrick G.
The goal of this project was to exploit social networking to facilitate scientific collaboration. The project objective was to research and identify scientific collaboration styles that are best served by social networking applications and to model the most effective social networking applications to substantiate how social networking can support scientific collaboration. To achieve this goal and objective, the project was to develop an understanding of the types of collaborations conducted by scientific researchers, through classification, data analysis and identification of unique collaboration requirements. Another technical objective in support of this goal was to understand the current state of technology inmore » collaboration tools. In order to test hypotheses about which social networking applications effectively support scientific collaboration the project was to create a prototype scientific collaboration system. The ultimate goal for testing the hypotheses and research of the project was to refine the prototype into a functional application that could effectively facilitate and grow collaboration within the U.S. Department of Energy (DOE) research community.« less
Mason, Ann M; Borgert, Christopher J; Bus, James S; Moiz Mumtaz, M; Simmons, Jane Ellen; Sipes, I Glenn
2007-09-01
Risk assessments are enhanced when policy and other decision-makers have access to experimental science designed to specifically inform key policy questions. Currently, our scientific understanding and science policy for environmental mixtures are based largely on extrapolating from and combining data in the observable range of single chemical toxicity to lower environmental concentrations and composition, i.e., using higher dose data to extrapolate and predict lower dose toxicity. There is a growing consensus that the default assumptions underlying those mixtures risk assessments that are conducted in the absence of actual mixtures data rest on an inadequate scientific database. Future scientific research should both build upon the current science and advance toxicology into largely uncharted territory. More precise approaches to better characterize toxicity of mixtures are needed. The Society of Toxicology (SOT) sponsored a series of panels, seminars, and workshops to help catalyze and improve the design and conduct of experimental toxicological research to better inform risk assessors and decision makers. This paper summarizes the activities of the SOT Mixtures Program and serves as the introductory paper to a series of articles in this issue, which hope to inspire innovative research and challenge the status quo.
A Sneak Preview of the E-ELT Design Reference Science Plan Questionnaire Results
NASA Astrophysics Data System (ADS)
Kissler-Patig, M.; Küpcü Yoldaş, A.; Liske, J.
2009-12-01
The European Extremely Large Telescope is in its detailed design phase until the end of 2010. During this period, the telescope design is being consolidated and instrument and operation concepts are being studied. The scientific users are feeding back requirements into the project in numerous ways. One of them, the Design Reference Science Plan, was an opportunity for the entire community to provide direct feedback to the project. Here, we summarise the first results from this study. The full report will appear in the first half of 2010.
Enhancing Transdisciplinary Research Through Collaborative Leadership
Gray, Barbara
2008-01-01
Transcending the well-established and familiar boundaries of disciplinary silos poses challenges for even the most interpersonally competent scientists. This paper explores the challenges inherent in leading transdisciplinary projects, detailing the critical roles that leaders play in shepherding transdisciplinary scientific endeavors. Three types of leadership tasks are considered: cognitive, structural, and processual. Distinctions are made between leading small, co-located projects and large, dispersed ones. Finally, social-network analysis is proposed as a useful tool for conducting research on leadership, and, in particular, on the role of brokers, on complex transdisciplinary teams. PMID:18619392
[The research project: financing and management].
Schena, F P
2003-01-01
Basic and clinical research is accomplished by projects. The design of a project is not only based on the scientific content but also on its financing and management. This article wants to illustrate the correct modalities for project financing and project management in a scientific project.
A Formula for Fixing Troubled Projects: The Scientific Method Meets Leadership
NASA Technical Reports Server (NTRS)
Wagner, Sandra
2006-01-01
This presentation focuses on project management, specifically addressing project issues using the scientific method of problem-solving. Two sample projects where this methodology has been applied are provided.
Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems
NASA Astrophysics Data System (ADS)
Bourgine, P.; Johnson, J.
2009-04-01
The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (< a few Euros). The research will create an in vivo laboratory of one to ten thousand postgraduate students studying courses in complex systems. This community is chosen because it is large and interdisciplinary and there is a known requirement for courses for thousand of students across Europe. The project involves every aspect of course production and delivery. Within this the research focused on the creation of a Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.
Beyond PARR - PMEL's Integrated Data Management Strategy
NASA Astrophysics Data System (ADS)
Burger, E. F.; O'Brien, K.; Manke, A. B.; Schweitzer, R.; Smith, K. M.
2016-12-01
NOAA's Pacific Marine Environmental Laboratory (PMEL) hosts a wide range of scientific projects that span a number of scientific and environmental research disciplines. Each of these 14 research projects have their own data streams that are as diverse as the research. With its requirements for public access to federally funded research results and data, the 2013 White House Office of Science and Technology memo on Public Access to Research Results (PARR) changed the data management landscape for Federal agencies. In 2015, with support from the PMEL Director, Dr. Christopher Sabine, PMEL's Science Data Integration Group (SDIG) initiated a multi-year effort to formulate and implement an integrated data-management strategy for PMEL research efforts. Instead of using external requirements, such as PARR, to define our approach, we focussed on strategies to provide PMEL science projects with a unified framework for data submission, interoperable data access, data storage, and easier data archival to National Data Centers. This improves data access to PMEL scientists, their collaborators, and the public, and also provides a unified lab framework that allows our projects to meet their data management objectives, as well as those required by the PARR. We are implementing this solution in stages that allows us to test technology and architecture choices before comitting to a large scale implementation. SDIG developers have completed the first year of development where our approach is to reuse and leverage existing frameworks and standards. This presentation will describe our data management strategy, explain our phased implementation approach, the software and framework choices, and how these elements help us meet the objectives of this strategy. We will share the lessons learned in dealing with diverse and complex datasets in this first year of implementation and how these outcomes will shape our decisions for this ongoing effort. The data management capabilities now available to scientific projects, and other services being developed to manage and preserve PMEL's scientific data assets for our researchers, their collaborators, and future generations, will be described.
E-Labs - Learning with Authentic Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardeen, Marjorie G.; Wayne, Mitchell
the success teachers have had providing an opportunity for students to: • Organize and conduct authentic research. • Experience the environment of scientific collaborations. • Possibly make real contributions to a burgeoning scientific field. We've created projects that are problem-based, student driven and technology dependent. Students reach beyond classroom walls to explore data with other students and experts and share results, publishing original work to a worldwide audience. Students can discover and extend the research of other students, modeling the processes of modern, large-scale research projects. From start to finish e-Labs are student-led, teacher-guided projects. Students need only a Webmore » browser to access computing techniques employed by professional researchers. A Project Map with milestones allows students to set the research plan rather than follow a step-by-step process common in other online projects. Most importantly, e-Labs build the learning experience around the students' own questions and let them use the very tools that scientists use. Students contribute to and access shared data, most derived from professional research databases. They use common analysis tools, store their work and use metadata to discover, replicate and confirm the research of others. This is where real scientific collaboration begins. Using online tools, students correspond with other research groups, post comments and questions, prepare summary reports, and in general participate in the part of scientific research that is often left out of classroom experiments. Teaching tools such as student and teacher logbooks, pre- and post-tests and an assessment rubric aligned with learner outcomes help teachers guide student work. Constraints on interface designs and administrative tools such as registration databases give teachers the "one-stop-shopping" they seek for multiple e-Labs. Teaching and administrative tools also allow us to track usage and assess the impact on student learning.« less
Hanauer, David I; Graham, Mark J; Betancur, Laura; Bobrownicki, Aiyana; Cresawn, Steven G; Garlena, Rebecca A; Jacobs-Sera, Deborah; Kaufmann, Nancy; Pope, Welkin H; Russell, Daniel A; Jacobs, William R; Sivanathan, Viknesh; Asai, David J; Hatfull, Graham F
2017-12-19
Engaging undergraduate students in scientific research promises substantial benefits, but it is not accessible to all students and is rarely implemented early in college education, when it will have the greatest impact. An inclusive Research Education Community (iREC) provides a centralized scientific and administrative infrastructure enabling engagement of large numbers of students at different types of institutions. The Science Education Alliance-Phage Hunters Advancing Genomics and Evolutionary Science (SEA-PHAGES) is an iREC that promotes engagement and continued involvement in science among beginning undergraduate students. The SEA-PHAGES students show strong gains correlated with persistence relative to those in traditional laboratory courses regardless of academic, ethnic, gender, and socioeconomic profiles. This persistent involvement in science is reflected in key measures, including project ownership, scientific community values, science identity, and scientific networking. Copyright © 2017 the Author(s). Published by PNAS.
Hanauer, David I.; Graham, Mark J.; Betancur, Laura; Bobrownicki, Aiyana; Cresawn, Steven G.; Garlena, Rebecca A.; Jacobs-Sera, Deborah; Kaufmann, Nancy; Pope, Welkin H.; Russell, Daniel A.; Jacobs, William R.; Sivanathan, Viknesh; Asai, David J.
2017-01-01
Engaging undergraduate students in scientific research promises substantial benefits, but it is not accessible to all students and is rarely implemented early in college education, when it will have the greatest impact. An inclusive Research Education Community (iREC) provides a centralized scientific and administrative infrastructure enabling engagement of large numbers of students at different types of institutions. The Science Education Alliance–Phage Hunters Advancing Genomics and Evolutionary Science (SEA-PHAGES) is an iREC that promotes engagement and continued involvement in science among beginning undergraduate students. The SEA-PHAGES students show strong gains correlated with persistence relative to those in traditional laboratory courses regardless of academic, ethnic, gender, and socioeconomic profiles. This persistent involvement in science is reflected in key measures, including project ownership, scientific community values, science identity, and scientific networking. PMID:29208718
Challenges and opportunities of cloud computing for atmospheric sciences
NASA Astrophysics Data System (ADS)
Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.
2016-04-01
Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.
Scientific, Social, and Institutional Constraints Facing Coastal Restoration in Louisiana
NASA Astrophysics Data System (ADS)
Kleiss, B.; Shabman, L. A.; Brown, G.
2017-12-01
Due to multiple stressors, including subsidence, accelerated sea level rise, canal construction, tropical storm damages, and basin-wide river management decisions, southern Louisiana is experiencing some of the world's highest rates of coastal land loss. Although ideas abound, the solutions proposed to mitigate for land loss are often uncertain, complex, expensive, and difficult. There are significant scientific uncertainties associated with fundamental processes including the spatial distribution of rates of subsidence, the anticipated impacts of increased inundation on marsh plant species and questions about the resilience of engineered solutions. Socially and politically, there is the need to balance navigation, flood risk management and environmental restoration with the fact that the land involved is largely privately owned and includes many communities and towns. And layered within this, there are federal and state regulatory constraints which seek to follow a myriad of existing State and Federal laws, protect the benefits realized from previous federal investments, and balance the conflicting interests of a large number of stakeholders. Additionally, current practice when implementing some environmental regulations is to assess impacts against the baseline of current conditions, not projected future, non-project conditions, making it difficult to receive a permit for projects which may have a short-term detriment, but hope for a long-term benefit. The resolution (or lack thereof) of these issues will serve to inform similar future struggles in other low lying coastal areas around the globe.
The Role of Dissemination as a Fundamental Part of a Research Project.
Marín-González, Esther; Malmusi, Davide; Camprubí, Lluís; Borrell, Carme
2017-04-01
Dissemination and communication of research should be considered as an integral part of any research project. Both help in increasing the visibility of research outputs, public engagement in science and innovation, and confidence of society in research. Effective dissemination and communication are vital to ensure that the conducted research has a social, political, or economical impact. They draw attention of governments and stakeholders to research results and conclusions, enhancing their visibility, comprehension, and implementation. In the European project SOPHIE (Evaluating the Impact of Structural Policies on Health Inequalities and Their Social Determinants and Fostering Change), dissemination was an essential component of the project in order to achieve the purpose of fostering policy change based on research findings. Here we provide our experience and make some recommendations based on our learning. A strong use of online communication (website, Twitter, and Slideshare accounts), the production of informative videos, the research partnership with civil society organizations, and the organization of final concluding scientific events, among other instruments, helped to reach a large public within the scientific community, civil society, and the policy making arena and to influence the public view on the impact on health and equity of certain policies.
NASA Technical Reports Server (NTRS)
Kring, David A.; Zurcher, Lukas; Horz, Friedrich
2003-01-01
The Chicxulub Scientific Drilling Project recovered a continuous core from the Yaxcopoil-1 (YAX-1) borehole, which is approx.60-65 km from the center of the Chicxulub structure, approx.15 km beyond the limit of the estimated approx.50 km radius transient crater (excavation cavity), but within the rim of the estimated approx.90 km radius final crater. Approximately approx.100 m of melt-bearing impactites were recoverd from a depth of 794 to 895 m, above approx.600 m of underlying megablocks of Cretaceous target sediments, before bottoming at 1511 m. Compared to lithologies at impact craters like the Ries, the YAX-1 impactite sequence is incredibly rich in impact melts of unusual textural variety and complexity. The impactite sequence has also been altered by hydrothermal activity that may have largely been produced by the impact event.
OAI and NASA's Scientific and Technical Information
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Rocker, JoAnne; Harrison, Terry L.
2002-01-01
The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) is an evolving protocol and philosophy regarding interoperability for digital libraries (DLs). Previously, "distributed searching" models were popular for DL interoperability. However, experience has shown distributed searching systems across large numbers of DLs to be difficult to maintain in an Internet environment. The OAI-PMH is a move away from distributed searching, focusing on the arguably simpler model of "metadata harvesting". We detail NASA s involvement in defining and testing the OAI-PMH and experience to date with adapting existing NASA distributed searching DLs (such as the NASA Technical Report Server) to use the OAI-PMH and metadata harvesting. We discuss some of the entirely new DL projects that the OAI-PMH has made possible, such as the Technical Report Interchange project. We explain the strategic importance of the OAI-PMH to the mission of NASA s Scientific and Technical Information Program.
Astro-WISE: Chaining to the Universe
NASA Astrophysics Data System (ADS)
Valentijn, E. A.; McFarland, J. P.; Snigula, J.; Begeman, K. G.; Boxhoorn, D. R.; Rengelink, R.; Helmich, E.; Heraudeau, P.; Verdoes Kleijn, G.; Vermeij, R.; Vriend, W.-J.; Tempelaar, M. J.; Deul, E.; Kuijken, K.; Capaccioli, M.; Silvotti, R.; Bender, R.; Neeser, M.; Saglia, R.; Bertin, E.; Mellier, Y.
2007-10-01
The recent explosion of recorded digital data and its processed derivatives threatens to overwhelm researchers when analysing their experimental data or looking up data items in archives and file systems. While current hardware developments allow the acquisition, processing and storage of hundreds of terabytes of data at the cost of a modern sports car, the software systems to handle these data are lagging behind. This problem is very general and is well recognized by various scientific communities; several large projects have been initiated, e.g., DATAGRID/EGEE {http://www.eu-egee.org/} federates compute and storage power over the high-energy physical community, while the international astronomical community is building an Internet geared Virtual Observatory {http://www.euro-vo.org/pub/} (Padovani 2006) connecting archival data. These large projects either focus on a specific distribution aspect or aim to connect many sub-communities and have a relatively long trajectory for setting standards and a common layer. Here, we report first light of a very different solution (Valentijn & Kuijken 2004) to the problem initiated by a smaller astronomical IT community. It provides an abstract scientific information layer which integrates distributed scientific analysis with distributed processing and federated archiving and publishing. By designing new abstractions and mixing in old ones, a Science Information System with fully scalable cornerstones has been achieved, transforming data systems into knowledge systems. This break-through is facilitated by the full end-to-end linking of all dependent data items, which allows full backward chaining from the observer/researcher to the experiment. Key is the notion that information is intrinsic in nature and thus is the data acquired by a scientific experiment. The new abstraction is that software systems guide the user to that intrinsic information by forcing full backward and forward chaining in the data modelling.
The Brazilian Science Data Center (BSDC)
NASA Astrophysics Data System (ADS)
de Almeida, Ulisses Barres; Bodmann, Benno; Giommi, Paolo; Brandt, Carlos H.
Astrophysics and Space Science are becoming increasingly characterised by what is now known as “big data”, the bottlenecks for progress partly shifting from data acquisition to “data mining”. Truth is that the amount and rate of data accumulation in many fields already surpasses the local capabilities for its processing and exploitation, and the efficient conversion of scientific data into knowledge is everywhere a challenge. The result is that, to a large extent, isolated data archives risk being progressively likened to “data graveyards”, where the information stored is not reused for scientific work. Responsible and efficient use of these large data-sets means democratising access and extracting the most science possible from it, which in turn signifies improving data accessibility and integration. Improving data processing capabilities is another important issue specific to researchers and computer scientists of each field. The project presented here wishes to exploit the enormous potential opened up by information technology at our age to advance a model for a science data center in astronomy which aims to expand data accessibility and integration to the largest possible extent and with the greatest efficiency for scientific and educational use. Greater access to data means more people producing and benefiting from information, whereas larger integration of related data from different origins means a greater research potential and increased scientific impact. The project of the BSDC is preoccupied, primarily, with providing tools and solutions for the Brazilian astronomical community. It nevertheless capitalizes on extensive international experience, and is developed in full cooperation with the ASI Science Data Center (ASDC), from the Italian Space Agency, granting it an essential ingredient of internationalisation. The BSDC is Virtual Observatory-complient and part of the “Open Universe”, a global initiative built under the auspices of the United Nations.
iSPHERE - A New Approach to Collaborative Research and Cloud Computing
NASA Astrophysics Data System (ADS)
Al-Ubaidi, T.; Khodachenko, M. L.; Kallio, E. J.; Harry, A.; Alexeev, I. I.; Vázquez-Poletti, J. L.; Enke, H.; Magin, T.; Mair, M.; Scherf, M.; Poedts, S.; De Causmaecker, P.; Heynderickx, D.; Congedo, P.; Manolescu, I.; Esser, B.; Webb, S.; Ruja, C.
2015-10-01
The project iSPHERE (integrated Scientific Platform for HEterogeneous Research and Engineering) that has been proposed for Horizon 2020 (EINFRA-9- 2015, [1]) aims at creating a next generation Virtual Research Environment (VRE) that embraces existing and emerging technologies and standards in order to provide a versatile platform for scientific investigations and collaboration. The presentation will introduce the large project consortium, provide a comprehensive overview of iSPHERE's basic concepts and approaches and outline general user requirements that the VRE will strive to satisfy. An overview of the envisioned architecture will be given, focusing on the adapted Service Bus concept, i.e. the "Scientific Service Bus" as it is called in iSPHERE. The bus will act as a central hub for all communication and user access, and will be implemented in the course of the project. The agile approach [2] that has been chosen for detailed elaboration and documentation of user requirements, as well as for the actual implementation of the system, will be outlined and its motivation and basic structure will be discussed. The presentation will show which user communities will benefit and which concrete problems, scientific investigations are facing today, will be tackled by the system. Another focus of the presentation is iSPHERE's seamless integration of cloud computing resources and how these will benefit scientific modeling teams by providing a reliable and web based environment for cloud based model execution, storage of results, and comparison with measurements, including fully web based tools for data mining, analysis and visualization. Also the envisioned creation of a dedicated data model for experimental plasma physics will be discussed. It will be shown why the Scientific Service Bus provides an ideal basis to integrate a number of data models and communication protocols and to provide mechanisms for data exchange across multiple and even multidisciplinary platforms.
Europe Unveils 20-Year Plan for Brilliant Future in Astronomy
NASA Astrophysics Data System (ADS)
2008-11-01
Astronomy is enjoying a golden age of fundamental, exciting discoveries. Europe is at the forefront, thanks to 50 years of progress in cooperation. To remain ahead over the next two to three decades, Europe must prioritise and coordinate the investment of its financial and human resources even more closely. The ASTRONET network, backed by the entire European scientific community, supported by the European Commission, and coordinated by the CNRS, today presents its Roadmap for a brilliant future for European astronomy. ESO's European Extremely Large Telescope is ranked as one of two top-priority large ground-based projects. Astronet and the E-ELT ESO PR Photo 43a/08 The E-ELT Europe is a leader in astronomy today, with the world's most successful optical observatory, ESO's Very Large Telescope, and cutting-edge facilities in radio astronomy and in space. In an unprecedented effort demonstrating the potential of European scientific cooperation, all of European astronomy is now joining forces to define the scientific challenges for the future and construct a common plan to address them in a cost-effective manner. In 2007, a top-level Science Vision was prepared to assess the most burning scientific questions over the next quarter century, ranging from dark energy to life on other planets. European astronomy now presents its Infrastructure Roadmap, a comprehensive 20-year plan to coordinate national and community investments to meet these challenges in a cost-effective manner. The Roadmap not only prioritises the necessary new frontline research facilities from radio telescopes to planetary probes, in space and on the ground, but also considers such key issues as existing facilities, human resources, ICT infrastructure, education and outreach, and cost -- of operations as well as construction. This bold new initiative -- ASTRONET -- was created by the major European funding agencies with support from the European Commission and is coordinated by the National Institute for Earth Sciences and Astronomy (INSU) of the CNRS. To build consensus on priorities in a very diverse community, the Science Vision and Roadmap were developed in an open process involving intensive interaction with the community through large open meetings and feedback via e-mail and the web. The result is a plan now backed by astronomers in 28 Member and Associated States of the EU, with over 500 million inhabitants. Over 60 selected experts from across Europe contributed to the construction of the ASTRONET Roadmap, ensuring that European astronomy has the tools to compete successfully in answering the challenges of the Science Vision. They identified and prioritised a set of new facilities to observe the Universe from radio waves to gamma rays, to open up new ways of probing the cosmos, such as gravitational waves, and to advance in the exploration of our Solar System. In the process, they considered all the elements needed by a successful scientific enterprise, from global-scale cooperation on the largest mega-project to the need for training and recruiting skilled young scientists and engineers. One of two top-priority large ground-based projects is ESO's European Extremely Large Telescope. Its 42-metre diameter mirror will make the E-ELT the largest optical/near-infrared telescope in the world -- "the biggest eye on the sky". The science to be done with the E-ELT is extremely exciting and includes studies of exoplanets and discs, galaxy formation and dark energy. ESO Director General Tim de Zeeuw says: "The top ranking of the E-ELT in the Roadmap is a strong endorsement from the European astronomical community. This flagship project will indisputably raise the European scientific, technological and industrial profile". Among other recommendations, the Roadmap considers how to maximise the future scientific impact of existing facilities in a cost-effective manner. It also identifies a need for better access to state-of-the art computing and laboratory facilities, and for a stronger involvement of European high-tech industry in the development of future facilities. Moreover, success depends critically upon an adequate supply of qualified scientists, and of engineers in fields ranging from IT to optics. Finally, the Roadmap proposes a series of measures to enhance the public understanding of astronomy as a means to boost recruitment in science and technology in schools and universities across Europe. Europe currently spends approximately €2 billion a year on astronomy in the broadest sense. Implementing the ASTRONET Roadmap will require a funding increase of around 20% -- less than €1 per year per European citizen. Global cooperation will be needed -- and is being planned -- for several of the largest projects.
Barriers and Solutions to Conducting Large International, Interdisciplinary Research Projects
NASA Astrophysics Data System (ADS)
Pischke, Erin C.; Knowlton, Jessie L.; Phifer, Colin C.; Gutierrez Lopez, Jose; Propato, Tamara S.; Eastmond, Amarella; de Souza, Tatiana Martins; Kuhlberg, Mark; Picasso Risso, Valentin; Veron, Santiago R.; Garcia, Carlos; Chiappe, Marta; Halvorsen, Kathleen E.
2017-12-01
Global environmental problems such as climate change are not bounded by national borders or scientific disciplines, and therefore require international, interdisciplinary teamwork to develop understandings of their causes and solutions. Interdisciplinary scientific work is difficult enough, but these challenges are often magnified when teams also work across national boundaries. The literature on the challenges of interdisciplinary research is extensive. However, research on international, interdisciplinary teams is nearly non-existent. Our objective is to fill this gap by reporting on results from a study of a large interdisciplinary, international National Science Foundation Partnerships for International Research and Education (NSF-PIRE) research project across the Americas. We administered a structured questionnaire to team members about challenges they faced while working together across disciplines and outside of their home countries in Argentina, Brazil, and Mexico. Analysis of the responses indicated five major types of barriers to conducting interdisciplinary, international research: integration, language, fieldwork logistics, personnel and relationships, and time commitment. We discuss the causes and recommended solutions to the most common barriers. Our findings can help other interdisciplinary, international research teams anticipate challenges, and develop effective solutions to minimize the negative impacts of these barriers to their research.
Barriers and Solutions to Conducting Large International, Interdisciplinary Research Projects.
Pischke, Erin C; Knowlton, Jessie L; Phifer, Colin C; Gutierrez Lopez, Jose; Propato, Tamara S; Eastmond, Amarella; de Souza, Tatiana Martins; Kuhlberg, Mark; Picasso Risso, Valentin; Veron, Santiago R; Garcia, Carlos; Chiappe, Marta; Halvorsen, Kathleen E
2017-12-01
Global environmental problems such as climate change are not bounded by national borders or scientific disciplines, and therefore require international, interdisciplinary teamwork to develop understandings of their causes and solutions. Interdisciplinary scientific work is difficult enough, but these challenges are often magnified when teams also work across national boundaries. The literature on the challenges of interdisciplinary research is extensive. However, research on international, interdisciplinary teams is nearly non-existent. Our objective is to fill this gap by reporting on results from a study of a large interdisciplinary, international National Science Foundation Partnerships for International Research and Education (NSF-PIRE) research project across the Americas. We administered a structured questionnaire to team members about challenges they faced while working together across disciplines and outside of their home countries in Argentina, Brazil, and Mexico. Analysis of the responses indicated five major types of barriers to conducting interdisciplinary, international research: integration, language, fieldwork logistics, personnel and relationships, and time commitment. We discuss the causes and recommended solutions to the most common barriers. Our findings can help other interdisciplinary, international research teams anticipate challenges, and develop effective solutions to minimize the negative impacts of these barriers to their research.
Highlighted scientific findings of the Interior Columbia Basin Ecosystem Management Project.
Thomas M. Quigley; Heidi Bigler Cole
1997-01-01
Decisions regarding 72 million acres of Forest Service- and Bureau of Land Management- administered lands will be based on scientific findings brought forth in the Interior Columbia Basin Ecosystem Management Project. Some highlights of the scientific findings are presented here. Project scientists drew three general conclusions: (1) Conditions and trends differ widely...
Genomics, "Discovery Science," Systems Biology, and Causal Explanation: What Really Works?
Davidson, Eric H
2015-01-01
Diverse and non-coherent sets of epistemological principles currently inform research in the general area of functional genomics. Here, from the personal point of view of a scientist with over half a century of immersion in hypothesis driven scientific discovery, I compare and deconstruct the ideological bases of prominent recent alternatives, such as "discovery science," some productions of the ENCODE project, and aspects of large data set systems biology. The outputs of these types of scientific enterprise qualitatively reflect their radical definitions of scientific knowledge, and of its logical requirements. Their properties emerge in high relief when contrasted (as an example) to a recent, system-wide, predictive analysis of a developmental regulatory apparatus that was instead based directly on hypothesis-driven experimental tests of mechanism.
The ImageJ ecosystem: an open platform for biomedical image analysis
Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368
GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing
NASA Astrophysics Data System (ADS)
Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.
2016-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.
NASA Astrophysics Data System (ADS)
Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.
2009-12-01
The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.
Data management strategies for multinational large-scale systems biology projects.
Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A
2014-01-01
Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.
Data management strategies for multinational large-scale systems biology projects
Peuker, Martin; Regenbrecht, Christian R.A.
2014-01-01
Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157
NASA Technical Reports Server (NTRS)
Moore, A. W.; Neilan, R. E.; Springer, T. A.; Reigber, Ch.
2000-01-01
A strong multipurpose aspect of the International GPS Service (IGS) is revealed by a glance at the titles of current projects and working groups within the IGS: IGS/BIPM Time Transfer Project; Ionosphere Working Group; Troposphere Working Group; International GLONASS Experiment; Working Group on Low-Earth Orbiter Missions; and Tide Gauges, CGPS, and the IGS. The IGS network infrastructure, in large part originally commissioned for geodynamical investigations, has proved to be a valuable asset in developing application-oriented subnetworks whose requirements overlap the characteristics of existing IGS stations and future station upgrades. Issues encountered thus far in the development of multipurpose or multitechnique IGS projects as well as future possibilities will be reviewed.
NASA Astrophysics Data System (ADS)
Martinez-Rey, J.; Brockmann, P.; Cadule, P.; Nangini, C.
2016-12-01
Earth System Models allow us to understand the interactions between climate and biogeological processes. These models generate a very large amount of data. These data are usually reduced to a few number of static figures shown in highly specialized scientific publications. However, the potential impacts of climate change demand a broader perspective regarding the ways in which climate model results of this kind are disseminated, particularly in the amount and variety of data, and the target audience. This issue is of great importance particularly for scientific projects that seek a large broadcast with different audiences on their key results. The MGClimDeX project, which assesses the climate change impact on La Martinique island in the Lesser Antilles, will provide tools and means to help the key stakeholders -responsible for addressing the critical social, economic, and environmental issues- to take the appropriate adaptation and mitigation measures in order to prevent future risks associated with climate variability and change, and its role on human activities. The MGClimDeX project will do so by using model output and data visualization techniques within the next year, showing the cross-connected impacts of climate change on various sectors (agriculture, forestry, ecosystems, water resources and fisheries). To address this challenge of representing large sets of data from model output, we use back-end data processing and front-end web-based visualization techniques, going from the conventional netCDF model output stored on hub servers to highly interactive web-based data-powered visualizations on browsers. We use the well-known javascript library D3.js extended with DC.js -a dimensional charting library for all the front-end interactive filtering-, in combination with Bokeh, a Python library to synthesize the data, all framed in the essential HTML+CSS scripts. The resulting websites exist as standalone information units or embedded into journals or scientific-related information hubs. These visualizations encompass all the relevant findings, allowing individual model intercomparisons in the context of observations and socioeconomic references. In this way, the full spectrum of results of the MGClimDeX project is available to the public in general and policymakers in particular.
Using "Big Data" in a Classroom Setting for Student-Developed Projects
NASA Astrophysics Data System (ADS)
Hayes-Gehrke, Melissa; Vogel, Stuart N.
2018-01-01
The advances in exploration of the optical transient sky anticipated with major facilities such as the Zwicky Transient Facility (ZTF) and Large Synoptic Survey Telescope (LSST) provide an opportunity to integrate large public research datasets into the undergraduate classroom. As a step in this direction, the NSF PIRE-funded GROWTH (Global Relay of Observatories Watching Transients Happen) collaboration provided funding for curriculum development using data from the precursor to ZTF, the Intermediate Palomar Transient Factory (iPTF). One of the iPTF portals, the PTF Variable Marshal, was used by 56 Astronomy majors in the fall 2016 and 2017 semesters of the required Observational Astronomy course at the University of Maryland. Student teams learned about the iPTF survey and how to use the PTF Variable Marshal and then developed their own hypotheses about variable stars to test using data they gathered from the Variable Marshal. Through this project, students gained experience in how to develop scientific questions that can be explored using large datasets and became aware of the limitations and difficulties of such projects. This work was supported in part by NSF award OISE-1545949.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Jincheng; Rimsza, Jessica; Deng, Lu
This NEUP Project aimed to generate accurate atomic structural models of nuclear waste glasses by using large-scale molecular dynamics-based computer simulations and to use these models to investigate self-diffusion behaviors, interfacial structures, and hydrated gel structures formed during dissolution of these glasses. The goal was to obtain realistic and accurate short and medium range structures of these complex oxide glasses, to provide a mechanistic understanding of the dissolution behaviors, and to generate reliable information with predictive power in designing nuclear waste glasses for long-term geological storage. Looking back of the research accomplishments of this project, most of the scientific goalsmore » initially proposed have been achieved through intensive research in the three and a half year period of the project. This project has also generated a wealth of scientific data and vibrant discussions with various groups through collaborations within and outside of this project. Throughout the project one book chapter and 14 peer reviewed journal publications have been generated (including one under review) and 16 presentations (including 8 invited talks) have been made to disseminate the results of this project in national and international conference. Furthermore, this project has trained several outstanding graduate students and young researchers for future workforce in nuclear related field, especially on nuclear waste immobilization. One postdoc and four PhD students have been fully or partially supported through the project with intensive training in the field material science and engineering with expertise on glass science and nuclear waste disposal« less
SFB 754 - Managing a large interdisciplinary collaborative research centre: what matters?
NASA Astrophysics Data System (ADS)
Schelten, Christiane; Antia, Avan; Braker, Gesche; Kamm, Ruth; Mehrtens, Hela
2016-04-01
The German Research Foundation (DFG) funds Collaborative Research Centres (CRCs - in German: Sonderforschungsbereiche SFBs) that are generally applied for by one university, but may also incorporate neighbouring universities or non-university research institutions. SFBs are crossing the boundaries of disciplines, as well as faculties, departments, institutions and institutes. The funding of an SFB can be up to 12 years (3 x 4 years). Kiel University and GEOMAR Helmholtz Centre for Ocean Research Kiel received funding for the SFB 754 'Climate-biogeochemical interactions in the tropical ocean' in 2008. Currently, the centre is in its third phase comprising 17 scientific subprojects, one outreach project, a central coordination and management subproject and a subproject covering the research expeditions with a total project budget of 12 Mio Euro. Around 100 scientists of interdisciplinary research fields (e.g. physical oceanography, micro-biology, palaeontology, chemistry, modelling) are actively involved. Besides generating high profile research, gender equality, early career support and data management are complementary goals of SFBs requested by the DFG. Within the SFB 754 the scientific coordination office is responsible for developing concepts and strategies to cover these additional requirements and over the past eight years the SFB 754 has been successful in setting up profound programmes and various measures. Some of the SFB 754 practices have been taken up by other projects, and hence allowed the SFB 754 to serve as a role model for 'best practice' within marine sciences in Kiel. A main reason for the success of the SFB 754 to work towards the additional goals set out in the DFGs SFB programme is that the project is well tied into existing structures and builds upon outstanding management expertise available in Kiel. Three examples are highlighted here: • young scientists programme (closely linked to a graduate school (Integrated School of Marine Sciences) and a postdoctoral network (Integrated Marine Postdoc Network) both set up by 'The Future Ocean', a project funded within the German Excellence Initiative • gender measures (close cooperation with the Central Office for Gender Equality, Diversity & Family at Kiel University) • data management (part of a joint GEOMAR data management group) Thus, a motivated and also creative coordination team interested in pioneer work is essential to manage a large interdisciplinary research community. Overall, networking, transparent management tools linked to active communication as well as fairness in processes such as the distribution of funds are basic prerequisites of trustful cooperation in large scientific consortia. (This presentation is linked to posters by Dr. Nina Bergmann, Dr. Gesche Braker, Dr. Ruth Kamm and Dr. Hela Mehrtens.)
7 CFR 3400.20 - Grantee review prior to award.
Code of Federal Regulations, 2010 CFR
2010-01-01
... the funded project has changed significantly, other scientific discoveries have affected the project... scientific peer review conducted in accordance with § 3400.21. For education and extension projects, such...
NASA Astrophysics Data System (ADS)
Sobel, A. H.; Wang, S.; Bellon, G.; Sessions, S. L.; Woolnough, S.
2013-12-01
Parameterizations of large-scale dynamics have been developed in the past decade for studying the interaction between tropical convection and large-scale dynamics, based on our physical understanding of the tropical atmosphere. A principal advantage of these methods is that they offer a pathway to attack the key question of what controls large-scale variations of tropical deep convection. These methods have been used with both single column models (SCMs) and cloud-resolving models (CRMs) to study the interaction of deep convection with several kinds of environmental forcings. While much has been learned from these efforts, different groups' efforts are somewhat hard to compare. Different models, different versions of the large-scale parameterization methods, and experimental designs that differ in other ways are used. It is not obvious which choices are consequential to the scientific conclusions drawn and which are not. The methods have matured to the point that there is value in an intercomparison project. In this context, the Global Atmospheric Systems Study - Weak Temperature Gradient (GASS-WTG) project was proposed at the Pan-GASS meeting in September 2012. The weak temperature gradient approximation is one method to parameterize large-scale dynamics, and is used in the project name for historical reasons and simplicity, but another method, the damped gravity wave (DGW) method, will also be used in the project. The goal of the GASS-WTG project is to develop community understanding of the parameterization methods currently in use. Their strengths, weaknesses, and functionality in models with different physics and numerics will be explored in detail, and their utility to improve our understanding of tropical weather and climate phenomena will be further evaluated. This presentation will introduce the intercomparison project, including background, goals, and overview of the proposed experimental design. Interested groups will be invited to join (it will not be too late), and preliminary results will be presented.
Grachev, S V; Gorodnova, E A
2008-01-01
The authors presented an original material, devoted to first experience of teaching of theoretical bases of venture financing of scientifically-innovative projects in medical high school. The results and conclusions were based on data of the questionnaire performed by the authors. More than 90% of young scientist physicians recognized actuality of this problem for realization of their research work results into practice. Thus, experience of teaching of theoretical bases of venture financing of scientifically-innovative projects in medical high school proves reasonability of further development and inclusion the module "The venture financing of scientifically-innovative projects in biomedicine" in the training plan.
NASA Astrophysics Data System (ADS)
Guiquan, Xi; Lin, Cong; Xuehui, Jin
2018-05-01
As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.
7 CFR 3400.20 - Grantee review prior to award.
Code of Federal Regulations, 2014 CFR
2014-01-01
... significantly, other scientific discoveries have affected the project, or the need for the project has changed... in this subpart. For research projects, such review must be a scientific peer review conducted in...
7 CFR 3400.20 - Grantee review prior to award.
Code of Federal Regulations, 2012 CFR
2012-01-01
... significantly, other scientific discoveries have affected the project, or the need for the project has changed... in this subpart. For research projects, such review must be a scientific peer review conducted in...
7 CFR 3400.20 - Grantee review prior to award.
Code of Federal Regulations, 2013 CFR
2013-01-01
... significantly, other scientific discoveries have affected the project, or the need for the project has changed... in this subpart. For research projects, such review must be a scientific peer review conducted in...
7 CFR 3400.20 - Grantee review prior to award.
Code of Federal Regulations, 2011 CFR
2011-01-01
... significantly, other scientific discoveries have affected the project, or the need for the project has changed... in this subpart. For research projects, such review must be a scientific peer review conducted in...
Petersen, David W; Kawasaki, Ernest S
2007-01-01
DNA microarray technology has become a powerful tool in the arsenal of the molecular biologist. Capitalizing on high precision robotics and the wealth of DNA sequences annotated from the genomes of a large number of organisms, the manufacture of microarrays is now possible for the average academic laboratory with the funds and motivation. Microarray production requires attention to both biological and physical resources, including DNA libraries, robotics, and qualified personnel. While the fabrication of microarrays is a very labor-intensive process, production of quality microarrays individually tailored on a project-by-project basis will help researchers shed light on future scientific questions.
NAROM - a national laboratory for space education and student rockets
NASA Astrophysics Data System (ADS)
Hansen, Arne Hjalmar; Larsen, May Aimee; Østbø, Morten
2001-08-01
Despite a considerable growth in space related industry and scientific research over the past few decades, space related education has largely been neglected in our country. NAROM - the National Centre for Space Related Education - was formed last year to organize space related educational activities, to promote recruitment, to promote appreciation for the benefits of space activities, and to stimulate interest for science in general. This year, nine students from Narvik Engineering College have participated in the Hotel Payload Project (HPP) at Anøya Rocket Range. They have thus played an active and essential role in an ongoing engineering project.
NAROM- a national Laboratory for space education
NASA Astrophysics Data System (ADS)
Hansen, Arne Hjalmar; Østbø, Morten
2002-07-01
Despite a considerable growth in space related industry and scientific research over the past few decades, space related education has largely been neglected in our country. NAROM - the National Centre for Space Related Education - was formed last year to organize space related educational activities, to promote recruitment, to promote appreciation for the benefits of space activities, and to stimulate interest for science in general. This year, nine students from Narvik Engineering College have participated in the Hotel Payload Project (HPP) at Andøya Rocket Range. They have thus played an active and essential role in an ongoing engineering project.
Bedmap2; Mapping, visualizing and communicating the Antarctic sub-glacial environment.
NASA Astrophysics Data System (ADS)
Fretwell, Peter; Pritchard, Hamish
2013-04-01
Bedmap2; Mapping, visualizing and communicating the Antarctic sub-glacial environment. The Bedmap2 project has been a large cooperative effort to compile, model, map and visualize the ice-rock interface beneath the Antarctic ice sheet. Here we present the final output of that project; the Bedmap2 printed map. The map is an A1, double sided print, showing 2d and 3d visualizations of the dataset. It includes scientific interpretations, cross sections and comparisons with other areas. Paper copies of the colour double sided map will be freely distributed at this session.
RELIABILITY, AVAILABILITY, AND SERVICEABILITY FOR PETASCALE HIGH-END COMPUTING AND BEYOND
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chokchai "Box" Leangsuksun
2011-05-31
Our project is a multi-institutional research effort that adopts interplay of RELIABILITY, AVAILABILITY, and SERVICEABILITY (RAS) aspects for solving resilience issues in highend scientific computing in the next generation of supercomputers. results lie in the following tracks: Failure prediction in a large scale HPC; Investigate reliability issues and mitigation techniques including in GPGPU-based HPC system; HPC resilience runtime & tools.
ERIC Educational Resources Information Center
Zagallo, Patricia; Meddleton, Shanice; Bolger, Molly S.
2016-01-01
We present our design for a cell biology course to integrate content with scientific practices, specifically data interpretation and model-based reasoning. A 2-year research project within this course allowed us to understand how students interpret authentic biological data in this setting. Through analysis of written work, we measured the extent…
ERIC Educational Resources Information Center
Pavlik, John V.; Laufer, Peter D.; Burns, David P.; Ataya, Ramzi T.
2012-01-01
Journalism and mass communication higher education in Iraq is well established but largely isolated from global developments since the 1970s. In the post-Iraq war period, the United Nations Educational, Scientific and Cultural Organization (UNESCO) implemented a multiyear project to work with the leadership of Iraqi higher education to help update…
The Snowmastodon Project: cutting-edge science on the blade of a bulldozer
Pigati, Jeffery S.; Miller, Ian M.; Johnson, Kirk R.
2015-01-01
Cutting-edge science happens at a variety of scales, from the individual and intimate to the large-scale and collaborative. The publication of a special issue of Quaternary Research in Nov. 2014 dedicated to the scientific findings of the “Snowmastodon Project” highlights what can be done when natural history museums, governmental agencies, and academic institutions work toward a common goal.
NASA Astrophysics Data System (ADS)
Cheng, Tao; Wu, Youwei; Shen, Xiaoqin; Lai, Wenyong; Huang, Wei
2018-01-01
In this work, a simple methodology was developed to enhance the patterning resolution of inkjet printing, involving process optimization as well as substrate modification and treatment. The line width of the inkjet-printed silver lines was successfully reduced to 1/3 of the original value using this methodology. Large-area flexible circuits with delicate patterns and good morphology were thus fabricated. The resultant flexible circuits showed excellent electrical conductivity as low as 4.5 Ω/□ and strong tolerance to mechanical bending. The simple methodology is also applicable to substrates with various wettability, which suggests a general strategy to enhance the printing quality of inkjet printing for manufacturing high-performance large-area flexible electronics. Project supported by the National Key Basic Research Program of China (Nos. 2014CB648300, 2017YFB0404501), the National Natural Science Foundation of China (Nos. 21422402, 21674050), the Natural Science Foundation of Jiangsu Province (Nos. BK20140060, BK20130037, BK20140865, BM2012010), the Program for Jiangsu Specially-Appointed Professors (No. RK030STP15001), the Program for New Century Excellent Talents in University (No. NCET-13-0872), the NUPT "1311 Project" and Scientific Foundation (Nos. NY213119, NY213169), the Synergetic Innovation Center for Organic Electronics and Information Displays, the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD), the Leading Talent of Technological Innovation of National Ten-Thousands Talents Program of China, the Excellent Scientific and Technological Innovative Teams of Jiangsu Higher Education Institutions (No. TJ217038), the Program for Graduate Students Research and Innovation of Jiangsu Province (No. KYZZ16-0253), and the 333 Project of Jiangsu Province (Nos. BRA2017402, BRA2015374).
NASA Astrophysics Data System (ADS)
Favali, Paolo; Beranzoli, Laura; Best, Mairi; Franceschini, PierLuigi; Materia, Paola; Peppoloni, Silvia; Picard, John
2014-05-01
EMSO (European Multidisciplinary Seafloor and Water Column Observatory) is a large-scale European Research Infrastructure (RI). It is a geographically distributed infrastructure composed of several deep-seafloor and water-column observatories, which will be deployed at key sites in European waters, spanning from the Arctic, through the Atlantic and Mediterranean, to the Black Sea, with the basic scientific objective of real-time, long-term monitoring of environmental processes related to the interaction between the geosphere, biosphere and hydrosphere. EMSO is one of the environmental RIs on the ESFRI roadmap. The ESRFI Roadmap identifies new RIs of pan-European importance that correspond to the long term needs of European research communities. EMSO will be the sub-sea segment of the EU's large-scale Earth Observation program, Copernicus (previously known as GMES - Global Monitoring for Environment and Security) and will significantly enhance the observational capabilities of European member states. An open data policy compliant with the recommendations being developed within the GEOSS initiative (Global Earth Observation System of Systems) will allow for shared use of the infrastructure and the exchange of scientific information and knowledge. The processes that occur in the oceans have a direct impact on human societies, therefore it is crucial to improve our understanding of how they operate and interact. To encompass the breadth of these major processes, sustained and integrated observations are required that appreciate the interconnectedness of atmospheric, surface ocean, biological pump, deep-sea, and solid-Earth dynamics and that can address: • natural and anthropogenic change; • interactions between ecosystem services, biodiversity, biogeochemistry, physics, and climate; • impacts of exploration and extraction of energy, minerals, and living resources; • geo-hazard early warning capability for earthquakes, tsunamis, gas-hydrate release, and slope instability and failure; • connecting scientific outcomes to stakeholders and policy makers, including to government decision-makers. The development of a large research infrastructure initiatives like EMSO must continuously take into account wide-reaching environmental and socio-economic implications and objectives. For this reason, an Ethics Commitee was established early in EMSO's initial Preparatory Phase with responsibility for overseeing the key ethical and social aspects of the project. These include: • promoting inclusive science communication and data dissemination services to civil society according to Open Access principles; • guaranteeing top quality scientific information and data as results of top quality research; • promoting the increased adoption of eco-friendly, sustainable technologies through the dissemination of advanced scientific knowledge and best practices to the private sector and to policy makers; • developing Education Strategies in cooperation with academia and industry aimed at informing and sensitizing the general public on the environmental and socio-economic implications and benefits of large research infrastructure initiatives such as EMSO; • carrying out Excellent Science following strict criteria of research integrity, as expressed in the Montreal Statement (2013); • promoting Geo-ethical awareness and innovation by spurring innovative approaches in the management of environmental aspects of large research projects; • supporting technological Innovation by working closely in support of SMEs; • providing a constant, qualified and authoritative one-stop-shopping Reference Point and Advisory for politicians and decision-makers. The paper shows how Geoethics is an essential tool for guiding methodological and operational choices, and management of an European project with great impact on the environment and society.
Yoho, Rachel A.; Vanmali, Binaben H.
2016-01-01
The biological sciences encompass topics considered controversial by the American public, such as evolution and climate change. We believe that the development of climate change education in the biology classroom is better informed by an understanding of the history of the teaching of evolution. A common goal for science educators should be to engender a greater respect for and appreciation of science among students while teaching specific content knowledge. Citizen science has emerged as a viable yet underdeveloped method for engaging students of all ages in key scientific issues that impact society through authentic data-driven scientific research. Where successful, citizen science may open avenues of communication and engagement with the scientific process that would otherwise be more difficult to achieve. Citizen science projects demonstrate versatility in education and the ability to test hypotheses by collecting large amounts of often publishable data. We find a great possibility for science education research in the incorporation of citizen science projects in curriculum, especially with respect to “hot topics” of socioscientific debate based on our review of the findings of other authors. Journal of Microbiology & Biology Education PMID:27047604
NASA Technical Reports Server (NTRS)
Clark, Ian G.; Adler, Mark; Manning, Rob
2015-01-01
NASA's Low-Density Supersonic Decelerator Project is developing and testing the next generation of supersonic aerodynamic decelerators for planetary entry. A key element of that development is the testing of full-scale articles in conditions relevant to their intended use, primarily the tenuous Mars atmosphere. To achieve this testing, the LDSD project developed a test architecture similar to that used by the Viking Project in the early 1970's for the qualification of their supersonic parachute. A large, helium filled scientific balloon is used to hoist a 4.7 m blunt body test vehicle to an altitude of approximately 32 kilometers. The test vehicle is released from the balloon, spun up for gyroscopic stability, and accelerated to over four times the speed of sound and an altitude of 50 kilometers using a large solid rocket motor. Once at those conditions, the vehicle is despun and the test period begins. The first flight of this architecture occurred on June 28th of 2014. Though primarily a shake out flight of the new test system, the flight was also able to achieve an early test of two of the LDSD technologies, a large 6 m diameter Supersonic Inflatable Aerodynamic Decelerator (SIAD) and a large, 30.5 m nominal diameter supersonic parachute. This paper summarizes this first flight.
Visualization techniques to aid in the analysis of multispectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.
Aymerich, Marta; Carrion, Carme; Gallo, Pedro; Garcia, Maria; López-Bermejo, Abel; Quesada, Miquel; Ramos, Rafel
2012-08-01
Most ex-post evaluations of research funding programs are based on bibliometric methods and, although this approach has been widely used, it only examines one facet of the project's impact, that is, scientific productivity. More comprehensive models of payback assessment of research activities are designed for large-scale projects with extensive funding. The purpose of this study was to design and implement a methodology for the ex-post evaluation of small-scale projects that would take into account both the fulfillment of projects' stated objectives as well as other wider benefits to society as payback measures. We used a two-phase ex-post approach to appraise impact for 173 small-scale projects funded in 2007 and 2008 by a Spanish network center for research in epidemiology and public health. In the internal phase we used a questionnaire to query the principal investigator (PI) on the outcomes as well as actual and potential impact of each project; in the external phase we sent a second questionnaire to external reviewers with the aim of assessing (by peer-review) the performance of each individual project. Overall, 43% of the projects were rated as having completed their objectives "totally", and 40% "considerably". The research activities funded were reported by PIs as socially beneficial their greatest impact being on research capacity (50% of payback to society) and on knowledge translation (above 11%). The method proposed showed a good discriminating ability that makes it possible to measure, reliably, the extent to which a project's objectives were met as well as the degree to which the project contributed to enhance the group's scientific performance and of its social payback. Copyright © 2012 Elsevier Ltd. All rights reserved.
The European ALMA Regional Centre: a model of user support
NASA Astrophysics Data System (ADS)
Andreani, P.; Stoehr, F.; Zwaan, M.; Hatziminaoglou, E.; Biggs, A.; Diaz-Trigo, M.; Humphreys, E.; Petry, D.; Randall, S.; Stanke, T.; van Kampen, E.; Bárta, M.; Brand, J.; Gueth, F.; Hogerheijde, M.; Bertoldi, F.; Muxlow, T.; Richards, A.; Vlemmings, W.
2014-08-01
The ALMA Regional Centres (ARCs) form the interface between the ALMA observatory and the user community from the proposal preparation stage to the delivery of data and their subsequent analysis. The ARCs provide critical services to both the ALMA operations in Chile and to the user community. These services were split by the ALMA project into core and additional services. The core services are financed by the ALMA operations budget and are critical to the successful operation of ALMA. They are contractual obligations and must be delivered to the ALMA project. The additional services are not funded by the ALMA project and are not contractual obligations, but are critical to achieve ALMA full scientific potential. A distributed network of ARC nodes (with ESO being the central ARC) has been set up throughout Europe at the following seven locations: Bologna, Bonn-Cologne, Grenoble, Leiden, Manchester, Ondrejov, Onsala. These ARC nodes are working together with the central node at ESO and provide both core and additional services to the ALMA user community. This paper presents the European ARC, and how it operates in Europe to support the ALMA community. This model, although complex in nature, is turning into a very successful one, providing a service to the scientific community that has been so far highly appreciated. The ARC could become a reference support model in an age where very large collaborations are required to build large facilities, and support is needed for geographically and culturally diverse communities.
ERIC Educational Resources Information Center
Akçöltekin, Alptürk
2016-01-01
The main purpose of this study is to develop positive attitudes in high school teachers towards scientific research and project competitions by training them in scientific research and project preparation subjects. The study group consists of 90 high school teachers. As a result of the study, a significant difference was found in favor of…
ERIC Educational Resources Information Center
Núñez, Cristina; Guinea, Ana; Callau, Sara; Bengoa, Christophe; Basco, Josep; Gavaldà, Jordi
2017-01-01
The Bachelor's Degree Final Project (BDFP) of our school aims to develop a real constructive project, enhance cooperative teamwork and increase productivity of students. We present a real case study, related with engineering and scientific innovation results obtained by BDFP, which has led to an innovative scientific study presented at the 7th…
NASA Astrophysics Data System (ADS)
Peticolas, L. M.; Yan, D.; Cable, C.; Zevin, D.; Johnson, C.; Bender, M.
2017-12-01
The "Eclipse Megamovie" project aimed to gather scientifically useful photographs of the corona from the public at large during the Aug 21, 2017 total solar eclipse. The project used many different mechanisms for gathering 3 types of volunteers: the over 1,000 trained photographers positioned along the path of totality, members of the public along the path of totality using the Megamovie App, and members of the public who took photographs on their own and then uploaded photographs. In order to interest the public in becoming volunteers to provide photographs for this scientific effort, we drove across the path of totality providing presentations in a town hall fashion. We drove through nine states in week-long trips with a total of six trips. The first week took place in August, 2016 through Oregon. The remaining trips took place February-June, 2017. The tour gained press in each town seeded our recruitment efforts, which then gained momentum via articles and press releases in the Spring and Summer, 2017. By Aug 2, 2017 over 1,000 photographers had signed up to be trained volunteers. This presentation will present information on the tours and their impact in seeding the overall recruitment effort for the Eclipse Megamovie Project.
Genome Improvement at JGI-HAGSC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimwood, Jane; Schmutz, Jeremy J.; Myers, Richard M.
Since the completion of the sequencing of the human genome, the Joint Genome Institute (JGI) has rapidly expanded its scientific goals in several DOE mission-relevant areas. At the JGI-HAGSC, we have kept pace with this rapid expansion of projects with our focus on assessing, assembling, improving and finishing eukaryotic whole genome shotgun (WGS) projects for which the shotgun sequence is generated at the Production Genomic Facility (JGI-PGF). We follow this by combining the draft WGS with genomic resources generated at JGI-HAGSC or in collaborator laboratories (including BAC end sequences, genetic maps and FLcDNA sequences) to produce an improved draft sequence.more » For eukaryotic genomes important to the DOE mission, we then add further information from directed experiments to produce reference genomic sequences that are publicly available for any scientific researcher. Also, we have continued our program for producing BAC-based finished sequence, both for adding information to JGI genome projects and for small BAC-based sequencing projects proposed through any of the JGI sequencing programs. We have now built our computational expertise in WGS assembly and analysis and have moved eukaryotic genome assembly from the JGI-PGF to JGI-HAGSC. We have concentrated our assembly development work on large plant genomes and complex fungal and algal genomes.« less
NASA Astrophysics Data System (ADS)
Mayer, A. S.; Vye, E.
2016-12-01
The Michigan Tech GlobalWatershed GK-12 Fellowship program bridges the gap between K-12 learning institutions and the scientific community with a focus on watershed research. Michigan Tech graduate students (fellows) work in tandem with teachers on the development of relevant hands-on, inquiry based lesson plans and activities based on their doctoral research projects in watershed science. By connecting students and teachers to state of the art academic research in watershed science, teachers are afforded a meaningful way in which to embed scientific research as a component of K-12 curricula, while mentoring fellows on the most pertinent and essential topics for lesson plan development. Fellows fulfill their vital responsibility of communicating their academic research to a broader public while fostering improved teaching and communication skills. A goal of the project is to increase science literacy among students so they may understand, communicate and participate in decisions made at local, regional, and global levels. The project largely works with schools located in Michigan's western Upper Peninsula but also partners with K-12 systems in Sonora, Mexico. While focusing on local and regional issues, the international element of the project helps expand student, teacher, and fellow worldviews and global awareness of watershed issues and creates meaningful partnerships. Lesson plans are available online and teacher workshops are held regularly to disseminate the wealth of information and resources available to the broader public. Evaluation results indicate that fellows' skill and confidence in their ability to communicate science increased as a results of their participation of the program, as well as their desire to communicate science in their future careers. Teachers' confidence in their capacity to present watershed science to their students increased, along with their understanding of how scientific research contributes to understanding of water-related issues. The GlobalWatershed GK-12 Fellowship program serves as a model for broadening scientific impacts among a wider public through shared communication and partnership.
Risk and value analysis of SETI
NASA Technical Reports Server (NTRS)
Billingham, J.
1986-01-01
The risks, values, and costs of the SETI project are evaluated and compared with those of the Viking project. Examination of the scientific values, side benefits, and costs of the two projects reveal that both projects provide equal benefits at equal costs. The probability of scientific and technical success is analyzed.
Physical habitat monitoring strategy (PHAMS) for reach-scale restoration effectiveness monitoring
Jones, Krista L.; O'Daniel, Scott J.; Beechie, Tim J.; Zakrajsek, John; Webster, John G.
2015-04-14
Habitat restoration efforts by the Confederated Tribes of the Umatilla Indian Reservation (CTUIR) have shifted from the site scale (1-10 meters) to the reach scale (100-1,000 meters). This shift was in response to the growing scientific emphasis on process-based restoration and to support from the 2007 Accords Agreement with the Bonneville Power Administration. With the increased size of restoration projects, the CTUIR and other agencies are in need of applicable monitoring methods for assessing large-scale changes in river and floodplain habitats following restoration. The goal of the Physical Habitat Monitoring Strategy is to outline methods that are useful for capturing reach-scale changes in surface and groundwater hydrology, geomorphology, hydrologic connectivity, and riparian vegetation at restoration projects. The Physical Habitat Monitoring Strategy aims to avoid duplication with existing regional effectiveness monitoring protocols by identifying complimentary reach-scale metrics and methods that may improve the ability of CTUIR and others to detect instream and riparian changes at large restoration projects.
Drilling to investigate processes in active tectonics and magmatism
NASA Astrophysics Data System (ADS)
Shervais, J.; Evans, J.; Toy, V.; Kirkpatrick, J.; Clarke, A.; Eichelberger, J.
2014-12-01
Coordinated drilling efforts are an important method to investigate active tectonics and magmatic processes related to faults and volcanoes. The US National Science Foundation (NSF) recently sponsored a series of workshops to define the nature of future continental drilling efforts. As part of this series, we convened a workshop to explore how continental scientific drilling can be used to better understand active tectonic and magmatic processes. The workshop, held in Park City, Utah, in May 2013, was attended by 41 investigators from seven countries. Participants were asked to define compelling scientific justifications for examining problems that can be addressed by coordinated programs of continental scientific drilling and related site investigations. They were also asked to evaluate a wide range of proposed drilling projects, based on white papers submitted prior to the workshop. Participants working on faults and fault zone processes highlighted two overarching topics with exciting potential for future scientific drilling research: (1) the seismic cycle and (2) the mechanics and architecture of fault zones. Recommended projects target fundamental mechanical processes and controls on faulting, and range from induced earthquakes and earthquake initiation to investigations of detachment fault mechanics and fluid flow in fault zones. Participants working on active volcanism identified five themes: the volcano eruption cycle; eruption sustainability, near-field stresses, and system recovery; eruption hazards; verification of geophysical models; and interactions with other Earth systems. Recommended projects address problems that are transferrable to other volcanic systems, such as improved methods for identifying eruption history and constraining the rheological structure of shallow caldera regions. Participants working on chemical geodynamics identified four major themes: large igneous provinces (LIPs), ocean islands, continental hotspot tracks and rifts, and convergent plate margins (subduction zones). This workshop brought together a diverse group of scientists with a broad range of scientific experience and interests. A particular strength was the involvement of both early-career scientists, who will initiate and carry out these new research programs, and more senior researchers with many years of experience in scientific drilling and active tectonics research. Each of the themes and questions outlined above has direct benefits to society, including improving hazard assessment, direct monitoring of active systems for early warning, renewable and non-renewable resource and energy exploitation, and predicting the environmental impacts of natural hazards, emphasizing the central role that scientific drilling will play in future scientific and societal developments.
NASA Astrophysics Data System (ADS)
Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.
2017-12-01
Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both repositories support secure and sustainable data storage, backup, documentation, publication with DOIs, search, download, statistics as well as web mapping features. Moreover, RDM consulting and support services as well as training sessions are carried out regularly.
NASA Astrophysics Data System (ADS)
Burgos-Martin, J.; Sanchez-Padron, M.; Sanchez, F.; Martinez-Roger, Carlos
2004-07-01
Large-Scale observing facilities are scarce and costly. Even so, the perspective to enlarge or to increase the number of these facilities are quite real and several projects are undertaking their first steps in this direction. These costly facilities require the cooperation of highly qualified institutions, able to undertake the project from the scientific and technological point of view, as well as the vital collaboration and effective support of several countries, at the highest level, able to provide the necessary investment for their construction. Because of these technological implications and the financial magnitude of these projects, their impact goes well beyond the international astrophysical community. We propose to carry out a study on the socio-economic impact from the construction and operation of an Extremely Large Telescope of class 30 - 100 m. We plan to approach several aspects such as its impact in the promotion of the employment; social, educational and cultural integration of the population; the impulse of industries; its impact on the national and international policies on research; environmental issues; etc. We will also analyze the financial instruments available, and those special aids only accessible for some countries and regions to encourage their participation in projects of this magnitude.
4th Annual DOE-ERSP PI Meeting: Abstracts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazen, Terry C.
2009-03-01
This contains abstracts from the 2009 Annual Environmental Remediation Sciences Program (ERSP) Principal Investigators (PI) Meeting. The ERSP seeks to advance fundamental science to understand, predict, and mitigate the impacts of environmental contamination from past nuclear weapons production and provide a scientific basis for the long-term stewardship of nuclear waste disposal. These ambitious goals cannot be achieved by any one project alone. Therefore, ERSP funds a combination of research programs at the DOE national laboratories, individual projects at universities and federal agencies, and large long(er)-term field site research. Integration of these activities to advance the ERSP goals is a constantmore » challenge, but made significantly simpler by bringing together all funded ERSP researchers once a year to discuss the very latest research results. It is at these meetings where new ideas and/or scientific advancements in support of ERSP goals can be discussed and openly debated among all PIs in the program. The ERSP thrives, in part, on the new ideas, concepts, scientific connections, and collaborations generated as a result of these meetings. The annual PI Meeting is very much a working meeting with three major goals: (1) to provide opportunities for scientific interaction among the ERSP scientists, a critical element for the program; (2) to provide the ERSP program staff with an opportunity to evaluate the progress of each program and project; and (3) to showcase the ERSP to interested parties within DOE and within other federal agencies In addition to program managers from within OBER, there will be representatives from other offices within DOE and other federal agencies in attandance at the meeting.« less
iCollections methodology: workflow, results and lessons learned
Penn, Malcolm; Sadka, Mike; Hine, Adrian; Brooks, Stephen; Siebert, Darrell J.; Sleep, Chris; Cafferty, Steve; Cane, Elisa; Martin, Geoff; Toloni, Flavia; Wing, Peter; Chainey, John; Duffell, Liz; Huxley, Rob; Ledger, Sophie; McLaughlin, Caitlin; Mazzetta, Gerardo; Perera, Jasmin; Crowther, Robyn; Douglas, Lyndsey; Durant, Joanna; Scialabba, Elisabetta; Honey, Martin; Huertas, Blanca; Howard, Theresa; Carter, Victoria; Albuquerque, Sara; Paterson, Gordon; Kitching, Ian J.
2017-01-01
Abstract The Natural History Museum, London (NHMUK) has embarked on an ambitious programme to digitise its collections. The first phase of this programme was to undertake a series of pilot projects to develop the workflows and infrastructure needed to support mass digitisation of very large scientific collections. This paper presents the results of one of the pilot projects – iCollections. This project digitised all the lepidopteran specimens usually considered as butterflies, 181,545 specimens representing 89 species from the British Isles and Ireland. The data digitised includes, species name, georeferenced location, collector and collection date - the what, where, who and when of specimen data. In addition, a digital image of each specimen was taken. A previous paper explained the way the data were obtained and the background to the collections that made up the project. The present paper describes the technical, logistical, and economic aspects of managing the project. PMID:29104442
iCollections methodology: workflow, results and lessons learned
Penn, Malcolm; Sadka, Mike; Hine, Adrian; Brooks, Stephen; Siebert, Darrell J.; Sleep, Chris; Cafferty, Steve; Cane, Elisa; Martin, Geoff; Toloni, Flavia; Wing, Peter; Chainey, John; Duffell, Liz; Huxley, Rob; Ledger, Sophie; McLaughlin, Caitlin; Mazzetta, Gerardo; Perera, Jasmin; Crowther, Robyn; Douglas, Lyndsey; Durant, Joanna; Honey, Martin; Huertas, Blanca; Howard, Theresa; Carter, Victoria; Albuquerque, Sara; Paterson, Gordon; Kitching, Ian J.
2017-01-01
Abstract The Natural History Museum, London (NHMUK) has embarked on an ambitious programme to digitise its collections. The first phase of this programme was to undertake a series of pilot projects to develop the workflows and infrastructure needed to support mass digitisation of very large scientific collections. This paper presents the results of one of the pilot projects – iCollections. This project digitised all the lepidopteran specimens usually considered as butterflies, 181,545 specimens representing 89 species from the British Isles and Ireland. The data digitised includes, species name, georeferenced location, collector and collection date - the what, where, who and when of specimen data. In addition, a digital image of each specimen was taken. A previous paper explained the way the data were obtained and the background to the collections that made up the project. The present paper describes the technical, logistical, and economic aspects of managing the project. PMID:29104435
iCollections methodology: workflow, results and lessons learned.
Blagoderov, Vladimir; Penn, Malcolm; Sadka, Mike; Hine, Adrian; Brooks, Stephen; Siebert, Darrell J; Sleep, Chris; Cafferty, Steve; Cane, Elisa; Martin, Geoff; Toloni, Flavia; Wing, Peter; Chainey, John; Duffell, Liz; Huxley, Rob; Ledger, Sophie; McLaughlin, Caitlin; Mazzetta, Gerardo; Perera, Jasmin; Crowther, Robyn; Douglas, Lyndsey; Durant, Joanna; Honey, Martin; Huertas, Blanca; Howard, Theresa; Carter, Victoria; Albuquerque, Sara; Paterson, Gordon; Kitching, Ian J
2017-01-01
The Natural History Museum, London (NHMUK) has embarked on an ambitious programme to digitise its collections. The first phase of this programme was to undertake a series of pilot projects to develop the workflows and infrastructure needed to support mass digitisation of very large scientific collections. This paper presents the results of one of the pilot projects - iCollections. This project digitised all the lepidopteran specimens usually considered as butterflies, 181,545 specimens representing 89 species from the British Isles and Ireland. The data digitised includes, species name, georeferenced location, collector and collection date - the what, where, who and when of specimen data. In addition, a digital image of each specimen was taken. A previous paper explained the way the data were obtained and the background to the collections that made up the project. The present paper describes the technical, logistical, and economic aspects of managing the project.
The Roland Maze Project — Cosmic Ray Registration at Schools
NASA Astrophysics Data System (ADS)
Feder, J.; JȨDRZEJCZAK, K.; Karczmarczyk, J.; Lewandowski, R.; Swarzyński, J.; Szabelska, B.; Szabelski, J.; Tokarski, P.; Wibig, T.
Experimental studies of cosmic rays at the highest energies (above 1018 eV) are the main scientific goal of the projected large area network of extensive air shower detectors. Placing the detectors on the roofs of high school buildings will lower the cost by using the existing urban infrastructure (INTERNET, power supply, etc.), and can be a very efficient way of science popularisation by engaging high school students in the research program. 30 high schools in Łódź are already involved in the project. The project has recently obtained some financial support from the City Council of Łódź. The donation enabled us to start experimental work on detector construction details. A cycle of lectures and seminars devoted to different aspects of project realization (detector construction, on-line data acquisition system, C++ programming) has been organized for students at our Institute and at schools.
NASA Astrophysics Data System (ADS)
Glushkova, Yu O.; Gordashnukova, O. Yu; Pahomova, A. V.; Shatohina, S. P.; Filippov, D. V.
2018-05-01
The modern markets are characterized by fierce competition, constantly changing demand, increasing demands of consumers, shortening of the life cycle of goods and services in connection with scientific and technological progress. Therefore, for survival, modern logistic systems of industrial enterprises must be constantly improved. Modern economic literature is represented by a large volume of publications on various aspects of the studied issues. They consider the issues of project management in the logistics system that inevitably encounter with triple Limited. It initially describes the balance between project content, cost, and time. Later it was suggested to either replace the content with quality or add a fourth criterion. Therefore it is possible to name such limitation as triple or four-criteria limitation.
Embracing Diversity: The Exploration of User Motivations in Citizen Science Astronomy Projects
NASA Astrophysics Data System (ADS)
Lee, Lo
2018-06-01
Online citizen science projects ask members of the public to donate spare time on their personal computers to process large datasets. A critical challenge for these projects is volunteer recruitment and retention. Many of these projects use Berkeley Open Infrastructure for Network Computing (BOINC), a piece of middleware, to support their operations. This poster analyzes volunteer motivations in two large, BOINC-based astronomy projects, Einstein@Home and Milkyway@Home. Volunteer opinions are addressed to assess whether and how competitive elements, such as credit and ranking systems, motivate volunteers. Findings from a study of project volunteers, comprising surveys (n=2,031) and follow-up interviews (n=21), show that altruism is the main incentive for participation because volunteers consider scientific research to be critical for humans. Multiple interviewees also revealed a passion for extrinsic motivations, i.e. those that involve recognition from other people, such as opportunities to become co-authors of publications or to earn financial benefits. Credit and ranking systems motivate nearly half of interviewees. By analyzing user motivations in astronomical BOINC projects, this research provides scientists with deeper understandings about volunteer communities and various types of volunteers. Building on these findings, scientists can develop different strategies, for example, awarding volunteers badges, to recruit and retain diverse volunteers, and thus enhance long-term user participation in astronomical BOINC projects.
COMUNICA Project: a commitment for strategic communication on Earth Sciences
NASA Astrophysics Data System (ADS)
Cortes-Picas, Jordi; Diaz, Jordi; Fernandez-Turiel, Jose-Luis
2016-04-01
The Institute of Earth Sciences Jaume Almera (ICTJA-CSIC) has just celebrated its 50-year anniversary last year. It is a reference research center on Earth Sciences both national and international level. The Institute includes 4 research groups which focus their scientific activity on the structure and dynamics of the Earth, the environmental changes in the geological record, geophysical and geochemical modelling and crystallography and optical properties. Only when large geological disasters happens, mainly earthquakes and volcanic eruptions, some interaction between ICTJA-CSIC researchers and traditional media occurs, which is limited by the fact that the aim of the Institute is the scientific research and it has no responsibilities in the area of civil protection. This relationship reduces the knowledge of our activity to the general public. To overcome this situation, the ICTJA-CSIC has decided to take an active role in the social dissemination of geological and geophysical knowledge. Thus, the ICTJA-CSIC has launched the COMUNICA Project. The project is aimed to increase the social visibility of the ICTJA-CSIC and to promote the outreach of researchers. Therefore ICTJA-CSIC has created the Communication Unit, which is in charge of designing communication strategies to give to different audiences (media, students of secondary and higher education, general public) an overview of the scientific and institutional activity of the ICTJA-CSIC. A global communication plan is being designed to define the strategic actions, both internal and external. An important role has been reserved for digital channels, to promote ICTJA-CSIC activity on social networks such as Twitter, Facebook or Youtube, besides making a major effort in the renovation and maintenance of the corporate website. A strong effort will be done to collect and spread through press releases the major scientific milestones achieved by the researchers, to promote the interest of mass media. Communication plan includes also institutional participation in scientific dissemination events, talks addressed to general public, and workshops and seminars for students of secondary and higher education.
NASA Astrophysics Data System (ADS)
Kapon, S.; Ganiel, U.; Eylon, B.
2009-09-01
Many large scientific projects and scientific centres incorporate some kind of outreach programme. Almost all of these outreach programmes include public scientific lectures delivered by practising scientists. In this article, we examine such lectures from the perspectives of: (i) lecturers (7) who are practising scientists acknowledged to be good public lecturers and (ii) audiences composed of high-school students (169) and high-school physics teachers (80) who attended these lectures. We identify and discuss the main goals as expressed by the lecturers and the audiences, and the correspondence between these goals. We also discuss how the lecturers' goals impact on the design of their lectures and examine how the lecture affects audiences with different attitudes towards (and interests in) physics. Our findings suggest that the goals of the participating lecturers and the expectations of their audiences were highly congruent. Both believe that a good public scientific lecture must successfully communicate state-of-the-art scientific knowledge to the public, while inspiring interest in and appreciation of science. Our findings also suggest that exemplary public scientific lectures incorporate content, structure and explanatory means that explicitly adhere to the lecturers' goals. We identify and list several design principles.
NASA Astrophysics Data System (ADS)
Kusnadi, K.; Rustaman, N. Y.; Redjeki, S.; Aryantha, I. N. P.
2017-09-01
The implementation of the inquiry laboratory based project to enhance scientific inquiry literacy of prospective biology teachers in Microbiology course has been done. The inquiry lab based project was designed by three stages were debriefing of basic microbiology lab skills, guided inquiry and free inquiry respectively. The Study was quasi experimental with control group pretest-posttest design. The subjects were prospective biology teachers consists of 80 students. The scientific inquiry literacy instrument refers to ScInqLiT by Wenning. The results showed that there was significant difference of scientific inquiry literacy posttest scores between experiment and control (α 0,05) and was obtained N-gain score was 0.49 (medium) to experiment and 0.24 (low) to control. Based on formative assessment showed that development of student’s scientific attitude, research and microbiology lab skills during conducting project were increased. Student’s research skills especially in identification of variables, constructing a hypothesis, communicating and concluding were increased. During implementation of inquiry project also showed that they carried out mind and hands-on and so collaborative group investigation lab activities. Our findings may aid in reforming higher-education, particularly in microbiology laboratory activities to better promote scientific inquiry literacy, scientific attitude, research and laboratory skills.
New Technologies Promise Dramatic Increase In Capabilities of the Very Large Array
NASA Astrophysics Data System (ADS)
1996-06-01
The National Science Foundation's Very Large Array (VLA) radio telescope in New Mexico is an exceedingly powerful scientific instrument, and has transformed many areas of astronomy in its more than 15 years of operation. It has been used by more astronomers and has produced more scientific papers than any other radio telescope. Though its position as one of the world's premier radio telescopes will remain unchallenged for a long time, new technologies could increase its scientific capabilities greater than tenfold. Details were presented today to the American Astronomical Society's meeting in Madison, Wisconsin. An enhanced VLA, incorporating state-of-the-art technologies, would provide scientists with a number of important, new capabilities, including detailed investigations of the physics of solar radio bursts; improved radar probes of planets, asteroids and comets; the ability to image protoplanetary disks around young stars; more rapid response and effective observations of transient events such as supernovae; new types of information about gas both within our own Galaxy and in other galaxies; and greatly improved ability to study clusters of galaxies and extremely distant objects in the Universe. In addition, the enhanced VLA will serve as an improved partner with the Very Long Baseline Array (VLBA), a continent-wide radio telescope, also part of the National Radio Astronomy Observatory (NRAO). "The VLA upgrade proposes an essentially new instrument, created from two existing instruments, with power and capability far exceeding that of either one alone," said Rick Perley, NRAO Project Scientist for the VLA Upgrade Project. "It builds on the existing staff and infrastructure and would hardly affect operations costs. In today's fiscal climate, this provides the benefit of a `new' instrument with outstanding scientific capability at the least cost," Perley added. The VLA was built in the 1970s and dedicated in 1980. At the time of its completion, it was a state-of-the-art instrument. Even today, "it exceeds all other radio astronomy facilities with its combination of sensitivity, flexibility, speed, and overall imaging quality," Perley said. However, many of the technologies used by the VLA, such as computing, high-speed data transfer, and radio receivers, have greatly advanced over the past 15 years. "The VLA has in place all the needed infrastructure to take maximum advantage of these technological advances at minimum cost," Perley said. The VLA of the future, Perley said, could have: * Sensitivity improved by a factor of 2 to 15, depending on frequency; * A capacity for gathering information on spectral lines increased by a factor of 16; * Complete frequency coverage, versus very spotty current coverage; * Resolution increased by a factor of about 8; and * Complete integration with the VLBA (a long-term project). This would produce an instrument with "an outstanding, unique capability: continuous frequency coverage over a factor of 500 and continuous resolution coverage over a factor of a million, with the best sensitivity of any current instrument," Perley said. The scientific capability of the VLA now is limited in many areas by the aging technology currently employed. These limitations can be solved inexpensively by replacing the older equipment with new, state-of-the-art technology. The National Radio Astronomy Observatory began the VLA Upgrade Project with a scientific workshop held in Socorro, NM, in January of 1995. Scientists from many specialties within astronomy and planetary science were invited to this workshop to present their needs for future observations. The participants of this workshop produced a book outlining the goals of the VLA Upgrade Project. Another scientific workshop is planned for 1997. NRAO scientists and engineers now are working in groups to focus on specific aspects of the upgrade project. "We continue to solicit feedback from all interested members of the scientific community on how we can best serve their needs with an improved VLA for the next century," Perley said. For more information about the VLA Upgrade Project, and other NRAO instruments, visit the NRAO World Wide Web Home Page.
NASA Astrophysics Data System (ADS)
Goulet, C. A.; Abrahamson, N. A.; Al Atik, L.; Atkinson, G. M.; Bozorgnia, Y.; Graves, R. W.; Kuehn, N. M.; Youngs, R. R.
2017-12-01
The Next Generation Attenuation project for Central and Eastern North America (CENA), NGA-East, is a major multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER). The project was co-sponsored by the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), the Electric Power Research Institute (EPRI) and the U.S. Geological Survey (USGS). NGA-East involved a large number of participating researchers from various organizations in academia, industry and government and was carried-out as a combination of 1) a scientific research project and 2) a model-building component following the NRC Seismic Senior Hazard Analysis Committee (SSHAC) Level 3 process. The science part of the project led to several data products and technical reports while the SSHAC component aggregated the various results into a ground motion characterization (GMC) model. The GMC model consists in a set of ground motion models (GMMs) for median and standard deviation of ground motions and their associated weights, combined into logic-trees for use in probabilistic seismic hazard analyses (PSHA). NGA-East addressed many technical challenges, most of them related to the relatively small number of earthquake recordings available for CENA. To resolve this shortcoming, the project relied on ground motion simulations to supplement the available data. Other important scientific issues were addressed through research projects on topics such as the regionalization of seismic source, path and attenuation of motions, the treatment of variability and uncertainties and on the evaluation of site effects. Seven working groups were formed to cover the complexity and breadth of topics in the NGA-East project, each focused on a specific technical area. This presentation provides an overview of the NGA-East research project and its key products.
Apparatus for real-time acoustic imaging of Rayleigh-Bénard convection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuehn, Kerry, K.
We have successfully designed, built and tested an experimental apparatus which is capable of providing the first real-time ultrasound images of Rayleigh-B\\'{e}nard convection in optically opaque fluids confined to large aspect ratio experimental cells. The apparatus employs a modified version of a commercially available ultrasound camera to capture images (30 frames per second) of flow patterns in a fluid undergoing Rayleigh Bénard convection. The apparatus was validated by observing convection rolls in 5cSt polydimethylsiloxane (PDMS) polymer fluid. Our first objective, after having built the apparatus, was to use it to study the sequence of transitions from diffusive to time--dependent heatmore » transport in liquid mercury. The aim was to provide important information on pattern formation in the largely unexplored regime of very low Prandtl number fluids. Based on the theoretical stability diagram for liquid mercury, we anticipated that straight rolls should be stable over a range of Rayleigh numbers, between 1708 and approximately 1900. Though some of our power spectral densities were suggestive of the existence of weak convection, we have been unable to unambiguously visualize stable convection rolls above the theoretical onset of convection in liquid mercury. Currently, we are seeking ways to increase the sensitivity of our apparatus, such as (i) improving the acoustic impedance matching between our materials in the ultrasound path and (ii) reducing the noise level in our acoustic images due to turbulence and cavitation in the cooling fluids circulating above and below our experimental cell. If we are able to convincingly improve the sensitivity of our apparatus, and we still do not observe stable convection rolls in liquid mercury, then it may be the case that the theoretical stability diagram requires revision. In that case, either (i) straight rolls are not stable in a large aspect ratio cell at the Prandtl numbers associated with liquid mercury, or (ii) they are stable, but not in the region of the stability diagram which has been studied by this experimenter. Our second objective was to use the apparatus to study other optically opaque fluids. To this end, we have obtained the first ultrasound images of Rayleigh-Bénard convection in a ferrofluid (EFH1). This project has provided a vehicle for the scientific training of five undergraduate research assistants during the past four years. It allowed students at Wisconsin Lutheran College, a small undergraduate liberal arts college in Milwaukee, to become directly involved in a significant scientific project from its inception through publication of scientific results. The funding of this project has also strengthened the research and teaching infrastructure at the Wisconsin Lutheran College in three major ways. The project has funded the PI and his students in the design and construction of a major piece of scientific apparatus which is capable of performing novel studies of Rayleigh-Bénard convection in opaque fluids. With the acquisition of this apparatus, we are able to embark on a broad research program to study problems in pattern formation in alloys, ferro-fluids, opaque gels, and liquid metals under thermal or magnetic stresses. This project has allowed the PI to purchase auxiliary equipment necessary for establishing a fluid dynamics research laboratory at the College. And this project has served as an impetus for the College to invest in a new machine shop in the basement of the Science Building at the College in order to support this, and other, scientific projects at the College. The PI has presented work funded by this grant at physics and engineering colloquia at a nearby university and at the keynote presentation at an undergraduate research symposium at Wisconsin Lutheran College. Also, the work was featured in local magazine and newspaper articles, and is described on the PI's research webpage. Such scientific outreach serves to advance the cause of science by making it interesting and accessible to a wider audience, and to bring attention to the work done by the Office of Basic Energy Sciences of the Department of Energy.« less
ARAGO: a robotic observatrory for the variable sky
NASA Astrophysics Data System (ADS)
Boer, Michel; Acker, Agnes; Atteia, Jean-Luc; Buchholtz, Gilles; Colas, Francois; Deleuil, Magali; Dennefeld, Michel; Desert, Jean-Michel; Dolez, Noel; Eysseric, J.; Ferlet, Roger; Ferrari, Marc; Jean, Pierre; Klotz, Alain; Kouach, Driss; Lecavelier des Etangs, Alain; Lemaitre, Gerard R.; Marcowith, Alexandre; Marquette, Jean-Babtiste; Meunier, Jean-Pierre; Mochkovitch, Robert; Pain, Reynald; Pares, Laurent; Pinna, Henri; Pinna, Roger; Provost, Lionel; Roques, Sylvie; Schneider, Jean; Sivan, Jean-Pierre; Soubiran, Caroline; Thiebaut, Carole; Vauclair, Gerard; Verchere, Richard; Vidal-Madjar, Alfred
2002-12-01
We present the Advanced Robotic Agile Observatory (ARAGO), a project for a large variability survey of the sky, in the range 10-8Hz (year) to 1Hz. Among its scientific objectives are the detection of cosmic gamma-ray bursts, both on alert and serendipitously, orphan afterglows, extrasolar planets, AGNs, quasar microlensing, variable and flare stars, trans-neptunian asteroids, Earth-grazers, orbital debris, etc. A large Education and Public Outreach program will be an important part of the project. The telescope itself will be made of Silicon Carbide, allowing, among other advantages, a very light weight and agile capabilities. ARAGO will be fully autonomous, i.e. there will be no human intervention from the request to the data processing and result dissemination, nor to assist night or day operations. ARAGO will start routine observation by mid-2005.
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Ostrouchov; W.E.Doll; D.A.Wolf
2003-07-01
Unexploded ordnance(UXO)surveys encompass large areas, and the cost of surveying these areas can be high. Enactment of earlier protocols for sampling UXO sites have shown the shortcomings of these procedures and led to a call for development of scientifically defensible statistical procedures for survey design and analysis. This project is one of three funded by SERDP to address this need.
Anderson-Schmidt, Heike; Adler, Lothar; Aly, Chadiga; Anghelescu, Ion-George; Bauer, Michael; Baumgärtner, Jessica; Becker, Joachim; Bianco, Roswitha; Becker, Thomas; Bitter, Cosima; Bönsch, Dominikus; Buckow, Karoline; Budde, Monika; Bührig, Martin; Deckert, Jürgen; Demiroglu, Sara Y; Dietrich, Detlef; Dümpelmann, Michael; Engelhardt, Uta; Fallgatter, Andreas J; Feldhaus, Daniel; Figge, Christian; Folkerts, Here; Franz, Michael; Gade, Katrin; Gaebel, Wolfgang; Grabe, Hans-Jörgen; Gruber, Oliver; Gullatz, Verena; Gusky, Linda; Heilbronner, Urs; Helbing, Krister; Hegerl, Ulrich; Heinz, Andreas; Hensch, Tilman; Hiemke, Christoph; Jäger, Markus; Jahn-Brodmann, Anke; Juckel, Georg; Kandulski, Franz; Kaschka, Wolfgang P; Kircher, Tilo; Koller, Manfred; Konrad, Carsten; Kornhuber, Johannes; Krause, Marina; Krug, Axel; Lee, Mahsa; Leweke, Markus; Lieb, Klaus; Mammes, Mechthild; Meyer-Lindenberg, Andreas; Mühlbacher, Moritz; Müller, Matthias J; Nieratschker, Vanessa; Nierste, Barbara; Ohle, Jacqueline; Pfennig, Andrea; Pieper, Marlenna; Quade, Matthias; Reich-Erkelenz, Daniela; Reif, Andreas; Reitt, Markus; Reininghaus, Bernd; Reininghaus, Eva Z; Riemenschneider, Matthias; Rienhoff, Otto; Roser, Patrik; Rujescu, Dan; Schennach, Rebecca; Scherk, Harald; Schmauss, Max; Schneider, Frank; Schosser, Alexandra; Schott, Björn H; Schwab, Sybille G; Schwanke, Jens; Skrowny, Daniela; Spitzer, Carsten; Stierl, Sebastian; Stöckel, Judith; Stübner, Susanne; Thiel, Andreas; Volz, Hans-Peter; von Hagen, Martin; Walter, Henrik; Witt, Stephanie H; Wobrock, Thomas; Zielasek, Jürgen; Zimmermann, Jörg; Zitzelsberger, Antje; Maier, Wolfgang; Falkai, Peter G; Rietschel, Marcella; Schulze, Thomas G
2013-12-01
The German Association for Psychiatry and Psychotherapy (DGPPN) has committed itself to establish a prospective national cohort of patients with major psychiatric disorders, the so-called DGPPN-Cohort. This project will enable the scientific exploitation of high-quality data and biomaterial from psychiatric patients for research. It will be set up using harmonised data sets and procedures for sample generation and guided by transparent rules for data access and data sharing regarding the central research database. While the main focus lies on biological research, it will be open to all kinds of scientific investigations, including epidemiological, clinical or health-service research.
Final Report. Institute for Ultralscale Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois
The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less
The Role of Project Science in the Chandra X-Ray Observatory
NASA Technical Reports Server (NTRS)
O'Dell, Stephen L.; Weisskopf, Martin C.
2006-01-01
The Chandra X-Ray Observatory, one of NASA's Great Observatories, has an outstanding record of scientific and technical success. This success results from the efforts of a team comprising NASA, its contractors, the Smithsonian Astrophysical Observatory, the instrument groups, and other elements of the scientific community, including thousands of scientists who utilize this powerful facility for astrophysical research. We discuss the role of NASA Project Science in the formulation, development, calibration, and operation of the Chandra X-ray Observatory. In addition to representing the scientific community within the Project, Project Science performed what we term "science systems engineering". This activity encompasses translation of science requirements into technical requirements and assessment of the scientific impact of programmatic and technical trades. We briefly describe several examples of science systems engineering conducted by Chandra Project Science.
The Roland Maze Project school-based extensive air shower network
NASA Astrophysics Data System (ADS)
Feder, J.; Jȩdrzejczak, K.; Karczmarczyk, J.; Lewandowski, R.; Swarzyński, J.; Szabelska, B.; Szabelski, J.; Wibig, T.
2006-01-01
We plan to construct the large area network of extensive air shower detectors placed on the roofs of high school buildings in the city of Łódź. Detection points will be connected by INTERNET to the central server and their work will be synchronized by GPS. The main scientific goal of the project are studies of ultra high energy cosmic rays. Using existing town infrastructure (INTERNET, power supply, etc.) will significantly reduce the cost of the experiment. Engaging high school students in the research program should significantly increase their knowledge of science and modern technologies, and can be a very efficient way of science popularisation. We performed simulations of the projected network capabilities of registering Extensive Air Showers and reconstructing energies of primary particles. Results of the simulations and the current status of project realisation will be presented.
Teaching toward a More Scientifically Literate Society
ERIC Educational Resources Information Center
LoGiudici, Raymond; Ende, Fred
2010-01-01
To teach scientific literacy to eighth graders, the authors created a yearlong project that emphasizes the various components and skills required to be a scientifically literate citizen. This project is broken into four separate components: skeptical thinking (pseudoscience), current-event article analysis, fiction and nonfiction literature, and…
An Interpretive Study of Meanings Citizen Scientists Make When Participating in Galaxy Zoo
NASA Astrophysics Data System (ADS)
Mankowski, T. S.; Slater, S. J.; Slater, T. F.
2011-09-01
As the Web 2.0 world lurches forward, so do intellectual opportunities for students and the general public to meaningfully engage in the scientific enterprise. In an effort to assess the intrinsic motivation afforded by participation in Galaxy Zoo, we have inductively analyzed more than 1,000 contributions in the Galaxy Zoo Forum and coded posts thematically. We find that participants overwhelmingly want to meaningfully contribute to a larger scientific enterprise as well as have seemingly unique access to high quality, professional astronomical data. While other citizen science projects work through large data sets, Galaxy Zoo is unique in its motivations and retention abilities. Many of these motivations originate in the aesthetic power of astronomical images, which Galaxy Zoo successfully harnesses, while not compromising the scientific value of the project. From the data emerged several trends of motivation, the primary being the sense of community created within the project that promotes professional-amateur collaboration; fulfilling a dream of being an astronomer, physicist, or astronaut; tapping into a potential well of interest created during the space race era; the spiritual aspect generated when the imagination interacts with Galaxy Zoo; and, uniting them all, the aesthetic appeal of the galaxy images. In addition, a very powerful tool also emerged as a method of retention unique to Galaxy Zoo. This tool, known as variable ratio reinforcement in behavioral psychology, uses the most appealing images as positive reinforcement to maintain classification rates over time.
Big data analytics workflow management for eScience
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni
2015-04-01
In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.
Report of the Fermilab ILC Citizens' Task Force
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Fermi National Accelerator Laboratory convened the ILC Citizens' Task Force to provide guidance and advice to the laboratory to ensure that community concerns and ideas are included in all public aspects of planning and design for a proposed future accelerator, the International Linear Collider. In this report, the members of the Task Force describe the process they used to gather and analyze information on all aspects of the proposed accelerator and its potential location at Fermilab in northern Illinois. They present the conclusions and recommendations they reached as a result of the learning process and their subsequent discussions and deliberations.more » While the Task Force was charged to provide guidance on the ILC, it became clear during the process that the high cost of the proposed accelerator made a near-term start for the project at Fermilab unlikely. Nevertheless, based on a year of extensive learning and dialogue, the Task Force developed a series of recommendations for Fermilab to consider as the laboratory develops all successor projects to the Tevatron. The Task Force recognizes that bringing a next-generation particle physics project to Fermilab will require both a large international effort and the support of the local community. While the Task Force developed its recommendations in response to the parameters of a future ILC, the principles they set forth apply directly to any large project that may be conceived at Fermilab, or at other laboratories, in the future. With this report, the Task Force fulfills its task of guiding Fermilab from the perspective of the local community on how to move forward with a large-scale project while building positive relationships with surrounding communities. The report summarizes the benefits, concerns and potential impacts of bringing a large-scale scientific project to northern Illinois.« less
Global Collaborations - Prospects and Problems
NASA Astrophysics Data System (ADS)
Corbett, Ian
2005-04-01
International collaboration has long been a feature of science. Collaborative investments in joint facilities and projects have grown considerably over the past 20-40 years, and many projects have been multinational from the start. This has been particularly true in Europe, where intergovernmental organizations such as CERN, ESA, and ESO have enabled European countries to carry out forefront science with state-of-art facilites which would have been beyond the capabilities of any one country. A brief survey of these organizations, their structure, and the possible reasons behind their success is given. The transition from regional to global creates new problems. Global scale projects face a range of generic issues which must be addressed and overcome if the project is to be a success. Each project has its own specific boundary conditions and each adopts an approach best fitted to its own objectives and constraints. Experience with billion dollar projects such as the SSC, LHC, and ITER shows the key problem areas and demonstrates the importance of preparatory work in the early stages to settle issues such as schedule, funding, location, legal and managerial structure, and oversight. A range of current and proposed intercontinental or global projects - so- called ``Megascience Projects" - is reviewed. Such projects, originally a feature of space and particle physics, are now becoming more common, and very large projects in astronomy, for example ALMA and 50 - 100m telescopes, and other areas of physics now fall into the `global' category. These projects are on such a large scale, from any scientific, managerial, financial or political perspective, and have such global importance, that they have necessarily been conceived as international from the outset. Increasing financial pressures on governments and funding agencies in the developed countries place additional demands on the project planning. The contrasting approaches, problems faced, and progress made in various projects will be analyzed and possible lessions drawn out. The role which can be played in the early stages by bodies such as the OECD Global Science Forum and G-8 Carnegie Meetings, where science policy makers meet, is examined. Experience shows that these valuable `scene setting' discussions have to be informed by coordinated input from the scientific community and must be followed up by more detailed discussions between funding agencies or their equivalent, because decision making requires the development of a consensus amongst the participants. This process can be illustrated most effectively by the care with which the ideas for the International Linear Collider have been and are being developed. Agreement on building and operating a facility is not the end of the story. The legitimate desire of scientists in all other countries to be able to participate in exploiting a major new facility has to be taken into account, and that introduces a range of proprietary and sociological issues over data access and rights, and now, with the explosion in computing and storage powers, in data archiving support. These are issues which can be addressed within the scientific community and taken to the political arena via such bodies as the OECD Global Science Forum.
NASA Astrophysics Data System (ADS)
Laursen, S. L.; Dauber, R.; Molnar, P. H.; Smith, L. K.
2009-12-01
Making wise decisions about daunting societal and environmental problems requires understanding of both scientific concepts and the limits of scientific knowledge. While K-12 school standards now include topics on scientific inquiry and the nature of science, few science teachers have personal knowledge of these ideas through conducting science research first-hand. In their own education, most have experienced primarily fact-packed lecture courses rather than deep engagement with gathering, interpreting and communicating about scientific evidence. Teachers are thus at a disadvantage in teaching about the nature of science. Moreover, few curriculum materials directly address these ideas. Instead, instructors at all levels tend to rely on students gleaning ideas from their lab work, without ever making them explicit. The result is a poor understanding of the nature of science among many students and citizens. Thus the nature of science is an important and fruitful area for “broader impacts” efforts by NSF-funded projects across the entire spectrum of science. To address this gap, we have created a 20-minute educational documentary film focused on the nature and processes of science. The film is a broader impacts effort for a large, NSF-funded, multidisciplinary, collaborative research project to study the uplift of the Tibetan plateau and its impact on atmospheric and climate processes. The film, Upward and Outward: Scientific Inquiry on the Tibetan Plateau, focuses on the process of science, as seen through the lens of a specific project. Viewers follow an international team of scientists as they work in the laboratory and in the field, build new instruments and computer models, travel to exotic locales, argue about their findings, and enjoy collaboration and conversation. By gaining an insider’s glimpse into both the intellectual process of scientific inquiry and the everyday social and professional activities of science, students learn how science is a human process for building knowledge, not just a body of fact. While originally targeted to students in grades 8-12, the film has also proven effective with undergraduates in introductory science courses, and with teachers in professional development courses. The 20-minute length ensures that the film can be readily screened and discussed within a single class session, and teachers are supported with suggested pre/post writing prompts, discussion questions, teaching tips, and background materials on the film's scientific content. The presentation will describe the making of the film, its relationship to the scientific project, its use with students and teachers, and some data on their responses. We will show a short clip and make copies of the DVD available to educators and professional developers who attend the session. More information about the film, a short clip, and supporting information for educators can be found at our web site.
Scientific literacy of adult participants in an online citizen science project
NASA Astrophysics Data System (ADS)
Price, Charles Aaron
Citizen Science projects offer opportunities for non-scientists to take part in scientific research. Scientific results from these projects have been well documented. However, there is limited research about how these projects affect their volunteer participants. In this study, I investigate how participation in an online, collaborative astronomical citizen science project can be associated with the scientific literacy of its participants. Scientific literacy is measured through three elements: attitude towards science, belief in the nature of science and competencies associated with learning science. The first two elements are measured through a pre-test given to 1,385 participants when they join the project and a post-test given six months later to 125 participants. Attitude towards science was measured using nine Likert-items custom designed for this project and beliefs in the nature of science were measured using a modified version of the Nature of Science Knowledge scale. Responses were analyzed using the Rasch Rating Scale Model. Competencies are measured through analysis of discourse occurring in online asynchronous discussion forums using the Community of Inquiry framework, which describes three types of presence in the online forums: cognitive, social and teaching. Results show that overall attitudes did not change, p = .225. However, there was significant change towards attitudes about science in the news (positive) and scientific self efficacy (negative), p < .001 and p = .035 respectively. Beliefs in the nature of science exhibited a small, but significant increase, p = .04. Relative positioning of scores on the belief items did not change much, suggesting the increase is mostly due to reinforcement of current beliefs. The cognitive and teaching presence in the online forums did not change, p = .807 and p = .505 respectively. However, the social presence did change, p = .011. Overall, these results suggest that multi-faceted, collaborative citizen science projects can have an impact on some aspects of scientific literacy. Using the Rasch Model allowed us to uncover effects that may have otherwise been hidden. Future projects may want to include social interactivity between participants and also make participants specifically aware of how they are contributing to the entire scientific process.
PREDON Scientific Data Preservation 2014
NASA Astrophysics Data System (ADS)
Diaconu, C.; Kraml, S.; Surace, C.; Chateigner, D.; Libourel, T.; Laurent, A.; Lin, Y.; Schaming, M.; Benbernou, S.; Lebbah, M.; Boucon, D.; Cérin, C.; Azzag, H.; Mouron, P.; Nief, J.-Y.; Coutin, S.; Beckmann, V.
Scientific data collected with modern sensors or dedicated detectors exceed very often the perimeter of the initial scientific design. These data are obtained more and more frequently with large material and human efforts. A large class of scientific experiments are in fact unique because of their large scale, with very small chances to be repeated and to superseded by new experiments in the same domain: for instance high energy physics and astrophysics experiments involve multi-annual developments and a simple duplication of efforts in order to reproduce old data is simply not affordable. Other scientific experiments are in fact unique by nature: earth science, medical sciences etc. since the collected data is "time-stamped" and thereby non-reproducible by new experiments or observations. In addition, scientific data collection increased dramatically in the recent years, participating to the so-called "data deluge" and inviting for common reflection in the context of "big data" investigations. The new knowledge obtained using these data should be preserved long term such that the access and the re-use are made possible and lead to an enhancement of the initial investment. Data observatories, based on open access policies and coupled with multi-disciplinary techniques for indexing and mining may lead to truly new paradigms in science. It is therefore of outmost importance to pursue a coherent and vigorous approach to preserve the scientific data at long term. The preservation remains nevertheless a challenge due to the complexity of the data structure, the fragility of the custom-made software environments as well as the lack of rigorous approaches in workflows and algorithms. To address this challenge, the PREDON project has been initiated in France in 2012 within the MASTODONS program: a Big Data scientific challenge, initiated and supported by the Interdisciplinary Mission of the National Centre for Scientific Research (CNRS). PREDON is a study group formed by researchers from different disciplines and institutes. Several meetings and workshops lead to a rich exchange in ideas, paradigms and methods. The present document includes contributions of the participants to the PREDON Study Group, as well as invited papers, related to the scientific case, methodology and technology. This document should be read as a "facts finding" resource pointing to a concrete and significant scientific interest for long term research data preservation, as well as to cutting edge methods and technologies to achieve this goal. A sustained, coherent and long term action in the area of scientific data preservation would be highly beneficial.
NASA Astrophysics Data System (ADS)
Brändström; Gustavsson, Björn; Pellinen-Wannberg, Asta; Sandahl, Ingrid; Sergienko, Tima; Steen, Ake
2005-08-01
The Auroral Large Imaging System (ALIS) was first proposed at the ESA-PAC meeting in Lahnstein 1989. The first spectroscopic imaging station was operational in 1994, and since then up to six stations have been in simultaneous operation. Each station has a scientific-grade CCD-detector and a filter-wheel for narrow-band interference-filters with six positions. The field-of-view is around 70°. Each imager is mounted in a positioning system, enabling imaging of a common volume from several sites. This enables triangulation and tomography. Raw data from ALIS is freely available at ("http://alis.irf.se") and ALIS is open for scientific colaboration. ALIS made the first unambiguous observations of Radio-induced optical emissions at high latitudes, and the detection of water in a Leonid meteor-trail. Both rockets and satellite coordination are considered for future observations with ALIS.
Huntington, Henry; Callaghan, Terry; Fox, Shari; Krupnik, Igor
2004-11-01
Recent environmental changes are having, and are expected to continue to have, significant impacts in the Arctic as elsewhere in the world. Detecting those changes and determining the mechanisms that cause them are far from trivial problems. The use of multiple methods of observation can increase confidence in individual observations, broaden the scope of information available about environmental change, and contribute to insights concerning mechanisms of change. In this paper, we examine the ways that using traditional ecological knowledge (TEK) together with scientific observations can achieve these objectives. A review of TEK observations in comparison with scientific observations demonstrates the promise of this approach, while also revealing several challenges to putting it into practice on a large scale. Further efforts are suggested, particularly in undertaking collaborative projects designed to produce parallel observations that can be readily compared and analyzed in greater detail than is possible in an opportunistic sample.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less
Curiosity: the Mars Science Laboratory Project
NASA Technical Reports Server (NTRS)
Cook, Richard A.
2012-01-01
The Curiosity rover landed successfully in Gale Crater, Mars on August 5, 2012. This event was a dramatic high point in the decade long effort to design, build, test and fly the most sophisticated scientific vehicle ever sent to Mars. The real achievements of the mission have only just begun, however, as Curiosity is now searching for signs that Mars once possessed habitable environments. The Mars Science Laboratory Project has been one of the most ambitious and challenging planetary projects that NASA has undertaken. It started in the successful aftermath of the 2003 Mars Exploration Rover project and was designed to take significant steps forward in both engineering and scientific capabilities. This included a new landing system capable of emplacing a large mobile vehicle over a wide range of potential landing sites, advanced sample acquisition and handling capabilities that can retrieve samples from both rocks and soil, and a high reliability avionics suite that is designed to permit long duration surface operations. It also includes a set of ten sophisticated scientific instruments that will investigate both the geological context of the landing site plus analyze samples to understand the chemical & organic composition of rocks & soil found there. The Gale Crater site has been specifically selected as a promising location where ancient habitable environments may have existed and for which evidence may be preserved. Curiosity will spend a minimum of one Mars year (about two Earth years) looking for this evidence. This paper will report on the progress of the mission over the first few months of surface operations, plus look retrospectively at lessons learned during both the development and cruise operations phase of the mission..
NASA Technical Reports Server (NTRS)
Skinner, J. A., Jr.; Gaddis, L. R.; Hagerty, J. J.
2010-01-01
The first systematic lunar geologic maps were completed at 1:1M scale for the lunar near side during the 1960s using telescopic and Lunar Orbiter (LO) photographs [1-3]. The program under which these maps were completed established precedents for map base, scale, projection, and boundaries in order to avoid widely discrepant products. A variety of geologic maps were subsequently produced for various purposes, including 1:5M scale global maps [4-9] and large scale maps of high scientific interest (including the Apollo landing sites) [10]. Since that time, lunar science has benefitted from an abundance of surface information, including high resolution images and diverse compositional data sets, which have yielded a host of topical planetary investigations. The existing suite of lunar geologic maps and topical studies provide exceptional context in which to unravel the geologic history of the Moon. However, there has been no systematic approach to lunar geologic mapping since the flight of post-Apollo scientific orbiters. Geologic maps provide a spatial and temporal framework wherein observations can be reliably benchmarked and compared. As such, a lack of a systematic mapping program means that modern (post- Apollo) data sets, their scientific ramifications, and the lunar scientists who investigate these data, are all marginalized in regard to geologic mapping. Marginalization weakens the overall understanding of the geologic evolution of the Moon and unnecessarily partitions lunar research. To bridge these deficiencies, we began a pilot geologic mapping project in 2005 as a means to assess the interest, relevance, and technical methods required for a renewed lunar geologic mapping program [11]. Herein, we provide a summary of the pilot geologic mapping project, which focused on the geologic materials and stratigraphic relationships within the Copernicus quadrangle (0-30degN, 0-45degW).
Crowdfunding To Support University Research and Public Outreach
NASA Astrophysics Data System (ADS)
Jackson, Brian
2016-10-01
Crowdfunding involves raising (usually small) monetary contributions from a large number of people, often performed via the internet. Several universities have adopted this model to support small-dollar, high-profile projects and provide the seed money for research efforts. By contrast with traditional scientific funding, crowdfunding provides a novel way to engage the public in the scientific process and sometimes involves donor rewards in the form of acknowledgments in publications and direct involvement in the research itself.In addition to Kickstarter.com and Indiegogo.com that support a range of enterprises, there are several organizations tailored to scientific research and development, including Experiment.com and the now-defunct PetriDish.org. Private companies are also available to help universities establish their own crowd-funding platforms. At Boise State University, we recently engaged the services of ScaleFunder to launch the PonyUp platform (https://ponyup.boisestate.edu/), inaugurated in Fall 2015 with requests of support for projects ranging from the health effects of organic food during pregnancy to censuses of hummingbirds.In this presentation, I'll discuss my own crowdfunding project to support the rehabilitation of Boise State's on-campus observatory. As the first project launched on PonyUp, it was an enormous success -- we met our original donation goal of $8k just two weeks into the four-week campaign and so upped the goal to $10k, which we achieved two weeks later. In addition to the very gratifying monetary support of the broader Boise community, we received personal stories from many of our donors about their connections to Boise State and the observatory. I'll talk about our approach to social and traditional media platforms and discuss how we leveraged an unlikely cosmic syzygy to boost the campaign.
Diurnal Cycle of Convection and Interaction with the Large-Scale Circulation
NASA Technical Reports Server (NTRS)
Salby, Murry
2002-01-01
The science in this effort was scheduled in the project's 3rd and 4th years, after a long record of high-resolution Global Cloud Imagery (GCI) had been produced. Unfortunately, political disruptions that interfered with this project led to its funding being terminated after only two years of support. Nevertheless, the availability of intermediate data opened the door to a number of important scientific studies. Beyond considerations of the diurnal cycle addressed in this grant, the GCI wakes possible a wide range of studies surrounding convection, cloud and precipitation. Several are already underway with colleagues in the US and abroad, who have requested the GCI.
Interfacial nanobubbles produced by long-time preserved cold water
NASA Astrophysics Data System (ADS)
Zhou, Li-Min; Wang, Shuo; Qiu, Jie; Wang, Lei; Wang, Xing-Ya; Li, Bin; Zhang, Li-Juan; Hu, Jun
2017-09-01
Not Available Project supported by the Key Laboratory of Interfacial Physics and Technology, Chinese Academy of Sciences, the Open Research Project of the Large Scientific Facility of the Chinese Academy of Sciences, the National Natural Science Foundation of China (Grant Nos. 11079050, 11290165, 11305252, 11575281, and U1532260), the National Key Basic Research Program of China (Grant Nos. 2012CB825705 and 2013CB932801), the National Natural Science Foundation for Outstanding Young Scientists, China (Grant No. 11225527), the Shanghai Academic Leadership Program, China (Grant No. 13XD1404400), and the Program of the Chinese Academy of Sciences (Grant Nos. KJCX2-EW-W09 and QYZDJ-SSW-SLH019)
Setting up crowd science projects.
Scheliga, Kaja; Friesike, Sascha; Puschmann, Cornelius; Fecher, Benedikt
2016-11-29
Crowd science is scientific research that is conducted with the participation of volunteers who are not professional scientists. Thanks to the Internet and online platforms, project initiators can draw on a potentially large number of volunteers. This crowd can be involved to support data-rich or labour-intensive projects that would otherwise be unfeasible. So far, research on crowd science has mainly focused on analysing individual crowd science projects. In our research, we focus on the perspective of project initiators and explore how crowd science projects are set up. Based on multiple case study research, we discuss the objectives of crowd science projects and the strategies of their initiators for accessing volunteers. We also categorise the tasks allocated to volunteers and reflect on the issue of quality assurance as well as feedback mechanisms. With this article, we contribute to a better understanding of how crowd science projects are set up and how volunteers can contribute to science. We suggest that our findings are of practical relevance for initiators of crowd science projects, for science communication as well as for informed science policy making. © The Author(s) 2016.
Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.
Archiving for Rosetta: Lessons for IPDA
NASA Astrophysics Data System (ADS)
Heather, David
The Rosetta Project is unusual, possibly unique, in that all data must be archived both in NASA's Planetary Data System (PDS), and in ESA's Planetary Science Archive (PSA), ac-cording to an inter-agency agreement that predates the existence of ESA's PSA. This requires that all data are formatted according to NASA's PDS3 Standards. Scientific peer reviews of the data content for Rosetta have been carried out both in the US and in Europe and there was a very large overlap of the issues raised, illustrating the general scientific agreement, independent of geography, in what an archive must contain to be useful to the broader community of planetary scientists. However, validation of the data against the PDS Standards using both PSA and PDS devel-oped software has led to the discovery that many of the items that are validated are unstated assumptions in the written PDS Standards and are related, at least in large part, to how the two archiving systems operate rather than to the actual content that a scientist needs to use the data. The talk will illustrate some of these discrepancies with examples and suggest how to avoid such issues in future, optimizing the scientific return on the investment in archiving while minimizing the costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geveci, Berk; Maynard, Robert
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less
Science Fair Projects: The Environment.
ERIC Educational Resources Information Center
Bonnet, Bob; Keen, Dan
This book approaches the development of science fair projects from the point of view that science should be enjoyable, interesting, and thought-provoking. The scientific concepts introduced here will later help young students to understand more advanced scientific principles. These projects develop skills such as classification, making measured…
Assessing the Role of Technology in Citizen Science: A Volunteer's Perspective
NASA Astrophysics Data System (ADS)
Wei, J. H.; Force, A.
2017-12-01
From a volunteer's perspective, citizen science can provide a direct connection between outdoor enthusiasts and the scientists who study these natural environments. These experiences are both rewarding and engaging, as participants become aware of field sites, the scientific method, and their own environmental impacts. Recent technological advances (i.e. smart phones and mobile applications) have the potential to transform citizen science, specifically as technology can both enable and modernize the networks between a large community of potential volunteers and scientists using these data. By providing volunteers who venture into largely understudied and remote areas with an easy method for data collection and entry, it becomes easier to encourage volunteer engagement in science, while maintaining quality control over the data collection process. Participating in Adventure Scientists' projects demonstrates the application of technology as an effective engagement tool, especially when compared to traditional pen and paper surveys often conducted. Pairing volunteers with simple, familiar technology increases engagement, particularly for volunteers otherwise intimidated by the scientific process. When equipped with useful features, such as GPS functionality, smartphone apps offer a simple and standardized method of data collection and description. Yet a variety of factors can complicate field sampling; final choices are ultimately left to the judgment of the volunteer and perhaps could be guided by use of a phone/app. Importantly, Adventure Scientists conducts follow ups and volunteer surveys, which are critical to the continued evaluation of volunteer experiences and the sampling methods themselves. For future projects, creating a forum in which scientists and volunteers can interact (perhaps also through a phone app) could provide scientific context for volunteers, further investing them in the scientific process and their continued participation. Overall, the Adventure Scientists' approach provides a model for connecting science with the growing outdoor recreation community; by embracing technology already in our pockets, scientists can hope to connect with more participants and future generations to come.
NASA Astrophysics Data System (ADS)
Donahue, Megan; Kaplan, J.; Ebert-May, D.; Ording, G.; Melfi, V.; Gilliland, D.; Sikorski, A.; Johnson, N.
2009-01-01
The typical large liberal-arts, tier-one research university requires all of its graduates to achieve some minimal standards of quantitative literacy and scientific reasoning skills. But how do we know what we are doing, as instructors and as a university, is working the way we think it should? At Michigan State University, a cross-disciplinary team of scientists, statisticians, and teacher education experts have begun a large-scale investigation about student mastery of quantitative and scientific skills, beginning with an assessment of 3,000 freshmen before they start their university careers. We will describe the process we used for developing and testing an instrument, for expanding faculty involvement and input on high-level goals. For this limited presentation, we will limit the discussion mainly to the scientific reasoning perspective, but we will briefly mention some intriguing observations regarding quantitative literacy as well. This project represents the beginning of long-term, longitudinal tracking of the progress of students at our institution. We will discuss preliminary results our 2008 assessment of incoming freshman at Michigan State, and where we plan to go from here. We acknowledge local support from the Quality Fund from the Office of the Provost at MSU. We also acknowledge the Center for Assessment at James Madison University and the NSF for their support at the very beginning of our work.
ERIC Educational Resources Information Center
Harland, Darci
2013-01-01
Research projects are worth doing. They raise student interest in science and offer experience in authentic scientific practices. Implementing independent research projects among students requires the teacher to be skilled not only in scientific research but also in project management. Teachers' duties include--but are not limited…
U. S. Navy’s Superconductivity Programs; Scientific Curosity To Fleet Utility
2010-10-01
NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND...ADDRESS(ES) Naval Research Laboratory,Washington,DC,20375 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS...classes of materials studied for superconductivity were ternary alloys13, and organic materials14. The dilution refrigerator largely replaced
NASA Technical Reports Server (NTRS)
Barclay, Rebecca O.; Pinelli, Thomas E.
1997-01-01
The large and complex aerospace industry, which employed approximately 850,000 people in 1994 (Aerospace Facts, 1994-95, p. 11), plays a vital role in the nation's economy. Although only a small percentage of those employed in aerospace are technical communicators, they perform a wide variety of communication duties in government and the private sector.
Haiyang Qiangguo: China as a Maritime Power
2016-03-15
areas, including provincial development projects in coastal areas, much of China’s offshore oil and gas industry, and a large range of other maritime...it clear that resource extraction extends to deep ocean mining, deep sea fishing, and oil and gas extraction beyond Chinese-claimed waters.40...Promote the Reaching of New Height in China’s Polar Scientific Survey Undertaking.”; Wang Yilin. “CNOOC: How the Maritime Oil Industry Can Assist in
NASA Astrophysics Data System (ADS)
George, L. A.; Parra, J.; Rao, M.; Offerman, L.
2007-12-01
Research experiences for science teachers are an important mechanism for increasing classroom teachers' science content knowledge and facility with "real world" research processes. We have developed and implemented a summer scientific research and education workshop model for high school teachers and students which promotes classroom science inquiry projects and produces important research results supporting our overarching scientific agenda. The summer training includes development of a scientific research framework, design and implementation of preliminary studies, extensive field research and training in and access to instruments, measurement techniques and statistical tools. The development and writing of scientific papers is used to reinforce the scientific research process. Using these skills, participants collaborate with scientists to produce research quality data and analysis. Following the summer experience, teachers report increased incorporation of research inquiry in their classrooms and student participation in science fair projects. This workshop format was developed for an NSF Biocomplexity Research program focused on the interaction of urban climates, air quality and human response and can be easily adapted for other scientific research projects.
How to Teach High-School Students "How Science Really Works?"
NASA Astrophysics Data System (ADS)
Losiak, Anna; Students, High-School; Winiarska, Anna; Parys-Wasylkiewicz, Magdalena
2016-04-01
One of the largest problems in Poland (as well as in the large part of the developed world) is that people do not understand how science works. Based on what they learned at school, they think that science is an aggregation of facts that you need to learn by heart. Based on media coverage of the science topics, they think it is a collection of curiosities about the two-headed-snakes. Based on the way in which science is shown in movies and TV series, they envision science as a magic performed in a white coat with usage of colorful fluids and magic spells such as "transformative hermeneutics of quantum gravity". As a result, our societies include a large number of people who "do not believe" in evolution, think that vaccinations are causing autism and that anthropogenic global warming is a myth. This is not very surprising, given that most people never had a chance to perform a real scientific experiment. Most of people, if they are lucky, are able to see some science demonstrations in the classrooms. They are of course very useful, but it is quite clear for everyone that (if everything goes well) the demonstration can end up in one, pre-defined way. The "real" scientific experiment, as a part of the scientific process, is when the outcome is unknown until the end of the entire process. In order to teach high-school students "How Science Really Works" we have developed a project lasting one year (grant from Foundation for Polish Science 26/UD/SKILLS/2015): 1) At first students learned about scientific method, science history and performed a simple scientific experiment. 2) Later, students developed an experiment that was answering a real, unanswered scientific problem (the problem was given by the Leading Scientist). The aim of the project was to determine influence of albedo and emissivity of rock particles laying on a surface of a glacier on the rate of cryoconite holes formation. The results of this experiment can be used to better determine the rate of melting terrestrial glaciers and Martian North Polar Residual Cap. 3) Students were responsible for physically preparing scientific equipment (within a given budget). 4) Students prepared detailed procedures which were used during the experiment. The experiment was performed by the Austrian Space Forum analog astronauts during the Mars Analog Mission AMADEE-15 between 2nd and 14th of August 2015 at the Kaunertal Glacier in Austria. 5) During and after the mission students analyzed data collected during the experiment. 6) Students presented their findings during the regional science fair (Dolnoslaski Festiwal Nauki). Despite the fact the quality of the data produced during the experiment was not satisfactory, the project was a success in terms of explaining students "How Science Really Works" (e.g., how much depends on the properly designed and executed procedures).
Food for Thought ... Mechanistic Validation
Hartung, Thomas; Hoffmann, Sebastian; Stephens, Martin
2013-01-01
Summary Validation of new approaches in regulatory toxicology is commonly defined as the independent assessment of the reproducibility and relevance (the scientific basis and predictive capacity) of a test for a particular purpose. In large ring trials, the emphasis to date has been mainly on reproducibility and predictive capacity (comparison to the traditional test) with less attention given to the scientific or mechanistic basis. Assessing predictive capacity is difficult for novel approaches (which are based on mechanism), such as pathways of toxicity or the complex networks within the organism (systems toxicology). This is highly relevant for implementing Toxicology for the 21st Century, either by high-throughput testing in the ToxCast/ Tox21 project or omics-based testing in the Human Toxome Project. This article explores the mostly neglected assessment of a test's scientific basis, which moves mechanism and causality to the foreground when validating/qualifying tests. Such mechanistic validation faces the problem of establishing causality in complex systems. However, pragmatic adaptations of the Bradford Hill criteria, as well as bioinformatic tools, are emerging. As critical infrastructures of the organism are perturbed by a toxic mechanism we argue that by focusing on the target of toxicity and its vulnerability, in addition to the way it is perturbed, we can anchor the identification of the mechanism and its verification. PMID:23665802
Teaching Cell Biology to Dental Students with a Project-Based Learning Approach.
Costa-Silva, Daniela; Côrtes, Juliana A; Bachinski, Rober F; Spiegel, Carolina N; Alves, Gutemberg G
2018-03-01
Although the discipline of cell biology (CB) is part of the curricula of predoctoral dental schools, students often fail to recognize its practical relevance. The aim of this study was to assess the effectiveness of a practical-theoretical project-based course in closing the gaps among CB, scientific research, and dentistry for dental students. A project-based learning course was developed with nine sequential lessons to evaluate 108 undergraduate dental students enrolled in CB classes of a Brazilian school of dentistry during 2013-16. To highlight the relevance of in vitro studies in the preclinical evaluation of dental materials at the cellular level, the students were challenged to complete the process of drafting a protocol and performing a cytocompatibility assay for a bone substitute used in dentistry. Class activities included small group discussions, scientific database search and article presentations, protocol development, lab experimentation, and writing of a final scientific report. A control group of 31 students attended only one laboratory class on the same theme, and the final reports were compared between the two groups. The results showed that the project-based learning students had superior outcomes in acknowledging the relevance of in vitro methods during biocompatibility testing. Moreover, they produced scientifically sound reports with more content on methodological issues, the relationship with dentistry, and the scientific literature than the control group (p<0.05). The project-based learning students also recognized a higher relevance of scientific research and CB to dental practice. These results suggest that a project-based approach can help contextualize scientific research in dental curricula.
Project EDDIE: Improving Big Data skills in the classroom
NASA Astrophysics Data System (ADS)
Soule, D. C.; Bader, N.; Carey, C.; Castendyk, D.; Fuller, R.; Gibson, C.; Gougis, R.; Klug, J.; Meixner, T.; Nave, L. E.; O'Reilly, C.; Richardson, D.; Stomberg, J.
2015-12-01
High-frequency sensor-based datasets are driving a paradigm shift in the study of environmental processes. The online availability of high-frequency data creates an opportunity to engage undergraduate students in primary research by using large, long-term, and sensor-based, datasets for science courses. Project EDDIE (Environmental Data-Driven Inquiry & Exploration) is developing flexible classroom activity modules designed to (1) improve quantitative and reasoning skills; (2) develop the ability to engage in scientific discourse and argument; and (3) increase students' engagement in science. A team of interdisciplinary faculty from private and public research universities and undergraduate institutions have developed these modules to meet a series of pedagogical goals that include (1) developing skills required to manipulate large datasets at different scales to conduct inquiry-based investigations; (2) developing students' reasoning about statistical variation; and (3) fostering accurate student conceptions about the nature of environmental science. The modules cover a wide range of topics, including lake physics and metabolism, stream discharge, water quality, soil respiration, seismology, and climate change. Assessment data from questionnaire and recordings collected during the 2014-2015 academic year show that our modules are effective at making students more comfortable analyzing data. Continued development is focused on improving student learning outcomes with statistical concepts like variation, randomness and sampling, and fostering scientific discourse during module engagement. In the coming year, increased sample size will expand our assessment opportunities to comparison groups in upper division courses and allow for evaluation of module-specific conceptual knowledge learned. This project is funded by an NSF TUES grant (NSF DEB 1245707).
Enabling responsible public genomics.
Conley, John M; Doerr, Adam K; Vorhaus, Daniel B
2010-01-01
As scientific understandings of genetics advance, researchers require increasingly rich datasets that combine genomic data from large numbers of individuals with medical and other personal information. Linking individuals' genetic data and personal information precludes anonymity and produces medically significant information--a result not contemplated by the established legal and ethical conventions governing human genomic research. To pursue the next generation of human genomic research and commerce in a responsible fashion, scientists, lawyers, and regulators must address substantial new issues, including researchers' duties with respect to clinically significant data, the challenges to privacy presented by genomic data, the boundary between genomic research and commerce, and the practice of medicine. This Article presents a new model for understanding and addressing these new challenges--a "public genomics" premised on the idea that ethically, legally, and socially responsible genomics research requires openness, not privacy, as its organizing principle. Responsible public genomics combines the data contributed by informed and fully consenting information altruists and the research potential of rich datasets in a genomic commons that is freely and globally available. This Article examines the risks and benefits of this public genomics model in the context of an ambitious genetic research project currently under way--the Personal Genome Project. This Article also (i) demonstrates that large-scale genomic projects are desirable, (ii) evaluates the risks and challenges presented by public genomics research, and (iii) determines that the current legal and regulatory regimes restrict beneficial and responsible scientific inquiry while failing to adequately protect participants. The Article concludes by proposing a modified normative and legal framework that embraces and enables a future of responsible public genomics.
Project BudBurst - Meeting the Needs of Climate Change Educators and Scientists
NASA Astrophysics Data System (ADS)
Henderson, S.
2015-12-01
It is challenging for many to get a sense of what climate change means as long periods of time are involved - like decades - which can be difficult to grasp. However, there are a number of citizen science based projects, including NEON's Project BudBurst, that provide the opportunity for both learning about climate change and advancing scientific knowledge. In this presentation, we will share lessons learned from Project BudBurst. Project BudBurst is a national citizen science initiative designed to engage the public in observations of phenological (plant life cycle) events and to increase climate literacy. Project BudBurst is important from an educational perspective, but also because it enables scientists to broaden the geographic and temporal scale of their observations. The goals of Project BudBurst are to 1) increase awareness of phenology as an area of scientific study; 2) Increase awareness of the impacts of changing climates on plants at a continental-scale; and 3) increase science literacy by engaging participants in the scientific process. It was important to better understand if and how Project BudBurst is meeting its goals. Specifically, does participation by non-experts advance scientific knowledge? Does participation advance educational goals and outcomes? Is participation an effective approach to advance/enhance science education in both formal and informal settings? Critical examination of Project BudBurst supports advancement of scientific knowledge and realization of educational objectives. Citizen science collected observations and measurements are being used by scientists as evidenced by the increase of such data in scientific publication. In addition, we found that there is a significant increase in educators utilizing citizen science as part of their instruction. Part of this increase is due to the resources and professional development materials available to educators. Working with partners also demonstrated that the needs of both science and education are being met. Project BudBurst, partners with the PhenoCam Network, National Geographic Society, US Fish and Wildlife Service, National Park Service botanic gardens, science centers and other organizations with both a scientific and educational mission.
Art meets science: The Cosmopolitan Chicken Research Project.
Stinckens, A; Vereijken, A; Ons, E; Konings, P; Van As, P; Cuppens, H; Moreau, Y; Sakai, R; Aerts, J; Goddeeris, B; Buys, N; Vanmechelen, K; Cassiman, J J
2015-01-01
The Cosmopolitan Chicken Project is an artistic undertaking of renowned artist Koen Vanmechelen. In this project, the artist interbreeds domestic chickens from different countries aiming at the creation of a true Cosmopolitan Chicken as a symbol for global diversity. The unifying theme is the chicken and the egg, symbols that link scientific, political, philosophical and ethical issues. The Cosmopolitan Chicken Research Project is the scientific component of this artwork. Based on state of the art genomic techniques, the project studies the effect of the crossing of chickens on the genetic diversity. Also, this research is potentially applicable to the human population. The setup of the CC®P is quite different from traditional breeding experiments: starting from the crossbreed of two purebred chickens (Mechelse Koekoek x Poule de Bresse), every generation is crossed with a few animals from another breed. For 26 of these purebred and crossbred populations, genetic diversity was measured (1) under the assumption that populations were sufficiently large to maintain all informative SNP within a generation and (2) under the circumstances of the CCP breeding experiment. Under the first assumption, a steady increase in genetic diversity was witnessed over the consecutive generations, thus indeed indicating the creation of a "Cosmopolitan Chicken Genome". However, under the conditions of the CCP, which reflects the reality within the human population, diversity is seen to fluctuate within given boundaries instead of steadily increasing. A reflection on this might be that this is because, in humans, an evolutionary optimum in genetic diversity is reached. Key words.
First Light for ASTROVIRTEL Project
NASA Astrophysics Data System (ADS)
2000-04-01
Astronomical data archives increasingly resemble virtual gold mines of information. A new project, known as ASTROVIRTEL aims to exploit these astronomical treasure troves by allowing scientists to use the archives as virtual telescopes. The competition for observing time on large space- and ground-based observatories such as the ESA/NASA Hubble Space Telescope and the ESO Very Large Telescope (VLT) is intense. On average, less than a quarter of applications for observing time are successful. The fortunate scientist who obtains observing time usually has one year of so-called proprietary time to work with the data before they are made publicly accessible and can be used by other astronomers. Precious data from these large research facilities retain their value far beyond their first birthday and may still be useful decades after they were first collected. The enormous quantity of valuable astronomical data now stored in the archives of the European Southern Observatory (ESO) and the Space Telescope-European Coordinating Facility (ST-ECF) is increasingly attracting the attention of astronomers. Scientists are aware that one set of observations can serve many different scientific purposes, including some that were not considered at all when the observations were first made. Data archives as "gold mines" for research [ASTROVIRTEL Logo; JPEG - 184 k] Astronomical data archives increasingly resemble virtual gold mines of information. A new project, known as ASTROVIRTEL or "Accessing Astronomical Archives as Virtual Telescopes" aims to exploit these astronomical treasure troves. It is supported by the European Commission (EC) within the "Access to Research Infrastructures" action under the "Improving Human Potential & the Socio-economic Knowledge Base" of the EC (under EU Fifth Framework Programme). ASTROVIRTEL has been established on behalf of the European Space Agency (ESA) and the European Southern Observatory (ESO) in response to rapid developments currently taking place in the fields of telescope and detector construction, computer hardware, data processing, archiving, and telescope operation. Nowadays astronomical telescopes can image increasingly large areas of the sky. They use more and more different instruments and are equipped with ever-larger detectors. The quantity of astronomical data collected is rising dramatically, generating a corresponding increase in potentially interesting research projects. These large collections of valuable data have led to the useful concept of "data mining", whereby large astronomical databases are exploited to support original research. However, it has become obvious that scientists need additional support to cope efficiently with the massive amounts of data available and so to exploit the true potential of the databases. The strengths of ASTROVIRTEL ASTROVIRTEL is the first virtual astronomical telescope dedicated to data mining. It is currently being established at the joint ESO/Space Telescope-European Coordinating Facility Archive in Garching (Germany). Scientists from EC member countries and associated states will be able to apply for support for a scientific project based on access to and analysis of data from the Hubble Space Telescope (HST), Very Large Telescope (VLT), New Technology Telescope (NTT), and Wide Field Imager (WFI) archives, as well as a number of other related archives, including the Infrared Space Observatory (ISO) archive. Scientists will be able to visit the archive site and collaborate with the archive specialists there. Special software tools that incorporate advanced methods for exploring the enormous quantities of information available will be developed. Statements The project co-ordinator, Piero Benvenuti , Head of ST-ECF, elaborates on the advantages of ASTROVIRTEL: "The observations by the ESA/NASA Hubble Space Telescope and, more recently, by the ESO Very Large Telescope, have already been made available on-line to the astronomical community, once the proprietary period of one year has elapsed. ASTROVIRTEL is different, in that astronomers are now invited to regard the archive as an "observatory" in its own right: a facility that, when properly used, may provide an answer to their specific scientific questions. The architecture of the archives as well as their suite of software tools may have to evolve to respond to the new demand. ASTROVIRTEL will try to drive this evolution on the basis of the scientific needs of its users." Peter Quinn , the Head of ESO's Data Management and Operations Division, is of the same opinion: "The ESO/HST Archive Facility at ESO Headquarters in Garching is currently the most rapidly growing astronomical archive resource in the world. This archive is projected to contain more than 100 Terabytes (100,000,000,000,000 bytes) of data within the next four years. The software and hardware technologies for the archive will be jointly developed and operated by ESA and ESO staff and will be common to both HST and ESO data archives. The ASTROVIRTEL project will provide us with real examples of scientific research programs that will push the capabilities of the archive and allow us to identify and develop new software tools for data mining. The growing archive facility will provide the European astronomical community with new digital windows on the Universe." Note [1] This is a joint Press Release by the European Southern Observatory (ESO) and the Space Telescope European Coordinating Facility (ST-ECF). Additional information More information about ASTROVIRTEL can be found at the dedicated website at: http://www.stecf.org/astrovirtel The European Southern Observatory (ESO) is an intergovernmental organisation, supported by eight European countries: Belgium, Denmark, France, Germany, Italy, The Netherlands, Sweden and Switzerland. The European Space Agency is an intergovernmental organisation supported by 15 European countries: Austria, Belgium, Denmark, Finland, France, Germany, Ireland, Italy, Netherlands, Norway, Portugal, Spain, Sweden, Switzerland and the United Kingdom. The Space Telescope European Coordinating Facility (ST-ECF) is a co-operation between the European Space Agency and the European Southern Observatory. The Hubble Space Telescope (HST) is a project of international co-operation between NASA and ESA.
Scientific data bases on a VAX-11/780 running VMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benkovitz, C.M.; Tichler, J.L.
At Brookhaven National Laboratory several current projects are developing and applying data management techniques to compile, analyze and distribute scientific data sets that are the result of various multi institutional experiments and data gathering projects. This paper will present an overview of a few of these data management projects.
The Nature of Ultraluminous Galaxies: Infrared Space Observatory Analysis and Instrument Team
NASA Technical Reports Server (NTRS)
Satyapal, Shobita
2001-01-01
The scientific goal of the proposed research was to investigate the physical conditions in the nuclear regions of infrared luminous galaxies by carrying out detailed infrared spectroscopic observations of a large sample of infrared luminous galaxies. During the past year, these observations have been successfully analyzed and extensive modeling using photoionization and photodissociation codes has been carried out. Two first-author publications and a second-author publication have been submitted to the Astrophysical Journal and results were presented at two invited talks. Four additional journal papers are in preparation and will be submitted during year 2 of the grant. The secondary project included in this program was the development of a near-infrared cryogenic Fabry-Perot interferometer for use on future large aperture telescopes. System integration and room temperature testing was successfully carried out for this project during year 1.
Time Projection Chamber Polarimeters for X-ray Astrophysics
NASA Astrophysics Data System (ADS)
Hill, Joanne; Black, Kevin; Jahoda, Keith
2015-04-01
Time Projection Chamber (TPC) based X-ray polarimeters achieve the sensitivity required for practical and scientifically significant astronomical observations, both galactic and extragalactic, with a combination of high analyzing power and good quantum efficiency. TPC polarimeters at the focus of an X-ray telescope have low background and large collecting areas providing the ability to measure the polarization properties of faint persistent sources. TPCs based on drifting negative ions rather than electrons permit large detector collecting areas with minimal readout electronics enabling wide field of view polarimeters for observing unpredictable, bright transient sources such as gamma-ray bursts. We described here the design and expected performance of two different TPC polarimeters proposed for small explorer missions: The PRAXyS (Polarimetry of Relativistic X-ray Sources) X-ray Polarimeter Instrument, optimized for observations of faint persistent sources and the POET (Polarimetry of Energetic Transients) Low Energy Polarimeter, designed to detect and measure bright transients. also NASA/GSFC.
Queue observing at the Observatoire du Mont-Mégantic 1.6-m telescope
NASA Astrophysics Data System (ADS)
Artigau, Étienne; Lamontagne, Robert; Doyon, René; Malo, Lison
2010-07-01
Queue planning of observation and service observing are generally seen as specific to large, world-class, astronomical observatories that draw proposal from a large community. One of the common grievance, justified or not, against queue planning and service observing is the fear of training a generation of astronomers without hands-on observing experience. At the Observatoire du Mont-Mégantic (OMM) 1.6-m telescope, we are developing a student-run service observing program. Queue planning and service observing are used as training tools to expose students to a variety of scientific project and instruments beyond what they would normally use for their own research project. The queue mode at the OMM specifically targets relatively shallow observations that can be completed in less than a few hours and are too short to justify a multi-night classical observing run.
Listening to the Deep: live monitoring of ocean noise and cetacean acoustic signals.
André, M; van der Schaar, M; Zaugg, S; Houégnigan, L; Sánchez, A M; Castell, J V
2011-01-01
The development and broad use of passive acoustic monitoring techniques have the potential to help assessing the large-scale influence of artificial noise on marine organisms and ecosystems. Deep-sea observatories have the potential to play a key role in understanding these recent acoustic changes. LIDO (Listening to the Deep Ocean Environment) is an international project that is allowing the real-time long-term monitoring of marine ambient noise as well as marine mammal sounds at cabled and standalone observatories. Here, we present the overall development of the project and the use of passive acoustic monitoring (PAM) techniques to provide the scientific community with real-time data at large spatial and temporal scales. Special attention is given to the extraction and identification of high frequency cetacean echolocation signals given the relevance of detecting target species, e.g. beaked whales, in mitigation processes, e.g. during military exercises. Copyright © 2011. Published by Elsevier Ltd.
Taming the Data Deluge to Unravel the Mysteries of the Universe
NASA Astrophysics Data System (ADS)
Johnston-Hollitt, M.
2017-04-01
Modern Astrophysics is one of the most data intensive research fields in the world and is driving many of the required innovations in the "big data" space. Foremost in astronomy in terms of data generation is radio astronomy, and in the last decade an increase in global interest and investment in the field had led to a large number of new or upgraded facilities which are each currently generating petabytes of data per annum. The peak of this so-called 'radio renaissance' will be the Square Kilometre Array (SKA) - a global observatory designed to uncover the mysteries of the Universe. The SKA will create the highest resolution, fastest frame rate movie of the evolving Universe ever and in doing so will generate 160 terrabytes of data a second, or close to 5 zettabytes of data per annum. Furthermore, due to the extreme faintness of extraterrestrial radio signals, the telescope elements for the SKA must be located in radio quite parts of the world with very low population density. Thus the project aims to build the most data intensive scientific experiment ever, in some of the most remote places on Earth. Generating and serving scientific data products of this scale to a global community of researchers from remote locations is just the first of the "big data" challenges the project faces. Coordination of a global network of tiered data resources will be required along with software tools to exploit the vast sea of results generated. In fact, to fully realize the enormous scientific potential of this project, we will need not only better data distribution and coordination mechanisms, but also improved algorithms, artificial intelligence and ontologies to extract knowledge in an automated way at a scale not yet attempted in science. In this keynote I will present an overview of the SKA project, outline the "big data" challenges the project faces and discuss some of the approaches we are taking to tame the astronomical data deluge we face.
MouseNet database: digital management of a large-scale mutagenesis project.
Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M
2000-07-01
The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.
Kraemer Diaz, Anne E.; Spears Johnson, Chaya R.; Arcury, Thomas A.
2013-01-01
Community-based participatory research (CBPR) has become essential in health disparities and environmental justice research; however, the scientific integrity of CBPR projects has become a concern. Some concerns, such as appropriate research training, lack of access to resources and finances, have been discussed as possibly limiting the scientific integrity of a project. Prior to understanding what threatens scientific integrity in CBPR, it is vital to understand what scientific integrity means for the professional and community investigators who are involved in CBPR. This analysis explores the interpretation of scientific integrity in CBPR among 74 professional and community research team members from of 25 CBPR projects in nine states in the southeastern United States in 2012. It describes the basic definition for scientific integrity and then explores variations in the interpretation of scientific integrity in CBPR. Variations in the interpretations were associated with team member identity as professional or community investigators. Professional investigators understood scientific integrity in CBPR as either conceptually or logistically flexible, as challenging to balance with community needs, or no different than traditional scientific integrity. Community investigators interpret other factors as important in scientific integrity, such as trust, accountability, and overall benefit to the community. This research demonstrates that the variations in the interpretation of scientific integrity in CBPR call for a new definition of scientific integrity in CBPR that takes into account the understanding and needs of all investigators. PMID:24161098
NEON Citizen Science: Planning and Prototyping
NASA Astrophysics Data System (ADS)
Newman, S. J.; Henderson, S.; Gardiner, L. S.; Ward, D.; Gram, W.
2011-12-01
The National Ecological Observatory Network (NEON) will be a national resource for ecological research and education. NEON citizen science projects are being designed to increase awareness and educate citizen scientists about the impacts of climate change, land-use change, and invasive species on continental-scale ecological processes as well as expand NEON data collection capacity by enabling laypersons to collect geographically distributed data. The citizen science area of the NEON web portal will enable citizen scientists to collect, contribute, interpret, and visualize scientific data, as well as access training modules, collection protocols and targeted learning experiences related to citizen science project topics. For NEON, citizen science projects are a means for interested people to interact with and contribute to NEON science. Investigations at vast spatial and temporal scales often require rapid acquisition of large amounts of data from a geographically distributed population of "human sensors." As a continental-scale ecological observatory, NEON is uniquely positioned to develop strategies to effectively integrate data collected by non-scientists into scientific databases. Ultimately, we plan to work collaboratively to transform the practice of science to include "citizens" or non-scientists in the process. Doing science is not limited to scientists, and breaking down the barriers between scientists and citizens will help people better understand the power of using science in their own decision making. In preparation for fully developing the NEON citizen science program, we are partnering with Project BudBurst (PBB), a citizen science project focused on monitoring plant phenology. The educational goals of PBB are to: (1) increase awareness of climate change, (2) educate citizen scientists about the impacts of climate change on plants and the environment, and (3) increase science literacy by engaging participants in the scientific process. Phenology was chosen as the focus of this citizen science campaign because it is a visible and comprehensible way of demonstrating the effects of climate change. In addition, plants are readily accessible in nearly every neighborhood and park, and wild area across the continent, so people can make observations whether they live near an inner city park or in the rural countryside. Recently, NEON developed data visualization tools for Project BudBurst to engage citizen science participants in "doing science" beyond data collection. By prototyping NEON citizen science through Project BudBurst, NEON is developing a better understanding of how to build a citizen science program that addresses areas of awareness, mastery, and leadership of scientific information like that which NEON will produce over the next 30 years.
Kondolf, G. Mathias; Angermeier, Paul L.; Cummins, Kenneth; Dunne, Thomas; Healey, Michael; Kimmerer, Wim; Moyle, Peter B.; Murphy, Dennis; Patten, Duncan; Railsback, Steve F.; Reed, Denise J.; Spies, Robert B.; Twiss, Robert
2008-01-01
Despite increasingly large investments, the potential ecological effects of river restoration programs are still small compared to the degree of human alterations to physical and ecological function. Thus, it is rarely possible to “restore” pre-disturbance conditions; rather restoration programs (even large, well-funded ones) will nearly always involve multiple small projects, each of which can make some modest change to selected ecosystem processes and habitats. At present, such projects are typically selected based on their attributes as individual projects (e.g., consistency with programmatic goals of the funders, scientific soundness, and acceptance by local communities), and ease of implementation. Projects are rarely prioritized (at least explicitly) based on how they will cumulatively affect ecosystem function over coming decades. Such projections require an understanding of the form of the restoration response curve, or at least that we assume some plausible relations and estimate cumulative effects based thereon. Drawing on our experience with the CALFED Bay-Delta Ecosystem Restoration Program in California, we consider potential cumulative system-wide benefits of a restoration activity extensively implemented in the region: isolating/filling abandoned floodplain gravel pits captured by rivers to reduce predation of outmigrating juvenile salmon by exotic warmwater species inhabiting the pits. We present a simple spreadsheet model to show how different assumptions about gravel pit bathymetry and predator behavior would affect the cumulative benefits of multiple pit-filling and isolation projects, and how these insights could help managers prioritize which pits to fill.
Designing Citizen Science Projects in the Era of Mega-Information and Connected Activism
NASA Astrophysics Data System (ADS)
Pompea, S. M.
2010-12-01
The design of citizen science projects must take many factors into account in order to be successful. Currently, there are a wide variety of citizen science projects with different aims, audiences, reporting methods, and degrees of scientific rigor and usefulness. Projects function on local, national, and worldwide scales and range in time from limited campaigns to around the clock projects. For current and future projects, advanced cell phones and mobile computing allow an unprecedented degree of connectivity and data transfer. These advances will greatly influence the design of citizen science projects. An unprecedented amount of data is available for data mining by interested citizen scientists; how can projects take advantage of this? Finally, a variety of citizen scientist projects have social activism and change as part of their mission and goals. How can this be harnessed in a constructive and efficient way? The design of projects must also select the proper role for experts and novices, provide quality control, and must motivate users to encourage long-term involvement. Effective educational and instructional materials design can be used to design responsive and effective projects in a more highly connected age with access to very large amounts of information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Allcock, William; Beggio, Chris
2014-10-17
U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less
Terre des Lasers: the new Aquitaine outreach and communication center in photonics
NASA Astrophysics Data System (ADS)
Prulhiere, Jean Paul; Sarger, Laurent
2009-06-01
The competitive cluster "Route des Lasers" has been labeled by the French Government in July 2005. In this context, it has launched in September 2005, in cooperation with Commissariat à l'Energie Atomique (CEA) and Regional Council a project involving scientific exhibitions, called "Terre des Lasers ®", in order to create an exhibition and an area of communication and science discovery or a very large target (public, school, industry) in the fields of optics, lasers, optronics and imaging. This initiative is part of the strategy of the "Route des Lasers" center which aims to promote technologies developed in the areas of photonics, targeting in particular children and teenagers and their awareness for this particular industrial and scientific topic.
German activities in optical space instrumentation
NASA Astrophysics Data System (ADS)
Hartmann, G.
2018-04-01
In the years of space exploration since the mid-sixties, a wide experience in optical space instrumentation has developed in Germany. This experience ranges from large telescopes in the 1 m and larger category with the accompanying focal plane detectors and spectrometers for all regimes of the electromagnetic spectrum (infrared, visible, ultraviolet, x-rays), to miniature cameras for cometary and planetary explorations. The technologies originally developed for space science. are now also utilized in the fields of earth observation and even optical telecommunication. The presentation will cover all these areas, with examples for specific technological or scientific highlights. Special emphasis will be given to the current state-of-the-art instrumentation technologies in scientific institutions and industry, and to the future perspective in approved and planned projects.
U.S. Geological Survey coastal and marine geology research; recent highlights and achievements
Williams, S. Jeffress; Barnes, Peter W.; Prager, Ellen J.
2000-01-01
The USGS Coastal and Marine Geology Program has large-scale national and regional research projects that focus on environmental quality, geologic hazards, natural resources, and information transfer. This Circular highlights recent scientific findings of the program, which play a vital role in the USGS endeavor to understand human interactions with the natural environment and to determine how the fundamental geologic processes controlling the Earth work. The scientific knowledge acquired through USGS research and monitoring is critically needed by planners, government agencies, and the public. Effective communication of the results of this research will enable the USGS Coastal and Marine Geology Program to play an integral part in assisting the Nation in responding the pressing Earth science challenges of the 21st century.
The European perspective for LSST
NASA Astrophysics Data System (ADS)
Gangler, Emmanuel
2017-06-01
LSST is a next generation telescope that will produce an unprecedented data flow. The project goal is to deliver data products such as images and catalogs thus enabling scientific analysis for a wide community of users. As a large scale survey, LSST data will be complementary with other facilities in a wide range of scientific domains, including data from ESA or ESO. European countries have invested in LSST since 2007, in the construction of the camera as well as in the computing effort. This latter will be instrumental in designing the next step: how to distribute LSST data to Europe. Astroinformatics challenges for LSST indeed includes not only the analysis of LSST big data, but also the practical efficiency of the data access.
Symstad, Amy J.; Long, Andrew J.; Stamm, John; King, David A.; Bachelet, Dominque M.; Norton, Parker A.
2014-01-01
Wind Cave National Park (WICA) protects one of the world’s longest caves, has large amounts of high quality, native vegetation, and hosts a genetically important bison herd. The park’s relatively small size and unique purpose within its landscape requires hands-on management of these and other natural resources, all of which are interconnected. Anthropogenic climate change presents an added challenge to WICA natural resource management because it is characterized by large uncertainties, many of which are beyond the control of park and National Park Service (NPS) staff. When uncertainty is high and control of this uncertainty low, scenario planning is an appropriate tool for determining future actions. In 2009, members of the NPS obtained formal training in the use of scenario planning in order to evaluate it as a tool for incorporating climate change into NPS natural resource management planning. WICA served as one of two case studies used in this training exercise. Although participants in the training exercise agreed that the scenario planning process showed promise for its intended purpose, they were concerned that the process lacked the scientific rigor necessary to defend the management implications derived from it in the face of public scrutiny. This report addresses this concern and others by (1) providing a thorough description of the process of the 2009 scenario planning exercise, as well as its results and management implications for WICA; (2) presenting the results of a follow-up, scientific study that quantitatively simulated responses of WICA’s hydrological and ecological systems to specific climate projections; (3) placing these climate projections and the general climate scenarios used in the scenario planning exercise in the broader context of available climate projections; and (4) comparing the natural resource management implications derived from the two approaches. Wind Cave National Park (WICA) protects one of the world’s longest caves, has large amounts of high quality, native vegetation, and hosts a genetically important bison herd. The park’s relatively small size and unique purpose within its landscape requires hands-on management of these and other natural resources, all of which are interconnected. Anthropogenic climate change presents an added challenge to WICA natural resource management because it is characterized by large uncertainties, many of which are beyond the control of park and National Park Service (NPS) staff. When uncertainty is high and control of this uncertainty low, scenario planning is an appropriate tool for determining future actions. In 2009, members of the NPS obtained formal training in the use of scenario planning in order to evaluate it as a tool for incorporating climate change into NPS natural resource management planning. WICA served as one of two case studies used in this training exercise. Although participants in the training exercise agreed that the scenario planning process showed promise for its intended purpose, they were concerned that the process lacked the scientific rigor necessary to defend the management implications derived from it in the face of public scrutiny. This report addresses this concern and others by (1) providing a thorough description of the process of the 2009 scenario planning exercise, as well as its results and management implications for WICA; (2) presenting the results of a follow-up, scientific study that quantitatively simulated responses of WICA’s hydrological and ecological systems to specific climate projections; (3) placing these climate projections and the general climate scenarios used in the scenario planning exercise in the broader context of available climate projections; and (4) comparing the natural resource management implications derived from the two approaches.
Exploring "The World around Us" in a Community of Scientific Enquiry
ERIC Educational Resources Information Center
Dunlop, Lynda; Compton, Kirsty; Clarke, Linda; McKelvey-Martin, Valerie
2013-01-01
The primary Communities of Scientific Enquiry project is one element of the outreach work in Science in Society in Biomedical Sciences in partnership with the School of Education at the University of Ulster. The project aims to develop scientific understanding and skills at key stage 2 and is a response to several contemporary issues in primary…
NASA Technical Reports Server (NTRS)
Swanson, P. N.; Gulkis, S.; Kulper, T. B. H.; Kiya, M.
1983-01-01
The history and background of the Large Deployable Reflector (LDR) are reviewed. The results of the June 1982 Asilomar (CA) workshop are incorporated into the LDR science objectives and telescope concept. The areas where the LDR may have the greatest scientific impact are in the study of star formation and planetary systems in the own and nearby galaxies and in cosmological studies of the structure and evolution of the early universe. The observational requirements for these and other scientific studies give rise to a set of telescope functional requirements. These, in turn, are satisfied by an LDR configuration which is a Cassegrain design with a 20 m diameter, actively controlled, segmented, primary reflector, diffraction limited at a wavelength of 30 to 50 microns. Technical challenges in the LDR development include construction of high tolerance mirror segments, surface figure measurement, figure control, vibration control, pointing, cryogenics, and coherent detectors. Project status and future plans for the LDR are discussed.
Sustaining Open Source Communities through Hackathons - An Example from the ASPECT Community
NASA Astrophysics Data System (ADS)
Heister, T.; Hwang, L.; Bangerth, W.; Kellogg, L. H.
2016-12-01
The ecosystem surrounding a successful scientific open source software package combines both social and technical aspects. Much thought has been given to the technology side of writing sustainable software for large infrastructure projects and software libraries, but less about building the human capacity to perpetuate scientific software used in computational modeling. One effective format for building capacity is regular multi-day hackathons. Scientific hackathons bring together a group of science domain users and scientific software contributors to make progress on a specific software package. Innovation comes through the chance to work with established and new collaborations. Especially in the domain sciences with small communities, hackathons give geographically distributed scientists an opportunity to connect face-to-face. They foster lively discussions amongst scientists with different expertise, promote new collaborations, and increase transparency in both the technical and scientific aspects of code development. ASPECT is an open source, parallel, extensible finite element code to simulate thermal convection, that began development in 2011 under the Computational Infrastructure for Geodynamics. ASPECT hackathons for the past 3 years have grown the number of authors to >50, training new code maintainers in the process. Hackathons begin with leaders establishing project-specific conventions for development, demonstrating the workflow for code contributions, and reviewing relevant technical skills. Each hackathon expands the developer community. Over 20 scientists add >6,000 lines of code during the >1 week event. Participants grow comfortable contributing to the repository and over half continue to contribute afterwards. A high return rate of participants ensures continuity and stability of the group as well as mentoring for novice members. We hope to build other software communities on this model, but anticipate each to bring their own unique challenges.
NASA Astrophysics Data System (ADS)
McAuliffe, C.; Ledley, T.; Dahlman, L.; Haddad, N.
2007-12-01
One of the challenges faced by Earth science teachers, particularly in K-12 settings, is that of connecting scientific research to classroom experiences. Helping teachers and students analyze Web-based scientific data is one way to bring scientific research to the classroom. The Earth Exploration Toolbook (EET) was developed as an online resource to accomplish precisely that. The EET consists of chapters containing step-by-step instructions for accessing Web-based scientific data and for using a software analysis tool to explore issues or concepts in science, technology, and mathematics. For example, in one EET chapter, users download Earthquake data from the USGS and bring it into a geographic information system (GIS), analyzing factors affecting the distribution of earthquakes. The goal of the EET Workshops project is to provide professional development that enables teachers to incorporate Web-based scientific data and analysis tools in ways that meet their curricular needs. In the EET Workshops project, Earth science teachers participate in a pair of workshops that are conducted in a combined teleconference and Web-conference format. In the first workshop, the EET Data Analysis Workshop, participants are introduced to the National Science Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). They also walk through an Earth Exploration Toolbook (EET) chapter and discuss ways to use Earth science datasets and tools with their students. In a follow-up second workshop, the EET Implementation Workshop, teachers share how they used these materials in the classroom by describing the projects and activities that they carried out with students. The EET Workshops project offers unique and effective professional development. Participants work at their own Internet-connected computers, and dial into a toll-free group teleconference for step-by-step facilitation and interaction. They also receive support via Elluminate, a Web-conferencing software program. The software allows participants to see the facilitator's computer as the analysis techniques of an EET chapter are demonstrated. If needed, the facilitator can also view individual participant's computers, assisting with technical difficulties. In addition, it enables a large number of end users, often widely distributed, to engage in interactive, real-time instruction. In this presentation, we will describe the elements of an EET Workshop pair, highlighting the capabilities and use of Elluminate. We will share lessons learned through several years of conducting this type of professional development. We will also share findings from survey data gathered from teachers who have participated in our workshops.
NASA Astrophysics Data System (ADS)
Lyons, Renee
Educational programs created to provide opportunities for all, in reality often reflect social inequalities. Such is the case for Public Participation in Scientific Research (PPSR) Projects. PPSR projects have been proposed as an effective way to engage more diverse audiences in science, yet the demographics of PPSR participants do not correspond with the demographic makeup of the United States. The field of PPSR as a whole has struggled to recruit low SES and underrepresented populations to participate in project research efforts. This research study explores factors, which may be affecting an underrepresented community's willingness to engage in scientific research and provides advice from PPSR project leaders in the field, who have been able to engage underrepresented communities in scientific research, on how to overcome these barriers. Finally the study investigates the theoretical construct of a Third Space within a PPSR project. The research-based recommendations for PPSR projects desiring to initiate and sustain research partnerships with underrepresented communities well align with the theoretical construct of a Third Space. This study examines a specific scientific research partnership between an underrepresented community and scientific researchers to examine if and to what extent a Third Space was created. Using qualitative methods to understand interactions and processes involved in initiating and sustaining a scientific research partnership, this study provides advice on how PPSR research partnerships can engage underrepresented communities in scientific research. Study results show inequality and mistrust of powerful institutions stood as participation barriers for underrepresented community members. Despite these barriers PPSR project leaders recommend barriers can be confronted by open dialogue with communities about the abuse and alienation they have faced, by signaling respect for the community, and by entering the community through someone the community already trusts. Finally although many of the principles of a Third Space well align with the larger level of activity, which existed in the PPSR project examined in this study, study findings challenge others to critically examine assumptions behind the idea of a Third Space in PPSR and urge other PPSR project leaders towards a transformed view of science.
Innovative Tools for Scientific and Technological Education in Italian Secondary Schools
ERIC Educational Resources Information Center
Santucci, Annalisa; Mini, Roberta; Ferro, Elisa; Martelli, Paola; Trabalzini, Lorenza
2004-01-01
This paper describes the project "Biotech a Scuola" ("Biotech at School"), financed by the Italian Ministry of Education within the SeT program (Special Project for Scientific-Technological Education). The project involved the University of Siena, five senior and junior secondary schools in the Siena area, and a private…
NASA Technical Reports Server (NTRS)
Stuchlik, David W.; Lanzi, Raymond J.
2017-01-01
The National Aeronautics and Space Administrations (NASA) Wallops Flight Facility (WFF), part of the Goddard Space Flight Center (GSFC), has developed a unique pointing control system for instruments aboard scientific balloon gondolas. The ability to point large telescopes and instruments with arc-second accuracy and stability is highly desired by multiple scientific disciplines, such as Planetary, Earth Science, Heliospheric and Astrophysics, and the availability of a standardized system supplied by NASA alleviates the need for the science user to develop and provide their own system. In addition to the pointing control system, a star tracker has been developed with both daytime and nighttime capability to augment the WASP and provide an absolute pointing reference. The WASP Project has successfully completed five test flights and one operational science mission, and is currently supporting an additional test flight in 2017, along with three science missions with flights scheduled between 2018 and 2020. The WASP system has demonstrated precision pointing and high reliability, and is available to support scientific balloon missions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, Panagiotis; /Fermilab; Cary, John
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less
Wang, Jack T H; Daly, Joshua N; Willner, Dana L; Patil, Jayee; Hall, Roy A; Schembri, Mark A; Tyson, Gene W; Hugenholtz, Philip
2015-05-01
Clinical microbiology testing is crucial for the diagnosis and treatment of community and hospital-acquired infections. Laboratory scientists need to utilize technical and problem-solving skills to select from a wide array of microbial identification techniques. The inquiry-driven laboratory training required to prepare microbiology graduates for this professional environment can be difficult to replicate within undergraduate curricula, especially in courses that accommodate large student cohorts. We aimed to improve undergraduate scientific training by engaging hundreds of introductory microbiology students in an Authentic Large-Scale Undergraduate Research Experience (ALURE). The ALURE aimed to characterize the microorganisms that reside in the healthy human oral cavity-the oral microbiome-by analyzing hundreds of samples obtained from student volunteers within the course. Students were able to choose from selective and differential culture media, Gram-staining, microscopy, as well as polymerase chain reaction (PCR) and 16S rRNA gene sequencing techniques, in order to collect, analyze, and interpret novel data to determine the collective oral microbiome of the student cohort. Pre- and postsurvey analysis of student learning gains across two iterations of the course (2012-2013) revealed significantly higher student confidence in laboratory skills following the completion of the ALURE (p < 0.05 using the Mann-Whitney U-test). Learning objectives on effective scientific communication were also met through effective student performance in laboratory reports describing the research outcomes of the project. The integration of undergraduate research in clinical microbiology has the capacity to deliver authentic research experiences and improve scientific training for large cohorts of undergraduate students.
Lyman L. McDonald; Robert Bilby; Peter A. Bisson; Charles C. Coutant; John M. Epifanio; Daniel Goodman; Susan Hanna; Nancy Huntly; Erik Merrill; Brian Riddell; William Liss; Eric J. Loudenslager; David P. Philipp; William Smoker; Richard R. Whitney; Richard N. Williams
2007-01-01
The year 2006 marked two milestones in the Columbia River Basin and the Pacific Northwest region's efforts to rebuild its once great salmon and steelhead runs: the 25th anniversary of the creation of the Northwest Power and Conservation Council and the 10th anniversary of an amendment to the Northwest Power Act that formalized scientific peer review of the council...
Computational methods and software systems for dynamics and control of large space structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Felippa, C. A.; Farhat, C.; Pramono, E.
1990-01-01
This final report on computational methods and software systems for dynamics and control of large space structures covers progress to date, projected developments in the final months of the grant, and conclusions. Pertinent reports and papers that have not appeared in scientific journals (or have not yet appeared in final form) are enclosed. The grant has supported research in two key areas of crucial importance to the computer-based simulation of large space structure. The first area involves multibody dynamics (MBD) of flexible space structures, with applications directed to deployment, construction, and maneuvering. The second area deals with advanced software systems, with emphasis on parallel processing. The latest research thrust in the second area, as reported here, involves massively parallel computers.
iCollections - Digitising the British and Irish Butterflies in the Natural History Museum, London.
Paterson, Gordon; Albuquerque, Sara; Blagoderov, Vladimir; Brooks, Stephen; Cafferty, Steve; Cane, Elisa; Carter, Victoria; Chainey, John; Crowther, Robyn; Douglas, Lyndsey; Durant, Joanna; Duffell, Liz; Hine, Adrian; Honey, Martin; Huertas, Blanca; Howard, Theresa; Huxley, Rob; Kitching, Ian; Ledger, Sophie; McLaughlin, Caitlin; Martin, Geoff; Mazzetta, Gerardo; Penn, Malcolm; Perera, Jasmin; Sadka, Mike; Scialabba, Elisabetta; Self, Angela; Siebert, Darrell J; Sleep, Chris; Toloni, Flavia; Wing, Peter
2016-01-01
The Natural History Museum, London (NHMUK) has embarked on an ambitious programme to digitise its collections . The first phase of this programme has been to undertake a series of pilot projects that will develop the necessary workflows and infrastructure development needed to support mass digitisation of very large scientific collections. This paper presents the results of one of the pilot projects - iCollections. This project digitised all the lepidopteran specimens usually considered as butterflies, 181,545 specimens representing 89 species from the British Isles and Ireland. The data digitised includes, species name, georeferenced location, collector and collection date - the what, where, who and when of specimen data. In addition, a digital image of each specimen was taken. This paper explains the way the data were obtained and the background to the collections which made up the project. Specimen-level data associated with British and Irish butterfly specimens have not been available before and the iCollections project has released this valuable resource through the NHM data portal.
Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.
2006-01-01
This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.
Accelerator Based Tools of Stockpile Stewardship
NASA Astrophysics Data System (ADS)
Seestrom, Susan
2017-01-01
The Manhattan Project had to solve difficult challenges in physics and materials science. During the cold war a large nuclear stockpile was developed. In both cases, the approach was largely empirical. Today that stockpile must be certified without nuclear testing, a task that becomes more difficult as the stockpile ages. I will discuss the role of modern accelerator based experiments, such as x-ray radiography, proton radiography, neutron and nuclear physics experiments, in stockpile stewardship. These new tools provide data of exceptional sensitivity and are answering questions about the stockpile, improving our scientific understanding, and providing validation for the computer simulations that are relied upon to certify todays' stockpile.
ERIC Educational Resources Information Center
Alozie, Nonye M.; Moje, Elizabeth Birr; Krajcik, Joseph S.
2010-01-01
One goal of project-based science is to promote the development of scientific discourse communities in classrooms. Holding rich high school scientific discussions is challenging, especially when the demands of content and norms of high school science pose challenges to their enactment. There is little research on how high school teachers enact…
National Laboratory for Advanced Scientific Visualization at UNAM - Mexico
NASA Astrophysics Data System (ADS)
Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo
2016-04-01
In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.
Three-dimensional imaging for large LArTPCs
NASA Astrophysics Data System (ADS)
Qian, X.; Zhang, C.; Viren, B.; Diwan, M.
2018-05-01
High-performance event reconstruction is critical for current and future massive liquid argon time projection chambers (LArTPCs) to realize their full scientific potential. LArTPCs with readout using wire planes provide a limited number of 2D projections. In general, without a pixel-type readout it is challenging to achieve unambiguous 3D event reconstruction. As a remedy, we present a novel 3D imaging method, Wire-Cell, which incorporates the charge and sparsity information in addition to the time and geometry through simple and robust mathematics. The resulting 3D image of ionization density provides an excellent starting point for further reconstruction and enables the true power of 3D tracking calorimetry in LArTPCs.
ObsPy: A Python toolbox for seismology - Current state, applications, and ecosystem around it
NASA Astrophysics Data System (ADS)
Lecocq, Thomas; Megies, Tobias; Krischer, Lion; Sales de Andrade, Elliott; Barsch, Robert; Beyreuther, Moritz
2016-04-01
ObsPy (http://www.obspy.org) is a community-driven, open-source project offering a bridge for seismology into the scientific Python ecosystem. It provides * read and write support for essentially all commonly used waveform, station, and event metadata formats with a unified interface, * a comprehensive signal processing toolbox tuned to the needs of seismologists, * integrated access to all large data centers, web services and databases, and * convenient wrappers to third party codes like libmseed and evalresp. Python, in contrast to many other languages and tools, is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often has to be translated to stable and production ready environments. It furthermore offers many freely available high quality scientific modules covering most needs in developing scientific software. ObsPy has been in constant development for more than 5 years and nowadays enjoys a large rate of adoption in the community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. This contribution will give a short introduction and overview of ObsPy and highlight a number of use cases and software built around it. We will furthermore discuss the issue of sustainability of scientific software.
ObsPy: A Python toolbox for seismology - Current state, applications, and ecosystem around it
NASA Astrophysics Data System (ADS)
Krischer, L.; Megies, T.; Sales de Andrade, E.; Barsch, R.; Beyreuther, M.
2015-12-01
ObsPy (http://www.obspy.org) is a community-driven, open-source project offering a bridge for seismology into the scientific Python ecosystem. It provides read and write support for essentially all commonly used waveform, station, and event metadata formats with a unified interface, a comprehensive signal processing toolbox tuned to the needs of seismologists, integrated access to all large data centers, web services and databases, and convenient wrappers to third party codes like libmseed and evalresp. Python, in contrast to many other languages and tools, is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often has to be translated to stable and production ready environments. It furthermore offers many freely available high quality scientific modules covering most needs in developing scientific software.ObsPy has been in constant development for more than 5 years and nowadays enjoys a large rate of adoption in the community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it.This contribution will give a short introduction and overview of ObsPy and highlight a number of us cases and software built around it. We will furthermore discuss the issue of sustainability of scientific software.
NASA Astrophysics Data System (ADS)
Price, Aaron
2012-01-01
Citizen science projects offer opportunities for non-scientists to take part in scientific research. While their contribution to scientific data collection has been well documented, there is limited research on changes that may occur to their volunteer participants. In this study, we investigated (1) how volunteers’ attitudes towards science and beliefs in the nature of science changed over six months of participation in an astronomy-themed citizen science project and (2) how the level of project participation accounted for these changes. To measure attitudes towards science and beliefs about the nature of science, identical pre- and post-tests were used. We used pre-test data from 1,375 participants and post-test data collected from 175 participants. Responses were analyzed using the Rasch Rating Scale Model. The pre-test sample was used to create the Rasch scales for the two scientific literacy measures. For the pre/post-test comparisons, data from those who completed both tests were used. Fourteen participants who took the pre/post-tests were interviewed. Results show that overall scientific attitudes did not change, p = .812. However, we did find significant changes related towards two scientific attitude items about science in the news (positive change; p < .001, p < .05) and one related to scientific self-efficacy (negative change, p < .05). These changes were related to the participants’ social activity in the project. Beliefs in the nature of science significantly increased between the pre- and post-tests, p = .014. Relative positioning of individual items on the belief scale did not change much and this change was not related to any of our recorded project activity variables. The interviews suggest that the social aspect of the project is important to participants and the change in self-efficacy is not due to a lowering of esteem but rather a greater appreciation for what they have yet to learn.
NASA Astrophysics Data System (ADS)
Simila, G.; McNally, K.; Quintero, R.; Segura, J.
2006-12-01
The seismic strong motion array project (SSMAP) for the Nicoya Peninsula in northwestern Costa Rica is composed of 10 13 sites including Geotech A900/A800 accelerographs (three-component), Ref-Teks (three- component velocity), and Kinemetric Episensors. The main objectives of the array are to: 1) record and locate strong subduction zone mainshocks [and foreshocks, "early aftershocks", and preshocks] in Nicoya Peninsula, at the entrance of the Nicoya Gulf, and in the Papagayo Gulf regions of Costa Rica, and 2) record and locate any moderate to strong upper plate earthquakes triggered by a large subduction zone earthquake in the above regions. Our digital accelerograph array has been deployed as part of our ongoing research on large earthquakes in conjunction with the Earthquake and Volcano Observatory (OVSICORI) at the Universidad Nacional in Costa Rica. The country wide seismographic network has been operating continuously since the 1980's, with the first earthquake bulletin published more than 20 years ago, in 1984. The recording of seismicity and strong motion data for large earthquakes along the Middle America Trench (MAT) has been a major research project priority over these years, and this network spans nearly half the time of a "repeat cycle" (50 years) for large (Ms 7.5- 7.7) earthquakes beneath the Nicoya Peninsula, with the last event in 1950. Our long time co-collaborators include the seismology group OVSICORI, with coordination for this project by Dr. Ronnie Quintero and Mr. Juan Segura. Numerous international investigators are also studying this region with GPS and seismic stations (US, Japan, Germany, Switzerland, etc.). Also, there are various strong motion instruments operated by local engineers, for building purposes and mainly concentrated in the population centers of the Central Valley. The major goal of our project is to contribute unique scientific information pertaining to a large subduction zone earthquake and its related seismic activity when the next large earthquake occurs in Nicoya. A centralized data base will be created within the main seismic network files at OVSICORI, with various local personnel working in teams that will be responsible to collect data within 3 days following a large mainshock.
NASA Astrophysics Data System (ADS)
Simila, G.; Lafromboise, E.; McNally, K.; Quintereo, R.; Segura, J.
2007-12-01
The seismic strong motion array project (SSMAP) for the Nicoya Peninsula in northwestern Costa Rica is composed of 10 - 13 sites including Geotech A900/A800 accelerographs (three-component), Ref-Teks (three- component velocity), and Kinemetric Episensors. The main objectives of the array are to: 1) record and locate strong subduction zone mainshocks [and foreshocks, "early aftershocks", and preshocks] in Nicoya Peninsula, at the entrance of the Nicoya Gulf, and in the Papagayo Gulf regions of Costa Rica, and 2) record and locate any moderate to strong upper plate earthquakes triggered by a large subduction zone earthquake in the above regions. Our digital accelerograph array has been deployed as part of our ongoing research on large earthquakes in conjunction with the Earthquake and Volcano Observatory (OVSICORI) at the Universidad Nacional in Costa Rica. The country wide seismographic network has been operating continuously since the 1980's, with the first earthquake bulletin published more than 20 years ago, in 1984. The recording of seismicity and strong motion data for large earthquakes along the Middle America Trench (MAT) has been a major research project priority over these years, and this network spans nearly half the time of a "repeat cycle" (~ 50 years) for large (Ms ~ 7.5- 7.7) earthquakes beneath the Nicoya Peninsula, with the last event in 1950. Our long time co- collaborators include the seismology group OVSICORI, with coordination for this project by Dr. Ronnie Quintero and Mr. Juan Segura. The major goal of our project is to contribute unique scientific information pertaining to a large subduction zone earthquake and its related seismic activity when the next large earthquake occurs in Nicoya. We are now collecting a database of strong motion records for moderate sized events to document this last stage prior to the next large earthquake. A recent event (08/18/06; M=4.3) located 20 km northwest of Samara was recorded by two stations (Playa Carrillo and Nicoya) at distances of 25-30 km with maximum acceleration of 0.2g.
Methods for structuring scientific knowledge from many areas related to aging research.
Zhavoronkov, Alex; Cantor, Charles R
2011-01-01
Aging and age-related disease represents a substantial quantity of current natural, social and behavioral science research efforts. Presently, no centralized system exists for tracking aging research projects across numerous research disciplines. The multidisciplinary nature of this research complicates the understanding of underlying project categories, the establishment of project relations, and the development of a unified project classification scheme. We have developed a highly visual database, the International Aging Research Portfolio (IARP), available at AgingPortfolio.org to address this issue. The database integrates information on research grants, peer-reviewed publications, and issued patent applications from multiple sources. Additionally, the database uses flexible project classification mechanisms and tools for analyzing project associations and trends. This system enables scientists to search the centralized project database, to classify and categorize aging projects, and to analyze the funding aspects across multiple research disciplines. The IARP is designed to provide improved allocation and prioritization of scarce research funding, to reduce project overlap and improve scientific collaboration thereby accelerating scientific and medical progress in a rapidly growing area of research. Grant applications often precede publications and some grants do not result in publications, thus, this system provides utility to investigate an earlier and broader view on research activity in many research disciplines. This project is a first attempt to provide a centralized database system for research grants and to categorize aging research projects into multiple subcategories utilizing both advanced machine algorithms and a hierarchical environment for scientific collaboration.
NASA Astrophysics Data System (ADS)
Guernion, Muriel; Hoeffner, Kevin; Guillocheau, Sarah; Hotte, Hoël; Cylly, Daniel; Piron, Denis; Cluzeau, Daniel; Hervé, Morgane; Nicolai, Annegret; Pérès, Guénola
2017-04-01
Scientists have become more and more interested in earthworms because of their impact on soil functioning and their importance in provision of many ecosystem services. To improve the knowledge on soil biodiversity and integrate earthworms in soil quality diagnostics, it appeared necessary to gain a large amount of data on their distribution. The University of Rennes 1 developed since 2011 a collaborative science project called Observatoire Participatif des Vers de Terre (OPVT, participative earthworm observatory). It has several purposes : i) to offer a simple tool for soil biodiversity evaluation in natural and anthropic soils through earthworm assessment, ii) to offer trainings to farmers, territory managers, gardeners, pupils on soil ecology, iii) to build a database of reference values on earthworms in different habitats, iv) to propose a website (https://ecobiosoil.univ-rennes1.fr/OPVT_accueil.php) providing for example general scientific background (earthworm ecology and impacts of soil management), sampling protocols and online visualization of results (data processing and earthworms mapping). Up to now, more than 5000 plots have been prospected since the opening of the project in 2011., Initially available to anyone on a voluntary basis, this project is also used by the French Ministry of Agriculture to carry out a scientific survey throughout the French territory.
Encouraging Data Use in the Classroom-DLESE Workshop Evaluation Results
NASA Astrophysics Data System (ADS)
Lynds, S. E.; Buhr, S. M.; Ledley, T. S.
2005-12-01
For the last two years, the Data Services Team of the Digital Library for Earth Systems Education (DLESE) has offered annual workshops, bringing scientists, technology specialists, and education professionals together to develop ways of using scientific data in education. Teams comprised of representatives from each of five professional roles (scientist, curriculum developer, data provider, teacher, tool developer) worked on developing online educational units of the Earth Exploration Toolbook (EET--http://serc.carleton.edu/eet/). Workshop evaluation projects elicited a large amount of feedback from participants at both workshops. Consistently, the attendees most highly valued the opportunity to network with those of other professional roles and to collaborate on a real-world education project. Technology and science specialists emphasized their desire for a greater understanding of practical applications for scientific data in the classroom and what educators need for successful curricula. The evaluation project also revealed similarities in the limitations that many attendees reported in using online data. Technological barriers such as data format, bandwidth limitations, and proprietary data were all mentioned by participants regardless of professional role. This talk will discuss the barriers to and advantages of collaborations between scientists, technology specialists, and educators and the potential for this format to result in data-rich curriculum elements.
Project Icarus: Stakeholder Scenarios for an Interstellar Exploration Program
NASA Astrophysics Data System (ADS)
Hein, A. M.; Tziolas, A. C.; Osborne, R.
The Project Icarus Study Group's objective is to design a mainly fusion-propelled interstellar probe. The starting point are the results of the Daedalus study, which was conducted by the British Interplanetary Society during the 1970's. As the Daedalus study already indicated, interstellar probes will be the result of a large scale, decade-long development program. To sustain a program over such long periods, the commitment of key stakeholders is vital. Although previous publications identified political and societal preconditions to an interstellar exploration program, there is a lack of more specific scientific and political stakeholder scenarios. This paper develops stakeholder scenarios which allow for a more detailed sustainability assessment of future programs. For this purpose, key stakeholder groups and their needs are identified and scientific and political scenarios derived. Political scenarios are based on patterns of past space programs but unprecedented scenarios are considered as well. Although it is very difficult to sustain an interstellar exploration program, there are scenarios in which this seems to be possible, e.g. the discovery of life within the solar system and on an exoplanet, a global technology development program, and dual-use of technologies for defence and security purposes. This is a submission of the Project Icarus Study Group.
Project management for complex ground-based instruments: MEGARA plan
NASA Astrophysics Data System (ADS)
García-Vargas, María. Luisa; Pérez-Calpena, Ana; Gil de Paz, Armando; Gallego, Jesús; Carrasco, Esperanza; Cedazo, Raquel; Iglesias, Jorge
2014-08-01
The project management of complex instruments for ground-based large telescopes is a challenge itself. A good management is a clue for project success in terms of performance, schedule and budget. Being on time has become a strict requirement for two reasons: to assure the arrival at the telescope due to the pressure on demanding new instrumentation for this first world-class telescopes and to not fall in over-costs. The budget and cash-flow is not always the expected one and has to be properly handled from different administrative departments at the funding centers worldwide distributed. The complexity of the organizations, the technological and scientific return to the Consortium partners and the participation in the project of all kind of professional centers working in astronomical instrumentation: universities, research centers, small and large private companies, workshops and providers, etc. make the project management strategy, and the tools and procedures tuned to the project needs, crucial for success. MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is a facility instrument of the 10.4m GTC (La Palma, Spain) working at optical wavelengths that provides both Integral-Field Unit (IFU) and Multi-Object Spectrograph (MOS) capabilities at resolutions in the range R=6,000-20,000. The project is an initiative led by Universidad Complutense de Madrid (Spain) in collaboration with INAOE (Mexico), IAA-CSIC (Spain) and Universidad Politécnica de Madrid (Spain). MEGARA is being developed under contract with GRANTECAN.
ERIC Educational Resources Information Center
Baker, Dale R.; Lewis, Elizabeth B.; Purzer, Senay; Watts, Nievita Bueno; Perkins, Gita; Uysal, Sibel; Wong, Sissy; Beard, Rachelle; Lang, Michael
2009-01-01
This study reports on the context and impact of the Communication in Science Inquiry Project (CISIP) professional development to promote teachers' and students' scientific literacy through the creation of science classroom discourse communities. The theoretical underpinnings of the professional development model are presented and key professional…
Investigation the Scientific Creativity of Gifted Students through Project-Based Activities
ERIC Educational Resources Information Center
Karademir, Ersin
2016-01-01
In this research, it is aimed to identify the scientific creativity of gifted students through project-based activities. In accordance with this purpose, a study has been carried out with 13 gifted students studying in third and fifth grade. In the study, students have been informed about the project development stages and they have been asked…
Paper to Plastics: An Interdisciplinary Summer Outreach Project in Sustainability
ERIC Educational Resources Information Center
Tamburini, Fiona; Kelly, Thomas; Weerapana, Eranthie; Byers, Jeffery A.
2014-01-01
Paper to Plastics (P2P) is an interdisciplinary program that combines chemistry and biology in a research setting. The goal of this project is 2-fold: to engage students in scientific research and to educate them about sustainability and biodegradable materials. The scientific aim of the project is to recycle unwanted office paper to the useful…
ERIC Educational Resources Information Center
Boss, Suzie
2002-01-01
Idaho secondary students learn the scientific method through outdoor environmental projects related to water quality monitoring. A program trains teachers to design project-based learning and provides extensive followup support. Five-day summer workshops immerse teachers in the types of projects they will orchestrate with their own students.…
The Great War as a Crucial Point in the History of Russian Science and Technology.
Saprykin, Dmitry L
2016-01-01
The paper is devoted to one of the most important and, at the same time, relatively unexplored phases in the history of Russian science and technology. The Great War coincided with the beginning of a heyday in science, engineering education, and technology in Russia. It was precisely the time in which Russia's era of "Big Science" was emer- ging. Many Russian and Soviet technical projects and scientific schools were rooted in the time of the Great War. The "engineerization" of science and a "physical-technical" way of thinking had already begun before the war. But it was precisely the war which encouraged a large proportion of the Russian academic community to take part in industrial projects. Academics also played a significant role in developing concepts and implementing strategic plans during the Great War. This article also discusses how the organization of science and the academic community was transformed during, and after, the Great War. And it looks at the impact that war had on Russia's participation in the international scientific community.
[Earth and Space Sciences Project Services for NASA HPCC
NASA Technical Reports Server (NTRS)
Merkey, Phillip
2002-01-01
This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.
[Earth Science Technology Office's Computational Technologies Project
NASA Technical Reports Server (NTRS)
Fischer, James (Technical Monitor); Merkey, Phillip
2005-01-01
This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.
Western Pacific emergent constraint lowers projected increase in Indian summer monsoon rainfall
NASA Astrophysics Data System (ADS)
Li, Gen; Xie, Shang-Ping; He, Chao; Chen, Zesheng
2017-10-01
The agrarian-based socioeconomic livelihood of densely populated South Asian countries is vulnerable to modest changes in Indian summer monsoon (ISM) rainfall. How the ISM rainfall will evolve is a question of broad scientific and socioeconomic importance. In response to increased greenhouse gas (GHG) forcing, climate models commonly project an increase in ISM rainfall. This wetter ISM projection, however, does not consider large model errors in both the mean state and ocean warming pattern. Here we identify a relationship between biases in simulated present climate and future ISM projections in a multi-model ensemble: models with excessive present-day precipitation over the tropical western Pacific tend to project a larger increase in ISM rainfall under GHG forcing because of too strong a negative cloud-radiation feedback on sea surface temperature. The excessive negative feedback suppresses the local ocean surface warming, strengthening ISM rainfall projections via atmospheric circulation. We calibrate the ISM rainfall projections using this `present-future relationship’ and observed western Pacific precipitation. The correction reduces by about 50% of the projected rainfall increase over the broad ISM region. Our study identifies an improved simulation of western Pacific convection as a priority for reliable ISM projections.
Extremely Large Telescope Project Selected in ESFRI Roadmap
NASA Astrophysics Data System (ADS)
2006-10-01
In its first Roadmap, the European Strategy Forum on Research Infrastructures (ESFRI) choose the European Extremely Large Telescope (ELT), for which ESO is presently developing a Reference Design, as one of the large scale projects to be conducted in astronomy, and the only one in optical astronomy. The aim of the ELT project is to build before the end of the next decade an optical/near-infrared telescope with a diameter in the 30-60m range. ESO PR Photo 40/06 The ESFRI Roadmap states: "Extremely Large Telescopes are seen world-wide as one of the highest priorities in ground-based astronomy. They will vastly advance astrophysical knowledge allowing detailed studies of inter alia planets around other stars, the first objects in the Universe, super-massive Black Holes, and the nature and distribution of the Dark Matter and Dark Energy which dominate the Universe. The European Extremely Large Telescope project will maintain and reinforce Europe's position at the forefront of astrophysical research." Said Catherine Cesarsky, Director General of ESO: "In 2004, the ESO Council mandated ESO to play a leading role in the development of an ELT for Europe's astronomers. To that end, ESO has undertaken conceptual studies for ELTs and is currently also leading a consortium of European institutes engaged in studying enabling technologies for such a telescope. The inclusion of the ELT in the ESFRI roadmap, together with the comprehensive preparatory work already done, paves the way for the next phase of this exciting project, the design phase." ESO is currently working, in close collaboration with the European astronomical community and the industry, on a baseline design for an Extremely Large Telescope. The plan is a telescope with a primary mirror between 30 and 60 metres in diameter and a financial envelope of about 750 m Euros. It aims at more than a factor ten improvement in overall performance compared to the current leader in ground based astronomy: the ESO Very Large Telescope at the Paranal Observatory. The draft Baseline Reference Design will be presented to the wider scientific community on 29 - 30 November 2006 at a dedicated ELT Workshop Meeting in Marseille (France) and will be further reiterated. The design is then to be presented to the ESO Council at the end of 2006. The goal is to start the detailed E-ELT design work by the first half of 2007. Launched in April 2002, the European Strategy Forum on Research Infrastructures was set-up following a recommendation of the European Union Council, with the role to support a coherent approach to policy-making on research infrastructures in Europe, and to act as an incubator for international negotiations about concrete initiatives. In particular, ESFRI has prepared a European Roadmap identifying new Research Infrastructure of pan-European interest corresponding to the long term needs of the European research communities, covering all scientific areas, regardless of possible location and likely to be realised in the next 10 to 20 years. The Roadmap was presented on 19 October. It is the result of an intensive two-year consultation and peer review process involving over 1000 high level European and international experts. The Roadmap identifies 35 large scale infrastructure projects, at various stages of development, in seven key research areas including Environmental Sciences; Energy; Materials Sciences; Astrophysics, Astronomy, Particle and Nuclear Physics; Biomedical and Life Sciences; Social Sciences and the Humanities; Computation and data Treatment.
NASA Technical Reports Server (NTRS)
Vonzahn, U.
1989-01-01
The project Winter in Northern Europe (WINE) of the international Middle Atmosphere Program (MAP) comprised a multinational study of the structure, dynamics and composition of the middle atmosphere in winter at high latitudes. Coordinated field measurements were performed during the winter 1983 to 1984 by a large number of ground-based, air-borne, rocket-borne and satellite-borne instruments. Many of the individual experiments were performed in the European sector of the high latitude and polar atmosphere. Studies of the stratosphere, were, in addition, expanded to hemispheric scales by the use of data obtained from remotely sensing satellites. Beyond its direct scientific results, which are reviewed, MAP/WINE has stimulated quite a number of follow-on experiments and projects which address the aeronomy of the middle atmosphere at high and polar latitudes.
Assessing risk from a stakeholder perspective
NASA Technical Reports Server (NTRS)
Cooper, L. P.
2003-01-01
Planetary exploration missions are subject to a vast array of interpretations of 'success' based on the concerns of multiple stakeholder groups. While project risk management generally focuses on issues of cost/schedule constraints or reliability issues, a broader interpretation of 'risk' as it applies to stakeholders such as sponsors (e.g., NASA), the public at large, the scientific community, the home organization, and the project team itself can provide important insights into the full spectrum of risk that needs to be managed. This paper presents a stakeholder view of risk which is divided into failure, not-a-failure, success, and stunning-success zones. Using the Mars Pathfinder mission as an example, an alternative interpretation of the risks to that mission is presented from the view of key stakeholders. The implications of the stakeholder perspective to project risk management are addressed.
Devi, V; Abraham, R R; Adiga, A; Ramnarayan, K; Kamath, A
2010-01-01
Healthcare decision-making is largely reliant on evidence-based medicine; building skills in scientific reasoning and thinking among medical students becomes an important part of medical education. Medical students in India have no formal path to becoming physicians, scientists or academicians. This study examines students' perceptions regarding research skills improvement after participating in the Mentored Student Project programme at Melaka Manipal Medical College, Manipal Campus, India. Additionally, this paper describes the initiatives taken for the continual improvement of the Mentored Student Project programme based on faculty and student perspectives. At Melaka Manipal Medical College, Mentored Student Project was implemented in the curriculum during second year of Bachelor of Medicine and Bachelor of Surgery programme with the intention of developing research skills essential to the career development of medical students. The study design was cross-sectional. To inculcate the spirit of team work students were grouped (n=3 to 5) and each group was asked to select a research project. The students' research projects were guided by their mentors. A questionnaire (Likert's five point scale) on students' perceptions regarding improvement in research skills after undertaking projects and guidance received from the mentor was administered to medical students after they had completed their Mentored Student Project. The responses of students were summarised using percentages. The median grade with inter-quartile range was reported for each item in the questionnaire. The median grade for all the items related to perceptions regarding improvement in research skills was 4 which reflected that the majority of the students felt that Mentored Student Project had improved their research skills. The problems encountered by the students during Mentored Student Project were related to time management for the Mentored Student Project and mentors. This study shows that students acknowledged that their research skills were improved after participating in the Mentored Student Project programme. The Mentored Student Project programme was successful in fostering positive attitudes among medical students towards scientific research. The present study also provides scope for further improvement of the Mentored Student Project programme based on students' and faculty perspectives.
NASA Astrophysics Data System (ADS)
Rauser, Florian; Vamborg, Freja
2016-04-01
The interdisciplinary project on High Definition Clouds and Precipitation for advancing climate prediction HD(CP)2 (hdcp2.eu) is an example for the trend in fundamental research in Europe to increasingly focus on large national and international research programs that require strong scientific coordination. The current system has traditionally been host-based: project coordination activities and funding is placed at the host institute of the central lead PI of the project. This approach is simple and has the advantage of strong collaboration between project coordinator and lead PI, while exhibiting a list of strong, inherent disadvantages that are also mentioned in this session's description: no community best practice development, lack of integration between similar projects, inefficient methodology development and usage, and finally poor career development opportunities for the coordinators. Project coordinators often leave the project before it is finalized, leaving some of the fundamentally important closing processes to the PIs. This systematically prevents the creation of professional science management expertise within academia, which leads to an automatic imbalance that hinders the outcome of large research programs to help future funding decisions. Project coordinators in academia often do not work in a professional project office environment that could distribute activities and use professional tools and methods between different projects. Instead, every new project manager has to focus on methodological work anew (communication infrastructure, meetings, reporting), even though the technological needs of large research projects are similar. This decreases the efficiency of the coordination and leads to funding that is effectively misallocated. We propose to challenge this system by creating a permanent, virtual "Centre for Earth System Science Management CESSMA" (cessma.com), and changing the approach from host- based to centre-based. This should complement the current system, by creating permanent, sustained options for interactions between large research projects in similar fields. In the long run such a centre might improve on the host-based system because the centre-based solution allows multiple projects to be coordinated in conjunction by experienced science managers, using overlap in meeting organization, reporting, infrastructure, travel and so on. To still maintain close cooperation between project managers and lead PIs, we envision a virtual centre that creates extensive collaborative opportunities by organizing yearly retreats, a shared technical data base, et cetera. As "CESSMA" is work in progress (we have applied for funding for 2016-18), we would like to use this opportunity to discuss chances, potential problems, experiences and options for this attempt to institutionalise the very reason for this session: improved, coordinated, effective science coordination; and to create a central focal point for public / academia interactions.
ERIC Educational Resources Information Center
Ibe, Mary; Deutscher, Rebecca
This study investigated the effects on student scientific efficacy after participation in the Goldstone Apple Valley Radio Telescope (GAVRT) project. In the GAVRT program, students use computers to record extremely faint radio waves collected by the telescope and analyze real data. Scientific efficacy is a type of self-knowledge a person uses to…
The role of fluctuations and interactions in pedestrian dynamics
NASA Astrophysics Data System (ADS)
Corbetta, Alessandro; Meeusen, Jasper; Benzi, Roberto; Lee, Chung-Min; Toschi, Federico
Understanding quantitatively the statistical behaviour of pedestrians walking in crowds is a major scientific challenge of paramount societal relevance. Walking humans exhibit a rich (stochastic) dynamics whose small and large deviations are driven, among others, by own will as well as by environmental conditions. Via 24/7 automatic pedestrian tracking from multiple overhead Microsoft Kinect depth sensors, we collected large ensembles of pedestrian trajectories (in the order of tens of millions) in different real-life scenarios. These scenarios include both narrow corridors and large urban hallways, enabling us to cover and compare a wide spectrum of typical pedestrian dynamics. We investigate the pedestrian motion measuring the PDFs, e.g. those of position, velocity and acceleration, and at unprecedentedly high statistical resolution. We consider the dependence of PDFs on flow conditions, focusing on diluted dynamics and pair-wise interactions (''collisions'') for mutual avoidance. By means of Langevin-like models we provide models for the measured data, inclusive typical fluctuations and rare events. This work is part of the JSTP research programme ``Vision driven visitor behaviour analysis and crowd management'' with Project Number 341-10-001, which is financed by the Netherlands Organisation for Scientific Research (NWO).
Enhancing the Environmental Legacy of the International Polar Year 2007- 2008
NASA Astrophysics Data System (ADS)
Tin, T.; Roura, R.; Perrault, M.
2006-12-01
The International Geophysical Year (IGY) left a legacy of peace and international cooperation in the form of the 1959 Antarctic Treaty. Since the IGY, the 1991 Protocol of Environmental Protection to the Antarctic Treaty was signed and entered into force. The Protocol establishes that the protection of the environment and the wilderness values of Antarctica "shall be fundamental considerations in the planning and conduct of all activities in the Antarctic Treaty area". Fifty years on, the IPY 2007-08 can, in turn, leave behind a positive environmental legacy - where the sharing of facilities and logistics are encouraged, the human footprint in Antarctica is minimized and a future generation of environmentally aware scientists, logisticians and visitors is fostered. Based on an analysis of all Expressions of Interest submitted to the IPY, we found that about three-quarters of IPY's Antarctic projects plan to have fieldwork components. About one-third of these field projects expect to leave physical infrastructure in Antarctica. A number of projects plan to develop large-scale infrastructure, such as stations and observatories, in hitherto pristine areas. Fewer than one percent of Antarctic field projects address the issue of their environmental legacy: four projects indicated that the site will be cleaned up or the equipment will be removed at the end of the project; two projects indicated that their results may be useful for the management of the Antarctic environment, e.g., in the control of invasive species or setting up of marine protected areas. With the goal of increasing the environmental awareness of Antarctic field scientists, our contribution will review current research on the impacts of human activities science, tourism, exploitation of marine resources and global climate change - on the Antarctic environment. A preliminary analysis of the cumulative impacts of IPY activities will be presented. Case studies of scientific projects in Antarctica with a potentially positive environmental legacy will be highlighted, and suggestions of actions that could be taken to increase the environmental friendliness of scientific projects will be discussed.
Reengineering observatory operations for the time domain
NASA Astrophysics Data System (ADS)
Seaman, Robert L.; Vestrand, W. T.; Hessman, Frederic V.
2014-07-01
Observatories are complex scientific and technical institutions serving diverse users and purposes. Their telescopes, instruments, software, and human resources engage in interwoven workflows over a broad range of timescales. These workflows have been tuned to be responsive to concepts of observatory operations that were applicable when various assets were commissioned, years or decades in the past. The astronomical community is entering an era of rapid change increasingly characterized by large time domain surveys, robotic telescopes and automated infrastructures, and - most significantly - of operating modes and scientific consortia that span our individual facilities, joining them into complex network entities. Observatories must adapt and numerous initiatives are in progress that focus on redesigning individual components out of the astronomical toolkit. New instrumentation is both more capable and more complex than ever, and even simple instruments may have powerful observation scripting capabilities. Remote and queue observing modes are now widespread. Data archives are becoming ubiquitous. Virtual observatory standards and protocols and astroinformatics data-mining techniques layered on these are areas of active development. Indeed, new large-aperture ground-based telescopes may be as expensive as space missions and have similarly formal project management processes and large data management requirements. This piecewise approach is not enough. Whatever challenges of funding or politics facing the national and international astronomical communities it will be more efficient - scientifically as well as in the usual figures of merit of cost, schedule, performance, and risks - to explicitly address the systems engineering of the astronomical community as a whole.
Active space debris removal by using laser propulsion
NASA Astrophysics Data System (ADS)
Rezunkov, Yu. A.
2013-03-01
At present, a few projects on the space debris removal by using highpower lasers are developed. One of the established projects is the ORION proposed by Claude Phipps from Photonics Associates Company and supported by NASA (USA) [1]. But the technical feasibility of the concept is limited by sizes of the debris objects (from 1 to 10 cm) because of a small thrust impulse generated at the laser ablation of the debris materials. At the same time, the removal of rocket upper stages and satellites, which have reached the end of their lives, has been carried out only in a very small number of cases and most of them remain on the Low Earth Orbits (LEO). To reduce the amount of these large-size objects, designing of space systems allowing deorbiting upper rocket stages and removing large-size satellite remnants from economically and scientifically useful orbits to disposal ones is considered. The suggested system is based on high-power laser propulsion. Laser-Orbital Transfer Vehicle (LOTV) with the developed aerospace laser propulsion engine is considered as applied to the problem of mitigation of man-made large-size space debris in LEO.
NASA Astrophysics Data System (ADS)
Malandraki, Olga; Klein, Karl-Ludwig; Vainio, Rami; Agueda, Neus; Nunez, Marlon; Heber, Bernd; Buetikofer, Rolf; Sarlanis, Christos; Crosby, Norma
2017-04-01
High-energy solar energetic particles (SEPs) emitted from the Sun are a major space weather hazard motivating the development of predictive capabilities. In this work, the current state of knowledge on the origin and forecasting of SEP events will be reviewed. Subsequently, we will present the EU HORIZON2020 HESPERIA (High Energy Solar Particle Events foRecastIng and Analysis) project, its structure, its main scientific objectives and forecasting operational tools, as well as the added value to SEP research both from the observational as well as the SEP modelling perspective. The project addresses through multi-frequency observations and simulations the chain of processes from particle acceleration in the corona, particle transport in the magnetically complex corona and interplanetary space to the detection near 1 AU. Furthermore, publicly available software to invert neutron monitor observations of relativistic SEPs to physical parameters that can be compared with space-borne measurements at lower energies is provided for the first time by HESPERIA. In order to achieve these goals, HESPERIA is exploiting already available large datasets stored in databases such as the neutron monitor database (NMDB) and SEPServer that were developed under EU FP7 projects from 2008 to 2013. Forecasting results of the two novel SEP operational forecasting tools published via the consortium server of 'HESPERIA' will be presented, as well as some scientific key results on the acceleration, transport and impact on Earth of high-energy particles. Acknowledgement: This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 637324.
Haux, Reinhold; Kuballa, Stefanie; Schulze, Mareike; Böhm, Claudia; Gefeller, Olaf; Haaf, Jan; Henning, Peter; Mielke, Corinna; Niggemann, Florian; Schürg, Andrea; Bergemann, Dieter
2016-12-07
Based on today's information and communication technologies the open access paradigm has become an important approach for adequately communicating new scientific knowledge. Summarizing the present situation for journal transformation. Presenting criteria for adequate transformation as well as a specific approach for it. Describing our exemplary implementation of such a journal transformation. Studying the respective literature as well as discussing this topic in various discussion groups and meetings (primarily of editors and publishers, but also of authors and readers), with long term experience as editors and /or publishers of scientific publications as prerequisite. There is a clear will, particularly of political and funding organizations, towards open access publishing. In spite of this, there is still a large amount of scientific knowledge, being communicated through subscription-based journals. For successfully transforming such journals into open access, sixteen criteria for a goal-oriented, stepwise, sustainable, and fair transformation are suggested. The Tandem Model as transformation approach is introduced. Our exemplary implementation is done in the Trans-O-MIM project. It is exploring strategies, models and evaluation metrics for journal transformation. As instance the journal Methods of Information in Medicine will apply the Tandem Model from 2017 onwards. Within Trans-O-MIM we will reach at least nine of the sixteen criteria for adequate transformation. It was positive to implement Trans-O-MIM as international research project. After first steps for transforming Methods have successfully been made, challenges will remain, among others, in identifying appropriate incentives for open access publishing in order to support its transformation.
Golet, Gregory H; Roberts, Michael D; Larsen, Eric W; Luster, Ryan A; Unger, Ron; Werner, Gregg; White, Gregory G
2006-06-01
Studies have shown that ecological restoration projects are more likely to gain public support if they simultaneously increase important human services that natural resources provide to people. River restoration projects have the potential to influence many of the societal functions (e.g., flood control, water quality) that rivers provide, yet most projects fail to consider this in a comprehensive manner. Most river restoration projects also fail to take into account opportunities for revitalization of large-scale river processes, focusing instead on opportunities presented at individual parcels. In an effort to avoid these pitfalls while planning restoration of the Sacramento River, we conducted a set of coordinated studies to evaluate societal impacts of alternative restoration actions over a large geographic area. Our studies were designed to identify restoration actions that offer benefits to both society and the ecosystem and to meet the information needs of agency planning teams focusing on the area. We worked with local partners and public stakeholders to design and implement studies that assessed the effects of alternative restoration actions on flooding and erosion patterns, socioeconomics, cultural resources, and public access and recreation. We found that by explicitly and scientifically melding societal and ecosystem perspectives, it was possible to identify restoration actions that simultaneously improve both ecosystem health and the services (e.g., flood protection and recreation) that the Sacramento River and its floodplain provide to people. Further, we found that by directly engaging with local stakeholders to formulate, implement, and interpret the studies, we were able to develop a high level of trust that ultimately translated into widespread support for the project.
NASA Astrophysics Data System (ADS)
Sorensen, A. E.; Jordan, R.
2016-12-01
Recent literature has suggested public participatory research models (e.g., citizen science and similar) as a key opportunity for scientists to meaningfully engage and communicate with the public to increase support for science and encourage pro-science behavior. In this, there has been an inherent assumption that all models of engagement yield similar participant results with few examples of assessment of these programs. While many of these programs do share superficial similarities in their modes of participant engagement and participant motivation, there is a large disparity in participant engagement between them. This disparity suggests that framing of these projects (e.g., citizen science versus crowd sourcing) also plays an important role in decisions about participation. Additionally, participant outcomes, in terms of beliefs about scientific practices and scientific trust, between these two project types has not yet been investigated. To investigate the impact of framing, participants were recruited to a web-based tree phenology public participatory research program where half the participants were engaged in a citizen science framed program and the other were engaged in a crowdsourced framed project. The participants in each frame were engaged in the same task (reporting leaf budding/leaf drop), but the way the projects were framed differed. Post-participation we see that there are indeed statistically significant differences in participant outcomes between individuals who participated as a citizen scientist versus as a crowdsourcer. Particularly we see differences in terms of their views of science, identity, and trust of science. This work is the first to the authors' knowledge that aims to evaluate if projects can be treated synonymously when discussing potential for public engagement and broader trust and literacy outcomes.
Sample Identification at Scale - Implementing IGSN in a Research Agency
NASA Astrophysics Data System (ADS)
Klump, J. F.; Golodoniuc, P.; Wyborn, L. A.; Devaraju, A.; Fraser, R.
2015-12-01
Earth sciences are largely observational and rely on natural samples, types of which vary significantly between science disciplines. Sharing and referencing of samples in scientific literature and across the Web requires the use of globally unique identifiers essential for disambiguation. This practice is very common in other fields, e.g. ISBN in publishing, doi in scientific literature, etc. In Earth sciences however, this is still often done in an ad-hoc manner without the use of unique identifiers. The International Geo Sample Number (IGSN) system provides a persistent, globally unique label for identifying environmental samples. As an IGSN allocating agency, CSIRO implements the IGSN registration service at the organisational scale with contributions from multiple research groups. Capricorn Distal Footprints project is one of the first pioneers and early adopters of the technology in Australia. For this project, IGSN provides a mechanism for identification of new and legacy samples, as well as derived sub-samples. It will ensure transparency and reproducibility in various geochemical sampling campaigns that will involve a diversity of sampling methods. Hence, diverse geochemical and isotopic results can be linked back to the parent sample, particularly where multiple children of that sample have also been analysed. The IGSN integration for this project is still in early stages and requires further consultations on the governance mechanisms that we need to put in place to allow efficient collaboration within CSIRO and collaborating partners on the project including naming conventions, service interfaces, etc. In this work, we present the results of the initial implementation of IGSN in the context of the Capricorn Distal Footprints project. This study has so far demonstrated the effectiveness of the proposed approach, while maintaining the flexibility to adapt to various media types, which is critical in the context of a multi-disciplinary project.
Idler, Nadja; Huber, Johanna; von Mutius, Sabine; Welbergen, Lena; Fischer, Martin R
2016-01-01
Objective: During the 2015 summer semester of Munich's Ludwig Maximilian University (LMU) medical school, the pilot project "MeMPE Summer University - An Interprofessional Seminar on Prevention and Health Promotion" was implemented as a compulsory elective subject. In 90 teaching units of 45 minutes each, 20 students from the degree programs of Medicine, Master of Public Health and Master of Science Epidemiology (MeMPE) completed modules in theoretical introduction, scientific project work as well as practical assignments and conference attendance. Methods: The project was evaluated by students using pre- and post-project questionnaires (26 and 57 items, evaluated on a Five-level Likert scale of 1="fully agree" to 5="fully disagree"). The evaluation interviews of the instruction participants were recorded, transcribed and analyzed according to Mayring's qualitative content analysis. Results: Questionnaire response rate was 100 %. In pre/post comparison, the students reported an improvement in factual knowledge (pre median=3.0; post median=2.0; p<0.0001), in scientific work (pre median=3.0; post median=1.0; p<0.0001) and in interprofessional work (pre median=2.0; post median=1.0; p=0.024). In 18 interviews, the instructors largely expressed their motivation to participate in the project again. Conclusion: The MeMPE Summer University can serve as an example of best practice for interprofessional communication of prevention and health-promotion topics in theory and practice. The evaluation results show that the project enjoyed a high level of acceptance among students and instructors, and that it should be conducted in a revised version again in 2016.
The SWITCH-ON Virtual Water-Science Laboratory
NASA Astrophysics Data System (ADS)
Arheimer, Berit; Boot, Gerben; Calero, Joan; Ceola, Serena; Gyllensvärd, Frida; Hrachowitz, Markus; Little, Lorna; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Wagener, Thorsten
2017-04-01
The SWITCH-ON Virtual Water-Science Laboratory (VWSL) aims to facilitate collaboration and support reproducible experiments in water research. The goal is to overcome geographical distance for comparative hydrology and increase transparency when using computational tools in hydrological sciences. The VWSL gives access to open data through dedicated software tools for data search and upload, and helps creating collaborative protocols for joint experiments in the virtual environment. The VWSL will help scientists with: • Cooperation around the world - straightforward connections with other scientists in comparative analyses and collaboration, as a mean to accelerate scientific advance in hydrology. • Repeatability of experiments -thorough review of a large variety of numerical experiments, which is a foundational principle in scientific research, and improvement of research standards. • New forms of scientific research - by using online 'living' protocols, scientists you can elaborate ideas incrementally with a large group of colleagues and share data, tools, models, etc. in open science. The VWSL was developed within the EU project "Sharing Water Information to Tackle Changes in Hydrology - for Operational Needs" (Grant agreement No 603587). Visitors can choose to Define, Participate or Review experiments by clicking the start buttons (http://www.switch-on-vwsl.eu/). Anyone can view protocols without log-in (that's important for Open Science) - but to create, participate and edit protocols, you need to Log-in for security reasons. During the work process, the protocol is moved from one view to another as the experiment evolves from idea, to on-going, to be completed. The users of the Lab also get access to useful tools for running collaborative experiments, for instance: Open data Search, Data (and metadata) Upload, and Create Protocol tools. So far, eight collaborative experiments have been completed in the VWSL and resulted in research papers (published or submitted), and there are currently four on-going experiments, which also involves external participants, not paid by the project. The VWSL is now launched and open to everyone but it will be continuously developed and sustained also after the project. This presentation will give an on-line demonstration of the major features of the present VWSL and discuss some future visions and major challenges in this e-infrastructure.
Characterizing uncertain sea-level rise projections to support investment decisions.
Sriver, Ryan L; Lempert, Robert J; Wikman-Svahn, Per; Keller, Klaus
2018-01-01
Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions.
Characterizing uncertain sea-level rise projections to support investment decisions
Lempert, Robert J.; Wikman-Svahn, Per; Keller, Klaus
2018-01-01
Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions. PMID:29414978
NASA Astrophysics Data System (ADS)
Charania, A.
2002-01-01
At the end of the first decade of the 21st century, the International Space Station (ISS) will stand as a testament of the engineering capabilities of the international community. The choices for the next logical step for this community remain vast and conflicting: a Mars mission, moon colonization, Space Solar Power (SSP), etc. This examination focuses on positioning SSP as one such candidate for consideration. A marketing roadmap is presented that reveals the potential benefits of SSP to both the space community and the global populace at large. Recognizing that scientific efficiency itself has no constituency large enough to persuade entities to outlay funds for such projects, a holistic approach is taken to positioning SSP. This includes the scientific, engineering, exploratory, economic, political, and development capabilities of the system. SSP can be seen as both space exploration related and a resource project for undeveloped nations. Coupling these two non-traditional areas yields a broader constituency for the project that each one alone could generate. Space exploration is many times seen as irrelevant to the condition of the populace of the planet from which the money comes for such projects. When in this new century, billions of people on the planet still have never made a phone call or even have access to clean water, the origins of this skepticism can be understandable. An area of concern is the problem of not living up to the claims of overeager program marketers. Just as the ISS may never live up to the claims of its advocates in terms of space research, any SSP program must be careful in not promising utopian global solutions to any future energy starved world. Technically, SSP is a very difficult problem, even harder than creating the ISS, yet the promise it can hold for both space exploration and Earth development can lead to a renaissance of the relevance of space to the lives of the citizens of the world.
ERIC Educational Resources Information Center
Gungor, Sema Nur; Ozer, Dilek Zeren; Ozkan, Muhlis
2013-01-01
This study re-evaluated 454 science projects that were prepared by primary school students between 2007 and 2011 within the scope of Science Projects Event for Primary School Students. Also, submitted to TUBITAK BIDEB Bursa regional science board by MNE regional work groups in accordance with scientific research methods and techniques, including…
Salton Sea Scientific Drilling Program
Sass, J.H.
1988-01-01
The Salton Sea Scientific Drilling Program (SSSDP) was the first large-scale drilling project undertaken by the U.S Continental Scientific Drilling Program. The objectives of the SSSDP were (1) to drill a deep well into the Salton Sea Geothermal Field in the Imperial Valley of California, (2) to retrieve a high percentage of core and cuttings along the entire depth of the well, (3) to obtain a comprehensive suite of geophysical logs, (4) to conduct flow tests at two depths (and to take fluid samples therefrom), and (5) to carry out several downhole experiments. These activites enabled the U.S Geological Survey and cooperating agencies to study the physical and chemical processes involved in an active hydrothermal system driven by a molten-rock heat source. This program, orginally conceived by Wilfred A. Elders, professor of geology at the University of California at Riverside, was coordinated under an inter-agency accord among the Geological Survey, the U.S Department of Energy, and the National Science Foundation.
Observing Mars with Schiaparelli's telescope
NASA Astrophysics Data System (ADS)
Bernagozzi, Andrea; Testa, Antonella; Tucci, Pasquale
2004-03-01
We have taken the occasion of the 2003 Mars' opposition to carry out observations of the red planet with the 218 mm Merz refractor, built in the 1863-1865, recently restored, used by Giovanni Virginio Schiaparelli to observe Mars from the opposition of August 1877 until that one of the 1883-'84. In the occasion we launched a 5 days initiative of dissemination of scientific culture addressed to students and public at large. We organized direct observations and adapted a webcam to the ancient instrument. The images were sent to Milanese Planetarium where about 300 people, every night, could participate to the manifestation. Moreover a big screen was arranged in the garden around the Planetarium in order to allow other people to participate. The projection of the images was part of a 2 hours program of short lectures on the historical and current aspects of Mars. This initiative was successful: but what about scientific culture? What kind of scientific information did the public perceive?
"An expedition to heal the wounds of war". The 1919 eclipse and Eddington as Quaker adventurer.
Stanley, Matthew
2003-03-01
The 1919 eclipse expedition's confirmation of general relativity is often celebrated as a triumph of scientific internationalism. However, British scientific opinion during World War I leaned toward the permanent severance of intellectual ties with Germany. That the expedition came to be remembered as a progressive moment of internationalism was largely the result of the efforts of A. S. Eddington. A devout Quaker, Eddington imported into the scientific community the strategies being used by his coreligionists in the national dialogue: humanize the enemy through personal contact and dramatic projects that highlight the value of peace and cooperation. The essay also addresses the common misconception that Eddington's sympathy for Einstein led him intentionally to misinterpret the expedition's results. The evidence gives no reason to think that Eddington or his coworkers were anything but rigorous. Eddington's pacifism is reflected not in manipulated data but in the meaning of the expedition and the way it entered the collective memory as a celebration of international cooperation in the wake of war.
CALLISTO: scientific projects performed by high school students
NASA Astrophysics Data System (ADS)
Boer, Michel
The Callisto project was initiated in 2002 by the "Lyće de l'Arc" (High School in Orange, e France) and the "Observatoire de Haute Provence". Its goal is to give the students motivation for scientific and technical studies: they have the possibility to perform scientific projects together with professional astronomers. The pupils work in groups of 3 to 4, each having a specific theme: geophysics, variable stars, small bodies of the solar system, mechanical and optical instrumentation. They follow a whole scientific approach, from the question to answer, the instrumental setup, acquisition, data reduction, and publication. During a week they are invited to observe using the OHP 1.20m and 0.80m, with the support of a professional astronomer. Some projects have been in fact derived from actual proposal accepted at OHP (e.g. rotation curves of binary asteroids). The best projects are considered for some competitions like ESO "catch a star", "Olympiade de Physique", etc. Since 2005 three high-schools participate to this project. The Callisto initiative has also produced the basis of a teacher training course. Callisto is an example of a succesful collaboration between an interdisciplinary team of teachers (physics, maths, philosophy, English...), a research institution (the OHP), and researchers.
Integrated design and management of complex and fast track projects
NASA Astrophysics Data System (ADS)
Mancini, Dario
2003-02-01
Modern scientific and technological projects are increasingly in competition over scientific aims, technological innovation, performance, time and cost. They require a dedicated and innovative organization able to satisfy contemporarily various technical and logistic constraints imposed by the final user, and guarantee the satisfaction of technical specifications, identified on the basis of scientific aims. In order to satisfy all the above, the management has to be strategically innovative and intuitive, by removing, first of all, the bottlenecks that are pointed out, usually only at the end of the projects, as the causes of general dissatisfaction. More than 30 years spent working on complex multidisciplinary systems and 20 years of formative experience in managing contemporarily both scientific, technological and industrial projects have given the author the possibility to study, test and validate strategies for parallel project management and integrated design, merged in a sort of unique optimized task, using the newly-coined word "Technomethodology". The paper highlights useful information to be taken into consideration during project organization to minimize the program deviations from the expected goals and describe some of the basic meanings of this new advanced method that is the key for parallel successful management of multiple and interdisciplinary activities.
2014-03-01
streamlines) from two types of diffusion weighted imaging scans, diffusion tensor imaging ( DTI ) and diffusion spectrum imaging (DSI). We examined...individuals. Importantly, the results also showed that this effect was greater for the DTI method than the DSI method. This suggested that DTI can better...compared to level surface walking. This project combines experimental EEG data and electromyography (EMG) data recorded from seven muscles of the leg
2006-10-08
FINAL REPORT to Air Force Office of Scientific Research (AFOSR) Project Title Influence of Surface Roughness on the Second Order Transport of...large amount of research has been performed to quantify the effects of Mach number, roughness, and wall curvature on turbulent boundary layers. However...18 a) b) c) Figure 3: a) A. D. Smith high pressure storage tank. b) Morin B series actuator controlling Virgo Engineers Trunion Mounted Ball Valve. c
Status of GDL - GNU Data Language
NASA Astrophysics Data System (ADS)
Coulais, A.; Schellens, M.; Gales, J.; Arabas, S.; Boquien, M.; Chanial, P.; Messmer, P.; Fillmore, D.; Poplawski, O.; Maret, S.; Marchal, G.; Galmiche, N.; Mermet, T.
2010-12-01
Gnu Data Language (GDL) is an open-source interpreted language aimed at numerical data analysis and visualisation. It is a free implementation of the Interactive Data Language (IDL) widely used in Astronomy. GDL has a full syntax compatibility with IDL, and includes a large set of library routines targeting advanced matrix manipulation, plotting, time-series and image analysis, mapping, and data input/output including numerous scientific data formats. We will present the current status of the project, the key accomplishments, and the weaknesses - areas where contributions are welcome!
NASA Technical Reports Server (NTRS)
Schreiber, Robert; Simon, Horst D.
1992-01-01
We are surveying current projects in the area of parallel supercomputers. The machines considered here will become commercially available in the 1990 - 1992 time frame. All are suitable for exploring the critical issues in applying parallel processors to large scale scientific computations, in particular CFD calculations. This chapter presents an overview of the surveyed machines, and a detailed analysis of the various architectural and technology approaches taken. Particular emphasis is placed on the feasibility of a Teraflops capability following the paths proposed by various developers.
NASA Astrophysics Data System (ADS)
Galison, Peter
2010-02-01
Secrecy in matters of national defense goes back far past antiquity. But our modern form of national secrecy owes a huge amount to a the large scale, systematic, and technical system of scientific secrecy that began in the Radar and Manhattan Projects of World War II and came to its current form in the Cold War. Here I would like to capture some of this trajectory and to present some of the paradoxes and deep conundrums that our secrecy system offers us in the Post-Cold War world. )
Complementarity of NGST, ALMA, and Far IR Space Observatories
NASA Technical Reports Server (NTRS)
Mather, John C.
2004-01-01
The Next Generation Space Telescope (NGST) and the Atacama Large Millimeter Array (ALMA) will both start operations long before a new far IR observatory to follow SIRTF into space can be launched. What will be unknown even after they are operational, and what will a far IR space observatory be able to add? I will compare the telescope design concepts and capabilities and the advertised scientific programs for the projects and attempt to forecast the research topics that will be at the forefront in 2010.
Complementarity of NGST, ALMA, and far IR Space Observatories
NASA Technical Reports Server (NTRS)
Mather, John C.; Fisher, Richard R. (Technical Monitor)
2002-01-01
The Next Generation Space Telescope (NGST) and the Atacama Large Millimeter Array (ALMA) will both start operations long before a new far IR observatory in space can be launched. What will be unknown even after they are operational, and what will a far IR space observatory be able to add? I will compare the telescope design concepts and capabilities and the advertised scientific programs for the projects and attempt to forecast the research topics that will be at the forefront in 2010.
NASA Technical Reports Server (NTRS)
Clifford, Stephen M.; Greeley, Ronald; Haberle, Robert M.
1988-01-01
The scientific highlights of the Mars: Evolution of its Climate and Atmosphere (MECA) study project are reviewed and some of the important issues in Martian climate research that remain unresolved are discussed.
Innovative tools for scientific and technological education in italian secondary schools.
Santucci, Annalisa; Mini, Roberta; Ferro, Elisa; Martelli, Paola; Trabalzini, Lorenza
2004-03-01
This paper describes the project "Biotech a Scuola" ("Biotech at School"), financed by the Italian Ministry of Education within the SeT program (Special Project for Scientific-Technological Education). The project involved the University of Siena, five senior and junior secondary schools in the Siena area, and a private company. Twenty-three teachers from diverse fields and 318 students from 15 classes were involved. The aim of the project was to improve scientific-technological teaching by providing schools with the support and materials necessary to understand some fundamental aspects of biotechnology. With this project we propose a model of close cooperation among various educational sectors with the goal of teaching junior and senior high school students some of the theory and practice of modern biotechnology. Copyright © 2004 International Union of Biochemistry and Molecular Biology, Inc.
Physicists Get INSPIREd: INSPIRE Project and Grid Applications
NASA Astrophysics Data System (ADS)
Klem, Jukka; Iwaszkiewicz, Jan
2011-12-01
INSPIRE is the new high-energy physics scientific information system developed by CERN, DESY, Fermilab and SLAC. INSPIRE combines the curated and trusted contents of SPIRES database with Invenio digital library technology. INSPIRE contains the entire HEP literature with about one million records and in addition to becoming the reference HEP scientific information platform, it aims to provide new kinds of data mining services and metrics to assess the impact of articles and authors. Grid and cloud computing provide new opportunities to offer better services in areas that require large CPU and storage resources including document Optical Character Recognition (OCR) processing, full-text indexing of articles and improved metrics. D4Science-II is a European project that develops and operates an e-Infrastructure supporting Virtual Research Environments (VREs). It develops an enabling technology (gCube) which implements a mechanism for facilitating the interoperation of its e-Infrastructure with other autonomously running data e-Infrastructures. As a result, this creates the core of an e-Infrastructure ecosystem. INSPIRE is one of the e-Infrastructures participating in D4Science-II project. In the context of the D4Science-II project, the INSPIRE e-Infrastructure makes available some of its resources and services to other members of the resulting ecosystem. Moreover, it benefits from the ecosystem via a dedicated Virtual Organization giving access to an array of resources ranging from computing and storage resources of grid infrastructures to data and services.
Power to the people: To what extent has public involvement in applied health research achieved this?
Green, Gill
2016-01-01
Public involvement is required for applied health research funded in the UK. One of the largest funders, the National Institute of Health Research (NIHR), makes it clear that it values the knowledge of patients and the public. As a result, there are now many resources to make sure that the public voice is included in decision-making about research. However, there is concern that the public voice still has limited impact on research decision-making. This article asks to what extent has power shifted from the scientific research community to the public? It looks at how much power and impact patients and members of the public have about research by asking: How do the public contribute to deciding which research areas and which research projects should be funded? How do they influence how the research is carried out? The article argues that there is evidence that the public voice is present in research decision-making. However, there is less evidence of a change in the power dynamic between the scientific research community and the public. The public involved in research are not always equal partners. The scientific research community still has the loudest voice and patients and the public do not always feel sufficiently empowered to challenge it. Public involvement in applied health research is a pre-requisite for funding from many funding bodies. In particular the National Institute of Health Research (NIHR) in the UK, clearly states that it values lay knowledge and there is an expectation that members of the public will participate as research partners in research. As a result a large public involvement infrastructure has emerged to facilitate this. However, there is concern that despite the flurry of activity in promoting public involvement, lay knowledge is marginalised and has limited impact on research decision-making. This article asks to what extent has power shifted from the scientific research community to the public? It discusses the meaning of power and models of public involvement and examines the development of public involvement in applied health research. It identifies public involvement in a range of decision-making: identifying priority areas for commissioning research; making decisions about which projects are funded; decisions about details of research design. Whilst there is evidence that the public voice is present in the composition of research proposals submitted to NIHR and in the decision-making about which projects are funded and how they are carried out, there is less evidence of a change in the power dynamic manifest in social relations between the scientific research community and the public. As a result the biomedical model remains dominant and largely unchallenged in research decision-making.
Solving Large Problems Quickly: Progress in 2001-2003
NASA Technical Reports Server (NTRS)
Mowry, Todd C.; Colohan, Christopher B.; Brown, Angela Demke; Steffan, J. Gregory; Zhai, Antonia
2004-01-01
This document describes the progress we have made and the lessons we have learned in 2001 through 2003 under the NASA grant entitled "Solving Important Problems Faster". The long-term goal of this research is to accelerate large, irregular scientific applications which have enormous data sets and which are difficult to parallelize. To accomplish this goal, we are exploring two complementary techniques: (i) using compiler-inserted prefetching to automatically hide the I/O latency of accessing these large data sets from disk; and (ii) using thread-level data speculation to enable the optimistic parallelization of applications despite uncertainty as to whether data dependences exist between the resulting threads which would normally make them unsafe to execute in parallel. Overall, we made significant progress in 2001 through 2003, and the project has gone well.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Director who is responsible for the scientific and technical direction of the project. (d) Grantee means... Department to whom the authority to issue or modify research project grant instruments has been delegated. (j... experience in particular scientific or technical fields to give expert advice, in accordance with the...
Code of Federal Regulations, 2014 CFR
2014-01-01
... Director who is responsible for the scientific and technical direction of the project. (d) Grantee means... Department to whom the authority to issue or modify research project grant instruments has been delegated. (j... experience in particular scientific or technical fields to give expert advice, in accordance with the...
Code of Federal Regulations, 2013 CFR
2013-01-01
... Director who is responsible for the scientific and technical direction of the project. (d) Grantee means... Department to whom the authority to issue or modify research project grant instruments has been delegated. (j... experience in particular scientific or technical fields to give expert advice, in accordance with the...
DIVE: A Graph-based Visual Analytics Framework for Big Data
Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie
2014-01-01
The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197
Automated metadata--final project report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schissel, David
This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO projectmore » was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.« less
Energy Exascale Earth System Model (E3SM) Project Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bader, D.
The E3SM project will assert and maintain an international scientific leadership position in the development of Earth system and climate models at the leading edge of scientific knowledge and computational capabilities. With its collaborators, it will demonstrate its leadership by using these models to achieve the goal of designing, executing, and analyzing climate and Earth system simulations that address the most critical scientific questions for the nation and DOE.
Three-dimensional imaging for large LArTPCs
Qian, X.; Zhang, Chao; Viren, B.; ...
2018-05-29
High-performance event reconstruction is critical for current and future massive liquid argon time projection chambers (LArTPCs) to realize their full scientific potential. LArTPCs with readout using wire planes provide a limited number of 2D projections. In general, without a pixel- type readout it is challenging to achieve unambiguous 3D event reconstruction. As a remedy, we present a novel 3D imaging method, Wire-Cell, which incorporates the charge and sparsity information in addition to the time and geometry through simple and robust mathematics. Furthermore, the resulting 3D image of ionization density provides an excellent starting point for further reconstruction and enables themore » true power of 3D tracking calorimetry in LArTPCs.« less
Three-dimensional imaging for large LArTPCs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qian, X.; Zhang, Chao; Viren, B.
High-performance event reconstruction is critical for current and future massive liquid argon time projection chambers (LArTPCs) to realize their full scientific potential. LArTPCs with readout using wire planes provide a limited number of 2D projections. In general, without a pixel- type readout it is challenging to achieve unambiguous 3D event reconstruction. As a remedy, we present a novel 3D imaging method, Wire-Cell, which incorporates the charge and sparsity information in addition to the time and geometry through simple and robust mathematics. Furthermore, the resulting 3D image of ionization density provides an excellent starting point for further reconstruction and enables themore » true power of 3D tracking calorimetry in LArTPCs.« less
Creative cross-organizational collaboration: coming to a project near you.
Reeve, Brock C
2012-03-01
Historically, the pharmaceutical industry has provided investors with robust growth and patients with a range of life-enhancing treatments; academic institutions conducted early-stage research largely supported by the government; disease foundations funded projects in their areas of interest; and venture capital built exciting new startups with bold ambitions. Today, those institutions are all facing scientific, economic and operating challenges. As a result, they are experimenting with new organizational and funding models. We consider some of those models in the life sciences in general, as well as in the development and delivery of novel regenerative medicines. In particular, the changing roles of the venture capital and disease foundation communities are considered in the context of academic and commercial collaborations.
Burlamaque-Neto, A C; Santos, G R; Lisbôa, L M; Goldim, J R; Machado, C L B; Matte, U; Giugliani, R
2012-02-01
In Brazil, scientific research is carried out mainly at universities, where professors coordinate research projects with the active participation of undergraduate and graduate students. However, there is no formal program for the teaching/learning of the scientific method. The objective of the present study was to evaluate the comprehension of the scientific method by students of health sciences who participate in scientific projects in an academic research laboratory. An observational descriptive cross-sectional study was conducted using Edgar Morin complexity as theoretical reference. In a semi-structured interview, students were asked to solve an abstract logical puzzle - TanGram. The collected data were analyzed using the hermeneutic-dialectic analysis method proposed by Minayo and discussed in terms of the theoretical reference of complexity. The students' concept of the scientific method is limited to participation in projects, stressing the execution of practical procedures as opposed to scientific thinking. The solving of the TanGram puzzle revealed that the students had difficulties in understanding questions and activities focused on subjects and their processes. Objective answers, even when dealing with personal issues, were also reflected on the students' opinions about the characteristics of a successful researcher. Students' difficulties concerning these issues may affect their scientific performance and result in poorly designed experiments. This is a preliminary study that should be extended to other centers of scientific research.
Burlamaque-Neto, A.C.; Santos, G.R.; Lisbôa, L.M.; Goldim, J.R.; Machado, C.L.B.; Matte, U.; Giugliani, R.
2012-01-01
In Brazil, scientific research is carried out mainly at universities, where professors coordinate research projects with the active participation of undergraduate and graduate students. However, there is no formal program for the teaching/learning of the scientific method. The objective of the present study was to evaluate the comprehension of the scientific method by students of health sciences who participate in scientific projects in an academic research laboratory. An observational descriptive cross-sectional study was conducted using Edgar Morin complexity as theoretical reference. In a semi-structured interview, students were asked to solve an abstract logical puzzle - TanGram. The collected data were analyzed using the hermeneutic-dialectic analysis method proposed by Minayo and discussed in terms of the theoretical reference of complexity. The students' concept of the scientific method is limited to participation in projects, stressing the execution of practical procedures as opposed to scientific thinking. The solving of the TanGram puzzle revealed that the students had difficulties in understanding questions and activities focused on subjects and their processes. Objective answers, even when dealing with personal issues, were also reflected on the students' opinions about the characteristics of a successful researcher. Students' difficulties concerning these issues may affect their scientific performance and result in poorly designed experiments. This is a preliminary study that should be extended to other centers of scientific research. PMID:22249427
Research on cost control and management in high voltage transmission line construction
NASA Astrophysics Data System (ADS)
Xu, Xiaobin
2017-05-01
Enterprises. The cost control is of vital importance to the construction enterprises. It is the key to the profitability of the transmission line project, which is related to the survival and development of the electric power construction enterprises. Due to the long construction line, complex and changeable construction terrain as well as large construction costs of transmission line, it is difficult for us to take accurate and effective cost control on the project implementation of entire transmission line. Therefore, the cost control of transmission line project is a complicated and arduous task. It is of great theoretical and practical significance to study the cost control scheme of transmission line project by a more scientific and efficient way. Based on the characteristics of the construction project of the transmission line project, this paper analyzes the construction cost structure of the transmission line project and the current cost control problem of the transmission line project, and demonstrates the necessity and feasibility of studying the cost control scheme of the transmission line project more accurately. In this way, the dynamic cycle cost control process including plan, implementation, feedback, correction, modification and re-implement is achieved to realize the accurate and effective cost control of entire electric power transmission line project.
MicroEcos: Micro-Scale Explorations of Large-Scale Late Pleistocene Ecosystems
NASA Astrophysics Data System (ADS)
Gellis, B. S.
2017-12-01
Pollen data can inform the reconstruction of early-floral environments by providing data for artistic representations of what early-terrestrial ecosystems looked like, and how existing terrestrial landscapes have evolved. For example, what did the Bighorn Basin look like when large ice sheets covered modern Canada, the Yellowstone Plateau had an ice cap, and the Bighorn Mountains were mantled with alpine glaciers? MicroEcos is an immersive, multimedia project that aims to strengthen human-nature connections through the understanding and appreciation of biological ecosystems. Collected pollen data elucidates flora that are visible in the fossil record - associated with the Late-Pleistocene - and have been illustrated and described in botanical literature. It aims to make scientific data accessible and interesting to all audiences through a series of interactive-digital sculptures, large-scale photography and field-based videography. While this project is driven by scientific data, it is rooted in deeply artistic and outreach-based practices, which include broad artistic practices, e.g.: digital design, illustration, photography, video and sound design. Using 3D modeling and printing technology MicroEcos centers around a series of 3D-printed models of the Last Canyon rock shelter on the Wyoming and Montana border, Little Windy Hill pond site in Wyoming's Medicine Bow National Forest, and Natural Trap Cave site in Wyoming's Big Horn Basin. These digital, interactive-3D sculpture provide audiences with glimpses of three-dimensional Late-Pleistocene environments, and helps create dialogue of how grass, sagebrush, and spruce based ecosystems form. To help audiences better contextualize how MicroEcos bridges notions of time, space, and place, modern photography and videography of the Last Canyon, Little Windy Hill and Natural Trap Cave sites surround these 3D-digital reconstructions.
Development Of An Agroforestry Sequestration Project In KhammamDistrict Of India
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudha, P.; Ramprasad, V.; Nagendra, M.D.V.
2007-06-01
Large potential for agroforestry as a mitigation option hasgiven rise to scientific and policy questions. This paper addressesmethodological issues in estimating carbon sequestration potential,baseline determination, additionality and leakage in Khammam district,Andhra Pradesh, southern part of India. Technical potential forafforestation was determined considering the various landuse options. Forestimating the technical potential, culturable wastelands, fallow andmarginal croplands were considered for Eucalyptus clonal plantations.Field studies for aboveground and below ground biomass, woody litter andsoil organic carbon for baseline and project scenario were conducted toestimate the carbon sequestration potential. The baseline carbon stockwas estimated to be 45.33 tC/ha. The additional carbon sequestrationpotential under themore » project scenario for 30 years is estimated to be12.82 tC/ha/year inclusive of harvest regimes and carbon emissions due tobiomass burning and fertilizer application. The project scenario thoughhas a higher benefit cost ratio compared to baseline scenario, initialinvestment cost is high. Investment barrier exists for adoptingagroforestry in thedistrict.« less
NASA Astrophysics Data System (ADS)
Collard, F.; Quartly, G. D.; Konik, M.; Johannessen, J. A.; Korosov, A.; Chapron, B.; Piolle, J.-F.; Herledan, S.; Darecki, M.; Isar, A.; Nafornita, C.
2015-12-01
Ocean Virtual Laboratory is an ESA-funded project to prototype the concept of a single point of access for all satellite remote-sensing data with ancillary model output and in situ measurements for a given region. The idea is to provide easy access for the non-specialist to both data and state-of-the-art processing techniques and enable their easy analysis and display. The project, led by OceanDataLab, is being trialled in the region of the Agulhas Current, as it contains signals of strong contrast (due to very energetic upper ocean dynamics) and special SAR data acquisitions have been recorded there. The project also encourages the take up of Earth Observation data by developing training material to help those not in large scientific or governmental organizations make the best use of what data are available. The website for access is: http://ovlproject.oceandatalab.com/
Using offsets to mitigate environmental impacts of major projects: A stakeholder analysis.
Martin, Nigel; Evans, Megan; Rice, John; Lodhia, Sumit; Gibbons, Philip
2016-09-01
Global patterns of development suggest that as more projects are initiated, business will need to find acceptable measures to conserve biodiversity. The application of environmental offsets allows firms to combine their economic interests with the environment and society. This article presents the results of a multi-stakeholder analysis related to the design of offsets principles, policies, and regulatory processes, using a large infrastructure projects context. The results indicate that business was primarily interested in using direct offsets and other compensatory measures, known internationally as indirect offsets, to acquit their environmental management obligations. In contrast, the environmental sector argued that highly principled and scientifically robust offsets programs should be implemented and maintained for enduring environmental protection. Stakeholder consensus stressed the importance of offsets registers with commensurate monitoring and enforcement. Our findings provide instructive insights into the countervailing views of offsets policy stakeholders. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cosmic Visions Dark Energy: Small Projects Portfolio
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Kyle; Frieman, Josh; Heitmann, Katrin
Understanding cosmic acceleration is one of the key science drivers for astrophysics and high-energy physics in the coming decade (2014 P5 Report). With the Large Synoptic Survey Telescope (LSST) and the Dark Energy Spectroscopic Instrument (DESI) and other new facilities beginning operations soon, we are entering an exciting phase during which we expect an order of magnitude improvement in constraints on dark energy and the physics of the accelerating Universe. This is a key moment for a matching Small Projects portfolio that can (1) greatly enhance the science reach of these flagship projects, (2) have immediate scientific impact, and (3)more » lay the groundwork for the next stages of the Cosmic Frontier Dark Energy program. In this White Paper, we outline a balanced portfolio that can accomplish these goals through a combination of observational, experimental, and theory and simulation efforts.« less
Colling, D.; Britton, D.; Gordon, J.; Lloyd, S.; Doyle, A.; Gronbech, P.; Coles, J.; Sansum, A.; Patrick, G.; Jones, R.; Middleton, R.; Kelsey, D.; Cass, A.; Geddes, N.; Clark, P.; Barnby, L.
2013-01-01
The Large Hadron Collider (LHC) is one of the greatest scientific endeavours to date. The construction of the collider itself and the experiments that collect data from it represent a huge investment, both financially and in terms of human effort, in our hope to understand the way the Universe works at a deeper level. Yet the volumes of data produced are so large that they cannot be analysed at any single computing centre. Instead, the experiments have all adopted distributed computing models based on the LHC Computing Grid. Without the correct functioning of this grid infrastructure the experiments would not be able to understand the data that they have collected. Within the UK, the Grid infrastructure needed by the experiments is provided by the GridPP project. We report on the operations, performance and contributions made to the experiments by the GridPP project during the years of 2010 and 2011—the first two significant years of the running of the LHC. PMID:23230163
NASA Astrophysics Data System (ADS)
Diiwu, J.; Silins, U.; Kevin, B.; Anderson, A.
2008-12-01
Like many areas of the Rocky Mountains, Alberta's forests on the eastern slopes of the Rockies have been shaped by decades of successful fire suppression. These forests are at high risk to fire and large scale insect infestation, and climate change will continue to increase these risks. These headwaters forests provide the vast majority of usable surface water supplies to large region of the province, and large scale natural disasters can have dramatic effects on water quality and water availability. The population in the region has steadily increased and now this area is the main source water for many Alberta municipalities, including the City of Calgary, which has a population of over one million. In 2003 a fire burned 21,000 ha in the southern foothills area. The government land managers were concerned about the downstream implications of the fire and salvage operations, however there was very limited scientific information to guide the decision making. This led to establishment of the Southern Rockies Watershed Project, which is a partnership between Alberta Sustainable Resource Development, the provincial government department responsible for land management and the University of Alberta. After five years of data collection, the project has produced quantitative information that was not previously available about the effects of fire and management interventions such as salvage logging on headwaters and regional water quality. This information can be used to make decisions on forest operations, fire suppression, and post-fire salvage operations. In the past few years this project has captured the interest of large municipalities and water treatment researchers who are keen to investigate the potential implications of large natural disturbances to large and small drinking water treatment facilities. Examples from this project will be used to highlight the challenges and successes encountered while bridging the gap between science and land management policy.
Earth Science Project Office (ESPO) Field Experiences During ORACLES, ATom, KORUS and POSIDON
NASA Technical Reports Server (NTRS)
Salazar, Vidal; Zavaleta, Jhony
2017-01-01
Very often, scientific field campaigns entail years of planning and incur substantial cost, especially if they involve the operation of large research aircraft in remote locations. Deploying and operating these aircrafts even for short periods of time poses challenges that, if not addressed properly, can have significant negative consequences and potentially jeopardize the success of a scientific campaign. Challenges vary from country to country and range from safety, health, and security risks to differences in cultural and social norms. Our presentation will focus on sharing experiences on the ESPO 2016 conducted field campaigns ORACLES, ATom, KORUS and POSIDON. We will focus on the best practices, lessons learned, international relations and coordination aspects of the country-specific experiences. This presentation will be part of the ICARE Conference (2nd International Conference on Airborne Research for the Environment (ICARE 2017) that will focus on "Developing the infrastructure to meet future scientific challenges". This unique conference and gathering of facility support experts will not only allow for dissemination and sharing of knowledge but also promote collaboration and networking among groups that support scientific research using airborne platforms around the globe.
A DBMS architecture for global change research
NASA Astrophysics Data System (ADS)
Hachem, Nabil I.; Gennert, Michael A.; Ward, Matthew O.
1993-08-01
The goal of this research is the design and development of an integrated system for the management of very large scientific databases, cartographic/geographic information processing, and exploratory scientific data analysis for global change research. The system will represent both spatial and temporal knowledge about natural and man-made entities on the eath's surface, following an object-oriented paradigm. A user will be able to derive, modify, and apply, procedures to perform operations on the data, including comparison, derivation, prediction, validation, and visualization. This work represents an effort to extend the database technology with an intrinsic class of operators, which is extensible and responds to the growing needs of scientific research. Of significance is the integration of many diverse forms of data into the database, including cartography, geography, hydrography, hypsography, images, and urban planning data. Equally important is the maintenance of metadata, that is, data about the data, such as coordinate transformation parameters, map scales, and audit trails of previous processing operations. This project will impact the fields of geographical information systems and global change research as well as the database community. It will provide an integrated database management testbed for scientific research, and a testbed for the development of analysis tools to understand and predict global change.
NASA Technical Reports Server (NTRS)
1977-01-01
The joint U.S.-USSR experiments and the U.S. conducted unilateral experiments performed during the Apollo Soyuz Test Project are described. Scientific concepts and experiment design and operation are discussed along with scientific results of postflight analysis.
Systems engineering real estate development projects
NASA Astrophysics Data System (ADS)
Gusakova, Elena; Titarenko, Boris; Stepanov, Vitaliy
2017-10-01
In recent years, real estate development has accumulated a wealth of experience in implementing major projects, which requires comprehension and systematization. The scientific instrument of system engineering is studied in the article and is substantively interpreted with reference to real estate development projects. The most perspective approaches and models are substantiated, allowing strategically to plan the life cycle of the project as a whole, and also to solve the engineering butt problems of the project. The relevance of further scientific studies of regularities and specifics of the life cycle of real estate development projects conducted at the Moscow State University of Economics and Management at the ISTA department is shown.
NASA Astrophysics Data System (ADS)
Hoffmann, Friederike; Meyer, Stefanie; de Vareilles, Mahaut
2017-04-01
In the past years there has been a strong push in Norway for increasing participation in the EU Framework Programmes for Research and Innovation. EU projects coordinated by the University of Bergen (UiB) usually receive management support from the central administration (mostly financial) in collaboration with a full- or part-time scientific project manager working on a fixed-term contract at the same institute as the project's principal scientist. With an increasing amount of granted EU projects, the number of scientific project managers employed across the whole university has also increased, and a need for coordination and professionalization of this service became obvious. Until recently, UiB had no unified structures and routines for training of newly recruited project managers, or for archiving and transferring routines and skills after the end of the project and the manager's employment contract. To overcome this administrative knowledge gap, the "Forum for scientific EU project managers at UiB" was founded in spring 2016 as an informal communication platform. Its purpose is to bring together current and previous scientific EU project managers from different disciplines to share their experiences. The main aim of the forum is to transfer and improve knowledge, skills and routines on effective management of EU funded projects, but also to function as a discussion forum where issues arising from handling international consortia can be reviewed. The group meets monthly and discusses current challenges from on-going EU projects as well as routines for specific project operation tasks. These routines are archived in an online best-practise guide which the group currently develops. The regular personal meetings are supplemented with an intense communication via a group mailing list and several individual mail- and phone-meetings. Since lessons learned during project implementation may improve future proposals, UiB research advisors for proposal support frequently interact with the members of the forum. The forum is also used to spread relevant information received from other sources. We already realize that the forum and its products lead to increased competence of scientific EU project managers and research advisors at UiB. To further harvest these synergy effects, we aim to increase our interaction with similar groups, networks, and online platforms in and beyond Europe.
The NGEE Arctic Data Archive -- Portal for Archiving and Distributing Data and Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boden, Thomas A; Palanisamy, Giri; Devarakonda, Ranjeet
2014-01-01
The Next-Generation Ecosystem Experiments (NGEE Arctic) project is committed to implementing a rigorous and high-quality data management program. The goal is to implement innovative and cost-effective guidelines and tools for collecting, archiving, and sharing data within the project, the larger scientific community, and the public. The NGEE Arctic web site is the framework for implementing these data management and data sharing tools. The open sharing of NGEE Arctic data among project researchers, the broader scientific community, and the public is critical to meeting the scientific goals and objectives of the NGEE Arctic project and critical to advancing the mission ofmore » the Department of Energy (DOE), Office of Science, Biological and Environmental (BER) Terrestrial Ecosystem Science (TES) program.« less
Cloud Feedbacks on Climate: A Challenging Scientific Problem
Norris, Joe
2017-12-22
One reason it has been difficult to develop suitable social and economic policies to address global climate change is that projected global warming during the coming century has a large uncertainty range. The primary physical cause of this large uncertainty range is lack of understanding of the magnitude and even sign of cloud feedbacks on the climate system. If Earth's cloudiness responded to global warming by reflecting more solar radiation back to space or allowing more terrestrial radiation to be emitted to space, this would mitigate the warming produced by increased anthropogenic greenhouse gases. Contrastingly, a cloud response that reduced solar reflection or terrestrial emission would exacerbate anthropogenic greenhouse warming. It is likely that a mixture of responses will occur depending on cloud type and meteorological regime, and at present, we do not know what the net effect will be. This presentation will explain why cloud feedbacks have been a challenging scientific problem from the perspective of theory, modeling, and observations. Recent research results on observed multidecadal cloud-atmosphere-ocean variability over the Pacific Ocean will also be shown, along with suggestions for future research.
Vallée, Geneviève C; Muñoz, Daniella Santos; Sankoff, David
2016-11-11
Of the approximately two hundred sequenced plant genomes, how many and which ones were sequenced motivated by strictly or largely scientific considerations, and how many by chiefly economic, in a wide sense, incentives? And how large a role does publication opportunity play? In an integration of multiple disparate databases and other sources of information, we collect and analyze data on the size (number of species) in the plant orders and families containing sequenced genomes, on the trade value of these species, and of all the same-family or same-order species, and on the publication priority within the family and order. These data are subjected to multiple regression and other statistical analyses. We find that despite the initial importance of model organisms, it is clearly economic considerations that outweigh others in the choice of genome to be sequenced. This has important implications for generalizations about plant genomes, since human choices of plants to harvest (and cultivate) will have incurred many biases with respect to phenotypic characteristics and hence of genomic properties, and recent genomic evolution will also have been affected by human agricultural practices.
Reaves, Erik J; Valle, Ruben; Chandrasekera, Ruvani M; Soto, Giselle; Burke, Ronald L; Cummings, James F; Bausch, Daniel G; Kasper, Matthew R
2017-05-01
Scientific publication in academic literature is a key venue in which the U.S. Department of Defense's Global Emerging Infections Surveillance and Response System (GEIS) program disseminates infectious disease surveillance data. Bibliometric analyses are tools to evaluate scientific productivity and impact of published research, yet are not routinely used for disease surveillance. Our objective was to incorporate bibliometric indicators to measure scientific productivity and impact of GEIS-funded infectious disease surveillance, and assess their utility in the management of the GEIS surveillance program. Metrics on GEIS program scientific publications, project funding, and countries of collaborating institutions from project years 2006 to 2012 were abstracted from annual reports and program databases and organized by the six surveillance priority focus areas: respiratory infections, gastrointestinal infections, febrile and vector-borne infections, antimicrobial resistance, sexually transmitted infections, and capacity building and outbreak response. Scientific productivity was defined as the number of scientific publications in peer-reviewed literature derived from GEIS-funded projects. Impact was defined as the number of citations of a GEIS-funded publication by other peer-reviewed publications, and the Thomson Reuters 2-year journal impact factor. Indicators were retrieved from the Web of Science and Journal Citation Report. To determine the global network of international collaborations between GEIS partners, countries were organized by the locations of collaborating institutions. Between 2006 and 2012, GEIS distributed approximately US $330 million to support 921 total projects. On average, GEIS funded 132 projects (range 96-160) with $47 million (range $43 million-$53 million), annually. The predominant surveillance focus areas were respiratory infections with 317 (34.4%) projects and $225 million, and febrile and vector-borne infections with 274 (29.8%) projects and $45 million. The number of annual respiratory infections-related projects peaked in 2006 and 2009. The number of febrile and vector-borne infections projects increased from 29 projects in 2006 to 58 in 2012. There were 651 articles published in 147 different peer-reviewed journals, with an average Thomson Reuters 2-year journal impact factor of 4.2 (range 0.3-53.5). On average, 93 articles were published per year (range 67-117) with $510,000 per publication. Febrile and vector-borne, respiratory, and gastrointestinal infections had 287, 167, and 73 articles published, respectively. Of the 651 articles published, 585 (89.9%) articles were cited at least once (range 1-1,045). Institutions from 90 countries located in all six World Health Organization regions collaborated with surveillance projects. These findings summarize the GEIS-funded surveillance portfolio between 2006 and 2012, and demonstrate the scientific productivity and impact of the program in each of the six disease surveillance priority focus areas. GEIS might benefit from further financial investment in both the febrile and vector-borne and sexually transmitted infections surveillance priority focus areas and increasing peer-reviewed publications of surveillance data derived from respiratory infections projects. Bibliometric indicators are useful to measure scientific productivity and impact in surveillance systems; and this methodology can be utilized as a management tool to assess future changes to GEIS surveillance priorities. Additional metrics should be developed when peer-reviewed literature is not used to disseminate noteworthy accomplishments. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.
Nokleberg, Warren J.; Miller, Robert J.; Naumova, Vera V.; Khanchuk, Alexander I.; Parfenov, Leonid M.; Kuzmin, Mikhail I.; Bounaeva, Tatiana M.; Obolenskiy, Alexander A.; Rodionov, Sergey M.; Seminskiy, Zhan V.; Diggles, Michael F.
2003-01-01
This is the Web version of a CD-ROM publication. This report consists of summary major compilations and syntheses accomplished in the six-year project through April 2003 for the study on the Mineral Resources, Metallogenesis, and Tectonics of Northeast Asia (Eastern and Southern Siberia, Mongolia, Northeastern China, South Korea, and Japan). The major scientific goals and benefits of the project are to: (1) provide a comprehensive international data base on the mineral resources of the region that is the first, extensive knowledge available in English; (2) provide major new interpretations of the origin and crustal evolution of mineralizing systems and their host rocks, thereby enabling enhanced, broad-scale tectonic reconstructions and interpretations; and (3) promote trade and scientific and technical exchanges between the North America and Northeast Asia. Data from the project are providing sound scientific data and interpretations for commercial firms, governmental agencies, universities, and individuals that are developing new ventures and studies in the project area, and for land-use planning studies that deal with both mineral potential issues. Northeast Asia has vast potential for known and undiscovered mineral deposits; however, little information existed in English in the West until publication of products from this project. Consequently, data and interpretations from the project are providing basic knowledge for major scientific, commercial, national, and international endeavors by other interested individuals and groups.
The PACA Project Ecology: Observing Campaigns, Outreach and Citizen Science
NASA Astrophysics Data System (ADS)
Yanamandra-Fisher, P. A.
2016-12-01
The PACA Project has three main components: observational campaigns aligned with scientific research; outreach to engage all forms of audiences and citizen science projects that aim to produce specific scientific results, by engaging professional scientific and amateur communities and a variety of audiences. The primary observational projects are defined by specific scientific goals by professionals, resulting in global observing campaigns involving a variety of observers, and observing techniques. Some of PACA's observing campaigns have included global characterization of comets (e.g., C/ISON, SidingSpring, 67P/Churyumov-Gerasimenko, Lovejoy, etc.), planets (Jupiter, Saturn and Mars) and currently expanding to include polarimetric exploration of solar system objects with small apertures and collaboration with CITIZEN CATE, a citizen science observing campaign to observe the 2017 Continental America Total Eclipse. Our Outreach campaigns leverage the multiple social media/platforms for at least two important reasons: (i) the immediate dissemination of observations and interaction with the global network and (ii) free or inexpensive resources for most of the participants. The use of social media is becoming prevalent in citizen science projects due to these factors. The final stage of the PACA ecosystem is the integration of these components into a publication. We shall highlight some of the interesting challenges and solutions of the PACA Project so far and provide a view of future projects in all three categories with new partnerships and collaborations.
Homeland Security Technologies Creating an Asymmetric Advantage
2002-04-01
solution. Sustaining the Asymmetric Advantage: A Manhattan Project for the 21st Century This nation’s greatest technological endeavors have been inspired...bomb stimulated the Manhattan Project —a top—secret engineering venture that engaged the best available scientific expertise and delivered a working...term Manhattan Project is now “a byword for an enormous breakneck effort involving vast resources and the best scientific minds in the world.” [15] The
Lidar system for air-pollution monitoring over urban areas
NASA Astrophysics Data System (ADS)
Moskalenko, Irina V.; Shcheglov, Djolinard A.; Molodtsov, Nikolai A.
1997-05-01
The atmospheric environmental situation over the urban area of a large city is determined by a complex combination of anthropogenic pollution and meteorological factors. The efficient way to provide three-dimensional mapping of gaseous pollutants over wide areas is utilization of lidar systems employing tunable narrowband transmitters. The paper presented describes activity of RRC 'Kurchatov Institute' in the field of lidar atmospheric monitoring. The project 'mobile remote sensing system based on tunable laser transmitter for environmental monitoring' is developed under financial support of International Scientific and Technology Center (Moscow). The objective of the project is design, construction and field testing of a DIAL-technique system. The lidar transmitter consists of an excimer laser pumping dye laser, BBO crystal frequency doubler, and scanning flat mirror. Sulfur dioxide and atomic mercury have been selected as pollutants for field tests of the lidar system under development. A recent large increase in Moscow traffic stimulated taking into consideration also the remote sensing of lower troposphere ozone because of the photochemical smog problem. The status of the project is briefly discussed. The current activity includes also collecting of environmental data relevant to lidar remote sensing. Main attention is paid to pollutant concentration levels over Moscow city and Moscow district areas.
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Płóciennik, Marcin; Doutriaux, Charles; Blanquer, Ignacio; Barbera, Roberto; Donvito, Giacinto; Williams, Dean N.; Anantharaj, Valentine; Salomoni, Davide D.; Aloisio, Giovanni
2017-04-01
In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated, such as the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). A case study on climate models intercomparison data analysis addressing several classes of multi-model experiments is being implemented in the context of the EU H2020 INDIGO-DataCloud project. Such experiments require the availability of large amount of data (multi-terabyte order) related to the output of several climate models simulations as well as the exploitation of scientific data management tools for large-scale data analytics. More specifically, the talk discusses in detail a use case on precipitation trend analysis in terms of requirements, architectural design solution, and infrastructural implementation. The experiment has been tested and validated on CMIP5 datasets, in the context of a large scale distributed testbed across EU and US involving three ESGF sites (LLNL, ORNL, and CMCC) and one central orchestrator site (PSNC). The general "environment" of the case study relates to: (i) multi-model data analysis inter-comparison challenges; (ii) addressed on CMIP5 data; and (iii) which are made available through the IS-ENES/ESGF infrastructure. The added value of the solution proposed in the INDIGO-DataCloud project are summarized in the following: (i) it implements a different paradigm (from client- to server-side); (ii) it intrinsically reduces data movement; (iii) it makes lightweight the end-user setup; (iv) it fosters re-usability (of data, final/intermediate products, workflows, sessions, etc.) since everything is managed on the server-side; (v) it complements, extends and interoperates with the ESGF stack; (vi) it provides a "tool" for scientists to run multi-model experiments, and finally; and (vii) it can drastically reduce the time-to-solution for these experiments from weeks to hours. At the time the contribution is being written, the proposed testbed represents the first concrete implementation of a distributed multi-model experiment in the ESGF/CMIP context joining server-side and parallel processing, end-to-end workflow management and cloud computing. As opposed to the current scenario based on search & discovery, data download, and client-based data analysis, the INDIGO-DataCloud architectural solution described in this contribution addresses the scientific computing & analytics requirements by providing a paradigm shift based on server-side and high performance big data frameworks jointly with two-level workflow management systems realized at the PaaS level via a cloud infrastructure.
Le management des projets scientifiques
NASA Astrophysics Data System (ADS)
Perrier, Françoise
2000-12-01
We describe in this paper a new approach for the management of scientific projects. This approach is the result of a long reflexion carried out within the MQDP (Methodology and Quality in the Project Development) group of INSU-CNRS, and continued with Guy Serra. Our reflexion was initiated with the study of the so-called `North-American Paradigm' which was, initially considered as the only relevant management model. Through our active participation in several astrophysical projects we realized that this model could not be applied to our laboratories without major modifications. Therefore, step-by-step, we have constructed our own methodology, using to the fullest human potential resources existing in our research field, their habits and skills. We have also participated in various working groups in industrial and scientific organisms for the benefits of CNRS. The management model presented here is based on a systemic and complex approach. This approach lets us describe the multiple aspects of a scientific project specially taking into account the human dimension. The project system model includes three major interconnected systems, immersed within an influencing and influenced environment: the `System to be Realized' which defines scientific and technical tasks leading to the scientific goals, the `Realizing System' which describes procedures, processes and organization, and the `Actors' System' which implements and boosts all the processes. Each one exists only through a series of successive models, elaborated at predefined dates of the project called `key-points'. These systems evolve with time and under often-unpredictable circumstances and the models have to take it into account. At these key-points, each model is compared to reality and the difference between the predicted and realized tasks is evaluated in order to define the data for the next model. This model can be applied to any kind of projects.
EuroStemCell: A European infrastructure for communication and engagement with stem cell research.
Barfoot, Jan; Doherty, Kate; Blackburn, C Clare
2017-10-01
EuroStemCell is a large and growing network of organizations and individuals focused on public engagement with stem cells and regenerative medicine - a fluid and contested domain, where scientific, political, ethical, legal and societal perspectives intersect. Rooted in the European stem cell research community, this project has developed collaborative and innovative approaches to information provision and direct and online engagement, that reflect and respond to the dynamic growth of the field itself. EuroStemCell started as the communication and outreach component of a research consortium and subsequently continued as a stand-alone engagement initiative. The involvement of established European stem cell scientists has grown year-on-year, facilitating their participation in public engagement by allowing them to make high-value contributions with broad reach. The project has now had sustained support by partners and funders for over twelve years, and thus provides a model for longevity in public engagement efforts. This paper considers the evolution of the EuroStemCell project in response to - and in dialogue with - its evolving environment. In it, we aim to reveal the mechanisms and approaches taken by EuroStemCell, such that others within the scientific community can explore these ideas and be further enabled in their own public engagement endeavours. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
A Demonstration of Big Data Technology for Data Intensive Earth Science (Invited)
NASA Astrophysics Data System (ADS)
Kuo, K.; Clune, T.; Ramachandran, R.; Rushing, J.; Fekete, G.; Lin, A.; Doan, K.; Oloso, A. O.; Duffy, D.
2013-12-01
Big Data technologies exhibit great potential to change the way we conduct scientific investigations, especially analysis of voluminous and diverse data sets. Obviously, not all Big Data technologies are applicable to all aspects of scientific data analysis. Our NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) project, Automated Event Service (AES), pioneers the exploration of Big Data technologies for data intensive Earth science. Since Earth science data are largely stored and manipulated in the form of multidimensional arrays, the project first evaluates array performance of several candidate Big Data technologies, including MapReduce (Hadoop), SciDB, and a custom-built Polaris system, which have one important feature in common: shared nothing architecture. The evaluation finds SicDB to be the most promising. In this presentation, we demonstrate SciDB using a couple of use cases, each operating on a distinct data set in the regular latitude-longitude grid. The first use case is the discovery and identification of blizzards using NASA's Modern Era Retrospective-analysis for Research and Application (MERRA) data sets. The other finds diurnal signals in the same 8-year period using SSMI data from three different instruments with different equator crossing times by correlating their retrieved parameters. In addition, the AES project is also developing a collaborative component to enable the sharing of event queries and results. Preliminary capabilities will be presented as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R R; Brugger, E; Cook, R
The Data group provides data analysis and visualization support to its customers. This consists primarily of the development and support of VisIt, a data analysis and visualization tool. Support ranges from answering questions about the tool, providing classes on how to use the tool, and performing data analysis and visualization for customers. The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include applying visualization software for large scale data exploration; running video production labs on two networks; supporting graphics libraries and tools for end users;more » maintaining PowerWalls and assorted other displays; and developing software for searching and managing scientific data. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization techniques for large scale data exploration that are funded by the ASC program, among others. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry. The IMG group is located in the Terascale Simulation Facility, home to Dawn, Atlas, BGL, and others, which includes both classified and unclassified visualization theaters, a visualization computer floor and deployment workshop, and video production labs. We continued to provide the traditional graphics group consulting and video production support. We maintained five PowerWalls and many other displays. We deployed a 576-node Opteron/IB cluster with 72 TB of memory providing a visualization production server on our classified network. We continue to support a 128-node Opteron/IB cluster providing a visualization production server for our unclassified systems and an older 256-node Opteron/IB cluster for the classified systems, as well as several smaller clusters to drive the PowerWalls. The visualization production systems includes NFS servers to provide dedicated storage for data analysis and visualization. The ASC projects have delivered new versions of visualization and scientific data management tools to end users and continue to refine them. VisIt had 4 releases during the past year, ending with VisIt 2.0. We released version 2.4 of Hopper, a Java application for managing and transferring files. This release included a graphical disk usage view which works on all types of connections and an aggregated copy feature for quickly transferring massive datasets quickly and efficiently to HPSS. We continue to use and develop Blockbuster and Telepath. Both the VisIt and IMG teams were engaged in a variety of movie production efforts during the past year in addition to the development tasks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langer, S; Rotman, D; Schwegler, E
The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less
Management evolution in the LSST project
NASA Astrophysics Data System (ADS)
Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine
2010-07-01
The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.
ERIC Educational Resources Information Center
Chace, Jameson F.
2014-01-01
Between 2007 and 2010, three types of semester research projects were assigned in BIO 140 Humans and Their Environment, a nonmajors introductory course at Salve Regina University. Specific environmental impact-type assessments were used to foster scientific inquiry and achieve higher scientific literacy. Quantitative and qualitative measurements…
Water-Resources Manpower: Supply and Demand Patterns to 1980.
ERIC Educational Resources Information Center
Lewis, James E.
Relating the supply of scientific manpower to the educational potential of the general population and the productive capacity of the educational system, this study disaggregates independent projections of scientific manpower supply and demand to yield projections for water resources manpower. This supply of engineers, natural scientists, and…
The Interior Columbia Basin Ecosystem Management Project: scientific assessment.
1999-01-01
This CD-ROM contains digital versions (PDF) of the major scientific documents prepared for the Interior Columbia Basin Ecosystem Management Project (ICBEMP). "A Framework for Ecosystem Management in the Interior Columbia Basin and Portions of the Klamath and Great Basins" describes a general planning model for ecosystem management. The "Highlighted...
Comparison of Scientific Research Projects of Education Faculties
ERIC Educational Resources Information Center
Altunay, Esen; Tonbul, Yilmaz
2015-01-01
Many studies indicate that knowledge and knowledge production are the main predictors of social development, welfare and the ability to face the future with confidence. It could be argued that knowledge production is mainly carried out by universities. This study compares 1266 scientific research projects (SRPs) completed by faculties of education…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amerio, S.; Behari, S.; Boyd, J.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Amerio, S.; Behari, S.; Boyd, J.; Brochmann, M.; Culbertson, R.; Diesburg, M.; Freeman, J.; Garren, L.; Greenlee, H.; Herner, K.; Illingworth, R.; Jayatilaka, B.; Jonckheere, A.; Li, Q.; Naymola, S.; Oleynik, G.; Sakumoto, W.; Varnes, E.; Vellidis, C.; Watts, G.; White, S.
2017-04-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. These efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.
What Will Science Gain From Mapping the World Ocean Floor?
NASA Astrophysics Data System (ADS)
Jakobsson, M.
2017-12-01
It is difficult to estimate how much of the World Ocean floor topography (bathymetry) that has been mapped. Estimates range from a few to more than ten percent of the World Ocean area. The most recent version of the bathymetric grid compiled by the General Bathymetric Chart of the Oceans (GEBCO) has bathymetric control points in 18% of the 30 x 30 arc second large grid cells. The depth values for the rest of the cells are obtained through interpolation guided by satellite altimetry in deep water. With this statistic at hand, it seems tenable to suggest that there are many scientific discoveries to be made from a complete high-resolution mapping of the World Ocean floor. In this presentation, some of our recent scientific discoveries based on modern multibeam bathymetric mapping will be highlighted and discussed. For example, how multibeam mapping provided evidence for a km-thick ice shelf covering the entire Arctic Ocean during peak glacial conditions, a hypothesis proposed nearly half a century ago, and how groundwater escape features are visible in high-resolution bathymetry in the Baltic Sea, with potential implications for the freshwater budget and distribution of nutrients and pollutants. Presented examples will be placed in the context of mapping resolution, systematic surveys versus mapping along transits, and scientific hypothesis driven mapping versus ocean exploration. The newly announced Nippon Foundation - GEBCO Seabed 2030 project has the vision to map 100% of the World Ocean floor mapped by 2030. Are there specific scientific areas where we can expect new discoveries from all mapping data collected through the Seabed 2030 project? Are there outstanding hypothesis that can be tested from a fully mapped World Ocean floor?
Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing
Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong
2014-01-01
This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931
Globalization and WMD Proliferation Networks: The Policy Landscape
2006-07-01
scientific advances, it moved to shut down this network by classifying all information relating to the Manhattan Project . This security action had only...As with the U.S. efforts during World War II to deny access to Manhattan Project Report Documentation Page Form ApprovedOMB No. 0704-0188 Public...the scientific discoveries paving the way for the atomic bomb, as well as of the U.S. government’s subsequent classification of Manhattan Project information
50 CFR 21.23 - Scientific collecting permits.
Code of Federal Regulations, 2013 CFR
2013-10-01
... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...
50 CFR 21.23 - Scientific collecting permits.
Code of Federal Regulations, 2012 CFR
2012-10-01
... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...
50 CFR 21.23 - Scientific collecting permits.
Code of Federal Regulations, 2014 CFR
2014-10-01
... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
‘Sciencenet’—towards a global search and share engine for all scientific knowledge
Lütjohann, Dominic S.; Shah, Asmi H.; Christen, Michael P.; Richter, Florian; Knese, Karsten; Liebel, Urban
2011-01-01
Summary: Modern biological experiments create vast amounts of data which are geographically distributed. These datasets consist of petabytes of raw data and billions of documents. Yet to the best of our knowledge, a search engine technology that searches and cross-links all different data types in life sciences does not exist. We have developed a prototype distributed scientific search engine technology, ‘Sciencenet’, which facilitates rapid searching over this large data space. By ‘bringing the search engine to the data’, we do not require server farms. This platform also allows users to contribute to the search index and publish their large-scale data to support e-Science. Furthermore, a community-driven method guarantees that only scientific content is crawled and presented. Our peer-to-peer approach is sufficiently scalable for the science web without performance or capacity tradeoff. Availability and Implementation: The free to use search portal web page and the downloadable client are accessible at: http://sciencenet.kit.edu. The web portal for index administration is implemented in ASP.NET, the ‘AskMe’ experiment publisher is written in Python 2.7, and the backend ‘YaCy’ search engine is based on Java 1.6. Contact: urban.liebel@kit.edu Supplementary Material: Detailed instructions and descriptions can be found on the project homepage: http://sciencenet.kit.edu. PMID:21493657
Ontology-Driven Provenance Management in eScience: An Application in Parasite Research
NASA Astrophysics Data System (ADS)
Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.
Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.
magHD: a new approach to multi-dimensional data storage, analysis, display and exploitation
NASA Astrophysics Data System (ADS)
Angleraud, Christophe
2014-06-01
The ever increasing amount of data and processing capabilities - following the well- known Moore's law - is challenging the way scientists and engineers are currently exploiting large datasets. The scientific visualization tools, although quite powerful, are often too generic and provide abstract views of phenomena, thus preventing cross disciplines fertilization. On the other end, Geographic information Systems allow nice and visually appealing maps to be built but they often get very confused as more layers are added. Moreover, the introduction of time as a fourth analysis dimension to allow analysis of time dependent phenomena such as meteorological or climate models, is encouraging real-time data exploration techniques that allow spatial-temporal points of interests to be detected by integration of moving images by the human brain. Magellium is involved in high performance image processing chains for satellite image processing as well as scientific signal analysis and geographic information management since its creation (2003). We believe that recent work on big data, GPU and peer-to-peer collaborative processing can open a new breakthrough in data analysis and display that will serve many new applications in collaborative scientific computing, environment mapping and understanding. The magHD (for Magellium Hyper-Dimension) project aims at developing software solutions that will bring highly interactive tools for complex datasets analysis and exploration commodity hardware, targeting small to medium scale clusters with expansion capabilities to large cloud based clusters.
The Moon in the Russian scientific-educational project: Kazan-GeoNa-2010
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.; Petrova, N.
Historically thousand-year Kazan city and the two-hundred-year Kazan university Russia carry out a role of the scientific-organizational and cultural-educational center of Volga region For the further successful development of educational and scientific-educational activity of the Russian Federation the Republic Tatarstan Kazan is offered the national project - the International Center of the Science and the Internet of Technologies bf GeoNa bf Geo metry of bf Na ture - bf GeoNa is developed - wisdom enthusiasm pride grandeur which includes a modern complex of conference halls up to 4 thousand places the Center the Internet of Technologies 3D Planetarium - development of the Moon PhysicsLand an active museum of natural sciences an oceanarium training a complex Spheres of Knowledge botanical and landscape oases In center bf GeoNa will be hosted conferences congresses fundamental scientific researches of the Moon scientific-educational actions presentation of the international scientific programs on lunar research modern lunar databases exhibition Hi-tech of the equipment the extensive cultural-educational tourist and cognitive programs Center bf GeoNa will enable scientists and teachers of the Russian universities to join to advanced achievements of a science information technologies to establish scientific communications with foreign colleagues in sphere of the high technology and educational projects with world space centers
75 FR 39548 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-09
...: Center for Scientific Review Special Emphasis Panel; Program Project: NeuroAIDS. Date: August 4-5, 2010... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: AIDS Molecular Biology and...
Planet X probe: A fresh new look at an old familiar place
NASA Technical Reports Server (NTRS)
Nicholson, James; Obrien, Tom; Brower, Sharon; Canright, Shelley
1988-01-01
Planet X Probe utilizes a Get Away Special (GAS) payload to provide a large student population with a remote Earth sensing experimental package. To provide a cooperative as well as a competitive environment, the effort is targeted at all grade levels and at schools in different geographical regions. LANDSAT capability allows students to investigate the Earth, its physical makeup, its resources, and the impact of man. This project also serves as an educational device to get students to stand back and take a fresh look at their home planet. The key element is to treat the familiar Earth as an unknown planet with knowledge based only on what is observable and provable from the images obtained. Through participation, a whole range of experiences will include: (1) mission planning; (2) research and pilot projects to train teams; (3) identification and recruitment of scientific mentors and dialogue; (4) selection of a student advisory team to be available during the mission; (5) analysis of data and compilation of findings; (6) report preparation, constucted along sound scientific principles; and (7) presentation and defense of findings before a meeting of competitive student groups and scientist in the field.
BTDI detector technology for reconnaissance application
NASA Astrophysics Data System (ADS)
Hilbert, Stefan; Eckardt, Andreas; Krutz, David
2017-11-01
The Institute of Optical Sensor Systems (OS) at the Robotics and Mechatronics Center of the German Aerospace Center (DLR) has more than 30 years of experience with high-resolution imaging technology. This paper shows the institute's scientific results of the leading-edge detector design in a BTDI (Bidirectional Time Delay and Integration) architecture. This project demonstrates an approved technological design for high or multi-spectral resolution spaceborne instruments. DLR OS and BAE Systems were driving the technology of new detectors and the FPA design for future projects, new manufacturing accuracy in order to keep pace with ambitious scientific and user requirements. Resulting from customer requirements and available technologies the current generation of space borne sensor systems is focusing on VIS/NIR high spectral resolution to meet the requirements on earth and planetary observation systems. The combination of large swath and high-spectral resolution with intelligent control applications and new focal plane concepts opens the door to new remote sensing and smart deep space instruments. The paper gives an overview of the detector development and verification program at DLR on detector module level and key parameters like SNR, linearity, spectral response, quantum efficiency, PRNU, DSNU and MTF.
Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks
NASA Astrophysics Data System (ADS)
Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.
2015-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.
Lam, Tram Kim; Schully, Sheri D; Rogers, Scott D; Benkeser, Rachel; Reid, Britt; Khoury, Muin J
2013-04-01
In a time of scientific and technological developments and budgetary constraints, the National Cancer Institute's (NCI) Provocative Questions Project offers a novel funding mechanism for cancer epidemiologists. We reviewed the purposes underlying the Provocative Questions Project, present information on the contributions of epidemiologic research to the current Provocative Questions portfolio, and outline opportunities that the cancer epidemiology community might capitalize on to advance a research agenda that spans a translational continuum from scientific discoveries to population health impact.
Legal & ethical compliance when sharing biospecimen.
Klingstrom, Tomas; Bongcam-Rudloff, Erik; Reichel, Jane
2018-01-01
When obtaining samples from biobanks, resolving ethical and legal concerns is a time-consuming task where researchers need to balance the needs of privacy, trust and scientific progress. The Biobanking and Biomolecular Resources Research Infrastructure-Large Prospective Cohorts project has resolved numerous such issues through intense communication between involved researchers and experts in its mission to unite large prospective study sets in Europe. To facilitate efficient communication, it is useful for nonexperts to have an at least basic understanding of the regulatory system for managing biological samples.Laws regulating research oversight are based on national law and normally share core principles founded on international charters. In interview studies among donors, chief concerns are privacy, efficient sample utilization and access to information generated from their samples. Despite a lack of clear evidence regarding which concern takes precedence, scientific as well as public discourse has largely focused on privacy concerns and the right of donors to control the usage of their samples.It is therefore important to proactively deal with ethical and legal issues to avoid complications that delay or prevent samples from being accessed. To help biobank professionals avoid making unnecessary mistakes, we have developed this basic primer covering the relationship between ethics and law, the concept of informed consent and considerations for returning findings to donors. © The Author 2017. Published by Oxford University Press.
Legal & ethical compliance when sharing biospecimen
Klingstrom, Tomas; Bongcam-Rudloff, Erik; Reichel, Jane
2018-01-01
Abstract When obtaining samples from biobanks, resolving ethical and legal concerns is a time-consuming task where researchers need to balance the needs of privacy, trust and scientific progress. The Biobanking and Biomolecular Resources Research Infrastructure-Large Prospective Cohorts project has resolved numerous such issues through intense communication between involved researchers and experts in its mission to unite large prospective study sets in Europe. To facilitate efficient communication, it is useful for nonexperts to have an at least basic understanding of the regulatory system for managing biological samples. Laws regulating research oversight are based on national law and normally share core principles founded on international charters. In interview studies among donors, chief concerns are privacy, efficient sample utilization and access to information generated from their samples. Despite a lack of clear evidence regarding which concern takes precedence, scientific as well as public discourse has largely focused on privacy concerns and the right of donors to control the usage of their samples. It is therefore important to proactively deal with ethical and legal issues to avoid complications that delay or prevent samples from being accessed. To help biobank professionals avoid making unnecessary mistakes, we have developed this basic primer covering the relationship between ethics and law, the concept of informed consent and considerations for returning findings to donors. PMID:28460118
Governance of the International Linear Collider Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, B.; /Oxford U.; Barish, B.
Governance models for the International Linear Collider Project are examined in the light of experience from similar international projects around the world. Recommendations for one path which could be followed to realize the ILC successfully are outlined. The International Linear Collider (ILC) is a unique endeavour in particle physics; fully international from the outset, it has no 'host laboratory' to provide infrastructure and support. The realization of this project therefore presents unique challenges, in scientific, technical and political arenas. This document outlines the main questions that need to be answered if the ILC is to become a reality. It describesmore » the methodology used to harness the wisdom displayed and lessons learned from current and previous large international projects. From this basis, it suggests both general principles and outlines a specific model to realize the ILC. It recognizes that there is no unique model for such a laboratory and that there are often several solutions to a particular problem. Nevertheless it proposes concrete solutions that the authors believe are currently the best choices in order to stimulate discussion and catalyze proposals as to how to bring the ILC project to fruition. The ILC Laboratory would be set up by international treaty and be governed by a strong Council to whom a Director General and an associated Directorate would report. Council would empower the Director General to give strong management to the project. It would take its decisions in a timely manner, giving appropriate weight to the financial contributions of the member states. The ILC Laboratory would be set up for a fixed term, capable of extension by agreement of all the partners. The construction of the machine would be based on a Work Breakdown Structure and value engineering and would have a common cash fund sufficiently large to allow the management flexibility to optimize the project's construction. Appropriate contingency, clearly apportioned at both a national and global level, is essential if the project is to be realised. Finally, models for running costs and decommissioning at the conclusion of the ILC project are proposed. This document represents an interim report of the bodies and individuals studying these questions inside the structure set up and supervised by the International Committee for Future Accelerators (ICFA). It represents a request for comment to the international community in all relevant disciplines, scientific, technical and most importantly, political. Many areas require further study and some, in particular the site selection process, have not yet progressed sufficiently to be addressed in detail in this document. Discussion raised by this document will be vital in framing the final proposals due to be published in 2012 in the Technical Design Report being prepared by the Global Design Effort of the ILC.« less
Citizen Science - What's policy got to do with it? (Invited)
NASA Astrophysics Data System (ADS)
Shanley, L.
2013-12-01
Sensing capabilities, computing power, and data storage have grown rapidly and become increasingly ubiquitous. In 2012, the number of smartphones worldwide topped one billion, and it is expected to double by 2015. A growing segment of the population now has the ability to collect and share information instantly. Social media and crowdsourcing platforms help to amplify and focus online information sharing and collaboration. We have seen exiting uses of these new tools and approaches to foster broad public participation in scientific research, from classifying galaxies and collecting environmental data to collectively solving the structure of an AIDS-related enzyme through a protein-folding game. The U.S. Geological Survey (USGS), for example, is using social media and crowdsourcing to learn more about earthquakes. These techniques provide inexpensive and rapid data to augment and extend the capabilities provided by traditional monitoring techniques. A new report by the Wilson Center, Transforming Earthquake Detection and Science Through Citizen Seismology, describes these groundbreaking citizen science projects. These efforts include the Tweet Earthquake Dispatch, which uses an algorithm to provide seismologists with initial alerts of earthquakes felt around the globe via Twitter in less than two minutes. The report also examines the Quake Catcher Network, which equips the public with low-cost sensors to collect information on seismic activity, as well as Did You Feel It, which uses the Internet to survey individuals about their experiences in earthquakes, including location and extent of the damage. Projects like these, however, do not happen overnight. Citizen-based science projects at the federal level must navigate a web of practical, legal and policy considerations to make them a reality. Projects must take into account the limitations of the Privacy Act, advising people on how the information they contribute might be used and respecting fair information practices. They also must address the Paperwork Reduction Act, receiving Office of Management and Budget approval before beginning information collection. This presentation will examine many of these legal and policy issues, and provide lessons learned so that others may apply them to their unique missions. The opportunity for large-scale public contributions is great. With planning and support, citizen science could improve our scientific enterprise, facilitate greater public awareness and understanding of scientific issues, and change how the public interacts with government and the scientific community.
Scientific Data Purchase Project Overview Presentation
NASA Technical Reports Server (NTRS)
Holekamp, Kara; Fletcher, Rose
2001-01-01
The Scientific Data Purchase (SDP) project acquires science data from commercial sources. It is a demonstration project to test a new way of doing business, tap new sources of data, support Earth science research, and support the commercial remote sensing industry. Phase I of the project reviews simulated/prototypical data sets from 10 companies. Phase II of the project is a 3 year purchase/distribution of select data from 5 companies. The status of several SDP projects is reviewed in this viewgraph presentation, as is the SDP process of tasking, verification, validation, and data archiving. The presentation also lists SDP results for turnaround time, metrics, customers, data use, science research, applications research, and user feedback.
“Brevity is the Soul of Wit”: Use of a Stepwise Project to Teach Concise Scientific Writing
Cyr, Nicole E.
2017-01-01
Skillful writing is essential for professionals in science and medicine. Consequently, many undergraduate institutions have adjusted their curriculum to include in-depth instruction and practice in writing for students majoring in the sciences. In neuroscience, students are often asked to write a laboratory report in the style of a primary scientific article or a term paper structured like a review article. Typically, students write section by section and build up to the final draft of a complete paper. In this way, students learn how to write a scientific paper. While learning to write such a paper is important, this is not the only type of written communication relevant to scientific careers. Here, I describe a stepwise writing project aimed to improve editing, succinctness, and the ability to synthesize the literature. Furthermore, I provide feedback from the students, and discuss the advantages and challenges of this project. PMID:29371841
Python-based geometry preparation and simulation visualization toolkits for STEPS
Chen, Weiliang; De Schutter, Erik
2014-01-01
STEPS is a stochastic reaction-diffusion simulation engine that implements a spatial extension of Gillespie's Stochastic Simulation Algorithm (SSA) in complex tetrahedral geometries. An extensive Python-based interface is provided to STEPS so that it can interact with the large number of scientific packages in Python. However, a gap existed between the interfaces of these packages and the STEPS user interface, where supporting toolkits could reduce the amount of scripting required for research projects. This paper introduces two new supporting toolkits that support geometry preparation and visualization for STEPS simulations. PMID:24782754
Leveraging Python Interoperability Tools to Improve Sapphire's Usability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gezahegne, A; Love, N S
2007-12-10
The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.
Astrophysics and Cosmology: International Partnerships
NASA Astrophysics Data System (ADS)
Blandford, Roger
2016-03-01
Most large projects in astrophysics and cosmology are international. This raises many challenges including: --Aligning the sequence of: proposal, planning, selection, funding, construction, deployment, operation, data mining in different countries --Managing to minimize cost growth through reconciling different practices --Communicating at all levels to ensure a successful outcome --Stabilizing long term career opportunities. There has been considerable progress in confronting these challenges. Lessons learned from past collaborations are influencing current facilities but much remains to be done if we are to optimize the scientific and public return on the expenditure of financial and human resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Supinski, B.; Caliga, D.
2017-09-28
The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.
C3: A Collaborative Web Framework for NASA Earth Exchange
NASA Astrophysics Data System (ADS)
Foughty, E.; Fattarsi, C.; Hardoyo, C.; Kluck, D.; Wang, L.; Matthews, B.; Das, K.; Srivastava, A.; Votava, P.; Nemani, R. R.
2010-12-01
The NASA Earth Exchange (NEX) is a new collaboration platform for the Earth science community that provides a mechanism for scientific collaboration and knowledge sharing. NEX combines NASA advanced supercomputing resources, Earth system modeling, workflow management, NASA remote sensing data archives, and a collaborative communication platform to deliver a complete work environment in which users can explore and analyze large datasets, run modeling codes, collaborate on new or existing projects, and quickly share results among the Earth science communities. NEX is designed primarily for use by the NASA Earth science community to address scientific grand challenges. The NEX web portal component provides an on-line collaborative environment for sharing of Eearth science models, data, analysis tools and scientific results by researchers. In addition, the NEX portal also serves as a knowledge network that allows researchers to connect and collaborate based on the research they are involved in, specific geographic area of interest, field of study, etc. Features of the NEX web portal include: Member profiles, resource sharing (data sets, algorithms, models, publications), communication tools (commenting, messaging, social tagging), project tools (wikis, blogs) and more. The NEX web portal is built on the proven technologies and policies of DASHlink.arc.nasa.gov, (one of NASA's first science social media websites). The core component of the web portal is a C3 framework, which was built using Django and which is being deployed as a common framework for a number of collaborative sites throughout NASA.
ASSESSMENT OF MAST IN EUROPEAN PATIENT-CENTERED TELEMEDICINE PILOTS.
Ekeland, Anne Granstrøm; Grøttland, Astrid
2015-01-01
Model for ASsessment of Telemedicine Applications (MAST) is a health technology assessment (HTA) inspired framework for assessing the effectiveness and contribution to quality of telemedicine applications based on rigorous, scientific data. This study reports from a study of how it was used and perceived in twenty-one pilots of the European project RENEWING HEALTH (RH). The objectives of RH were to implement large-scale, real-life test beds for the validation and subsequent evaluation of innovative patient-centered telemedicine services. The study is a contribution to the appraisal of HTA methods. A questionnaire was administered for project leaders of the pilots. It included questions about use and usefulness of MAST for (i) preceding considerations, (ii) evaluation of outcomes within seven domains, and (iii) considerations of transferability. Free text spaces allowed for proposals of improvement. The responses covered all pilots. A quantitative summary of use and a qualitative analysis of usefulness were performed. MAST was used and considered useful for pilot evaluations. Challenges included problems to scientifically determine alternative service options and outcome within the seven domains. Proposals for improvement included process studies and adding domains of technological usability, responsible innovation, health literacy, behavior change, caregiver perspectives and motivational issues of professionals. MAST was used according to its structure. Its usefulness in patient centered pilots can be improved by adding new stakeholder groups. Interdependencies between scientific rigor, resources and timeliness should be addressed. Operational options for improvements include process studies, literature reviews and sequential mini-HTAs for identification of areas for more elaborate investigations.
77 FR 59198 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-26
....gov . Name of Committee: Center for Scientific Review Special Emphasis Panel; Drug Discovery for the... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Program Projects...
Pervasive healthcare as a scientific discipline.
Bardram, J E
2008-01-01
The OECD countries are facing a set of core challenges; an increasing elderly population; increasing number of chronic and lifestyle-related diseases; expanding scope of what medicine can do; and increasing lack of medical professionals. Pervasive healthcare asks how pervasive computing technology can be designed to meet these challenges. The objective of this paper is to discuss 'pervasive healthcare' as a research field and tries to establish how novel and distinct it is, compared to related work within biomedical engineering, medical informatics, and ubiquitous computing. The paper presents the research questions, approach, technologies, and methods of pervasive healthcare and discusses these in comparison to those of other related scientific disciplines. A set of central research themes are presented; monitoring and body sensor networks; pervasive assistive technologies; pervasive computing for hospitals; and preventive and persuasive technologies. Two projects illustrate the kind of research being done in pervasive healthcare. The first project is targeted at home-based monitoring of hypertension; the second project is designing context-aware technologies for hospitals. Both projects approach the healthcare challenges in a new way, apply a new type of research method, and come up with new kinds of technological solutions. 'Clinical proof-of-concept' is recommended as a new method for pervasive healthcare research; the method helps design and test pervasive healthcare technologies, and in ascertaining their clinical potential before large-scale clinical tests are needed. The paper concludes that pervasive healthcare as a research field and agenda is novel; it is addressing new emerging research questions, represents a novel approach, designs new types of technologies, and applies a new kind of research method.
How to Grow Project Scientists: A Systematic Approach to Developing Project Scientists
NASA Technical Reports Server (NTRS)
Kea, Howard
2011-01-01
The Project Manager is one of the key individuals that can determine the success or failure of a project. NASA is fully committed to the training and development of Project Managers across the agency to ensure that highly capable individuals are equipped with the competencies and experience to successfully lead a project. An equally critical position is that of the Project Scientist. The Project Scientist provides the scientific leadership necessary for the scientific success of a project by insuring that the mission meets or exceeds the scientific requirements. Traditionally, NASA Goddard project scientists were appointed and approved by the Center Science Director based on their knowledge, experience, and other qualifications. However the process to obtain the necessary knowledge, skills and abilities was not documented or done in a systematic way. NASA Goddard's current Science Director, Nicholas White saw the need to create a pipeline for developing new projects scientists, and appointed a team to develop a process for training potential project scientists. The team members were Dr. Harley Thronson, Chair, Dr. Howard Kea, Mr. Mark Goldman, DACUM facilitator and the late Dr. Michael VanSteenberg. The DACUM process, an occupational analysis and evaluation system, was used to produce a picture of the project scientist's duties, tasks, knowledge, and skills. The output resulted in a 3-Day introductory course detailing all the required knowledge, skills and abilities a scientist must develop over time to be qualified for selections as a Project Scientist.
Reading, Writing, and Conducting Inquiry about Science in Kindergarten
ERIC Educational Resources Information Center
Patrick, Helen; Mantzicopoulos, Panayota; Samarapungavan, Ala
2009-01-01
Over the past three years, the authors have worked with kindergarten teachers to develop study units with sequences of integrated science inquiry and literacy activities appropriate for kindergartners. Their work, which is part of the Scientific Literacy Project, has been very successful. The success of the Scientific Literacy Project (SLP) is in…
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Norvig, Peter (Technical Monitor)
2000-01-01
NASA's ScienceDesk Project at the Ames Research Center is responsible for scientific knowledge management which includes ensuring the capture, preservation, and traceability of scientific knowledge. Other responsibilities include: 1) Maintaining uniform information access which is achieved through intelligent indexing and visualization, 2) Collaborating both asynchronous and synchronous science teamwork, 3) Monitoring and controlling semi-autonomous remote experimentation.
42 CFR 52h.8 - What are the review criteria for grants?
Code of Federal Regulations, 2010 CFR
2010-10-01
... research, from a scientific or technical standpoint; (b) The adequacy of the approach and methodology... SCIENTIFIC PEER REVIEW OF RESEARCH GRANT APPLICATIONS AND RESEARCH AND DEVELOPMENT CONTRACT PROJECTS § 52h.8... review group shall assess the overall impact that the project could have on the research field involved...
76 FR 13195 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-10
...: Center for Scientific Review Special Emphasis Panel; Program Project: Presynaptic Mechanisms of Neural...: AIDS/HIV Innovative Research Applications. Date: March 30, 2011. Time: 11 a.m. to 7 p.m. Agenda: To... Special Emphasis Panel; Program Project: Mitochondrial Metabolism. Date: April 4-5, 2011. Time: 8 a.m. to...
Prime the Pipeline Project (P[cube]): Putting Knowledge to Work
ERIC Educational Resources Information Center
Greenes, Carole; Wolfe, Susan; Weight, Stephanie; Cavanagh, Mary; Zehring, Julie
2011-01-01
With funding from NSF, the Prime the Pipeline Project (P[cube]) is responding to the need to strengthen the science, technology, engineering, and mathematics (STEM) pipeline from high school to college by developing and evaluating the scientific village strategy and the culture it creates. The scientific village, a community of high school…
Project Citizen: Promoting Action-Oriented Citizen Science in the Classroom
ERIC Educational Resources Information Center
Green, Carie; Medina-Jerez, William
2012-01-01
In recent years, citizen science projects have emerged as a means to involve students in scientific inquiry, particularly in the fields of ecology and environmental science. A citizen scientist is "a volunteer who collects and/or processes data as part of a scientific inquiry" (Silverton 2009, p. 467). Participation in citizen science…
ERIC Educational Resources Information Center
Hartt, Richard W.
This report discusses the characteristics, operations, and automation requirements of technical libraries providing services to organizations involved in aerospace and defense scientific and technical work, and describes the Local Automation Model project. This on-going project is designed to demonstrate the concept of a fully integrated library…
Reports of planetary astronomy - 1991
NASA Technical Reports Server (NTRS)
Rahe, Jurgen (Editor)
1993-01-01
This publication provides information about currently funded scientific research projects conducted in the Planetary Astronomy Program during 1991, and consists of two main sections. The first section gives a summary of research objectives, past accomplishments, and projected future investigations, as submitted by each principal investigator. In the second section, recent scientifically significant accomplishments within the Program are highlighted.
Inferring cortical function in the mouse visual system through large-scale systems neuroscience.
Hawrylycz, Michael; Anastassiou, Costas; Arkhipov, Anton; Berg, Jim; Buice, Michael; Cain, Nicholas; Gouwens, Nathan W; Gratiy, Sergey; Iyer, Ramakrishnan; Lee, Jung Hoon; Mihalas, Stefan; Mitelut, Catalin; Olsen, Shawn; Reid, R Clay; Teeter, Corinne; de Vries, Saskia; Waters, Jack; Zeng, Hongkui; Koch, Christof
2016-07-05
The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.
Caous, Cristofer André; Machado, Birajara; Hors, Cora; Zeh, Andrea Kaufmann; Dias, Cleber Gustavo; Amaro Junior, Edson
2012-01-01
To propose a measure (index) of expected risks to evaluate and follow up the performance analysis of research projects involving financial and adequate structure parameters for its development. A ranking of acceptable results regarding research projects with complex variables was used as an index to gauge a project performance. In order to implement this method the ulcer index as the basic model to accommodate the following variables was applied: costs, high impact publication, fund raising, and patent registry. The proposed structured analysis, named here as RoSI (Return on Scientific Investment) comprises a pipeline of analysis to characterize the risk based on a modeling tool that comprises multiple variables interacting in semi-quantitatively environments. This method was tested with data from three different projects in our Institution (projects A, B and C). Different curves reflected the ulcer indexes identifying the project that may have a minor risk (project C) related to development and expected results according to initial or full investment. The results showed that this model contributes significantly to the analysis of risk and planning as well as to the definition of necessary investments that consider contingency actions with benefits to the different stakeholders: the investor or donor, the project manager and the researchers.
Large space telescope, phase A. Volume 4: Scientific instrument package
NASA Technical Reports Server (NTRS)
1972-01-01
The design and characteristics of the scientific instrument package for the Large Space Telescope are discussed. The subjects include: (1) general scientific objectives, (2) package system analysis, (3) scientific instrumentation, (4) imaging photoelectric sensors, (5) environmental considerations, and (6) reliability and maintainability.
Achievements of ATS-6 beacon experiment over Indian sub-continent
NASA Technical Reports Server (NTRS)
Deshpande, M. R.; Rastogi, R. G.; Vats, H. O.; Sethia, G.; Chandra, H.; Davies, K.; Grubb, R. N.; Jones, J. E.
1978-01-01
The repositioning of the ATS-6 satellite at 34 deg E enabled the scientific community of India to use the satellite's radio beacon for ionospheric studies. Two scientific projects were undertaken. The objective of the first project was to map ionospheric electron content, range rate errors, traveling ionospheric phenomena, solar flare effect, and magnetic phenomena. The second project was aimed at studying geophysical phenomena associated with the equatorial electrojet. The principal results of these studies are described.
NASA Astrophysics Data System (ADS)
Zambito, Anna Maria; Curcio, Francesco; Meli, Antonella; Saverio Ambesi-Impiombato, Francesco
The "MoMa" project: "From Molecules to Man: Space Research Applied to the improvement of the Quality of Life of the Ageing Population on Earth started June 16 2006 and finished right on schedule June 25 2009, has been the biggest of the three projects funded by ASI in the sector "Medicine and Biotechnology. In the last years the scientific community had formed a national chain of biomedical spatial research with different research areas. MoMa responds to the necessity of unification in ASI of the two areas "Radiobiology and Protection" and "Cellular and Molecular Biotechnology" in a line of joint research: "Biotechnological Applications" were the interests of all groups would be combined and unified in a goal of social relevance. MoMa is the largest project ever developed in the biomedical area in Italy, the idea was born thinking about the phenomenon of acceleration of the aging process observed in space, and already described in literature, and the aim of studying the effects of the space environment at cellular, molecular and human organism level. "MoMa" was divided into three primary areas of study: Molecules, Cells and Man with an industrial area alongside. This allowed to optimize the work and information flows within the scientific research more similar and more culturally homogeneous and allowed a perfect industrial integration in a project of great scientific importance. Within three scientific areas 10 scientific lines in total are identified, each of them coordinated by a subcontractor. The rapid and efficient exchange of information between different areas of science and the development of industrial applications in various areas of interest have been assured by a strong work of Scientific Coordination of System Engineering and Quality Control. After three years of intense and coordinated activities within the MoMa project, the objectives achieved are very significant not only as regards the scientific results and the important hardware produced but also as regard of the employment targets with the delivery of approximately 250 scholarships for researchers and doctoral students and financing to industries and SMEs Italian. The scientific and industrial MoMa community is aware that a so important and challenging project can not expire and is now ready to take advantage of the huge potentiality gained to compete successfully at international level in this new phase of space exploration.
Silventoinen, Karri; Jelenkovic, Aline; Sund, Reijo; Honda, Chika; Aaltonen, Sari; Yokoyama, Yoshie; Tarnoki, Adam D; Tarnoki, David L; Ning, Feng; Ji, Fuling; Pang, Zengchang; Ordoñana, Juan R; Sánchez-Romera, Juan F; Colodro-Conde, Lucia; Burt, S Alexandra; Klump, Kelly L; Medland, Sarah E; Montgomery, Grant W; Kandler, Christian; McAdams, Tom A; Eley, Thalia C; Gregory, Alice M; Saudino, Kimberly J; Dubois, Lise; Boivin, Michel; Haworth, Claire M A; Plomin, Robert; Öncel, Sevgi Y; Aliev, Fazil; Stazi, Maria A; Fagnani, Corrado; D'Ippolito, Cristina; Craig, Jeffrey M; Saffery, Richard; Siribaddana, Sisira H; Hotopf, Matthew; Sumathipala, Athula; Spector, Timothy; Mangino, Massimo; Lachance, Genevieve; Gatz, Margaret; Butler, David A; Bayasgalan, Gombojav; Narandalai, Danshiitsoodol; Freitas, Duarte L; Maia, José Antonio; Harden, K Paige; Tucker-Drob, Elliot M; Christensen, Kaare; Skytthe, Axel; Kyvik, Kirsten O; Hong, Changhee; Chong, Youngsook; Derom, Catherine A; Vlietinck, Robert F; Loos, Ruth J F; Cozen, Wendy; Hwang, Amie E; Mack, Thomas M; He, Mingguang; Ding, Xiaohu; Chang, Billy; Silberg, Judy L; Eaves, Lindon J; Maes, Hermine H; Cutler, Tessa L; Hopper, John L; Aujard, Kelly; Magnusson, Patrik K E; Pedersen, Nancy L; Aslan, Anna K Dahl; Song, Yun-Mi; Yang, Sarah; Lee, Kayoung; Baker, Laura A; Tuvblad, Catherine; Bjerregaard-Andersen, Morten; Beck-Nielsen, Henning; Sodemann, Morten; Heikkilä, Kauko; Tan, Qihua; Zhang, Dongfeng; Swan, Gary E; Krasnow, Ruth; Jang, Kerry L; Knafo-Noam, Ariel; Mankuta, David; Abramson, Lior; Lichtenstein, Paul; Krueger, Robert F; McGue, Matt; Pahlen, Shandell; Tynelius, Per; Duncan, Glen E; Buchwald, Dedra; Corley, Robin P; Huibregtse, Brooke M; Nelson, Tracy L; Whitfield, Keith E; Franz, Carol E; Kremen, William S; Lyons, Michael J; Ooki, Syuichi; Brandt, Ingunn; Nilsen, Thomas Sevenius; Inui, Fujio; Watanabe, Mikio; Bartels, Meike; van Beijsterveldt, Toos C E M; Wardle, Jane; Llewellyn, Clare H; Fisher, Abigail; Rebato, Esther; Martin, Nicholas G; Iwatani, Yoshinori; Hayakawa, Kazuo; Rasmussen, Finn; Sung, Joohon; Harris, Jennifer R; Willemsen, Gonneke; Busjahn, Andreas; Goldberg, Jack H; Boomsma, Dorret I; Hur, Yoon-Mi; Sørensen, Thorkild I A; Kaprio, Jaakko
2015-08-01
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
Silventoinen, Karri; Jelenkovic, Aline; Sund, Reijo; Honda, Chika; Aaltonen, Sari; Yokoyama, Yoshie; Tarnoki, Adam D; Tarnoki, David L; Ning, Feng; Ji, Fuling; Pang, Zengchang; Ordoñana, Juan R; Sánchez-Romera, Juan F; Colodro-Conde, Lucia; Burt, S Alexandra; Klump, Kelly L; Medland, Sarah E; Montgomery, Grant W; Kandler, Christian; McAdams, Tom A; Eley, Thalia C; Gregory, Alice M; Saudino, Kimberly J; Dubois, Lise; Boivin, Michel; Haworth, Claire MA; Plomin, Robert; Öncel, Sevgi Y; Aliev, Fazil; Stazi, Maria A; Fagnani, Corrado; D'Ippolito, Cristina; Craig, Jeffrey M; Saffery, Richard; Siribaddana, Sisira H; Hotopf, Matthew; Sumathipala, Athula; Spector, Timothy; Mangino, Massimo; Lachance, Genevieve; Gatz, Margaret; Butler, David A; Bayasgalan, Gombojav; Narandalai, Danshiitsoodol; Freitas, Duarte L; Maia, José Antonio; Harden, K Paige; Tucker-Drob, Elliot M; Christensen, Kaare; Skytthe, Axel; Kyvik, Kirsten O; Hong, Changhee; Chong, Youngsook; Derom, Catherine A; Vlietinck, Robert F; Loos, Ruth JF; Cozen, Wendy; Hwang, Amie E; Mack, Thomas M; He, Mingguang; Ding, Xiaohu; Chang, Billy; Silberg, Judy L; Eaves, Lindon J; Maes, Hermine H; Cutler, Tessa L; Hopper, John L; Aujard, Kelly; Magnusson, Patrik KE; Pedersen, Nancy L; Dahl-Aslan, Anna K; Song, Yun-Mi; Yang, Sarah; Lee, Kayoung; Baker, Laura A; Tuvblad, Catherine; Bjerregaard-Andersen, Morten; Beck-Nielsen, Henning; Sodemann, Morten; Heikkilä, Kauko; Tan, Qihua; Zhang, Dongfeng; Swan, Gary E; Krasnow, Ruth; Jang, Kerry L; Knafo-Noam, Ariel; Mankuta, David; Abramson, Lior; Lichtenstein, Paul; Krueger, Robert F; McGue, Matt; Pahlen, Shandell; Tynelius, Per; Duncan, Glen E; Buchwald, Dedra; Corley, Robin P; Huibregtse, Brooke M; Nelson, Tracy L; Whitfield, Keith E; Franz, Carol E; Kremen, William S; Lyons, Michael J; Ooki, Syuichi; Brandt, Ingunn; Nilsen, Thomas Sevenius; Inui, Fujio; Watanabe, Mikio; Bartels, Meike; van Beijsterveldt, Toos CEM; Wardle, Jane; Llewellyn, Clare H; Fisher, Abigail; Rebato, Esther; Martin, Nicholas G; Iwatani, Yoshinori; Hayakawa, Kazuo; Rasmussen, Finn; Sung, Joohon; Harris, Jennifer R; Willemsen, Gonneke; Busjahn, Andreas; Goldberg, Jack H; Boomsma, Dorret I; Hur, Yoon-Mi; Sørensen, Thorkild IA; Kaprio, Jaakko
2015-01-01
For over one hundred years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically 1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and 2) to study the effects of birth related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects including both monozygotic and dizygotic twins using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes. PMID:26014041
Diurnal Cycle of Convection and Interaction with the Large-Scale Circulation
NASA Technical Reports Server (NTRS)
Salby, Murry L.
2002-01-01
The science in this effort was scheduled in the project's third and fourth years, after a long record of high-resolution Global Cloud Imagery (GCI) had been produced. Unfortunately, political disruptions that interfered with this project led to its funding being terminated after only two years of support. Nevertheless, the availability of intermediate data opened the door to a number of important scientific studies. Beyond considerations of the diurnal cycle addressed in this grant, the GCI makes possible a wide range of studies surrounding convection, cloud, and precipitation. Several are already underway with colleagues in the US and abroad, including global cloud simulations, a global precipitation product, global precipitation simulations, upper tropospheric humidity, asynoptic sampling studies, convective organization studies, equatorial wave simulations, and the tropical tropopause.
The Undergraduate ALFALFA Team: Collaborative Research Projects
NASA Astrophysics Data System (ADS)
Cannon, John M.; Koopmann, Rebecca A.; Haynes, Martha P.; Undergraduate ALFALFA Team, ALFALFA Team
2016-01-01
The NSF-sponsored Undergraduate ALFALFA (Arecibo Legacy Fast ALFA) Team (UAT) has allowed faculty and students from a wide range of public and private colleges and especially those with small astronomy programs to learn how science is accomplished in a large collaboration while contributing to the scientific goals of a legacy radio astronomy survey. The UAT has achieved this through close collaboration with ALFALFA PIs to identify research areas accessible to undergraduates. In this talk we will summarize the main research efforts of the UAT, including multiwavelength followup observations of ALFALFA sources, the UAT Collaborative Groups Project, the Survey of HI in Extremely Low-mass Dwarfs (SHIELD), and the Arecibo Pisces-Perseus Supercluster Survey. This work has been supported by NSF grants AST-0724918/0902211, AST-075267/0903394, AST-0725380, and AST-1211005.
Data management integration for biomedical core facilities
NASA Astrophysics Data System (ADS)
Zhang, Guo-Qiang; Szymanski, Jacek; Wilson, David
2007-03-01
We present the design, development, and pilot-deployment experiences of MIMI, a web-based, Multi-modality Multi-Resource Information Integration environment for biomedical core facilities. This is an easily customizable, web-based software tool that integrates scientific and administrative support for a biomedical core facility involving a common set of entities: researchers; projects; equipments and devices; support staff; services; samples and materials; experimental workflow; large and complex data. With this software, one can: register users; manage projects; schedule resources; bill services; perform site-wide search; archive, back-up, and share data. With its customizable, expandable, and scalable characteristics, MIMI not only provides a cost-effective solution to the overarching data management problem of biomedical core facilities unavailable in the market place, but also lays a foundation for data federation to facilitate and support discovery-driven research.
Land Cover Applications, Landscape Dynamics, and Global Change
Tieszen, Larry L.
2007-01-01
The Land Cover Applications, Landscape Dynamics, and Global Change project at U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) seeks to integrate remote sensing and simulation models to better understand and seek solutions to national and global issues. Modeling processes related to population impacts, natural resource management, climate change, invasive species, land use changes, energy development, and climate mitigation all pose significant scientific opportunities. The project activities use remotely sensed data to support spatial monitoring, provide sensitivity analyses across landscapes and large regions, and make the data and results available on the Internet with data access and distribution, decision support systems, and on-line modeling. Applications support sustainable natural resource use, carbon cycle science, biodiversity conservation, climate change mitigation, and robust simulation modeling approaches that evaluate ecosystem and landscape dynamics.
Citizen Science: Opportunities for Girls' Development of Science Identity
NASA Astrophysics Data System (ADS)
Brien, Sinead Carroll
Many students in the United States, particularly girls, have lost interest in science by the time they reach high school and do not pursue higher degrees or careers in science. Several science education researchers have found that the ways in which youth see themselves and position themselves in relation to science can influence whether they pursue science studies and careers. I suggest that participation in a citizen science program, which I define as a program in which girls interact with professional scientists and collect data that contributes to scientific research, could contribute to changing girls' perceptions of science and scientists, and promote their science identity work. I refer to science identity as self-recognition and recognition by others that one thinks scientifically and does scientific work. I examined a case study to document and analyze the relationship between girls' participation in a summer citizen science project and their development of science identity. I observed six girls between the ages of 16 and 18 during the Milkweed and Monarch Project, taking field notes on focal girls' interactions with other youth, adults, and the scientist, conducted highly-structured interviews both pre-and post- girls' program participation, and interviewed the project scientist and educator. I qualitatively analyzed field notes and interview responses for themes in girls' discussion of what it meant to think scientifically, roles they took on, and how they recognized themselves as thinking scientifically. I found that girls who saw themselves as thinking scientifically during the program seemed to demonstrate shifts in their science identity. The aspects of the citizen science program that seemed to most influence shifts in these girls' science identities were 1) the framing of the project work as "real science, 2) that it involved ecological field work, and 3) that it created a culture that valued data and scientific work. However, some of the girls only saw themselves as completing a repetitive task of data collection, and these evidenced no change in science identity. This indicates that science identity work might require more explicit attention by educators and scientists to girls' perceptions of science and scientific thinking, and discussion of how this is related to the project work and the roles they are playing within the citizen science project.
Selling science 2.0: What scientific projects receive crowdfunding online?
Schäfer, Mike S; Metag, Julia; Feustle, Jessica; Herzog, Livia
2016-09-19
Crowdfunding has emerged as an additional source for financing research in recent years. The study at hand identifies and tests explanatory factors influencing the success of scientific crowdfunding projects by drawing on news value theory, the "reputation signaling" approach, and economic theories of online payment. A standardized content analysis of 371 projects on English- and German-language platforms reveals that each theory provides factors influencing crowdfunding success. It shows that projects presented on science-only crowdfunding platforms have a higher success rate. At the same time, projects are more likely to be successful if their presentation includes visualizations and humor, the lower their targeted funding is, the less personal data potential donors have to relinquish and the more interaction between researchers and donors is possible. This suggests that after donors decide to visit a scientific crowdfunding platform, factors unrelated to science matter more for subsequent funding decisions, raising questions about the potential and implications of crowdfunding science. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Hagan, Wendy L.
Project G.R.O.W. is an ecology-based research project developed for high school biology students. The curriculum was designed based on how students learn and awareness of the nature of science and scientific practices so that students would design and carry out scientific investigations using real data from a local coastal wetland. This was a scientist-teacher collaboration between a CSULB biologist and high school biology teacher. Prior to implementing the three-week research project, students had multiple opportunities to practice building requisite skills via 55 lessons focusing on the nature of science, scientific practices, technology, Common Core State Standards of reading, writing, listening and speaking, and Next Generation Science Standards. Project G.R.O.W. culminated with student generated research papers and oral presentations. Outcomes reveal students struggle with constructing explanations and the use of Excel to create meaningful graphs. They showed gains in data organization, analysis, teamwork and aspects of the nature of science.
Object classification and outliers analysis in the forthcoming Gaia mission
NASA Astrophysics Data System (ADS)
Ordóñez-Blanco, D.; Arcay, B.; Dafonte, C.; Manteiga, M.; Ulla, A.
2010-12-01
Astrophysics is evolving towards the rational optimization of costly observational material by the intelligent exploitation of large astronomical databases from both terrestrial telescopes and spatial mission archives. However, there has been relatively little advance in the development of highly scalable data exploitation and analysis tools needed to generate the scientific returns from these large and expensively obtained datasets. Among the upcoming projects of astronomical instrumentation, Gaia is the next cornerstone ESA mission. The Gaia survey foresees the creation of a data archive and its future exploitation with automated or semi-automated analysis tools. This work reviews some of the work that is being developed by the Gaia Data Processing and Analysis Consortium for the object classification and analysis of outliers in the forthcoming mission.
An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.
2012-11-01
Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.
Pulido, Diony; Robledo, Rocío; Agudelo, Carlos A
2009-01-01
A collaboration network involving 6 countries in Europe, Latin-America and the Caribbean has embarked on a project (Network of Collaboration Between Europe and Latin American Caribbean Countries-NECOBELAC; www.necobelac.eu) aimed at improving scientific writing open access and scholarly communication to spread know-how regarding current and future issues and information related to health. The NECOBELAC project is sponsored by the European Community (7th Framework Programme) and will last for 3 years. The project recognises the challenge arising from socio-cultural differences between the participating countries and will deal with generating networks involving institutions working in close collaboration for carrying out training and know-how exchange programmes aimed at producing open access information and spreading it (including technical and ethical aspects). The NECOBELAC project currently involves the Istituto Superiore di Sanità - ISS from Italy (coordinating the project), the Consejo Superior de Investigaciones Científicas (CSIC) from Spain, the University of Nottingham (SHERPA) from the United Kingdom, BIREME from Brazil, the Instituto de Salud Pública (ISP) from Colombia and the Universidade de Minho from Portugal.
NASA Astrophysics Data System (ADS)
Turner, Sean W. D.; Marlow, David; Ekström, Marie; Rhodes, Bruce G.; Kularathna, Udaya; Jeffrey, Paul J.
2014-04-01
Despite a decade of research into climate change impacts on water resources, the scientific community has delivered relatively few practical methodological developments for integrating uncertainty into water resources system design. This paper presents an application of the "decision scaling" methodology for assessing climate change impacts on water resources system performance and asks how such an approach might inform planning decisions. The decision scaling method reverses the conventional ethos of climate impact assessment by first establishing the climate conditions that would compel planners to intervene. Climate model projections are introduced at the end of the process to characterize climate risk in such a way that avoids the process of propagating those projections through hydrological models. Here we simulated 1000 multisite synthetic monthly streamflow traces in a model of the Melbourne bulk supply system to test the sensitivity of system performance to variations in streamflow statistics. An empirical relation was derived to convert decision-critical flow statistics to climatic units, against which 138 alternative climate projections were plotted and compared. We defined the decision threshold in terms of a system yield metric constrained by multiple performance criteria. Our approach allows for fast and simple incorporation of demand forecast uncertainty and demonstrates the reach of the decision scaling method through successful execution in a large and complex water resources system. Scope for wider application in urban water resources planning is discussed.
Science to support the understanding of Ohio's water resources, 2014-15
Shaffer, Kimberly; Kula, Stephanie P.
2014-01-01
The U.S. Geological Survey (USGS) works in cooperation with local, State, and other Federal agencies, as well as universities, to furnish decision makers, policy makers, USGS scientists, and the general public with reliable scientific information and tools to assist them in management, stewardship, and use of Ohio’s natural resources. The diversity of scientific expertise among USGS personnel enables them to carry out large- and small-scale multidisciplinary studies. The USGS is unique among government organizations because it has neither regulatory nor developmental authority—its sole product is impartial, credible, relevant, and timely scientific information, equally accessible and available to everyone. The USGS Ohio Water Science Center provides reliable hydrologic and water-related ecological information to aid in the understanding of the use and management of the Nation’s water resources, in general, and Ohio’s water resources, in particular. This fact sheet provides an overview of current (2014) or recently completed USGS studies and data activities pertaining to water resources in Ohio. More information regarding projects of the USGS Ohio Water Science Center is available at http://oh.water.usgs.gov/.
NASA Astrophysics Data System (ADS)
Doran, Rosa; Ferlet, Roger; Gómez de Castro, Ana I.; Hill, Robert; Horellou, Cathy; Mankiewicz, Lech; Melchior, Anne-Laure; Metaxa, Margarita; Zanazzi, Alessandra
2007-08-01
Hands-on Universe is a project born at UC@Berkeley. A project devoted to enrich the teaching of Astronomy within the classroom environment with a different approach, more connected to the new technologies. Its main goals are not only to promote the use of such technologies but also to reawaken on students the taste for STEM (Science, technologies, engineering and math) related issues and also to increase their scientific culture. Eight countries in Europe decided to adopt the method and, funded by MINERVA, formed the European Hands-on Universe. Several resources were produced and a data reduction software developed http://www.euhou.net/.Other European countries are interested and should join this coordinated effort in the near future. At an international level there are 20 countries using this approach. There are plans to develop scientific cooperation among these countries. Pilot scientific research projects in schools are being tested in EU-HOU schools, Russia and USA. There is also a game being developed to be used as a new tool for teaching scientific content in the classroom environment. An effort to develop an international network of scientific / educational collaboration is the next step.
The structure of control and data transfer management system for the GAMMA-400 scientific complex
NASA Astrophysics Data System (ADS)
Arkhangelskiy, A. I.; Bobkov, S. G.; Serdin, O. V.; Gorbunov, M. S.; Topchiev, N. P.
2016-02-01
A description of the control and data transfer management system for scientific instrumentation involved in the GAMMA-400 space project is given. The technical capabilities of all specialized equipment to provide the functioning of the scientific instrumentation and satellite support systems are unified in a single structure. Control of the scientific instruments is maintained using one-time pulse radio commands, as well as program commands in the form of 16-bit code words, which are transmitted via onboard control system and scientific data acquisition system. Up to 100 GByte of data per day can be transferred to the ground segment of the project. The correctness of the proposed and implemented structure, engineering solutions and electronic elemental base selection has been verified by the experimental working-off of the prototype of the GAMMA-400 scientific complex in laboratory conditions.
The Human Genome Diversity (HGD) Project. Summary document
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-12-31
In 1991 a group of human geneticists and molecular biologists proposed to the scientific community that a world wide survey be undertaken of variation in the human genome. To aid their considerations, the committee therefore decided to hold a small series of international workshops to explore the major scientific issues involved. The intention was to define a framework for the project which could provide a basis for much wider and more detailed discussion and planning--it was recognized that the successful implementation of the proposed project, which has come to be known as the Human Genome Diversity (HGD) Project, would notmore » only involve scientists but also various national and international non-scientific groups all of which should contribute to the project`s development. The international HGD workshop held in Sardinia in September 1993 was the last in the initial series of planning workshops. As such it not only explored new ground but also pulled together into a more coherent form much of the formal and informal discussion that had taken place in the preceding two years. This report presents the deliberations of the Sardinia workshop within a consideration of the overall development of the HGD Project to date.« less
Shaw, Jennifer
2016-02-01
The Human Genome Archive Project (HGAP) aimed to preserve the documentary heritage of the UK's contribution to the Human Genome Project (HGP) by using archival theory to develop a suitable methodology for capturing the results of modern, collaborative science. After assessing past projects and different archival theories, the HGAP used an approach based on the theory of documentation strategy to try to capture the records of a scientific project that had an influence beyond the purely scientific sphere. The HGAP was an archival survey that ran for two years. It led to ninety scientists being contacted and has, so far, led to six collections being deposited in the Wellcome Library, with additional collections being deposited in other UK repositories. In applying documentation strategy the HGAP was attempting to move away from traditional archival approaches to science, which have generally focused on retired Nobel Prize winners. It has been partially successful in this aim, having managed to secure collections from people who are not 'big names', but who made an important contribution to the HGP. However, the attempt to redress the gender imbalance in scientific collections and to improve record-keeping in scientific organisations has continued to be difficult to achieve. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.
Documenting genomics: Applying archival theory to preserving the records of the Human Genome Project
Shaw, Jennifer
2016-01-01
The Human Genome Archive Project (HGAP) aimed to preserve the documentary heritage of the UK's contribution to the Human Genome Project (HGP) by using archival theory to develop a suitable methodology for capturing the results of modern, collaborative science. After assessing past projects and different archival theories, the HGAP used an approach based on the theory of documentation strategy to try to capture the records of a scientific project that had an influence beyond the purely scientific sphere. The HGAP was an archival survey that ran for two years. It led to ninety scientists being contacted and has, so far, led to six collections being deposited in the Wellcome Library, with additional collections being deposited in other UK repositories. In applying documentation strategy the HGAP was attempting to move away from traditional archival approaches to science, which have generally focused on retired Nobel Prize winners. It has been partially successful in this aim, having managed to secure collections from people who are not ‘big names’, but who made an important contribution to the HGP. However, the attempt to redress the gender imbalance in scientific collections and to improve record-keeping in scientific organisations has continued to be difficult to achieve. PMID:26388555
Eyring, Veronika; Bony, Sandrine; Meehl, Gerald A.; ...
2016-05-26
By coordinating the design and distribution of global climate model simulations of the past, current, and future climate, the Coupled Model Intercomparison Project (CMIP) has become one of the foundational elements of climate science. However, the need to address an ever-expanding range of scientific questions arising from more and more research communities has made it necessary to revise the organization of CMIP. After a long and wide community consultation, a new and more federated structure has been put in place. It consists of three major elements: (1) a handful of common experiments, the DECK (Diagnostic, Evaluation and Characterization of Klima) andmore » CMIP historical simulations (1850–near present) that will maintain continuity and help document basic characteristics of models across different phases of CMIP; (2) common standards, coordination, infrastructure, and documentation that will facilitate the distribution of model outputs and the characterization of the model ensemble; and (3) an ensemble of CMIP-Endorsed Model Intercomparison Projects (MIPs) that will be specific to a particular phase of CMIP (now CMIP6) and that will build on the DECK and CMIP historical simulations to address a large range of specific questions and fill the scientific gaps of the previous CMIP phases. The DECK and CMIP historical simulations, together with the use of CMIP data standards, will be the entry cards for models participating in CMIP. Participation in CMIP6-Endorsed MIPs by individual modelling groups will be at their own discretion and will depend on their scientific interests and priorities. With the Grand Science Challenges of the World Climate Research Programme (WCRP) as its scientific backdrop, CMIP6 will address three broad questions: – How does the Earth system respond to forcing? – What are the origins and consequences of systematic model biases? – How can we assess future climate changes given internal climate variability, predictability, and uncertainties in scenarios? This CMIP6 overview paper presents the background and rationale for the new structure of CMIP, provides a detailed description of the DECK and CMIP6 historical simulations, and includes a brief introduction to the 21 CMIP6-Endorsed MIPs.« less
Human Genome Project discoveries: Dialectics and rhetoric in the science of genetics
NASA Astrophysics Data System (ADS)
Robidoux, Charlotte A.
The Human Genome Project (HGP), a $437 million effort that began in 1990 to chart the chemical sequence of our three billion base pairs of DNA, was completed in 2003, marking the 50th anniversary that proved the definitive structure of the molecule. This study considered how dialectical and rhetorical arguments functioned in the science, political, and public forums over a 20-year period, from 1980 to 2000, to advance human genome research and to establish the official project. I argue that Aristotle's continuum of knowledge--which ranges from the probable on one end to certified or demonstrated knowledge on the other--provides useful distinctions for analyzing scientific reasoning. While contemporary scientific research seeks to discover certified knowledge, investigators generally employ the hypothetico-deductive or scientific method, which often yields probable rather than certain findings, making these dialectical in nature. Analysis of the discourse describing human genome research revealed the use of numerous rhetorical figures and topics. Persuasive and probable reasoning were necessary for scientists to characterize unknown genetic phenomena, to secure interest in and funding for large-scale human genome research, to solve scientific problems, to issue probable findings, to convince colleagues and government officials that the findings were sound and to disseminate information to the public. Both government and private venture scientists drew on these tools of reasoning to promote their methods of mapping and sequencing the genome. The debate over how to carry out sequencing was rooted in conflicting values. Scientists representing the academic tradition valued a more conservative method that would establish high quality results, and those supporting private industry valued an unconventional approach that would yield products and profits more quickly. Values in turn influenced political and public forum arguments. Agency representatives and investors sided with the approach that reflected values they supported. Fascinated with this controversy and the convincing comparisons, the media often endorsed Celera's work for its efficiency. The analysis of discourse from the science, political, and public forums revealed that value systems influenced the accuracy and quality of the arguments more than the type or number of figures used to describe the research to various audiences.
From Sky to Archive: Long Term Management of Sky Survey Data
NASA Astrophysics Data System (ADS)
Darch, Peter T.; Sands, Ashley E.; Borgman, Christine; Golshan, Milena S.; Traweek, Sharon
2017-01-01
Sky survey data may remain scientifically valuable long beyond the end of a survey’s operational period, both for continuing inquiry and for calibrating and testing instruments for subsequent generations of surveys. Astronomy infrastructure has many stakeholders, including those concerned with data management. Research libraries are increasingly partnering with scholars to sustain access to data.The Sloan Digital Sky Survey (SDSS) was among the first major scientific projects to partner with libraries in this way, embarking on a data transfer process with two university libraries. We report on a qualitative case study of this process.Ideally, long-term sustainability of sky survey data would be a key part of planning and construction, but rarely does this occur. Teams are under pressure to deliver a project on time and on budget that produces high-quality data during its operational period, leaving few resources available to plan long-term data management. The difficulty of planning is further compounded by the complexity of predicting circumstances and needs of the astronomy community in future decades. SDSS team members regarded libraries, long-lived institutions concerned with access to scholarship, as a potential solution to long-term data sustainability.As the SDSS data transfer was the first of this scale attempted - 160 TB of data - astronomers and library staff were faced with scoping the range of activities involved. They spent two years planning this five-year process. While successful overall as demonstration projects, the libraries encountered many obstacles. We found all parties experienced difficulty in articulating their notions of “scientific data,” “archiving,” “serving,” and “providing access” to the datasets. Activities and interpretations of the data transfer process varied by institutional motivations for participation and by available infrastructure. We conclude several, rather than a single, “library solutions” for long-term data management should be considered. Life cycle models popular in the library community are insufficient to conceptualize data management at this scale. We also identify institutional and policy challenges for curating large scientific datasets.
Using a Feature Film to Promote Scientific Enquiry
ERIC Educational Resources Information Center
Hadzigeorgiou, Yannis; Kodakos, Tassos; Garganourakis, Vassilios
2010-01-01
This article reports on an action research project undertaken with the primary aim of investigating the extent to which a feature film, whose plot included Tesla's demonstrations on the wireless transmission of electrical energy, can promote scientific enquiry. The class that participated in this project was an 11th grade class in a rural area of…
ERIC Educational Resources Information Center
Friedrich, Jon M.
2014-01-01
Engaging freshman and sophomore students in meaningful scientific research is challenging because of their developing skill set and their necessary time commitments to regular classwork. A project called the Chondrule Analysis Project was initiated to engage first- and second-year students in an initial research experience and also accomplish…
Get Your Feet Wet--Scientifically: A Guide to Water Testing as a School Science Project.
ERIC Educational Resources Information Center
Sattler, Edward D.; Zalkin, Larry
1989-01-01
Describes a project involving students in hands-on scientific experiment to locate and identify areas of water pollution, based on Delta Laboratories Adopt-A-Stream Program. Describes getting started, working cooperatively, community support, recording and using data. Includes data sheet, checklist, and photographs of students at study site. (TES)
Cyclic stress induced phase transformation in super-bainitic microstructure
NASA Astrophysics Data System (ADS)
Xiu, Wencui; Han, Ying; Liu, Cheng; Wu, Hua; Liu, Yunxu
2017-03-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 51171030 and 51604034), the Scientific and Technological Planning Project of Jilin Province, China (Grant No. 20150520030JH), and the Scientific and Technological Research Fund of Jilin Provincial Education Department during the Twelfth Five-year Plan Period, China (Grant No. 2015-95).
Citizen Science Initiatives: Engaging the Public and Demystifying Science
Van Vliet, Kim; Moore, Claybourne
2016-01-01
The Internet and smart phone technologies have opened up new avenues for collaboration among scientists around the world. These technologies have also expanded citizen science opportunities and public participation in scientific research (PPSR). Here we discuss citizen science, what it is, who does it, and the variety of projects and methods used to increase scientific knowledge and scientific literacy. We describe a number of different types of citizen-science projects. These greatly increase the number of people involved, helping to speed the pace of data analysis and allowing science to advance more rapidly. As a result of the numerous advantages of citizen-science projects, these opportunities are likely to expand in the future and increase the rate of novel discoveries. PMID:27047582
Citizen Science Initiatives: Engaging the Public and Demystifying Science.
Van Vliet, Kim; Moore, Claybourne
2016-03-01
The Internet and smart phone technologies have opened up new avenues for collaboration among scientists around the world. These technologies have also expanded citizen science opportunities and public participation in scientific research (PPSR). Here we discuss citizen science, what it is, who does it, and the variety of projects and methods used to increase scientific knowledge and scientific literacy. We describe a number of different types of citizen-science projects. These greatly increase the number of people involved, helping to speed the pace of data analysis and allowing science to advance more rapidly. As a result of the numerous advantages of citizen-science projects, these opportunities are likely to expand in the future and increase the rate of novel discoveries.
Science Support: The Building Blocks of Active Data Curation
NASA Astrophysics Data System (ADS)
Guillory, A.
2013-12-01
While the scientific method is built on reproducibility and transparency, and results are published in peer reviewed literature, we have come to the digital age of very large datasets (now of the order of petabytes and soon exabytes) which cannot be published in the traditional way. To preserve reproducibility and transparency, active curation is necessary to keep and protect the information in the long term, and 'science support' activities provide the building blocks for active data curation. With the explosive growth of data in all fields in recent years, there is a pressing urge for data centres to now provide adequate services to ensure long-term preservation and digital curation of project data outputs, however complex those may be. Science support provides advice and support to science projects on data and information management, from file formats through to general data management awareness. Another purpose of science support is to raise awareness in the science community of data and metadata standards and best practice, engendering a culture where data outputs are seen as valued assets. At the heart of Science support is the Data Management Plan (DMP) which sets out a coherent approach to data issues pertaining to the data generating project. It provides an agreed record of the data management needs and issues within the project. The DMP is agreed upon with project investigators to ensure that a high quality documented data archive is created. It includes conditions of use and deposit to clearly express the ownership, responsibilities and rights associated with the data. Project specific needs are also identified for data processing, visualization tools and data sharing services. As part of the National Centre for Atmospheric Science (NCAS) and National Centre for Earth Observation (NCEO), the Centre for Environmental Data Archival (CEDA) fulfills this science support role of facilitating atmospheric and Earth observation data generating projects to ensure successful management of the data and accompanying information for reuse and repurpose. Specific examples at CEDA include science support provided to FAAM (Facility for Airborne Atmospheric Measurements) aircraft campaigns and large-scale modelling projects such as UPSCALE, the largest ever PRACE (Partnership for Advanced Computing in Europe) computational project, dependent on CEDA to provide the high-performance storage, transfer capability and data analysis environment on the 'super-data-cluster' JASMIN. The impact of science support on scientific research is conspicuous: better documented datasets with an increasing collection of metadata associated to the archived data, ease of data sharing with the use of standards in formats and metadata and data citation. These establish a high-quality of data management ensuring long-term preservation and enabling re-use by peer scientists which ultimately leads to faster paced progress in science.
DBMS as a Tool for Project Management
NASA Technical Reports Server (NTRS)
Linder, H.
1984-01-01
Scientific objectives of crustal dynamics are listed as well as the contents of the centralized data information system for the crustal dynamics project. The system provides for project observation schedules, gives project configuration control information and project site information.
Astronomy in the Russian Scientific-Educational Project: "KAZAN-GEONA-2010"
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.
2006-08-01
The European Union promotes the Sixth Framework Programme. One of the goals of the EU Programme is opening national research and training programs. A special role in the history of the Kazan University was played by the great mathematician Nikolai Lobachevsky - the founder of non-Euclidean geometry (1826). Historically, the thousand-year old city of Kazan and the two-hundred-year old Kazan University carry out the role of the scientific, organizational, and cultural educational center of the Volga region. For the continued successful development of educational and scientific-educational activity of the Russian Federation, the Republic Tatarstan, Kazan was offered the national project: the International Center of the Sciences and Internet Technologies "GeoNa" (Geometry of Nature - GeoNa - is wisdom, enthusiasm, pride, grandeur). This is a modern complex of conference halls including the Center for Internet Technologies, a 3D Planetarium - development of the Moon, PhysicsLand, an active museum of natural sciences, an oceanarium, and a training complex "Spheres of Knowledge". Center GeoNa promotes the direct and effective channel of cooperation with scientific centers around the world. GeoNa will host conferences, congresses, fundamental scientific research sessions of the Moon and planets, and scientific-educational actions: presentation of the international scientific programs on lunar research and modern lunar databases. A more intense program of exchange between scientific centers and organizations for a better knowledge and planning of their astronomical curricula and the introduction of the teaching of astronomy are proposed. Center GeoNa will enable scientists and teachers of the Russian universities with advanced achievements in science and information technologies to join together to establish scientific communications with foreign colleagues in the sphere of the high technology and educational projects with world scientific centers.
NASA Astrophysics Data System (ADS)
Thompson, Nick; Watters, Robert J.; Schiffman, Peter
2008-04-01
Hawaiian Island flank failures are recognized as the largest landslide events on Earth, reaching volumes of several thousand cubic kilometers and lengths of over 200 km and occurring on an average of once every 100 000 years. The 3.1 km deep Hawaii Scientific Drilling Project (HSDP) enabled an investigation of the rock mass strength variations on the island of Hawaii [Schiffman, P., Watters, R.J., Thompson, N., Walton, A.W., 2006. Hyaloclastites and the slope stability of Hawaiian volcanoes: insights from the Hawaiian Scientific Drilling Project's 3-km drill core. Journal of Volcanology and Geothermal Research, 151 (1-3): 217-228]. This study builds on that of Schiffman et al. [Schiffman, P., Watters, R.J., Thompson, N., Walton, A.W., 2006. Hyaloclastites and the slope stability of Hawaiian volcanoes: Insights from the Hawaiian Scientific Drilling Project's 3-km drill core. Journal of Volcanology and Geothermal Research, 151 (1-3): 217-228] by considering more in-depth rock mass classification and strength testing methods of the HSDP core. Geotechnical core logging techniques combined with laboratory strength testing methods show that rock strength differences exist within the edifice. Comparing the rock strength parameters obtained from the various volcano lithologies identified weak zones, suggesting the possible location of future slip surfaces for large flank failures. Relatively weak rock layers were recognized within poorly consolidated hyaloclastite zones, with increases in strength based on degree of alteration. Subaerial and submarine basalt flows are found to be significantly stronger. With the aid of digital elevation models, cross-sections have been developed of key flank areas on the island of Hawaii. Limit equilibrium slope stability analyses are performed on each cross-section using various failure criteria for the rock mass strength calculations. Based on the stability analyses the majority of the slopes analyzed are considered stable. In cases where instability (i.e. failure) is predicted, decreased rock mass quality (strength) of the altered and highly poorly consolidated lithologies is found to have a significant influence. These lithologies are present throughout the Hawaiian Islands, representing potential failure surfaces for large flank collapses. Failure criterion input parameters are considered in sensitivity analyses as are the influences of certain external stability factors such as sea level variation and seismic loading.
The use of the German V-2 in US for upper atmosphere research
NASA Technical Reports Server (NTRS)
Curtis, S. A.
1979-01-01
Early U.S. space experiments involving the liquid propellant German V-2 are discussed. Although the primary objective of the experiments conducted under project Hermes after World War II was initially the development of missile technology, scientific objectives were soon given the priority. The missile was modified for scientific experiments and the payload increased from 6.8% to 47% between 1946 and 1949. Among other instruments, the payload included a cosmic ray telescope, ionosphere transmitter and spectrograph for solar spectral measurements. While the scientific success of the program established a positive public attitude towards space research, the Upper Atmosphere Research Panel, formed to coordinate the project, set a pattern for future scientific advisory bodies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, J.; Herner, K.; Jayatilaka, B.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in bothmore » software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. Furthermore, these efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.« less
VanBlaricom, Glenn R.; Belting, Traci F.; Triggs, Lisa H.
2015-01-01
Studies of sea otters in captivity began in 1932, producing important insights for conservation. Soviet (initiated in 1932) and United States (1951) studies provided information on captive otter husbandry, setting the stage for eventual large-scale translocations as tools for population restoration. Early studies also informed effective housing of animals in zoos and aquaria, with sea otters first publicly displayed in 1954. Surveys credited displayed otters in convincing the public of conservation values. After early studies, initial scientific data for captive sea otters in aquaria came from work initiated in 1956, and from dedicated research facilities beginning in 1968. Significant achievements have been made in studies of behavior, physiology, reproduction, and high-priority management issues. Larger-scale projects involving translocation and oil spill response provided extensive insights into stress reactions, water quality issues in captivity, and effects of oil spills.
Data preservation at the Fermilab Tevatron
Amerio, S.; Behari, S.; Boyd, J.; ...
2017-01-22
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
Boyd, J.; Herner, K.; Jayatilaka, B.; ...
2015-12-23
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in bothmore » software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. Furthermore, these efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.« less
White House announces “big data” initiative
NASA Astrophysics Data System (ADS)
Showstack, Randy
2012-04-01
The world is now generating zetabytes—which is 10 to the 21st power, or a billion trillion bytess—of information every year, according to John Holdren, director of the White House Office of Science and Technology Policy. With data volumes growing exponentially from a variety of sources such as computers running large-scale models, scientific instruments including telescopes and particle accelerators, and even online retail transactions, a key challenge is to better manage and utilize the data. The Big Data Research and Development Initiative, launched by the White House at a 29 March briefing, initially includes six federal departments and agencies providing more than $200 million in new commitments to improve tools and techniques for better accessing, organizing, and using data for scientific advances. The agencies and departments include the National Science Foundation (NSF), Department of Energy, U.S. Geological Survey (USGS), National Institutes of Health (NIH), Department of Defense, and Defense Advanced Research Projects Agency.
Evolution of Scientific and Technical Information Distribution
NASA Technical Reports Server (NTRS)
Esler, Sandra; Nelson, Michael L.
1998-01-01
World Wide Web (WWW) and related information technologies are transforming the distribution of scientific and technical information (STI). We examine 11 recent, functioning digital libraries focusing on the distribution of STI publications, including journal articles, conference papers, and technical reports. We introduce 4 main categories of digital library projects: based on the architecture (distributed vs. centralized) and the contributor (traditional publisher vs. authoring individual/organization). Many digital library prototypes merely automate existing publishing practices or focus solely on the digitization of the publishing cycle output, not sampling and capturing elements of the input. Still others do not consider for distribution the large body of "gray literature." We address these deficiencies in the current model of STI exchange by suggesting methods for expanding the scope and target of digital libraries by focusing on a greater source of technical publications and using "buckets," an object-oriented construct for grouping logically related information objects, to include holdings other than technical publications.
Adapting California’s ecosystems to a changing climate
Elizabeth Chornesky,; David Ackerly,; Paul Beier,; Frank Davis,; Flint, Lorraine E.; Lawler, Joshua J.; Moyle, Peter B.; Moritz, Max A.; Scoonover, Mary; Byrd, Kristin B.; Alvarez, Pelayo; Heller, Nicole E.; Micheli, Elisabeth; Weiss, Stuart
2017-01-01
Significant efforts are underway to translate improved understanding of how climate change is altering ecosystems into practical actions for sustaining ecosystem functions and benefits. We explore this transition in California, where adaptation and mitigation are advancing relatively rapidly, through four case studies that span large spatial domains and encompass diverse ecological systems, institutions, ownerships, and policies. The case studies demonstrate the context specificity of societal efforts to adapt ecosystems to climate change and involve applications of diverse scientific tools (e.g., scenario analyses, downscaled climate projections, ecological and connectivity models) tailored to specific planning and management situations (alternative energy siting, wetland management, rangeland management, open space planning). They illustrate how existing institutional and policy frameworks provide numerous opportunities to advance adaptation related to ecosystems and suggest that progress is likely to be greatest when scientific knowledge is integrated into collective planning and when supportive policies and financing enable action.
Fermi Spots a Record Flare from Blazar
2015-07-10
Blazar 3C 279's historic gamma-ray flare can be seen in this image from the Large Area Telescope (LAT) on NASA's Fermi satellite. Gamma rays with energies from 100 million to 100 billion electron volts (eV) are shown; for comparison, visible light has energies between 2 and 3 eV. The image spans 150 degrees, is shown in a stereographic projection, and represents an exposure from June 11 at 00:28 UT to June 17 at 08:17 UT. Read more: go.nasa.gov/1TqBAdJ Credit: NASA/DOE/Fermi LAT Collaboration NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Boyd, J.; Herner, K.; Jayatilaka, B.; Roser, R.; Sakumoto, W.
2015-12-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. These efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.
SCCR Digital Learning System for Scientific Conceptual Change and Scientific Reasoning
ERIC Educational Resources Information Center
She, H. C.; Lee, C. Q.
2008-01-01
This study reports an adaptive digital learning project, scientific concept construction and reconstruction (SCCR), that was developed based on the theories of Dual Situated Learning Model (DSLM) and scientific reasoning. In addition, the authors investigated the effects of an SCCR related to a "combustion" topic for sixth grade students…
Laboratory directed research and development fy1999 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Ayat, R A
2000-04-11
The Lawrence Livermore National Laboratory (LLNL) was founded in 1952 and has been managed since its inception by the University of California (UC) for the U.S. Department of Energy (DOE). Because of this long association with UC, the Laboratory has been able to recruit a world-class workforce, establish an atmosphere of intellectual freedom and innovation, and achieve recognition in relevant fields of knowledge as a scientific and technological leader. This environment and reputation are essential for sustained scientific and technical excellence. As a DOE national laboratory with about 7,000 employees, LLNL has an essential and compelling primary mission to ensuremore » that the nation's nuclear weapons remain safe, secure, and reliable and to prevent the spread and use of nuclear weapons worldwide. The Laboratory receives funding from the DOE Assistant Secretary for Defense Programs, whose focus is stewardship of our nuclear weapons stockpile. Funding is also provided by the Deputy Administrator for Defense Nuclear Nonproliferation, many Department of Defense sponsors, other federal agencies, and the private sector. As a multidisciplinary laboratory, LLNL has applied its considerable skills in high-performance computing, advanced engineering, and the management of large research and development projects to become the science and technology leader in those areas of its mission responsibility. The Laboratory Directed Research and Development (LDRD) Program was authorized by the U.S. Congress in 1984. The Program allows the Director of each DOE laboratory to fund advanced, creative, and innovative research and development (R&D) activities that will ensure scientific and technical vitality in the continually evolving mission areas at DOE and the Laboratory. In addition, the LDRD Program provides LLNL with the flexibility to nurture and enrich essential scientific and technical competencies, which attract the most qualified scientists and engineers. The LDRD Program also enables many collaborations with the scientific community in academia, national and international laboratories, and industry. The projects in the FY1999 LDRD portfolio were carefully selected to continue vigorous support of the strategic vision and the long-term goals of DOE and the Laboratory. Projects chosen for LDRD funding undergo stringent selection processes, which look for high-potential scientific return, emphasize strategic relevance, and feature technical peer reviews by external and internal experts. The FY1999 projects described in this annual report focus on supporting the Laboratory's national security needs: stewardship of the U.S. nuclear weapons stockpile, responsibility for the counter- and nonproliferation of weapons of mass destruction, development of high-performance computing, and support of DOE environmental research and waste management programs. In the past, LDRD investments have significantly enhanced LLNL scientific capabilities and greatly contributed to the Laboratory's ability to meet its national security programmatic requirements. Examples of past investments include technical precursors to the Accelerated Strategic Computing Initiative (ASCI), special-materials processing and characterization, and biodefense. Our analysis of the FY1999 portfolio shows that it strongly supports the Laboratory's national security mission. About 95% of the LDRD dollars have directly supported LLNL's national security activities in FY1999, which far exceeds the portion of LLNL's overall budget supported by National Security Programs, which is 63% for FY1999.« less
Examining the Predictive Validity of NIH Peer Review Scores
Lindner, Mark D.; Nakamura, Richard K.
2015-01-01
The predictive validity of peer review at the National Institutes of Health (NIH) has not yet been demonstrated empirically. It might be assumed that the most efficient and expedient test of the predictive validity of NIH peer review would be an examination of the correlation between percentile scores from peer review and bibliometric indices of the publications produced from funded projects. The present study used a large dataset to examine the rationale for such a study, to determine if it would satisfy the requirements for a test of predictive validity. The results show significant restriction of range in the applications selected for funding. Furthermore, those few applications that are funded with slightly worse peer review scores are not selected at random or representative of other applications in the same range. The funding institutes also negotiate with applicants to address issues identified during peer review. Therefore, the peer review scores assigned to the submitted applications, especially for those few funded applications with slightly worse peer review scores, do not reflect the changed and improved projects that are eventually funded. In addition, citation metrics by themselves are not valid or appropriate measures of scientific impact. The use of bibliometric indices on their own to measure scientific impact would likely increase the inefficiencies and problems with replicability already largely attributed to the current over-emphasis on bibliometric indices. Therefore, retrospective analyses of the correlation between percentile scores from peer review and bibliometric indices of the publications resulting from funded grant applications are not valid tests of the predictive validity of peer review at the NIH. PMID:26039440
REXUS/BEXUS: launching student experiments -a step towards a stronger space science community
NASA Astrophysics Data System (ADS)
Fittock, Mark; Stamminger, Andreas; Maria, Roth; Dannenberg, Kristine; Page, Helen
The REXUS/BEXUS (Rocket/Balloon Experiments for University Students) programme pro-vides opportunities to teams of European student scientists and engineers to fly experiments on sounding rockets and high altitude balloons. This is an opportunity for students and the scientific community to benefit from encouragement and support for experiments. An important feature of the programme is that the students experience a full project life-cycle which is typically not a part of their university education and which helps to prepare them for further scientific work. They have to plan, organize, and control their project in order to develop and build up an experiment but must also work on the scientic aspects. Many of the students continue to work in the field on which they focused in the programme and can often build upon both the experience and the results from flight. Within the REXUS/BEXUS project cycle, they are encouraged to write and present papers about their experiments and results; increasing amounts of scientific output are seen from the students who participate. Not only do the students learn and develop from REXUS/BEXUS but the scientific community also reaps significant benefits. Another major benefit of the programme is the promotion that the students are able to bring to the whole space community. Not only are the public made more aware of advanced science and technical concepts but an advantage is present in the contact that the students who participate have to other university level students. Students are less restricted in their publicity and attract large public followings online as well as presenting themselves in more traditional media outlets. Many teams' creative approach to outreach is astonishing. The benefits are not only for the space science community as a whole; institutes, universities and departments can see increased interest following the support of participating students in the programme. The programme is realized under a bilateral Agency Agreement between the German Aerospace Center (DLR) and the Swedish National Space Board (SNSB). The Swedish share of the payload has been made available to students from other European countries through collaboration with the European Space Agency (ESA). EuroLaunch, a cooperation between the Esrange Space Center of the Swedish Space Corporation (SSC) and the Mobile Rocket Base (MORABA) of DLR, is responsible for the campaign management and operations of the launch vehicles. Project coordination is carried out at DLR's Institute of Space Systems and SSC's Esrange. Experts from DLR, SSC and ESA provide technical support to the student teams throughout their project cycles. The REXUS/BEXUS programme has been carried out in its current format since 2007. In that time, it has developed significantly, building upon strengths to provide a richer experience and increasing the educational, scientific, and promotional outputs. The programme is now showing the potential for students to reach out to a truly broad audience and promote the space science community with youthful enthusiasm and an accessible image.
Progress on Suffa Large Radiotelescope Project
NASA Astrophysics Data System (ADS)
Shanin, G. I.; Hojaev, A. S.
2006-08-01
The large-scale radio astronomy facility complex (analogous to the GBT at NRAO) is being created not far from Samarkand (Uzbekistan) on Suffa plateau at 2300 m (Trimble,2001). Originally it was designed as a basic part of the Earth-Space VLBI system (Kardashev et al,1995; URL http://www.asc.rssi.ru/suffa/) and contains the radio telescope for 0.8-60 mm band with 70 [Author ID2: at Fri Jul 14 10:39:00 2006 ]m main reflector, two removable subreflectors; satellite communication station; data receiving and processing system and other necessary infrastructure. The adaptive optics principle will be used for control the surface of the main mirror consisted of 1200 trapezoid panels. The site location provides good seeing conditions for cm-mm range. Averaged annual atmospheric transmission coefficients at zenith were derived as 0.90-0.98 for 3.1 mm and 5.8 mm wavelengths and about 0.60 for 1.36 (Hojaev & Shanin, 1996). The project started as far back as the period of the Soviet Union was stalled since its disintegration. Quite recently the firm decision on completing the project has been endorsed by our Governments, and Russia will invest for these; therefore the project's layouts have been considerably modernizated and updated in order to build up the state-of-art instrument. It should be operational in 2009. Now we are arranging the scientific consortium to further explore the Suffa site more deep and to learn the main 'radio astro climate' parameters by means of a new technology ('radioseeing', radio transparency in different submm, mm and cm bands, PWV, their intercorrelation and correlation with meteoparameters) for the atmosphere modelling at the site and try to forecast the "radio-weather" for reliably planning the scientific schedule of the future telescope. References: Kardashev N.S., Andreyanov V.V., Gvamichava A.S., Likhachev S.F., and Slysh V.I., 1995, , Acta Astronautica, vol.37, p.271 Hojaev A.S., Shanin G.I., 1996, JKAS, v.29 , p.S411 Trimble, V., 2001, A Year of Discovery: Astronomy Highlights of 2000, Sky and Telescope, vol.101, N.2, p.51
Progress on Suffa Large Radiotelescope Project
NASA Astrophysics Data System (ADS)
Shanin, G. I.; Hojaev, A. S.
2006-08-01
The large-scale radio astronomy facility complex (analogous to the GBT at NRAO) is being created not far from Samarkand (Uzbekistan) on Suffa plateau at 2300 m (Trimble,2001). Originally it was designed as a basic part of the Earth-Space VLBI system (Kardashev et al,1995; URL http://www.asc.rssi.ru/suffa/) and contains the radio telescope for 0.8-60 mm band with 70m main reflector, two removable subreflectors; satellite communication station; data receiving and processing system and other necessary infrastructure. The adaptive optics principle will be used for control the surface of the main mirror consisted of 1200 trapezoid panels. The site location provides good seeing conditions for cm-mm range. Averaged annual atmospheric transmission coefficients at zenith were derived as 0.90-0.98 for 3.1 mm and 5.8 mm wavelengths and about 0.60 for 1.36 (Hojaev & Shanin, 1996). The project started as far back as the period of the Soviet Union was stalled since its disintegration. Quite recently the firm decision on completing the project has been endorsed by our Governments, and Russia will invest for these; therefore the project's layouts have been considerably modernizated and updated in order to build up the state-of-art instrument. It should be operational in 2009. Now we are arranging the scientific consortium to further explore the Suffa site more deep and to learn the main 'radio astro climate' parameters by means of a new technology ('radioseeing', radio transparency in different submm, mm and cm bands, PWV , their intercorrelation and correlation with meteoparameters) for the atmosphere modelling at the site and try to forecast the "radio-weather" for reliably planning the scientific schedule of the future telescope. References: Kardashev N.S., Andreyanov V.V., Gvamichava A.S., Likhachev S.F., and Slysh V.I., 1995, , Acta Astronautica, vol.37, p.271 Hojaev A.S., Shanin G.I., 1996, JKAS, v.29 , p.S411 Trimble, V., 2001, A Year of Discovery: Astronomy Highlights of 2000, Sky and Telescope, vol.101, N.2, p.51
NASA Astrophysics Data System (ADS)
Sayers, J.
2003-12-01
Teachers and students at Northview High School in Brazil, Indiana have the opportunity to engage in authentic scientific research through our participation in two national projects, TLRBSE and PEPP. Teacher Leaders in Research Based Science Education (TLRBSE) is a teacher professional development and retention program coupled with authentic scientific research projects in astronomy. Teacher-Leaders are trained in research-based pedagogy and serve as mentors to less experienced colleagues and work with students to develop science research methods and research projects for the classroom. Astronomical data collected at Kitt Peak by astronomers and teachers is made available on CD for classroom use. Northview is in its second year as a TLRBSE school. The Princeton Earth Physics Project (PEPP) trains mentor teachers in fundamentals of research in seismology. Teachers and students then gain hands on experience in science research through operation of a research quality seismic station sited at the high school. Data from the Northview seismometer are stored locally and also transmitted over the Internet to a database at Indiana University. Students have access to local data as well as seismic databases accessible through the Internet to use for research projects. The Northview Seismic Station has been in operation since 1998. In this presentation, I will describe how these projects have been incorporated into the physics and earth science programs at Northview High School. I will discus how our teachers and students have benefited from the opportunity to take part in hands-on scientific research under the guidance of university faculty. In particular, I will describe our participation in a regional seismic network through seismic data acquisition, data analysis using seismological software, and students' experiences in a university-based student research symposium. I reflect on the some of the successes and barriers to high-school teachers' and students' involvement in scientific research programs. I conclude with a discussion of a successful student seismology project that was a finalist in the 2003 INTEL International Science and Engineering Fair
GEO-6 project for Galileo data scientific utilization
NASA Astrophysics Data System (ADS)
Buresova, Dalia; Lastovicka, Jan; Boska, Josef; Sauli, Petra; Kouba, Daniel; Mosna, Zbysek
The future GNSS Galileo system offer a number of benefits (e.g. availability of better accuracy positioning, new frequencies bands allowing the implementation of specific techniques, provable time-stamp and location data using SIS authorisation, integrity, better support ad-hoc algorithms for data analysis and other service guarantee for liability and regulated applications) are widely spread among different disciplines. Also applications which are less interesting from the commercial and market point of view could successfully contribute to the numerous social benefits and support the innovation in the international research. The aim of the GEO-6 project "Scientific research Using GNSS" is to propose and broaden scientific utilization of future GNSS Galileo system data in research. It is a joint project of seven institutions from six countries led by the Atos Origin Company from Spain. The core of the project consists from six projects in five priority areas: PA-1 Remote sensing of the ocean using GNSS reflections, PA-2a Investigating GNSS ionospheric data assimilation, PA-2b 3-D gravity wave detection and determination (both PA-2a and PA-2b are ionospheric topics), PA-3 Demonstration of capability for operational forecasting of atmospheric delays, PA-4 GNSS seismometer, PA-5 Spacecraft formation flying using global navigation satellite systems. Institute of Atmospheric Physics, Prague, Czech Republic is responsible for the project PA-2b, where we developed and tested (to the extent allowed by available data) an algorithm and computer code for the 3-D detection of gravity waves and determination of their characteristics. The main drivers of the GEO-6 project are: high levels of accuracy even with the support of local elements, sharing of solutions and results for the worldwide scientific community. The paper will present basic description of the project with more details concerning Czech participation in it.
NASA Astrophysics Data System (ADS)
Bronarska, K.; Michalek, G.
2018-07-01
Since 1995 coronal mass ejections (CMEs) have been routinely observed thanks to the sensitive Large Angle and Spectrometric Coronagraphs (LASCO) on board the Solar and Heliospheric Observatory (SOHO) mission. Their observed characteristics are stored, among other, in the SOHO/LASCO catalog. These parameters are commonly used in scientific studies. Unfortunately, coronagraphic observations of CMEs are subject to projection effects. This makes it practically impossible to determine the true properties of CMEs and therefore makes it more difficult to forecast their geoeffectiveness. In this study, using quadrature observations with the two Solar Terrestrial Relations Observatory (STEREO) spacecrafts, we estimate the projection effect affecting velocity of CMEs included in the SOHO/LASCO catalog. It was demonstrated that this effect depends significantly on width and source location of CMEs. It can be very significant for narrow events and originating from the disk center. The effect diminishes with increasing width and absolute longitude of source location of CMEs. For very wide (width ⩾ 250°) or limb events (| longitude ⩾ 70°) projection effects completely disappears.
Student cognition and motivation during the Classroom BirdWatch citizen science project
NASA Astrophysics Data System (ADS)
Tomasek, Terry Morton
The purpose of this study was to examine and describe the ways various stakeholders (CBW project developer/coordinator, elementary and middle school teachers, and 5th through 8th grade students) envisioned, implemented and engaged in the citizen science project, eBird/Classroom BirdWatch. A multiple case study mixed-methods research design was used to examine student engagement in the cognitive processes associated with scientific inquiry as part of citizen science participation. Student engagement was described based on a sense of autonomy, competence, relatedness and intrinsic motivation. A goal of this study was to expand the taxonomy of differences between authentic scientific inquiry and simple inquiry to include those inquiry tasks associated with participation in citizen science by describing how students engaged in this type of science. This research study built upon the existing framework of cognitive processes associated with scientific inquiry described by Chinn and Malhotra (2002). This research provides a systematic analysis of the scientific processes and related reasoning tasks associated with the citizen science project eBird and the corresponding curriculum Classroom BirdWatch . Data consisted of responses to surveys, focus group interviews, document analysis and individual interviews. I suggest that citizen science could be an additional form of classroom-based science inquiry that can promote more authentic features of scientific inquiry and engage students in meaningful ways.
Opal web services for biomedical applications.
Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W
2010-07-01
Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.
The scientific programme between ESRO and ESA: Choosing new projects (1973-1977)
NASA Astrophysics Data System (ADS)
Russo, Arturo
1995-02-01
The ESA History Study Reports are preliminary reports of studies carried out within the framework of an ESA (European Space Agency) contract. They will form the basis of a comprehensive study of European Space activities covering the period 1959-1987. The transformation of ESRO (European Space Research Organization) into ESA found the Organization's bodies involved in a new round of the decision-making process to select future scientific satellite projects. In this report, the three main phases of the decision-making process are discussed. The first, from June 1973 to April 1974, the European scientific community and their representatives in ESA's advisory committee structure were invited to agree on a set of space missions for which a definition study was recommended. At the end of this phase, thirteen missions were selected for such a definition study by ESRO's Scientific Program Board (SPB). The second phase covers the period from that decision up to March 1975, when a much more important decision was required, namely to select a restricted number of missions for which a feasibility study was to be performed. The aims of such feasibility studies were to establish the technical and financial feasibility of each project, to propose a well-defined project concept, to identify the research and technology effort required to support it, and to state a preliminary cost estimate to completion. The third phase covers the first two years of the new Agency's life, and concludes with the selection of the projects to be adopted in ESA's scientific program.
A new window on the cosmos: The Stratospheric Observatory for Infrared Astronomy (SOFIA)
NASA Astrophysics Data System (ADS)
Gehrz, R. D.; Becklin, E. E.; de Pater, I.; Lester, D. F.; Roellig, T. L.; Woodward, C. E.
2009-08-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a joint US/German Project to develop and operate a gyrostabilized 2.5-m telescope in a Boeing 747-SP. This observatory will allow astronomical observations from 0.3 μm to sub-millimeter wavelengths at stratospheric altitudes as high as 45,000 ft where the atmosphere is not only cloud-free, but largely transparent at infrared wavelengths. The dynamics and chemistry of interstellar matter, and the details of embedded star formation will be key science goals. In addition, SOFIA's unique portability will enable large-telescope observations at sites required to observe transient phenomena and location specific events. SOFIA will offer the convenient accessibility of a ground-based telescope for servicing, maintenance, and regular technology upgrades, yet will also have many of the performance advantages of a space-based telescope. Initially, SOFIA will fly with nine first-generation focal plane instruments that include broad-band imagers, moderate resolution spectrographs that will resolve broad features from dust and large molecules, and high resolution spectrometers capable of studying the chemistry and detailed kinematics of molecular and atomic gas. First science flights will begin in 2010, leading to a full operations schedule of about 120 8-10 h flights per year by 2014. The next call for instrument development that can respond to scientifically exciting new technologies will be issued in 2010. We describe the SOFIA facility and outline the opportunities for observations by the general scientific community with cutting edge focal plane technology. We summarize the operational characteristics of the first-generation instruments and give specific examples of the types of fundamental scientific studies these instruments are expected to make.
The NSF ITR Project: Framework for the National Virtual Observatory
NASA Astrophysics Data System (ADS)
Szalay, A. S.; Williams, R. D.; NVO Collaboration
2002-05-01
Technological advances in telescope and instrument design during the last ten years, coupled with the exponential increase in computer and communications capability, have caused a dramatic and irreversible change in the character of astronomical research. Large-scale surveys of the sky from space and ground are being initiated at wavelengths from radio to x-ray, thereby generating vast amounts of high quality irreplaceable data. The potential for scientific discovery afforded by these new surveys is enormous. Entirely new and unexpected scientific results of major significance will emerge from the combined use of the resulting datasets, science that would not be possible from such sets used singly. However, their large size and complexity require tools and structures to discover the complex phenomena encoded within them. We plan to build the NVO framework both through coordinating diverse efforts already in existence and providing a focus for the development of capabilities that do not yet exist. The NVO we envisage will act as an enabling and coordinating entity to foster the development of further tools, protocols, and collaborations necessary to realize the full scientific potential of large astronomical datasets in the coming decade. The NVO must be able to change and respond to the rapidly evolving world of IT technology. In spite of its underlying complex software, the NVO should be no harder to use for the average astronomer, than today's brick-and-mortar observatories and telescopes. Development of these capabilities will require close interaction and collaboration with the information technology community and other disciplines facing similar challenges. We need to ensure that the tools that we need exist or are built, but we do not duplicate efforts, and rely on relevant experience of others.
NASA Astrophysics Data System (ADS)
Gorgas, Thomas; Conze, Ronald; Lorenz, Henning; Elger, Kirsten; Ulbricht, Damian; Wilkens, Roy; Lyle, Mitchell; Westerhold, Thomas; Drury, Anna Joy; Tian, Jun; Hahn, Annette
2017-04-01
Scientific ocean drilling over the past >40 years and corresponding efforts on land (by now for more than >20 years) has led to the accumulation of an enormous amount of valuable petrophysical, geochemical, biological and geophysical data obtained through laboratory and field experiments across a multitude of scale-and time dimensions. Such data can be utilized comprehensively in a holistic fashion, and thereby provide base toward an enhanced "Core-Log-Integration", modeling small-scale basin processes to large-scale Earth phenomena, while also storing and managing all relevant information in an "Open Access" fashion. Since the early 1990's members of our team have acquired and measured a large dataset of physical and geochemical properties representing both terrestrial and marine geological environments. This dataset cover a variety of both macro-to-microscale dimensions, and thereby allowing this type of interdisciplinary data examination. Over time, data management and processing tools have been developed and were recently merged with modern data publishing methods, which allow identifying and tracking data and associated publications in a trackable and concise manner. Our current presentation summarizes an important part of the value chain in geosciences, comprising: 1) The state-of-the-art in data management for continental and lake drilling projects performed with and through ICDP's Drilling Information System (DIS). 2) The CODD (Code for Ocean Drilling Data) as numerical-based, programmable data processing toolbox and applicable for both continental and marine drilling projects. 3) The implementation of Persistent Identifiers, such as the International Geo Sample Number (IGSN) to identify and track sample material as part of Digital-Object-Identifier (DOI)-tagged operation reports and research publications. 4) A list of contacts provided for scientists with an interest in learning and applying methods and techniques we offer in form of basic and advanced training courses at our respective research institutions and facilities around the world.
NASA Astrophysics Data System (ADS)
Tatge, C. B.; Slater, S. J.; Slater, T. F.; Schleigh, S.; McKinnon, D.
2016-12-01
Historically, an important part of the scientific research cycle is to situate any research project within the landscape of the existing scientific literature. In the field of discipline-based astronomy education research, grappling with the existing literature base has proven difficult because of the difficulty in obtaining research reports from around the world, particularly early ones. In order to better survey and efficiently utilize the wide and fractured range and domain of astronomy education research methods and results, the iSTAR international Study of Astronomical Reasoning database project was initiated. The project aims to host a living, online repository of dissertations, theses, journal articles, and grey literature resources to serve the world's discipline-based astronomy education research community. The first domain of research artifacts ingested into the iSTAR database were doctoral dissertations. To the authors' great surprise, nearly 300 astronomy education research dissertations were found from the last 100-years. Few, if any, of the literature reviews from recent astronomy education dissertations surveyed even come close to summarizing this many dissertations, most of which have not been published in traditional journals, as re-publishing one's dissertation research as a journal article was not a widespread custom in the education research community until recently. A survey of the iSTAR database dissertations reveals that the vast majority of work has been largely quantitative in nature until the last decade. We also observe that modern-era astronomy education research writings reaches as far back as 1923 and that the majority of dissertations come from the same eight institutions. Moreover, most of the astronomy education research work has been done covering learners' grasp of broad knowledge of astronomy rather than delving into specific learning targets, which has been more in vogue during the last two decades. The surprisingly wide breadth of largely unknown research revealed in the iSTAR database motivates us to begin to synthesize the research and look for broader themes using widely accepted meta analysis techniques.
Scientific impact: opportunity and necessity.
Cohen, Marlene Z; Alexander, Gregory L; Wyman, Jean F; Fahrenwald, Nancy L; Porock, Davina; Wurzbach, Mary E; Rawl, Susan M; Conn, Vicki S
2010-08-01
Recent National Institutes of Health changes have focused attention on the potential scientific impact of research projects. Research with the excellent potential to change subsequent science or health care practice may have high scientific impact. Only rigorous studies that address highly significant problems can generate change. Studies with high impact may stimulate new research approaches by changing understanding of a phenomenon, informing theory development, or creating new research methods that allow a field of science to move forward. Research with high impact can transition health care to more effective and efficient approaches. Studies with high impact may propel new policy developments. Research with high scientific impact typically has both immediate and sustained influence on the field of study. The article includes ideas to articulate potential scientific impact in grant applications as well as possible dissemination strategies to enlarge the impact of completed projects.
Dead Sea deep cores: A window into past climate and seismicity
NASA Astrophysics Data System (ADS)
Stein, Mordechai; Ben-Avraham, Zvi; Goldstein, Steven L.
2011-12-01
The area surrounding the Dead Sea was the locus of humankind's migration out of Africa and thus has been the home of peoples since the Stone Age. For this reason, understanding the climate and tectonic history of the region provides valuable insight into archaeology and studies of human history and helps to gain a better picture of future climate and tectonic scenarios. The deposits at the bottom of the Dead Sea are a geological archive of the environmental conditions (e.g., rains, floods, dust storms, droughts) during ice ages and warm ages, as well as of seismic activity in this key region. An International Continental Scientific Drilling Program (ICDP) deep drilling project was performed in the Dead Sea between November 2010 and March 2011. The project was funded by the ICDP and agencies in Israel, Germany, Japan, Norway, Switzerland, and the United States. Drilling was conducted using the new Large Lake Drilling Facility (Figure 1), a barge with a drilling rig run by DOSECC, Inc. (Drilling, Observation and Sampling of the Earth's Continental Crust), a nonprofit corporation dedicated to advancing scientific drilling worldwide. The main purpose of the project was to recover a long, continuous core to provide a high resolution record of the paleoclimate, paleoenvironment, paleoseismicity, and paleomagnetism of the Dead Sea Basin. With this, scientists are beginning to piece together a record of the climate and seismic history of the Middle East during the past several hundred thousand years in millennial to decadal to annual time resolution.
Störmer, A; Franz, R
2009-12-01
Most food packages and food-contact materials are manufactured using adhesives. The European Union regulates all food-contact materials, as their constituents may not contaminate food and endanger consumers' health. In contrast to plastics which are regulated by positive lists of authorized ingredients, adhesives have not yet a specific regulation. The MIGRESIVES project aimed to elaborate a scientific global risk-assessment approach to meet current general European Union regulatory requirements and as a basis for future specific European Union legislation as well as to provide the industry, especially small and medium-sized enterprises, a tool to ensure that migration from adhesives is in compliance with the regulatory requirements. The idea was to demonstrate that consumers' exposure to chemicals released by adhesives is in many cases below levels of concern. Technical/scientific knowledge from industry and research institutes will be merged into a collective research endeavour gathering all stakeholders. The major milestones are (1) the classification of adhesives according to chemistry and uses, (2) the test strategies based on physico-chemical behaviour of adhesives, (3) modelling migration/exposure from adhesives, (4) providing guidelines to integrate the risk-assessment approach into the daily life of companies, (5) the feasibility of applying the toxicological approach from the European Union BIOSAFEPAPER project, and (6) extensive training/education to small and medium-sized enterprises (SMEs) and large dissemination for general adoption of the concept in Europe.
An expert panel process to evaluate habitat restoration actions in the Columbia River estuary.
Krueger, Kirk L; Bottom, Daniel L; Hood, W Gregory; Johnson, Gary E; Jones, Kim K; Thom, Ronald M
2017-03-01
We describe a process for evaluating proposed ecosystem restoration projects intended to improve survival of juvenile salmon in the Columbia River estuary (CRE). Changes in the Columbia River basin (northwestern USA), including hydropower development, have contributed to the listing of 13 salmon stocks as endangered or threatened under the U.S. Endangered Species Act. Habitat restoration in the CRE, from Bonneville Dam to the ocean, is part of a basin-wide, legally mandated effort to mitigate federal hydropower impacts on salmon survival. An Expert Regional Technical Group (ERTG) was established in 2009 to improve and implement a process for assessing and assigning "survival benefit units" (SBUs) to restoration actions. The SBU concept assumes site-specific restoration projects will increase juvenile salmon survival during migration through the 234 km CRE. Assigned SBUs are used to inform selection of restoration projects and gauge mitigation progress. The ERTG standardized the SBU assessment process to improve its scientific integrity, repeatability, and transparency. In lieu of experimental data to quantify the survival benefits of individual restoration actions, the ERTG adopted a conceptual model composed of three assessment criteria-certainty of success, fish opportunity improvements, and habitat capacity improvements-to evaluate restoration projects. Based on these criteria, an algorithm assigned SBUs by integrating potential fish density as an indicator of salmon performance. Between 2009 and 2014, the ERTG assessed SBUs for 55 proposed projects involving a total of 181 restoration actions located across 8 of 9 reaches of the CRE, largely relying on information provided in a project template based on the conceptual model, presentations, discussions with project sponsors, and site visits. Most projects restored tidal inundation to emergent wetlands, improved riparian function, and removed invasive vegetation. The scientific relationship of geomorphic and salmonid responses to restoration actions remains the foremost concern. Although not designed to establish a broad strategy for estuary restoration, the scoring process has adaptively influenced the types, designs, and locations of restoration proposals. The ERTG process may be a useful model for others who have unique ecosystem restoration goals and share some of our common challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.
Curriculum and Faculty Development for the Teaching of Academic Research Ethics.
ERIC Educational Resources Information Center
Stern, Judy E.; Elliott, Deni
This report summarizes a three-year project to design a graduate level course in ethics and scientific research at Dartmouth College (New Hampshire). The goals of the project were: (1) to train faculty to teach a course in research ethics, (2) to pilot-teach a graduate course in ethics and scientific research, and (3) to develop teaching materials…
ERIC Educational Resources Information Center
Shakuto, Elena A.; Dorozhkin, Evgenij M.; Kozlova, Anastasia A.
2016-01-01
The relevance of the subject under analysis is determined by the lack of theoretical development of the problem of management of teacher scientific-methodical work in vocational educational institutions based upon innovative approaches in the framework of project paradigm. The purpose of the article is to develop and test a science-based…
A technology program for the development of the large deployable reflector for space based astronomy
NASA Technical Reports Server (NTRS)
Kiya, M. K.; Gilbreath, W. P.; Swanson, P. N.
1982-01-01
Technologies for the development of the Large Deployable Reflector (LDR), a NASA project for the 1990's, for infrared and submillimeter astronomy are presented. The proposed LDR is a 10-30 diameter spaceborne observatory operating in the spectral region from 30 microns to one millimeter, where ground observations are nearly impossible. Scientific rationales for such a system include the study of ancient signals from galaxies at the edge of the universe, the study of star formation, and the observation of fluctuations in the cosmic background radiation. System requirements include the ability to observe faint objects at large distances and to map molecular clouds and H II regions. From these requirements, mass, photon noise, and tolerance budgets are developed. A strawman concept is established, and some alternate concepts are considered, but research is still necessary in the areas of segment, optical control, and instrument technologies.
Global artificial photosynthesis project: a scientific and legal introduction.
Faunce, Thomas
2011-12-01
With the global human population set to exceed 10 billion by 2050, its collective energy consumption to rise from 400 to over 500 EJ/yr and with the natural environment under increasing pressure from these sources as well as from anthropogenic climate change, political solutions such as the creation of an efficient carbon price and trading scheme may arrive too late. In this context, the scientific community is exploring technological remedies. Central to these options is artificial photosynthesis--the creation, particularly through nanotechnology, of devices capable to doing what plants have done for millions of years - transforming sunlight, water and carbon dioxide into food and fuel. This article argues that a Global Artificial Photosynthesis (GAP) project can raise the public profile and encourage the pace, complexity and funding of scientific collaborations in artificial photosynthesis research. The legal structure of a GAP project will be critical to prevent issues such as state sovereignty over energy and food resources and corporate intellectual monopoly privileges unduly inhibiting the important contribution of artificial photosynthesis to global public health and environmental sustainability. The article presents an introduction to the scientific and legal concepts behind a GAP project.
The control and data acquisition structure for the GAMMA-400 space gamma-telescope
NASA Astrophysics Data System (ADS)
Arkhangelskiy, Andrey
2016-07-01
The GAMMA-400 space project is intended for precision investigation of the cosmic gamma-emission in the energy band from keV region up to several TeV, electrons and positrons fluxes from ˜~1~GeV up to ˜~10~TeV and high energy cosmic-ray nuclei fluxes. A description of the control and data acquisition structure for gamma-telescope involved in the GAMMA 400 space project is given. The technical capabilities of all specialized equipment providing the functioning of the scientific instrumentation and satellite support systems are unified in a single structure. Control of the scientific instruments is maintained using one-time pulse radio commands and program commands transmitted via onboard control system and scientific data acquisition system. Up to 100~GByte of data per day can be transferred to the ground segment of the project. The correctness of the proposed and implemented structure, engineering solutions and electronic elemental base selection has been verified experimentally with the scientific complex prototype in the laboratory conditions.
Introduction to the HL-LHC Project
NASA Astrophysics Data System (ADS)
Rossi, L.; Brüning, O.
The Large Hadron Collider (LHC) is one of largest scientific instruments ever built. It has been exploring the new energy frontier since 2010, gathering a global user community of 7,000 scientists. To extend its discovery potential, the LHC will need a major upgrade in the 2020s to increase its luminosity (rate of collisions) by a factor of five beyond its design value and the integrated luminosity by a factor of ten. As a highly complex and optimized machine, such an upgrade of the LHC must be carefully studied and requires about ten years to implement. The novel machine configuration, called High Luminosity LHC (HL-LHC), will rely on a number of key innovative technologies, representing exceptional technological challenges, such as cutting-edge 11-12 tesla superconducting magnets, very compact superconducting cavities for beam rotation with ultra-precise phase control, new technology for beam collimation and 300-meter-long high-power superconducting links with negligible energy dissipation. HL-LHC federates efforts and R&D of a large community in Europe, in the US and in Japan, which will facilitate the implementation of the construction phase as a global project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katz, Daniel S; Jha, Shantenu; Weissman, Jon
2017-01-31
This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable and interoperablemore » distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weissman, Jon; Katz, Dan; Jha, Shantenu
2017-01-31
This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable andmore » interoperable distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less