Forget the hype or reality. Big data presents new opportunities in Earth Science.
NASA Astrophysics Data System (ADS)
Lee, T. J.
2015-12-01
Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.
Big Science and the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Giudice, Gian Francesco
2012-03-01
The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.
NASA Astrophysics Data System (ADS)
Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.
2017-12-01
Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.
ERIC Educational Resources Information Center
Ward, Tony J.; Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij
2016-01-01
"Air Toxics Under the Big Sky" is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. This research explored: (1)…
Evolution of the Air Toxics under the Big Sky Program
ERIC Educational Resources Information Center
Marra, Nancy; Vanek, Diana; Hester, Carolyn; Holian, Andrij; Ward, Tony; Adams, Earle; Knuth, Randy
2011-01-01
As a yearlong exploration of air quality and its relation to respiratory health, the "Air Toxics Under the Big Sky" program offers opportunities for students to learn and apply science process skills through self-designed inquiry-based research projects conducted within their communities. The program follows a systematic scope and sequence…
A Proposed Concentration Curriculum Design for Big Data Analytics for Information Systems Students
ERIC Educational Resources Information Center
Molluzzo, John C.; Lawler, James P.
2015-01-01
Big Data is becoming a critical component of the Information Systems curriculum. Educators are enhancing gradually the concentration curriculum for Big Data in schools of computer science and information systems. This paper proposes a creative curriculum design for Big Data Analytics for a program at a major metropolitan university. The design…
MiTEP's Collaborative Field Course Design Process Based on Earth Science Literacy Principles
NASA Astrophysics Data System (ADS)
Engelmann, C. A.; Rose, W. I.; Huntoon, J. E.; Klawiter, M. F.; Hungwe, K.
2010-12-01
Michigan Technological University has developed a collaborative process for designing summer field courses for teachers as part of their National Science Foundation funded Math Science Partnership program, called the Michigan Teacher Excellence Program (MiTEP). This design process was implemented and then piloted during two two-week courses: Earth Science Institute I (ESI I) and Earth Science Institute II (ESI II). Participants consisted of a small group of Michigan urban science teachers who are members of the MiTEP program. The Earth Science Literacy Principles (ESLP) served as the framework for course design in conjunction with input from participating MiTEP teachers as well as research done on common teacher and student misconceptions in Earth Science. Research on the Earth Science misconception component, aligned to the ESLP, is more fully addressed in GSA Abstracts with Programs Vol. 42, No. 5. “Recognizing Earth Science Misconceptions and Reconstructing Knowledge through Conceptual-Change-Teaching”. The ESLP were released to the public in January 2009 by the Earth Science Literacy Organizing Committee and can be found at http://www.earthscienceliteracy.org/index.html. Each day of the first nine days of both Institutes was focused on one of the nine ESLP Big Ideas; the tenth day emphasized integration of concepts across all of the ESLP Big Ideas. Throughout each day, Michigan Tech graduate student facilitators and professors from Michigan Tech and Grand Valley State University consistantly focused teaching and learning on the day's Big Idea. Many Earth Science experts from Michigan Tech and Grand Valley State University joined the MiTEP teachers in the field or on campus, giving presentations on the latest research in their area that was related to that Big Idea. Field sites were chosen for their unique geological features as well as for the “sense of place” each site provided. Preliminary research findings indicate that this collaborative design process piloted as ESI I and ESI II was successful in improving MiTEP teacher understanding of Earth Science content and that it was helpful to use the ESLP framework. Ultimately, a small sample of student scores will look at the impact on student learning in the MiTEP teacher classrooms.
Big data science: A literature review of nursing research exemplars.
Westra, Bonnie L; Sylvia, Martha; Weinfurter, Elizabeth F; Pruinelli, Lisiane; Park, Jung In; Dodd, Dianna; Keenan, Gail M; Senk, Patricia; Richesson, Rachel L; Baukner, Vicki; Cruz, Christopher; Gao, Grace; Whittenburg, Luann; Delaney, Connie W
Big data and cutting-edge analytic methods in nursing research challenge nurse scientists to extend the data sources and analytic methods used for discovering and translating knowledge. The purpose of this study was to identify, analyze, and synthesize exemplars of big data nursing research applied to practice and disseminated in key nursing informatics, general biomedical informatics, and nursing research journals. A literature review of studies published between 2009 and 2015. There were 650 journal articles identified in 17 key nursing informatics, general biomedical informatics, and nursing research journals in the Web of Science database. After screening for inclusion and exclusion criteria, 17 studies published in 18 articles were identified as big data nursing research applied to practice. Nurses clearly are beginning to conduct big data research applied to practice. These studies represent multiple data sources and settings. Although numerous analytic methods were used, the fundamental issue remains to define the types of analyses consistent with big data analytic methods. There are needs to increase the visibility of big data and data science research conducted by nurse scientists, further examine the use of state of the science in data analytics, and continue to expand the availability and use of a variety of scientific, governmental, and industry data resources. A major implication of this literature review is whether nursing faculty and preparation of future scientists (PhD programs) are prepared for big data and data science. Copyright © 2016 Elsevier Inc. All rights reserved.
The NASA Beyond Einstein Program
NASA Technical Reports Server (NTRS)
White, Nicholas E.
2006-01-01
Einstein's legacy is incomplete, his theory of General relativity raises -- but cannot answer --three profound questions: What powered the big bang? What happens to space, time, and matter at the edge of a black hole? and What is the mysterious dark energy pulling the Universe apart? The Beyond Einstein program within NASA's Office of Space Science aims to answer these questions, employing a series of missions linked by powerful new technologies and complementary approaches towards shared science goals. The Beyond Einstein program has three linked elements which advance science and technology towards two visions; to detect directly gravitational wave signals from the earliest possible moments of the BIg Bang, and to image the event horizon of a black hole. The central element is a pair of Einstein Great Observatories, Constellation-X and LISA. Constellation-X is a powerful new X-ray observatory dedicated to X-Ray Spectroscopy. LISA is the first spaced based gravitational wave detector. These powerful facilities will blaze new paths to the questions about black holes, the Big Bang and dark energy. The second element is a series of competitively selected Einstein Probes, each focused on one of the science questions and includes a mission dedicated resolving the Dark Energy mystery. The third element is a program of technology development, theoretical studies and education. The Beyond Einstein program is a new element in the proposed NASA budget for 2004. This talk will give an overview of the program and the missions contained within it.
NASA Astrophysics Data System (ADS)
Campbell, J. L.; Burrows, S.; Gower, S. T.; Cohen, W. B.
1999-09-01
The BigFoot Project is funded by the Earth Science Enterprise to collect and organize data to be used in the EOS Validation Program. The data collected by the BigFoot Project are unique in being ground-based observations coincident with satellite overpasses. In addition to collecting data, the BigFoot project will develop and test new algorithms for scaling point measurements to the same spatial scales as the EOS satellite products. This BigFoot Field Manual Mill be used to achieve completeness and consistency of data collected at four initial BigFoot sites and at future sites that may collect similar validation data. Therefore, validation datasets submitted to the ORNL DAAC that have been compiled in a manner consistent with the field manual will be especially valuable in the validation program.
Beyond Einstein: From the Big Bang to Black Holes
NASA Astrophysics Data System (ADS)
White, N.
Beyond Einstein is a science-driven program of missions, education and outreach, and technology, to address three questions: What powered the Big Bang? What happens to space, time, and matter at the edge of a Black Hole? What is the mysterious Dark Energy pulling the universe apart? To address the science objectives, Beyond Einstein contains several interlinked elements. The strategic missions Constellation-X and LISA primarily investigate the nature of black holes. Constellation-X is a spectroscopic observatory that uses X-ray emitting atoms as clocks to follow the fate of matter falling into black holes. LISA will be the first space-based gravitational wave observatory uses gravitational waves to measure the dynamic structure of space and time around black holes. Moderate sized probes that are fully competed, peer-reviewed missions (300M-450M) launched every 3-5 years to address the focussed science goals: 1) Determine the nature of the Dark Energy that dominates the universe, 2) Search for the signature of the beginning of the Big Bang in the microwave background and 3) Take a census of Black Holes of all sizes and ages in the universe. The final element is a Technology Program to enable ultimate Vision Missions (after 2015) to directly detect gravitational waves echoing from the beginning of the Big Bang, and to directly image matter near the event horizon of a Black Hole. An associated Education and Public Outreach Program will inspire the next generation of scientists, and support national science standards and benchmarks.
Beyond Einstein: from the Big Bang to black holes
NASA Astrophysics Data System (ADS)
White, Nicholas E.; Diaz, Alphonso V.
2004-01-01
How did the Universe begin? Does time have a beginning and an end? Does space have edges? Einstein's theory of relativity replied to these ancient questions with three startling predictions: that the Universe is expanding from a Big Bang; that black holes so distort space and time that time stops at their edges; and that a dark energy could be pulling space apart, sending galaxies forever beyond the edge of the visible Universe. Observations confirm these remarkable predictions, the last finding only four years ago. Yet Einstein's legacy is incomplete. His theory raises - but cannot answer - three profound questions: What powered the Big Bang? What happens to space, time and matter at the edge of a black hole? and, What is the mysterious dark energy pulling the Universe apart? The Beyond Einstein program within NASA's office of space science aims to answer these questions, employing a series of missions linked by powerful new technologies and complementary approaches to shared science goals. The program also serves as a potent force with which to enhance science education and science literacy.
Small Bodies, Big Concepts: Bringing Visual Analysis into the Middle School Classroom
NASA Astrophysics Data System (ADS)
Cobb, W. H.; Lebofsky, L. A.; Ristvey, J. D.; Buxner, S.; Weeks, S.; Zolensky, M. E.
2012-03-01
Multi-disciplinary PD model digs into high-end planetary science backed by a pedagogical framework, Designing Effective Science Instruction. NASA activities are sequenced to promote visual analysis of emerging data from Discovery Program missions.
Bigfoot Field Manual, Version 2.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, J.L.; Burrows, S.; Gower, S.T.
1999-09-01
The BigFoot Project is funded by the Earth Science Enterprise to collect and organize data to be used in the National Aeronautics and Space Administration's Earth Observing System (EOS) Validation Program. The data collected by the BigFoot Project are unique in being ground-based observations coincident with satellite overpasses. In addition to collecting data, the BigFoot project will develop and test new algorithms for scaling point measurements to the same spatial scales as the EOS satellite products. This BigFoot Field Manual will be used to achieve completeness and consistency of data collected at four initial BigFoot sites and at future sitesmore » that may collect similar validation data. Therefore, validation datasets submitted to the Oak Ridge National Laboratory Distributed Active Archive Center that have been compiled in a manner consistent with the field manual will be especially valuable in the validation program.« less
Ward, Tony J; Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij
2016-01-01
Air Toxics Under the Big Sky is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. A quasi-experimental design was used in order to understand: 1) how the program affects student understanding of scientific inquiry and research and 2) how the open inquiry learning opportunities provided by the program increase student interest in science as a career path . Treatment students received instruction related to air pollution (airborne particulate matter), associated health concerns, and training on how to operate air quality testing equipment. They then participated in a yearlong scientific research project in which they developed and tested hypotheses through research of their own design regarding the sources and concentrations of air pollution in their homes and communities. Results from an external evaluation revealed that treatment students developed a deeper understanding of scientific research than did comparison students, as measured by their ability to generate good hypotheses and research designs, and equally expressed an increased interest in pursuing a career in science. These results emphasize the value of and need for authentic science learning opportunities in the modern science classroom.
NASA Astrophysics Data System (ADS)
Ward, Tony J.; Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij
2016-04-01
Air Toxics Under the Big Sky is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. This research explored: (1) how the program affects student understanding of scientific inquiry and research and (2) how the open-inquiry learning opportunities provided by the program increase student interest in science as a career path. Treatment students received instruction related to air pollution (airborne particulate matter), associated health concerns, and training on how to operate air quality testing equipment. They then participated in a yearlong scientific research project in which they developed and tested hypotheses through research of their own design regarding the sources and concentrations of air pollution in their homes and communities. Results from an external evaluation revealed that treatment students developed a deeper understanding of scientific research than did comparison students, as measured by their ability to generate good hypotheses and research designs, and equally expressed an increased interest in pursuing a career in science. These results emphasize the value of and need for authentic science learning opportunities in the modern science classroom.
Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij
2016-01-01
Air Toxics Under the Big Sky is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. A quasi-experimental design was used in order to understand: 1) how the program affects student understanding of scientific inquiry and research and 2) how the open inquiry learning opportunities provided by the program increase student interest in science as a career path. Treatment students received instruction related to air pollution (airborne particulate matter), associated health concerns, and training on how to operate air quality testing equipment. They then participated in a yearlong scientific research project in which they developed and tested hypotheses through research of their own design regarding the sources and concentrations of air pollution in their homes and communities. Results from an external evaluation revealed that treatment students developed a deeper understanding of scientific research than did comparison students, as measured by their ability to generate good hypotheses and research designs, and equally expressed an increased interest in pursuing a career in science. These results emphasize the value of and need for authentic science learning opportunities in the modern science classroom. PMID:28286375
Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach
Cheung, Mike W.-L.; Jak, Suzanne
2016-01-01
Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639
Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.
Cheung, Mike W-L; Jak, Suzanne
2016-01-01
Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.
Huang, Ying; Zhang, Yi; Youtie, Jan; Porter, Alan L.; Wang, Xuefeng
2016-01-01
How do funding agencies ramp-up their capabilities to support research in a rapidly emerging area? This paper addresses this question through a comparison of research proposals awarded by the US National Science Foundation (NSF) and the National Natural Science Foundation of China (NSFC) in the field of Big Data. Big data is characterized by its size and difficulties in capturing, curating, managing and processing it in reasonable periods of time. Although Big Data has its legacy in longstanding information technology research, the field grew very rapidly over a short period. We find that the extent of interdisciplinarity is a key aspect in how these funding agencies address the rise of Big Data. Our results show that both agencies have been able to marshal funding to support Big Data research in multiple areas, but the NSF relies to a greater extent on multi-program funding from different fields. We discuss how these interdisciplinary approaches reflect the research hot-spots and innovation pathways in these two countries. PMID:27219466
Huang, Ying; Zhang, Yi; Youtie, Jan; Porter, Alan L; Wang, Xuefeng
2016-01-01
How do funding agencies ramp-up their capabilities to support research in a rapidly emerging area? This paper addresses this question through a comparison of research proposals awarded by the US National Science Foundation (NSF) and the National Natural Science Foundation of China (NSFC) in the field of Big Data. Big data is characterized by its size and difficulties in capturing, curating, managing and processing it in reasonable periods of time. Although Big Data has its legacy in longstanding information technology research, the field grew very rapidly over a short period. We find that the extent of interdisciplinarity is a key aspect in how these funding agencies address the rise of Big Data. Our results show that both agencies have been able to marshal funding to support Big Data research in multiple areas, but the NSF relies to a greater extent on multi-program funding from different fields. We discuss how these interdisciplinary approaches reflect the research hot-spots and innovation pathways in these two countries.
Business and Science - Big Data, Big Picture
NASA Astrophysics Data System (ADS)
Rosati, A.
2013-12-01
Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.
NASA Astrophysics Data System (ADS)
Hu, X.; Zou, Z.
2017-12-01
For the next decades, comprehensive big data application environment is the dominant direction of cyberinfrastructure development on space science. To make the concept of such BIG cyberinfrastructure (e.g. Digital Space) a reality, these aspects of capability should be focused on and integrated, which includes science data system, digital space engine, big data application (tools and models) and the IT infrastructure. In the past few years, CAS Chinese Space Science Data Center (CSSDC) has made a helpful attempt in this direction. A cloud-enabled virtual research platform on space science, called Solar-Terrestrial and Astronomical Research Network (STAR-Network), has been developed to serve the full lifecycle of space science missions and research activities. It integrated a wide range of disciplinary and interdisciplinary resources, to provide science-problem-oriented data retrieval and query service, collaborative mission demonstration service, mission operation supporting service, space weather computing and Analysis service and other self-help service. This platform is supported by persistent infrastructure, including cloud storage, cloud computing, supercomputing and so on. Different variety of resource are interconnected: the science data can be displayed on the browser by visualization tools, the data analysis tools and physical models can be drived by the applicable science data, the computing results can be saved on the cloud, for example. So far, STAR-Network has served a series of space science mission in China, involving Strategic Pioneer Program on Space Science (this program has invested some space science satellite as DAMPE, HXMT, QUESS, and more satellite will be launched around 2020) and Meridian Space Weather Monitor Project. Scientists have obtained some new findings by using the science data from these missions with STAR-Network's contribution. We are confident that STAR-Network is an exciting practice of new cyberinfrastructure architecture on space science.
Big Data Science Cafés: High School Students Experiencing Real Research with Scientists
NASA Astrophysics Data System (ADS)
Walker, C. E.; Pompea, S. M.
2017-12-01
The Education and Public Outreach group at the National Optical Astronomy Observatory has designed an outside-of-school education program to excite the interest of talented youth in future projects like the Large Synoptic Survey Telescope (LSST) and the NOAO (archival) Data Lab - their data approaches and key science projects. Originally funded by the LSST Corporation, the program cultivates talented youth to enter STEM disciplines and serves as a model to disseminate to the 40+ institutions involved in LSST. One Saturday a month during the academic year, high school students have the opportunity to interact with expert astronomers who work with large astronomical data sets in their scientific work. Students learn about killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, dark energy, dark matter, and more. The format for the Saturday science cafés has been a short presentation, discussion (plus food), computer lab activity and more discussion. They last about 2.5 hours and have been planned by a group of interested local high school students, an undergraduate student coordinator, the presenting astronomers, the program director and an evaluator. High school youth leaders help ensure an enjoyable and successful program for fellow students. They help their fellow students with the activities and help evaluate how well the science café went. Their remarks shape the next science café and improve the program. The experience offers youth leaders ownership of the program, opportunities to take on responsibilities and learn leadership and communication skills, as well as foster their continued interests in STEM. The prototype Big Data Science Academy was implemented successfully in the Spring 2017 and engaged almost 40 teens from greater Tucson in the fundamentals of astronomy concepts and research. As with any first implementation there were bumps. However, staff, scientists, and student leaders all stepped up to make the program a success. The project achieved many of its goals with a relatively small budget, providing value not only to the student leaders and student attendees, but to the scientists and staff as well. Staff learned what worked and what needed more fine-tuning to successfully launch and run a big data academy for teens in the years that follow.
Envisioning the future of 'big data' biomedicine.
Bui, Alex A T; Van Horn, John Darrell
2017-05-01
Through the increasing availability of more efficient data collection procedures, biomedical scientists are now confronting ever larger sets of data, often finding themselves struggling to process and interpret what they have gathered. This, while still more data continues to accumulate. This torrent of biomedical information necessitates creative thinking about how the data are being generated, how they might be best managed, analyzed, and eventually how they can be transformed into further scientific understanding for improving patient care. Recognizing this as a major challenge, the National Institutes of Health (NIH) has spearheaded the "Big Data to Knowledge" (BD2K) program - the agency's most ambitious biomedical informatics effort ever undertaken to date. In this commentary, we describe how the NIH has taken on "big data" science head-on, how a consortium of leading research centers are developing the means for handling large-scale data, and how such activities are being marshalled for the training of a new generation of biomedical data scientists. All in all, the NIH BD2K program seeks to position data science at the heart of 21 st Century biomedical research. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Smith, D. A.; Peticolas, L.; Schwerin, T.; Shipp, S.; Manning, J. G.
2014-07-01
For nearly two decades, NASA has embedded education and public outreach (EPO) in its Earth and space science missions and research programs on the principle that science education is most effective when educators and scientists work hand-in-hand. Four Science EPO Forums organize the respective NASA Science Mission Directorate (SMD) Astrophysics, Earth Science, Heliophysics, and Planetary Science EPO programs into a coordinated, efficient, and effective nationwide effort. The NASA SMD EPO program evaluates EPO impacts that support NASA's policy of providing a direct return-on-investment for the American public, advances STEM education and literacy, and enables students and educators to participate in the practice of science as embodied in the 2013 Next Generation Science Standards. Leads of the four NASA SMD Science EPO Forums provided big-picture perspectives on NASA's effort to incorporate authentic science into the nation's STEM education and scientific literacy, highlighting examples of program effectiveness and impact. Attendees gained an increased awareness of the depth and breadth of NASA SMD's EPO programs and achievements, the magnitude of its impacts through representative examples, and the ways current and future EPO programs can build upon the work being done.
Small Things Draw Big Interest
ERIC Educational Resources Information Center
Green, Susan; Smith III, Julian
2005-01-01
Although the microscope is a basic tool in both physical and biological sciences, it is notably absent from most elementary school science programs. One reason teachers find it challenging to introduce microscopy at the elementary level is because children can have a hard time connecting the image of an object seen through a microscope with what…
Through the Eyes of NASA: NASA's 2017 Eclipse Education Progam
NASA Astrophysics Data System (ADS)
Mayo, L.
2017-12-01
Over the last three years, NASA has been developing plans to bring the August 21st total solar eclipse to the nation, "as only NASA can", leveraging its considerable space assets, technology, scientists, and its unmatched commitment to science education. The eclipse, long anticipated by many groups, represents the largest Big Event education program that NASA has ever undertaken. It is the latest in a long string of successful Big Event international celebrations going back two decades including both transits of Venus, three solar eclipses, solar maximum, and mission events such as the MSL/Curiosity landing on Mars, and the launch of the Lunar Reconnaissance Orbiter (LRO) to name a few. This talk will detail NASA's program development methods, strategic partnerships, and strategies for using this celestial event to engage the nation and improve overall science literacy.
Semantic Web technologies for the big data in life sciences.
Wu, Hongyan; Yamaguchi, Atsuko
2014-08-01
The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.
NASA Astrophysics Data System (ADS)
Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.
2014-12-01
Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single metadata catalog. The entire BiG CZ Software system is being developed on public repositories as a modular suite of open source software projects. It will be built around a new Observations Data Model Version 2.0 (ODM2) that has been developed by members of the BiG CZ project team, with community input, under separate funding.
Commentary: Epidemiology in the era of big data.
Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M
2015-05-01
Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called "three V's": variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field's future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future.
Epidemiology in the Era of Big Data
Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M
2015-01-01
Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221
Big Biology: Supersizing Science During the Emergence of the 21st Century
Vermeulen, Niki
2017-01-01
Ist Biologie das jüngste Mitglied in der Familie von Big Science? Die vermehrte Zusammenarbeit in der biologischen Forschung wurde in der Folge des Human Genome Project zwar zum Gegenstand hitziger Diskussionen, aber Debatten und Reflexionen blieben meist im Polemischen verhaftet und zeigten eine begrenzte Wertschätzung für die Vielfalt und Erklärungskraft des Konzepts von Big Science. Zur gleichen Zeit haben Wissenschafts- und Technikforscher/innen in ihren Beschreibungen des Wandels der Forschungslandschaft die Verwendung des Begriffs Big Science gemieden. Dieser interdisziplinäre Artikel kombiniert eine begriffliche Analyse des Konzepts von Big Science mit unterschiedlichen Daten und Ideen aus einer Multimethodenuntersuchung mehrerer großer Forschungsprojekte in der Biologie. Ziel ist es, ein empirisch fundiertes, nuanciertes und analytisch nützliches Verständnis von Big Biology zu entwickeln und die normativen Debatten mit ihren einfachen Dichotomien und rhetorischen Positionen hinter sich zu lassen. Zwar kann das Konzept von Big Science als eine Mode in der Wissenschaftspolitik gesehen werden – inzwischen vielleicht sogar als ein altmodisches Konzept –, doch lautet meine innovative Argumentation, dass dessen analytische Verwendung unsere Aufmerksamkeit auf die Ausweitung der Zusammenarbeit in den Biowissenschaften lenkt. Die Analyse von Big Biology zeigt Unterschiede zu Big Physics und anderen Formen von Big Science, namentlich in den Mustern der Forschungsorganisation, der verwendeten Technologien und der gesellschaftlichen Zusammenhänge, in denen sie tätig ist. So können Reflexionen über Big Science, Big Biology und ihre Beziehungen zur Wissensproduktion die jüngsten Behauptungen über grundlegende Veränderungen in der Life Science-Forschung in einen historischen Kontext stellen. PMID:27215209
Adapting bioinformatics curricula for big data.
Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H
2016-01-01
Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. © The Author 2015. Published by Oxford University Press.
Adapting bioinformatics curricula for big data
Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.
2016-01-01
Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469
Big Data: Next-Generation Machines for Big Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hack, James J.; Papka, Michael E.
Addressing the scientific grand challenges identified by the US Department of Energy’s (DOE’s) Office of Science’s programs alone demands a total leadership-class computing capability of 150 to 400 Pflops by the end of this decade. The successors to three of the DOE’s most powerful leadership-class machines are set to arrive in 2017 and 2018—the products of the Collaboration Oak Ridge Argonne Livermore (CORAL) initiative, a national laboratory–industry design/build approach to engineering nextgeneration petascale computers for grand challenge science. These mission-critical machines will enable discoveries in key scientific fields such as energy, biotechnology, nanotechnology, materials science, and high-performance computing, and servemore » as a milestone on the path to deploying exascale computing capabilities.« less
Costa, Fabricio F
2014-04-01
The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.
Big Science and Big Big Science
ERIC Educational Resources Information Center
Marshall, Steve
2012-01-01
In his introduction to the science shows feature in "Primary Science" 115, Ian B. Dunne asks the question "Why have science shows?" He lists a host of very sound reasons, starting with because "science is fun" so why not engage and entertain, inspire, grab attention and encourage them to learn? He goes onto to state that: "Even in today's…
Nursing Needs Big Data and Big Data Needs Nursing.
Brennan, Patricia Flatley; Bakken, Suzanne
2015-09-01
Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.
Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H
2015-01-01
Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.
Using "The Big Bang Theory's" World in Young High-Potentials Education
NASA Astrophysics Data System (ADS)
Leitner, J. J.; Taubner, R.-S.; Firneis, M. G.; Hitzenberger, R.
2014-04-01
One of the corner stones of the Research Platform: ExoLife, University of Vienna, Austria, is public outreach and education with respect to astrobology, exoplanets, and planetary sciences. Since 2009, several initiatives have been started by the Research Platform to concentrate the interest of students inside and outside the University onto natural sciences. Additionally, there are two special programs - one in adult education and one in training/education of young high-potentials. In these programs, astrobiology (and within this context also planetary sciences) as a very interdisciplinary scientific discipline, which fascinates youngsters and junior scientists, is utilized to direct their thirst for knowledge and their curiosity to natural science topics (see [1, 2]).
ERIC Educational Resources Information Center
Stetson, Emily
1991-01-01
Thanks to two enterprising teachers, a basement greenhouse has energized an inner-city elementary school in Brooklyn, New York. Though the basement jungle is the most visible part of the science program, students tend outdoor plants and integrate horticulture into all curriculum areas. (MLH)
Badr, Kamal F.; Dove, Edward S.; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J.; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N.; Sabra, Ramzi; Sarkissian, Christineh N.; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K.; Kickbusch, Ilona
2013-01-01
Abstract Biomedical science in the 21st century is embedded in, and draws from, a digital commons and “Big Data” created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., “the lone genius” or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21st century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists—only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the “bottom one billion”—the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while sharing similar disease burdens, therapeutics, and diagnostic needs. We report the creation of ten Type 2 micro-grants for citizen science and artisan labs to be administered by the nonprofit Data-Enabled Life Sciences Alliance International (DELSA Global, Seattle). Our hope is that these micro-grants will spur novel forms of disruptive innovation and genomics translation by artisan scientists and citizen scholars alike. We conclude with a neglected voice from the global health frontlines, the American University of Iraq in Sulaimani, and suggest that many similar global regions are now poised for micro-grant enabled collective innovation to harness the 21st century digital commons. PMID:23574338
Özdemir, Vural; Badr, Kamal F; Dove, Edward S; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N; Sabra, Ramzi; Sarkissian, Christineh N; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K; Kickbusch, Ilona
2013-04-01
Biomedical science in the 21(st) century is embedded in, and draws from, a digital commons and "Big Data" created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., "the lone genius" or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21(st) century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists-only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the "bottom one billion"-the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while sharing similar disease burdens, therapeutics, and diagnostic needs. We report the creation of ten Type 2 micro-grants for citizen science and artisan labs to be administered by the nonprofit Data-Enabled Life Sciences Alliance International (DELSA Global, Seattle). Our hope is that these micro-grants will spur novel forms of disruptive innovation and genomics translation by artisan scientists and citizen scholars alike. We conclude with a neglected voice from the global health frontlines, the American University of Iraq in Sulaimani, and suggest that many similar global regions are now poised for micro-grant enabled collective innovation to harness the 21(st) century digital commons.
Big Machines and Big Science: 80 Years of Accelerators at Stanford
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loew, Gregory
2008-12-16
Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'
ERIC Educational Resources Information Center
MDRC, 2016
2016-01-01
Many social policy and education programs start from the assumption that people act in their best interest. But behavioral science shows that people often weigh intuition over reason, make inconsistent choices, and put off big decisions. The individuals and families who need services and the staff who provide them are no exception. From city…
NASA Astrophysics Data System (ADS)
Torgersen, Thomas
2006-06-01
Multiple issues in hydrologic and environmental sciences are now squarely in the public focus and require both government and scientific study. Two facts also emerge: (1) The new approach being touted publicly for advancing the hydrologic and environmental sciences is the establishment of community-operated "big science" (observatories, think tanks, community models, and data repositories). (2) There have been important changes in the business of science over the last 20 years that make it important for the hydrologic and environmental sciences to demonstrate the "value" of public investment in hydrological and environmental science. Given that community-operated big science (observatories, think tanks, community models, and data repositories) could become operational, I argue that such big science should not mean a reduction in the importance of single-investigator science. Rather, specific linkages between the large-scale, team-built, community-operated big science and the single investigator should provide context data, observatory data, and systems models for a continuing stream of hypotheses by discipline-based, specialized research and a strong rationale for continued, single-PI ("discovery-based") research. I also argue that big science can be managed to provide a better means of demonstrating the value of public investment in the hydrologic and environmental sciences. Decisions regarding policy will still be political, but big science could provide an integration of the best scientific understanding as a guide for the best policy.
Creation a Geo Big Data Outreach and Training Collaboratory for Wildfire Community
NASA Astrophysics Data System (ADS)
Altintas, I.; Sale, J.; Block, J.; Cowart, C.; Crawl, D.
2015-12-01
A major challenge for the geoscience community is the training and education of current and next generation big data geoscientists. In wildfire research, there are an increasing number of tools, middleware and techniques to use for data science related to wildfires. The necessary computing infrastructures are often within reach and most of the software tools for big data are freely available. But what has been lacking is a transparent platform and training program to produce data science experts who can use these integrated tools effectively. Scientists well versed to take advantage of big data technologies in geoscience applications is of critical importance to the future of research and knowledge advancement. To address this critical need, we are developing learning modules to teach process-based thinking to capture the value of end-to-end systems of reusable blocks of knowledge and integrate the tools and technologies used in big data analysis in an intuitive manner. WIFIRE is an end-to-end cyberinfrastructure for dynamic data-driven simulation, prediction and visualization of wildfire behavior.To this end, we are openly extending an environment we have built for "big data training" (biobigdata.ucsd.edu) to similar MOOC-based approaches to the wildfire community. We are building an environment that includes training modules for distributed platforms and systems, Big Data concepts, and scalable workflow tools, along with other basics of data science including data management, reproducibility and sharing of results. We also plan to provide teaching modules with analytical and dynamic data-driven wildfire behavior modeling case studies which address the needs not only of standards-based K-12 science education but also the needs of a well-educated and informed citizenry.Another part our outreach mission is to educate our community on all aspects of wildfire research. One of the most successful ways of accomplishing this is through high school and undergraduate student internships. Students have worked closely with WIFIRE researchers on various projects including the development of statistical models of wildfire ignition probabilities for southern California, and the development of a smartphone app for crowd-sourced wildfire reporting through social networks such as Twitter and Facebook.
ERIC Educational Resources Information Center
Brooks, Jacqueline Grennon
2011-01-01
Strong evidence from recent brain research shows that the intentional teaching of science is crucial in early childhood. "Big Science for Growing Minds" describes a groundbreaking curriculum that invites readers to rethink science education through a set of unifying concepts or "big ideas." Using an integrated learning approach, the author shows…
Toward a manifesto for the 'public understanding of big data'.
Michael, Mike; Lupton, Deborah
2016-01-01
In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article. © The Author(s) 2015.
"Dinosaurs." Kindergarten. Anchorage School District Elementary Science Program.
ERIC Educational Resources Information Center
Herminghaus, Trisha, Ed.
This unit contains 15 lessons on dinosaurs for kindergarten children. It provides a materials list, supplementary materials list, use of process skill terminology, unit objectives, vocabulary, six major dinosaurs, and background information. Lessons are: (1) "Webbing"; (2) "Introduction to the Big Six"; (3) "Paleontology…
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinney, M.J.; Jenkins, S.
Project JEM (Jarvis Enhancement of Males) is a pre-college program directed toward stimulating disadvantaged, talented African American males in grades four, five, and six to attend college and major in mathematics, science, computer science, or related technical areas needed by the US Department of Energy. Twenty young African American male students were recruited from Gladewater Independent School District (ISD), Longview ISD, Hawkins ISD, Tyler ISD, Winona ISD and big Sandy ISD. Students enrolled in the program range from ages 10 to 13 and are in grades four, five and six. Student participants in the 1997 Project JEM Program attended Saturdaymore » Academy sessions and a four week intensive, summer residential program. The information here provides a synopsis of the activities which were conducted through each program component.« less
IS Programs Responding to Industry Demands for Data Scientists: A Comparison between 2011-2016
ERIC Educational Resources Information Center
Mills, Robert J.; Chudoba, Katherine M.; Olsen, David H.
2016-01-01
The term data scientist has only been in common use since 2008, but in 2016 it is considered one of the top careers in the United States. The purpose of this paper is to explore the growth of data science content areas such as analytics, business intelligence, and big data in AACSB Information Systems (IS) programs between 2011 and 2016. A…
Teaming to Teach the Information Problem-Solving Process.
ERIC Educational Resources Information Center
Sine, Lynn; Murphy, Becky
1992-01-01
Explains a problem-solving format developed by a school media specialist and first grade teacher that used the framework of Eisenberg and Berkowitz's "Big Six Skills" for library media programs. The application of the format to a science unit on the senses is described. (two references) (MES)
Turning the Ship: The Transformation of DESY, 1993-2009
NASA Astrophysics Data System (ADS)
Heinze, Thomas; Hallonsten, Olof; Heinecke, Steffi
2017-12-01
This article chronicles the most recent history of the Deutsches Elektronen-Synchrotron (DESY) located in Hamburg, Germany, with particular emphasis on how this national laboratory founded for accelerator-based particle physics shifted its research program toward multi-disciplinary photon science. Synchrotron radiation became DESY's central experimental research program through a series of changes in its organizational, scientific, and infrastructural setup and the science policy context. Furthermore, the turn toward photon science is part of a broader transformation in the late twentieth century in which nuclear and particle physics, once the dominating fields in national and international science budgets, gave way to increasing investment in the materials sciences and life sciences. Synchrotron radiation research took a lead position on the experimental side of these growing fields and became a new form of big science, generously funded by governments and with user communities expanding across both academia and industry.
Facilitymetrics for Big Ocean Science: Towards Improved Measurement of Scientific Impact
NASA Astrophysics Data System (ADS)
Juniper, K.; Owens, D.; Moran, K.; Pirenne, B.; Hallonsten, O.; Matthews, K.
2016-12-01
Cabled ocean observatories are examples of "Big Science" facilities requiring significant public investments for installation and ongoing maintenance. Large observatory networks in Canada and the United States, for example, have been established after extensive up-front planning and hundreds of millions of dollars in start-up costs. As such, they are analogous to particle accelerators and astronomical observatories, which may often be required to compete for public funding in an environment of ever-tightening national science budget allocations. Additionally, the globalization of Big Science compels these facilities to respond to increasing demands for demonstrable productivity, excellence and competitiveness. How should public expenditures on "Big Science" facilities be evaluated and justified in terms of benefits to the countries that invest in them? Published literature counts are one quantitative measure often highlighted in the annual reports of large science facilities. But, as recent research has demonstrated, publication counts can lead to distorted characterizations of scientific impact, inviting evaluators to calculate scientific outputs in terms of costs per publication—a ratio that can be simplistically misconstrued to conclude Big Science is wildly expensive. Other commonly promoted measurements of Big Science facilities include technical reliability (a.k.a. uptime), provision of training opportunities for Highly Qualified Personnel, generation of commercialization opportunities, and so forth. "Facilitymetrics" is a new empirical focus for scientometrical studies, which has been applied to the evaluation and comparison of synchrotron facilities. This paper extends that quantitative and qualitative examination to a broader inter-disciplinary comparison of Big Science facilities in the ocean science realm to established facilities in the fields of astronomy and particle physics.
Facilitymetrics for Big Ocean Science: Towards Improved Measurement of Scientific Impact
NASA Astrophysics Data System (ADS)
Juniper, K.; Owens, D.; Moran, K.; Pirenne, B.; Hallonsten, O.; Matthews, K.
2016-02-01
Cabled ocean observatories are examples of "Big Science" facilities requiring significant public investments for installation and ongoing maintenance. Large observatory networks in Canada and the United States, for example, have been established after extensive up-front planning and hundreds of millions of dollars in start-up costs. As such, they are analogous to particle accelerators and astronomical observatories, which may often be required to compete for public funding in an environment of ever-tightening national science budget allocations. Additionally, the globalization of Big Science compels these facilities to respond to increasing demands for demonstrable productivity, excellence and competitiveness. How should public expenditures on "Big Science" facilities be evaluated and justified in terms of benefits to the countries that invest in them? Published literature counts are one quantitative measure often highlighted in the annual reports of large science facilities. But, as recent research has demonstrated, publication counts can lead to distorted characterizations of scientific impact, inviting evaluators to calculate scientific outputs in terms of costs per publication—a ratio that can be simplistically misconstrued to conclude Big Science is wildly expensive. Other commonly promoted measurements of Big Science facilities include technical reliability (a.k.a. uptime), provision of training opportunities for Highly Qualified Personnel, generation of commercialization opportunities, and so forth. "Facilitymetrics" is a new empirical focus for scientometrical studies, which has been applied to the evaluation and comparison of synchrotron facilities. This paper extends that quantitative and qualitative examination to a broader inter-disciplinary comparison of Big Science facilities in the ocean science realm to established facilities in the fields of astronomy and particle physics.
Urgent Call for Nursing Big Data.
Delaney, Connie W
2016-01-01
The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference.
NASA Astrophysics Data System (ADS)
Schwerin, T. G.; Peticolas, L. M.; Shipp, S. S.; Smith, D. A.
2014-12-01
Since 1993, NASA has embedded education and public outreach (EPO) in its Earth and space science missions and research programs on the principle that science education is most effective when educators and scientists work hand-in-hand. Four Science EPO Forums organize the respective NASA Science Mission Directorate (SMD) Astrophysics, Earth Science, Heliophysics, and Planetary Science EPO programs into a coordinated, efficient, and effective nationwide effort. The result is significant, evaluated EPO impacts that support NASA's policy of providing a direct return-on-investment for the American public, advance STEM education and literacy, and enable students and educators to participate in the practices of science and engineering as embodied in the 2013 Next Generation Science Standards. This presentation by the leads of the four NASA SMD Science EPO Forums provides big-picture perspectives on NASA's effort to incorporate authentic science into the nation's STEM education and scientific literacy, highlighting tools that were developed to foster a collaborative community and examples of program effectiveness and impact. The Forums are led by: Astrophysics - Space Telescope Science Institute (STScI); Earth Science - Institute for Global Environmental Strategies (IGES); Heliophysics - University of California, Berkeley; and Planetary Science - Lunar and Planetary Institute (LPI).
Big Data access and infrastructure for modern biology: case studies in data repository utility.
Boles, Nathan C; Stone, Tyler; Bergeron, Charles; Kiehl, Thomas R
2017-01-01
Big Data is no longer solely the purview of big organizations with big resources. Today's routine tools and experimental methods can generate large slices of data. For example, high-throughput sequencing can quickly interrogate biological systems for the expression levels of thousands of different RNAs, examine epigenetic marks throughout the genome, and detect differences in the genomes of individuals. Multichannel electrophysiology platforms produce gigabytes of data in just a few minutes of recording. Imaging systems generate videos capturing biological behaviors over the course of days. Thus, any researcher now has access to a veritable wealth of data. However, the ability of any given researcher to utilize that data is limited by her/his own resources and skills for downloading, storing, and analyzing the data. In this paper, we examine the necessary resources required to engage Big Data, survey the state of modern data analysis pipelines, present a few data repository case studies, and touch on current institutions and programs supporting the work that relies on Big Data. © 2016 New York Academy of Sciences.
The BRAID: Experiments in Stitching Together Disciplines at a Big Ten University
ERIC Educational Resources Information Center
Luckie, Douglas B.; Bellon, Richard; Sweeder, Ryan D.
2012-01-01
Since 2005 we have pursued a formal research program called the BRAID (Bringing Relationships Alive through Interdisciplinary Discourse), which is designed to develop and test strategies for training first- and second-year undergraduate science students to bridge scientific disciplines. The BRAID's ongoing multiyear investigation points to…
ALCF Data Science Program: Productive Data-centric Supercomputing
NASA Astrophysics Data System (ADS)
Romero, Nichols; Vishwanath, Venkatram
The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
The New Big Science: What's New, What's Not, and What's the Difference
NASA Astrophysics Data System (ADS)
Westfall, Catherine
2016-03-01
This talk will start with a brief recap of the development of the ``Big Science'' epitomized by high energy physics, that is, the science that flourished after WWII based on accelerators, teams, and price tags that grew ever larger. I will then explain the transformation that started in the 1980s and culminated in the 1990s when the Cold War ended and the next big machine needed to advance high energy physics, the multi-billion dollar Superconducting Supercollider (SSC), was cancelled. I will go on to outline the curious series of events that ushered in the New Big Science, a form of research well suited to a post-Cold War environment that valued practical rather than esoteric projects. To show the impact of the New Big Science I will describe how decisions were ``set into concrete'' during the development of experimental equipment at the Thomas Jefferson National Accelerator Facility in Newport News, Virginia.
Schultz, Jane S; Rodgers, V G J
2012-07-01
The Department of Bioengineering at the University of California, Riverside (UCR), was established in 2006 and is the youngest department in the Bourns College of Engineering. It is an interdisciplinary research engine that builds strength from highly recognized experts in biochemistry, biophysics, biology, and engineering, focusing on common critical themes. The range of faculty research interests is notable for its diversity, from the basic cell biology through cell function to the physiology of the whole organism, each directed at breakthroughs in biomedical devices for measurement and therapy. The department forges future leaders in bioengineering, mirroring the field in being energetic, interdisciplinary, and fast moving at the frontiers of biomedical discoveries. Our educational programs combine a solid foundation in bio logical sciences and engineering, diverse communication skills, and training in the most advanced quantitative bioengineering research. Bioengineering at UCR also includes the Bioengineering Interdepartmental Graduate (BIG) program. With its slogan Start-Grow-Be-BIG, it is already recognized for its many accomplishments, including being third in the nation in 2011 for bioengineering students receiving National Science Foundation graduate research fellowships as well as being one of the most ethnically inclusive programs in the nation.
A Big Data Guide to Understanding Climate Change: The Case for Theory-Guided Data Science.
Faghmous, James H; Kumar, Vipin
2014-09-01
Global climate change and its impact on human life has become one of our era's greatest challenges. Despite the urgency, data science has had little impact on furthering our understanding of our planet in spite of the abundance of climate data. This is a stark contrast from other fields such as advertising or electronic commerce where big data has been a great success story. This discrepancy stems from the complex nature of climate data as well as the scientific questions climate science brings forth. This article introduces a data science audience to the challenges and opportunities to mine large climate datasets, with an emphasis on the nuanced difference between mining climate data and traditional big data approaches. We focus on data, methods, and application challenges that must be addressed in order for big data to fulfill their promise with regard to climate science applications. More importantly, we highlight research showing that solely relying on traditional big data techniques results in dubious findings, and we instead propose a theory-guided data science paradigm that uses scientific theory to constrain both the big data techniques as well as the results-interpretation process to extract accurate insight from large climate data .
From big data to deep insight in developmental science.
Gilmore, Rick O
2016-01-01
The use of the term 'big data' has grown substantially over the past several decades and is now widespread. In this review, I ask what makes data 'big' and what implications the size, density, or complexity of datasets have for the science of human development. A survey of existing datasets illustrates how existing large, complex, multilevel, and multimeasure data can reveal the complexities of developmental processes. At the same time, significant technical, policy, ethics, transparency, cultural, and conceptual issues associated with the use of big data must be addressed. Most big developmental science data are currently hard to find and cumbersome to access, the field lacks a culture of data sharing, and there is no consensus about who owns or should control research data. But, these barriers are dissolving. Developmental researchers are finding new ways to collect, manage, store, share, and enable others to reuse data. This promises a future in which big data can lead to deeper insights about some of the most profound questions in behavioral science. © 2016 The Authors. WIREs Cognitive Science published by Wiley Periodicals, Inc.
Perspectives on Policy and the Value of Nursing Science in a Big Data Era.
Gephart, Sheila M; Davis, Mary; Shea, Kimberly
2018-01-01
As data volume explodes, nurse scientists grapple with ways to adapt to the big data movement without jeopardizing its epistemic values and theoretical focus that celebrate while acknowledging the authority and unity of its body of knowledge. In this article, the authors describe big data and emphasize ways that nursing science brings value to its study. Collective nursing voices that call for more nursing engagement in the big data era are answered with ways to adapt and integrate theoretical and domain expertise from nursing into data science.
NASA Astrophysics Data System (ADS)
Bitter, C.
2016-12-01
Talking about your science is just like talking about yourself (although you may be difficult to explain). You are not alone, and even the most famous scientists and engineers struggle because parts of our work are hard to explain. We'll explore the BIG stuff like the best ways to tackle the Scale of the Universe for the public, REALLY big numbers for little kids, and crowd favorites like Deep Time and Climate Change. We'll sweat the small stuff too like subatomic particles, and the unseeables but knowables like exoplanets, ground water and dark matter. Through case studies spanning over a decade of working with and observing scientists and engineers in public programming, education, outreach, and working groups for communicating science through museum exhibits, discover why the best science communicators are straightforward, curious, great storytellers and use everyday objects, humor, excitement and fun to share concepts. We'll examine a few epic fails too, and how to recover, as well as helping your audience feel truly accomplished after communicating with you.
Big Data over a 100G network at Fermilab
Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo; ...
2014-06-11
As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out ofmore » the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. Furthermore, this work presents the new R&D facility and the continuation of the evaluation program.« less
Big Data over a 100G network at Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo
As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out ofmore » the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. Furthermore, this work presents the new R&D facility and the continuation of the evaluation program.« less
Nursing Knowledge: Big Data Science-Implications for Nurse Leaders.
Westra, Bonnie L; Clancy, Thomas R; Sensmeier, Joyce; Warren, Judith J; Weaver, Charlotte; Delaney, Connie W
2015-01-01
The integration of Big Data from electronic health records and other information systems within and across health care enterprises provides an opportunity to develop actionable predictive models that can increase the confidence of nursing leaders' decisions to improve patient outcomes and safety and control costs. As health care shifts to the community, mobile health applications add to the Big Data available. There is an evolving national action plan that includes nursing data in Big Data science, spearheaded by the University of Minnesota School of Nursing. For the past 3 years, diverse stakeholders from practice, industry, education, research, and professional organizations have collaborated through the "Nursing Knowledge: Big Data Science" conferences to create and act on recommendations for inclusion of nursing data, integrated with patient-generated, interprofessional, and contextual data. It is critical for nursing leaders to understand the value of Big Data science and the ways to standardize data and workflow processes to take advantage of newer cutting edge analytics to support analytic methods to control costs and improve patient quality and safety.
Gender Problems in Western Theatrical Dance: Little Girls, Big Sissies & the "Baryshnikov Complex"
ERIC Educational Resources Information Center
Risner, Doug
2014-01-01
General education programs, in postsecondary institutions, provide a broad base of learning in the liberal arts and sciences with common goals that prepare undergraduate students for living informed and satisfying lives. In the United States, dance units in public institutions, offering general education coursework for non-majors (dance…
Two Texas Colleges Buy into the Big Time.
ERIC Educational Resources Information Center
Business Week, 1983
1983-01-01
Revenues from oil/gas royalties on University of Texas (Austin) and Texas A&M go into a fund which is used to hire faculty and develop programs, particularly in the sciences. However, Texas legislators are considering a bill that would spread the oil money to other campuses in the Texas university system. (JN)
Ford Foundation Plans Big Increase in Grants to Higher Education.
ERIC Educational Resources Information Center
Desruisseaux, Paul; McMillen, Liz
1988-01-01
The Ford Foundation's increase, by almost one-fourth, of support to higher education is intended to reinvigorate college teaching, especially in the social sciences, and to improve opportunities for minority group faculty. Other new foundation efforts include support of a program to assist victims of Acquired Immune Deficiency Syndrome (AIDS).…
Bakken, Suzanne; Reame, Nancy
2016-01-01
Symptom management research is a core area of nursing science and one of the priorities for the National Institute of Nursing Research, which specifically focuses on understanding the biological and behavioral aspects of symptoms such as pain and fatigue, with the goal of developing new knowledge and new strategies for improving patient health and quality of life. The types and volume of data related to the symptom experience, symptom management strategies, and outcomes are increasingly accessible for research. Traditional data streams are now complemented by consumer-generated (i.e., quantified self) and "omic" data streams. Thus, the data available for symptom science can be considered big data. The purposes of this chapter are to (a) briefly summarize the current drivers for the use of big data in research; (b) describe the promise of big data and associated data science methods for advancing symptom management research; (c) explicate the potential perils of big data and data science from the perspective of the ethical principles of autonomy, beneficence, and justice; and (d) illustrate strategies for balancing the promise and the perils of big data through a case study of a community at high risk for health disparities. Big data and associated data science methods offer the promise of multidimensional data sources and new methods to address significant research gaps in symptom management. If nurse scientists wish to apply big data and data science methods to advance symptom management research and promote health equity, they must carefully consider both the promise and perils.
Big Data in Plant Science: Resources and Data Mining Tools for Plant Genomics and Proteomics.
Popescu, George V; Noutsos, Christos; Popescu, Sorina C
2016-01-01
In modern plant biology, progress is increasingly defined by the scientists' ability to gather and analyze data sets of high volume and complexity, otherwise known as "big data". Arguably, the largest increase in the volume of plant data sets over the last decade is a consequence of the application of the next-generation sequencing and mass-spectrometry technologies to the study of experimental model and crop plants. The increase in quantity and complexity of biological data brings challenges, mostly associated with data acquisition, processing, and sharing within the scientific community. Nonetheless, big data in plant science create unique opportunities in advancing our understanding of complex biological processes at a level of accuracy without precedence, and establish a base for the plant systems biology. In this chapter, we summarize the major drivers of big data in plant science and big data initiatives in life sciences with a focus on the scope and impact of iPlant, a representative cyberinfrastructure platform for plant science.
From Big Data to Knowledge in the Social Sciences.
Hesse, Bradford W; Moser, Richard P; Riley, William T
2015-05-01
One of the challenges associated with high-volume, diverse datasets is whether synthesis of open data streams can translate into actionable knowledge. Recognizing that challenge and other issues related to these types of data, the National Institutes of Health developed the Big Data to Knowledge or BD2K initiative. The concept of translating "big data to knowledge" is important to the social and behavioral sciences in several respects. First, a general shift to data-intensive science will exert an influence on all scientific disciplines, but particularly on the behavioral and social sciences given the wealth of behavior and related constructs captured by big data sources. Second, science is itself a social enterprise; by applying principles from the social sciences to the conduct of research, it should be possible to ameliorate some of the systemic problems that plague the scientific enterprise in the age of big data. We explore the feasibility of recalibrating the basic mechanisms of the scientific enterprise so that they are more transparent and cumulative; more integrative and cohesive; and more rapid, relevant, and responsive.
From Big Data to Knowledge in the Social Sciences
Hesse, Bradford W.; Moser, Richard P.; Riley, William T.
2015-01-01
One of the challenges associated with high-volume, diverse datasets is whether synthesis of open data streams can translate into actionable knowledge. Recognizing that challenge and other issues related to these types of data, the National Institutes of Health developed the Big Data to Knowledge or BD2K initiative. The concept of translating “big data to knowledge” is important to the social and behavioral sciences in several respects. First, a general shift to data-intensive science will exert an influence on all scientific disciplines, but particularly on the behavioral and social sciences given the wealth of behavior and related constructs captured by big data sources. Second, science is itself a social enterprise; by applying principles from the social sciences to the conduct of research, it should be possible to ameliorate some of the systemic problems that plague the scientific enterprise in the age of big data. We explore the feasibility of recalibrating the basic mechanisms of the scientific enterprise so that they are more transparent and cumulative; more integrative and cohesive; and more rapid, relevant, and responsive. PMID:26294799
A Big Data Guide to Understanding Climate Change: The Case for Theory-Guided Data Science
Kumar, Vipin
2014-01-01
Abstract Global climate change and its impact on human life has become one of our era's greatest challenges. Despite the urgency, data science has had little impact on furthering our understanding of our planet in spite of the abundance of climate data. This is a stark contrast from other fields such as advertising or electronic commerce where big data has been a great success story. This discrepancy stems from the complex nature of climate data as well as the scientific questions climate science brings forth. This article introduces a data science audience to the challenges and opportunities to mine large climate datasets, with an emphasis on the nuanced difference between mining climate data and traditional big data approaches. We focus on data, methods, and application challenges that must be addressed in order for big data to fulfill their promise with regard to climate science applications. More importantly, we highlight research showing that solely relying on traditional big data techniques results in dubious findings, and we instead propose a theory-guided data science paradigm that uses scientific theory to constrain both the big data techniques as well as the results-interpretation process to extract accurate insight from large climate data. PMID:25276499
Cool Cosmology: ``WHISPER" better than ``BANG"
NASA Astrophysics Data System (ADS)
Carr, Paul
2007-10-01
Cosmologist Fred Hoyle coined ``big bang'' as a term of derision for Belgian priest George Lemaitre's prediction that the universe had originated from the expansion of a ``primeval atom'' in space-time. Hoyle referred to Lamaitre's hypothesis sarcastically as ``this big bang idea'' during a program broadcast on March 28, 1949 on the BBC. Hoyle's continuous creation or steady state theory can not explain the microwave background radiation or cosmic whisper discovered by Penzias and Wilson in 1964. The expansion and subsequent cooling of Lemaitre's hot ``primeval atom'' explains the whisper. ``Big bang'' makes no physical sense, as there was no matter (or space) to carry the sound that Hoyle's term implies. The ``big bang'' is a conjecture. New discoveries may be able to predict the observed ``whispering cosmos'' as well as dark matter and the nature of dark energy. The ``whispering universe'' is cooler cosmology than the big bang. Reference: Carr, Paul H. 2006. ``From the 'Music of the Spheres' to the 'Whispering Cosmos.' '' Chapter 3 of Beauty in Science and Spirit. Beech River Books. Center Ossipee, NH, http://www.MirrorOfNature.org.
Slater, Craig E; Cusick, Anne; Louie, Jimmy C Y
2017-11-13
Self-directed learning (SDL) is expected of health science graduates; it is thus a learning outcome in many pre-certification programs. Previous research identified age, gender, discipline and prior education as associated with variations in students' self-directed learning readiness (SDLR). Studies in other fields also propose personality as influential. This study investigated relationships between SDLR and age, gender, discipline, previous education, and personality traits. The Self-Directed Learning Readiness Scale and the 50-item 'big five' personality trait inventory were administered to 584 first-year undergraduate students (n = 312 female) enrolled in a first-session undergraduate interprofessional health sciences subject. Students were from health promotion, health services management, therapeutic recreation, sports and exercise science, occupational therapy, physiotherapy, and podiatry. Four hundred and seven responses (n = 230 females) were complete. SDLR was significantly higher in females and students in occupational therapy and physiotherapy. SDLR increased with age and higher levels of previous education. It was also significantly associated with 'big five' personality trait scores. Regression analysis revealed 52.9% of variance was accounted for by personality factors, discipline and prior experience of tertiary education. Demographic, discipline and personality factors are associated with SDLR in the first year of study. Teachers need to be alert to individual student variation in SDLR.
Big Data as catalyst for change in Astronomy Libraries - Indian Scenario
NASA Astrophysics Data System (ADS)
Birdie, Christina
2015-08-01
Research in Astronomy fosters exciting missions and encourages libraries to engage themselves in big budget astronomy programs which are the flagship projects for most of the astronomers. The scholarly communication resulting from analyzing Big Data has led to new opportunities for Astronomy librarians to become involved in the management of publications more intelligently. In India the astronomers have committed their participation in the mega ‘TMT’ (Thirty Meter Telescope) project, which is an international partnership science program between Caltech, University of California, Canada, Japan, China and India. Participation in the TMT project will provide Indian astronomers an opportunity to carryout frontline research in astronomy. Within India, there are three major astronomy research institutes, namely, Indian Institute of Astrophysics (IIA), Inter-University center for Astronomy & Astrophysics (IUCAA), & Aryabhatta Research Institute of Observational sciences (ARIES) are stake holders in this program along with Indian Government as veuture capitalist. This study will examine the potential publishing pattern of those astronomers and technologists within India, with special focus to those three institutes. The indications of already existing collaborations among them, the expertise in instrument building, display of software development skills and cutting edge research capability etc. can be derived from analyzing their publications for the last ten years. Attempt also will be made to examine the in-house technical reports, newsletters,conference presentations etc. from these three institutes with a view to highlight the hidden potential skills and the possible collaboration among the Indian astronomers expressed from the grey literature.The incentive to make the astronomy libraries network stronger within India, may evolve from the findings and future requirements. As this project is deemed to be the national project with the financial support from science & technology department of Government of India, the libraries participating have excellent opportunity to showcase their capabilities and make an impact in the national level.
Rethinking Big Science. Modest, mezzo, grand science and the development of the Bevalac, 1971-1993.
Westfall, Catherine
2003-03-01
Historians of science have tended to focus exclusively on scale in investigations of largescale research, perhaps because it has been easy to assume that comprehending a phenomenon dubbed "Big Science" hinges on an understanding of bigness. A close look at Lawrence Berkeley Laboratory's Bevalac, a medium-scale "mezzo science" project formed by uniting two preexisting machines--the modest SuperHILAC and the grand Bevatron--shows what can be gained by overcoming this preoccupation with bigness. The Bevalac story reveals how interconnections, connections, and disconnections ultimately led to the development of a new kind of science that transformed the landscape of large-scale research in the United States. Important lessons in historiography also emerge: the value of framing discussions in terms of networks, the necessity of constantly expanding and refining methodology, and the importance of avoiding the rhetoric of participants and instead finding words to tell our own stories.
Presenting the 'Big Ideas' of Science: Earth Science Examples.
ERIC Educational Resources Information Center
King, Chris
2001-01-01
Details an 'explanatory Earth story' on plate tectonics to show how such a 'story' can be developed in an earth science context. Presents five other stories in outline form. Explains the use of these stories as vehicles to present the big ideas of science. (DDR)
Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna
2017-12-01
To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.
'Big data' in pharmaceutical science: challenges and opportunities.
Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John
2014-05-01
Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.
Next Generation Workload Management and Analysis System for Big Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, Kaushik
We report on the activities and accomplishments of a four-year project (a three-year grant followed by a one-year no cost extension) to develop a next generation workload management system for Big Data. The new system is based on the highly successful PanDA software developed for High Energy Physics (HEP) in 2005. PanDA is used by the ATLAS experiment at the Large Hadron Collider (LHC), and the AMS experiment at the space station. The program of work described here was carried out by two teams of developers working collaboratively at Brookhaven National Laboratory (BNL) and the University of Texas at Arlingtonmore » (UTA). These teams worked closely with the original PanDA team – for the sake of clarity the work of the next generation team will be referred to as the BigPanDA project. Their work has led to the adoption of BigPanDA by the COMPASS experiment at CERN, and many other experiments and science projects worldwide.« less
The Big Bang Theory and the Nature of Science
NASA Astrophysics Data System (ADS)
Arthury, Luiz Henrique Martins; Peduzzi, Luiz O. Q.
2015-12-01
Modern cosmology was constituted, throughout the twentieth century to the present days, as a very productive field of research, resulting in major discoveries that attest to its explanatory power. The Big Bang Theory, the generic and popular name of the standard model of cosmology, is probably the most daring research program of physics and astronomy, trying to recreate the evolution of our observable universe. But contrary to what you might think, its conjectures are of a degree of refinement and corroborative evidence that make it our best explanation for the history of our cosmos. The Big Bang Theory is also an excellent field to discuss issues regarding the scientific activity itself. In this paper we discuss the main elements of this theory with an epistemological look, resulting in a text quite useful to work on educational activities with related goals.
A Blended Professional Development Program to Help a Teacher Learn to Provide One-to-One Scaffolding
ERIC Educational Resources Information Center
Belland, Brian R.; Burdo, Ryan; Gu, Jiangyue
2015-01-01
Argumentation is central to instruction centered on socio-scientific issues (Sadler & Donnelly in "International Journal of Science Education," 28(12), 1463-1488, 2006. doi: 10.1080/09500690600708717). Teachers can play a big role in helping students engage in argumentation and solve authentic scientific problems. To do so, they need…
Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives
Miron-Shatz, T.; Lau, A. Y. S.; Paton, C.
2014-01-01
Summary Objectives As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. Methods A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Results Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to “small data” would also be useful. PMID:25123717
Hansen, M M; Miron-Shatz, T; Lau, A Y S; Paton, C
2014-08-15
As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts: The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to "small data" would also be useful.
The Ethics of Big Data and Nursing Science.
Milton, Constance L
2017-10-01
Big data is a scientific, social, and technological trend referring to the process and size of datasets available for analysis. Ethical implications arise as healthcare disciplines, including nursing, struggle over questions of informed consent, privacy, ownership of data, and its possible use in epistemology. The author offers straight-thinking possibilities for the use of big data in nursing science.
50 CFR 86.11 - What does the national BIG Program do?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false What does the national BIG Program do? 86.11 Section 86.11 Wildlife and Fisheries UNITED STATES FISH AND WILDLIFE SERVICE, DEPARTMENT OF THE... GRANT (BIG) PROGRAM General Information About the Grant Program § 86.11 What does the national BIG...
Earth science big data at users' fingertips: the EarthServer Science Gateway Mobile
NASA Astrophysics Data System (ADS)
Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Fargetta, Marco; Pappalardo, Marco; Rundo, Francesco
2014-05-01
The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. In this contribution we will report on the EarthServer Science Gateway Mobile, an app for both iOS and Android-based devices that allows users to seamlessly access some of the EarthServer applications using SAML-based federated authentication and fine-grained authorisation mechanisms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
Beyond Einstein: Exploring the Extreme Universe
NASA Technical Reports Server (NTRS)
Barbier, Louis M.
2005-01-01
This paper will give an overview of the NASA Universe Division Beyond Einstein program. The Beyond Einstein program consists of a series of exploratory missions to investigate some of the most important and pressing problems in modern-day astrophysics - including searches for Dark Energy and studies of the earliest times in the universe, during the inflationary period after the Big Bang. A variety of new technologies are being developed both in the science instrumentation these missions will carry and in the spacecraft that will carry those instruments.
Big Data and Data Science in Critical Care.
Sanchez-Pinto, L Nelson; Luo, Yuan; Churpek, Matthew M
2018-05-09
The digitalization of the health-care system has resulted in a deluge of clinical Big Data and has prompted the rapid growth of data science in medicine. Data science, which is the field of study dedicated to the principled extraction of knowledge from complex data, is particularly relevant in the critical care setting. The availability of large amounts of data in the ICU, the need for better evidence-based care, and the complexity of critical illness makes the use of data science techniques and data-driven research particularly appealing to intensivists. Despite the increasing number of studies and publications in the field, thus far there have been few examples of data science projects that have resulted in successful implementations of data-driven systems in the ICU. However, given the expected growth in the field, intensivists should be familiar with the opportunities and challenges of Big Data and data science. The present article reviews the definitions, types of algorithms, applications, challenges, and future of Big Data and data science in critical care. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
Meteor Observations as Big Data Citizen Science
NASA Astrophysics Data System (ADS)
Gritsevich, M.; Vinkovic, D.; Schwarz, G.; Nina, A.; Koschny, D.; Lyytinen, E.
2016-12-01
Meteor science represents an excellent example of the citizen science project, where progress in the field has been largely determined by amateur observations. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established BigSkyEarth http://bigskyearth.eu/ network.
Biosecurity in the age of Big Data: a conversation with the FBI
Kozminski, Keith G.
2015-01-01
New scientific frontiers and emerging technologies within the life sciences pose many global challenges to society. Big Data is a premier example, especially with respect to individual, national, and international security. Here a Special Agent of the Federal Bureau of Investigation discusses the security implications of Big Data and the need for security in the life sciences. PMID:26543195
Data Science and its Relationship to Big Data and Data-Driven Decision Making.
Provost, Foster; Fawcett, Tom
2013-03-01
Companies have realized they need to hire data scientists, academic institutions are scrambling to put together data-science programs, and publications are touting data science as a hot-even "sexy"-career choice. However, there is confusion about what exactly data science is, and this confusion could lead to disillusionment as the concept diffuses into meaningless buzz. In this article, we argue that there are good reasons why it has been hard to pin down exactly what is data science. One reason is that data science is intricately intertwined with other important concepts also of growing importance, such as big data and data-driven decision making. Another reason is the natural tendency to associate what a practitioner does with the definition of the practitioner's field; this can result in overlooking the fundamentals of the field. We believe that trying to define the boundaries of data science precisely is not of the utmost importance. We can debate the boundaries of the field in an academic setting, but in order for data science to serve business effectively, it is important (i) to understand its relationships to other important related concepts, and (ii) to begin to identify the fundamental principles underlying data science. Once we embrace (ii), we can much better understand and explain exactly what data science has to offer. Furthermore, only once we embrace (ii) should we be comfortable calling it data science. In this article, we present a perspective that addresses all these concepts. We close by offering, as examples, a partial list of fundamental principles underlying data science.
The role of administrative data in the big data revolution in social science research.
Connelly, Roxanne; Playford, Christopher J; Gayle, Vernon; Dibben, Chris
2016-09-01
The term big data is currently a buzzword in social science, however its precise meaning is ambiguous. In this paper we focus on administrative data which is a distinctive form of big data. Exciting new opportunities for social science research will be afforded by new administrative data resources, but these are currently under appreciated by the research community. The central aim of this paper is to discuss the challenges associated with administrative data. We emphasise that it is critical for researchers to carefully consider how administrative data has been produced. We conclude that administrative datasets have the potential to contribute to the development of high-quality and impactful social science research, and should not be overlooked in the emerging field of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Big Data in Health: a Literature Review from the Year 2005.
de la Torre Díez, Isabel; Cosgaya, Héctor Merino; Garcia-Zapirain, Begoña; López-Coronado, Miguel
2016-09-01
The information stored in healthcare systems has increased over the last ten years, leading it to be considered Big Data. There is a wealth of health information ready to be analysed. However, the sheer volume raises a challenge for traditional methods. The aim of this article is to conduct a cutting-edge study on Big Data in healthcare from 2005 to the present. This literature review will help researchers to know how Big Data has developed in the health industry and open up new avenues for research. Information searches have been made on various scientific databases such as Pubmed, Science Direct, Scopus and Web of Science for Big Data in healthcare. The search criteria were "Big Data" and "health" with a date range from 2005 to the present. A total of 9724 articles were found on the databases. 9515 articles were discarded as duplicates or for not having a title of interest to the study. 209 articles were read, with the resulting decision that 46 were useful for this study. 52.6 % of the articles used were found in Science Direct, 23.7 % in Pubmed, 22.1 % through Scopus and the remaining 2.6 % through the Web of Science. Big Data has undergone extremely high growth since 2011 and its use is becoming compulsory in developed nations and in an increasing number of developing nations. Big Data is a step forward and a cost reducer for public and private healthcare.
Transformative Science for the Next Decade with the Green Bank Observatory
NASA Astrophysics Data System (ADS)
O'Neil, Karen; Frayer, David; Ghigo, Frank; Lockman, Felix; Lynch, Ryan; Maddalena, Ronald; minter, Anthony; Prestage, Richard
2018-01-01
With new instruments and improved performance, the 100m Green Bank Telescope is now demonstrating its full potential. On this 60th anniversary of the groundbreaking for the Green Bank Observatory, we can look forward to the future of the facility for the next 5, 10, and even 20 years. Here we describe the results from a recent workshop, “Transformative Science for the Next Decade with the Green Bank Observatory: Big Questions, Large Programs, and New Instruments,” and describe the scientific plans for our facility.
Detection and Characterisation of Meteors as a Big Data Citizen Science project
NASA Astrophysics Data System (ADS)
Gritsevich, M.
2017-12-01
Out of a total around 50,000 meteorites currently known to science, the atmospheric passage was recorded instrumentally in only 30 cases with the potential to derive their atmospheric trajectories and pre-impact heliocentric orbits. Similarly, while the observations of meteors, add thousands of new entries per month to existing databases, it is extremely rare they lead to meteorite recovery. Meteor studies thus represent an excellent example of the Big Data citizen science project, where progress in the field largely depends on the prompt identification and characterisation of meteor events as well as on extensive and valuable contributions by amateur observers. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established EU COST BigSkyEarth http://bigskyearth.eu/ network.
Data science, learning, and applications to biomedical and health sciences.
Adam, Nabil R; Wieder, Robert; Ghosh, Debopriya
2017-01-01
The last decade has seen an unprecedented increase in the volume and variety of electronic data related to research and development, health records, and patient self-tracking, collectively referred to as Big Data. Properly harnessed, Big Data can provide insights and drive discovery that will accelerate biomedical advances, improve patient outcomes, and reduce costs. However, the considerable potential of Big Data remains unrealized owing to obstacles including a limited ability to standardize and consolidate data and challenges in sharing data, among a variety of sources, providers, and facilities. Here, we discuss some of these challenges and potential solutions, as well as initiatives that are already underway to take advantage of Big Data. © 2017 New York Academy of Sciences.
Academician Basov, high-power lasers, and the antimissile defense problem
NASA Astrophysics Data System (ADS)
Zarubin, Peter Vasilievich
2013-02-01
A review of the extensive program of the pioneering research and development of high-power lasers and laser radar undertaken in the USSR during the years 1964 to 1978 under the scientific supervision of N.G. Basov is presented. In the course of this program, many high-energy lasers with unique properties were created, new big research and design teams were formed, and the laser production and testing facilities were extended and developed. The program was fulfilled at many leading research institutions and design bureaus of the USSR Academy of Sciences and defense industry.
Big physics quartet win government backing
NASA Astrophysics Data System (ADS)
Banks, Michael
2014-09-01
Four major physics-based projects are among 10 to have been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology for funding in the coming decade as part of its “roadmap” of big-science projects.
NASA Astrophysics Data System (ADS)
Orna, Mary Virginia
1999-09-01
We are in the era of Big Science, which also means big institutions where the Big Science is done. However, higher education in the United States is unique in that parallel to the array of big institutions is a system of small liberal arts and sciences colleges where students receive the personal attention and faculty contact that is often not possible at larger institutions. While these smaller institutions are limited in resources and finances, studies have shown that they contribute a disproportionately higher number of leaders across a spectrum of disciplines, including chemistry. This address summarizes my personal odyssey and the reasons for the award. In it, I emphasize the advantages enjoyed by liberal arts and sciences students and faculty that enable them to overcome the view that great things can only be done in large, cosmopolitan settings.
Water Exploration: An Online High School Water Resource Education Program
NASA Astrophysics Data System (ADS)
Ellins, K. K.; McCall, L. R.; Amos, S.; McGowan, R. F.; Mote, A.; Negrito, K.; Paloski, B.; Ryan, C.; Cameron, B.
2010-12-01
The Institute for Geophysics at The University of Texas at Austin and 4empowerment.com, a Texas-based for-profit educational enterprise, teamed up with the Texas Water Development Board to develop and implement a Web-based water resources education program for Texas high school students. The program, Water Exploration uses a project-based learning approach called the Legacy Cycle model to permit students to conduct research and build an understanding about water science and critical water-related issues, using the Internet and computer technology. The three Legacy Cycle modules in the Water Exploration curriculum are: Water Basics, Water-Earth Dynamics and People Need Water. Within each Legacy Cycle there are three different challenges, or instructional modules, laid out as projects with clearly stated goals for students to carry out. Each challenge address themes that map to the water-related “Big Ideas” and supporting concepts found in the new Earth Science Literacy Principles: The Big Ideas and Supporting Concepts of Earth Science. As students work through a challenge they follow a series of steps, each of which is associated (i.e., linked online) with a manageable number of corresponding, high quality, research-based learning activities and Internet resources, including scholarly articles, cyber tools, and visualizations intended to enhance understanding of the concepts presented. The culmination of each challenge is a set of “Go Public” products that are the students’ answers to the challenge and which serve as the final assessment for the challenge. The “Go Public” products are posted to a collaborative workspace on the Internet as the “legacy” of the students’ work, thereby allowing subsequent groups of students who take the challenge to add new products. Twenty-two science educators have been trained on the implementation of the Water Exploration curriculum. A graduate student pursuing a master’s degree in science education through The University of Texas’ UTEACH program is conducting research to track the teachers’ implementation of Water Exploration and assess their comfort with cyber-education through classroom observations, students and teacher surveys, and evaluation of students’ “Go Public” products.
NASA Astrophysics Data System (ADS)
Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca
2013-04-01
The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on AAA Platforms For Scientific Resources in Europe (https://confluence.terena.org/display/aaastudy/AAA+Study+Home+Page) will also be assessed.
The LSSTC Data Science Fellowship Program
NASA Astrophysics Data System (ADS)
Miller, Adam; Walkowicz, Lucianne; LSSTC DSFP Leadership Council
2017-01-01
The Large Synoptic Survey Telescope Corporation (LSSTC) Data Science Fellowship Program (DSFP) is a unique professional development program for astronomy graduate students. DSFP students complete a series of six, one-week long training sessions over the course of two years. The sessions are cumulative, each building on the last, to allow an in-depth exploration of the topics covered: data science basics, statistics, image processing, machine learning, scalable software, data visualization, time-series analysis, and science communication. The first session was held in Aug 2016 at Northwestern University, with all materials and lectures publicly available via github and YouTube. Each session focuses on a series of technical problems which are written in iPython notebooks. The initial class of fellows includes 16 students selected from across the globe, while an additional 14 fellows will be added to the program in year 2. Future sessions of the DSFP will be hosted by a rotating cast of LSSTC member institutions. The DSFP is designed to supplement graduate education in astronomy by teaching the essential skills necessary for dealing with big data, serving as a resource for all in the LSST era. The LSSTC DSFP is made possible by the generous support of the LSST Corporation, the Data Science Initiative (DSI) at Northwestern, and CIERA.
From darwin to the census of marine life: marine biology as big science.
Vermeulen, Niki
2013-01-01
With the development of the Human Genome Project, a heated debate emerged on biology becoming 'big science'. However, biology already has a long tradition of collaboration, as natural historians were part of the first collective scientific efforts: exploring the variety of life on earth. Such mappings of life still continue today, and if field biology is gradually becoming an important subject of studies into big science, research into life in the world's oceans is not taken into account yet. This paper therefore explores marine biology as big science, presenting the historical development of marine research towards the international 'Census of Marine Life' (CoML) making an inventory of life in the world's oceans. Discussing various aspects of collaboration--including size, internationalisation, research practice, technological developments, application, and public communication--I will ask if CoML still resembles traditional collaborations to collect life. While showing both continuity and change, I will argue that marine biology is a form of natural history: a specific way of working together in biology that has transformed substantially in interaction with recent developments in the life sciences and society. As a result, the paper does not only give an overview of transformations towards large scale research in marine biology, but also shines a new light on big biology, suggesting new ways to deepen the understanding of collaboration in the life sciences by distinguishing between different 'collective ways of knowing'.
Promoting Diversity Through Polar Interdisciplinary Coordinated Education (Polar ICE)
NASA Astrophysics Data System (ADS)
McDonnell, J. D.; Hotaling, L. A.; Garza, C.; Van Dyk, P. B.; Hunter-thomson, K. I.; Middendorf, J.; Daniel, A.; Matsumoto, G. I.; Schofield, O.
2017-12-01
Polar Interdisciplinary Coordinated Education (ICE) is an education and outreach program designed to provide public access to the Antarctic and Arctic regions through polar data and interactions with the scientists. The program provides multi-faceted science communication training for early career scientists that consist of a face-to face workshop and opportunities to apply these skills. The key components of the scientist training workshop include cultural competency training, deconstructing/decoding science for non-expert audiences, the art of telling science stories, and networking with members of the education and outreach community and reflecting on communication skills. Scientists partner with educators to provide professional development for K-12 educators and support for student research symposia. Polar ICE has initiated a Polar Literacy initiative that provides both a grounding in big ideas in polar science and science communication training designed to underscore the importance of the Polar Regions to the public while promoting interdisciplinary collaborations between scientists and educators. Our ultimate objective is to promote STEM identity through professional development of scientists and educators while developing career awareness of STEM pathways in Polar science.
The Swift MIDEX Education and Public Outreach Program
NASA Astrophysics Data System (ADS)
Feigelson, E. D.; Cominsky, L. R.; Whitlock, L. A.
1999-12-01
The Swift satellite is dedicated to an understanding of gamma-ray bursts, the most powerful explosions in the Universe since the Big Bang. A multifaceted E/PO program associated with Swift is planned. Web sites will be constructed, including sophisticated interactive learning environments for combining science concepts with with exploration and critical thinking for high school students. The award-winning instructional television program "What's in the News?", produced by Penn State Public Broadcasting and reaching several million 4th-7th graders, will create a series of broadcasts on Swift and space astronomy. A teachers' curricular guide on space astronomy will be produced by UC-Berkeley's Lawrence Hall of Science as part of their highly successful GEMS guides promoting inquiry-based science education. Teacher workshops will be conducted in the Appalachian region and nationwide to testbed and disseminate these products. We may also assist the production of gamma-ray burst museum exhibits. All aspects of the program will be overseen by a Swift Education Committee and assessed by a professional educational evaluation firm. This effort will be supported by the NASA Swift MIDEX contract to Penn State.
Zhang, Xinzhi; Pérez-Stable, Eliseo J.; Bourne, Philip E.; Peprah, Emmanuel; Duru, O. Kenrik; Breen, Nancy; Berrigan, David; Wood, Fred; Jackson, James S.; Wong, David W.S.; Denny, Joshua
2017-01-01
Addressing minority health and health disparities has been a missing piece of the puzzle in Big Data science. This article focuses on three priority opportunities that Big Data science may offer to the reduction of health and health care disparities. One opportunity is to incorporate standardized information on demographic and social determinants in electronic health records in order to target ways to improve quality of care for the most disadvantaged populations over time. A second opportunity is to enhance public health surveillance by linking geographical variables and social determinants of health for geographically defined populations to clinical data and health outcomes. Third and most importantly, Big Data science may lead to a better understanding of the etiology of health disparities and understanding of minority health in order to guide intervention development. However, the promise of Big Data needs to be considered in light of significant challenges that threaten to widen health disparities. Care must be taken to incorporate diverse populations to realize the potential benefits. Specific recommendations include investing in data collection on small sample populations, building a diverse workforce pipeline for data science, actively seeking to reduce digital divides, developing novel ways to assure digital data privacy for small populations, and promoting widespread data sharing to benefit under-resourced minority-serving institutions and minority researchers. With deliberate efforts, Big Data presents a dramatic opportunity for reducing health disparities but without active engagement, it risks further widening them. PMID:28439179
Zhang, Xinzhi; Pérez-Stable, Eliseo J; Bourne, Philip E; Peprah, Emmanuel; Duru, O Kenrik; Breen, Nancy; Berrigan, David; Wood, Fred; Jackson, James S; Wong, David W S; Denny, Joshua
2017-01-01
Addressing minority health and health disparities has been a missing piece of the puzzle in Big Data science. This article focuses on three priority opportunities that Big Data science may offer to the reduction of health and health care disparities. One opportunity is to incorporate standardized information on demographic and social determinants in electronic health records in order to target ways to improve quality of care for the most disadvantaged populations over time. A second opportunity is to enhance public health surveillance by linking geographical variables and social determinants of health for geographically defined populations to clinical data and health outcomes. Third and most importantly, Big Data science may lead to a better understanding of the etiology of health disparities and understanding of minority health in order to guide intervention development. However, the promise of Big Data needs to be considered in light of significant challenges that threaten to widen health disparities. Care must be taken to incorporate diverse populations to realize the potential benefits. Specific recommendations include investing in data collection on small sample populations, building a diverse workforce pipeline for data science, actively seeking to reduce digital divides, developing novel ways to assure digital data privacy for small populations, and promoting widespread data sharing to benefit under-resourced minority-serving institutions and minority researchers. With deliberate efforts, Big Data presents a dramatic opportunity for reducing health disparities but without active engagement, it risks further widening them.
Advances in Data Management in Remote Sensing and Climate Modeling
NASA Astrophysics Data System (ADS)
Brown, P. G.
2014-12-01
Recent commercial interest in "Big Data" information systems has yielded little more than a sense of deja vu among scientists whose work has always required getting their arms around extremely large databases, and writing programs to explore and analyze it. On the flip side, there are some commercial DBMS startups building "Big Data" platform using techniques taken from earth science, astronomy, high energy physics and high performance computing. In this talk, we will introduce one such platform; Paradigm4's SciDB, the first DBMS designed from the ground up to combine the kinds of quality-of-service guarantees made by SQL DBMS platforms—high level data model, query languages, extensibility, transactions—with the kinds of functionality familiar to scientific users—arrays as structural building blocks, integrated linear algebra, and client language interfaces that minimize the learning curve. We will review how SciDB is used to manage and analyze earth science data by several teams of scientific users.
Big data in forensic science and medicine.
Lefèvre, Thomas
2018-07-01
In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Tractenberg, Rochelle E; Russell, Andrew J; Morgan, Gregory J; FitzGerald, Kevin T; Collmann, Jeff; Vinsel, Lee; Steinmann, Michael; Dolling, Lisa M
2015-12-01
The use of Big Data--however the term is defined--involves a wide array of issues and stakeholders, thereby increasing numbers of complex decisions around issues including data acquisition, use, and sharing. Big Data is becoming a significant component of practice in an ever-increasing range of disciplines; however, since it is not a coherent "discipline" itself, specific codes of conduct for Big Data users and researchers do not exist. While many institutions have created, or will create, training opportunities (e.g., degree programs, workshops) to prepare people to work in and around Big Data, insufficient time, space, and thought have been dedicated to training these people to engage with the ethical, legal, and social issues in this new domain. Since Big Data practitioners come from, and work in, diverse contexts, neither a relevant professional code of conduct nor specific formal ethics training are likely to be readily available. This normative paper describes an approach to conceptualizing ethical reasoning and integrating it into training for Big Data use and research. Our approach is based on a published framework that emphasizes ethical reasoning rather than topical knowledge. We describe the formation of professional community norms from two key disciplines that contribute to the emergent field of Big Data: computer science and statistics. Historical analogies from these professions suggest strategies for introducing trainees and orienting practitioners both to ethical reasoning and to a code of professional conduct itself. We include two semester course syllabi to strengthen our thesis that codes of conduct (including and beyond those we describe) can be harnessed to support the development of ethical reasoning in, and a sense of professional identity among, Big Data practitioners.
Building the biomedical data science workforce.
Dunn, Michelle C; Bourne, Philip E
2017-07-01
This article describes efforts at the National Institutes of Health (NIH) from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K) training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH's internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers.
Building the biomedical data science workforce
Dunn, Michelle C.; Bourne, Philip E.
2017-01-01
This article describes efforts at the National Institutes of Health (NIH) from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K) training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH’s internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers. PMID:28715407
Machine learning for Big Data analytics in plants.
Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng
2014-12-01
Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.
["Big data" - large data, a lot of knowledge?].
Hothorn, Torsten
2015-01-28
Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.
From big data to deep insight in developmental science
2016-01-01
The use of the term ‘big data’ has grown substantially over the past several decades and is now widespread. In this review, I ask what makes data ‘big’ and what implications the size, density, or complexity of datasets have for the science of human development. A survey of existing datasets illustrates how existing large, complex, multilevel, and multimeasure data can reveal the complexities of developmental processes. At the same time, significant technical, policy, ethics, transparency, cultural, and conceptual issues associated with the use of big data must be addressed. Most big developmental science data are currently hard to find and cumbersome to access, the field lacks a culture of data sharing, and there is no consensus about who owns or should control research data. But, these barriers are dissolving. Developmental researchers are finding new ways to collect, manage, store, share, and enable others to reuse data. This promises a future in which big data can lead to deeper insights about some of the most profound questions in behavioral science. WIREs Cogn Sci 2016, 7:112–126. doi: 10.1002/wcs.1379 For further resources related to this article, please visit the WIREs website. PMID:26805777
Aquatic Sciences and Its Appeal for Expeditionary Research Science Education
NASA Astrophysics Data System (ADS)
Aguilar, C.; Cuhel, R. L.
2016-02-01
Our multi-program team studies aim to develop specific "hard" and "soft" STEM skills that integrate, literally, both disciplinary and socio-economic aspects of students lives to include peer mentoring, advisement, enabling, and professional mentorship, as well as honestly productive, career-developing hands-on research. Specifically, we use Interdependent, multidisciplinary research experiences; Development and honing of specific disciplinary skill (you have to have something TO network); Use of skill in a team to produce big picture product; Interaction with varied, often outside professionals; in order to Finish with self-confidence and a marketable skill. In a given year our umbrella projects involve linked aquatic science disciplines: Analytical Chemistry; Geology; Geochemistry; Microbiology; Engineering (Remotely Operated Vehicles); and recently Policy (scientist-public engagement). We especially use expeditionary research activities aboard our research vessel in Lake Michigan, during which (a dozen at a time, from multiple programs) students: Experience ocean-scale research cruise activities; Apply a learned skill in real time to characterize a large lake; Participate in interdisciplinary teamwork; Learn interactions among biology, chemistry, geology, optics, physics for diverse aquatic habitats; and, importantly, Experience leadership as "Chief Scientist-for-a-station". These team efforts achieve beneficial outcomes: Develop self-confidence in application of skills; Enable expression of leadership capabilities; Provide opportunity to assess "love of big water"; Produce invaluable long-term dataset for the studied region (our benefit); and they are Often voted as a top influence for career decisions. These collectively have led to some positive outcomes for "historical" undergraduate participants - more than half in STEM graduate programs, only a few not still involved in a STEM career at some level, or involved as for example a lawyer in environmental policy.
LLMapReduce: Multi-Lingual Map-Reduce for Supercomputing Environments
2015-11-20
1990s. Popularized by Google [36] and Apache Hadoop [37], map-reduce has become a staple technology of the ever- growing big data community...Lexington, MA, U.S.A Abstract— The map-reduce parallel programming model has become extremely popular in the big data community. Many big data ...to big data users running on a supercomputer. LLMapReduce dramatically simplifies map-reduce programming by providing simple parallel programming
Pruinelli, Lisiane; Delaney, Connie W; Garciannie, Amy; Caspers, Barbara; Westra, Bonnie L
2016-01-01
There is a growing body of evidence of the relationship of nurse staffing to patient, nurse, and financial outcomes. With the advent of big data science and developing big data analytics in nursing, data science with the reuse of big data is emerging as a timely and cost-effective approach to demonstrate nursing value. The Nursing Management Minimum Date Set (NMMDS) provides standard administrative data elements, definitions, and codes to measure the context where care is delivered and, consequently, the value of nursing. The integration of the NMMDS elements in the current health system provides evidence for nursing leaders to measure and manage decisions, leading to better patient, staffing, and financial outcomes. It also enables the reuse of data for clinical scholarship and research.
ERIC Educational Resources Information Center
Herrera, Carla; Kauh, Tina J.; Cooney, Siobhan M.; Grossman, Jean Baldwin; McMaken, Jennifer
2008-01-01
High schools have recently become a popular source of mentors for school-based mentoring (SBM) programs. The high school Bigs program of Big Brothers Big Sisters of America, for example, currently involves close to 50,000 high-school-aged mentors across the country. While the use of these young mentors has several potential advantages, their age…
NASA Astrophysics Data System (ADS)
2013-09-01
Conference: The Big Bangor Day Meeting Lecture: Charterhouse plays host to a physics day Festival: Science on Stage festival 2013 arrives in Poland Event: Scottish Physics Teachers' Summer School Meeting: Researchers and educators meet at Lund University Conference: Exeter marks the spot Recognition: European Physical Society uncovers an historic site Education: Initial teacher education undergoes big changes Forthcoming events
ERIC Educational Resources Information Center
Lara-Alecio, Rafael; Irby, Beverly J.; Tong, Fuhui; Guerrero, Cindy; Koch, Janice; Sutton-Jones, Kara L.
2018-01-01
The overarching purpose of our study was to compare performances of treatment and control condition students who completed a literacy-infused, inquiry-based science intervention through sixth grade as measured by a big idea assessment tool which we refer to as the Big Ideas in Science Assessment (BISA). First, we determine the concurrent validity…
Big-Data-Driven Stem Cell Science and Tissue Engineering: Vision and Unique Opportunities.
Del Sol, Antonio; Thiesen, Hans J; Imitola, Jaime; Carazo Salas, Rafael E
2017-02-02
Achieving the promises of stem cell science to generate precise disease models and designer cell samples for personalized therapeutics will require harnessing pheno-genotypic cell-level data quantitatively and predictively in the lab and clinic. Those requirements could be met by developing a Big-Data-driven stem cell science strategy and community. Copyright © 2017 Elsevier Inc. All rights reserved.
Developing Healthcare Data Analytics APPs with Open Data Science Tools.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong
2017-01-01
Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.
The BIG Data Center: from deposition to integration to translation
2017-01-01
Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. PMID:27899658
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
Henly, Susan J; McCarthy, Donna O; Wyman, Jean F; Heitkemper, Margaret M; Redeker, Nancy S; Titler, Marita G; McCarthy, Ann Marie; Stone, Patricia W; Moore, Shirley M; Alt-White, Anna C; Conley, Yvette P; Dunbar-Jacob, Jacqueline
2015-01-01
The Council for the Advancement of Nursing Science aims to "facilitate and recognize life-long nursing science career development" as an important part of its mission. In light of fast-paced advances in science and technology that are inspiring new questions and methods of investigation in the health sciences, the Council for the Advancement of Nursing Science convened the Idea Festival for Nursing Science Education and appointed the Idea Festival Advisory Committee (IFAC) to stimulate dialogue about linking PhD education with a renewed vision for preparation of the next generation of nursing scientists. Building on the 2005 National Research Council report Advancing The Nation's Health Needs and the 2010 American Association of Colleges of Nursing Position Statement on the Research-Focused Doctorate Pathways to Excellence, the IFAC specifically addressed the capacity of PhD programs to prepare nursing scientists to conduct cutting-edge research in the following key emerging and priority areas of health sciences research: omics and the microbiome; health behavior, behavior change, and biobehavioral science; patient-reported outcomes; big data, e-science, and informatics; quantitative sciences; translation science; and health economics. The purpose of this article is to (a) describe IFAC activities, (b) summarize 2014 discussions hosted as part of the Idea Festival, and (c) present IFAC recommendations for incorporating these emerging areas of science and technology into research-focused doctoral programs committed to preparing graduates for lifelong, competitive careers in nursing science. The recommendations address clearer articulation of program focus areas; inclusion of foundational knowledge in emerging areas of science in core courses on nursing science and research methods; faculty composition; prerequisite student knowledge and skills; and in-depth, interdisciplinary training in supporting area of science content and methods. Copyright © 2015 Elsevier Inc. All rights reserved.
Geologic map of Big Bend National Park, Texas
Turner, Kenzie J.; Berry, Margaret E.; Page, William R.; Lehman, Thomas M.; Bohannon, Robert G.; Scott, Robert B.; Miggins, Daniel P.; Budahn, James R.; Cooper, Roger W.; Drenth, Benjamin J.; Anderson, Eric D.; Williams, Van S.
2011-01-01
The purpose of this map is to provide the National Park Service and the public with an updated digital geologic map of Big Bend National Park (BBNP). The geologic map report of Maxwell and others (1967) provides a fully comprehensive account of the important volcanic, structural, geomorphological, and paleontological features that define BBNP. However, the map is on a geographically distorted planimetric base and lacks topography, which has caused difficulty in conducting GIS-based data analyses and georeferencing the many geologic features investigated and depicted on the map. In addition, the map is outdated, excluding significant data from numerous studies that have been carried out since its publication more than 40 years ago. This report includes a modern digital geologic map that can be utilized with standard GIS applications to aid BBNP researchers in geologic data analysis, natural resource and ecosystem management, monitoring, assessment, inventory activities, and educational and recreational uses. The digital map incorporates new data, many revisions, and greater detail than the original map. Although some geologic issues remain unresolved for BBNP, the updated map serves as a foundation for addressing those issues. Funding for the Big Bend National Park geologic map was provided by the United States Geological Survey (USGS) National Cooperative Geologic Mapping Program and the National Park Service. The Big Bend mapping project was administered by staff in the USGS Geology and Environmental Change Science Center, Denver, Colo. Members of the USGS Mineral and Environmental Resources Science Center completed investigations in parallel with the geologic mapping project. Results of these investigations addressed some significant current issues in BBNP and the U.S.-Mexico border region, including contaminants and human health, ecosystems, and water resources. Funding for the high-resolution aeromagnetic survey in BBNP, and associated data analyses and interpretation, was from the USGS Crustal Geophysics and Geochemistry Science Center. Mapping contributed from university professors and students was mostly funded by independent sources, including academic institutions, private industry, and other agencies.
The New Big Science at the NSLS
NASA Astrophysics Data System (ADS)
Crease, Robert
2016-03-01
The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.
ERIC Educational Resources Information Center
Liou, Pey-Yan
2014-01-01
The purpose of this study is to examine the relationship between student self-concept and achievement in science in Taiwan based on the big-fish-little-pond effect (BFLPE) model using the Trends in International Mathematics and Science Study (TIMSS) 2003 and 2007 databases. Hierarchical linear modeling was used to examine the effects of the…
ERIC Educational Resources Information Center
Bush, Diane; Dreistadt, Steve
This program has 10 units, each to be taught in 40-50 minute periods. Each unit includes a statement of purpose, concepts to be taught, a list of necessary materials, preparation, and graphics. Guidelines are provided for 10-15 minutes of introduction with classroom discussion, 15-20 minutes of activities and 5-10 minutes of wrap-up discussion.…
NASA Astrophysics Data System (ADS)
Agrawal, Ankit; Choudhary, Alok
2016-05-01
Our ability to collect "big data" has greatly surpassed our capability to analyze it, underscoring the emergence of the fourth paradigm of science, which is data-driven discovery. The need for data informatics is also emphasized by the Materials Genome Initiative (MGI), further boosting the emerging field of materials informatics. In this article, we look at how data-driven techniques are playing a big role in deciphering processing-structure-property-performance relationships in materials, with illustrative examples of both forward models (property prediction) and inverse models (materials discovery). Such analytics can significantly reduce time-to-insight and accelerate cost-effective materials discovery, which is the goal of MGI.
To See the Unseen: A History of Planetary Radar Astronomy
NASA Technical Reports Server (NTRS)
Butrica, Andrew J.
1996-01-01
This book relates the history of planetary radar astronomy from its origins in radar to the present day and secondarily to bring to light that history as a case of 'Big Equipment but not Big Science'. Chapter One sketches the emergence of radar astronomy as an ongoing scientific activity at Jodrell Bank, where radar research revealed that meteors were part of the solar system. The chief Big Science driving early radar astronomy experiments was ionospheric research. Chapter Two links the Cold War and the Space Race to the first radar experiments attempted on planetary targets, while recounting the initial achievements of planetary radar, namely, the refinement of the astronomical unit and the rotational rate and direction of Venus. Chapter Three discusses early attempts to organize radar astronomy and the efforts at MIT's Lincoln Laboratory, in conjunction with Harvard radio astronomers, to acquire antenna time unfettered by military priorities. Here, the chief Big Science influencing the development of planetary radar astronomy was radio astronomy. Chapter Four spotlights the evolution of planetary radar astronomy at the Jet Propulsion Laboratory, a NASA facility, at Cornell University's Arecibo Observatory, and at Jodrell Bank. A congeries of funding from the military, the National Science Foundation, and finally NASA marked that evolution, which culminated in planetary radar astronomy finding a single Big Science patron, NASA. Chapter Five analyzes planetary radar astronomy as a science using the theoretical framework provided by philosopher of science Thomas Kuhn. Chapter Six explores the shift in planetary radar astronomy beginning in the 1970s that resulted from its financial and institutional relationship with NASA Big Science. Chapter Seven addresses the Magellan mission and its relation to the evolution of planetary radar astronomy from a ground-based to a space-based activity. Chapters Eight and Nine discuss the research carried out at ground-based facilities by this transformed planetary radar astronomy, as well as the upgrading of the Arecibo and Goldstone radars. A technical essay appended to this book provides an overview of planetary radar techniques, especially range-Doppler mapping.
Raster Data Partitioning for Supporting Distributed GIS Processing
NASA Astrophysics Data System (ADS)
Nguyen Thai, B.; Olasz, A.
2015-08-01
In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.
Earth Science Data Analysis in the Era of Big Data
NASA Technical Reports Server (NTRS)
Kuo, K.-S.; Clune, T. L.; Ramachandran, R.
2014-01-01
Anyone with even a cursory interest in information technology cannot help but recognize that "Big Data" is one of the most fashionable catchphrases of late. From accurate voice and facial recognition, language translation, and airfare prediction and comparison, to monitoring the real-time spread of flu, Big Data techniques have been applied to many seemingly intractable problems with spectacular successes. They appear to be a rewarding way to approach many currently unsolved problems. Few fields of research can claim a longer history with problems involving voluminous data than Earth science. The problems we are facing today with our Earth's future are more complex and carry potentially graver consequences than the examples given above. How has our climate changed? Beside natural variations, what is causing these changes? What are the processes involved and through what mechanisms are these connected? How will they impact life as we know it? In attempts to answer these questions, we have resorted to observations and numerical simulations with ever-finer resolutions, which continue to feed the "data deluge." Plausibly, many Earth scientists are wondering: How will Big Data technologies benefit Earth science research? As an example from the global water cycle, one subdomain among many in Earth science, how would these technologies accelerate the analysis of decades of global precipitation to ascertain the changes in its characteristics, to validate these changes in predictive climate models, and to infer the implications of these changes to ecosystems, economies, and public health? Earth science researchers need a viable way to harness the power of Big Data technologies to analyze large volumes and varieties of data with velocity and veracity. Beyond providing speedy data analysis capabilities, Big Data technologies can also play a crucial, albeit indirect, role in boosting scientific productivity by facilitating effective collaboration within an analysis environment. To illustrate the effects of combining a Big Data technology with an effective means of collaboration, we relate the (fictitious) experience of an early-career Earth science researcher a few years beyond the present, interlaced and contrasted with reminiscences of its recent past (i.e., the present).
Who Owns Educational Theory? Big Data, Algorithms and the Expert Power of Education Data Science
ERIC Educational Resources Information Center
Williamson, Ben
2017-01-01
"Education data science" is an emerging methodological field which possesses the algorithm-driven technologies required to generate insights and knowledge from educational big data. This article consists of an analysis of the Lytics Lab, Stanford University's laboratory for research and development in learning analytics, and the Center…
Big Data: Philosophy, Emergence, Crowdledge, and Science Education
ERIC Educational Resources Information Center
dos Santos, Renato P.
2015-01-01
Big Data already passed out of hype, is now a field that deserves serious academic investigation, and natural scientists should also become familiar with Analytics. On the other hand, there is little empirical evidence that any science taught in school is helping people to lead happier, more prosperous, or more politically well-informed lives. In…
ERIC Educational Resources Information Center
Kinsella, William J.
1999-01-01
Extends a Foucauldian view of power/knowledge to the archetypical knowledge-intensive organization, the scientific research laboratory. Describes the discursive production of power/knowledge at the "big science" laboratory conducting nuclear fusion research and illuminates a critical incident in which the fusion research…
ERIC Educational Resources Information Center
McCullagh, John; Greenwood, Julian
2011-01-01
In this digital age, is primary science being left behind? Computer microscopes provide opportunities to transform science lessons into highly exciting learning experiences and to shift enquiry and discovery back into the hands of the children. A class of 5- and 6-year-olds was just one group of children involved in the Digitally Resourced…
A Guided Inquiry on Hubble Plots and the Big Bang
ERIC Educational Resources Information Center
Forringer, Ted
2014-01-01
In our science for non-science majors course "21st Century Physics," we investigate modern "Hubble plots" (plots of velocity versus distance for deep space objects) in order to discuss the Big Bang, dark matter, and dark energy. There are two potential challenges that our students face when encountering these topics for the…
From Darwin to the Census of Marine Life: Marine Biology as Big Science
Vermeulen, Niki
2013-01-01
With the development of the Human Genome Project, a heated debate emerged on biology becoming ‘big science’. However, biology already has a long tradition of collaboration, as natural historians were part of the first collective scientific efforts: exploring the variety of life on earth. Such mappings of life still continue today, and if field biology is gradually becoming an important subject of studies into big science, research into life in the world's oceans is not taken into account yet. This paper therefore explores marine biology as big science, presenting the historical development of marine research towards the international ‘Census of Marine Life’ (CoML) making an inventory of life in the world's oceans. Discussing various aspects of collaboration – including size, internationalisation, research practice, technological developments, application, and public communication – I will ask if CoML still resembles traditional collaborations to collect life. While showing both continuity and change, I will argue that marine biology is a form of natural history: a specific way of working together in biology that has transformed substantially in interaction with recent developments in the life sciences and society. As a result, the paper does not only give an overview of transformations towards large scale research in marine biology, but also shines a new light on big biology, suggesting new ways to deepen the understanding of collaboration in the life sciences by distinguishing between different ‘collective ways of knowing’. PMID:23342119
The Role of Gender in Youth Mentoring Relationship Formation and Duration
ERIC Educational Resources Information Center
Rhodes, Jean; Lowe, Sarah R.; Litchfield, Leon; Walsh-Samp, Kathy
2008-01-01
The role of gender in shaping the course and quality of adult-youth mentoring relationships was examined. The study drew on data from a large, random assignment evaluation of Big Brothers Big Sisters of America (BBSA) programs [Grossman, J. B., & Tierney, J. P. (1998). Does mentoring work? An impact study of the Big Brothers Big Sisters program.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkata, Manjunath Gorentla; Aderholdt, William F
The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less
Big data and visual analytics in anaesthesia and health care.
Simpao, A F; Ahumada, L M; Rehman, M A
2015-09-01
Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The BIG Data Center: from deposition to integration to translation.
2017-01-04
Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Derrick, M.
These proceedings document a number of aspects of a big science facility and its impact on science, on technology, and on the continuing program of a major US research institution. The Zero Gradient Synchrotron (ZGS) was a 12.5 GeV weak focusing proton accelerator that operated at Argonne for fifteen years--from 1964 to 1979. It was a major user facility which led to new close links between the Laboratory and university groups: in the research program; in the choice of experiments to be carried out; in the design and construction of beams and detectors; and even in the Laboratory management. Formore » Argonne, it marked a major move from being a Laboratory dominated by nuclear reactor development to one with a stronger basic research orientation. The present meeting covered the progress in accelerator science, in the applications of technology pioneered or developed by people working at the ZGS, as well as in physics research and detector construction. At this time, when the future of the US research programs in science is being questioned as a result of the ending of the Cold War and plans to balance the Federal budget, the specific place of the National Laboratories in the spectrum of research activities is under particular examination. This Symposium highlights one case history of a major science program that was completed more than a decade ago--so that the further developments of both the science and the technology can be seen in some perspective. The subsequent activities of the people who had worked in the ZGS program as well as the redeployment of the ZGS facilities were addressed. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less
NASA Astrophysics Data System (ADS)
Little, M. M.; Moe, K.; Komar, G.
2014-12-01
NASA's Earth Science Technology Office (ESTO) manages a wide range of information technology projects under the Advanced Information Systems Technology (AIST) Program. The AIST Program aims to support all phases of NASA's Earth Science program with the goal of enabling new observations and information products, increasing the accessibility and use of Earth observations, and reducing the risk and cost of satellite and ground based information systems. Recent initiatives feature computational technologies to improve information extracted from data streams or model outputs and researchers' tools for Big Data analytics. Data-centric technologies enable research communities to facilitate collaboration and increase the speed with which results are produced and published. In the future NASA anticipates more small satellites (e.g., CubeSats), mobile drones and ground-based in-situ sensors will advance the state-of-the-art regarding how scientific observations are performed, given the flexibility, cost and deployment advantages of new operations technologies. This paper reviews the success of the program and the lessons learned. Infusion of these technologies is challenging and the paper discusses the obstacles and strategies to adoption by the earth science research and application efforts. It also describes alternative perspectives for the future program direction and for realizing the value in the steps to transform observations from sensors to data, to information, and to knowledge, namely: sensor measurement concepts development; data acquisition and management; data product generation; and data exploitation for science and applications.
NASA Astrophysics Data System (ADS)
Gargus, Gerald Vincent
This investigation represents an in-depth understanding of teacher professional development at the Alexander Science Center School, a dependent charter museum school established through a partnership between the California Science Center and Los Angeles Unified School District. Three methods of data collection were used. A survey was distributed and collected from the school's teachers, resulting in a prioritized list of teacher professional development needs, as well as a summary of teachers' opinions about the school's existing professional development program. In addition, six key stakeholders in the school's professional development program were interviewed for the study. Finally, documents related to the school's professional development program were analyzed. Data collected from the interviews and documents were used to develop an understand various components of the Alexander Science Center School's professional development program. Teachers identified seven areas that had a high-priority for future professional development including developing skills far working with below-grade-level students, improving the analytical skills of student in mathematics, working with English Language Learners, improving students' overall reading ability levels, developing teachers' content-area knowledge for science, integrating science across the curriculum, and incorporating hands-on activity-based learning strategies to teach science. Professional development needs identified by Alexander Science Center School teachers were categorized based on their focus on content knowledge, pedagogical content knowledge, or curricular knowledge. Analysis of data collected through interviews and documents revealed that the Alexander Science Center School's professional development program consisted of six venues for providing professional development for teachers including weekly "banked time" sessions taking place within the standard school day, grade-level meetings, teacher support meetings, classroom coaching/Big Lab co-teaching, summer institutes, and off-campus conferences and seminars. Results indicated that the effectiveness of the six venues was closely tied to the level of collaborative planning that took place between the Alexander Science Center School and the associated California Science Center. Examination of teachers' and stakeholders opinions reflect that after a year-and-a-half of operations, the school's professional development program is perceived as disjointed and ineffective, but that the foundation of a sound program has been established.
Nano-Bio-Genesis: tracing the rise of nanotechnology and nanobiotechnology as 'big science'
Kulkarni, Rajan P
2007-01-01
Nanotechnology research has lately been of intense interest because of its perceived potential for many diverse fields of science. Nanotechnology's tools have found application in diverse fields, from biology to device physics. By the 1990s, there was a concerted effort in the United States to develop a national initiative to promote such research. The success of this effort led to a significant influx of resources and interest in nanotechnology and nanobiotechnology and to the establishment of centralized research programs and facilities. Further government initiatives (at federal, state, and local levels) have firmly cemented these disciplines as 'big science,' with efforts increasingly concentrated at select laboratories and centers. In many respects, these trends mirror certain changes in academic science over the past twenty years, with a greater emphasis on applied science and research that can be more directly utilized for commercial applications. We also compare the National Nanotechnology Initiative and its successors to the Human Genome Project, another large-scale, government funded initiative. These precedents made acceptance of shifts in nanotechnology easier for researchers to accept, as they followed trends already established within most fields of science. Finally, these trends are examined in the design of technologies for detection and treatment of cancer, through the Alliance for Nanotechnology in Cancer initiative of the National Cancer Institute. Federal funding of these nanotechnology initiatives has allowed for expansion into diverse fields and the impetus for expanding the scope of research of several fields, especially biomedicine, though the ultimate utility and impact of all these efforts remains to be seen. PMID:17629932
Assessing Teachers' Comprehension of What Matters in Earth Science
NASA Astrophysics Data System (ADS)
Penuel, W. R.; Kreikemeier, P.; Venezky, D.; Blank, J. G.; Davatzes, A.; Davatzes, N.
2006-12-01
Curricular standards developed for individual U.S. States tell teachers what they should teach. Most sets of standards are too numerous to be taught in a single year, forcing teachers to make decisions about what to emphasize in their curriculum. Ideally, such decisions would be based on what matters most in Earth science, namely, the big ideas that anchor scientific inquiry in the field. A measure of teachers' ability to associate curriculum standards with fundamental concepts in Earth science would help K-12 program and curriculum developers to bridge gaps in teachers' knowledge in order to help teachers make better decisions about what is most important to teach and communicate big ideas to students. This paper presents preliminary results of an attempt to create and validate a measure of teachers' comprehension of what matters in three sub-disciplines of Earth science. This measure was created as part of an experimental study of teacher professional development in Earth science. It is a task that requires teachers to take their state's curriculum standards and identify which standards are necessary or supplemental to developing students' understanding of fundamental concepts in the target sub-disciplines. To develop the task, a team of assessment experts and educational researchers asked a panel of four Earth scientists to identify key concepts embedded within middle school standards for the state of Florida. The Earth science panel reached a consensus on which standards needed to be taught in order to develop understanding of those concepts; this was used as a basis for comparison with teacher responses. Preliminary analysis of the responses of 44 teachers who participated in a pilot validation study identified differences between teachers' and scientists' maps of standards to big ideas in the sub-disciplines. On average, teachers identified just under one-third of the connections seen by expert Earth scientists between the concepts and their state standards. Teachers with higher levels of agreement also had a higher percentage of standards identified that were "off-grade," meaning that they saw connections to standards that they were not themselves required to teach but that nonetheless were relevant to developing student understanding of a particular concept. This result is consistent with the premise that to make good decisions about what to teach, teachers need to be able to identify relevant standards from other grade levels that are connected to the big ideas of a discipline (Shulman, 1986, Educ. Res. 15:4-14).
Frontier Fields: Bringing the Distant Universe into View
NASA Astrophysics Data System (ADS)
Eisenhamer, Bonnie; Lawton, Brandon L.; Summers, Frank; Ryer, Holly
2014-06-01
The Frontier Fields is a multi-cycle program of six deep-field observations of strong-lensing galaxy clusters that will be taken in parallel with six deep “blank fields.” The three-year long collaborative program centers on observations from NASA’s Great Observatories, who will team up to look deeper into the universe than ever before, and potentially uncover galaxies that are as much as 100 times fainter than what the telescopes can typically see. Because of the unprecedented views of the universe that will be achieved, the Frontier Fields science program is ideal for informing audiences about scientific advances and topics in STEM. For example, the program provides an opportunity to look back on the history of deep field observations and how they changed (and continue to change) astronomy, while exploring the ways astronomers approach big science problems. As a result, the Space Telescope Science Institute’s Office of Public Outreach has initiated an education and public outreach (E/PO) project to follow the progress of the Frontier Fields program - providing a behind-the-scenes perspective of this observing initiative. This poster will highlight the goals of the Frontier Fields E/PO project and the cost-effective approach being used to bring the program’s results to both the public and educational audiences.
Communicating the Nature of Science through "The Big Bang Theory": Evidence from a Focus Group Study
ERIC Educational Resources Information Center
Li, Rashel; Orthia, Lindy A.
2016-01-01
In this paper, we discuss a little-studied means of communicating about or teaching the nature of science (NOS)--through fiction television. We report some results of focus group research which suggest that the American sitcom "The Big Bang Theory" (2007-present), whose main characters are mostly working scientists, has influenced…
ERIC Educational Resources Information Center
Scheider, Walter
2005-01-01
The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…
Big Data and Perioperative Nursing.
Westra, Bonnie L; Peterson, Jessica J
2016-10-01
Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Big Explosions, Strong Gravity: Making Girl Scouts ACEs of Space through Chandra Outreach
NASA Astrophysics Data System (ADS)
Hornschemeier, A. E.; Lochner, J. C.; Ganguly, R.; Feaga, L. M.; Ford, K. E. S.
2005-12-01
Thanks to two years of Chandra E/PO funding we have carried out a number of successful activities with the Girl Scouts of Central Maryland, focusing on girls in the 11-17 year age range. Our reasons for targeting this age range include the general decline in interest in math and science that occurs at or after children reach this critical age (meaning that we reach them early enough to have a positive effect). We initially target girls due to their underrepresentation in science, but the actitivities are all gender-neutral and highly adaptable to other groups. The program includes two components, in collaboration with Girl Scouts of Central Maryland. The first component is a well-established one-day Girl Scout patch activity entitled Big Explosions and Strong Gravity (BESG) where the girls earn a patch for their badge sash. The four BESG activities, mostly adapted from existing E/PO material, are available on the World Wide Web for use by others. The activities cover the electromagnetic spectrum as a tool for astronomy, the cosmic abundance of the elements and the supernova origin of many of the elements, black holes and their detection, and supernova explosions/stellar evolution. Thus far approximately 200 girls and their parents have participated in BESG and it has now become part of the council culture. The second activity is new and is part of the relatively new Girl Scout Studio 2B program, which is a girl-led program for the 11-17 year age range. Based on several meetings with small groups of girls and adults, we have formed a Studio 2B "club" called the ACE of Space Club (Astronomical Cosmic Exploration). We'll describe our experiences interacting with the Girl Scouts in this girl-led program.
"Small" data in a big data world: archiving terrestrial ecology data at ORNL DAAC
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Beaty, T.; Boyer, A.; Deb, D.; Hook, L.; Shrestha, R.; Thornton, M.; Virdi, M.; Wei, Y.; Wright, D.
2016-12-01
The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC http://daac.ornl.gov), a NASA-funded data center, archives a diverse collection of terrestrial biogeochemistry and ecological dynamics observations and models in support of NASA's Earth Science program. The ORNL DAAC has been addressing the increasing challenge of publishing diverse small data products into an online archive while dealing with the enhanced need for integration and availability of these data to address big science questions. This paper will show examples of "small" diverse data holdings - ranging from the Daymet model output data to site-based soil moisture observation data. We define "small" by the data volume of these data products compared to petabyte scale observations. We will highlight the use of tools and services for visualizing diverse data holdings and subsetting services such as the MODIS land products subsets tool (at ORNL DAAC) that provides big MODIS data in small chunks. Digital Object Identifiers (DOI) and data citations have enhanced the availability of data. The challenge faced by data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing small diverse data products into an online archive. This paper will also present our experiences designing a data curation system for these types of data. The characteristics of these data will be examined and their scientific value will be demonstrated via data citation metrics. We will present case studies of leveraging specialized tools and services that have enabled small data sets to realize their "big" scientific potential. Overall, we will provide a holistic view of the challenges and potential of small diverse terrestrial ecology data sets from data curation to distribution.
Zhao, Yihong; Castellanos, F Xavier
2016-03-01
Psychiatric science remains descriptive, with a categorical nosology intended to enhance interobserver reliability. Increased awareness of the mismatch between categorical classifications and the complexity of biological systems drives the search for novel frameworks including discovery science in Big Data. In this review, we provide an overview of incipient approaches, primarily focused on classically categorical diagnoses such as schizophrenia (SZ), autism spectrum disorder (ASD), and attention-deficit/hyperactivity disorder (ADHD), but also reference convincing, if focal, advances in cancer biology, to describe the challenges of Big Data and discovery science, and outline approaches being formulated to overcome existing obstacles. A paradigm shift from categorical diagnoses to a domain/structure-based nosology and from linear causal chains to complex causal network models of brain-behavior relationship is ongoing. This (r)evolution involves appreciating the complexity, dimensionality, and heterogeneity of neuropsychiatric data collected from multiple sources ('broad' data) along with data obtained at multiple levels of analysis, ranging from genes to molecules, cells, circuits, and behaviors ('deep' data). Both of these types of Big Data landscapes require the use and development of robust and powerful informatics and statistical approaches. Thus, we describe Big Data analysis pipelines and the promise and potential limitations in using Big Data approaches to study psychiatric disorders. We highlight key resources available for psychopathological studies and call for the application and development of Big Data approaches to dissect the causes and mechanisms of neuropsychiatric disorders and identify corresponding biomarkers for early diagnosis. © 2016 Association for Child and Adolescent Mental Health.
Zhao, Yihong; Castellanos, F. Xavier
2015-01-01
Background and Scope Psychiatric science remains descriptive, with a categorical nosology intended to enhance inter-observer reliability. Increased awareness of the mismatch between categorical classifications and the complexity of biological systems drives the search for novel frameworks including discovery science in Big Data. In this review, we provide an overview of incipient approaches, primarily focused on classically categorical diagnoses such as schizophrenia (SZ), autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD), but also reference convincing, if focal, advances in cancer biology, to describe the challenges of Big Data and discovery science, and outline approaches being formulated to overcome existing obstacles. Findings A paradigm shift from categorical diagnoses to a domain/structure-based nosology and from linear causal chains to complex causal network models of brain-behavior relationship is ongoing. This (r)evolution involves appreciating the complexity, dimensionality and heterogeneity of neuropsychiatric data collected from multiple sources (“broad” data) along with data obtained at multiple levels of analysis, ranging from genes to molecules, cells, circuits and behaviors (“deep” data). Both of these types of Big Data landscapes require the use and development of robust and powerful informatics and statistical approaches. Thus, we describe Big Data analysis pipelines and the promise and potential limitations in using Big Data approaches to study psychiatric disorders. Conclusion We highlight key resources available for psychopathological studies and call for the application and development of Big Data approaches to dissect the causes and mechanisms of neuropsychiatric disorders and identify corresponding biomarkers for early diagnosis. PMID:26732133
A Demonstration of Big Data Technology for Data Intensive Earth Science (Invited)
NASA Astrophysics Data System (ADS)
Kuo, K.; Clune, T.; Ramachandran, R.; Rushing, J.; Fekete, G.; Lin, A.; Doan, K.; Oloso, A. O.; Duffy, D.
2013-12-01
Big Data technologies exhibit great potential to change the way we conduct scientific investigations, especially analysis of voluminous and diverse data sets. Obviously, not all Big Data technologies are applicable to all aspects of scientific data analysis. Our NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) project, Automated Event Service (AES), pioneers the exploration of Big Data technologies for data intensive Earth science. Since Earth science data are largely stored and manipulated in the form of multidimensional arrays, the project first evaluates array performance of several candidate Big Data technologies, including MapReduce (Hadoop), SciDB, and a custom-built Polaris system, which have one important feature in common: shared nothing architecture. The evaluation finds SicDB to be the most promising. In this presentation, we demonstrate SciDB using a couple of use cases, each operating on a distinct data set in the regular latitude-longitude grid. The first use case is the discovery and identification of blizzards using NASA's Modern Era Retrospective-analysis for Research and Application (MERRA) data sets. The other finds diurnal signals in the same 8-year period using SSMI data from three different instruments with different equator crossing times by correlating their retrieved parameters. In addition, the AES project is also developing a collaborative component to enable the sharing of event queries and results. Preliminary capabilities will be presented as well.
ERIC Educational Resources Information Center
Black, Susan
1999-01-01
Mentoring programs cannot always obliterate deficiencies in adult/child relationships or better student achievement. Two successful programs are the Big Brother/Big Sister program and the Office of Juvenile Justice's Juvenile Mentoring Program. (JUMP). Social support, not social control, is essential. Sidebars contain program tips and selected…
Big Data: You Are Adding to . . . and Using It
ERIC Educational Resources Information Center
Makela, Carole J.
2016-01-01
"Big data" prompts a whole lexicon of terms--data flow; analytics; data mining; data science; smart you name it (cars, houses, cities, wearables, etc.); algorithms; learning analytics; predictive analytics; data aggregation; data dashboards; digital tracks; and big data brokers. New terms are being coined frequently. Are we paying…
Integrating Antarctic Science Into Geospace System Science
NASA Astrophysics Data System (ADS)
Kelly, J. D.
2010-12-01
Addressing the scientific, technical, and sociological challenges of the future requires both detailed basic research and system based approaches to the entire geospace system from the Earth’s core, through solid Earth, ice, oceans, atmosphere, ionosphere, and magnetosphere to the Sun’s outer atmosphere and even beyond. Fully integrating Antarctic science, and fully exploiting the scientific research possibilities of the Antarctic continent through effective and efficient support infrastructure, will be a very important contribution to future success. Amongst many new facilities and programs which can and are being proposed, the Moveable Antarctic Incoherent Scatter Radar (MAISR) at McMurdo illustrates the potential for innovative future science. This poster uses some of the proposed science programs to show how the scientific community can use the data products of this facility, and how they can contribute to the development of the tools and mechanisms for proposing, executing, and utilizing such new research capabilities. In particular, incoherent scatter radars played a big role in data collection during the recent International Polar Year and plans for future extended operations, including those in Antarctica, will be discussed in the light of lessons learnt in applying observations to global modeling developments.
NASA Astrophysics Data System (ADS)
Botella, J.; Warburton, J.; Bartholow, S.; Reed, L. F.
2014-12-01
The Joint Antarctic School Expedition (JASE) is an international collaboration program between high school students and teachers from the United States and Chile aimed at providing the skills required for establishing the scientific international collaborations that our globalized world demands, and to develop a new approach for science education. The National Antarctic Programs of Chile and the United States worked together on a pilot program that brought high school students and teachers from both countries to Punta Arenas, Chile, in February 2014. The goals of this project included strengthening the partnership between the two countries, and building relationships between future generations of scientists, while developing the students' awareness of global scientific issues and expanding their knowledge and interest in Antarctica and polar science. A big component of the project involved the sharing by students of the acquired knowledge and experiences with the general public. JASE is based on the successful Chilean Antarctic Science Fair developed by Chile´s Antarctic Research Institute. For 10 years, small groups of Chilean students, each mentored by a teacher, perform experimental or bibliographical Antarctic research. Winning teams are awarded an expedition to the Chilean research station on King George Island. In 2014, the Chileans invited US participation in this program in order to strengthen science ties for upcoming generations. On King George Island, students have hands-on experiences conducting experiments and learning about field research. While the total number of students directly involved in the program is relatively small, the sharing of the experience by students with the general public is a novel approach to science education. Research experiences for students, like JASE, are important as they influence new direction for students in science learning, science interest, and help increase science knowledge. We will share experiences with the planning of the pilot program as well as the expedition itself. We also share the results of the assessment report prepared by an independent party. Lastly, we will offer recommendations for initiating international science education collaborations. United States participation was funded by the NSF Division of Polar Programs.
Data Discovery of Big and Diverse Climate Change Datasets - Options, Practices and Challenges
NASA Astrophysics Data System (ADS)
Palanisamy, G.; Boden, T.; McCord, R. A.; Frame, M. T.
2013-12-01
Developing data search tools is a very common, but often confusing, task for most of the data intensive scientific projects. These search interfaces need to be continually improved to handle the ever increasing diversity and volume of data collections. There are many aspects which determine the type of search tool a project needs to provide to their user community. These include: number of datasets, amount and consistency of discovery metadata, ancillary information such as availability of quality information and provenance, and availability of similar datasets from other distributed sources. Environmental Data Science and Systems (EDSS) group within the Environmental Science Division at the Oak Ridge National Laboratory has a long history of successfully managing diverse and big observational datasets for various scientific programs via various data centers such as DOE's Atmospheric Radiation Measurement Program (ARM), DOE's Carbon Dioxide Information and Analysis Center (CDIAC), USGS's Core Science Analytics and Synthesis (CSAS) metadata Clearinghouse and NASA's Distributed Active Archive Center (ORNL DAAC). This talk will showcase some of the recent developments for improving the data discovery within these centers The DOE ARM program recently developed a data discovery tool which allows users to search and discover over 4000 observational datasets. These datasets are key to the research efforts related to global climate change. The ARM discovery tool features many new functions such as filtered and faceted search logic, multi-pass data selection, filtering data based on data quality, graphical views of data quality and availability, direct access to data quality reports, and data plots. The ARM Archive also provides discovery metadata to other broader metadata clearinghouses such as ESGF, IASOA, and GOS. In addition to the new interface, ARM is also currently working on providing DOI metadata records to publishers such as Thomson Reuters and Elsevier. The ARM program also provides a standards based online metadata editor (OME) for PIs to submit their data to the ARM Data Archive. USGS CSAS metadata Clearinghouse aggregates metadata records from several USGS projects and other partner organizations. The Clearinghouse allows users to search and discover over 100,000 biological and ecological datasets from a single web portal. The Clearinghouse also enabled some new data discovery functions such as enhanced geo-spatial searches based on land and ocean classifications, metadata completeness rankings, data linkage via digital object identifiers (DOIs), and semantically enhanced keyword searches. The Clearinghouse also currently working on enabling a dashboard which allows the data providers to look at various statistics such as number their records accessed via the Clearinghouse, most popular keywords, metadata quality report and DOI creation service. The Clearinghouse also publishes metadata records to broader portals such as NSF DataONE and Data.gov. The author will also present how these capabilities are currently reused by the recent and upcoming data centers such as DOE's NGEE-Arctic project. References: [1] Devarakonda, R., Palanisamy, G., Wilson, B. E., & Green, J. M. (2010). Mercury: reusable metadata management, data discovery and access system. Earth Science Informatics, 3(1-2), 87-94. [2]Devarakonda, R., Shrestha, B., Palanisamy, G., Hook, L., Killeffer, T., Krassovski, M., ... & Frame, M. (2014, October). OME: Tool for generating and managing metadata to handle BigData. In BigData Conference (pp. 8-10).
Big Data Science Education: A Case Study of a Project-Focused Introductory Course
ERIC Educational Resources Information Center
Saltz, Jeffrey; Heckman, Robert
2015-01-01
This paper reports on a case study of a project-focused introduction to big data science course. The pedagogy of the course leveraged boundary theory, where students were positioned to be at the boundary between a client's desire to understand their data and the academic class. The results of the case study demonstrate that using live clients…
Big data and clinicians: a review on the state of the science.
Wang, Weiqi; Krishnan, Eswar
2014-01-17
In the past few decades, medically related data collection saw a huge increase, referred to as big data. These huge datasets bring challenges in storage, processing, and analysis. In clinical medicine, big data is expected to play an important role in identifying causality of patient symptoms, in predicting hazards of disease incidence or reoccurrence, and in improving primary-care quality. The objective of this review was to provide an overview of the features of clinical big data, describe a few commonly employed computational algorithms, statistical methods, and software toolkits for data manipulation and analysis, and discuss the challenges and limitations in this realm. We conducted a literature review to identify studies on big data in medicine, especially clinical medicine. We used different combinations of keywords to search PubMed, Science Direct, Web of Knowledge, and Google Scholar for literature of interest from the past 10 years. This paper reviewed studies that analyzed clinical big data and discussed issues related to storage and analysis of this type of data. Big data is becoming a common feature of biological and clinical studies. Researchers who use clinical big data face multiple challenges, and the data itself has limitations. It is imperative that methodologies for data analysis keep pace with our ability to collect and store data.
Columbia University Public Outreach: Looking Beyond the Bright Lights in the Big City
NASA Astrophysics Data System (ADS)
Ash, Summer; Agueros, Marcel A.
2015-01-01
Columbia University astronomers have been inviting the public to come and share in our love of the skies for several decades now, but only within the last ten years has this program become a sustained tool for public outreach and professional development. Columbia's Public Outreach engages with multiple audiences, from the general public to teachers to students of all ages, year-round. In the last three years alone, we have interacted with approximately 7500 people via school visits, teacher-training events, and our public lecture and stargazing series. Our outreach efforts are unique in that they are staffed entirely by graduate students and undergraduate majors who volunteer their time, and coordinated by a dedicated science-trained staff member in the department. Our program is particularly suited to be a vehicle for graduate-student training in science communication and public speaking. We describe the various components of our program and provide an analysis of the populations reached.
Big Crater as Viewed by Pathfinder Lander
NASA Technical Reports Server (NTRS)
1997-01-01
The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.
The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech). The IMP was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.Schatz, Gottfried
2014-06-01
Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.
New to Teaching: Small Changes Can Produce Big Results!
ERIC Educational Resources Information Center
Shenton, Megan
2017-01-01
In this article, Megan Shenton, a final-year trainee teacher at Nottinghom Trent University, describes using "The Big Question" in her science teaching in a move away from objectives. The Big Question is an innovative pedagogical choice, where instead of implementing a learning objective, a question is posed at the start of the session…
Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika
2017-02-01
Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.
Big Science, Small-Budget Space Experiment Package Aka MISSE-5: A Hardware And Software Perspective
NASA Technical Reports Server (NTRS)
Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan
2007-01-01
Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.
Clinton Administration announces FY 2001 budget request
NASA Astrophysics Data System (ADS)
Showstack, Randy
Blessed with a strong US. economy the Clinton Administration on February 7 released a fiscal year 2001 federal budget request totaling a whopping $1,835 billion. Most of the funding request is slated for big ticket items including Social Security defense spending, Medicaid, Medicare, and paying down the federal debt. However, within the 19% of the budget that funds non-defense discretionary programs,science agencies receive fairly healthy increases.The National Science Foundation (NSF) budget request would increase NSF funding by 17.3% $675 million and bring the total budget request to $4.6 billion. This includes significant increases for several initiatives: biocomplexity in the environment, information technology research, nanoscale science and engineering, and 21st century workforce. Among the major Earth science projects are launching the Earthscope initiative which includes the US Array and San Andreas Fault Observatory at Depth (SAFOD) and the National Ecological Observatory Network (NEON).
From ecological records to big data: the invention of global biodiversity.
Devictor, Vincent; Bensaude-Vincent, Bernadette
2016-12-01
This paper is a critical assessment of the epistemological impact of the systematic quantification of nature with the accumulation of big datasets on the practice and orientation of ecological science. We examine the contents of big databases and argue that it is not just accumulated information; records are translated into digital data in a process that changes their meanings. In order to better understand what is at stake in the 'datafication' process, we explore the context for the emergence and quantification of biodiversity in the 1980s, along with the concept of the global environment. In tracing the origin and development of the global biodiversity information facility (GBIF) we describe big data biodiversity projects as a techno-political construction dedicated to monitoring a new object: the global diversity. We argue that, biodiversity big data became a powerful driver behind the invention of the concept of the global environment, and a way to embed ecological science in the political agenda.
The Role of Big Data in the Social Sciences
ERIC Educational Resources Information Center
Ovadia, Steven
2013-01-01
Big Data is an increasingly popular term across scholarly and popular literature but lacks a formal definition (Lohr 2012). This is beneficial in that it keeps the term flexible. For librarians, Big Data represents a few important ideas. One idea is the idea of balancing accessibility with privacy. Librarians tend to want information to be as open…
The Whole Shebang: How Science Produced the Big Bang Model.
ERIC Educational Resources Information Center
Ferris, Timothy
2002-01-01
Offers an account of the accumulation of evidence that has led scientists to have confidence in the big bang theory of the creation of the universe. Discusses the early work of Ptolemy, Copernicus, Kepler, Galileo, and Newton, noting the rise of astrophysics, and highlighting the birth of the big bang model (the cosmic microwave background theory…
Arzuman, Hafiza; Yusoff, Muhamad Saiful Bahri; Chit, Som Phong
2010-07-01
A cross-sectional descriptive study was conducted among Big Sib students to explore their perceptions of the educational environment at the School of Medical Sciences, Universiti Sains Malaysia (USM) and its weak areas using the Dundee Ready Educational Environment Measure (DREEM) inventory. The DREEM inventory is a validated global instrument for measuring educational environments in undergraduate medical and health professional education. The English version of the DREEM inventory was administered to all Year 2 Big Sib students (n = 67) at a regular Big Sib session. The purpose of the study as well as confidentiality and ethical issues were explained to the students before the questionnaire was administered. The response rate was 62.7% (42 out of 67 students). The overall DREEM score was 117.9/200 (SD 14.6). The DREEM indicated that the Big Sib students' perception of educational environment of the medical school was more positive than negative. Nevertheless, the study also revealed some problem areas within the educational environment. This pilot study revealed that Big Sib students perceived a positive learning environment at the School of Medical Sciences, USM. It also identified some low-scored areas that require further exploration to pinpoint the exact problems. The relatively small study population selected from a particular group of students was the major limitation of the study. This small sample size also means that the study findings cannot be generalised.
Database Resources of the BIG Data Center in 2018
Xu, Xingjian; Hao, Lili; Zhu, Junwei; Tang, Bixia; Zhou, Qing; Song, Fuhai; Chen, Tingting; Zhang, Sisi; Dong, Lili; Lan, Li; Wang, Yanqing; Sang, Jian; Hao, Lili; Liang, Fang; Cao, Jiabao; Liu, Fang; Liu, Lin; Wang, Fan; Ma, Yingke; Xu, Xingjian; Zhang, Lijuan; Chen, Meili; Tian, Dongmei; Li, Cuiping; Dong, Lili; Du, Zhenglin; Yuan, Na; Zeng, Jingyao; Zhang, Zhewen; Wang, Jinyue; Shi, Shuo; Zhang, Yadong; Pan, Mengyu; Tang, Bixia; Zou, Dong; Song, Shuhui; Sang, Jian; Xia, Lin; Wang, Zhennan; Li, Man; Cao, Jiabao; Niu, Guangyi; Zhang, Yang; Sheng, Xin; Lu, Mingming; Wang, Qi; Xiao, Jingfa; Zou, Dong; Wang, Fan; Hao, Lili; Liang, Fang; Li, Mengwei; Sun, Shixiang; Zou, Dong; Li, Rujiao; Yu, Chunlei; Wang, Guangyu; Sang, Jian; Liu, Lin; Li, Mengwei; Li, Man; Niu, Guangyi; Cao, Jiabao; Sun, Shixiang; Xia, Lin; Yin, Hongyan; Zou, Dong; Xu, Xingjian; Ma, Lina; Chen, Huanxin; Sun, Yubin; Yu, Lei; Zhai, Shuang; Sun, Mingyuan; Zhang, Zhang; Zhao, Wenming; Xiao, Jingfa; Bao, Yiming; Song, Shuhui; Hao, Lili; Li, Rujiao; Ma, Lina; Sang, Jian; Wang, Yanqing; Tang, Bixia; Zou, Dong; Wang, Fan
2018-01-01
Abstract The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. PMID:29036542
Making a Big Bang on the small screen
NASA Astrophysics Data System (ADS)
Thomas, Nick
2010-01-01
While the quality of some TV sitcoms can leave viewers feeling cheated out of 30 minutes of their lives, audiences and critics are raving about the science-themed US comedy The Big Bang Theory. First shown on the CBS network in 2007, the series focuses on two brilliant postdoc physicists, Leonard and Sheldon, who are totally absorbed by science. Adhering to the stereotype, they also share a fanatical interest in science fiction, video-gaming and comic books, but unfortunately lack the social skills required to connect with their 20-something nonacademic contemporaries.
Big Data and Clinicians: A Review on the State of the Science
Wang, Weiqi
2014-01-01
Background In the past few decades, medically related data collection saw a huge increase, referred to as big data. These huge datasets bring challenges in storage, processing, and analysis. In clinical medicine, big data is expected to play an important role in identifying causality of patient symptoms, in predicting hazards of disease incidence or reoccurrence, and in improving primary-care quality. Objective The objective of this review was to provide an overview of the features of clinical big data, describe a few commonly employed computational algorithms, statistical methods, and software toolkits for data manipulation and analysis, and discuss the challenges and limitations in this realm. Methods We conducted a literature review to identify studies on big data in medicine, especially clinical medicine. We used different combinations of keywords to search PubMed, Science Direct, Web of Knowledge, and Google Scholar for literature of interest from the past 10 years. Results This paper reviewed studies that analyzed clinical big data and discussed issues related to storage and analysis of this type of data. Conclusions Big data is becoming a common feature of biological and clinical studies. Researchers who use clinical big data face multiple challenges, and the data itself has limitations. It is imperative that methodologies for data analysis keep pace with our ability to collect and store data. PMID:25600256
NASA Astrophysics Data System (ADS)
Jones, A. P.; Bleacher, L.; Glotch, T. D.; Heldmann, J. L.; Bleacher, J. E.; Young, K. E.; Selvin, B.; Firstman, R.; Lim, D. S. S.; Johnson, S. S.; Kobs-Nawotniak, S. E.; Hughes, S. S.
2015-12-01
The Remote, In Situ, and Synchrotron Studies for Science and Exploration (RIS4E) and Field Investigations to Enable Solar System Science and Exploration (FINESSE) teams of NASA's Solar System Exploration Research Virtual Institute conduct research that will help us more safely and effectively explore the Moon, Near Earth Asteroids, and the moons of Mars. These teams are committed to making their scientific research accessible and to using their research as a lens through which students and teachers can better understand the process of science. In partnership with the Alan Alda Center for Communicating Science at Stony Brook University, in spring of 2015 the RIS4E team offered a semester-long course on science journalism that culminated in a 10-day reporting trip to document scientific fieldwork in action during the 2015 RIS4E field campaign on the Big Island of Hawaii. Their work is showcased on ReportingRIS4E.com. The RIS4E science journalism course is helping to prepare the next generation of science journalists to accurately represent scientific research in a way that is appealing and understandable to the public. It will be repeated in 2017. Students and teachers who participate in FINESSE Spaceward Bound, a program offered in collaboration with the Idaho Space Grant Consortium, conduct science and exploration research in Craters of the Moon National Monument and Preserve. Side-by-side with NASA researchers, they hike through lava flows, operate field instruments, participate in science discussions, and contribute to scientific publications. Teachers learn about FINESSE science in the field, and bring it back to their classrooms with support from educational activities and resources. The second season of FINESSE Spaceward Bound is underway in 2015. We will provide more information about the RIS4E and FINESSE education programs and discuss the power of integrating educational programs within scientific programs, the strength institutional partnerships can provide, and the impact participating in immersive field experiences can have on learners.
Moodie, Marjory L; Fisher, Jane
2009-01-30
The Big Brothers Big Sisters (BBBS) program matches vulnerable young people with a trained, supervised adult volunteer as mentor. The young people are typically seriously disadvantaged, with multiple psychosocial problems. Threshold analysis was undertaken to determine whether investment in the program was a worthwhile use of limited public funds. The potential cost savings were based on US estimates of life-time costs associated with high-risk youth who drop out-of-school and become adult criminals. The intervention was modelled for children aged 10-14 years residing in Melbourne in 2004. If the program serviced 2,208 of the most vulnerable young people, it would cost AUD 39.5 M. Assuming 50% were high-risk, the associated costs of their adult criminality would be AUD 3.3 billion. To break even, the program would need to avert high-risk behaviours in only 1.3% (14/1,104) of participants. This indicative evaluation suggests that the BBBS program represents excellent 'value for money'.
Invisible Roles of Doctoral Program Specialists
ERIC Educational Resources Information Center
Bachman, Eva Burns; Grady, Marilyn L.
2016-01-01
The purpose of this study was to investigate the roles of doctoral program specialists in Big Ten universities. Face-to-face interviews with 20 doctoral program specialists employed in institutions in the Big Ten were conducted. Participants were asked to describe their roles within their work place. The doctoral program specialists reported their…
NASA Astrophysics Data System (ADS)
Murata, K. T.
2014-12-01
Data-intensive or data-centric science is 4th paradigm after observational and/or experimental science (1st paradigm), theoretical science (2nd paradigm) and numerical science (3rd paradigm). Science cloud is an infrastructure for 4th science methodology. The NICT science cloud is designed for big data sciences of Earth, space and other sciences based on modern informatics and information technologies [1]. Data flow on the cloud is through the following three techniques; (1) data crawling and transfer, (2) data preservation and stewardship, and (3) data processing and visualization. Original tools and applications of these techniques have been designed and implemented. We mash up these tools and applications on the NICT Science Cloud to build up customized systems for each project. In this paper, we discuss science data processing through these three steps. For big data science, data file deployment on a distributed storage system should be well designed in order to save storage cost and transfer time. We developed a high-bandwidth virtual remote storage system (HbVRS) and data crawling tool, NICTY/DLA and Wide-area Observation Network Monitoring (WONM) system, respectively. Data files are saved on the cloud storage system according to both data preservation policy and data processing plan. The storage system is developed via distributed file system middle-ware (Gfarm: GRID datafarm). It is effective since disaster recovery (DR) and parallel data processing are carried out simultaneously without moving these big data from storage to storage. Data files are managed on our Web application, WSDBank (World Science Data Bank). The big-data on the cloud are processed via Pwrake, which is a workflow tool with high-bandwidth of I/O. There are several visualization tools on the cloud; VirtualAurora for magnetosphere and ionosphere, VDVGE for google Earth, STICKER for urban environment data and STARStouch for multi-disciplinary data. There are 30 projects running on the NICT Science Cloud for Earth and space science. In 2003 56 refereed papers were published. At the end, we introduce a couple of successful results of Earth and space sciences using these three techniques carried out on the NICT Sciences Cloud. [1] http://sc-web.nict.go.jp
Teaching Critical Thinking through a course on Science and Religion
NASA Astrophysics Data System (ADS)
Shipman, H. L.; Jordan, J. J.
2004-12-01
The relationship between science and religion is, according to the public debate, rather stormy. It doesn't have to be this way. Since 1998, an astronomer (Shipman) and a philosopher (Jordan) have team-taught a course with a more constructive approach. This course has a recognized role in the University's General Education program and in the philosophy major. As overall course goals, we hope that our students will be able to: - exhibit critical thinking skills in being able to tell the difference between good arguments and bad arguments in this area - recognize that the relationship between science and religion is not necessarily an antagonistic one. We accomplish these goals by focusing the course on four major issues, namely: - Does Big Bang Cosmology leave room for a Creator? - Can a rational person believe in miracle reports? - In the light of modern science, what does it mean to be human? - Can a theist, someone who believes in God, rationally accept the scientific theory of biological evolution? We have evidence in the course to evaluate student progress towards our goals. Student responses to a pre- and post-testing methodology, where they responded to the same assignment at the beginning and at the end of the course, were classified as seeing the relationship between science and religion as confrontational, distinct, convergent, or transitional between distinct and convergent. Preliminary analysis of the student responses shows a significant shift away from a confrontational position and towards a more convergent position. The development of this course was supported by the John Templeton Foundation's Science and Religion course program. H.L.S.'s scholarly work integrating science research and science education research is supported by the National Science Foundation's Distinguished Teaching Scholars Program. DUE-0306557),
The SAMI Galaxy Survey: A prototype data archive for Big Science exploration
NASA Astrophysics Data System (ADS)
Konstantopoulos, I. S.; Green, A. W.; Foster, C.; Scott, N.; Allen, J. T.; Fogarty, L. M. R.; Lorente, N. P. F.; Sweet, S. M.; Hopkins, A. M.; Bland-Hawthorn, J.; Bryant, J. J.; Croom, S. M.; Goodwin, M.; Lawrence, J. S.; Owers, M. S.; Richards, S. N.
2015-11-01
We describe the data archive and database for the SAMI Galaxy Survey, an ongoing observational program that will cover ≈3400 galaxies with integral-field (spatially-resolved) spectroscopy. Amounting to some three million spectra, this is the largest sample of its kind to date. The data archive and built-in query engine use the versatile Hierarchical Data Format (HDF5), which precludes the need for external metadata tables and hence the setup and maintenance overhead those carry. The code produces simple outputs that can easily be translated to plots and tables, and the combination of these tools makes for a light system that can handle heavy data. This article acts as a contextual companion to the SAMI Survey Database source code repository, samiDB, which is freely available online and written entirely in Python. We also discuss the decisions related to the selection of tools and the creation of data visualisation modules. It is our aim that the work presented in this article-descriptions, rationale, and source code-will be of use to scientists looking to set up a maintenance-light data archive for a Big Science data load.
The Year of the Solar System: An E/PO Community's Approach to Sharing Planetary Science
NASA Astrophysics Data System (ADS)
Shipp, S. S.; Boonstra, D.; Shupla, C.; Dalton, H.; Scalice, D.; Planetary Science E/Po Community
2010-12-01
YSS offers the opportunity to raise awareness, build excitement, and make connections with educators, students and the public about planetary science activities. The planetary science education and public outreach (E/PO) community is engaging and educating their audiences through ongoing mission and program activities. Based on discussion with partners, the community is presenting its products in the context of monthly thematic topics that are tied to the big questions of planetary science: how did the Sun’s family of planets and bodies originate and how have they evolved; and how did life begin and evolve on Earth, has it evolved elsewhere in our solar system, and what are characteristics that lead to the origins of life? Each month explores different compelling aspects of the solar system - its formation, volcanism, ice, life. Resources, activities, and events are interwoven in thematic context, and presented with ideas through which formal and informal educators can engage their audiences. The month-to-month themes place the big questions in a logical sequence of deepening learning experiences - and highlight mission milestones and viewing events. YSS encourages active participation and communication with its audiences. It includes nation-wide activities, such as a Walk Through the Solar System, held between October 2010 to March 2011, in which museums, libraries, science centers, schools, planetariums, amateur astronomers, and others are kicking off YSS by creating their own scale models of the solar system and sharing their events through online posting of pictures, video, and stories. YSS offers the E/PO community the opportunity to collaborate with each other and partners. The thematic approach leverages existing products, providing a home and allowing a “shelf life” that can outlast individual projects and missions. The broad themes highlight missions and programs multiple times. YSS also leverages existing online resources and social media. Hosted on the popular and long-lived Solar System Exploration website (http://solarsystem.nasa.gov/yss), multiple points of entry lead to YSS, ensuring sustained accessibility of thematic topics. Likewise, YSS is being shared through social media avenues of existing missions and programs, reaching a large audience without investment in building a fan-base on YSS-specific social media conduits. Create and share your own YSS event with the tools and resources offered on the website. Join the celebration!
Heike Kamerlingh Onnes and the Road to Superconductivity
NASA Astrophysics Data System (ADS)
van Delft, Dirk
2011-03-01
The discovery of superconductivity on 8 April 1911 came as a big surprise. It was stumbled upon in the Leiden cryogenic laboratory of Heike Kamerlingh Onnes in a moment of serendipity. Three years before, the liquefaction of helium on the other hand had been the culmination of a long battle with nature. It was a meticulously prepared operation, ``big science'' in its first appearance. Until recently, careless notebook entries by Kamerlingh Onnes and his terrible handwriting had hindered a complete view to the road to superconductivity. Even a date of the fascinating discovery was lacking. How did the discovery fit into the Leiden research program? What about the research effort Kamerlingh Onnes had to put in to be sure he had found superconductivity rather than a short-circuit? What about superfluidity? Once the right interpretation of the notebooks is clear, the real story can be told.
NASA Astrophysics Data System (ADS)
Malphrus, Benjamin Kevin
1990-01-01
The purpose of this study is to examine the sequence of events that led to the establishment of the NRAO, the construction and development of instrumentation and the contributions and discovery events and to relate the significance of these events to the evolution of the sciences of radio astronomy and cosmology. After an overview of the resources, a brief discussion of the early days of the science is given to set the stage for an examination of events that led to the establishment of the NRAO. The developmental and construction phases of the major instruments including the 85-foot Tatel telescope, the 300-foot telescope, the 140-foot telescope, and the Green Bank lnterferometer are examined. The technical evolution of these instruments is traced and their relevance to scientific programs and discovery events is discussed. The history is told in narrative format that is interspersed with technical and scientific explanations. Through the use of original data technical and scientific information of historical concern is provided to elucidate major developments and events. An interpretive discussion of selected programs, events and technological developments that epitomize the contributions of the NRAO to the science of radio astronomy is provided. Scientific programs conducted with the NRAO instruments that were significant to galactic and extragalactic astronomy are presented. NRAO research programs presented include continuum and source surveys, mapping, a high precision verification of general relativity, and SETI programs. Cosmic phenomena investigated in these programs include galactic and extragalactic HI and HII, emission nebula, supernova remnants, cosmic masers, giant molecular clouds, radio stars, normal and radio galaxies, and quasars. Modern NRAO instruments including the VLA and VLBA and their scientific programs are presented in the final chapter as well as plans for future NRAO instruments such as the GBT.
NASA Astrophysics Data System (ADS)
Murray, Marissa
This past summer I interned at the American Institute of Physics and helped research and write articles for the FYI Science Policy Bulletin. FYI is an objective digest of science policy developments in Washington, D.C. that impact the greater physical sciences community. Over the course of the summer, I independently attended, analyzed, and reported on a variety of science, technology, and funding related events including congressional hearings, government agency advisory committee meetings, and scientific society events. I wrote and co-wrote three articles on basic energy research legislation, the National Institute of Standards and Technology improvement act, and the National Science Foundation's big ideas for future investment. I had the opportunity to examine some challenging questions such as what is the role of government in funding applied research? How should science priorities be set? What is the right balance of funding across different agencies and programs? I learned about how science policy is a two-way street: science is used to inform policy decisions and policy is made to fund and regulate the conduct of science. I will conclude with how my summer working with FYI showed me the importance of science advocacy, being informed, and voting. Society of Physics Students.
GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO
2017-01-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:27908398
GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO
2017-01-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:24799088
Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao
2014-12-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.
Department of Energy's Virtual Lab Infrastructure for Integrated Earth System Science Data
NASA Astrophysics Data System (ADS)
Williams, D. N.; Palanisamy, G.; Shipman, G.; Boden, T.; Voyles, J.
2014-12-01
The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) Climate and Environmental Sciences Division (CESD) produces a diversity of data, information, software, and model codes across its research and informatics programs and facilities. This information includes raw and reduced observational and instrumentation data, model codes, model-generated results, and integrated data products. Currently, most of this data and information are prepared and shared for program specific activities, corresponding to CESD organization research. A major challenge facing BER CESD is how best to inventory, integrate, and deliver these vast and diverse resources for the purpose of accelerating Earth system science research. This talk provides a concept for a CESD Integrated Data Ecosystem and an initial roadmap for its implementation to address this integration challenge in the "Big Data" domain. Towards this end, a new BER Virtual Laboratory Infrastructure will be presented, which will include services and software connecting the heterogeneous CESD data holdings, and constructed with open source software based on industry standards, protocols, and state-of-the-art technology.
NASA SMD Science Education and Public Outreach Forums: A Five-Year Retrospective
NASA Astrophysics Data System (ADS)
Smith, Denise A.; Peticolas, Laura; Schwerin, Theresa; Shipp, Stephanie
2014-06-01
NASA’s Science Mission Directorate (SMD) created four competitively awarded Science Education and Public Outreach Forums (Astrophysics, Heliophysics, Planetary Science, Earth Science) in 2009. The objective is to enhance the overall coherence of SMD education and public outreach (E/PO), leading to more effective, efficient, and sustainable use of SMD science discoveries and learning experiences. We summarize progress and next steps towards achieving this goal with examples drawn from Astrophysics and cross-Forum efforts. Over the past five years, the Forums have enabled leaders of individual SMD mission and grant-funded E/PO programs to work together to place individual science discoveries and learning resources into context for audiences, conveying the big picture of scientific discovery based on audience needs. Forum-organized collaborations and partnerships extend the impact of individual programs to new audiences and provide resources and opportunities for educators to engage their audiences in NASA science. Similarly, Forum resources support scientists and faculty in utilizing SMD E/PO resources. Through Forum activities, mission E/PO teams and grantees have worked together to define common goals and provide unified professional development for educators (NASA’s Multiwavelength Universe); build partnerships with libraries to engage underserved/underrepresented audiences (NASA Science4Girls and Their Families); strengthen use of best practices; provide thematic, audience-based entry points to SMD learning experiences; support scientists in participating in E/PO; and, convey the impact of the SMD E/PO program. The Forums have created a single online digital library (NASA Wavelength, http://nasawavelength.org) that hosts all peer-reviewed SMD-funded education materials and worked with the SMD E/PO community to compile E/PO program metrics (http://nasamissionepometrics.org/). External evaluation shows the Forums are meeting their objectives. Specific examples of Forum-organized resources for use by scientists, faculty, and informal educators are discussed in related presentations (Meinke et al.; Manning et al.).
samiDB: A Prototype Data Archive for Big Science Exploration
NASA Astrophysics Data System (ADS)
Konstantopoulos, I. S.; Green, A. W.; Cortese, L.; Foster, C.; Scott, N.
2015-04-01
samiDB is an archive, database, and query engine to serve the spectra, spectral hypercubes, and high-level science products that make up the SAMI Galaxy Survey. Based on the versatile Hierarchical Data Format (HDF5), samiDB does not depend on relational database structures and hence lightens the setup and maintenance load imposed on science teams by metadata tables. The code, written in Python, covers the ingestion, querying, and exporting of data as well as the automatic setup of an HTML schema browser. samiDB serves as a maintenance-light data archive for Big Science and can be adopted and adapted by science teams that lack the means to hire professional archivists to set up the data back end for their projects.
Data Mining Citizen Science Results
NASA Astrophysics Data System (ADS)
Borne, K. D.
2012-12-01
Scientific discovery from big data is enabled through multiple channels, including data mining (through the application of machine learning algorithms) and human computation (commonly implemented through citizen science tasks). We will describe the results of new data mining experiments on the results from citizen science activities. Discovering patterns, trends, and anomalies in data are among the powerful contributions of citizen science. Establishing scientific algorithms that can subsequently re-discover the same types of patterns, trends, and anomalies in automatic data processing pipelines will ultimately result from the transformation of those human algorithms into computer algorithms, which can then be applied to much larger data collections. Scientific discovery from big data is thus greatly amplified through the marriage of data mining with citizen science.
2013-10-17
The center of the Milky Way galaxy imaged by NASA Spitzer Space Telescope is displayed on a quarter-of-a-billion-pixel, high-definition 23-foot-wide 7-meter LCD science visualization screen at NASA Ames Research Center.
pvsR: An Open Source Interface to Big Data on the American Political Sphere.
Matter, Ulrich; Stutzer, Alois
2015-01-01
Digital data from the political sphere is abundant, omnipresent, and more and more directly accessible through the Internet. Project Vote Smart (PVS) is a prominent example of this big public data and covers various aspects of U.S. politics in astonishing detail. Despite the vast potential of PVS' data for political science, economics, and sociology, it is hardly used in empirical research. The systematic compilation of semi-structured data can be complicated and time consuming as the data format is not designed for conventional scientific research. This paper presents a new tool that makes the data easily accessible to a broad scientific community. We provide the software called pvsR as an add-on to the R programming environment for statistical computing. This open source interface (OSI) serves as a direct link between a statistical analysis and the large PVS database. The free and open code is expected to substantially reduce the cost of research with PVS' new big public data in a vast variety of possible applications. We discuss its advantages vis-à-vis traditional methods of data generation as well as already existing interfaces. The validity of the library is documented based on an illustration involving female representation in local politics. In addition, pvsR facilitates the replication of research with PVS data at low costs, including the pre-processing of data. Similar OSIs are recommended for other big public databases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Jeffrey S.
LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.
Big Data Knowledge in Global Health Education.
Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge
The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.
[Social change and sciences in the 20th century].
Garamvölgyi, J
1995-12-05
The symbiotic interdependence of state, economy and science is one of the most significant structural characteristics of the 20th century. This development results from inherent scientific as well as from social procedures and needs, and it has been favoured by the two World Wars, culminating in the Cold War. This led to new structures: institutions of large scale research, think tanks, and the military-industrial complex. Big government, big business, and big science are depending on each other. Parallel to the new way of thinking in physics (Einstein, Bohr and others), finally accomplished by the revolution in cybernetics (Wiener), the traditional borders between disciplines have been overcome. The production of new knowledge is now of primary importance. Today, information proves to be one of the strategic resources which determines prosperity, power and prestige as well as success in economic and political markets.
Dawn Mission Education and Public Outreach: Science as Human Endeavor
NASA Astrophysics Data System (ADS)
Cobb, W. H.; Wise, J.; Schmidt, B. E.; Ristvey, J.
2012-12-01
Dawn Education and Public Outreach strives to reach diverse learners using multi-disciplinary approaches. In-depth professional development workshops in collaboration with NASA's Discovery Program, MESSENGER and Stardust-NExT missions focusing on STEM initiatives that integrate the arts have met the needs of diverse audiences and received excellent evaluations. Another collaboration on NASA ROSES grant, Small Bodies, Big Concepts, has helped bridge the learning sequence between the upper elementary and middle school, and the middle and high school Dawn curriculum modules. Leveraging the Small Bodies, Big Concepts model, educators experience diverse and developmentally appropriate NASA activities that tell the Dawn story, with teachers' pedagogical skills enriched by strategies drawn from NSTA's Designing Effective Science Instruction. Dawn mission members enrich workshops by offering science presentations to highlight events and emerging data. Teachers' awareness of the process of learning new content is heightened, and they use that experience to deepen their science teaching practice. Activities are sequenced to enhance conceptual understanding of big ideas in space science and Vesta and Ceres and the Dawn Mission 's place within that body of knowledge Other media add depth to Dawn's resources for reaching students. Instrument and ion engine interactives developed with the respective science team leads help audiences engage with the mission payload and the data each instrument collects. The Dawn Dictionary, an offering in both audio as well as written formats, makes key vocabulary accessible to a broader range of students and the interested public. Further, as Dawn E/PO has invited the public to learn about mission objectives as the mission explored asteroid Vesta, new inroads into public presentations such as the Dawn MissionCast tell the story of this extraordinary mission. Asteroid Mapper is the latest, exciting citizen science endeavor designed to invite the general public into the thrill of NASA science. Helping teachers develop a picture of the history and evolution of our understanding of the solar system, and honing in on the place of asteroids in helping us answer old questions and discover new ones, students and the general public sees the power and excitement underlying planetary science as human endeavor. Research indicates that science inquiry is powerful in the classroom and mission scientists are real-life models of science inquiry in action. Cross-curricular elements include examining research-based strategies for enhancing English language learners' ability to engage in higher order questions and a professional astronomy artist's insight into how visual analysis requires not just our eyes engaged, but our brains: comparing, synthesizing, questioning, evaluating, and wondering. Dawn Education and Public Outreach will share out perspectives and lessons learned, backed by extensive evaluation examining the efficacy of the mission's efforts.
Biosecurity in the age of Big Data: a conversation with the FBI.
You, Edward; Kozminski, Keith G
2015-11-05
New scientific frontiers and emerging technologies within the life sciences pose many global challenges to society. Big Data is a premier example, especially with respect to individual, national, and international security. Here a Special Agent of the Federal Bureau of Investigation discusses the security implications of Big Data and the need for security in the life sciences. © 2015 Kozminski. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
NASA Astrophysics Data System (ADS)
Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.
2015-05-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.
Big Outcrops and Big Ideas in Earth Science K-8 Professional Development
NASA Astrophysics Data System (ADS)
Baldwin, K. A.; Cooper, C. M.; Cavagnetto, A.; Morrison, J.; Adesope, O.
2014-12-01
Washington State has recently adopted the Next Generation Science Standards (NGSS) and state leaders are now working toward supporting teachers' implementation of the new standards and the pedagogical practices that support them. This poster encompasses one of one such professional development (PD) effort. The Enhancing Understanding of Concepts and Processes of Science (EUCAPS) project serves 31 K-8 in-service teachers in two southeast Washington school districts. In year two of this three year PD project, in-service teachers explored the Earth sciences and pedagogical approaches such as the Science Writing Heuristic, concept mapping, and activities which emphasized the epistemic nature of science. The goals of the EUCAPS PD project are to increase in-service teachers' big ideas in science and to provide support to in-service teachers as they transition to the NGSS. Teachers used concepts maps to document their knowledge of Earth science processes before and after visiting a local field site in Lewiston, Idaho. In the context of immersive inquiries, teachers collected field-based evidence to support their claims about the geological history of the field site. Teachers presented their claims and evidence to their peers in the form a story about the local geologic history. This poster will present an overview of the PD as well as provide examples of teacher's work and alignment with the NGSS.
Findability of Federal Research Data
NASA Astrophysics Data System (ADS)
Hourcle, J. A.
2013-12-01
Findability of Federal Research Data Although many of the federal agencies have been providing access to scientific research data for years if not decades, the findability of the data has been quite lacking. Many discipline-wide efforts have been made in the big science communities, such as PDS for planetary science and the VOs in night time astronomy and heliophysics, but there is a lack of single entry point for someone looking for data. The science.gov website contains links to many of these big-science search systems, but doesn't differentiate between links to science quality data and websites or browse products, making it more difficult to search specifically for data. The data.gov website is a useful repository for PIs of small science data to stash their data, particularly as it allows for interested parties to interact with tabular data. Unfortunately, as each group thinks of their data differently, much of what's now in the system is a mess; collections of data being tracked as individual records with no relationships between them. Big science projects also get tracked as single records, potentially with only a single record for missions with multiple instruments and significantly different data series. We present recommendations on how to improve the findability of federal research data on data.gov, based on years of working on the Virtual Solar Observatory and withing the science informatics community.
NASA Astrophysics Data System (ADS)
Waller, J. L.; Brey, J. A.
2014-12-01
"small problems, Big Trouble" (spBT) is an exhibition of artist Judith Waller's paintings accompanied by text panels written by Earth scientist Dr. James A. Brey and several science researchers and educators. The text panels' message is as much the focus of the show as the art--true interdisciplinarity! Waller and Brey's history of art and earth science collaborations include the successful exhibition "Layers: Places in Peril". New in spBT is extended collaboration with other scientists in order to create awareness of geoscience and other subjects (i.e. soil, parasites, dust, pollutants, invasive species, carbon, ground water contaminants, solar wind) small in scale which pose significant threats. The paintings are the size of a mirror, a symbol suggesting the problems depicted are those we increasingly need to face, noting our collective reflections of shared current and future reality. Naturalistic rendering and abstract form in the art helps reach a broad audience including those familiar with art and those familiar with science. The goal is that gallery visitors gain greater appreciation and understanding of both—and of the sober content of the show as a whole. "small problems, Big Trouble" premiers in Wisconsin April, 2015. As in previous collaborations, Waller and Brey actively utilize art and science (specifically geoscience) as an educational vehicle for active student learning. Planned are interdisciplinary university and area high school activities linked through spBT. The exhibition in a public gallery offers a means to enhance community awareness of and action on scientific issues through art's power to engage people on an emotional level. This AGU presentation includes a description of past Waller and Brey activities: incorporating art and earth science in lab and studio classrooms, producing gallery and museum exhibitions and delivering workshops and other presentations. They also describe how walking the paths of several past earth science disasters continues to inspire new chapters in their "Layers: Places in Peril" exhibit! A slide show includes images of paintings for "small problems, Big Trouble". Brey and Waller will lead a discussion on their process of incorporating broader collaboration with geoscientists and others in an educational art exhibition.
Kang, Stella K; Rawson, James V; Recht, Michael P
2017-12-05
Provided methodologic training, more imagers can contribute to the evidence basis on improved health outcomes and value in diagnostic imaging. The Value of Imaging Through Comparative Effectiveness Research Program was developed to provide hands-on, practical training in five core areas for comparative effectiveness and big biomedical data research: decision analysis, cost-effectiveness analysis, evidence synthesis, big data principles, and applications of big data analytics. The program's mixed format consists of web-based modules for asynchronous learning as well as in-person sessions for practical skills and group discussion. Seven diagnostic radiology subspecialties and cardiology are represented in the first group of program participants, showing the collective potential for greater depth of comparative effectiveness research in the imaging community. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Database Resources of the BIG Data Center in 2018.
2018-01-04
The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Moebus, Susanne; Kuhn, Joseph; Hoffmann, Wolfgang
2017-11-01
Big Data is a diffuse term, which can be described as an approach to linking gigantic and often unstructured data sets. Big Data is used in many corporate areas. For Public Health (PH), however, Big Data is not a well-developed topic. In this article, Big Data is explained according to the intention of use, information efficiency, prediction and clustering. Using the example of application in science, patient care, equal opportunities and smart cities, typical challenges and open questions of Big Data for PH are outlined. In addition to the inevitable use of Big Data, networking is necessary, especially with knowledge-carriers and decision-makers from politics and health care practice. © Georg Thieme Verlag KG Stuttgart · New York.
NASA Astrophysics Data System (ADS)
2004-01-01
BOOK REVIEWS (99) Complete A-Z Physics Handbook Science Magic in the Kitchen The Science of Cooking Science Experiments You Can Eat WEB WATCH (101) These journal themes are pasta joke Microwave oven Web links CD REVIEW (104) Electricity and Magnetism, KS3 Big Science Comics
Peek, N; Holmes, J H; Sun, J
2014-08-15
To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.
Neoliberal science, Chinese style: Making and managing the 'obesity epidemic'.
Greenhalgh, Susan
2016-08-01
Science and Technology Studies has seen a growing interest in the commercialization of science. In this article, I track the role of corporations in the construction of the obesity epidemic, deemed one of the major public health threats of the century. Focusing on China, a rising superpower in the midst of rampant, state-directed neoliberalization, I unravel the process, mechanisms, and broad effects of the corporate invention of an obesity epidemic. Largely hidden from view, Western firms were central actors at every stage in the creation, definition, and governmental management of obesity as a Chinese disease. Two industry-funded global health entities and the exploitation of personal ties enabled actors to nudge the development of obesity science and policy along lines beneficial to large firms, while obscuring the nudging. From Big Pharma to Big Food and Big Soda, transnational companies have been profiting from the 'epidemic of Chinese obesity', while doing little to effectively treat or prevent it. The China case suggests how obesity might have been constituted an 'epidemic threat' in other parts of the world and underscores the need for global frameworks to guide the study of neoliberal science and policymaking.
Big Ideas in Volcanology-a new way to teach and think about the subject?
NASA Astrophysics Data System (ADS)
Rose, W. I.
2011-12-01
As intense work with identifying and presenting earth science to middle school science teachers in the MiTEP project advances, I have realized that tools used to connect with teachers and students of earth science in general and especially to promote higher levels of learning, should be advantageous in graduate teaching as well. In my last of 40 years of teaching graduate volcanology, I have finally organized the class around ideas based on Earth Science Literacy Principles and on common misconceptions. As such, I propose and fully explore the twelve "big ideas" of volcanology at the rate of one per week. This curricular organization highlights the ideas in volcanology that have major impact beyond volcanology itself and explores the roots and global ramifications of these ideas. Together they show how volcanology interfaces with the science world and the "real" world or how volcanologists interface with "real" people. In addition to big ideas we explore difficult and misunderstood concepts and the public misconceptions associated with each. The new organization and its focus on understanding relevant and far reaching concepts and hypotheses provides a refreshing context for advanced learning. It is planned to be the basis for an interactive website.
Unraveling the Complexities of Life Sciences Data.
Higdon, Roger; Haynes, Winston; Stanberry, Larissa; Stewart, Elizabeth; Yandl, Gregory; Howard, Chris; Broomall, William; Kolker, Natali; Kolker, Eugene
2013-03-01
The life sciences have entered into the realm of big data and data-enabled science, where data can either empower or overwhelm. These data bring the challenges of the 5 Vs of big data: volume, veracity, velocity, variety, and value. Both independently and through our involvement with DELSA Global (Data-Enabled Life Sciences Alliance, DELSAglobal.org), the Kolker Lab ( kolkerlab.org ) is creating partnerships that identify data challenges and solve community needs. We specialize in solutions to complex biological data challenges, as exemplified by the community resource of MOPED (Model Organism Protein Expression Database, MOPED.proteinspire.org ) and the analysis pipeline of SPIRE (Systematic Protein Investigative Research Environment, PROTEINSPIRE.org ). Our collaborative work extends into the computationally intensive tasks of analysis and visualization of millions of protein sequences through innovative implementations of sequence alignment algorithms and creation of the Protein Sequence Universe tool (PSU). Pushing into the future together with our collaborators, our lab is pursuing integration of multi-omics data and exploration of biological pathways, as well as assigning function to proteins and porting solutions to the cloud. Big data have come to the life sciences; discovering the knowledge in the data will bring breakthroughs and benefits.
Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science
NASA Astrophysics Data System (ADS)
Baru, C.
2014-12-01
Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.
NASA Astrophysics Data System (ADS)
McInerney, M.; Schnase, J. L.; Duffy, D.; Tamkin, G.; Nadeau, D.; Strong, S.; Thompson, J. H.; Sinno, S.; Lazar, D.
2014-12-01
The climate sciences represent a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with big data that ultimately product societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by cloud computing. Within this framework, cloud computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics-as-a-service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the big data challenges in this domain. This poster will highlight specific examples of CAaaS using climate reanalysis data, high-performance cloud computing, map reduce, and the Climate Data Services API.
ERIC Educational Resources Information Center
Beigel, Allan
1991-01-01
Lessons learned by the University of Arizona through participation in two major scientific projects, construction of an astronomical observatory and a super cyclotron, are discussed. Four criteria for institutional participation in such projects are outlined, including consistency with institutional mission, adequate resources, leadership, and…
Measuring adolescent science motivation
NASA Astrophysics Data System (ADS)
Schumm, Maximiliane F.; Bogner, Franz X.
2016-02-01
To monitor science motivation, 232 tenth graders of the college preparatory level ('Gymnasium') completed the Science Motivation Questionnaire II (SMQ-II). Additionally, personality data were collected using a 10-item version of the Big Five Inventory. A subsequent exploratory factor analysis based on the eigenvalue-greater-than-one criterion, extracted a loading pattern, which in principle, followed the SMQ-II frame. Two items were dropped due to inappropriate loadings. The remaining SMQ-II seems to provide a consistent scale matching the findings in literature. Nevertheless, also possible shortcomings of the scale are discussed. Data showed a higher perceived self-determination in girls which seems compensated by their lower self-efficacy beliefs leading to equality of females and males in overall science motivation scores. Additionally, the Big Five personality traits and science motivation components show little relationship.
Enhancing Diversity in Biomedical Data Science
Canner, Judith E.; McEligot, Archana J.; Pérez, María-Eglée; Qian, Lei; Zhang, Xinzhi
2017-01-01
The gap in educational attainment separating underrepresented minorities from Whites and Asians remains wide. Such a gap has significant impact on workforce diversity and inclusion among cross-cutting Biomedical Data Science (BDS) research, which presents great opportunities as well as major challenges for addressing health disparities. This article provides a brief description of the newly established National Institutes of Health Big Data to Knowledge (BD2K) diversity initiatives at four universities: California State University, Monterey Bay; Fisk University; University of Puerto Rico, Río Piedras Campus; and California State University, Fullerton. We emphasize three main barriers to BDS careers (ie, preparation, exposure, and access to resources) experienced among those pioneer programs and recommendations for possible solutions (ie, early and proactive mentoring, enriched research experience, and data science curriculum development). The diversity disparities in BDS demonstrate the need for educators, researchers, and funding agencies to support evidence-based practices that will lead to the diversification of the BDS workforce PMID:28439180
Mertz, Leslie
2012-01-01
When the Defense Advanced Research Projects Agency (DARPA) asks research questions, it goes big. This is, after all, the same agency that put together teams of scientists and engineers to find a way to connect the worlds computers and, in doing so, developed the precursor to the Internet. DARPA, the experimental research wing of the U.S. Department of Defense, funds the types of research queries that scientists and engineers dream of tackling. Unlike a traditional granting agency that conservatively metes out its funding and only to projects with a good chance of success, DARPA puts its money on massive, multi-institutional projects that have no guarantees, but have enormous potential. In the 1990s, DARPA began its biological and medical science research to improve the safety, health, and well being of military personnel, according to DARPA program manager and Army Colonel Geoffrey Ling, Ph.D., M.D. More recently, DARPA has entered the realm of neuroscience and neurotechnology. Its focus with these projects is on its prime customer, the U.S. Department of Defense, but Ling acknowledged that technologies developed in its programs "certainly have potential to cascade into civilian uses."
What Difference Does Quantity Make? On the Epistemology of Big Data in Biology
Leonelli, Sabina
2015-01-01
Is big data science a whole new way of doing research? And what difference does data quantity make to knowledge production strategies and their outputs? I argue that the novelty of big data science does not lie in the sheer quantity of data involved, but rather in (1) the prominence and status acquired by data as commodity and recognised output, both within and outside of the scientific community; and (2) the methods, infrastructures, technologies, skills and knowledge developed to handle data. These developments generate the impression that data-intensive research is a new mode of doing science, with its own epistemology and norms. To assess this claim, one needs to consider the ways in which data are actually disseminated and used to generate knowledge. Accordingly, this paper reviews the development of sophisticated ways to disseminate, integrate and re-use data acquired on model organisms over the last three decades of work in experimental biology. I focus on online databases as prominent infrastructures set up to organise and interpret such data; and examine the wealth and diversity of expertise, resources and conceptual scaffolding that such databases draw upon. This illuminates some of the conditions under which big data need to be curated to support processes of discovery across biological subfields, which in turn highlights the difficulties caused by the lack of adequate curation for the vast majority of data in the life sciences. In closing, I reflect on the difference that data quantity is making to contemporary biology, the methodological and epistemic challenges of identifying and analyzing data given these developments, and the opportunities and worries associated to big data discourse and methods. PMID:25729586
What Difference Does Quantity Make? On the Epistemology of Big Data in Biology.
Leonelli, Sabina
2014-06-01
Is big data science a whole new way of doing research? And what difference does data quantity make to knowledge production strategies and their outputs? I argue that the novelty of big data science does not lie in the sheer quantity of data involved, but rather in (1) the prominence and status acquired by data as commodity and recognised output, both within and outside of the scientific community; and (2) the methods, infrastructures, technologies, skills and knowledge developed to handle data. These developments generate the impression that data-intensive research is a new mode of doing science, with its own epistemology and norms. To assess this claim, one needs to consider the ways in which data are actually disseminated and used to generate knowledge. Accordingly, this paper reviews the development of sophisticated ways to disseminate, integrate and re-use data acquired on model organisms over the last three decades of work in experimental biology. I focus on online databases as prominent infrastructures set up to organise and interpret such data; and examine the wealth and diversity of expertise, resources and conceptual scaffolding that such databases draw upon. This illuminates some of the conditions under which big data need to be curated to support processes of discovery across biological subfields, which in turn highlights the difficulties caused by the lack of adequate curation for the vast majority of data in the life sciences. In closing, I reflect on the difference that data quantity is making to contemporary biology, the methodological and epistemic challenges of identifying and analyzing data given these developments, and the opportunities and worries associated to big data discourse and methods.
Big data in psychology: Introduction to the special issue.
Harlow, Lisa L; Oswald, Frederick L
2016-12-01
The introduction to this special issue on psychological research involving big data summarizes the highlights of 10 articles that address a number of important and inspiring perspectives, issues, and applications. Four common themes that emerge in the articles with respect to psychological research conducted in the area of big data are mentioned, including: (a) The benefits of collaboration across disciplines, such as those in the social sciences, applied statistics, and computer science. Doing so assists in grounding big data research in sound theory and practice, as well as in affording effective data retrieval and analysis. (b) Availability of large data sets on Facebook, Twitter, and other social media sites that provide a psychological window into the attitudes and behaviors of a broad spectrum of the population. (c) Identifying, addressing, and being sensitive to ethical considerations when analyzing large data sets gained from public or private sources. (d) The unavoidable necessity of validating predictive models in big data by applying a model developed on 1 dataset to a separate set of data or hold-out sample. Translational abstracts that summarize the articles in very clear and understandable terms are included in Appendix A, and a glossary of terms relevant to big data research discussed in the articles is presented in Appendix B. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
DEVELOPING THE TRANSDISCIPLINARY AGING RESEARCH AGENDA: NEW DEVELOPMENTS IN BIG DATA.
Callaghan, Christian William
2017-07-19
In light of dramatic advances in big data analytics and the application of these advances in certain scientific fields, new potentialities exist for breakthroughs in aging research. Translating these new potentialities to research outcomes for aging populations, however, remains a challenge, as underlying technologies which have enabled exponential increases in 'big data' have not yet enabled a commensurate era of 'big knowledge,' or similarly exponential increases in biomedical breakthroughs. Debates also reveal differences in the literature, with some arguing big data analytics heralds a new era associated with the 'end of theory' or which makes the scientific method obsolete, where correlation supercedes causation, whereby science can advance without theory and hypotheses testing. On the other hand, others argue theory cannot be subordinate to data, no matter how comprehensive data coverage can ultimately become. Given these two tensions, namely between exponential increases in data absent exponential increases in biomedical research outputs, and between the promise of comprehensive data coverage and data-driven inductive versus theory-driven deductive modes of enquiry, this paper seeks to provide a critical review of certain theory and literature that offers useful perspectives of certain developments in big data analytics and their theoretical implications for aging research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Big Data in Psychology: Introduction to Special Issue
Harlow, Lisa L.; Oswald, Frederick L.
2016-01-01
The introduction to this special issue on psychological research involving big data summarizes the highlights of 10 articles that address a number of important and inspiring perspectives, issues, and applications. Four common themes that emerge in the articles with respect to psychological research conducted in the area of big data are mentioned, including: 1. The benefits of collaboration across disciplines, such as those in the social sciences, applied statistics, and computer science. Doing so assists in grounding big data research in sound theory and practice, as well as in affording effective data retrieval and analysis. 2. Availability of large datasets on Facebook, Twitter, and other social media sites that provide a psychological window into the attitudes and behaviors of a broad spectrum of the population. 3. Identifying, addressing, and being sensitive to ethical considerations when analyzing large datasets gained from public or private sources. 4. The unavoidable necessity of validating predictive models in big data by applying a model developed on one dataset to a separate set of data or hold-out sample. Translational abstracts that summarize the articles in very clear and understandable terms are included in Appendix A, and a glossary of terms relevant to big data research discussed in the articles is presented in Appendix B. PMID:27918177
Rethinking big data: A review on the data quality and usage issues
NASA Astrophysics Data System (ADS)
Liu, Jianzheng; Li, Jie; Li, Weifeng; Wu, Jiansheng
2016-05-01
The recent explosive publications of big data studies have well documented the rise of big data and its ongoing prevalence. Different types of ;big data; have emerged and have greatly enriched spatial information sciences and related fields in terms of breadth and granularity. Studies that were difficult to conduct in the past time due to data availability can now be carried out. However, big data brings lots of ;big errors; in data quality and data usage, which cannot be used as a substitute for sound research design and solid theories. We indicated and summarized the problems faced by current big data studies with regard to data collection, processing and analysis: inauthentic data collection, information incompleteness and noise of big data, unrepresentativeness, consistency and reliability, and ethical issues. Cases of empirical studies are provided as evidences for each problem. We propose that big data research should closely follow good scientific practice to provide reliable and scientific ;stories;, as well as explore and develop techniques and methods to mitigate or rectify those 'big-errors' brought by big data.
NASA Astrophysics Data System (ADS)
2014-05-01
WE RECOMMEND Level 3 Extended Project Student Guide A non-specialist, generally useful and nicely put together guide to project work ASE Guide to Research in Science Education Few words wasted in this handy introduction and reference The Science of Starlight Slow but steady DVD covers useful ground SPARKvue Impressive software now available as an app WORTH A LOOK My Inventions and Other Writings Science, engineering, autobiography, visions and psychic phenomena mixed in a strange but revealing concoction The Geek Manifesto: Why Science Matters More enthusiasm than science, but a good motivator and interesting A Big Ball of Fire: Your questions about the Sun answered Free iTunes download made by and for students goes down well APPS Collider visualises LHC experiments ... Science Museum app enhances school trips ... useful information for the Cambridge Science Festival
A corpus and a concordancer of academic journal articles.
Kwary, Deny A
2018-02-01
This data article presents a corpus (i.e. a selection of a big number of words in an electronic form) and a concordancer (i.e. a tool to show the word in its context of use) of academic journal articles. As the title suggests, the data were collected from research articles published in academic journals. The corpus contains 5,686,428 words selected from 895 journal articles published by Elsevier in 2011-2015. The corpus is classified into four subject areas: Health sciences, Life sciences, Physical Sciences, and Social Sciences, following the classifications of Scopus, which is the largest abstract and citation database of peer-reviewed scientific journals, books and conference proceedings. To ease the access and utilization of the corpus, a program to produce the key word in context (KWIC) and word frequency was created and placed on the website: corpus.kwary.net. The corpus is a valuable resource for researchers, teachers, and translators working on academic English.
NASA Astrophysics Data System (ADS)
Favors, J.
2016-12-01
NASA's Earth Science Division (ESD) seeks to develop a scientific understanding of the Earth as a dynamic, integrated system of diverse components that interact in complex ways - analogous to the human body. The Division approaches this goal through a coordinated series of satellite and airborne missions, sponsored basic and applied research, technology development, and science education. Integral to this approach are strong collaborations and partnerships with a spectrum of organizations that produce substantive benefit to communities - both locally and globally. This presentation will showcase various ways ESD approaches partnering and will highlight best practices, challenges, and provide case studies related to rapid partnerships, co-location of scientists and end-user communities, capacity building, and ESD's new Partnerships Program which is built around taking an innovative approach to partnering that fosters interdisplinary teaming & co-production of knowledge to broaden the applicability of Earth observations and answer new, big questions for partners and NASA, alike.
NASA Astrophysics Data System (ADS)
Gentry, Robert
2015-04-01
Big bang theory holds its central expansion redshift assumption quickly reduced the theorized radiation flash to ~ 1010 K, and then over 13.8 billion years reduced it further to the present 2.73 K CMR. Weinberg claims this 2.73 K value agrees with big bang theory so well that ``...we can be sure that this radiation was indeed left over from a time about a million years after the `big bang.' '' (TF3M, p180, 1993 ed.) Actually his conclusion is all based on big bang's in-flight wavelength expansion being a valid physical process. In fact all his surmising is nothing but science fiction because our disproof of GR-induced in-flight wavelength expansion [1] definitely proves the 2.73 K CMR could never have been the wavelength-expanded relic of any radiation, much less the presumed big bang's. This disproof of big bang's premier prediction is a death blow to the big bang as it is also to the idea that the redshifts in Hubble's redshift relation are expansion shifts; this negates Friedmann's everywhere-the-same, no-center universe concept and proves it does have a nearby Center, a place which can be identified in Psalm 103:19 and in Revelation 20:11 as the location of God's eternal throne. Widely published (Science, Nature, ARNS) evidence of Earth's fiat creation will also be presented. The research is supported by the God of Creation. This paper [1] is in for publication.
The New Improved Big6 Workshop Handbook. Professional Growth Series.
ERIC Educational Resources Information Center
Eisenberg, Michael B.; Berkowitz, Robert E.
This handbook is intended to help classroom teachers, teacher-librarians, technology teachers, administrators, parents, community members, and students to learn about the Big6 Skills approach to information and technology skills, to use the Big6 process in their own activities, and to implement a Big6 information and technology skills program. The…
NASA Astrophysics Data System (ADS)
Halversen, C.; McDonnell, J. D.; Apple, J. K.; Weiss, E. L.
2016-02-01
Two university courses, 1) Promoting Climate Literacy and 2) Climate and Data Literacy, developed by the University of California Berkeley provide faculty across the country with course materials to help their students delve into the science underlying global environmental change. The courses include culturally responsive content, such as indigenous and place-based knowledge, and examine how people learn and consequently, how we should teach and communicate science. Promoting Climate Literacy was developed working with Scripps Institution of Oceanography, University of Washington, and Western Washington University. Climate and Data Literacy was developed with Rutgers University and Padilla Bay National Estuarine Research Reserve, WA. The Climate and Data Literacy course also focuses on helping students in science majors participating in U-Teach programs and students in pre-service teacher education programs gain skills in using real and near-real time data through engaging in investigations using web-based and locally-relevant data resources. The course helps these students understand and apply the scientific practices, disciplinary concepts and big ideas described in the Framework for K-12 Science Education and the Next Generation Science Standards (NGSS). This course focuses on students interested in teaching middle school science for three reasons: (1) teachers often have relatively weak understandings of the practices of science, and of complex Earth systems science and climate change; (2) the concepts that underlie climate change align well with the NGSS; and (3) middle school is a critical time for promoting student interest in science and for recruitment to STEM careers and lifelong climate literacy. This course is now being field tested in a number of U-Teach programs including Florida State University, Louisiana State University, as well as pre-service teacher education programs at California State University East Bay, and Western Washington University. The Promoting Climate Literacy course is focused on graduate and undergraduate science students interested in learning how to more effectively communicate climate science, while participating in outreach opportunities with the public. The course has been disseminated through a workshop for faculty at 17 universities.
Valenta, Annette L; Meagher, Emma A; Tachinardi, Umberto
2016-01-01
Since the inception of the Clinical and Translational Science Award (CTSA) program in 2006, leaders in education across CTSA sites have been developing and updating core competencies for Clinical and Translational Science (CTS) trainees. By 2009, 14 competency domains, including biomedical informatics, had been identified and published. Since that time, the evolution of the CTSA program, changes in the practice of CTS, the rapid adoption of electronic health records (EHRs), the growth of biomedical informatics, the explosion of big data, and the realization that some of the competencies had proven to be difficult to apply in practice have made it clear that the competencies should be updated. This paper describes the process undertaken and puts forth a new set of competencies that has been recently endorsed by the Clinical Research Informatics Workgroup of AMIA. In addition to providing context and background for the current version of the competencies, we hope this will serve as a model for revision of competencies over time. PMID:27121608
NASA Astrophysics Data System (ADS)
Hertz, P.
2003-03-01
The Structure and Evolution of the Universe (SEU) theme within NASA's Office of Space Science seeks to explore and understand the dynamic transformations of energy in the Universe - the entire web of biological and physical interactions that determine the evolution of our cosmic habitat. This search for understanding will enrich the human spirit and inspire a new generation of explorers, scientists, and engineers. To that end, NASA's strategic planning process has generated a new Roadmap to enable those goals. Called "Beyond Einstein", this Roadmap identifies three science objectives for the SEU theme: (1) Find out what powered the Big Bang; (2) Observe how black holes manipulate space, time, and matter; and (3) Identify the mysterious dark energy pullingthe Universe apart. These objectives can be realized through a combination of large observatories (Constellation-X, LISA), moderate sized, PI-led missions (the Einstein Probes), and a contuinuing program of technology development, research and analysis, and education/public outreach. In this presentation, NASA's proposed Beyond Einstein Program will be described. The full Roadmap is available at http://universe.nasa.gov/.
Eclipse 2017: Through the Eyes of NASA
NASA Astrophysics Data System (ADS)
Mayo, Louis; NASA Heliophysics Education Consortium
2017-10-01
The August 21, 2017 total solar eclipse across America was, by all accounts, the biggest science education program ever carried out by NASA, significantly larger than the Curiosity Mars landing and the New Horizons Pluto flyby. Initial accounting estimates over two billion people reached and website hits exceeding five billion. The NASA Science Mission Directorate spent over two years planning and developing this enormous public education program, establishing over 30 official NASA sites along the path of totality, providing imagery from 11 NASA space assets, two high altitude aircraft, and over 50 high altitude balloons. In addition, a special four focal plane ground based solar telescope was developed in partnership with Lunt Solar Systems that observed and processed the eclipse in 6K resolution. NASA EDGE and NASA TV broadcasts during the entirity of totality across the country reached hundreds of millions, world wide.This talk will discuss NASA's strategy, results, and lessons learned; and preview some of the big events we plan to feature in the near future.
Small satellites (MSTI-3) for remote sensing: pushing the limits of sensor and bus technology
NASA Astrophysics Data System (ADS)
Jeffrey, William; Fraser, James C.; Gobel, Richard W.; Matlock, Richard S.; Schneider, Garret L.
1995-01-01
The miniature sensor technology integration (MSTI) program sponsored by the United States Department of Defense (DoD) exploits advances in sensor and small satellite bus technology for theater and national missile defense. MSTI-1 and MSTI-2 were used to demonstrate the capability of the common bus and to build up the integration and management infrastructure to allow for `faster, better, cheaper' missions. MSTI-3 is the newest of the MSTI series and the first to fully exploit the developed infrastructure. Given the foundation laid down by MSTI-1 and MSTI-2, MSTI-3's mission is totally science-driven and demonstrates the quality of science possible from a small satellite in low earth orbit. The MSTI-3 satellite will achieve bus and payload performance historically attributable only to much larger satellites -- while maintaining the cost and schedule advantages inherent in small systems. The MSTI program illustrates the paradigm shift that is beginning to occur and has the mantra: `faster, better, cheaper.' The disciples of smallsat technology have adopted this mantra as a goal -- whereas the MSTI program is demonstrating its reality. The new paradigm illustrated by MSTI-3 bases its foundation on a development philosophy coined the `Three Golden Truths of Small Satellites.' First, bus and payload performance do not need to be sacrificed by a smallsat. Second, big science can be done with a smallsat. And third, a quick timeline minimizes budget exposure and increases the likelihood of a hardware program as opposed to a paper study. These themes are elaborated using MSTI-3 as an example of the tremendous potential small satellites have for making space science more affordable and accessible to a large science community.
Opportunities and challenges of big data for the social sciences: The case of genomic data.
Liu, Hexuan; Guo, Guang
2016-09-01
In this paper, we draw attention to one unique and valuable source of big data, genomic data, by demonstrating the opportunities they provide to social scientists. We discuss different types of large-scale genomic data and recent advances in statistical methods and computational infrastructure used to address challenges in managing and analyzing such data. We highlight how these data and methods can be used to benefit social science research. Copyright © 2016 Elsevier Inc. All rights reserved.
Mennes, Maarten
2016-03-01
'Big Data' and 'Population Imaging' are becoming integral parts of inspiring research aimed at delineating the biological underpinnings of psychiatric disorders. The scientific strategies currently associated with big data and population imaging are typically embedded in so-called discovery science, thereby pointing to the hypothesis-generating rather than hypothesis-testing nature of discovery science. In this issue, Yihong Zhao and F. Xavier Castellanos provide a compelling overview of strategies for discovery science aimed at progressing our understanding of neuropsychiatric disorders. In particular, they focus on efforts in genetic and neuroimaging research, which, together with extended behavioural testing, form the main pillars of psychopathology research. © 2016 Association for Child and Adolescent Mental Health.
ERIC Educational Resources Information Center
Trammel, Ming
This research compendium summarizes and reviews the evaluations of 13 out-of-school time programs with positive outcomes for young people. The programs are (1) 4-H; (2) 21st Century Community Learning Centers; (3) The After-School Corporation; (4) Beacons; (5) BELL After School Instructional Curriculum; (6) Big Brothers Big Sisters of America; (7)…
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
Picture of the Week: Making the (reactive) case for explosives science
: small organisms, big impacts Biocrusts: small organisms, big impacts View on Flickr Bismuth and tin on the rocks Bismuth and tin on the rocks View on Flickr Need to Know: Van Allen Belts Need to Know: Van
Streaming Swarm of Nano Space Probes for Modern Analytical Methods Applied to Planetary Science
NASA Astrophysics Data System (ADS)
Vizi, P. G.; Horvath, A. F.; Berczi, Sz.
2017-11-01
Streaming swarms gives possibilities to collect data from big fields in one time. The whole streaming fleet possible to behave like one big organization and can be realized as a planetary mission solution with stream type analytical methods.
Opening the Black Box: Understanding the Science Behind Big Data and Predictive Analytics.
Hofer, Ira S; Halperin, Eran; Cannesson, Maxime
2018-05-25
Big data, smart data, predictive analytics, and other similar terms are ubiquitous in the lay and scientific literature. However, despite the frequency of usage, these terms are often poorly understood, and evidence of their disruption to clinical care is hard to find. This article aims to address these issues by first defining and elucidating the term big data, exploring the ways in which modern medical data, both inside and outside the electronic medical record, meet the established definitions of big data. We then define the term smart data and discuss the transformations necessary to make big data into smart data. Finally, we examine the ways in which this transition from big to smart data will affect what we do in research, retrospective work, and ultimately patient care.
Baumeister, A A; Bacharach, V R; Baumeister, A A
1997-11-01
Controversy about the amount and nature of funding for mental retardation research has persisted since the creation of NICHD. An issue that has aroused considerable debate, within the mental retardation research community as well as beyond, is distribution of funds between large group research grants, such as the program project (PO1) and the individual grant (RO1). Currently within the Mental Retardation and Developmental Disabilities Branch, more money is allocated to the PO1 mechanism than the RO1. We compared the two types of grants, focusing on success rates, productivity, costs, impact, publication practices, and outcome and conducted a comparative analysis of biomedical and behavioral research. Other related issues were considered, including review processes and cost-effectiveness.
Uppal, Rahul; Mandava, Gunasheil; Romagnoli, Katrina M; King, Andrew J; Draper, Amie J; Handen, Adam L; Fisher, Arielle M; Becich, Michael J; Dutta-Moscato, Joyeeta
2016-01-01
The Computer Science, Biology, and Biomedical Informatics (CoSBBI) program was initiated in 2011 to expose the critical role of informatics in biomedicine to talented high school students.[1] By involving them in Science, Technology, Engineering, and Math (STEM) training at the high school level and providing mentorship and research opportunities throughout the formative years of their education, CoSBBI creates a research infrastructure designed to develop young informaticians. Our central premise is that the trajectory necessary to be an expert in the emerging fields of biomedical informatics and pathology informatics requires accelerated learning at an early age.In our 4(th) year of CoSBBI as a part of the University of Pittsburgh Cancer Institute (UPCI) Academy (http://www.upci.upmc.edu/summeracademy/), and our 2nd year of CoSBBI as an independent informatics-based academy, we enhanced our classroom curriculum, added hands-on computer science instruction, and expanded research projects to include clinical informatics. We also conducted a qualitative evaluation of the program to identify areas that need improvement in order to achieve our goal of creating a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics in the era of big data and personalized medicine.
NASA Astrophysics Data System (ADS)
Haden, C.; Styers, M.; Asplund, S.
2015-12-01
Music and the performing arts can be a powerful way to engage students in learning about science. Research suggests that content-rich songs enhance student understanding of science concepts by helping students develop content-based vocabulary, by providing examples and explanations of concepts, and connecting to personal and situational interest in a topic. Building on the role of music in engaging students in learning, and on best practices in out-of-school time learning, the NASA Discovery and New Frontiers program in association with Jet Propulsion Laboratory, Marshall Space Flight Center, and KidTribe developed Space School Musical. Space School Musical consists of a set of nine songs and 36 educational activities to teach elementary and middle school learners about the solar system and space science through an engaging storyline and the opportunity for active learning. In 2014, NASA's Jet Propulsion Laboratory contracted with Magnolia Consulting, LLC to conduct an evaluation of Space School Musical. Evaluators used a mixed methods approach to address evaluation questions related to educator professional development experiences, program implementation and perceptions, and impacts on participating students. Measures included a professional development feedback survey, facilitator follow-up survey, facilitator interviews, and a student survey. Evaluation results showed that educators were able to use the program in a variety of contexts and in different ways to best meet their instructional needs. They noted that the program worked well for diverse learners and helped to build excitement for science through engaging all learners in the musical. Students and educators reported positive personal and academic benefits to participating students. We present findings from the evaluation and lessons learned about integration of the arts into STEM education.
Teaching Information & Technology Skills: The Big6[TM] in Secondary Schools.
ERIC Educational Resources Information Center
Eisenberg, Michael B.; Berkowitz, Robert E.
This companion volume to a previous work focusing on the Big6 Approach in elementary schools provides secondary school classroom teachers, teacher-librarians, and technology teachers with the background and tools necessary to implement an integrated Big6 program. The first part of this book explains the Big6 approach and the rationale behind it.…
NASA Technical Reports Server (NTRS)
1997-01-01
This image is of a landform informally called Jenkins Dune and is thought to be a small barchan dune. This feature is less than 1 foot (0.3 m) tall and perhaps 2-3 meters wide. Inferred wind direction is from the left to the right. Near the crest of the feature is a demarcation that may represent the exposure of a crust on the sediments; similar features were seen on sediments on the rock Big Joe at the Viking landing site.
Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech).Dove, Edward S; Özdemir, Vural
2015-09-01
The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, "extreme centrism", and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics-separate and together-have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness . By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.
Dove, Edward S.; Özdemir, Vural
2015-01-01
The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit. PMID:26345196
Scalability and Validation of Big Data Bioinformatics Software.
Yang, Andrian; Troup, Michael; Ho, Joshua W K
2017-01-01
This review examines two important aspects that are central to modern big data bioinformatics analysis - software scalability and validity. We argue that not only are the issues of scalability and validation common to all big data bioinformatics analyses, they can be tackled by conceptually related methodological approaches, namely divide-and-conquer (scalability) and multiple executions (validation). Scalability is defined as the ability for a program to scale based on workload. It has always been an important consideration when developing bioinformatics algorithms and programs. Nonetheless the surge of volume and variety of biological and biomedical data has posed new challenges. We discuss how modern cloud computing and big data programming frameworks such as MapReduce and Spark are being used to effectively implement divide-and-conquer in a distributed computing environment. Validation of software is another important issue in big data bioinformatics that is often ignored. Software validation is the process of determining whether the program under test fulfils the task for which it was designed. Determining the correctness of the computational output of big data bioinformatics software is especially difficult due to the large input space and complex algorithms involved. We discuss how state-of-the-art software testing techniques that are based on the idea of multiple executions, such as metamorphic testing, can be used to implement an effective bioinformatics quality assurance strategy. We hope this review will raise awareness of these critical issues in bioinformatics.
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
Modern data science for analytical chemical data - A comprehensive review.
Szymańska, Ewa
2018-10-22
Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
pvsR: An Open Source Interface to Big Data on the American Political Sphere
2015-01-01
Digital data from the political sphere is abundant, omnipresent, and more and more directly accessible through the Internet. Project Vote Smart (PVS) is a prominent example of this big public data and covers various aspects of U.S. politics in astonishing detail. Despite the vast potential of PVS’ data for political science, economics, and sociology, it is hardly used in empirical research. The systematic compilation of semi-structured data can be complicated and time consuming as the data format is not designed for conventional scientific research. This paper presents a new tool that makes the data easily accessible to a broad scientific community. We provide the software called pvsR as an add-on to the R programming environment for statistical computing. This open source interface (OSI) serves as a direct link between a statistical analysis and the large PVS database. The free and open code is expected to substantially reduce the cost of research with PVS’ new big public data in a vast variety of possible applications. We discuss its advantages vis-à-vis traditional methods of data generation as well as already existing interfaces. The validity of the library is documented based on an illustration involving female representation in local politics. In addition, pvsR facilitates the replication of research with PVS data at low costs, including the pre-processing of data. Similar OSIs are recommended for other big public databases. PMID:26132154
Basalt: Biologic Analog Science Associated with Lava Terrains
NASA Astrophysics Data System (ADS)
Lim, D. S. S.; Abercromby, A.; Kobs-Nawotniak, S. E.; Kobayashi, L.; Hughes, S. S.; Chappell, S.; Bramall, N. E.; Deans, M. C.; Heldmann, J. L.; Downs, M.; Cockell, C. S.; Stevens, A. H.; Caldwell, B.; Hoffman, J.; Vadhavk, N.; Marquez, J.; Miller, M.; Squyres, S. W.; Lees, D. S.; Fong, T.; Cohen, T.; Smith, T.; Lee, G.; Frank, J.; Colaprete, A.
2015-12-01
This presentation will provide an overview of the BASALT (Biologic Analog Science Associated with Lava Terrains) program. BASALT research addresses Science, Science Operations, and Technology. Specifically, BASALT is focused on the investigation of terrestrial volcanic terrains and their habitability as analog environments for early and present-day Mars. Our scientific fieldwork is conducted under simulated Mars mission constraints to evaluate strategically selected concepts of operations (ConOps) and capabilities with respect to their anticipated value for the joint human and robotic exploration of Mars. a) Science: The BASALT science program is focused on understanding habitability conditions of early and present-day Mars in two relevant Mars-analog locations (the Southwest Rift Zone (SWRZ) and the East Rift Zone (ERZ) flows on the Big Island of Hawai'i and the eastern Snake River Plain (ESRP) in Idaho) to characterize and compare the physical and geochemical conditions of life in these environments and to learn how to seek, identify, and characterize life and life-related chemistry in basaltic environments representing these two epochs of martian history. b) Science Operations: The BASALT team will conduct real (non-simulated) biological and geological science at two high-fidelity Mars analogs, all within simulated Mars mission conditions (including communication latencies and bandwidth constraints) that are based on current architectural assumptions for Mars exploration missions. We will identify which human-robotic ConOps and supporting capabilities enable science return and discovery. c) Technology: BASALT will incorporate and evaluate technologies in to our field operations that are directly relevant to conducting the scientific investigations regarding life and life-related chemistry in Mars-analogous terrestrial environments. BASALT technologies include the use of mobile science platforms, extravehicular informatics, display technologies, communication & navigation packages, remote sensing, advanced science mission planning tools, and scientifically-relevant instrument packages to achieve the project goals.
Climbing the Slope of Enlightenment during NASA's Arctic Boreal Vulnerability Experiment
NASA Astrophysics Data System (ADS)
Griffith, P. C.; Hoy, E.; Duffy, D.; McInerney, M.
2015-12-01
The Arctic Boreal Vulnerability Experiment (ABoVE) is a new field campaign sponsored by NASA's Terrestrial Ecology Program and designed to improve understanding of the vulnerability and resilience of Arctic and boreal social-ecological systems to environmental change (http://above.nasa.gov). ABoVE is integrating field-based studies, modeling, and data from airborne and satellite remote sensing. The NASA Center for Climate Simulation (NCCS) has partnered with the NASA Carbon Cycle and Ecosystems Office (CCEO) to create a high performance science cloud for this field campaign. The ABoVE Science Cloud combines high performance computing with emerging technologies and data management with tools for analyzing and processing geographic information to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage for "big data" with integrated data management, and integration of core variables from in-situ networks. The ABoVE Science Cloud is a collaboration that is accelerating the pace of new Arctic science for researchers participating in the field campaign. Specific examples of the utilization of the ABoVE Science Cloud by several funded projects will be presented.
Big agronomic data validates an oxymoron: Sustainable intensification under climate change
USDA-ARS?s Scientific Manuscript database
Crop science is increasingly embracing big data to reconcile the apparent rift between intensification of food production and sustainability of a steadily stressed production base. A strategy based on long-term agroecosystem research and modeling simulation of crops, crop rotations and cropping sys...
The International Big History Association
ERIC Educational Resources Information Center
Duffy, Michael; Duffy, D'Neil
2013-01-01
IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…
Atmospheric Science Data Center
2014-05-15
article title: Big Island, Hawaii View Larger ... Multi-angle Imaging SpectroRadiometer (MISR) images of the Big Island of Hawaii, April - June 2000. The images have been rotated so that ... NASA's Goddard Space Flight Center, Greenbelt, MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science ...
Climate change has emerged as the significant environmental challenge of the 21st century. Therefore, understanding our changing world has forced researchers from many different fields of science to join together to tackle complicated research questions. The climate change resear...
ERIC Educational Resources Information Center
Smith, Betty, Ed.
This document presents a collection of papers published in the "Teaching Teachers" column in the elementary-level journal, "Science and Children." Contents include: (1) "Science is Part of the Big Picture: Teachers Become Science Learners" (Anita Greenwood); (2) "Reaching the Reluctant Science Teacher: Learning How To Teach Inquiry-Based Science"…
NASA Astrophysics Data System (ADS)
Pinney, Brian Robert John
The purpose of this study was to characterize ways in which teaching practice in classroom undergoing first semester implementation of an argument-based inquiry approach changes in whole-class discussion. Being that argument is explicitly called for in the Next Generation Science Standards and is currently a rare practice in teaching, many teachers will have to transform their teaching practice for inclusion of this feature. Most studies on Argument-Based Inquiry (ABI) agree that development of argument does not come easily and is only acquired through practice. Few studies have examined the ways in which teaching practice changes in relation to the big idea or disciplinary core idea (NGSS), the development of dialogue, and/or the development of argument during first semester implementation of an argument-based inquiry approach. To explore these areas, this study posed three primary research questions: (1) How does a teacher in his first semester of Science Writing Heuristic professional development make use of the "big idea"?, (1a) Is the indicated big idea consistent with NGSS core concepts?, (2) How did the dialogue in whole-class discussion change during the first semester of argument-based inquiry professional development?, (3) How did the argument in whole-class discussion change during the first semester of argument-based inquiry professional development? This semester-long study that took place in a middle school in a rural Midwestern city was grounded in interactive constructivism, and utilized a qualitative design to identify the ways in which the teacher utilized big ideas and how dialogue and argumentative dialogue developed over time. The purposefully selected teacher in this study provided a unique situation where he was in his first semester of professional development using the Science Writing Heuristic Approach to argument-based inquiry with 19 students who had two prior years' experience in ABI. Multiple sources of data were collected, including classroom video with transcripts, teacher interview, researcher field notes, student journals, teacher lesson plans from previous years, and a student questionnaire. Data analysis used a basic qualitative approach. The results showed (1) only the first time period had a true big idea, while the other two units contained topics, (2) each semester contained a similar use for the given big idea, though its role in the class was reduced after the opening activity, (3) the types of teacher questions shifted toward students explaining their comprehension of ideas and more students were involved in discussing each idea and for more turns of talk than in earlier time periods, (4) understanding science term definitions became more prominent later in the semester, with more stating science terms occurring earlier in the semester, (5) no significant changes were seen to the use of argument or claims and evidence throughout the study. The findings have informed theory and practice about science argumentation, the practice of whole-class dialogue, and the understanding of practice along four aspects: (1) apparent lack of understanding about big ideas and how to utilize them as the central organizing feature of a unit, (2) independent development of dialogue and argument, (3) apparent lack of understanding about the structure of argument and use of basic terminology with argument and big ideas, (4) challenges of ABI implementation. This study provides insight into the importance of prolonged and persistent professional development with ABI in teaching practice.
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Portfolio Acquisition - How the DoD Can Leverage the Commercial Product Line Model
2015-04-30
canceled (Harrison, 2011). A major contributing factor common to these failures is that the programs tried to do too much at once: they used a big - bang ...requirements in a single, big - bang approach. MDAPs take 10 to 15 years from Milestone A to initial operational capability, with many of the largest...2013). The block upgrade model for B-52, F-15, and F-16 proved successful over decades, yet with its big - bang structure the F-35 program is
The Human Genome Project: big science transforms biology and medicine.
Hood, Leroy; Rowen, Lee
2013-01-01
The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.
Use of combinatorial chemistry to speed drug discovery.
Rádl, S
1998-10-01
IBC's International Conference on Integrating Combinatorial Chemistry into the Discovery Pipeline was held September 14-15, 1998. The program started with a pre-conference workshop on High-Throughput Compound Characterization and Purification. The agenda of the main conference was divided into sessions of Synthesis, Automation and Unique Chemistries; Integrating Combinatorial Chemistry, Medicinal Chemistry and Screening; Combinatorial Chemistry Applications for Drug Discovery; and Information and Data Management. This meeting was an excellent opportunity to see how big pharma, biotech and service companies are addressing the current bottlenecks in combinatorial chemistry to speed drug discovery. (c) 1998 Prous Science. All rights reserved.
Agreement on FY 1990 budget plan
NASA Astrophysics Data System (ADS)
The Bush administration has reached agreement with congressional leaders over a thumbnail version of the Fiscal Year 1990 budget. The plan contains few details but could have implications for NASA's Space Station Freedom and other big science projects.Overall, the budget agreement would achieve Gramm-Rudman-Hollings targets for budget deficit reduction without raising taxes, mostly through accounting manipulation and unspecified cuts in social programs. But a supplemental bill that calls for $1.2 billion in new spending for FY 1989 is expected to go to the House floor soon. That measure would violate the new agreement and add to the deficit.
Challenges of Big Data in Educational Assessment
ERIC Educational Resources Information Center
Gibson, David C.; Webb, Mary; Ifenthaler, Dirk
2015-01-01
This paper briefly discusses four measurement challenges of data science or "big data" in educational assessments that are enabled by technology: 1. Dealing with change over time via time-based data. 2. How a digital performance space's relationships interact with learner actions, communications and products. 3. How layers of…
Expanding Evidence Approaches for Learning in a Digital World
ERIC Educational Resources Information Center
Means, Barbara; Anderson, Kea
2013-01-01
This report describes how big data and an evidence framework can align across five contexts of educational improvement. It explains that before working with big data, there is an important prerequisite: the proposed innovation should align with deeper learning objectives and should incorporate sound learning sciences principles. New curriculum…
BIG: a large-scale data integration tool for renal physiology.
Zhao, Yue; Yang, Chin-Rang; Raghuram, Viswanathan; Parulekar, Jaya; Knepper, Mark A
2016-10-01
Due to recent advances in high-throughput techniques, we and others have generated multiple proteomic and transcriptomic databases to describe and quantify gene expression, protein abundance, or cellular signaling on the scale of the whole genome/proteome in kidney cells. The existence of so much data from diverse sources raises the following question: "How can researchers find information efficiently for a given gene product over all of these data sets without searching each data set individually?" This is the type of problem that has motivated the "Big-Data" revolution in Data Science, which has driven progress in fields such as marketing. Here we present an online Big-Data tool called BIG (Biological Information Gatherer) that allows users to submit a single online query to obtain all relevant information from all indexed databases. BIG is accessible at http://big.nhlbi.nih.gov/.
Özdemir, Vural; Kolker, Eugene
2016-02-01
Nutrition is central to sustenance of good health, not to mention its role as a cultural object that brings together or draws lines among societies. Undoubtedly, understanding the future paths of nutrition science in the current era of Big Data remains firmly on science, technology, and innovation strategy agendas around the world. Nutrigenomics, the confluence of nutrition science with genomics, brought about a new focus on and legitimacy for "variability science" (i.e., the study of mechanisms of person-to-person and population differences in response to food, and the ways in which food variably impacts the host, for example, nutrient-related disease outcomes). Societal expectations, both public and private, and claims over genomics-guided and individually-tailored precision diets continue to proliferate. While the prospects of nutrition science, and nutrigenomics in particular, are established, there is a need to integrate the efforts in four Big Data domains that are naturally allied--agrigenomics, nutrigenomics, nutriproteomics, and nutrimetabolomics--that address complementary variability questions pertaining to individual differences in response to food-related environmental exposures. The joint use of these four omics knowledge domains, coined as Precision Nutrition 4.0 here, has sadly not been realized to date, but the potentials for such integrated knowledge innovation are enormous. Future personalized nutrition practices would benefit from a seamless planning of life sciences funding, research, and practice agendas from "farm to clinic to supermarket to society," and from "genome to proteome to metabolome." Hence, this innovation foresight analysis explains the already existing potentials waiting to be realized, and suggests ways forward for innovation in both technology and ethics foresight frames on precision nutrition. We propose the creation of a new Precision Nutrition Evidence Barometer for periodic, independent, and ongoing retrieval, screening, and aggregation of the relevant life sciences data. For innovation in Big Data ethics oversight, we suggest "nested governance" wherein the processes of knowledge production are made transparent in the continuum from life sciences and social sciences to humanities, and where each innovation actor reports to another accountability and transparency layer: scientists to ethicists, and ethicists to scholars in the emerging field of ethics-of-ethics. Such nested innovation ecosystems offer safety against innovation blind spots, calibrate visible/invisible power differences in the cultures of science or ethics, and ultimately, reducing the risk of "paper values"--what people say--and "real values"--what innovation actors actually do. We are optimistic that the convergence of nutrigenomics with nutriproteomics, nutrimetabolomics, and agrigenomics can build a robust, sustainable, and trustworthy precision nutrition 4.0 agenda, as articulated in this Big Data and ethics foresight analysis.
Big Crater as Viewed by Pathfinder Lander - Anaglyph
NASA Technical Reports Server (NTRS)
1997-01-01
The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.
The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.The anaglyph view of Big Crater was produced by combining the left and right eye mosaics (above) by assigning the left eye view to the red color plane and the right eye view to the green and blue color planes (cyan), to produce a stereo anaglyph mosaic. This mosaic can be viewed in 3-D on your computer monitor or in color print form by wearing red-blue 3-D glasses.Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech). The IMP was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.Click below to see the left and right views individually. [figure removed for brevity, see original site] Left [figure removed for brevity, see original site] RightNext Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
Klimentov, A.; Buncic, P.; De, K.; ...
2015-05-22
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klimentov, A.; Buncic, P.; De, K.
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
Software Architecture for Big Data Systems
2014-03-27
Software Architecture: Trends and New Directions #SEIswArch © 2014 Carnegie Mellon University Software Architecture for Big Data Systems...AND SUBTITLE Software Architecture for Big Data Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...ih - . Software Architecture: Trends and New Directions #SEIswArch © 2014 Carnegie Mellon University WHAT IS BIG DATA ? FROM A SOFTWARE
ERIC Educational Resources Information Center
Greenes, Carole; Ginsburg, Herbert P.; Balfanz, Robert
2004-01-01
"Big Math for Little Kids," a comprehensive program for 4- and 5-year-olds, develops and expands on the mathematics that children know and are capable of doing. The program uses activities and stories to develop ideas about number, shape, pattern, logical reasoning, measurement, operations on numbers, and space. The activities introduce the…
50 CFR 86.10 - What does this regulation do?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false What does this regulation do? 86.10... (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE SPORT FISH RESTORATION PROGRAM BOATING INFRASTRUCTURE GRANT (BIG... Boating Infrastructure Grant (BIG) Program. “We” and “us” refers to the Fish and Wildlife Service. This...
High School Teen Mentoring Handbook
ERIC Educational Resources Information Center
Alberta Advanced Education and Technology, 2009
2009-01-01
Big Brothers Big Sisters Edmonton & Area, in partnership with Alberta Advanced Education and Technology, are providing the High School Teen Mentoring Program, a school-based mentoring program where mentor-mentee matches meet for one hour per week to engage in relationship-building activities at an elementary school. This initiative aims to…
ERIC Educational Resources Information Center
Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna
2009-01-01
The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…
ERIC Educational Resources Information Center
Fulkerson, Gregory
2009-01-01
This article describes three big programs from Delaware where the less commonly taught languages find their home in Delaware elementary schools. Odyssey Charter School, located in Wilmington, is one of the very few Greek-language-focused public schools in the nation. The school began in 2006 as a Greek immersion program that concentrated on the…
ETV Program Report: Big Fish Septage and High Strength Waste Water Treatment System
Verification testing of the Big Fish Environmental Septage and High Strength Wastewater Processing System for treatment of high-strength wastewater was conducted at the Big Fish facility in Charlevoix, Michigan. Testing was conducted over a 13-month period to address different c...
Advanced Methodologies for NASA Science Missions
NASA Astrophysics Data System (ADS)
Hurlburt, N. E.; Feigelson, E.; Mentzel, C.
2017-12-01
Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.
Measuring Adolescent Science Motivation
ERIC Educational Resources Information Center
Schumm, Maximiliane F.; Bogner, Franz X.
2016-01-01
To monitor science motivation, 232 tenth graders of the college preparatory level ("Gymnasium") completed the Science Motivation Questionnaire II (SMQ-II). Additionally, personality data were collected using a 10-item version of the Big Five Inventory. A subsequent exploratory factor analysis based on the eigenvalue-greater-than-one…
Technology for Mining the Big Data of MOOCs
ERIC Educational Resources Information Center
O'Reilly, Una-May; Veeramachaneni, Kalyan
2014-01-01
Because MOOCs bring big data to the forefront, they confront learning science with technology challenges. We describe an agenda for developing technology that enables MOOC analytics. Such an agenda needs to efficiently address the detailed, low level, high volume nature of MOOC data. It also needs to help exploit the data's capacity to reveal, in…
Big Ideas at the Center for Innovation in Education at Thomas College
ERIC Educational Resources Information Center
Prawat, Ted
2016-01-01
Schools and teachers are looking for innovative ways to teach the "big ideas" emerging in the core curricula, especially in STEAM fields (science technology, engineering, arts and math). As a result, learning environments that support digital learning and educational technology on various platforms and devices are taking on…
The Big Bang: UK Young Scientists' and Engineers' Fair 2010
ERIC Educational Resources Information Center
Allison, Simon
2010-01-01
The Big Bang: UK Young Scientists' and Engineers' Fair is an annual three-day event designed to promote science, technology, engineering and maths (STEM) careers to young people aged 7-19 through experiential learning. It is supported by stakeholders from business and industry, government and the community, and brings together people from various…
USDA-ARS?s Scientific Manuscript database
Most efforts to harness the power of big data for ecology and environmental sciences focus on data and metadata sharing, standardization, and accuracy. However, many scientists have not accepted the data deluge as an integral part of their research because the current scientific method is not scalab...
Mount Sharp Inside Gale Crater, Mars
2012-03-28
Curiosity, the big rover of NASA Mars Science Laboratory mission, will land in August 2012 near the foot of a mountain inside Gale Crater. The mission project science group is calling the mountain Mount Sharp.
Combustion Science for Cleaner Fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmed, Musahid
2014-10-17
Musahid Ahmed discusses how he and his team use the Advanced Light Source (ALS) to study combustion chemistry at our '8 Big Ideas' Science at the Theater event on October 8th, 2014, in Oakland, California.
Combustion Science for Cleaner Fuels
Ahmed, Musahid
2018-01-16
Musahid Ahmed discusses how he and his team use the Advanced Light Source (ALS) to study combustion chemistry at our '8 Big Ideas' Science at the Theater event on October 8th, 2014, in Oakland, California.
Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology.
Salazar, Brittany M; Balczewski, Emily A; Ung, Choong Yong; Zhu, Shizhen
2016-12-27
Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring "big data" applications in pediatric oncology. Computational strategies derived from big data science-network- and machine learning-based modeling and drug repositioning-hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which "big data" and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases.
The dynamics of big data and human rights: the case of scientific research.
Vayena, Effy; Tasioulas, John
2016-12-28
In this paper, we address the complex relationship between big data and human rights. Because this is a vast terrain, we restrict our focus in two main ways. First, we concentrate on big data applications in scientific research, mostly health-related research. And, second, we concentrate on two human rights: the familiar right to privacy and the less well-known right to science. Our contention is that human rights interact in potentially complex ways with big data, not only constraining it, but also enabling it in various ways; and that such rights are dynamic in character, rather than fixed once and for all, changing in their implications over time in line with changes in the context we inhabit, and also as they interact among themselves in jointly responding to the opportunities and risks thrown up by a changing world. Understanding this dynamic interaction of human rights is crucial for formulating an ethic tailored to the realities-the new capabilities and risks-of the rapidly evolving digital environment.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).
BIG: a large-scale data integration tool for renal physiology
Zhao, Yue; Yang, Chin-Rang; Raghuram, Viswanathan; Parulekar, Jaya
2016-01-01
Due to recent advances in high-throughput techniques, we and others have generated multiple proteomic and transcriptomic databases to describe and quantify gene expression, protein abundance, or cellular signaling on the scale of the whole genome/proteome in kidney cells. The existence of so much data from diverse sources raises the following question: “How can researchers find information efficiently for a given gene product over all of these data sets without searching each data set individually?” This is the type of problem that has motivated the “Big-Data” revolution in Data Science, which has driven progress in fields such as marketing. Here we present an online Big-Data tool called BIG (Biological Information Gatherer) that allows users to submit a single online query to obtain all relevant information from all indexed databases. BIG is accessible at http://big.nhlbi.nih.gov/. PMID:27279488
Art, science, and immersion: data-driven experiences
NASA Astrophysics Data System (ADS)
West, Ruth G.; Monroe, Laura; Ford Morie, Jacquelyn; Aguilera, Julieta
2013-03-01
This panel and dialog-paper explores the potentials at the intersection of art, science, immersion and highly dimensional, "big" data to create new forms of engagement, insight and cultural forms. We will address questions such as: "What kinds of research questions can be identified at the intersection of art + science + immersive environments that can't be expressed otherwise?" "How is art+science+immersion distinct from state-of-the art visualization?" "What does working with immersive environments and visualization offer that other approaches don't or can't?" "Where does immersion fall short?" We will also explore current trends in the application of immersion for gaming, scientific data, entertainment, simulation, social media and other new forms of big data. We ask what expressive, arts-based approaches can contribute to these forms in the broad cultural landscape of immersive technologies.
NASA Astrophysics Data System (ADS)
Zwintz, Konstanze; Poretti, Ennio
2017-09-01
In 2016 the BRITE-Constellation mission had been operational for more than two years. At that time, several hundreds of bright stars of various types had been observed successfully in the two BRITE lters and astonishing new discoveries had been made. Therefore, the time was ripe to host the Second BRITE-Constellation Science Conference: Small satellites | big science" from August 22 to 26, 2016, in the beautiful Madonnensaal of the University of Innsbruck, Austria. With this conference, we brought together the scientic community interested in BRITE-Constellation, pro- vided an update on the status of the mission, presented and discussed latest scientic results, shared our experiences with the data, illustrated successful cooperations between professional and amateur ground-based observers and BRITE scientists, and explored new ideas for future BRITE-Constellation observations.
Performance and scalability evaluation of "Big Memory" on Blue Gene Linux.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoshii, K.; Iskra, K.; Naik, H.
2011-05-01
We address memory performance issues observed in Blue Gene Linux and discuss the design and implementation of 'Big Memory' - an alternative, transparent memory space introduced to eliminate the memory performance issues. We evaluate the performance of Big Memory using custom memory benchmarks, NAS Parallel Benchmarks, and the Parallel Ocean Program, at a scale of up to 4,096 nodes. We find that Big Memory successfully resolves the performance issues normally encountered in Blue Gene Linux. For the ocean simulation program, we even find that Linux with Big Memory provides better scalability than does the lightweight compute node kernel designed solelymore » for high-performance applications. Originally intended exclusively for compute node tasks, our new memory subsystem dramatically improves the performance of certain I/O node applications as well. We demonstrate this performance using the central processor of the LOw Frequency ARray radio telescope as an example.« less
30th Annual Drug Information Association (DIA) Europe 2018 (April 17-19, 2018 - Basel, Switzerland).
Hamaui Cuadrado, S; Guinart Vidal, M
2018-05-01
The Drug Information Association (DIA) Europe held its annual meeting from April 17-19, 2018, in Basel, Switzerland. The key topics discussed in the 3-day meeting were related to pharmacovigilance, clinical development, patient engagement, data and data standards, preclinical development and early-phase clinical research, regulatory science, translational medicine and science, and value and access. The program was principally focused on the current opportunities and future landscape of the healthcare system as a result of the increasingly innovative technologies and effective utilization of big data. In addition, the critical need for collaboration and partnership between all the stakeholders of the healthcare system was highlighted. This report covers some of the regulatory sessions presented at the meeting in which regulators, payers, industry and patients presented their perspectives for discussion. Copyright 2018 Clarivate Analytics.
NASA Astrophysics Data System (ADS)
Branch, B. D.; Wegner, K.; Smith, S.; Schulze, D. G.; Merwade, V.; Jung, J.; Bessenbacher, A.
2013-12-01
It has been the tradition of the libraries to support literacy. Now in the realm of Executive Order, Making Open and Machine Readable the New Default for Government Information, May 9, 2013, the library has the responsibility to support geospatial data, big data, earth science data or cyber infrastructure data that may support STEM for educational pipeline stimulation. (Such information can be found at http://www.whitehouse.gov/the-press-office/2013/05/09/executive-order-making-open-and-machine-readable-new-default-government-.) Provided is an Educational Data Curation Framework (EDCF) that has been initiated in Purdue research, geospatial data service engagement and outreach endeavors for future consideration and application to augment such data science and climate literacy needs of future global citizens. In addition, this endorsement of this framework by the GLOBE program may facilitate further EDCF implementations, discussion points and prototypes for libraries. In addition, the ECDF will support teacher-led, placed-based and large scale climate or earth science learning systems where such knowledge transfer of climate or earth science data is effectively transferred from higher education research of cyberinfrastructure use such as, NOAA or NASA, to K-12 teachers and school systems. The purpose of this effort is to establish best practices for sustainable K-12 data science delivery system or GLOBE-provided system (http://vis.globe.gov/GLOBE/) where libraries manage the data curation and data appropriateness as data reference experts for such digital data. Here, the Purdue University Libraries' GIS department works to support soils, LIDAR and water science data experiences to support teacher training for an EDCF development effort. Lastly, it should be noted that the interdisciplinary collaboration and demonstration of library supported outreach partners and national organizations such the GLOBE program may best foster EDCF development. This trend in data science where library roles may emerge is consistent with NASA's wavelength program at http://nasawavelength.org. Mr. Steven Smith, an outreach coordinator, led this Purdue University outreach activity involving the GLOBE program with support by the Purdue University Libraries GIS department.
Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science
NASA Astrophysics Data System (ADS)
Riedel, Morris; Ramachandran, Rahul; Baumann, Peter
2014-05-01
The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.
Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science
NASA Technical Reports Server (NTRS)
Riedel, Morris; Ramachandran, Rahul; Baumann, Peter
2014-01-01
The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.
The Big Bang, Genesis, and Knocking on Heaven's Door
NASA Astrophysics Data System (ADS)
Gentry, Robert
2012-03-01
Michael Shermer recently upped the ante in the big bang-Genesis controversy by citing Lisa Randall's provocative claim (Science 334, 762 (2011)) that ``it is inconceivable that God could continue to intervene without introducing a material trace of his actions.'' So does Randall's and Shermer's agreement that no such evidence exists disprove God's existence? Not in my view because my 1970s Science, Nature and ARNS publications, and my article in the 1982 AAAS Western Division's Symposium Proceedings, Evolution Confronts Creation, all contain validation of God's existence via discovery of His Fingerprints of Creation and falsification of the big bang and geological evolution. These results came to wide public/scientific attention in my testimony at the 1981 Arkansas creation/evolution trial. There ACLU witness G Brent Dalrymple from the USGS -- and 2005 Medal of Science recipient from President Bush -- admitted I had discovered a tiny mystery (primordial polonium radiohalos) in granite rocks that indicated their almost instant creation. As a follow-up in 1992 and 1995 he sent out SOS letters to the entire AGU membership that the polonium halo evidence for fiat creation still existed and that someone needed to urgently find a naturalistic explanation for them. Is the physics community guilty of a Watergate-type cover-up of this discovery of God's existence and falsification of the big bang? For the answer see www.halos.tv.
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Big data in medical science--a biostatistical view.
Binder, Harald; Blettner, Maria
2015-02-27
Inexpensive techniques for measurement and data storage now enable medical researchers to acquire far more data than can conveniently be analyzed by traditional methods. The expression "big data" refers to quantities on the order of magnitude of a terabyte (1012 bytes); special techniques must be used to evaluate such huge quantities of data in a scientifically meaningful way. Whether data sets of this size are useful and important is an open question that currently confronts medical science. In this article, we give illustrative examples of the use of analytical techniques for big data and discuss them in the light of a selective literature review. We point out some critical aspects that should be considered to avoid errors when large amounts of data are analyzed. Machine learning techniques enable the recognition of potentially relevant patterns. When such techniques are used, certain additional steps should be taken that are unnecessary in more traditional analyses; for example, patient characteristics should be differentially weighted. If this is not done as a preliminary step before similarity detection, which is a component of many data analysis operations, characteristics such as age or sex will be weighted no higher than any one out of 10 000 gene expression values. Experience from the analysis of conventional observational data sets can be called upon to draw conclusions about potential causal effects from big data sets. Big data techniques can be used, for example, to evaluate observational data derived from the routine care of entire populations, with clustering methods used to analyze therapeutically relevant patient subgroups. Such analyses can provide complementary information to clinical trials of the classic type. As big data analyses become more popular, various statistical techniques for causality analysis in observational data are becoming more widely available. This is likely to be of benefit to medical science, but specific adaptations will have to be made according to the requirements of the applications.
House cuts science to restore Space Station
NASA Astrophysics Data System (ADS)
The House voted 240 to 173 to fully fund Space Station Freedom at $1.9 billion next year, overriding the House appropriations subcommittee, which eliminated the funding for the station last month. The unexpected action on June 6, taken after a day of heated debate, froze all other programs of the National Aeronautics and Space Administration at this year's levels, confirming the recent suspicion that the rest of the agency would suffer if the space station were funded. The House also took an additional $217 million from public housing subsidies and added it to the station. The National Science Foundation's budget request, funded by the same bill as NASA is, was not affected.NASA administrator Richard H. Truly called the vote “a big victory for all America.” He added, however, that “much work remains to be done to provide a final FY 1992 budget for NASA that is well balanced between science, manned space flight and exploration, aeronautical research, Earth observation, and technology development.”
Challenges and potential solutions for big data implementations in developing countries.
Luna, D; Mayan, J C; García, M J; Almerares, A A; Househ, M
2014-08-15
The volume of data, the velocity with which they are generated, and their variety and lack of structure hinder their use. This creates the need to change the way information is captured, stored, processed, and analyzed, leading to the paradigm shift called Big Data. To describe the challenges and possible solutions for developing countries when implementing Big Data projects in the health sector. A non-systematic review of the literature was performed in PubMed and Google Scholar. The following keywords were used: "big data", "developing countries", "data mining", "health information systems", and "computing methodologies". A thematic review of selected articles was performed. There are challenges when implementing any Big Data program including exponential growth of data, special infrastructure needs, need for a trained workforce, need to agree on interoperability standards, privacy and security issues, and the need to include people, processes, and policies to ensure their adoption. Developing countries have particular characteristics that hinder further development of these projects. The advent of Big Data promises great opportunities for the healthcare field. In this article, we attempt to describe the challenges developing countries would face and enumerate the options to be used to achieve successful implementations of Big Data programs.
Shadish, William R; Lecy, Jesse D
2015-09-01
This article looks at the impact of meta-analysis and then explores why meta-analysis was developed at the time and by the scholars it did in the social sciences in the 1970s. For the first problem, impact, it examines the impact of meta-analysis using citation network analysis. The impact is seen in the sciences, arts and humanities, and on such contemporaneous developments as multilevel modeling, medical statistics, qualitative methods, program evaluation, and single-case design. Using a constrained snowball sample of citations, we highlight key articles that are either most highly cited or most central to the systematic review network. Then, the article examines why meta-analysis came to be in the 1970s in the social sciences through the work of Gene Glass, Robert Rosenthal, and Frank Schmidt, each of whom developed similar theories of meta-analysis at about the same time. The article ends by explaining how Simonton's chance configuration theory and Campbell's evolutionary epistemology can illuminate why meta-analysis occurred with these scholars when it did and not in medical sciences. Copyright © 2015 John Wiley & Sons, Ltd.
The Universe Discovery Guides: A Collaborative Approach to Educating with NASA Science
NASA Astrophysics Data System (ADS)
Manning, Jim; Lawton, Brandon; Berendsen, Marni; Gurton, Suzanne; Smith, Denise A.; NASA SMD Astrophysics E/PO Community, The
2014-06-01
For the 2009 International Year of Astronomy, the then-existing NASA Origins Forum collaborated with the Astronomical Society of the Pacific (ASP) to create a series of monthly “Discovery Guides” for informal educator and amateur astronomer use in educating the public about featured sky objects and associated NASA science themes. Today’s NASA Astrophysics Science Education and Public Outreach Forum (SEPOF), one of a new generation of forums coordinating the work of NASA Science Mission Directorate (SMD) EPO efforts—in collaboration with the ASP and NASA SMD missions and programs--has adapted the Discovery Guides into “evergreen” educational resources suitable for a variety of audiences. The Guides focus on “deep sky” objects and astrophysics themes (stars and stellar evolution, galaxies and the universe, and exoplanets), showcasing EPO resources from more than 30 NASA astrophysics missions and programs in a coordinated and cohesive “big picture” approach across the electromagnetic spectrum, grounded in best practices to best serve the needs of the target audiences.Each monthly guide features a theme and a representative object well-placed for viewing, with an accompanying interpretive story, finding charts, strategies for conveying the topics, and complementary supporting NASA-approved education activities and background information from a spectrum of NASA missions and programs. The Universe Discovery Guides are downloadable from the NASA Night Sky Network web site at nightsky.jpl.nasa.gov.The presenter will share the Forum-led Collaborative’s experience in developing the guides, how they place individual science discoveries and learning resources into context for audiences, and how the Guides can be readily used in scientist public outreach efforts, in college and university introductory astronomy classes, and in other engagements between scientists, students and the public.
Bioinformatic training needs at a health sciences campus.
Oliver, Jeffrey C
2017-01-01
Health sciences research is increasingly focusing on big data applications, such as genomic technologies and precision medicine, to address key issues in human health. These approaches rely on biological data repositories and bioinformatic analyses, both of which are growing rapidly in size and scope. Libraries play a key role in supporting researchers in navigating these and other information resources. With the goal of supporting bioinformatics research in the health sciences, the University of Arizona Health Sciences Library established a Bioinformation program. To shape the support provided by the library, I developed and administered a needs assessment survey to the University of Arizona Health Sciences campus in Tucson, Arizona. The survey was designed to identify the training topics of interest to health sciences researchers and the preferred modes of training. Survey respondents expressed an interest in a broad array of potential training topics, including "traditional" information seeking as well as interest in analytical training. Of particular interest were training in transcriptomic tools and the use of databases linking genotypes and phenotypes. Staff were most interested in bioinformatics training topics, while faculty were the least interested. Hands-on workshops were significantly preferred over any other mode of training. The University of Arizona Health Sciences Library is meeting those needs through internal programming and external partnerships. The results of the survey demonstrate a keen interest in a variety of bioinformatic resources; the challenge to the library is how to address those training needs. The mode of support depends largely on library staff expertise in the numerous subject-specific databases and tools. Librarian-led bioinformatic training sessions provide opportunities for engagement with researchers at multiple points of the research life cycle. When training needs exceed library capacity, partnering with intramural and extramural units will be crucial in library support of health sciences bioinformatic research.
Exploiting big data for critical care research.
Docherty, Annemarie B; Lone, Nazir I
2015-10-01
Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.
Use of a metadata documentation and search tool for large data volumes: The NGEE arctic example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devarakonda, Ranjeet; Hook, Leslie A; Killeffer, Terri S
The Online Metadata Editor (OME) is a web-based tool to help document scientific data in a well-structured, popular scientific metadata format. In this paper, we will discuss the newest tool that Oak Ridge National Laboratory (ORNL) has developed to generate, edit, and manage metadata and how it is helping data-intensive science centers and projects, such as the U.S. Department of Energy s Next Generation Ecosystem Experiments (NGEE) in the Arctic to prepare metadata and make their big data produce big science and lead to new discoveries.
Russell, R J
2001-12-01
The sciences and the humanities, including theology, form an epistemic hierarchy that ensures both constraint and irreducibility. At the same time, theological methodology is analogous to scientific methodology, though with several important differences. This model of interaction between science and theology can be seen illustrated in a consideration of the relation between contemporary cosmology (Big Bang cosmology, cosmic inflation, and quantum cosmology) and Christian systematic and natural theology. In light of developments in cosmology, the question of origins has become theologically less interesting than that of the cosmic evolution of a contingent universe.
NASA Astrophysics Data System (ADS)
Nasyrov, R. K.; Poleshchuk, A. G.
2017-09-01
This paper describes the development and manufacture of diffraction corrector and imitator for the interferometric control of the surface shape of the 6-m main mirror of the Big Azimuthal Telescope of the Russian Academy of Sciences. The effect of errors in manufacture and adjustment on the quality of the measurement wavefront is studied. The corrector is controlled with the use of an off-axis diffraction imitator operating in a reflection mode. The measured error is smaller than 0.0138λ (RMS).
A New Coherent Science Content Storyline Astronomy Course for Pre-Service Teachers at Penn State
NASA Astrophysics Data System (ADS)
Palma, Christopher; Plummer, Julia; Earth and Space Science Partnership
2016-01-01
The Earth and Space Science Partnership (ESSP) is a collaboration among Penn State scientists, science educators and seven school districts across Pennsylvania. One of the ESSP goals has been to provide pre-service teachers with new or improved science course offerings at Penn State in the Earth and Space Science domains. In particular, we aim to provide students with opportunities to learn astronomy content knowledge through teaching methods that engage them in investigations where they experience the practices used by astronomers. We have designed a new course that builds on our research into students' ideas about Solar System astronomy (Plummer et al. 2015) and the curriculum our team created for a professional development workshop for in-service teachers (Palma et al. 2013) with this same theme. The course was offered for the first time in the spring 2015 semester. We designed the course using a coherent science content storyline approach (see, e.g., Palma et al. 2014), which requires all of the student investigations to build towards a big idea in science; in this case, we chose the model for formation of our Solar System. The course led pre-service teachers through a series of investigations that model the type of instruction we hope they will adopt in their own classrooms. They were presented with a series of research questions that all tie in to the big idea of Solar System formation, and they were responsible for collecting and interpreting their own data to draw evidence-based conclusions about one aspect of this model. Students in the course were assessed on their astronomy content knowledge, but also on their ability to construct arguments using scientific reasoning to answer astronomy questions. In this poster, we will present descriptions of the investigations, the assessments used, and our preliminary results about how the course led this group of pre-service teachers to improved understanding of astronomy content and the practices astronomers use in their investigations of the Solar System.We gratefully acknowledge support from the NSF MSP program award DUE#0962792.
Engaging High School Youth in Paleobiology Research
NASA Astrophysics Data System (ADS)
Saltzman, J.; Heim, N. A.; Payne, J.
2013-12-01
The chasm between classroom science and scientific research is bridged by the History of Life Internships at Stanford University. Nineteen interns recorded more than 25,500 linear body size measurements of fossil echinoderms and ostracods spanning more than 11,000 species. The interns were selected from a large pool of applicants, and well-established relationships with local teachers at schools serving underrepresented groups in STEM fields were leveraged to ensure a diverse mix of applicants. The lead investigator has been hosting interns in his research group for seven years, in the process measuring over 36,000 foraminfera species as well as representatives from many other fossil groups. We (faculty member, researcher, and educators) all find this very valuable to engage youth in novel research projects. We are able to create an environment where high school students can make genuine contributions to jmportant and unsolved scientific problems, not only through data collection but also through original data analysis. Science often involves long intervals of data collection, which can be tedious, and big questions often require big datasets. Body size evolution is ideally suited to this type of program, as the data collection process requires substantial person-power but not deep technical expertise or expensive equipment. Students are therefore able to engage in the full scientific process, posing previously unanswered questions regarding the evolution of animal size, compiling relevant data, and then analyzing the data in order to test their hypotheses. Some of the projects students developed were truly creative and fun to see come together. Communicating is a critical step in science yet is often lost in the science classroom. The interns submitted seven abstracts to this meeting for the youth session entitled Bright STaRS based on their research projects. To round out the experience, students also learn about the broad field of earth sciences through traditional lectures, active learning exercises, discussions of primary and secondary literature, guest speakers, lab tours and field trips (including to the UC Museum of Paleontology, Hayward fault, fossiliferous Pliocene outcrops, and tidepools). We will use a survey to assess the impact of the History of Life Internships on participant attitudes toward science and careers in science.
Big questions, big science: meeting the challenges of global ecology.
Schimel, David; Keller, Michael
2015-04-01
Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.
A Blended Professional Development Program to Help a Teacher Learn to Provide One-to-One Scaffolding
NASA Astrophysics Data System (ADS)
Belland, Brian R.; Burdo, Ryan; Gu, Jiangyue
2015-04-01
Argumentation is central to instruction centered on socio-scientific issues (Sadler & Donnelly in International Journal of Science Education, 28(12), 1463-1488, 2006. doi: 10.1080/09500690600708717). Teachers can play a big role in helping students engage in argumentation and solve authentic scientific problems. To do so, they need to learn one-to-one scaffolding—dynamic support to help students accomplish tasks that they could not complete unaided. This study explores a middle school science teacher's provision of one-to-one scaffolding during a problem-based learning unit, in which students argued about how to optimize the water quality of their local river. The blended professional development program incorporated three 1.5-h seminars, one 8-h workshop, and 4 weeks of online education activities. Data sources were video of three small groups per period, and what students typed in response to prompts from computer-based argumentation scaffolds. Results indicated that the teacher provided one-to-one scaffolding on a par with inquiry-oriented teachers described in the literature.
Georges Lemaître: The Priest Who Invented the Big Bang
NASA Astrophysics Data System (ADS)
Lambert, Dominique
This contribution gives a concise survey of Georges Lemaître works and life, shedding some light on less-known aspects. Lemaître is a Belgian catholic priest who gave for the first time in 1927 the explanation of the Hubble law and who proposed in 1931 the "Primeval Atom Hypothesis", considered as the first step towards the Big Bang cosmology. But the scientific work of Lemaître goes far beyond Physical Cosmology. Indeed, he contributed also to the theory of Cosmis Rays, to the Spinor theory, to Analytical mechanics (regularization of 3- Bodies problem), to Numerical Analysis (Fast Fourier Transform), to Computer Science (he introduced and programmed the first computer of Louvain),… Lemaître took part to the "Science and Faith" debate. He defended a position that has some analogy with the NOMA principle, making a sharp distinction between what he called the "two paths to Truth" (a scientific one and a theological one). In particular, he never made a confusion between the theological concept of "creation" and the scientific notion of "natural beginning" (initial singularity). Lemaître was deeply rooted in his faith and sacerdotal vocation. Remaining a secular priest, he belonged to a community of priests called "The Friends of Jesus", characterized by a deep spirituality and special vows (for example the vow of poverty). He had also an apostolic activity amongst Chinese students.
Entrenched Compartmentalisation and Students' Abilities and Levels of Interest in Science
ERIC Educational Resources Information Center
Billingsley, Berry; Nassaji, Mehdi; Abedin, Manzoorul
2017-01-01
This article explores the notion that asking and exploring so-called "big questions" could potentially increase the diversity and number of students who aspire to work in science and science-related careers. The focus is the premise that girls are more interested than boys in the relationships between science and other disciplines. The…
The Big Picture: Pre-Service Teachers' Perceptions of "Expert" Science Teachers
ERIC Educational Resources Information Center
McKinnon, Merryn; Perara, Sean
2015-01-01
This study adapted the Draw-A-Science-Teacher Test to compare 22 pre-service teachers' perceptions of their own strengths as science teachers against their perceived strengths of expert science teachers. The drawings identified a disconnection between theory and practice that we revisit in the literature. Our findings from this pilot study are…
MaMR: High-performance MapReduce programming model for material cloud applications
NASA Astrophysics Data System (ADS)
Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng
2017-02-01
With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.
Motivation of synthesis, with an example on groundwater quality sustainability
NASA Astrophysics Data System (ADS)
Fogg, G. E.; Labolle, E. M.
2007-12-01
Synthesis of ideas and theories from disparate disciplines is necessary for addressing the major problems faced by society. Such integration happens neither via edict nor via lofty declarations of what is needed or what is best. It happens mainly through two mechanisms: limited scope collaborations (e.g., ~2-3 investigators) in which the researchers believe deeply in their need for each other's expertise and much larger scope collaborations driven by the 'big idea.' Perhaps the strongest motivation for broad, effective synthesis is the 'big idea' that is sufficiently important and inspiring to marshal the appropriate collaborative efforts. Examples include the Manhattan Project, the quest for cancer cures, predicting effects of climate change, and groundwater quality sustainability. The latter is posed as an example of a 'big idea' that would potentially unify research efforts in both the sciences and social sciences toward a common, pressing objective.
VESPA: Developing the Planetary Science Virtual Observatory in H2020
NASA Astrophysics Data System (ADS)
Erard, S.; Cecconi, B.; Le Sidaner, P.; Capria, M. T.; Rossi, A. P.; Schmitt, B.; Andre, N.; Vandaele, A. C.; Scherf, M.; Hueso, R.; Maattanen, A. E.; Thuillot, W.; Achilleos, N.; Marmo, C.; Santolik, O.; Benson, K.
2015-12-01
In the frame of the Europlanet-RI program, a prototype Virtual Observatory dedicated to Planetary Science has been set up. Most of the activity was dedicated to the definition of standards to handle data in this field. The aim was to facilitate searches in big archives as well as sparse databases, to make on-line data access and visualization possible, and to allow small data providers to make their data available in an interoperable environment with minimum effort. This system makes intensive use of studies and developments led in Astronomy (IVOA), Solar Science (HELIO), and space archive services (IPDA). A general standard has been devised to handle the specific complexity of Planetary Science, e.g. in terms of measurement types and coordinate frames [1]. A procedure has been identified to install small data services, and several hands-on sessions have been organized already. A specific client (VESPA) has been developed at VO-Paris (http://vespa.obspm.fr), using a resolver for target names. Selected data can be sent to VO visualization tools such as TOPCAT or Aladin though the SAMP protocol. The Europlanet H2020 program started in Sept 2015 will provide support to new data services in Europe (30 to 50 expected), and focus on the improvement of the infrastructure. Future steps will include the development of a connection between the VO world and GIS tools, and integration of heliophysics, planetary plasma and reference spectroscopic data. The Europlanet H2020 project is funded by the European Commission under the H2020 Program, grant 654208. [1] Erard et al Astron & Comp 2014
ERIC Educational Resources Information Center
Jucovy, Linda; Herrera, Carla
2009-01-01
This issue of "Public/Private Ventures (P/PV) In Brief" is based on "High School Students as Mentors," a report that examined the efficacy of high school mentors using data from P/PV's large-scale random assignment impact study of Big Brothers Big Sisters school-based mentoring programs. The brief presents an overview of the findings, which…
Curriculum: Big Decisions--Making Healthy, Informed Choices about Sex
ERIC Educational Resources Information Center
Davis, Melanie
2009-01-01
Big Decisions is a 10-lesson abstinence-plus curriculum for ages 12-18 that emphasizes sex as a big decision, abstinence as the healthiest choice, and the mandate that sexually active teens use condoms and be tested for sexually transmitted diseases. This program can be implemented with limited resources and facilitator training when abstinence…
ERIC Educational Resources Information Center
National Endowment for the Arts, 2009
2009-01-01
The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…
Bringing the Tools of Big Science to Bear on Local Environmental Challenges
ERIC Educational Resources Information Center
Bronson, Scott; Jones, Keith W.; Brown, Maria
2013-01-01
We describe an interactive collaborative environmental education project that makes advanced laboratory facilities at Brookhaven National Laboratory accessible for one-year or multi-year science projects for the high school level. Cyber-enabled Environmental Science (CEES) utilizes web conferencing software to bring multi-disciplinary,…
Reflections on the Use of Tablet Technology
ERIC Educational Resources Information Center
Wise, Nicki; McGregor, Deb; Bird, James
2015-01-01
This article describes a recent Oxfordshire Big Science Event (BSE), which was combined with Science Week in Bure Park Primary School and involved a competition in which primary school children throughout Oxfordshire devised, carried out, and recorded data from science investigations to answer questions that interested them. Teams of children…
NASA EOSDIS Evolution in the BigData Era
NASA Technical Reports Server (NTRS)
Lynnes, Christopher
2015-01-01
NASA's EOSDIS system faces several challenges in the Big Data Era. Although volumes are large (but not unmanageably so), the variety of different data collections is daunting. That variety also brings with it a large and diverse user community. One key evolution EOSDIS is working toward is to enable more science analysis to be performed close to the data.
ERIC Educational Resources Information Center
Gaddis, S. Michael
2012-01-01
After 25 years of intense scrutiny, social capital remains an important yet highly debated concept in social science research. This research uses data from youth and mentors in several chapters of Big Brothers/Big Sisters to assess the importance of different mentoring relationship characteristics in creating positive outcomes among youths. The…
Air Toxics under the Big Sky: A Real-World Investigation to Engage High School Science Students
ERIC Educational Resources Information Center
Adams, Earle; Smith, Garon; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Jones, David; Henthorn, Melissa; Striebel, Jim
2008-01-01
This paper describes a problem-based chemistry education model in which students perform scientific research on a local environmentally relevant problem. The project is a collaboration among The University of Montana and local high schools centered around Missoula, Montana. "Air Toxics under the Big Sky" involves high school students in collecting…
Close Encounters of the Best Kind: The Latest Sci-Fi
ERIC Educational Resources Information Center
Kunzel, Bonnie
2008-01-01
Not only is science fiction alive and well--it's flourishing. From the big screen (howdy, Wall-E) to the big books (like Suzanne Collins's The Hunger Games, which has attracted loads of prepublication praise), 2008 has been a great year for sci-fi. Publishers have released truckloads of new sci-fi titles this year, but what's particularly…
ERIC Educational Resources Information Center
Cheng, Liang; Zhang, Wen; Wang, Jiechen; Li, Manchun; Zhong, Lishan
2014-01-01
Geographic information science (GIS) features a wide range of disciplines and has broad applicability. Challenges associated with rapidly developing GIS technology and the currently limited teaching and practice materials hinder universities from cultivating highly skilled GIS graduates. Based on the idea of "small core, big network," a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert P. Crease
2007-10-31
Robert P. Crease, historian for Brookhaven National Laboratory and Chair of the Philosophy Department at Stony Brook University, presents "How Big Science Came to Long Island: The Birth of Brookhaven Lab," covering the founding of the Laboratory, the key figures involved in starting BNL, and the many problems that had to be overcome in creating and designing its first big machines.
Robert P. Crease
2017-12-09
Robert P. Crease, historian for Brookhaven National Laboratory and Chair of the Philosophy Department at Stony Brook University, presents "How Big Science Came to Long Island: The Birth of Brookhaven Lab," covering the founding of the Laboratory, the key figures involved in starting BNL, and the many problems that had to be overcome in creating and designing its first big machines.
Big Data, Big Problems: A Healthcare Perspective.
Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M
2017-01-01
Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."
ERIC Educational Resources Information Center
Bayer, Amanda; Grossman, Jean Baldwin; DuBois, David L.
2013-01-01
Previous research suggests that school-based mentoring programs like those offered by Big Brothers Big Sisters of America (BBBSA) yield small but statistically significant improvements in the academic performance of mentored students and in their beliefs in their own scholastic efficacy. The present study uses data from a randomized control trial…
Visions 2025 and Linkage to NEXT
NASA Technical Reports Server (NTRS)
Wiscombe, W.; Lau, William K. M. (Technical Monitor)
2002-01-01
This talk will describe the progress to date on creating a science-driven vision for the NASA Earth Science Enterprise (ESE) in the post-2010 period. This effort began in the Fall of 2001 by organizing five science workgroups with representatives from NASA, academia and other agencies: Long-Term Climate, Medium-Term Climate, Extreme Weather, Biosphere & Ecosystems, and Solid Earth, Ice Sheets, & Sea Level. Each workgroup was directed to scope out one Big Question, including not just the science but the observational and modeling requirements, the information system requirements, and the applications and benefits to society. This first set of five Big Questions is now in hand and has been presented to the ESE Director. It includes: water resources, intraseasonal predictability, tropical cyclogenesis, invasive species, and sea level. Each of these topics will be discussed briefly. How this effort fits into the NEXT vision exercise and into Administrator O'Keefe's new vision for NASA will also be discussed.
Big Data Provenance: Challenges, State of the Art and Opportunities.
Wang, Jianwu; Crawl, Daniel; Purawat, Shweta; Nguyen, Mai; Altintas, Ilkay
2015-01-01
Ability to track provenance is a key feature of scientific workflows to support data lineage and reproducibility. The challenges that are introduced by the volume, variety and velocity of Big Data, also pose related challenges for provenance and quality of Big Data, defined as veracity. The increasing size and variety of distributed Big Data provenance information bring new technical challenges and opportunities throughout the provenance lifecycle including recording, querying, sharing and utilization. This paper discusses the challenges and opportunities of Big Data provenance related to the veracity of the datasets themselves and the provenance of the analytical processes that analyze these datasets. It also explains our current efforts towards tracking and utilizing Big Data provenance using workflows as a programming model to analyze Big Data.
The Big Splat, or How Our Moon Came to Be
NASA Astrophysics Data System (ADS)
MacKenzie, Dana
2003-03-01
The first popular book to explain the dramatic theory behind the Moon's genesis This lively science history relates one of the great recent breakthroughs in planetary astronomy-a successful theory of the birth of the Moon. Science journalist Dana Mackenzie traces the evolution of this theory, one little known outside the scientific community: a Mars-sized object collided with Earth some four billion years ago, and the remains of this colossal explosion-the Big Splat-came together to form the Moon. Beginning with notions of the Moon in ancient cosmologies, Mackenzie relates the fascinating history of lunar speculation, moving from Galileo and Kepler to George Darwin (son of Charles) and the Apollo astronauts, whose trips to the lunar surface helped solve one of the most enigmatic mysteries of the night sky: who hung the Moon? Dana Mackenzie (Santa Cruz, CA) is a freelance science journalist. His articles have appeared in such magazines as Science, Discover, American Scientist, The Sciences, and New Scientist.
NASA Astrophysics Data System (ADS)
Hellman, Leslie G.
This qualitative study uses children's writing to explore the divide between a conception of Science as a humanistic discipline reliant on creativity, ingenuity and out of the box thinking and a persistent public perception of science and scientists as rigid and methodical. Artifacts reviewed were 506 scripts written during 2014 and 2016 by 5th graders participating in an out-of classroom, mentor supported, free-choice 10-week arts and literacy initiative. 47% (237) of these scripts were found to contain content relating to Science, Scientists, Science Education and the Nature of Science. These 237 scripts were coded for themes; characteristics of named scientist characters were tracked and analyzed. Findings included NOS understandings being expressed by representation of Science and Engineering Practices; Ingenuity being primarily linked to Engineering tasks; common portrayals of science as magical or scientists as villains; and a persistence in negative stereotypes of scientists, including a lack of gender equity amongst the named scientist characters. Findings suggest that representations of scientists in popular culture highly influence the portrayals of scientists constructed by the students. Recommendations to teachers include encouraging explicit consideration of big-picture NOS concepts such as ethics during elementary school and encouraging the replacement of documentary or educational shows with more engaging fictional media.
Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III
2015-04-30
It is a supervised learning method but best for Big Data with low dimensions. It is an approximate inference good for Big Data and Hadoop ...Each process produces large amounts of information ( Big Data ). There is a critical need for automation, validation, and discovery to help acquisition...can inform managers where areas might have higher program risk and how resource and big data management might affect the desired return on investment
Applying science and mathematics to big data for smarter buildings.
Lee, Young M; An, Lianjun; Liu, Fei; Horesh, Raya; Chae, Young Tae; Zhang, Rui
2013-08-01
Many buildings are now collecting a large amount of data on operations, energy consumption, and activities through systems such as a building management system (BMS), sensors, and meters (e.g., submeters and smart meters). However, the majority of data are not utilized and are thrown away. Science and mathematics can play an important role in utilizing these big data and accurately assessing how energy is consumed in buildings and what can be done to save energy, make buildings energy efficient, and reduce greenhouse gas (GHG) emissions. This paper discusses an analytical tool that has been developed to assist building owners, facility managers, operators, and tenants of buildings in assessing, benchmarking, diagnosing, tracking, forecasting, and simulating energy consumption in building portfolios. © 2013 New York Academy of Sciences.
ERIC Educational Resources Information Center
Buche, Fred; Cox, Charles
A competency-based automotive mechanics curriculum was developed at Big Bend Community College (Washington) in order to provide the basis for an advanced placement procedure for high school graduates and experienced adults through a competency assessment. In order to create the curriculum, Big Bend Community College automotive mechanics…
ERIC Educational Resources Information Center
Eriba, Joel O.; Ande, Sesugh
2006-01-01
Over the years there exists gender inequality in science achievement among senior secondary school students the world over. It is observed that the males score higher than the females in science and science- related examinations. This has created a big psychological alienation or depression in the minds of female students towards science and…
The Big Science Questions About Mercury's Ice-Bearing Polar Deposits After MESSENGER
NASA Astrophysics Data System (ADS)
Chabot, N. L.; Lawrence, D. J.
2018-05-01
Mercury’s polar deposits provide many well-characterized locations that are known to have large expanses of exposed water ice and/or other volatile materials — presenting unique opportunities to address fundamental science questions.
IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.
Chen, Ying; Elenee Argentinis, J D; Weber, Griff
2016-04-01
Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Challenges and Potential Solutions for Big Data Implementations in Developing Countries
Mayan, J.C; García, M.J.; Almerares, A.A.; Househ, M.
2014-01-01
Summary Background The volume of data, the velocity with which they are generated, and their variety and lack of structure hinder their use. This creates the need to change the way information is captured, stored, processed, and analyzed, leading to the paradigm shift called Big Data. Objectives To describe the challenges and possible solutions for developing countries when implementing Big Data projects in the health sector. Methods A non-systematic review of the literature was performed in PubMed and Google Scholar. The following keywords were used: “big data”, “developing countries”, “data mining”, “health information systems”, and “computing methodologies”. A thematic review of selected articles was performed. Results There are challenges when implementing any Big Data program including exponential growth of data, special infrastructure needs, need for a trained workforce, need to agree on interoperability standards, privacy and security issues, and the need to include people, processes, and policies to ensure their adoption. Developing countries have particular characteristics that hinder further development of these projects. Conclusions The advent of Big Data promises great opportunities for the healthcare field. In this article, we attempt to describe the challenges developing countries would face and enumerate the options to be used to achieve successful implementations of Big Data programs. PMID:25123719
What’s So Different about Big Data?. A Primer for Clinicians Trained to Think Epidemiologically
Liu, Vincent
2014-01-01
The Big Data movement in computer science has brought dramatic changes in what counts as data, how those data are analyzed, and what can be done with those data. Although increasingly pervasive in the business world, it has only recently begun to influence clinical research and practice. As Big Data draws from different intellectual traditions than clinical epidemiology, the ideas may be less familiar to practicing clinicians. There is an increasing role of Big Data in health care, and it has tremendous potential. This Demystifying Data Seminar identifies four main strands in Big Data relevant to health care. The first is the inclusion of many new kinds of data elements into clinical research and operations, in a volume not previously routinely used. Second, Big Data asks different kinds of questions of data and emphasizes the usefulness of analyses that are explicitly associational but not causal. Third, Big Data brings new analytic approaches to bear on these questions. And fourth, Big Data embodies a new set of aspirations for a breaking down of distinctions between research data and operational data and their merging into a continuously learning health system. PMID:25102315
What's so different about big data?. A primer for clinicians trained to think epidemiologically.
Iwashyna, Theodore J; Liu, Vincent
2014-09-01
The Big Data movement in computer science has brought dramatic changes in what counts as data, how those data are analyzed, and what can be done with those data. Although increasingly pervasive in the business world, it has only recently begun to influence clinical research and practice. As Big Data draws from different intellectual traditions than clinical epidemiology, the ideas may be less familiar to practicing clinicians. There is an increasing role of Big Data in health care, and it has tremendous potential. This Demystifying Data Seminar identifies four main strands in Big Data relevant to health care. The first is the inclusion of many new kinds of data elements into clinical research and operations, in a volume not previously routinely used. Second, Big Data asks different kinds of questions of data and emphasizes the usefulness of analyses that are explicitly associational but not causal. Third, Big Data brings new analytic approaches to bear on these questions. And fourth, Big Data embodies a new set of aspirations for a breaking down of distinctions between research data and operational data and their merging into a continuously learning health system.
[Adherence and fidelity in patients treated with intragastric balloon].
Mazure, R A; Cancer, E; Martínez Olmos, M A; De Castro, M L; Abilés, V; Abilés, J; Bretón, I; Álvarez, V; Peláez, N; Culebras, J M
2014-01-01
A correct treatment of obesity needs a program of habits modification regardless of the selected technique, especially if it is minimally invasive as the intragastric balloon (BIG). The adherence of the obese patients with regard to recommended drugs measures to medium- and long-term is less than 50%. Given that the results obtained using the technique of gastric balloon must be seen influenced by adherence to the modification of habits program and its fulfillment, we reviewed series published in attention to the program proposed with the BIG. The series published to date provide few details about the used Therapeutic Programs as well as the adherence of patients to them, and even less concerning the Monitoring Plan and the loyalty of the patient can be seen. We conclude the convenience to agree on a follow-up strategy, at least the 6 months during which the BIG remain in the stomach.
Big questions, big science: meeting the challenges of global ecology
David Schimel; Michael Keller
2015-01-01
Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigatorâs or s group of investigatorâs labs, sustained for longer...
ERIC Educational Resources Information Center
Wang, Ze
2015-01-01
Using data from the Trends in International Mathematics and Science Study (TIMSS) 2007, this study examined the big-fish-little-pond-effects (BFLPEs) in 49 countries. In this study, the effect of math ability on math self-concept was decomposed into a within- and a between-level components using implicit mean centring and the complex data…
Neutronics Analysis of Water-Cooled Ceramic Breeder Blanket for CFETR
NASA Astrophysics Data System (ADS)
Zhu, Qingjun; Li, Jia; Liu, Songlin
2016-07-01
In order to investigate the nuclear response to the water-cooled ceramic breeder blanket models for CFETR, a detailed 3D neutronics model with 22.5° torus sector was developed based on the integrated geometry of CFETR, including heterogeneous WCCB blanket models, shield, divertor, vacuum vessel, toroidal and poloidal magnets, and ports. Using the Monte Carlo N-Particle Transport Code MCNP5 and IAEA Fusion Evaluated Nuclear Data Library FENDL2.1, the neutronics analyses were performed. The neutron wall loading, tritium breeding ratio, the nuclear heating, neutron-induced atomic displacement damage, and gas production were determined. The results indicate that the global TBR of no less than 1.2 will be a big challenge for the water-cooled ceramic breeder blanket for CFETR. supported by the National Magnetic Confinement Fusion Science Program of China (Nos. 2013GB108004, 2014GB122000, and 2014GB119000), and National Natural Science Foundation of China (No. 11175207)
KNMI DataLab experiences in serving data-driven innovations
NASA Astrophysics Data System (ADS)
Noteboom, Jan Willem; Sluiter, Raymond
2016-04-01
Climate change research and innovations in weather forecasting rely more and more on (Big) data. Besides increasing data from traditional sources (such as observation networks, radars and satellites), the use of open data, crowd sourced data and the Internet of Things (IoT) is emerging. To deploy these sources of data optimally in our services and products, KNMI has established a DataLab to serve data-driven innovations in collaboration with public and private sector partners. Big data management, data integration, data analytics including machine learning and data visualization techniques are playing an important role in the DataLab. Cross-domain data-driven innovations that arise from public-private collaborative projects and research programmes can be explored, experimented and/or piloted by the KNMI DataLab. Furthermore, advice can be requested on (Big) data techniques and data sources. In support of collaborative (Big) data science activities, scalable environments are offered with facilities for data integration, data analysis and visualization. In addition, Data Science expertise is provided directly or from a pool of internal and external experts. At the EGU conference, gained experiences and best practices are presented in operating the KNMI DataLab to serve data-driven innovations for weather and climate applications optimally.
ERIC Educational Resources Information Center
Ramsey, Jeffrey T.
2014-01-01
Signed into law in 1972, Title IX of the Education Amendments was designed to eliminate gender discrimination throughout the American educational system. Title IX applied to all educational programs at any level of schooling including admissions, financial aid, academic programs, and social organizations. However, Title IX has primarily been…
Big Data Provenance: Challenges, State of the Art and Opportunities
Wang, Jianwu; Crawl, Daniel; Purawat, Shweta; Nguyen, Mai; Altintas, Ilkay
2017-01-01
Ability to track provenance is a key feature of scientific workflows to support data lineage and reproducibility. The challenges that are introduced by the volume, variety and velocity of Big Data, also pose related challenges for provenance and quality of Big Data, defined as veracity. The increasing size and variety of distributed Big Data provenance information bring new technical challenges and opportunities throughout the provenance lifecycle including recording, querying, sharing and utilization. This paper discusses the challenges and opportunities of Big Data provenance related to the veracity of the datasets themselves and the provenance of the analytical processes that analyze these datasets. It also explains our current efforts towards tracking and utilizing Big Data provenance using workflows as a programming model to analyze Big Data. PMID:29399671
The Next Big Thing - Eric Haseltine
Eric Haseltine
2017-12-09
Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.
Diverse Grains in Mars Sandstone Target Big Arm
2015-07-01
This view of a sandstone target called "Big Arm" covers an area about 1.3 inches (33 millimeters) wide in detail that shows differing shapes and colors of sand grains in the stone. Three separate images taken by the Mars Hand Lens Imager (MAHLI) camera on NASA's Curiosity Mars rover, at different focus settings, were combined into this focus-merge view. The Big Arm target on lower Mount Sharp is at a location near "Marias Pass" where a mudstone bedrock is in contact with overlying sandstone bedrock. MAHLI recorded the component images on May 29, 2015, during the 999th Martian day, or sol, of Curiosity's work on Mars. The rounded shape of some grains visible here suggests they traveled long distances before becoming part of the sediment that later hardened into sandstone. Other grains are more angular and may have originated closer to the rock's current location. Lighter and darker grains may have different compositions. MAHLI was built by Malin Space Science Systems, San Diego. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. http://photojournal.jpl.nasa.gov/catalog/PIA19677
The Human Genome Project: big science transforms biology and medicine
2013-01-01
The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project. PMID:24040834
Assessment of Provisional MODIS-derived Surfaces Related to the Global Carbon Cycle
NASA Astrophysics Data System (ADS)
Cohen, W. B.; Maiersperger, T. K.; Turner, D. P.; Gower, S. T.; Kennedy, R. E.; Running, S. W.
2002-12-01
The global carbon cycle is one of the most important foci of an emerging global biosphere monitoring system. A key component of such a system is the MODIS sensor, onboard the Terra satellite platform. Biosphere monitoring requires an integrated program of satellite observations, Earth-system models, and in situ data. Related to the carbon cycle, MODIS science teams routinely develop a variety of global surfaces such as land cover, leaf area index, and net primary production using MODIS data and functional algorithms. The quality of these surfaces must be evaluated to determine their effectiveness for global biosphere monitoring. A project called BigFoot (http://www.fsl.orst.edu/larse/bigfoot/) is an organized effort across nine biomes to assess the quality of the abovementioned surfaces: (1) Arctic tundra; (2) boreal evergreen needle-leaved forest; temperate (3) cropland, (4) grassland, (5) evergreen needle-leaved forest, and (6) deciduous broad-leaved forest; desert (7) grassland and (8) shrubland; and (9) tropical evergreen broad-leaved forest. Each biome is represented by a site that has an eddy-covariance flux tower that measures water vapor and CO2 fluxes. Flux tower footprints are relatively small-approximately 1 km2. BigFoot characterizes 25 km2 around each tower, using field data, Landsat ETM+ image data, and ecosystem process models. Our innovative field sampling design incorporates a nested spatial series to facilitate geostatistical analyses, samples the ecological variability at a site, and is logistically efficient. Field data are used both to develop site-specific algorithms for mapping/modeling the variables of interest and to characterize the errors in derived BigFoot surfaces. Direct comparisons of BigFoot- and MODIS-derived surfaces are made to help understand the sources of error in MODIS-derived surfaces and to facilitate improvements to MODIS algorithms. Results from four BigFoot sites will be presented.
NASA Astrophysics Data System (ADS)
Heim, N. A.; Saltzman, J.; Payne, J.
2014-12-01
The chasm between classroom science and scientific research is bridged in the History of Life Internships at Stanford University. The primary foci of the internships are collection of new scientific data and original scientific research. While traditional high school science courses focus on learning content and laboratory skills, students are rarely engaged in real scientific research. Even in experiential learning environments, students investigate phenomena with known outcomes under idealized conditions. In the History of Life Internships, high school youth worked full time during the summers of 2013 and 2014 to collect body size data on fossil Echinoderms and Ostracods, measuring more than 20,000 species in total. These data are contributed to the larger research efforts in the Stanford Paleobiology Lab, but they also serve as a source of data for interns to conduct their own scientific research. Over the course of eight weeks, interns learn about previous research on body size evolution, collect data, develop their own hypotheses, test their hypotheses, and communicate their results to their peers and the larger scientific community: the 2014 interns have submitted eight abstracts to this meeting for the youth session entitled Bright STaRS where they will present their research findings. Based on a post-internship survey, students in the 2013 History of Life cohort had more positive attitudes towards science and had a better understanding of how to conduct scientific research compared to interns in the Earth Sciences General Internship Program, where interns typically do not complete their own research project from start to finish. In 2014, we implemented both pre- and post-internship surveys to determine if these positive attitudes were developed over the course of the internship. Conducting novel research inspires both the students and instructors. Scientific data collection often involves many hours of repetitive work, but answering big questions typically requires big datasets. Our team of 20 used calipers and data-rich compendia of fossil species to collect copious amounts of data. Our interns experienced the joys, frustrations, tedium and excitement of being scientists and discovering something new about the natural world for the first time.
Survey of Cyber Crime in Big Data
NASA Astrophysics Data System (ADS)
Rajeswari, C.; Soni, Krishna; Tandon, Rajat
2017-11-01
Big data is like performing computation operations and database operations for large amounts of data, automatically from the data possessor’s business. Since a critical strategic offer of big data access to information from numerous and various areas, security and protection will assume an imperative part in big data research and innovation. The limits of standard IT security practices are notable, with the goal that they can utilize programming sending to utilize programming designers to incorporate pernicious programming in a genuine and developing risk in applications and working frameworks, which are troublesome. The impact gets speedier than big data. In this way, one central issue is that security and protection innovation are sufficient to share controlled affirmation for countless direct get to. For powerful utilization of extensive information, it should be approved to get to the information of that space or whatever other area from a space. For a long time, dependable framework improvement has arranged a rich arrangement of demonstrated ideas of demonstrated security to bargain to a great extent with the decided adversaries, however this procedure has been to a great extent underestimated as “needless excess” and sellers In this discourse, essential talks will be examined for substantial information to exploit this develop security and protection innovation, while the rest of the exploration difficulties will be investigated.
ERIC Educational Resources Information Center
Seaton, Marjorie; Marsh, Herbert W.; Yeung, Alexander Seeshing; Craven, Rhonda
2011-01-01
Big-fish-little-pond effect (BFLPE) research has demonstrated that academic self-concept is negatively affected by attending high-ability schools. This article examines data from large, representative samples of 15-year-olds from each Australian state, based on the three Program for International Student Assessment (PISA) databases that focus on…
Economics and econophysics in the era of Big Data
NASA Astrophysics Data System (ADS)
Cheong, Siew Ann
2016-12-01
There is an undeniable disconnect between theory-heavy economics and the real world, and some cross polination of ideas with econophysics, which is more balanced between data and models, might help economics along the way to become a truly scientific enterprise. With the coming of the era of Big Data, this transformation of economics into a data-driven science is becoming more urgent. In this article, I use the story of Kepler's discovery of his three laws of planetary motion to enlarge the framework of the scientific approach, from one that focuses on experimental sciences, to one that accommodates observational sciences, and further to one that embraces data mining and machine learning. I distinguish between the ontological values of Kepler's Laws vis-a-vis Newton's Laws, and argue that the latter is more fundamental because it is able to explain the former. I then argue that the fundamental laws of economics lie not in mathematical equations, but in models of adaptive economic agents. With this shift in mind set, it becomes possible to think about how interactions between agents can lead to the emergence of multiple stable states and critical transitions, and complex adaptive policies and regulations that might actually work in the real world. Finally, I discuss how Big Data, exploratory agent-based modeling, and predictive agent-based modeling can come together in a unified framework to make economics a true science.
None
2017-12-09
Cérémonie du 25ème anniversaire du Cern avec 2 orateurs: le Prof.Weisskopf parle de la signification et le rôle du Cern et le Prof.Casimir(?) fait un exposé sur les rélations entre la science pure et la science appliquée et la "big science" (science légère)
Using a Very Big Rocket to take Very Small Satellites to Very Far Places
NASA Technical Reports Server (NTRS)
Cohen, Barbara
2017-01-01
Planetary science cubesats are being built. Insight (2018) will carry 2 cubesats to provide communication links to Mars. EM-1 (2019) will carry 13 cubesat-class missions to further smallsat science and exploration capabilities. Planetary science cubesats have more in common with large planetary science missions than LEO cubesats- need to work closely with people who have deep-space mission experience
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science and Technology.
These hearings on international cooperation in science focused on three issues: (1) international cooperation in big science; (2) the impact of international cooperation on research priorities; and (3) coordination in management of international cooperative research. Witnesses presenting testimony and/or prepared statements were: Victor Weisskopf;…
ERIC Educational Resources Information Center
Wheeler, Marc E.; Keller, Thomas E.; DuBois, David L.
2010-01-01
Between 2007 and 2009, reports were released on the results of three separate large-scale random assignment studies of the effectiveness of school-based mentoring programs for youth. The studies evaluated programs implemented by Big Brothers Big Sisters of America (BBBSA) affiliates (Herrera et al., 2007), Communities In Schools of San Antonio,…
One year on VESPA, a community-driven Virtual Observatory in Planetary Science
NASA Astrophysics Data System (ADS)
Erard, S.; Cecconi, B.; Le Sidaner, P.; Rossi, A. P.; Capria, M. T.; Schmitt, B.; Andre, N.; Vandaele, A. C.; Scherf, M.; Hueso, R.; Maattanen, A. E.; Thuillot, W.; Achilleos, N.; Marmo, C.; Santolik, O.; Benson, K.
2016-12-01
The Europlanet H2020 program started on 1/9/2015 for 4 years. It includes an activity to adapt Virtual Observatory (VO) techniques to Planetary Science data called VESPA. The objective is to facilitate searches in big archives as well as sparse databases, to provide simple data access and on-line visualization, and to allow small data providers to make their data available in an interoperable environment with minimum effort. The VESPA system, based on a prototype developed in a previous program [1], has been hugely improved during the first year of Europlanet H2020: the infrastructure has been upgraded to describe data in many fields more accurately; the main user search interface (http://vespa.obspm.fr) has been redesigned to provide more flexibility; alternative ways to access Planetary Science data services from VO tools are being implemented in addition to receiving data from the main interface; VO tools are being improved to handle specificities of Solar System data, e.g. measurements in reflected light, coordinate systems, etc. Existing data services have been updated, and new ones have been designed. The global objective (50 data services) is already overstepped, with 54 services open or being finalized. A procedure to install data services has been documented, and hands-on sessions are organized twice a year at EGU and EPSC; this is intended to favour the installation of services by individual research teams, e.g. to distribute derived data related to a published study. In complement, regular discussions are held with big data providers, starting with space agencies (IPDA). Common projects with ESA and NASA's PDS have been engaged, which should lead to a connection between PDS4 and EPN-TAP. In parallel, a Solar System Interest Group has been decided in IVOA; the goal is here to adapt existing astronomy standards to Planetary Science.Future steps will include the development of a connection between the VO world and GIS tools, and integration of Heliophysics, planetary plasma and mineral spectroscopy data. The Europlanet 2020 Research Infrastructure project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208. [1] Erard et al 2014, Astronomy & Computing 7-8, 71-80. http://arxiv.org/abs/1407.4886
Progress on VESPA, a community-driven Virtual Observatory in Planetary Science
NASA Astrophysics Data System (ADS)
Erard, S.; Cecconi, B.; Le Sidaner, P.; Rossi, A. P.; Capria, M. T.; Schmitt, B.; Genot, V. N.; André, N.; Vandaele, A. C.; Scherf, M.; Hueso, R.; Maattanen, A. E.; Carry, B.; Achilleos, N.; Marmo, C.; Santolik, O.; Benson, K.; Fernique, P.
2017-12-01
The Europlanet H2020 program started on 1/9/2015 for 4 years. It includes an activity to adapt Virtual Observatory (VO) techniques to Planetary Science data called VESPA. The objective is to facilitate searches in big archives as well as sparse databases, to provide simple data access and on-line visualization, and to allow small data providers to make their data available in an interoperable environment with minimum effort. The VESPA system, based on a prototype developed in a previous program [1], has been hugely improved during the first two years of Europlanet H2020: the infrastructure has been upgraded to describe data in many fields more accurately; the main user search interface (http://vespa.obspm.fr) has been redesigned to provide more flexibility; alternative ways to access Planetary Science data services from VO tools have been implemented; VO tools are being improved to handle specificities of Solar System data, e.g. measurements in reflected light, coordinate systems, etc. Current steps include the development of a connection between the VO world and GIS tools, and integration of Heliophysics, planetary plasmas, and mineral spectroscopy data to support of the analysis of observations. Existing data services have been updated, and new ones have been designed. The global objective is already overstepped, with 34 services open and 20 more being finalized. A procedure to install data services has been documented, and hands-on sessions are organized twice a year at EGU and EPSC; this is intended to favour the installation of services by individual research teams, e.g. to distribute derived data related to a published study. In complement, regular discussions are held with big data providers, starting with space agencies (IPDA). Common projects with ESA and NASA's PDS have been engaged, with the goal to connect PDS4 and EPN-TAP. In parallel, a Solar System Interest Group has just been started in IVOA; the goal is here to adapt existing astronomy standards to Planetary Science. The Europlanet 2020 Research Infrastructure project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208. [1] Erard et al 2014, Astronomy & Computing 7-8, 71-80. http://arxiv.org/abs/1407.4886
Meeting at the Museum: Sustained Research Education Partnerships Start in Your Own Back Yard
NASA Astrophysics Data System (ADS)
Morin, P. J.; Hamilton, P.; Campbell, K. M.
2007-12-01
The Science Museum of Minnesota (SMM) and the National Center for Earth-surface Dynamics (NCED) have been formal partners since 2002, when we jointly secured NSF center-level funding. We began in our local community by together creating our own "Big Back Yard", a 1.75 acre outdoor park in which museum visitors, teachers and students explore natural and engineered river systems through miniature golf and interactive exhibits. We went on to jointly design "Earthscapes" programming for students, teachers and graduate students, related directly or indirectly to the park. From there, our partnership led to a major new exhibition that begins touring nationally and around the world in late 2007. A current effort seeks to bring NCED and SMM together with five other geo-science-oriented, NSF-supported Science and Technology Centers (STCs) from around the United States to develop collaborative means by which the research and science of all six STCs can reach larger informal science education audiences. We have learned a lot along the way about how museums can help individual and teams of researchers most effectively reach formal and informal audiences. Successful partnerships require significant joint commitment and funding, dedicated staff, and meaningful formative and summative evaluation. For a research center or an individual researcher, partnering with a museum provides experience, expertise, infrastructure, collegial relationships and community visibility that significantly enhance that of the academy. For a museum, one successful and highly visible research collaboration opens many new doors in the research community, providing new opportunities to broaden and deepen the scientific content of exhibits and programming.
ERIC Educational Resources Information Center
Nagengast, Benjamin; Marsh, Herbert W.
2012-01-01
Being schooled with other high-achieving peers has a detrimental influence on students' self-perceptions: School-average and class-average achievement have a negative effect on academic self-concept and career aspirations--the big-fish-little-pond effect. Individual achievement, on the other hand, predicts academic self-concept and career…
ERIC Educational Resources Information Center
Stander, Julian; Dalla Valle, Luciana
2017-01-01
We discuss the learning goals, content, and delivery of a University of Plymouth intensive module delivered over four weeks entitled MATH1608PP Understanding Big Data from Social Networks, aimed at introducing students to a broad range of techniques used in modern Data Science. This module made use of R, accessed through RStudio, and some popular…
Daniel J. Murphy; Laurie Yung; Carina Wyborn; Daniel R. Williams
2017-01-01
This paper critically examines the temporal and spatial dynamics of adaptation in climate change science and explores how dynamic notions of 'place' elucidate novel ways of understanding community vulnerability and adaptation. Using data gathered from a narrative scenario-building process carried out among communities of the Big Hole Valley in Montana, the...
Enabling Analytics in the Cloud for Earth Science Data
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Lynnes, Christopher; Bingham, Andrew W.; Quam, Brandi M.
2018-01-01
The purpose of this workshop was to hold interactive discussions where providers, users, and other stakeholders could explore the convergence of three main elements in the rapidly developing world of technology: Big Data, Cloud Computing, and Analytics, [for earth science data].
The nonequilibrium quantum many-body problem as a paradigm for extreme data science
NASA Astrophysics Data System (ADS)
Freericks, J. K.; Nikolić, B. K.; Frieder, O.
2014-12-01
Generating big data pervades much of physics. But some problems, which we call extreme data problems, are too large to be treated within big data science. The nonequilibrium quantum many-body problem on a lattice is just such a problem, where the Hilbert space grows exponentially with system size and rapidly becomes too large to fit on any computer (and can be effectively thought of as an infinite-sized data set). Nevertheless, much progress has been made with computational methods on this problem, which serve as a paradigm for how one can approach and attack extreme data problems. In addition, viewing these physics problems from a computer-science perspective leads to new approaches that can be tried to solve more accurately and for longer times. We review a number of these different ideas here.
Exponential Growth and the Shifting Global Center of Gravity of Science Production, 1900-2011
ERIC Educational Resources Information Center
Zhang, Liang; Powell, Justin J. W.; Baker, David P.
2015-01-01
Long historical trends in scientific discovery led mid-20th century scientometricians to mark the advent of "big science"--extensive science production--and predicted that over the next few decades, the exponential growth would slow, resulting in lower rates of increase in production at the upper limit of a logistic curve. They were…
ERIC Educational Resources Information Center
Goldacre, Ben
2007-01-01
In this article, the author talks about pseudoscientific quack, or a big science story in a national newspaper and explains why science in the media is so often pointless, simplistic, boring, or just plain wrong. It is the author's hypothesis that in their choice of stories, and the way they cover them, the media create a parody of science, for…
Thinking, Doing, Talking Science: Evaluation Report and Executive Summary
ERIC Educational Resources Information Center
Hanley, Pam; Slavin, Robert; Elliott, Louise
2015-01-01
Thinking, Doing, Talking Science (TDTS) is a programme that aims to make science lessons in primary schools more practical, creative and challenging. Teachers are trained in a repertoire of strategies that aim to encourage pupils to use higher order thinking skills. For example, pupils are posed 'Big Questions,' such as 'How do you know that the…
ERIC Educational Resources Information Center
Castejon, Juan Luis; Cantero, Ma. Pilar; Perez, Nelida
2008-01-01
Introduction: The main objective of this paper is to establish a profile of socio-emotional competencies characteristic of a sample of students from each of the big academic areas in higher education: legal sciences, social sciences, education, humanities, science and technology, and health. An additional objective was to analyse differences…
ERIC Educational Resources Information Center
Taylor, Amy; Jones, Gail
2009-01-01
The "National Science Education Standards" emphasise teaching unifying concepts and processes such as basic functions of living organisms, the living environment, and scale. Scale influences science processes and phenomena across the domains. One of the big ideas of scale is that of surface area to volume. This study explored whether or not there…
Review of the National Research Council's Framework for K-12 Science Education
ERIC Educational Resources Information Center
Gross, Paul R.
2011-01-01
The new "Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas" is a big, comprehensive volume, carefully organized and heavily documented. It is the long-awaited product of the Committee on a Conceptual Framework for New K-12 Science Education Standards. As noted, it is a weighty document (more than 300…
BigDataScript: a scripting language for data pipelines.
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.
BigDataScript: a scripting language for data pipelines
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778
Engaging Scientists in Meaningful E/PO: The Universe Discovery Guides
NASA Astrophysics Data System (ADS)
Meinke, B. K.; Lawton, B.; Gurton, S.; Smith, D. A.; Manning, J. G.
2014-12-01
For the 2009 International Year of Astronomy, the then-existing NASA Origins Forum collaborated with the Astronomical Society of the Pacific (ASP) to create a series of monthly "Discovery Guides" for informal educator and amateur astronomer use in educating the public about featured sky objects and associated NASA science themes. Today's NASA Astrophysics Science Education and Public Outreach Forum (SEPOF), one of a new generation of forums coordinating the work of NASA Science Mission Directorate (SMD) EPO efforts—in collaboration with the ASP and NASA SMD missions and programs--has adapted the Discovery Guides into "evergreen" educational resources suitable for a variety of audiences. The Guides focus on "deep sky" objects and astrophysics themes (stars and stellar evolution, galaxies and the universe, and exoplanets), showcasing EPO resources from more than 30 NASA astrophysics missions and programs in a coordinated and cohesive "big picture" approach across the electromagnetic spectrum, grounded in best practices to best serve the needs of the target audiences. Each monthly guide features a theme and a representative object well-placed for viewing, with an accompanying interpretive story, finding charts, strategies for conveying the topics, and complementary supporting NASA-approved education activities and background information from a spectrum of NASA missions and programs. The Universe Discovery Guides are downloadable from the NASA Night Sky Network web site at nightsky.jpl.nasa.gov. We will share the Forum-led Collaborative's experience in developing the guides, how they place individual science discoveries and learning resources into context for audiences, and how the Guides can be readily used in scientist public outreach efforts, in college and university introductory astronomy classes, and in other engagements between scientists, students and the public.
Introducing Public Libraries to The Big Read: Final Report on the Audio Guide Distribution
ERIC Educational Resources Information Center
Sloan, Kay; Randall, Michelle
2009-01-01
In July 2008, over 14,000 public libraries throughout the U.S. received, free of charge, a set of fourteen Audio Guides introducing them to The Big Read. Since 2007, when the National Endowment for the Arts and the Institute of Museum and Library Services, in partnership with Arts Midwest, debuted The Big Read, the program has awarded grants to…
NASA Astrophysics Data System (ADS)
Westfall, Catherine
2018-03-01
This is the second in a three-part article describing the development of the Thomas Jefferson National Accelerator Facility's experimental program, from the first dreams of incisive electromagnetic probes into the structure of the nucleus through the era in which equipment was designed and constructed and a program crafted so that the long-desired experiments could begin. These developments unfolded against the backdrop of the rise of the more bureaucratic New Big Science and the intellectual tumult that grew from increasing understanding and interest in quark-level physics. Part 2, presented here, focuses on the period from 1986 to 1990. During this period of revolutionary change, laboratory personnel, potential users, and DOE officials labored to proceed from the 1986 laboratory design report, which included detailed accelerator plans and very preliminary experimental equipment sketches, to an approved 1990 experimental equipment conceptual design report, which provided designs complete enough for the onset of experimental equipment construction.
NASA Strategic Roadmap: Origin, Evolution, Structure, and Destiny of the Universe
NASA Technical Reports Server (NTRS)
White, Nicholas E.
2005-01-01
The NASA strategic roadmap on the Origin, Evolution, Structure and Destiny of the Universe is one of 13 roadmaps that outline NASA s approach to implement the vision for space exploration. The roadmap outlines a program to address the questions: What powered the Big Bang? What happens close to a Black Hole? What is Dark Energy? How did the infant universe grow into the galaxies, stars and planets, and set the stage for life? The roadmap builds upon the currently operating and successful missions such as HST, Chandra and Spitzer. The program contains two elements, Beyond Einstein and Pathways to Life, performed in three phases (2005-2015, 2015-2025 and >2025) with priorities set by inputs received from reviews undertaken by the National Academy of Sciences and technology readiness. The program includes the following missions: 2005-2015 GLAST, JWST and LISA; 2015-2025 Constellation-X and a series of Einstein Probes; and >2025 a number of ambitious vision missions which will be prioritized by results from the previous two phases.
Dougherty, Edward R.; Highfield, Roger R.
2016-01-01
The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035
Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R
2016-11-13
The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.
Airport Revenues: McMahon-Wrinkle Airpark; Big Spring, Texas
DOT National Transportation Integrated Search
1997-11-21
Audit objectives were to determine whether the city of Big Spring, Texas : (city), was in compliance with the Federal Aviation Administration (FAA) Airport : Improvement Program grant assurances to ensure (i) fee and rental structures were maintained...
Big Questions: Missing Antimatter
Lincoln, Don
2018-06-08
Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.
Game-XP: Action Games as Experimental Paradigms for Cognitive Science.
Gray, Wayne D
2017-04-01
Why games? How could anyone consider action games an experimental paradigm for Cognitive Science? In 1973, as one of three strategies he proposed for advancing Cognitive Science, Allen Newell exhorted us to "accept a single complex task and do all of it." More specifically, he told us that rather than taking an "experimental psychology as usual approach," we should "focus on a series of experimental and theoretical studies around a single complex task" so as to demonstrate that our theories of human cognition were powerful enough to explain "a genuine slab of human behavior" with the studies fitting into a detailed theoretical picture. Action games represent the type of experimental paradigm that Newell was advocating and the current state of programming expertise and laboratory equipment, along with the emergence of Big Data and naturally occurring datasets, provide the technologies and data needed to realize his vision. Action games enable us to escape from our field's regrettable focus on novice performance to develop theories that account for the full range of expertise through a twin focus on expertise sampling (across individuals) and longitudinal studies (within individuals) of simple and complex tasks. Copyright © 2017 Cognitive Science Society, Inc.
Mulugeta, Lily Yeruk; Yao, Lynne; Mould, Diane; Jacobs, Brian; Florian, Jeffrey; Smith, Brian; Sinha, Vikram; Barrett, Jeffrey S
2018-01-10
This article discusses the use of big data in pediatric drug development. The article covers key topics discussed at the ACCP annual meeting symposium in 2016 including the extent to which big data or real-world data can inform clinical trial design and substitute for efficacy and safety data typically obtained in clinical trials. The current states of use, opportunities, and challenges with the use of big data in future pediatric drug development are discussed. © 2018 American Society for Clinical Pharmacology and Therapeutics.
2013-04-22
Director of Strategic Communications and Senior Science and Technology Policy Analyst, Office of Science and Technology Policy, Executive Office of the President, Rick Weiss, left, “Big Bang Theory” co-creator Bill Prady, center, and NASA Mars Curiosity Landing mission controller, Bobak "Mohawk Guy" Ferdowsi talk during the White House Science Fair held at the White House, April 22, 2013. The science fair celebrated student winners of a broad range of science, technology, engineering and math (STEM) competitions from across the country. Photo Credit: (NASA/Bill Ingalls)
An Information Literacy Partnership.
ERIC Educational Resources Information Center
Bielich, Paul; Page, Frederick
2002-01-01
Describes a pilot partnership formed by a science teacher and a science library media specialist between Detroit's Northwestern High School and the David Adamany Undergraduate Library at Wayne State University to develop student information literacy in high school. Discusses activities; teacher attitudes; introduction of the Big6 Skills; and…
Welsh, Elaine; Jirotka, Marina; Gavaghan, David
2006-06-15
We examine recent developments in cross-disciplinary science and contend that a 'Big Science' approach is increasingly evident in the life sciences-facilitated by a breakdown of the traditional barriers between academic disciplines and the application of technologies across these disciplines. The first fruits of 'Big Biology' are beginning to be seen in, for example, genomics, (bio)-nanotechnology and systems biology. We suggest that this has profound implications for the research process and presents challenges both in technological design, in the provision of infrastructure and training, in the organization of research groups, and in providing suitable research funding mechanisms and reward systems. These challenges need to be addressed if the promise of this approach is to be fully realized. In this paper, we will draw on the work of social scientists to understand how these developments in science and technology relate to organizational culture, organizational change and the context of scientific work. We seek to learn from previous technological developments that seemed to offer similar potential for organizational and social change.
ERIC Educational Resources Information Center
Nowrouzian, Forough L.; Farewell, Anne
2013-01-01
Teamwork has become an integral part of most organisations today, and it is clearly important in Science and other disciplines. In Science, research teams increase in size while the number of single-authored papers and patents decline. Team-work in laboratory sciences permits projects that are too big or complex for one individual to be tackled.…
ERIC Educational Resources Information Center
Dai, David Yun; Rinn, Anne N.; Tan, Xiaoyuan
2013-01-01
The purposes of this study were to (a) examine the presence and prevalence of the big-fish-little-pond effect (BFLPE) in summer programs for the gifted, (b) identify group and individual difference variables that help predict those who are more susceptible to the BFLPE, and (c) put the possible BFLPE on academic self-concept in a larger context of…
Big Data: Are Biomedical and Health Informatics Training Programs Ready?
Hersh, W.; Ganesh, A. U. Jai
2014-01-01
Summary Objectives The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? Methods We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. Results The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one’s area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Conclusions Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in “deep analytical talent” as well as those who need knowledge to support such individuals. PMID:25123740
Otero, P; Hersh, W; Jai Ganesh, A U
2014-08-15
The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one's area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in "deep analytical talent" as well as those who need knowledge to support such individuals.
Wagner, Michael M
2002-01-01
The events that followed the launch of Sputnik on Oct 4, 1957, provide a metaphor for the events that are following the first bioterroristic case of pulmonary anthrax in the United States. This paper uses that metaphor to elucidate the nature of the task ahead and to suggest questions such as, Can the goals of the biodefense effort be formulated as concisely and concretely as the goal of the space program? Can we measure success in biodefense as we did for the space project? What are the existing resources that are the equivalents of propulsion systems and rocket engineers that can be applied to the problems of biodefense?
NASA Astrophysics Data System (ADS)
Zhong, L.; Takano, K.; Ji, Y.; Yamada, S.
2015-12-01
The disruption of telecommunications is one of the most critical disasters during natural hazards. As the rapid expanding of mobile communications, the mobile communication infrastructure plays a very fundamental role in the disaster response and recovery activities. For this reason, its disruption will lead to loss of life and property, due to information delays and errors. Therefore, disaster preparedness and response of mobile communication infrastructure itself is quite important. In many cases of experienced disasters, the disruption of mobile communication networks is usually caused by the network congestion and afterward long-term power outage. In order to reduce this disruption, the knowledge of communication demands during disasters is necessary. And big data analytics will provide a very promising way to predict the communication demands by analyzing the big amount of operational data of mobile users in a large-scale mobile network. Under the US-Japan collaborative project on 'Big Data and Disaster Research (BDD)' supported by the Japan Science and Technology Agency (JST) and National Science Foundation (NSF), we are going to investigate the application of big data techniques in the disaster preparedness and response of mobile communication infrastructure. Specifically, in this research, we have considered to exploit the big amount of operational information of mobile users for predicting the communications needs in different time and locations. By incorporating with other data such as shake distribution of an estimated major earthquake and the power outage map, we are able to provide the prediction information of stranded people who are difficult to confirm safety or ask for help due to network disruption. In addition, this result could further facilitate the network operators to assess the vulnerability of their infrastructure and make suitable decision for the disaster preparedness and response. In this presentation, we are going to introduce the results we obtained based on the big data analytics of mobile user statistical information and discuss the implications of these results.
Cappella, Joseph N
2017-10-01
Simultaneous developments in big data, social media, and computational social science have set the stage for how we think about and understand interpersonal and mass communication. This article explores some of the ways that these developments generate 4 hypothetical "vectors" - directions - into the next generation of communication research. These vectors include developments in network analysis, modeling interpersonal and social influence, recommendation systems, and the blurring of distinctions between interpersonal and mass audiences through narrowcasting and broadcasting. The methods and research in these arenas are occurring in areas outside the typical boundaries of the communication discipline but engage classic, substantive questions in mass and interpersonal communication.
The Next Big Thing - Eric Haseltine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eric Haseltine
2009-09-16
Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and moremore » than 100 publications in science and technical journals, the web and Discover Magazine.« less
A Hybrid Cloud Computing Service for Earth Sciences
NASA Astrophysics Data System (ADS)
Yang, C. P.
2016-12-01
Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.
2013-12-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.
2014-01-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
Big Science, Team Science, and Open Science for Neuroscience.
Koch, Christof; Jones, Allan
2016-11-02
The Allen Institute for Brain Science is a non-profit private institution dedicated to basic brain science with an internal organization more commonly found in large physics projects-large teams generating complete, accurate and permanent resources for the mouse and human brain. It can also be viewed as an experiment in the sociology of neuroscience. We here describe some of the singular differences to more academic, PI-focused institutions. Copyright © 2016 Elsevier Inc. All rights reserved.
Introducing the Big Knowledge to Use (BK2U) challenge.
Perl, Yehoshua; Geller, James; Halper, Michael; Ochs, Christopher; Zheng, Ling; Kapusnik-Uner, Joan
2017-01-01
The purpose of the Big Data to Knowledge initiative is to develop methods for discovering new knowledge from large amounts of data. However, if the resulting knowledge is so large that it resists comprehension, referred to here as Big Knowledge (BK), how can it be used properly and creatively? We call this secondary challenge, Big Knowledge to Use. Without a high-level mental representation of the kinds of knowledge in a BK knowledgebase, effective or innovative use of the knowledge may be limited. We describe summarization and visualization techniques that capture the big picture of a BK knowledgebase, possibly created from Big Data. In this research, we distinguish between assertion BK and rule-based BK (rule BK) and demonstrate the usefulness of summarization and visualization techniques of assertion BK for clinical phenotyping. As an example, we illustrate how a summary of many intracranial bleeding concepts can improve phenotyping, compared to the traditional approach. We also demonstrate the usefulness of summarization and visualization techniques of rule BK for drug-drug interaction discovery. © 2016 New York Academy of Sciences.
Considerations on Geospatial Big Data
NASA Astrophysics Data System (ADS)
LIU, Zhen; GUO, Huadong; WANG, Changlin
2016-11-01
Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.
A genetic algorithm-based job scheduling model for big data analytics.
Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei
Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.
Multi-scale computation methods: Their applications in lithium-ion battery research and development
NASA Astrophysics Data System (ADS)
Siqi, Shi; Jian, Gao; Yue, Liu; Yan, Zhao; Qu, Wu; Wangwei, Ju; Chuying, Ouyang; Ruijuan, Xiao
2016-01-01
Based upon advances in theoretical algorithms, modeling and simulations, and computer technologies, the rational design of materials, cells, devices, and packs in the field of lithium-ion batteries is being realized incrementally and will at some point trigger a paradigm revolution by combining calculations and experiments linked by a big shared database, enabling accelerated development of the whole industrial chain. Theory and multi-scale modeling and simulation, as supplements to experimental efforts, can help greatly to close some of the current experimental and technological gaps, as well as predict path-independent properties and help to fundamentally understand path-independent performance in multiple spatial and temporal scales. Project supported by the National Natural Science Foundation of China (Grant Nos. 51372228 and 11234013), the National High Technology Research and Development Program of China (Grant No. 2015AA034201), and Shanghai Pujiang Program, China (Grant No. 14PJ1403900).
Personality Theories Facilitate Integrating the Five Principles and Deducing Hypotheses for Testing
ERIC Educational Resources Information Center
Maddi, Salvatore R.
2007-01-01
Comments on the original article "A New Big Five: Fundamental Principles for an Integrative Science of Personality," by Dan P. McAdams and Jennifer L. Pals (see record 2006-03947-002). In presenting their view of personality science, McAdams and Pals (April 2006) elaborated the importance of five principles for building an integrated science of…
ERIC Educational Resources Information Center
Stains, Marilyne; Escriu-Sune, Marta; Alverez de Santizo, Myrna Lisseth Molina; Sevian, Hannah
2011-01-01
Development of learning progressions has been at the forefront of science education for several years. While understanding students' conceptual development toward "big ideas" in science is extremely valuable for researchers, science teachers can also benefit from assessment tools that diagnose their students' trajectories along the learning…
2009-12-10
Korean High Level Delegation Visit Ames Certer Director and various Senior staff: John Hines, Ames Center Chief Technologist (middel left) explains operations at the LADEE lab to Soon-Duk Bae, Deputy Director, Big Science Policy Division, Ministry of Educaiton, Science Technology, Young-Mok Hyun, Deputy Director, Space Development Division, Ministry of Educaiton, Science Technology, Seorium Lee, Senior Researcher, International Relations Korea Aerospace Research Institute.
SEAS (Surveillance Environmental Acoustic Support Program) Support
1984-02-29
ASEPS software - Provide support for AMES - Support for OUTPOST CREOLE, BIG DIPPER and MFA , First, a summary of the tasks as delineated in the contract...addition, the contractor will provide an engineer/scientist to support the BIG DIPPER data processing activities at NOSC. Task 3: SEAS Inventory - The...SI to provide support to SEAS for the OUTPOST -’ CREOLE III exercise which followed immediately after the BIG DIPPER .. exercise. OUTPOST CREOLE III
Exascale computing and big data
Reed, Daniel A.; Dongarra, Jack
2015-06-25
Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less
Exascale computing and big data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, Daniel A.; Dongarra, Jack
Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less
Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C
2018-04-14
Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.
Taking a 'Big Data' approach to data quality in a citizen science project.
Kelling, Steve; Fink, Daniel; La Sorte, Frank A; Johnston, Alison; Bruns, Nicholas E; Hochachka, Wesley M
2015-11-01
Data from well-designed experiments provide the strongest evidence of causation in biodiversity studies. However, for many species the collection of these data is not scalable to the spatial and temporal extents required to understand patterns at the population level. Only data collected from citizen science projects can gather sufficient quantities of data, but data collected from volunteers are inherently noisy and heterogeneous. Here we describe a 'Big Data' approach to improve the data quality in eBird, a global citizen science project that gathers bird observations. First, eBird's data submission design ensures that all data meet high standards of completeness and accuracy. Second, we take a 'sensor calibration' approach to measure individual variation in eBird participant's ability to detect and identify birds. Third, we use species distribution models to fill in data gaps. Finally, we provide examples of novel analyses exploring population-level patterns in bird distributions.
Margolis, Ronald; Derr, Leslie; Dunn, Michelle; Huerta, Michael; Larkin, Jennie; Sheehan, Jerry; Guyer, Mark; Green, Eric D
2014-01-01
Biomedical research has and will continue to generate large amounts of data (termed 'big data') in many formats and at all levels. Consequently, there is an increasing need to better understand and mine the data to further knowledge and foster new discovery. The National Institutes of Health (NIH) has initiated a Big Data to Knowledge (BD2K) initiative to maximize the use of biomedical big data. BD2K seeks to better define how to extract value from the data, both for the individual investigator and the overall research community, create the analytic tools needed to enhance utility of the data, provide the next generation of trained personnel, and develop data science concepts and tools that can be made available to all stakeholders. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
The Universe Discovery Guides: A Collaborative Approach to Educating with NASA Science
NASA Astrophysics Data System (ADS)
Manning, James G.; Lawton, Brandon L.; Gurton, Suzanne; Smith, Denise Anne; Schultz, Gregory; Astrophysics Community, NASA
2015-08-01
For the 2009 International Year of Astronomy, the then-existing NASA Origins Forum collaborated with the Astronomical Society of the Pacific (ASP) to create a series of monthly “Discovery Guides” for informal educator and amateur astronomer use in educating the public about featured sky objects and associated NASA science themes. Today’s NASA Astrophysics Science Education and Public Outreach Forum (SEPOF), one of the current generation of forums coordinating the work of NASA Science Mission Directorate (SMD) EPO efforts—in collaboration with the ASP and NASA SMD missions and programs--has adapted the Discovery Guides into “evergreen” educational resources suitable for a variety of audiences. The Guides focus on “deep sky” objects and astrophysics themes (stars and stellar evolution, galaxies and the universe, and exoplanets), showcasing EPO resources from more than 30 NASA astrophysics missions and programs in a coordinated and cohesive “big picture” approach across the electromagnetic spectrum, grounded in best practices to best serve the needs of the target audiences.Each monthly guide features a theme and a representative object well-placed for viewing, with an accompanying interpretive story, finding charts, strategies for conveying the topics, and complementary supporting NASA-approved education activities and background information from a spectrum of NASA missions and programs. The Universe Discovery Guides are downloadable from the NASA Night Sky Network web site at nightsky.jpl.nasa.gov and specifically from http://nightsky.jpl.nasa.gov/news-display.cfm?News_ID=611.The presentation will describe the collaborative’s experience in developing the guides, how they place individual science discoveries and learning resources into context for audiences, and how the Guides can be readily used in scientist public outreach efforts, in college and university introductory astronomy classes, and in other engagements between scientists, instructors, students and the public.
NASA Astrophysics Data System (ADS)
Dufoe, A.; Guertin, L. A.
2012-12-01
This project looks to help teachers utilize iPad technology in their classrooms as an instructional tool for Earth system science and connections to the Big Ideas in Earth Science. The project is part of Penn State University's National Science Foundation (NSF) Targeted Math Science Partnership grant, with one goal of the grant to help current middle school teachers across Pennsylvania engage students with significant and complex questions of Earth science. The free Apple software iBooks Author was used to create an electronic book for the iPad, focusing on a variety of controversial issues impacting the hydrosphere. The iBook includes image slideshows, embedded videos, interactive images and quizzes, and critical thinking questions along Bloom's Taxonomic Scale of Learning Objectives. Outlined in the introductory iBook chapters are the Big Ideas of Earth System Science and an overview of Earth's spheres. Since the book targets the hydrosphere, each subsequent chapter focuses on specific water issues, including glacial melts, aquifer depletion, coastal oil pollution, marine debris, and fresh-water chemical contamination. Each chapter is presented in a case study format that highlights the history of the issue, the development and current status of the issue, and some solutions that have been generated. The next section includes critical thinking questions in an open-ended discussion format that focus on the Big Ideas, proposing solutions for rectifying the situation, and/or assignments specifically targeting an idea presented in the case study chapter. Short, comprehensive multiple-choice quizzes are also in each chapter. Throughout the iBook, students are free to watch videos, explore the content and form their own opinions. As a result, this iBook fulfills the grant objective by engaging teachers and students with an innovative technological presentation that incorporates Earth system science with current case studies regarding global water issues.
Unlocking the Power of Big Data at the National Institutes of Health.
Coakley, Meghan F; Leerkes, Maarten R; Barnett, Jason; Gabrielian, Andrei E; Noble, Karlynn; Weber, M Nick; Huyen, Yentram
2013-09-01
The era of "big data" presents immense opportunities for scientific discovery and technological progress, with the potential to have enormous impact on research and development in the public sector. In order to capitalize on these benefits, there are significant challenges to overcome in data analytics. The National Institute of Allergy and Infectious Diseases held a symposium entitled "Data Science: Unlocking the Power of Big Data" to create a forum for big data experts to present and share some of the creative and innovative methods to gleaning valuable knowledge from an overwhelming flood of biological data. A significant investment in infrastructure and tool development, along with more and better-trained data scientists, may facilitate methods for assimilation of data and machine learning, to overcome obstacles such as data security, data cleaning, and data integration.
Unlocking the Power of Big Data at the National Institutes of Health
Coakley, Meghan F.; Leerkes, Maarten R.; Barnett, Jason; Gabrielian, Andrei E.; Noble, Karlynn; Weber, M. Nick
2013-01-01
Abstract The era of “big data” presents immense opportunities for scientific discovery and technological progress, with the potential to have enormous impact on research and development in the public sector. In order to capitalize on these benefits, there are significant challenges to overcome in data analytics. The National Institute of Allergy and Infectious Diseases held a symposium entitled “Data Science: Unlocking the Power of Big Data” to create a forum for big data experts to present and share some of the creative and innovative methods to gleaning valuable knowledge from an overwhelming flood of biological data. A significant investment in infrastructure and tool development, along with more and better-trained data scientists, may facilitate methods for assimilation of data and machine learning, to overcome obstacles such as data security, data cleaning, and data integration. PMID:27442200
Big Questions: Missing Antimatter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Don
2013-08-27
Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics.more » In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.« less
Sneak Preview of Berkeley Lab's Science at the Theatre on June 6th, 2011
Sanii, Babak
2017-12-11
Babak Sanii provides a sneak preview of Berkeley Lab's next Science at the Theater Event: Big Thinking: The Power of Nanoscience. Berkeley Lab scientists reveal how nanoscience will bring us cleaner energy, faster computers, and improved medicine. Berkeley Repertory Theatre on June 6th, 2011.
Collective Awareness and the New Institution Science
NASA Astrophysics Data System (ADS)
Pitt, Jeremy; Nowak, Andrzej
The following sections are included: * Introduction * Challenges for Institutions * Collective Awareness * A New Science of Institutions * Complex social ensembles * Interoceptive collective awareness * Planned emergence * Self-organising electronic institutions * Transformative Impact on Society * Social attitudes and processes * Innovative service creation and social innovation * Scientific impact * Big data * Self-regulation * Summary and Conclusions
Evolution: Don't Debate, Educate.
ERIC Educational Resources Information Center
Bybee, Rodger W.
2000-01-01
Discusses controversy over the teaching of biological evolution and other scientific ideas such as Big Bang theory. Recommends that teachers avoid debating creationists, help students develop a greater understanding and appreciation for science as a way of explaining the natural world, and emphasize inquiry and the nature of science. (Contains 19…
How Cosmology Became a Science.
ERIC Educational Resources Information Center
Brush, Stephen G.
1992-01-01
Describes the origin of the science of cosmology and the competing theories to explain the beginning of the universe. The big bang theory for the creation of the universe is contrasted with the steady state theory. The author details discoveries that led to the demise of the steady state theory. (PR)
2013 Student Science Jeopardy Tournament a Big Success | Poster
By Robin Meckley, Contributing Writer The category was “General Science,” and the clue read: “Named for an Italian scientist, it is the scientific number of molecules in 1 gram mole of any substance.” Everything depended on knowing the correct response and wagering enough points.
WHK Interns Win Big at Frederick County Science Fair | Poster
Three Werner H. Kirsten student interns claimed awards at the 35th Annual Frederick County Science and Engineering Fair—and got a shot at the national competition—for imaginative projects that reached out to the rings of Saturn and down to the details of advanced cancer diagnostics.
Sneak Preview of Berkeley Lab's Science at the Theatre on June 6th, 2011
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanii, Babak
Babak Sanii provides a sneak preview of Berkeley Lab's next Science at the Theater Event: Big Thinking: The Power of Nanoscience. Berkeley Lab scientists reveal how nanoscience will bring us cleaner energy, faster computers, and improved medicine. Berkeley Repertory Theatre on June 6th, 2011.
ERIC Educational Resources Information Center
Sapp, Gregg
2007-01-01
The state of science is a moving target, and its ever-shifting horizons can best be gleaned by the contents of scientific journals. However, the bigger picture of the scientific enterprise, which also encompasses its past, its future, and its overarching philosophies, can often be better represented through the more reflective pace of popular…
Waves in Nature, Lasers to Tsumanis and Beyond
LLNL - University of California Television
2017-12-09
Waves are everywhere. Microwaves, laser beams, music, tsunamis. Electromagnetic waves emanating from the Big Bang fill the universe. Learn about the similarities and difference in all of these wavy phenomena with Ed Moses and Rick Sawicki, Lawrence Livermore National Laboratory scientists Series: Science on Saturday [10/2006] [Science] [Show ID: 11541
Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?
ERIC Educational Resources Information Center
Robertson, Bill
2016-01-01
Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…
Waves in Nature, Lasers to Tsumanis and Beyond
DOE Office of Scientific and Technical Information (OSTI.GOV)
LLNL - University of California Television
2008-05-01
Waves are everywhere. Microwaves, laser beams, music, tsunamis. Electromagnetic waves emanating from the Big Bang fill the universe. Learn about the similarities and difference in all of these wavy phenomena with Ed Moses and Rick Sawicki, Lawrence Livermore National Laboratory scientists Series: Science on Saturday [10/2006] [Science] [Show ID: 11541
An entrepreneurial training model to enhance undergraduate training in biomedical research.
Kamangar, Farin; Silver, Gillian; Hohmann, Christine; Hughes-Darden, Cleo; Turner-Musa, Jocelyn; Haines, Robert Trent; Jackson, Avis; Aguila, Nelson; Sheikhattari, Payam
2017-01-01
Undergraduate students who are interested in biomedical research typically work on a faculty member's research project, conduct one distinct task (e.g., running gels), and, step by step, enhance their skills. This "apprenticeship" model has been helpful in training many distinguished scientists over the years, but it has several potential drawbacks. For example, the students have limited autonomy, and may not understand the big picture, which may result in students giving up on their goals for a research career. Also, the model is costly and may greatly depend on a single mentor. The NIH Building Infrastructure Leading to Diversity (BUILD) Initiative has been established to fund innovative undergraduate research training programs and support institutional and faculty development of the recipient university. The training model at Morgan State University (MSU), namely " A S tudent- C entered En trepreneurship D evelopment training model" (ASCEND), is one of the 10 NIH BUILD-funded programs, and offers a novel, experimental "entrepreneurial" training approach. In the ASCEND training model, the students take the lead. They own the research, understand the big picture, and experience the entire scope of the research process, which we hypothesize will lead to a greater sense of self-efficacy and research competency, as well as an enhanced sense of science identity. They are also immersed in environments with substantial peer support, where they can exchange research ideas and share experiences. This is important for underrepresented minority students who might have fewer role models and less peer support in conducting research. In this article, we describe the MSU ASCEND entrepreneurial training model's components, rationale, and history, and how it may enhance undergraduate training in biomedical research that may be of benefit to other institutions. We also discuss evaluation methods, possible sustainability solutions, and programmatic challenges that can affect all types of science training interventions.
Partnering for science: proceedings of the USGS Workshop on Citizen Science
Hines, Megan; Benson, Abigail; Govoni, David; Masaki, Derek; Poore, Barbara; Simpson, Annie; Tessler, Steven
2013-01-01
What U.S. Geological Survey (USGS) programs use citizen science? How can projects be best designed while meeting policy requirements? What are the most effective volunteer recruitment methods? What data should be collected to ensure validation and how should data be stored? What standard protocols are most easily used by volunteers? Can data from multiple projects be integrated to support new research or existing science questions? To help answer these and other questions, the USGS Community of Data Integration (CDI) supported the development of the Citizen Science Working Group (CSWG) in August 2011 and funded the working group’s proposal to hold a USGS Citizen Science Workshop in fiscal year 2012. The stated goals for our workshop were: raise awareness of programs and projects in the USGS that incorporate citizen science, create a community of practice for the sharing of knowledge and experiences, provide a forum to discuss the challenges of—and opportunities for—incorporating citizen science into USGS projects, and educate and support scientists and managers whose projects may benefit from public participation in science.To meet these goals, the workshop brought together 50 attendees (see appendix A for participant details) representing the USGS, partners, and external citizen science practitioners from diverse backgrounds (including scientists, managers, project coordinators, and technical developers, for example) to discuss these topics at the Denver Federal Center in Colorado on September 11–12, 2012. Over two and a half days, attendees participated in four major plenary sessions (Citizen Science Policy and Challenges, Engaging the Public in Scientific Research, Data Collection and Management, and Technology and Tools) comprised of 25 invited presentations and followed by structured discussions for each session designed to address both prepared and ad hoc "big questions." A number of important community support and infrastructure needs were identified from the sessions and discussions, and a subteam was formed to draft a strategic vision statement to guide and prioritize future USGS efforts to support the citizen science community. Attendees also brainstormed proposal ideas for the fiscal year 2013 CDI request for proposals: one possible venue to support the execution of the vision.
Southeast Regional Clearinghouse(SERCH)Mini-grants:Big Impacts on Future Explorers
NASA Astrophysics Data System (ADS)
Runyon, C.; Guimond, K.
2004-12-01
SERCH is one of seven regional Broker/Facilitator programs funded by NASA's Space Science Mission Directorate. Our purpose is to promote space science awareness and to enhance interest in science, math, and technology through the use of NASA's mission data, information, and educational products. We work closely with educators and NASA-funded scientists in 14 states (AL, AR, DC, FL, GA, KY, LA, MD, MS, NC, PR, SC/VI, TN, and VA) throughout the southeastern U.S. to share what NASA is doing in space science. Every year SERCH dedicates money from its budget to support education/outreach initiatives that increase the awareness and understanding of the four major scientific themes, or forums from NASA's space science program: 1) Sun-Earth Connection, 2) Solar System Exploration, 3) Structure and Evolution of the Universe, and 4) Astronomical Search for Origins and Planetary Systems. SERCH is particularly interested in proposals for education/outreach efforts that establish strong and lasting partnerships between the space science and education communities and that support the NASA's education mission. We encourage innovative, inter-disciplinary teams involving both scientists and educators to apply. These peer-reviewed grants are awarded for a period of one year in amounts usually ranging from 5,000 to 10,000. Three examples of highly successful previous grant awards include: 1) Teaching Astronomy and Space Science in Kentucky (KY): Designed to improve knowledge of science core concepts and teaching skills in astronomy and space science and increased expertise in achieving current Kentucky academic expectations; 2) Development of Multi-media Space Science Education/Tutorial Modules (MD): The objective is the production of three "turn-key" internet-based multi-media student tutorial modules to enable the mostly part-time professors/instructors teaching introductory astronomy in community colleges to add exciting and cutting-edge topics to their existing astronomy courses; and 3) Space Science the Special Way (SSS Way) (VA): This conference focused on solutions to the challenges faced when accommodating inclusive earth/space science instruction to students from the following special needs groups: blind and visually impaired, deaf and hard of hearing and the learning disabled.
Evolvix BEST Names for semantic reproducibility across code2brain interfaces.
Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha
2017-01-01
Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
2009-12-10
Korean High Level Delegation Visit Ames Certer Director and various Senior staff: Dan Andrews give presentation about LCROSS/LRO to Seorium Lee, Senior Researcher, International Relations Korea Aerospace Research Institute, Soon-Duk Bae, Deputy Director, Big Science Policy Division, Ministry of Educaiton, Science Technology, Young-Mok Hyun, Deputy Director, Space Development Division, Ministry of Educaiton, Science Technology, Seorium Lee, Senior Researcher, International Relations Korea Aerospace Research Institute.
A Systematic Review of Techniques and Sources of Big Data in the Healthcare Sector.
Alonso, Susel Góngora; de la Torre Díez, Isabel; Rodrigues, Joel J P C; Hamrioui, Sofiane; López-Coronado, Miguel
2017-10-14
The main objective of this paper is to present a review of existing researches in the literature, referring to Big Data sources and techniques in health sector and to identify which of these techniques are the most used in the prediction of chronic diseases. Academic databases and systems such as IEEE Xplore, Scopus, PubMed and Science Direct were searched, considering the date of publication from 2006 until the present time. Several search criteria were established as 'techniques' OR 'sources' AND 'Big Data' AND 'medicine' OR 'health', 'techniques' AND 'Big Data' AND 'chronic diseases', etc. Selecting the paper considered of interest regarding the description of the techniques and sources of Big Data in healthcare. It found a total of 110 articles on techniques and sources of Big Data on health from which only 32 have been identified as relevant work. Many of the articles show the platforms of Big Data, sources, databases used and identify the techniques most used in the prediction of chronic diseases. From the review of the analyzed research articles, it can be noticed that the sources and techniques of Big Data used in the health sector represent a relevant factor in terms of effectiveness, since it allows the application of predictive analysis techniques in tasks such as: identification of patients at risk of reentry or prevention of hospital or chronic diseases infections, obtaining predictive models of quality.
The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness.
Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen
2016-10-17
The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term "Big Data", which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.
The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness
Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen
2016-01-01
The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing. PMID:27763525
Complexity Science Framework for Big Data: Data-enabled Science
NASA Astrophysics Data System (ADS)
Surjalal Sharma, A.
2016-07-01
The ubiquity of Big Data has stimulated the development of analytic tools to harness the potential for timely and improved modeling and prediction. While much of the data is available near-real time and can be compiled to specify the current state of the system, the capability to make predictions is lacking. The main reason is the basic nature of Big Data - the traditional techniques are challenged in their ability to cope with its velocity, volume and variability to make optimum use of the available information. Another aspect is the absence of an effective description of the time evolution or dynamics of the specific system, derived from the data. Once such dynamical models are developed predictions can be made readily. This approach of " letting the data speak for itself " is distinct from the first-principles models based on the understanding of the fundamentals of the system. The predictive capability comes from the data-derived dynamical model, with no modeling assumptions, and can address many issues such as causality and correlation. This approach provides a framework for addressing the challenges in Big Data, especially in the case of spatio-temporal time series data. The reconstruction of dynamics from time series data is based on recognition that in most systems the different variables or degrees of freedom are coupled nonlinearly and in the presence of dissipation the state space contracts, effectively reducing the number of variables, thus enabling a description of its dynamical evolution and consequently prediction of future states. The predictability is analysed from the intrinsic characteristics of the distribution functions, such as Hurst exponents and Hill estimators. In most systems the distributions have heavy tails, which imply higher likelihood for extreme events. The characterization of the probabilities of extreme events are critical in many cases e. g., natural hazards, for proper assessment of risk and mitigation strategies. Big Data with such new analytics can yield improved risk estimates. The challenges of scientific inference from complex and massive data are addressed by data-enabled science, also referred as the Fourth paradigm, after experiment, theory and simulation. An example of this approach is the modelling of dynamical and statistical features of natural systems, without assumptions of specific processes. An effective use of the techniques of complexity science to yield the inherent features of a system from extensive data from observations and large scale numerical simulations is evident in the case of Earth's magnetosphere. The multiscale nature of the magnetosphere makes the numerical simulations a challenge, requiring very large computing resources. The reconstruction of dynamics from observational data can however yield the inherent characteristics using typical desktop computers. Such studies for other systems are in progress. Data-enabled approach using the framework of complexity science provides new techniques for modelling and prediction using Big Data. The studies of Earth's magnetosphere, provide an example of the potential for a new approach to the development of quantitative analytic tools.
NASA Astrophysics Data System (ADS)
Semken, S. C.
2013-12-01
How might we authentically and practically evaluate the effects of a geologic heritage place or program on public Earth science literacy? This pedagogical form of evaluation is distinct from the evaluation of a place for its geological importance, heritage value, economic or cultural impact, and so on. Best evaluation practices from the realms of formal education, informal education, and interpretation start with a coherent set of evaluable learning outcomes, ideally recapitulated in one or more 'big ideas' that capture the essential attributes of the place or program. Learning outcomes may be classified as cognitive, affective, or psychomotor. Cognitive learning outcomes in a geoheritage context are the Earth-science concepts a visitor or student would be expected to uncover through on-site or virtual exploration of the stratigraphy, structure, landforms, and processes in a place. The Earth Science Literacy Principles (ESLP), and similar literacy documents relating to atmosphere, oceans, and climate; offer a template for mapping localized concepts onto more global ones. Quantitative instruments to evaluate understanding of the ESLP are in development, and the ESLP also map directly onto measures used in formal educational assessment, notably the Next Generation Science Standards in the USA. Nongeological place meanings (a component of sense of place) may suggest other cognitive outcomes. Affective learning outcomes for visitors and students in geoheritage sites are less readily defined, but may include place attachment (also a component of sense of place), attitudes, and interest. Multiple quantitative and qualitative methods of evaluating these outcomes exist. Psychomotor learning outcomes are even muddier, but accessibility (defined by statutes) offers a potential starting point. In practice, evaluation may be conducted synchronously or asynchronously with visitors' or students' interaction with the geoheritage place or program. Evaluation programs are typically constrained by access and practicality. Synchronous methods include observation, semi-structured interviews, rapid prototyping and surveys. Asynchronous methods include interviews, surveys, and tracking. Evaluation tools may require content or instrument validation for the specific context in which they are used. Illustrative examples from evaluation of public engagement with geoheritage places (National Parks) in the Southwest USA will be offered.
STEM Education is Missing This.......
NASA Astrophysics Data System (ADS)
Orr, Laura; Johnson, Milton; Miller, Alexandra; Rebull, Luisa M.
2017-01-01
STEM education gets a lot of attention in schools, media, politics, and funding. But while the acronym grows from STEM to STEAM to STREAM, we still see a lack of student participation in real science, using big data and building partnerships with professionals in the field, and real student growth in science achievement. After the NITARP experience, we believe that NITARP is a rich, demanding, and authentic experience for dedicated teachers and students that provides a caliber of learning that is hard, if not impossible, to achieve in the traditional classroom. This poster looks at what STEM still needs to be and become for it to be the driving force behind greater student involvement, interest, and increased academic performance in the sciences. We focus on our own experiences and that of our students; our different teaching backgrounds and school environments; and the effects we see on our students using traditional and new STEM education and participation in the NITARP program. We come from backgrounds and situations that range from urban to rural, middle to high school, wide socioeconomic variety, gender differences, as well as different exposures to STEM opportunities. We propose that traditional and current standards for STEM education are falling short of what is needed for students to truly experience, understand, and gain the skills to accurately apply and advance in science. Incoming and current science teachers at all levels are not provided with quality, realistic, or applicable preparation. NITARP is truly a STEM experience because it actually integrates the 4 fields and provides opportunities for students to experience the overlap of the 4 fields in an authentic way. The deep, long term exposure to authentic research and technology as well as opportunity to talk with working scientists in a variety of fields have a huge impact on the students and teachers alike. Exposure to programs and experiences like NITARP are needed to help drive and support STEM education to meet its goals and intentions. Support provided for this work by the NASA/IPAC Teacher Archive Research Program (NITARP), which receives funding from the NASA ADP program.
NASA Astrophysics Data System (ADS)
Cobb, W. H.; Buxner, S.; Lebofsky, L. A.; Ristvey, J.; Weeks, S.; Zolensky, M.
2011-12-01
Small Bodies, Big Concepts is a multi-disciplinary, professional development project that engages 5th - 8th grade teachers in high end planetary science using a research-based pedagogical framework, Designing Effective Science Instruction (DESI). In addition to developing sound background knowledge with a focus on visual analysis, teachers' awareness of the process of learning new content is heightened, and they use that experience to deepen their science teaching practice. Culling from NASA E/PO educational materials, activities are sequenced to enhance conceptual understanding of big ideas in space science: what do we know, how do we know it, why do we care? Helping teachers develop a picture of the history and evolution of our understanding of the solar system, and honing in on the place of comets and asteroids in helping us answer old questions and discover new ones, teachers see the power and excitement underlying planetary science as human endeavor. Research indicates that science inquiry is powerful in the classroom and mission scientists are real-life models of science inquiry in action. Using guest scientist facilitators from the Planetary Science Institute, NASA Johnson Space Center, Lockheed Martin, and NASA E/PO professionals from McREL and NASA AESP, teachers practice framing scientific questions, using current visual data, and adapting NASA E/PO activities related to current exploration of asteroids and comets in our Solar System. Cross-curricular elements included examining research-based strategies for enhancing English language learners' ability to engage in higher order questions and a professional astronomy artist's insight into how visual analysis requires not just our eyes engaged, but our brains: comparing, synthesizing, questioning, evaluating, and wondering. This summer we pilot tested the SBBC curriculum with thirteen 5th- 10th grade teachers modeling a variety of instructional approaches over eight days. Each teacher developed lesson plans that incorporate DESI strategies with new space science content to implement during the coming year in their classroom. Initial evaluation of the workshop showed that teachers left with an increased understanding of small bodies in the solar system, current exploration, and ways to integrate this exploration into their current curriculum. We will reconvene the teachers in the spring of 2012 to share their implementation experiences. The professional development is a year-long effort, supported both online and through future face-to-face workshops. Next summer there will be a field test of the project will be implemented after evaluation data informs best steps for improvement. The result of the project will be a model for implementing professional development that integrates research-based instructional strategies and science findings from NASA missions to improve teacher practice. Small Bodies, BIG Concepts is based upon work supported by the National Aeronautics and Space Administration (NASA) under Grant/Contract/Agreement No. 09-EPOESS09-0044 issued through the Science Mission Directorate.
ERIC Educational Resources Information Center
Valdata, Patricia
2006-01-01
College or university presidents (or chancellors, depending on the institution) get paid the big bucks to worry about the big picture: capital campaigns, attracting and retaining students, creating and sustaining quality academic programs, shared governance. It's a demanding job even when everything goes well, but when problems arise, challenges…
Ajunwa, Ifeoma; Crawford, Kate; Ford, Joel S
2016-09-01
This essay details the resurgence of wellness program as employed by large corporations with the aim of reducing healthcare costs. The essay narrows in on a discussion of how Big Data collection practices are being utilized in wellness programs and the potential negative impact on the worker in regards to privacy and employment discrimination. The essay offers an ethical framework to be adopted by wellness program vendors in order to conduct wellness programs that would achieve cost-saving goals without undue burdens on the worker. The essay also offers some innovative approaches to wellness that may well better serve the goals of healthcare cost reduction. © 2016 American Society of Law, Medicine & Ethics.
A Guided Inquiry on Hubble Plots and the Big Bang
NASA Astrophysics Data System (ADS)
Forringer, Ted
2014-04-01
In our science for non-science majors course "21st Century Physics," we investigate modern "Hubble plots" (plots of velocity versus distance for deep space objects) in order to discuss the Big Bang, dark matter, and dark energy. There are two potential challenges that our students face when encountering these topics for the first time. The first challenge is in understanding and interpreting Hubble plots. The second is that some of our students have religious or cultural objections to the concept of a "Big Bang" or a universe that is billions of years old. This paper presents a guided inquiry exercise that was created with the goal of introducing students to Hubble plots and giving them the opportunity to discover for themselves why we believe our universe started with an explosion billions of years ago. The exercise is designed to be completed before the topics are discussed in the classroom. We did the exercise during a one hour and 45 minute "lab" time and it was done in groups of three or four students, but it would also work as an individual take-home assignment.
The Rise of Big Data in Oncology.
Fessele, Kristen L
2018-05-01
To describe big data and data science in the context of oncology nursing care. Peer-reviewed and lay publications. The rapid expansion of real-world evidence from sources such as the electronic health record, genomic sequencing, administrative claims and other data sources has outstripped the ability of clinicians and researchers to manually review and analyze it. To promote high-quality, high-value cancer care, big data platforms must be constructed from standardized data sources to support extraction of meaningful, comparable insights. Nurses must advocate for the use of standardized vocabularies and common data elements that represent terms and concepts that are meaningful to patient care. Copyright © 2018 Elsevier Inc. All rights reserved.
2015-12-04
from back-office big - data analytics to fieldable hot-spot systems providing storage-processing-communication services for off- grid sensors. Speed...and power efficiency are the key metrics. Current state-of-the art approaches for big - data aim toward scaling out to many computers to meet...pursued within Lincoln Laboratory as well as external sponsors. Our vision is to bring new capabilities in big - data and internet-of-things applications
The Axion Dark Matter Experiment: Big Science with a (relatively) Small Team
NASA Astrophysics Data System (ADS)
Carosi, Gianpaolo
2016-03-01
The idea of the solitary physicist tinkering alone in a lab was my image of how science was done growing up (mostly influenced by popular culture). Of course this is not generally how experimental physics is done now days with examples of experiments at the LHC now involving thousands of scientists. In this talk I will describe my experience in a relatively modest project, the Axion Dark Matter eXperiment (ADMX), which involves only a few dozen scientists at various universities and national labs. I will outline ADMX's humble beginnings at Lawrence Livermore National Laboratory (LLNL), where it began in the mid-1990s, and describe how the collaboration has evolved and grown throughout the years, as we pursue our elusive quarry: the dark-matter axion. Supported by DOE Grants DE-FG02-97ER41029, DE-FG02-96ER40956, DE- AC52-07NA27344, DE-AC03-76SF00098, and the Livermore LDRD program.
Space research - At a crossroads
NASA Technical Reports Server (NTRS)
Mcdonald, Frank B.
1987-01-01
Efforts which must be expended if U.S. space research is to regain vitality in the next few years are discussed. Small-scale programs are the cornerstone for big science projects, giving both researchers and students a chance to practice the development of space missions and hardware and identify promising goals for larger projects. Small projects can be carried aloft by balloons, sounding rockets, the Shuttle and ELVs. It is recommended that NASA continue the development of remote sensing systems, and join with other government agencies to fund space-based materials science, space biology and medical research. Increased international cooperation in space projects is necessary for affording moderate to large scale missions, for political reasons, and to maximize available space resources. Finally, the establishment and funding of long-range goals in space, particularly the development of the infrastructure and technologies for the exploration and colonization of the planets, must be viewed as the normal outgrowth of the capabilities being developed for LEO operations.
Small Bodies, Big Discoveries: NASA's Small Bodies Education Program
NASA Astrophysics Data System (ADS)
Mayo, L.; Erickson, K. J.
2014-12-01
2014 is turning out to be a watershed year for celestial events involving the solar system's unsung heroes, small bodies. This includes the close flyby of comet C/2013 A1 / Siding Spring with Mars in October and the historic Rosetta mission with its Philae lander to comet 67P/Churyumov-Gerasimenko. Beyond 2014, the much anticipated 2015 Pluto flyby by New Horizons and the February Dawn Mission arrival at Ceres will take center stage. To deliver the excitement and wonder of our solar system's small bodies to worldwide audiences, NASA's JPL and GSFC education teams in partnership with NASA EDGE will reach out to the public through multiple venues including broadcast media, social media, science and math focused educational activities, observing challenges, interactive visualization tools like "Eyes on the Solar System" and more. This talk will highlight NASA's focused education effort to engage the public in small bodies mission science and the role these objects play in our understanding of the formation and evolution of the solar system.
Discovery informatics in biological and biomedical sciences: research challenges and opportunities.
Honavar, Vasant
2015-01-01
New discoveries in biological, biomedical and health sciences are increasingly being driven by our ability to acquire, share, integrate and analyze, and construct and simulate predictive models of biological systems. While much attention has focused on automating routine aspects of management and analysis of "big data", realizing the full potential of "big data" to accelerate discovery calls for automating many other aspects of the scientific process that have so far largely resisted automation: identifying gaps in the current state of knowledge; generating and prioritizing questions; designing studies; designing, prioritizing, planning, and executing experiments; interpreting results; forming hypotheses; drawing conclusions; replicating studies; validating claims; documenting studies; communicating results; reviewing results; and integrating results into the larger body of knowledge in a discipline. Against this background, the PSB workshop on Discovery Informatics in Biological and Biomedical Sciences explores the opportunities and challenges of automating discovery or assisting humans in discovery through advances (i) Understanding, formalization, and information processing accounts of, the entire scientific process; (ii) Design, development, and evaluation of the computational artifacts (representations, processes) that embody such understanding; and (iii) Application of the resulting artifacts and systems to advance science (by augmenting individual or collective human efforts, or by fully automating science).
Infectious Disease Surveillance in the Big Data Era: Towards Faster and Locally Relevant Systems
Simonsen, Lone; Gog, Julia R.; Olson, Don; Viboud, Cécile
2016-01-01
While big data have proven immensely useful in fields such as marketing and earth sciences, public health is still relying on more traditional surveillance systems and awaiting the fruits of a big data revolution. A new generation of big data surveillance systems is needed to achieve rapid, flexible, and local tracking of infectious diseases, especially for emerging pathogens. In this opinion piece, we reflect on the long and distinguished history of disease surveillance and discuss recent developments related to use of big data. We start with a brief review of traditional systems relying on clinical and laboratory reports. We then examine how large-volume medical claims data can, with great spatiotemporal resolution, help elucidate local disease patterns. Finally, we review efforts to develop surveillance systems based on digital and social data streams, including the recent rise and fall of Google Flu Trends. We conclude by advocating for increased use of hybrid systems combining information from traditional surveillance and big data sources, which seems the most promising option moving forward. Throughout the article, we use influenza as an exemplar of an emerging and reemerging infection which has traditionally been considered a model system for surveillance and modeling. PMID:28830112
ERIC Educational Resources Information Center
Nagengast, Benjamin; Marsh, Herbert W.
2011-01-01
Research on the relation between students' achievement (ACH) and their academic self-concept (ASC) has consistently shown a Big-Fish-Little-Pond-Effect (BFLPE); ASC is positively affected by individual ACH, but negatively affected by school-average ACH. Surprisingly, however, there are few good UK studies of the BFLPE and few anywhere in the world…
Using big data to map the network organization of the brain.
Swain, James E; Sripada, Chandra; Swain, John D
2014-02-01
The past few years have shown a major rise in network analysis of "big data" sets in the social sciences, revealing non-obvious patterns of organization and dynamic principles. We speculate that the dependency dimension - individuality versus sociality - might offer important insights into the dynamics of neurons and neuronal ensembles. Connectomic neural analyses, informed by social network theory, may be helpful in understanding underlying fundamental principles of brain organization.
Using big data to map the network organization of the brain
Swain, James E.; Sripada, Chandra; Swain, John D.
2015-01-01
The past few years have shown a major rise in network analysis of “big data” sets in the social sciences, revealing non-obvious patterns of organization and dynamic principles. We speculate that the dependency dimension – individuality versus sociality – might offer important insights into the dynamics of neurons and neuronal ensembles. Connectomic neural analyses, informed by social network theory, may be helpful in understanding underlying fundamental principles of brain organization. PMID:24572243
NASA Astrophysics Data System (ADS)
Boudrias, M. A.; Cantzler, J.; Croom, S.; Huston, C.; Woods, M.
2015-12-01
Courses on sustainability can be taught from multiple perspectives with some focused on specific areas (environmental, socio-cultural, economic, ethics) and others taking a more integrated approach across areas of sustainability and academic disciplines. In conjunction with the Climate Change Education Program efforts to enhance climate change literacy with innovative approaches, resources and communication strategies developed by Climate Education Partners were used in two distinct ways to integrate climate change science and impacts into undergraduate and graduate level courses. At the graduate level, the first lecture in the MBA program in Sustainable Supply Chain Management is entirely dedicated to climate change science, local and global impacts and discussions about key messages to communicate to the business community. Basic science concepts are integrated with discussions about mitigation and adaptation focused on business leaders. The concepts learned are then applied to the semester-long business plan project for the students. At the undergraduate level, a new model of comprehensive integration across disciplines was implemented in Spring 2015 across three courses on Sustainability each with a specific lens: Natural Science, Sociology and Philosophy. All three courses used climate change as the 'big picture' framing concept and had similar learning objectives creating a framework where lens-specific topics, focusing on depth in a discipline, were balanced with integrated exercises across disciplines providing breadth and possibilities for integration. The comprehensive integration project was the creation of the climate action plan for the university with each team focused on key areas of action (water, energy, transportation, etc.) and each team built with at least one member from each class ensuring a natural science, sociological and philosophical perspective. The final project was presented orally to all three classes and an integrated paper included all three perspectives. The best projects are being compiled so they can be shared with the University of San Diego's planning committee.
A Physicist's Odyssey in the Public Schools
NASA Astrophysics Data System (ADS)
Blatt, S. Leslie
2004-03-01
My colleagues and I developed our "Discovering Physics" course a dozen years ago based on the best available research on (predominantly pre-college) student learning in the sciences. The hands-on small-group approach we subsequently adopted works quite nicely in the university environment, as well. As a major side benefit, we began consulting with and eventually working closely with teachers in the Worcester Public Schools. Over the years, we developed a regular collaborative cycle: 1.) A curriculum team of Clark faculty and K-12 teachers meets during the academic year for discussions and to design activities built around a "big idea" in the sciences; 2.) A summer institute is offered, for a larger group of teachers, based on the work of the curriculum team; 3.) A "Ways of Knowing in the Sciences" course is offered in the fall for Education Department students, centered on the previously-tested science content coupled with a variety of pedagogical approaches, as well as observations in the schools; and 4.) The cycle resumes with a new team and a different "big idea." The experience continues to be both rewarding and eye-opening.
NASA Astrophysics Data System (ADS)
2012-09-01
WE RECOMMEND Nucleus: A Trip into the Heart of Matter A coffee-table book for everyone to dip into and learn from The Wonderful World of Relativity A charming, stand-out introduction to relativity The Physics DemoLab, National University of Singapore A treasure trove of physics for hands-on science experiences Quarks, Leptons and the Big Bang Perfect to polish up on particle physics for older students Victor 70C USB Digital Multimeter Equipment impresses for usability and value WORTH A LOOK Cosmos Close-Up Weighty tour of the galaxy that would make a good display Shooting Stars Encourage students to try astrophotography with this ebook HANDLE WITH CARE Head Shot: The Science Behind the JKF Assassination Exploration of the science behind the crime fails to impress WEB WATCH App-lied science for education: a selection of free Android apps are reviewed and iPhone app options are listed
Systems biology for nursing in the era of big data and precision health.
Founds, Sandra
2017-12-02
The systems biology framework was previously synthesized with the person-environment-health-nursing metaparadigm. The purpose of this paper is to present a nursing discipline-specific perspective of the association of systems biology with big data and precision health. The fields of systems biology, big data, and precision health are now overviewed, from origins through expansions, with examples of what is being done by nurses in each area of science. Technological advances continue to expand omics and other varieties of big data that inform the person's phenotype and health outcomes for precision care. Meanwhile, millions of participants in the United States are being recruited for health-care research initiatives aimed at building the information commons of digital health data. Implications and opportunities abound via conceptualizing the integration of these fields through the nursing metaparadigm. Copyright © 2017 Elsevier Inc. All rights reserved.
Big Explosives Experimental Facility - BEEF
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.
Big Explosives Experimental Facility - BEEF
None
2018-01-16
The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.
ERIC Educational Resources Information Center
Gordon, Dan
2011-01-01
When it comes to implementing innovative classroom technology programs, urban school districts face significant challenges stemming from their big-city status. These range from large bureaucracies, to scalability, to how to meet the needs of a more diverse group of students. Because of their size, urban districts tend to have greater distance…
Collins named First Woman Shuttle Commander
NASA Astrophysics Data System (ADS)
Showstack, Randy
Just a few hours after NASA revealed that there is water ice on the Moon, U.S. First Lady Hillary Rodham Clinton introduced Air Force Lieutenant Colonel Eileen Collins to a packed auditorium at Dunbar Senior High School in Washington, D.C., as the first woman who will command a NASA space shuttle mission. With students at this school, which is noted for its pre-engineering program, cheering, Clinton said that Collins' selection “is one big step forward for women and one giant step for humanity.” Clinton added, “It doesn't matter if you are a boy or a girl, you can be an astronaut or a pilot, if you get a first-rate education in math and science.”
2009-12-10
Korean High Level Delegation Visit Ames Certer Director and various Senior staff: John Hines, Ames Center Chief Technologist (middel left) explains PharmaSat (small Satellites) to Soon-Duk Bae, Deputy Director, Big Science Policy Division, Ministry of Educaiton, Science Technology, Young-Mok Hyun, Deputy Director, Space Development Division, Ministry of Educaiton, Science Technology, Seorium Lee, Senior Researcher, International Relations Korea Aerospace Research Institute. Unkonw person at the end of table.
Processing Solutions for Big Data in Astronomy
NASA Astrophysics Data System (ADS)
Fillatre, L.; Lepiller, D.
2016-09-01
This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.
2009-10-01
DARPA) Legged Squad Support System (LS3) Program. DARPA’s LS3 Program is an effort to develop a walking platform, preferably a quadruped, which...top-scoring UGV’s are track- or wheel-based; only the BigDog is a leg -based system. This presented BigDog with certain advantages (particularly...Technologies, Inc.’s ( DTI ) first location in Ranlo, North Carolina) – is a system capable of wheeled or tracked locomotion and was recently
NASA Astrophysics Data System (ADS)
Ghiringhelli, Luca M.; Carbogno, Christian; Levchenko, Sergey; Mohamed, Fawzi; Huhs, Georg; Lüders, Martin; Oliveira, Micael; Scheffler, Matthias
2017-11-01
With big-data driven materials research, the new paradigm of materials science, sharing and wide accessibility of data are becoming crucial aspects. Obviously, a prerequisite for data exchange and big-data analytics is standardization, which means using consistent and unique conventions for, e.g., units, zero base lines, and file formats. There are two main strategies to achieve this goal. One accepts the heterogeneous nature of the community, which comprises scientists from physics, chemistry, bio-physics, and materials science, by complying with the diverse ecosystem of computer codes and thus develops "converters" for the input and output files of all important codes. These converters then translate the data of each code into a standardized, code-independent format. The other strategy is to provide standardized open libraries that code developers can adopt for shaping their inputs, outputs, and restart files, directly into the same code-independent format. In this perspective paper, we present both strategies and argue that they can and should be regarded as complementary, if not even synergetic. The represented appropriate format and conventions were agreed upon by two teams, the Electronic Structure Library (ESL) of the European Center for Atomic and Molecular Computations (CECAM) and the NOvel MAterials Discovery (NOMAD) Laboratory, a European Centre of Excellence (CoE). A key element of this work is the definition of hierarchical metadata describing state-of-the-art electronic-structure calculations.
2009-12-10
Korean High Level Delegation Visit Ames Certer Director and variou Senior staff: from left to right; Gary Martin, Director of New Ventures and Communication, NASA. Ames, Chris Giulietti, NASA Headquarters, Soon-Duk Bae, Deputy Director, Big Science Policy Division, Ministry of Educaiton, Science Technology, Young-Mok Hyun, Deputy Director, Space Development Division, Ministry of Educaiton, Science Technology, Yvonne Pendleton, Director of Lunar Science Institute, Terry Pagaduan, Ames Government Relations/Legislative Affairs Office, Seorium Lee, Senior Researcher, International Relations Korea Aerospace Research Institute
Using AER to Improve Teacher Education
NASA Astrophysics Data System (ADS)
Ludwig, Randi R.
2013-06-01
In many ways, the astronomy education community is uniquely poised to influence pre-service and in-service teacher preparation. Astro101 courses are among those most commonly taken to satisfy general education requirements for non-science majors, including 9-25% education majors (Deming & Hufnagel, 2001; Rudolph et al. 2010). In addition, the astronomy community's numerous observatories and NASA centers engage in many efforts to satisfy demand for in-service teacher professional development (PD). These efforts represent a great laboratory in which we can apply conclusions from astronomy education research (AER) studies in particular and science education research (SER) in general. Foremost, we can work to align typical Astro101 and teacher PD content coverage to heavily hit topics in the Next Generation Science Standards (http://www.nextgenscience.org/) and utilize methods of teaching those topics that have been identified as successful in AER studies. Additionally, we can work to present teacher education using methodology that has been identified by the SER community as effective for lasting learning. In this presentation, I will highlight some of the big ideas from AER and SER that may be most useful in teacher education, many of which we implement at UT Austin in the Hands-on-Science program for pre-service teacher education and in-service teacher PD.
Smarter Earth Science Data System
NASA Technical Reports Server (NTRS)
Huang, Thomas
2013-01-01
The explosive growth in Earth observational data in the recent decade demands a better method of interoperability across heterogeneous systems. The Earth science data system community has mastered the art in storing large volume of observational data, but it is still unclear how this traditional method scale over time as we are entering the age of Big Data. Indexed search solutions such as Apache Solr (Smiley and Pugh, 2011) provides fast, scalable search via keyword or phases without any reasoning or inference. The modern search solutions such as Googles Knowledge Graph (Singhal, 2012) and Microsoft Bing, all utilize semantic reasoning to improve its accuracy in searches. The Earth science user community is demanding for an intelligent solution to help them finding the right data for their researches. The Ontological System for Context Artifacts and Resources (OSCAR) (Huang et al., 2012), was created in response to the DARPA Adaptive Vehicle Make (AVM) programs need for an intelligent context models management system to empower its terrain simulation subsystem. The core component of OSCAR is the Environmental Context Ontology (ECO) is built using the Semantic Web for Earth and Environmental Terminology (SWEET) (Raskin and Pan, 2005). This paper presents the current data archival methodology within a NASA Earth science data centers and discuss using semantic web to improve the way we capture and serve data to our users.
NASA's Swift Education and Public Outreach Program
NASA Astrophysics Data System (ADS)
Cominsky, L. R.; Graves, T.; Plait, P.; Silva, S.; Simonnet, A.
2004-08-01
Few astronomical objects excite students more than big explosions and black holes. Gamma Ray Bursts (GRBs) are both: powerful explosions that signal the births of black holes. NASA's Swift satellite mission, set for launch in Fall 2004, will detect hundreds of black holes over its two-year nominal mission timeline. The NASA Education and Public Outreach (E/PO) group at Sonoma State University is leading the Swift E/PO effort, using the Swift mission to engage students in science and math learning. We have partnered with the Lawrence Hall of Science to create a ``Great Explorations in Math and Science" guide entitled ``Invisible Universe: from Radio Waves to Gamma Rays," which uses GRBs to introduce students to the electromagnetic spectrum and the scale of energies in the Universe. We have also created new standards-based activities for grades 9-12 using GRBs: one activity puts the students in the place of astronomers 20 years ago, trying to sort out various types of stellar explosions that create high-energy radiation. Another mimics the use of the Interplanetary Network to let students figure out the direction to a GRB. Post-launch materials will include magazine articles about Swift and GRBs, and live updates of GRB information to the Swift E/PO website that will excite and inspire students to learn more about space science.
1989-05-01
AFB Statement of Work. The Phase II Stage 1 investigation of Malmstrom AFB and two off-Base sites at Shelby and Brady, Montana are described in a...Flathead National Forest Headquarters in Kalispell and the Big Fork Ranger Station in Big Fork . Because there has been litigation involved with this...reviewed by phone. An interview at the spill site was held with Mr. William Pedersen, U.S. Forest Service Ranger for the Big Fork District. He personally
NASA Technical Reports Server (NTRS)
Hausrath, E. M.; Ming, D. W.; Peretyazhko, T.; Rampe, E. B.
2017-01-01
Water flowing through sediments at Gale Crater, Mars created environments that were likely habitable, and sampled basin-wide hydrological systems. However, many questions remain about these environments and the fluids that generated them. Measurements taken by the Mars Science Laboratory Curiosity of multiple fracture zones can help constrain the environments that formed them because they can be compared to nearby associated parent material (Figure 1). For example, measurements of altered fracture zones from the target Greenhorn in the Stimson sandstone can be compared to parent material measured in the nearby Big Sky target, allowing constraints to be placed on the alteration conditions that formed the Greenhorn target from the Big Sky target. Similarly, CheMin measurements of the powdered < 150 micron fraction from the drillhole at Big Sky and sample from the Rocknest eolian deposit indicate that the mineralogies are strikingly similar. The main differences are the presence of olivine in the Rocknest eolian deposit, which is absent in the Big Sky target, and the presence of far more abundant Fe oxides in the Big Sky target. Quantifying the changes between the Big Sky target and the Rocknest eolian deposit can therefore help us understand the diagenetic changes that occurred forming the Stimson sedimentary unit. In order to interpret these aqueous changes, we performed reactive transport modeling of 1) the formation of the Big Sky target from a Rocknest eolian deposit-like parent material, and 2) the formation of the Greenhorn target from the Big Sky target. This work allows us to test the relationships between the targets and the characteristics of the aqueous conditions that formed the Greenhorn target from the Big Sky target, and the Big Sky target from a Rocknest eolian deposit-like parent material.
Questions Come Alive. 2012 Annual Report
ERIC Educational Resources Information Center
Smithsonian Institution, 2012
2012-01-01
At the Smithsonian, big questions are the common denominators that unite their museums, research centers, and programs. And curiosity is contagious. Today's small child who looks in awe at the big dinosaurs that once roamed the earth and wonders, "What happened?" might just be tomorrow's paleontologist. Among this year's stories is a…
CyberConnect: Use the Internet with Big6[R] Skills To Achieve Standards.
ERIC Educational Resources Information Center
Murray, Janet
2003-01-01
Describes the use of Big6 strategies in guiding student research projects as part of a cooperative program between teachers and the school librarian. Topics include information seeking strategies; evaluating information sources; locating information using search engines; analyzing information sources; and achieving information literacy and…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-23
... under the National Coastal Wetland Conservation Grant Program for habitat restoration activities. The... Creek Restoration Final Environmental Impact Statement/Environmental Impact Report, Big Lagoon, Muir... impact report (EIS/EIR) for the Wetland and Creek Restoration at Big Lagoon, Muir Beach, California...
Enabling a new Paradigm to Address Big Data and Open Science Challenges
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; Fisher, Ward
2017-04-01
Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers also need to give scientists an ecosystem that includes data, tools, workflows and other services needed to perform analytics, integration, interpretation, and synthesis - all in the same environment or platform. Instead of moving data to processing systems near users, as is the tradition, the cloud permits one to bring processing, computing, analysis and visualization to data - so called data proximate workbench capabilities, also known as server-side processing. In this talk, I will present the ongoing work at Unidata to facilitate a new paradigm for doing science by offering a suite of tools, resources, and platforms to leverage cloud technologies for addressing both big data and Open Science/reproducibility challenges. That work includes the development and deployment of new protocols for data access and server-side operations and Docker container images of key applications, JupyterHub Python notebook tools, and cloud-based analysis and visualization capability via the CloudIDV tool to enable reproducible workflows and effectively use the accessed data.
Constraint programming based biomarker optimization.
Zhou, Manli; Luo, Youxi; Sun, Guoquan; Mai, Guoqin; Zhou, Fengfeng
2015-01-01
Efficient and intuitive characterization of biological big data is becoming a major challenge for modern bio-OMIC based scientists. Interactive visualization and exploration of big data is proven to be one of the successful solutions. Most of the existing feature selection algorithms do not allow the interactive inputs from users in the optimizing process of feature selection. This study investigates this question as fixing a few user-input features in the finally selected feature subset and formulates these user-input features as constraints for a programming model. The proposed algorithm, fsCoP (feature selection based on constrained programming), performs well similar to or much better than the existing feature selection algorithms, even with the constraints from both literature and the existing algorithms. An fsCoP biomarker may be intriguing for further wet lab validation, since it satisfies both the classification optimization function and the biomedical knowledge. fsCoP may also be used for the interactive exploration of bio-OMIC big data by interactively adding user-defined constraints for modeling.
Assessing Teachers' Science Content Knowledge: A Strategy for Assessing Depth of Understanding
NASA Astrophysics Data System (ADS)
McConnell, Tom J.; Parker, Joyce M.; Eberhardt, Jan
2013-06-01
One of the characteristics of effective science teachers is a deep understanding of science concepts. The ability to identify, explain and apply concepts is critical in designing, delivering and assessing instruction. Because some teachers have not completed extensive courses in some areas of science, especially in middle and elementary grades, many professional development programs attempt to strengthen teachers' content knowledge. Assessing this content knowledge is challenging. Concept inventories are reliable and efficient, but do not reveal depth of knowledge. Interviews and observations are time-consuming. The Problem Based Learning Project for Teachers implemented a strategy that includes pre-post instruments in eight content strands that permits blind coding of responses and comparison across teachers and groups of teachers. The instruments include two types of open-ended questions that assess both general knowledge and the ability to apply Big Ideas related to specific science topics. The coding scheme is useful in revealing patterns in prior knowledge and learning, and identifying ideas that are challenging or not addressed by learning activities. The strengths and limitations of the scoring scheme are identified through comparison of the findings to case studies of four participating teachers from middle and elementary schools. The cases include examples of coded pre- and post-test responses to illustrate some of the themes seen in teacher learning. The findings raise questions for future investigation that can be conducted using analyses of the coded responses.
NASA Astrophysics Data System (ADS)
O'Connell, E. A.
2017-12-01
The Frontier Scientists National Science Foundation project titled Science in Alaska: Using Multimedia to Support Science Education produced research products in several formats: videos short and long, blogs, social media, a computer game, and a pop-up book. These formats reached distinctly different audiences. Internet users, public TV viewers, gamers, schools, and parents & young children were drawn to Frontier Scientists' research in direct and indirect ways. The analytics (our big data) derived from this media broadcast has given us insight into what works, what doesn't, next steps. We have evidence for what is needed to present science as an interesting, vital, and a necessary component for the general public's daily information diet and as an important tool for scientists to publicize research and to thrive in their careers. Collaborations with scientists at several Universities, USGS, Native organizations, tourism organizations, and Alaska Museums promoted accuracy of videos and increased viewing. For example, Erin Marbarger, at Anchorage Museum, edited, and provided Spark!Lab to test parents & child's interest in the pop-up book titled: The Adventures of Apun the Arctic Fox. Without a marketing budget Frontier Scientist's minimum publicity, during the three year project, still drew an audience. Frontier Scientists was awarded Best Website 2016 by the Alaska Press Club, and won a number of awards for short videos and TV programs.
NASA Astrophysics Data System (ADS)
Charles, Karen Jungblut
For much of this century, mathematics and science have been taught in a didactic manner that is characterized by a passive student and a lecturing teacher. Since the late eighties national standards have encouraged professional developers specializing in mathematics and science education to deliver the messages of inquiry-based learning, active student engagement, and learner-constructed knowledge to the teachers they support. Follow-up studies of professional development programs, however, found that telling teachers was no more effective than telling students. Information transmitted in a passive setting was not transferring into effective classroom practices. This phenomenological case study was conducted to determine the effects of a constructivist-oriented professional development experience, the Technical Assistance Academy, in changing the practices and attitudes of mathematics and science professional developers regarding the use of constructivist strategies in workshop design. This study focused on 45 professional developers who participated in the Technical Assistance Academy. Data from a 2 1/2 year period were collected from session evaluations, journal reflections, a follow-up interview, and site visits that included observations and collaborative planning. Content analysis procedures were used to find common themes among the data. Use of new skills developed as a result of participation in the Technical Assistance Academy was determined using the Concerns-Based Adoption Model Levels of Use framework (Hall & Hord, 1987). Changes in attitude were determined by examining participants' journal reflections related to common constructivist themes such as those discussed by Fosnot (1996c): learning is developmental, disequilibrium and reflection facilitate learning, and the construction of "big ideas" results from the opportunity to struggle with new information. Results verified that all 45 participants demonstrated some level of use, and that most were in the 3 highest of 5 levels of use: mechanical (11%), routine (16%), refinement (27%), integration (24%), and renewal (22%). Participants reported valuing (a) active engagement necessary for the developmental progression of learning to occur, (b) their own disequilibrium, (c) opportunity to reflect, and they acknowledged a clearer understanding and appreciation of the big ideas in workshop design such as networking, collaboration, content and staff development standards, equity, and community building. Results support the conclusion that learning about constructivist instructional strategies in a long-term program that models them positively affects participants' attitudes and enhances their use of similar strategies in the design of professional development experiences for others. Knowledge developed in a constructivist setting transferred into effective facilitator practices.