Sample records for scientific analysis tools

  1. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    ERIC Educational Resources Information Center

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  2. Scientific Platform as a Service - Tools and solutions for efficient access to and analysis of oceanographic data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hansen, Morten W.; Korosov, Anton

    2017-04-01

    Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).

  3. Supporting Scientific Analysis within Collaborative Problem Solving Environments

    NASA Technical Reports Server (NTRS)

    Watson, Velvin R.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    Collaborative problem solving environments for scientists should contain the analysis tools the scientists require in addition to the remote collaboration tools used for general communication. Unfortunately, most scientific analysis tools have been designed for a "stand-alone mode" and cannot be easily modified to work well in a collaborative environment. This paper addresses the questions, "What features are desired in a scientific analysis tool contained within a collaborative environment?", "What are the tool design criteria needed to provide these features?", and "What support is required from the architecture to support these design criteria?." First, the features of scientific analysis tools that are important for effective analysis in collaborative environments are listed. Next, several design criteria for developing analysis tools that will provide these features are presented. Then requirements for the architecture to support these design criteria are listed. Sonic proposed architectures for collaborative problem solving environments are reviewed and their capabilities to support the specified design criteria are discussed. A deficiency in the most popular architecture for remote application sharing, the ITU T. 120 architecture, prevents it from supporting highly interactive, dynamic, high resolution graphics. To illustrate that the specified design criteria can provide a highly effective analysis tool within a collaborative problem solving environment, a scientific analysis tool that contains the specified design criteria has been integrated into a collaborative environment and tested for effectiveness. The tests were conducted in collaborations between remote sites in the US and between remote sites on different continents. The tests showed that the tool (a tool for the visual analysis of computer simulations of physics) was highly effective for both synchronous and asynchronous collaborative analyses. The important features provided by the tool (and made possible by the specified design criteria) are: 1. The tool provides highly interactive, dynamic, high resolution, 3D graphics. 2. All remote scientists can view the same dynamic, high resolution, 3D scenes of the analysis as the analysis is being conducted. 3. The responsiveness of the tool is nearly identical to the responsiveness of the tool in a stand-alone mode. 4. The scientists can transfer control of the analysis between themselves. 5. Any analysis session or segment of an analysis session, whether done individually or collaboratively, can be recorded and posted on the Web for other scientists or students to download and play in either a collaborative or individual mode. 6. The scientist or student who downloaded the session can, individually or collaboratively, modify or extend the session with his/her own "what if" analysis of the data and post his/her version of the analysis back onto the Web. 7. The peak network bandwidth used in the collaborative sessions is only 1K bit/second even though the scientists at all sites are viewing high resolution (1280 x 1024 pixels), dynamic, 3D scenes of the analysis. The links between the specified design criteria and these performance features are presented.

  4. Tools for Data Analysis in the Middle School Classroom: A Teacher Professional Development Program

    NASA Astrophysics Data System (ADS)

    Ledley, T. S.; Haddad, N.; McAuliffe, C.; Dahlman, L.

    2006-12-01

    In order for students to learn how to engage with scientific data to answer questions about the real world, it is imperative that their teachers are 1) comfortable with the data and the tools used to analyze it, and 2) feel prepared to support their students in this complex endeavor. TERC's Tools for Data Analysis in the Middle School Classroom (DataTools) professional development program, funded by NSF's ITEST program, prepares middle school teachers to integrate Web-based scientific data and analysis tools into their existing curricula. This 13-month program supports teachers in using a set of freely or commonly available tools with a wide range of data. It also gives them an opportunity to practice teaching these skills to students before teaching in their own classrooms. The ultimate goal of the program is to increase the number of middle school students who work directly with scientific data, who use the tools of technology to import, manipulate, visualize and analyze the data, who come to understand the power of data-based arguments, and who will consider pursuing a career in technical and scientific fields. In this session, we will describe the elements of the DataTools program and the Earth Exploration Toolbook (EET, http://serc.carleton.edu/eet), a Web-based resource that supports Earth system education for teachers and students in grades 6 through 16. The EET provides essential support to DataTools teachers as they use it to learn to locate and download Web-based data and use data analysis tools. We will also share what we have learned during the first year of this three-year program.

  5. Scientific Mobility and International Research Networks: Trends and Policy Tools for Promoting Research Excellence and Capacity Building

    ERIC Educational Resources Information Center

    Jacob, Merle; Meek, V. Lynn

    2013-01-01

    One of the ways in which globalization is manifesting itself in higher education and research is through the increasing importance and emphasis on scientific mobility. This article seeks to provide an overview and analysis of current trends and policy tools for promoting mobility. The article argues that the mobility of scientific labour is an…

  6. Bringing "Scientific Expeditions" into the Schools

    NASA Technical Reports Server (NTRS)

    Watson, Val; Kutler, Paul (Technical Monitor)

    1994-01-01

    Schools can obtain scientific information over the information superhighway. However, information suppliers use formats that permit access and analysis by the "least common denominator" tools for access and analysis. The result: most sources of dynamic representations of science are in the format of flat movies. We can shorten the time to get the "scientific expeditions" into schools and provide a unifying focus to vendors and information suppliers by establishing a target and goals for the "least common denominator" for tools to be used to access and analyze information over the information superhighway.

  7. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  8. Instruments of scientific visual representation in atomic databases

    NASA Astrophysics Data System (ADS)

    Kazakov, V. V.; Kazakov, V. G.; Meshkov, O. I.

    2017-10-01

    Graphic tools of spectral data representation provided by operating information systems on atomic spectroscopy—ASD NIST, VAMDC, SPECTR-W3, and Electronic Structure of Atoms—for the support of scientific-research and human-resource development are presented. Such tools of visual representation of scientific data as those of the spectrogram and Grotrian diagram plotting are considered. The possibility of comparative analysis of the experimentally obtained spectra and reference spectra of atomic systems formed according to the database of a resource is described. The access techniques to the mentioned graphic tools are presented.

  9. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    PubMed Central

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  10. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    PubMed

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  11. Educational and Scientific Applications of Climate Model Diagnostic Analyzer

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.

    2016-12-01

    Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.

  12. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Koo, Michelle; Cao, Yu

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less

  13. Biblio-MetReS: A bibliometric network reconstruction application and server

    PubMed Central

    2011-01-01

    Background Reconstruction of genes and/or protein networks from automated analysis of the literature is one of the current targets of text mining in biomedical research. Some user-friendly tools already perform this analysis on precompiled databases of abstracts of scientific papers. Other tools allow expert users to elaborate and analyze the full content of a corpus of scientific documents. However, to our knowledge, no user friendly tool that simultaneously analyzes the latest set of scientific documents available on line and reconstructs the set of genes referenced in those documents is available. Results This article presents such a tool, Biblio-MetReS, and compares its functioning and results to those of other user-friendly applications (iHOP, STRING) that are widely used. Under similar conditions, Biblio-MetReS creates networks that are comparable to those of other user friendly tools. Furthermore, analysis of full text documents provides more complete reconstructions than those that result from using only the abstract of the document. Conclusions Literature-based automated network reconstruction is still far from providing complete reconstructions of molecular networks. However, its value as an auxiliary tool is high and it will increase as standards for reporting biological entities and relationships become more widely accepted and enforced. Biblio-MetReS is an application that can be downloaded from http://metres.udl.cat/. It provides an easy to use environment for researchers to reconstruct their networks of interest from an always up to date set of scientific documents. PMID:21975133

  14. MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format.

    PubMed

    Ahmed, Zeeshan; Dandekar, Thomas

    2015-01-01

    Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography  (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool 'Mining Scientific Literature (MSL)', which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system's output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format.

  15. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  16. On a Modern Philosophy of Evaluating Scientific Publications

    NASA Astrophysics Data System (ADS)

    Guz, A. N.; Rushchitsky, J. J.; Chernyshenko, I. S.

    2005-10-01

    Current approaches to the citation analysis of scientific publications are outlined. Science Citation Index, Impact Factor, Immediacy Index, and the selection procedure for Essential Science Indicators—a relatively new citation analysis tool—are described. The new citation evaluation tool has yet not been discussed adequately by mechanicians

  17. MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format

    PubMed Central

    Ahmed, Zeeshan; Dandekar, Thomas

    2018-01-01

    Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography  (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool ‘Mining Scientific Literature (MSL)’, which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system’s output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format. PMID:29721305

  18. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  19. Using Network Analysis to Characterize Biogeographic Data in a Community Archive

    NASA Astrophysics Data System (ADS)

    Wellman, T. P.; Bristol, S.

    2017-12-01

    Informative measures are needed to evaluate and compare data from multiple providers in a community-driven data archive. This study explores insights from network theory and other descriptive and inferential statistics to examine data content and application across an assemblage of publically available biogeographic data sets. The data are archived in ScienceBase, a collaborative catalog of scientific data supported by the U.S Geological Survey to enhance scientific inquiry and acuity. In gaining understanding through this investigation and other scientific venues our goal is to improve scientific insight and data use across a spectrum of scientific applications. Network analysis is a tool to reveal patterns of non-trivial topological features in the data that do not exhibit complete regularity or randomness. In this work, network analyses are used to explore shared events and dependencies between measures of data content and application derived from metadata and catalog information and measures relevant to biogeographic study. Descriptive statistical tools are used to explore relations between network analysis properties, while inferential statistics are used to evaluate the degree of confidence in these assessments. Network analyses have been used successfully in related fields to examine social awareness of scientific issues, taxonomic structures of biological organisms, and ecosystem resilience to environmental change. Use of network analysis also shows promising potential to identify relationships in biogeographic data that inform programmatic goals and scientific interests.

  20. Scientific Reasoning Abilities in Kindergarten: Dynamic Assessment of the Control of Variables Strategy

    ERIC Educational Resources Information Center

    van der Graaf, Joep; Segers, Eliane; Verhoeven, Ludo

    2015-01-01

    A dynamic assessment tool was developed and validated using Mokken scale analysis to assess the extent to which kindergartners are able to construct unconfounded experiments, an essential part of scientific reasoning. Scientific reasoning is one of the learning processes happening within science education. A commonly used, hands-on,…

  1. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  2. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less

  3. Application of the enterprise management tools Lean Six Sigma and PMBOK in developing a program of research management.

    PubMed

    Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente

    2012-01-01

    Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.

  4. Exploring the Potential for Using Inexpensive Natural Reagents Extracted from Plants to Teach Chemical Analysis

    ERIC Educational Resources Information Center

    Hartwell, Supaporn Kradtap

    2012-01-01

    A number of scientific articles report on the use of natural extracts from plants as chemical reagents, where the main objective is to present the scientific applications of those natural plant extracts. The author suggests that natural reagents extracted from plants can be used as alternative low cost tools in teaching chemical analysis,…

  5. PubFinder: a tool for improving retrieval rate of relevant PubMed abstracts.

    PubMed

    Goetz, Thomas; von der Lieth, Claus-Wilhelm

    2005-07-01

    Since it is becoming increasingly laborious to manually extract useful information embedded in the ever-growing volumes of literature, automated intelligent text analysis tools are becoming more and more essential to assist in this task. PubFinder (www.glycosciences.de/tools/PubFinder) is a publicly available web tool designed to improve the retrieval rate of scientific abstracts relevant for a specific scientific topic. Only the selection of a representative set of abstracts is required, which are central for a scientific topic. No special knowledge concerning the query-syntax is necessary. Based on the selected abstracts, a list of discriminating words is automatically calculated, which is subsequently used for scoring all defined PubMed abstracts for their probability of belonging to the defined scientific topic. This results in a hit-list of references in the descending order of their likelihood score. The algorithms and procedures implemented in PubFinder facilitate the perpetual task for every scientist of staying up-to-date with current publications dealing with a specific subject in biomedicine.

  6. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  7. Report on Automated Semantic Analysis of Scientific and Engineering Codes

    NASA Technical Reports Server (NTRS)

    Stewart. Maark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    The loss of the Mars Climate Orbiter due to a software error reveals what insiders know: software development is difficult and risky because, in part, current practices do not readily handle the complex details of software. Yet, for scientific software development the MCO mishap represents the tip of the iceberg; few errors are so public, and many errors are avoided with a combination of expertise, care, and testing during development and modification. Further, this effort consumes valuable time and resources even when hardware costs and execution time continually decrease. Software development could use better tools! This lack of tools has motivated the semantic analysis work explained in this report. However, this work has a distinguishing emphasis; the tool focuses on automated recognition of the fundamental mathematical and physical meaning of scientific code. Further, its comprehension is measured by quantitatively evaluating overall recognition with practical codes. This emphasis is necessary if software errors-like the MCO error-are to be quickly and inexpensively avoided in the future. This report evaluates the progress made with this problem. It presents recommendations, describes the approach, the tool's status, the challenges, related research, and a development strategy.

  8. Science as Structured Imagination

    ERIC Educational Resources Information Center

    De Cruz, Helen; De Smedt, Johan

    2010-01-01

    This paper offers an analysis of scientific creativity based on theoretical models and experimental results of the cognitive sciences. Its core idea is that scientific creativity--like other forms of creativity--is structured and constrained by prior ontological expectations. Analogies provide scientists with a powerful epistemic tool to overcome…

  9. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less

  10. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.; Silva, Claudio

    2013-09-30

    For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integrationmore » or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.« less

  11. PTAL Database and Website: Developing a Novel Information System for the Scientific Exploitation of the Planetary Terrestrial Analogues Library

    NASA Astrophysics Data System (ADS)

    Veneranda, M.; Negro, J. I.; Medina, J.; Rull, F.; Lantz, C.; Poulet, F.; Cousin, A.; Dypvik, H.; Hellevang, H.; Werner, S. C.

    2018-04-01

    The PTAL website will store multispectral analysis of samples collected from several terrestrial analogue sites and pretend to become a cornerstone tool for the scientific community interested in deepening the knowledge on Mars geological processes.

  12. Interactive 3D visualization for theoretical virtual observatories

    NASA Astrophysics Data System (ADS)

    Dykes, T.; Hassan, A.; Gheller, C.; Croton, D.; Krokos, M.

    2018-06-01

    Virtual observatories (VOs) are online hubs of scientific knowledge. They encompass a collection of platforms dedicated to the storage and dissemination of astronomical data, from simple data archives to e-research platforms offering advanced tools for data exploration and analysis. Whilst the more mature platforms within VOs primarily serve the observational community, there are also services fulfilling a similar role for theoretical data. Scientific visualization can be an effective tool for analysis and exploration of data sets made accessible through web platforms for theoretical data, which often contain spatial dimensions and properties inherently suitable for visualization via e.g. mock imaging in 2D or volume rendering in 3D. We analyse the current state of 3D visualization for big theoretical astronomical data sets through scientific web portals and virtual observatory services. We discuss some of the challenges for interactive 3D visualization and how it can augment the workflow of users in a virtual observatory context. Finally we showcase a lightweight client-server visualization tool for particle-based data sets, allowing quantitative visualization via data filtering, highlighting two example use cases within the Theoretical Astrophysical Observatory.

  13. Examples of Effective Data Sharing in Scientific Publishing

    DOE PAGES

    Kitchin, John R.

    2015-05-11

    Here, we present a perspective on an approach to data sharing in scientific publications we have been developing in our group. The essence of the approach is that data can be embedded in a human-readable and machine-addressable way within the traditional publishing environment. We show this by example for both computational and experimental data. We articulate a need for new authoring tools to facilitate data sharing, and we discuss the tools we have been developing for this purpose. With these tools, data generation, analysis, and manuscript preparation can be deeply integrated, resulting in easier and better data sharing in scientificmore » publications.« less

  14. Examples of Effective Data Sharing in Scientific Publishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitchin, John R.

    Here, we present a perspective on an approach to data sharing in scientific publications we have been developing in our group. The essence of the approach is that data can be embedded in a human-readable and machine-addressable way within the traditional publishing environment. We show this by example for both computational and experimental data. We articulate a need for new authoring tools to facilitate data sharing, and we discuss the tools we have been developing for this purpose. With these tools, data generation, analysis, and manuscript preparation can be deeply integrated, resulting in easier and better data sharing in scientificmore » publications.« less

  15. Conceptual Level of Understanding about Sound Concept: Sample of Fifth Grade Students

    ERIC Educational Resources Information Center

    Bostan Sarioglan, Ayberk

    2016-01-01

    In this study, students' conceptual change processes related to the sound concept were examined. Study group was comprises of 325 fifth grade middle school students. Three multiple-choice questions were used as the data collection tool. At the data analysis process "scientific response", "scientifically unacceptable response"…

  16. Data, Analysis, and Visualization | Computational Science | NREL

    Science.gov Websites

    Data, Analysis, and Visualization Data, Analysis, and Visualization Data management, data analysis . At NREL, our data management, data analysis, and scientific visualization capabilities help move the approaches to image analysis and computer vision. Data Management and Big Data Systems, software, and tools

  17. Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.

  18. Conceptual Tools for Understanding Nature - Proceedings of the 3rd International Symposium

    NASA Astrophysics Data System (ADS)

    Costa, G.; Calucci, M.

    1997-04-01

    The Table of Contents for the full book PDF is as follows: * Foreword * Some Limits of Science and Scientists * Three Limits of Scientific Knowledge * On Features and Meaning of Scientific Knowledge * How Science Approaches the World: Risky Truths versus Misleading Certitudes * On Discovery and Justification * Thought Experiments: A Philosophical Analysis * Causality: Epistemological Questions and Cognitive Answers * Scientific Inquiry via Rational Hypothesis Revision * Probabilistic Epistemology * The Transferable Belief Model for Uncertainty Representation * Chemistry and Complexity * The Difficult Epistemology of Medicine * Epidemiology, Causality and Medical Anthropology * Conceptual Tools for Transdisciplinary Unified Theory * Evolution and Learning in Economic Organizations * The Possible Role of Symmetry in Physics and Cosmology * Observational Cosmology and/or other Imaginable Models of the Universe

  19. Data Rights and Responsibilities

    PubMed Central

    Wyndham, Jessica M.

    2015-01-01

    A human-rights-based analysis can be a useful tool for the scientific community and policy makers as they develop codes of conduct, harmonized standards, and national policies for data sharing. The human rights framework provides a shared set of values and norms across borders, defines rights and responsibilities of various actors involved in data sharing, addresses the potential harms as well as the benefits of data sharing, and offers a framework for balancing competing values. The right to enjoy the benefits of scientific progress and its applications offers a particularly helpful lens through which to view data as both a tool of scientific inquiry to which access is vital and as a product of science from which everyone should benefit. PMID:26297755

  20. Making Scientific Data Usable and Useful

    NASA Astrophysics Data System (ADS)

    Satwicz, T.; Bharadwaj, A.; Evans, J.; Dirks, J.; Clark Cole, K.

    2017-12-01

    Transforming geological data into information that has broad scientific and societal impact is a process fraught with barriers. Data sets and tools are often reported to have poor user experiences (UX) that make scientific work more challenging than it needs be. While many other technical fields have benefited from ongoing improvements to the UX of their tools (e.g., healthcare and financial services) scientists are faced with using tools that are labor intensive and not intuitive. Our research team has been involved in a multi-year effort to understand and improve the UX of scientific tools and data sets. We use a User-Centered Design (UCD) process that involves naturalistic behavioral observation and other qualitative research methods adopted from Human-Computer Interaction (HCI) and related fields. Behavioral observation involves having users complete common tasks on data sets, tools, and websites to identify usability issues and understand the severity of the issues. We measure how successfully they complete tasks and diagnosis the cause of any failures. Behavioral observation is paired with in-depth interviews where users describe their process for generating results (from initial inquiry to final results). By asking detailed questions we unpack common patterns and challenges scientists experience while working with data. We've found that tools built using the UCD process can have a large impact on scientist work flows and greatly reduce the time it takes to process data before analysis. It is often challenging to understand the organization and nuances of data across scientific fields. By better understanding how scientists work we can create tools that make routine tasks less-labor intensive, data easier to find, and solve common issues with discovering new data sets and engaging in interdisciplinary research. There is a tremendous opportunity for advancing scientific knowledge and helping the public benefit from that work by creating intuitive, interactive, and powerful tools and resources for generating knowledge. The pathway to achieving that is through building a detailed understanding of users and their needs, then using this knowledge to inform the design of the data products, tools, and services scientists and non-scientists use to do their work.

  1. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  2. Cognitive Affordances of the Cyberinfrastructure for Science and Math Learning

    ERIC Educational Resources Information Center

    Martinez, Michael E.; Peters Burton, Erin E.

    2011-01-01

    The "cyberinfrastucture" is a broad informational network that entails connections to real-time data sensors as well as tools that permit visualization and other forms of analysis, and that facilitates access to vast scientific databases. This multifaceted network, already a major boon to scientific discovery, now shows exceptional promise in…

  3. Recognizing Mechanistic Reasoning in Student Scientific Inquiry: A Framework for Discourse Analysis Developed from Philosophy of Science

    ERIC Educational Resources Information Center

    Russ, Rosemary S.; Scherr, Rachel E.; Hammer, David; Mikeska, Jamie

    2008-01-01

    Science education reform has long focused on assessing student inquiry, and there has been progress in developing tools specifically with respect to experimentation and argumentation. We suggest the need for attention to another aspect of inquiry, namely "mechanistic reasoning." Scientific inquiry focuses largely on understanding causal…

  4. Integrating advanced visualization technology into the planetary Geoscience workflow

    NASA Astrophysics Data System (ADS)

    Huffman, John; Forsberg, Andrew; Loomis, Andrew; Head, James; Dickson, James; Fassett, Caleb

    2011-09-01

    Recent advances in computer visualization have allowed us to develop new tools for analyzing the data gathered during planetary missions, which is important, since these data sets have grown exponentially in recent years to tens of terabytes in size. As part of the Advanced Visualization in Solar System Exploration and Research (ADVISER) project, we utilize several advanced visualization techniques created specifically with planetary image data in mind. The Geoviewer application allows real-time active stereo display of images, which in aggregate have billions of pixels. The ADVISER desktop application platform allows fast three-dimensional visualization of planetary images overlain on digital terrain models. Both applications include tools for easy data ingest and real-time analysis in a programmatic manner. Incorporation of these tools into our everyday scientific workflow has proved important for scientific analysis, discussion, and publication, and enabled effective and exciting educational activities for students from high school through graduate school.

  5. SERVING EPA'S MISSION: POTENTIAL ROLES OF ENGERETIC TOOLS

    EPA Science Inventory

    Effective environmental protection requires an understanding of environmental systems dynamics that includes socioeconomic activity along with its interactions with environmental processes. Some forms of scientific analysis, such as emergy analysis, do seek to account for the ...

  6. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  7. Fast 3D Net Expeditions: Tools for Effective Scientific Collaboration on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Watson, Val; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D (three dimensional), high resolution, dynamic, interactive viewing of scientific data. The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG (Motion Picture Expert Group) movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewers local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: (1) The visual is much higher in resolution (1280x1024 pixels with 24 bits of color) than typical video format transmitted over the network. (2) The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). (3) A rich variety of guided expeditions through the data can be included easily. (4) A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. (5) The scenes can be viewed in 3D using stereo vision. (6) The network bandwidth for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.) This talk will illustrate the use of these new technologies and present a proposal for using these technologies to improve science education.

  8. Tools for Scientific Thinking: Microcomputer-Based Laboratories for the Naive Science Learner.

    ERIC Educational Resources Information Center

    Thornton, Ronald K.

    A promising new development in science education is the use of microcomputer-based laboratory tools that allow for student-directed data acquisition, display, and analysis. Microcomputer-based laboratories (MBL) make use of inexpensive microcomputer-connected probes to measure such physical quantities as temperature, position, and various…

  9. Sight Application Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  10. Lakeside: Merging Urban Design with Scientific Analysis

    ScienceCinema

    Guzowski, Leah; Catlett, Charlie; Woodbury, Ed

    2018-01-16

    Researchers at the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago are developing tools that merge urban design with scientific analysis to improve the decision-making process associated with large-scale urban developments. One such tool, called LakeSim, has been prototyped with an initial focus on consumer-driven energy and transportation demand, through a partnership with the Chicago-based architectural and engineering design firm Skidmore, Owings & Merrill, Clean Energy Trust and developer McCaffery Interests. LakeSim began with the need to answer practical questions about urban design and planning, requiring a better understanding about the long-term impact of design decisions on energy and transportation demand for a 600-acre development project on Chicago's South Side - the Chicago Lakeside Development project.

  11. Visualization and Interaction in Research, Teaching, and Scientific Communication

    NASA Astrophysics Data System (ADS)

    Ammon, C. J.

    2017-12-01

    Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.

  12. Chandra X-ray Center Science Data Systems Regression Testing of CIAO

    NASA Astrophysics Data System (ADS)

    Lee, N. P.; Karovska, M.; Galle, E. C.; Bonaventura, N. R.

    2011-07-01

    The Chandra Interactive Analysis of Observations (CIAO) is a software system developed for the analysis of Chandra X-ray Observatory observations. An important component of a successful CIAO release is the repeated testing of the tools across various platforms to ensure consistent and scientifically valid results. We describe the procedures of the scientific regression testing of CIAO and the enhancements made to the testing system to increase the efficiency of run time and result validation.

  13. Chandra monitoring, trends, and response

    NASA Astrophysics Data System (ADS)

    Spitzbart, Brad D.; Wolk, Scott J.; Isobe, Takashi

    2002-12-01

    The Chandra X-ray Observatory was launched in July, 1999 and has yielded extraordinary scientific results. Behind the scenes, our Monitoring and Trends Analysis (MTA) system has proven to be a valuable resource. With three years worth of on-orbit data, we have available a vast array of both telescope diagnostic information and analysis of scientific data to access Observatory performance. As part of Chandra's Science Operations Team (SOT), the primary goal of MTA is to provide tools for effective decision making leading to the most efficient production of quality science output from the Observatory. We occupy a middle ground between flight operations, chiefly concerned with the health and safety of the spacecraft, and validation and verification, concerned with the scientific validity of the data taken and whether or not they fulfill the observer's requirements. In that role we provide and receive support from systems engineers, instrument experts, operations managers, and scientific users. MTA tools, products, and services include real-time monitoring and alert generation for the most mission critical components, long term trending of all spacecraft systems, detailed analysis of various subsystems for life expectancy or anomaly resolution, and creating and maintaining a large SQL database of relevant information. This is accomplished through the use of a wide variety of input data sources and flexible, accessible programming and analysis techniques. This paper will discuss the overall design of the system, its evolution and the resources available.

  14. Visualization techniques to aid in the analysis of multispectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.

  15. Realist Ontology and Natural Processes: A Semantic Tool to Analyze the Presentation of the Osmosis Concept in Science Texts

    ERIC Educational Resources Information Center

    Spinelli Barria, Michele; Morales, Cecilia; Merino, Cristian; Quiroz, Waldo

    2016-01-01

    In this work, we developed an ontological tool, based on the scientific realism of Mario Bunge, for the analysis of the presentation of natural processes in science textbooks. This tool was applied to analyze the presentation of the concept of osmosis in 16 chemistry and biology books at different educational levels. The results showed that more…

  16. Science Operations Management

    NASA Astrophysics Data System (ADS)

    Squibb, Gael F.

    1984-10-01

    The operation teams for the Infrared Astronomical Satellite (IRAS) included scientists from the IRAS International Science Team. The scientific decisions on an hour-to-hour basis, as well as the long-term strategic decisions, were made by science team members. The IRAS scientists were involved in the analysis of the instrument performance, the analysis of the quality of the data, the decision to reacquire data that was contaminated by radiation effects, the strategy for acquiring the survey data, and the process for using the telescope for additional observations, as well as the processing decisions required to ensure the publication of the final scientific products by end of flight operations plus one year. Early in the project, two science team members were selected to be responsible for the scientific operational decisions. One, located at the operations control center in England, was responsible for the scientific aspects of the satellite operations; the other, located at the scientific processing center in Pasadena, was responsible for the scientific aspects of the processing. These science team members were then responsible for approving the design and test of the tools to support their responsibilities and then, after launch, for using these tools in making their decisions. The ability of the project to generate the final science data products one year after the end of flight operations is due in a large measure to the active participation of the science team members in the operations. This paper presents a summary of the operational experiences gained from this scientific involvement.

  17. Tools and data for meeting America's conservation challenges

    USGS Publications Warehouse

    Gergely, Kevin J.; McKerrow, Alexa

    2013-01-01

    The Gap Analysis Program (GAP) produces data and tools that help meet critical national challenges such as biodiversity conservation, renewable energy development, climate change adaptation, and infrastructure investment. The GAP is managed by the U.S. Geological Survey, Department of the Interior. GAP supports a wide range of national, State, and local agencies as well as nongovernmental organizations and businesses with scientific tools and data. GAP uses a collaborative approach to do research, analysis, and data development, resulting in a history of cooperation with more than 500 agencies and organizations nationally.

  18. Reproducible research in palaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined, then saved as a self-contained configuration which can be re-run without human interaction. PuffinPlot can thus be used as a component of a larger scientific workflow, integrated with workflow management tools such as Kepler, without compromising its capabilities as an exploratory tool. Since both PuffinPlot and the platform it runs on (Java) are Free/Open Source software, even the most fundamental components of an analysis can be verified and reproduced.

  19. Scientist-Centered Workflow Abstractions via Generic Actors, Workflow Templates, and Context-Awareness for Groundwater Modeling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.

    2011-07-04

    A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less

  20. AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields

    NASA Astrophysics Data System (ADS)

    López, R.; San-Juan, J. F.

    2013-05-01

    Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.

  1. Computer applications in scientific balloon quality control

    NASA Astrophysics Data System (ADS)

    Seely, Loren G.; Smith, Michael S.

    Seal defects and seal tensile strength are primary determinants of product quality in scientific balloon manufacturing; they therefore require a unit of quality measure. The availability of inexpensive and powerful data-processing tools can serve as the basis of a quality-trends-discerning analysis of products. The results of one such analysis are presently given in graphic form for use on the production floor. Software descriptions and their sample outputs are presented, together with a summary of overall and long-term effects of these methods on product quality.

  2. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  3. Data Rights and Responsibilities: A Human Rights Perspective on Data Sharing.

    PubMed

    Harris, Theresa L; Wyndham, Jessica M

    2015-07-01

    A human-rights-based analysis can be a useful tool for the scientific community and policy makers as they develop codes of conduct, harmonized standards, and national policies for data sharing. The human rights framework provides a shared set of values and norms across borders, defines rights and responsibilities of various actors involved in data sharing, addresses the potential harms as well as the benefits of data sharing, and offers a framework for balancing competing values. The right to enjoy the benefits of scientific progress and its applications offers a particularly helpful lens through which to view data as both a tool of scientific inquiry to which access is vital and as a product of science from which everyone should benefit. © The Author(s) 2015.

  4. Latest Community Coordinated Modeling Center (CCMC) services and innovative tools supporting the space weather research and operational communities.

    NASA Astrophysics Data System (ADS)

    Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.

    2017-12-01

    The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).

  5. A Participative Tool for Sharing, Annotating and Archiving Submarine Video Data

    NASA Astrophysics Data System (ADS)

    Marcon, Y.; Kottmann, R.; Ratmeyer, V.; Boetius, A.

    2016-02-01

    Oceans cover more than 70 percent of the Earth's surface and are known to play an essential role on all of the Earth systems and cycles. However, less than 5 percent of the ocean bottom has been explored and many aspects of the deep-sea world remain poorly understood. Increasing our ocean literacy is a necessity in order for specialists and non-specialists to better grasp the roles of the ocean on the Earth's system, its resources, and the impact of human activities on it. Due to technological advances, deep-sea research produces ever-increasing amounts of scientific video data. However, using such data for science communication and public outreach purposes remains difficult as tools for accessing/sharing such scientific data are often lacking. Indeed, there is no common solution for the management and analysis of marine video data, which are often scattered across multiple research institutes or working groups and it is difficult to get an overview of the whereabouts of those data. The VIDLIB Deep-Sea Video Platform is a web-based tool for sharing/annotating time-coded deep-sea video data. VIDLIB provides a participatory way to share and analyze video data. Scientists can share expert knowledge for video analysis without the need to upload/download large video files. Also, VIDLIB offers streaming capabilities and has potential for participatory science and science communication in that non-specialists can ask questions on what they see and get answers from scientists. Such a tool is highly valuable in terms of scientific public outreach and popular science. Video data are by far the most efficient way to communicate scientific findings to a non-expert public. VIDLIB is being used for studying the impact of deep-sea mining on benthic communities as well as for exploration in polar regions. We will present the structure and workflow of VIDLIB as well as an example of video analysis. VIDLIB (http://vidlib.marum.de) is funded by the EU EUROFLEET project and the Helmholtz Alliance ROBEX.

  6. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.

  7. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  8. Guiding students to develop an understanding of scientific inquiry: a science skills approach to instruction and assessment.

    PubMed

    Stone, Elisa M

    2014-01-01

    New approaches for teaching and assessing scientific inquiry and practices are essential for guiding students to make the informed decisions required of an increasingly complex and global society. The Science Skills approach described here guides students to develop an understanding of the experimental skills required to perform a scientific investigation. An individual teacher's investigation of the strategies and tools she designed to promote scientific inquiry in her classroom is outlined. This teacher-driven action research in the high school biology classroom presents a simple study design that allowed for reciprocal testing of two simultaneous treatments, one that aimed to guide students to use vocabulary to identify and describe different scientific practices they were using in their investigations-for example, hypothesizing, data analysis, or use of controls-and another that focused on scientific collaboration. A knowledge integration (KI) rubric was designed to measure how students integrated their ideas about the skills and practices necessary for scientific inquiry. KI scores revealed that student understanding of scientific inquiry increased significantly after receiving instruction and using assessment tools aimed at promoting development of specific inquiry skills. General strategies for doing classroom-based action research in a straightforward and practical way are discussed, as are implications for teaching and evaluating introductory life sciences courses at the undergraduate level.

  9. Decision Analysis Tools for Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  10. Plan EJ 2014: Science Tools Development

    EPA Pesticide Factsheets

    Under Plan EJ 2014, EPA has committed to building a strong scientific foundation for supporting environmental justice and conducting disproportionate impact analysis, particularly methods to appropriately characterize and assess cumulative impacts.

  11. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  12. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  13. Linguistic analysis of IPCC summaries for policymakers and associated coverage

    NASA Astrophysics Data System (ADS)

    Barkemeyer, Ralf; Dessai, Suraje; Monge-Sanz, Beatriz; Renzi, Barbara Gabriella; Napolitano, Giulio

    2016-03-01

    The Intergovernmental Panel on Climate Change (IPCC) Summary for Policymakers (SPM) is the most widely read section of IPCC reports and the main springboard for the communication of its assessment reports. Previous studies have shown that communicating IPCC findings to a variety of scientific and non-scientific audiences presents significant challenges to both the IPCC and the mass media. Here, we employ widely established sentiment analysis tools and readability metrics to explore the extent to which information published by the IPCC differs from the presentation of respective findings in the popular and scientific media between 1990 and 2014. IPCC SPMs clearly stand out in terms of low readability, which has remained relatively constant despite the IPCC’s efforts to consolidate and readjust its communications policy. In contrast, scientific and quality newspaper coverage has become increasingly readable and emotive. Our findings reveal easy gains that could be achieved in making SPMs more accessible for non-scientific audiences.

  14. magHD: a new approach to multi-dimensional data storage, analysis, display and exploitation

    NASA Astrophysics Data System (ADS)

    Angleraud, Christophe

    2014-06-01

    The ever increasing amount of data and processing capabilities - following the well- known Moore's law - is challenging the way scientists and engineers are currently exploiting large datasets. The scientific visualization tools, although quite powerful, are often too generic and provide abstract views of phenomena, thus preventing cross disciplines fertilization. On the other end, Geographic information Systems allow nice and visually appealing maps to be built but they often get very confused as more layers are added. Moreover, the introduction of time as a fourth analysis dimension to allow analysis of time dependent phenomena such as meteorological or climate models, is encouraging real-time data exploration techniques that allow spatial-temporal points of interests to be detected by integration of moving images by the human brain. Magellium is involved in high performance image processing chains for satellite image processing as well as scientific signal analysis and geographic information management since its creation (2003). We believe that recent work on big data, GPU and peer-to-peer collaborative processing can open a new breakthrough in data analysis and display that will serve many new applications in collaborative scientific computing, environment mapping and understanding. The magHD (for Magellium Hyper-Dimension) project aims at developing software solutions that will bring highly interactive tools for complex datasets analysis and exploration commodity hardware, targeting small to medium scale clusters with expansion capabilities to large cloud based clusters.

  15. Intrageneric Primer Design: Bringing Bioinformatics Tools to the Class

    ERIC Educational Resources Information Center

    Lima, Andre O. S.; Garces, Sergio P. S.

    2006-01-01

    Bioinformatics is one of the fastest growing scientific areas over the last decade. It focuses on the use of informatics tools for the organization and analysis of biological data. An example of their importance is the availability nowadays of dozens of software programs for genomic and proteomic studies. Thus, there is a growing field (private…

  16. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    USGS Publications Warehouse

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  17. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  18. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  19. A Simple Framework for Evaluating Authorial Contributions for Scientific Publications.

    PubMed

    Warrender, Jeffrey M

    2016-10-01

    A simple tool is provided to assist researchers in assessing contributions to a scientific publication, for ease in evaluating which contributors qualify for authorship, and in what order the authors should be listed. The tool identifies four phases of activity leading to a publication-Conception and Design, Data Acquisition, Analysis and Interpretation, and Manuscript Preparation. By comparing a project participant's contribution in a given phase to several specified thresholds, a score of up to five points can be assigned; the contributor's scores in all four phases are summed to yield a total "contribution score", which is compared to a threshold to determine which contributors merit authorship. This tool may be useful in a variety of contexts in which a systematic approach to authorial credit is desired.

  20. Tools and Techniques to Teach Earth Sciences to Young People

    NASA Astrophysics Data System (ADS)

    Constantino, R.; Dicelis, G.; Molina, E. C.

    2010-12-01

    This study aims to identify the tools available to disseminate the Earth sciences to young people in Brazil and to propose new techniques that may help in the teaching of such subjects. The use of scientific dissemination can be a great tool for the consolidation of a scientific culture, especially for a public of young students. The starting point of this study is an important characteristic that is present in virtually all the children: curiosity. The young public tries to understand how the world is and how it works. The use of scientific dissemination and some educational experiences have shown that these students have a great ability to learn and deal with various topics within the Earth Sciences. Another relevant point is the possibility to show that the Earth sciences, e.g., geophysics, oceanography, meteorology, geology and geography, can be an educational attractive option. Several ways of disseminating Earth sciences are commonly used with the purpose of attracting and mainly teaching these subjects, such as websites, interactive museums and cultural and educational spaces. The objectives of this work are: i) Investigate the role of science centers as motivators in disseminating the scientific knowledge by examining the communication resources that are being employed, the acceptance, reaction, and interest of children to these means, and ii) From this analysis, to list suggestions of contents and new tools that could be used for obtaining better results.

  1. A TT&C Performance Simulator for Space Exploration and Scientific Satellites - Architecture and Applications

    NASA Astrophysics Data System (ADS)

    Donà, G.; Faletra, M.

    2015-09-01

    This paper presents the TT&C performance simulator toolkit developed internally at Thales Alenia Space Italia (TAS-I) to support the design of TT&C subsystems for space exploration and scientific satellites. The simulator has a modular architecture and has been designed using a model-based approach using standard engineering tools such as MATLAB/SIMULINK and mission analysis tools (e.g. STK). The simulator is easily reconfigurable to fit different types of satellites, different mission requirements and different scenarios parameters. This paper provides a brief description of the simulator architecture together with two examples of applications used to demonstrate some of the simulator’s capabilities.

  2. Gendermetrics.NET: a novel software for analyzing the gender representation in scientific authoring.

    PubMed

    Bendels, Michael H K; Brüggmann, Dörthe; Schöffel, Norman; Groneberg, David A

    2016-01-01

    Imbalances in female career promotion are believed to be strong in the field of academic science. A primary parameter to analyze gender inequalities is the gender authoring in scientific publications. Since the presently available data on gender distribution is largely limited to underpowered studies, we here develop a new approach to analyze authors' genders in large bibliometric databases. A SQL-Server based multiuser software suite was developed that serves as an integrative tool for analyzing bibliometric data with a special emphasis on gender and topographical analysis. The presented system allows seamless integration, inspection, modification, evaluation and visualization of bibliometric data. By providing an adaptive and almost fully automatic integration and analysis process, the inter-individual variability of analysis is kept at a low level. Depending on the scientific question, the system enables the user to perform a scientometric analysis including its visualization within a short period of time. In summary, a new software suite for analyzing gender representations in scientific articles was established. The system is suitable for the comparative analysis of scientific structures on the level of continents, countries, cities, city regions, institutions, research fields and journals.

  3. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, S. George

    1994-01-01

    We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complete database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful, and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications, and has produced real, published results.

  4. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  5. Analysis of model output and science data in the Virtual Model Repository (VMR).

    NASA Astrophysics Data System (ADS)

    De Zeeuw, D.; Ridley, A. J.

    2014-12-01

    Big scientific data not only includes large repositories of data from scientific platforms like satelites and ground observation, but also the vast output of numerical models. The Virtual Model Repository (VMR) provides scientific analysis and visualization tools for a many numerical models of the Earth-Sun system. Individual runs can be analyzed in the VMR and compared to relevant data through relevant metadata, but larger collections of runs can also now be studied and statistics generated on the accuracy and tendancies of model output. The vast model repository at the CCMC with over 1000 simulations of the Earth's magnetosphere was used to look at overall trends in accuracy when compared to satelites such as GOES, Geotail, and Cluster. Methodology for this analysis as well as case studies will be presented.

  6. Preparing WIND for the STEREO Mission

    NASA Astrophysics Data System (ADS)

    Schroeder, P.; Ogilve, K.; Szabo, A.; Lin, R.; Luhmann, J.

    2006-05-01

    The upcoming STEREO mission's IMPACT and PLASTIC investigations will provide the first opportunity for long duration, detailed observations of 1 AU magnetic field structures, plasma ions and electrons, suprathermal electrons, and energetic particles at points bracketing Earth's heliospheric location. Stereoscopic/3D information from the STEREO SECCHI imagers and SWAVES radio experiment will make it possible to use both multipoint and quadrature studies to connect interplanetary Coronal Mass Ejections (ICME) and solar wind structures to CMEs and coronal holes observed at the Sun. To fully exploit these unique data sets, tight integration with similarly equipped missions at L1 will be essential, particularly WIND and ACE. The STEREO mission is building novel data analysis tools to take advantage of the mission's scientific potential. These tools will require reliable access and a well-documented interface to the L1 data sets. Such an interface already exists for ACE through the ACE Science Center. We plan to provide a similar service for the WIND mission that will supplement existing CDAWeb services. Building on tools also being developed for STEREO, we will create a SOAP application program interface (API) which will allow both our STEREO/WIND/ACE interactive browser and third-party software to access WIND data as a seamless and integral part of the STEREO mission. The API will also allow for more advanced forms of data mining than currently available through other data web services. Access will be provided to WIND-specific data analysis software as well. The development of cross-spacecraft data analysis tools will allow a larger scientific community to combine STEREO's unique in-situ data with those of other missions, particularly the L1 missions, and, therefore, to maximize STEREO's scientific potential in gaining a greater understanding of the heliosphere.

  7. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  8. Co-authorship Network Analysis: A Powerful Tool for Strategic Planning of Research, Development and Capacity Building Programs on Neglected Diseases

    PubMed Central

    Morel, Carlos Medicis; Serruya, Suzanne Jacob; Penna, Gerson Oliveira; Guimarães, Reinaldo

    2009-01-01

    Background New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. Methodology/Principal Findings Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. Conclusions/Significance Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building objectives and a more consistent engagement of institutions and authors from endemic regions based on innovative criteria and parameters anchored on objective scientific data. PMID:19688044

  9. Climate tools in mainstream Linux distributions

    NASA Astrophysics Data System (ADS)

    McKinstry, Alastair

    2015-04-01

    Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.

  10. Multivariate Statistical Analysis Software Technologies for Astrophysical Research Involving Large Data Bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, S. G.

    1994-01-01

    We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complex database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects of the SKICAT system, and of some of the scientific results achieved to date. We also developed a user-friendly package for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications and has produced real, published results.

  11. The Spectral Image Processing System (SIPS) - Interactive visualization and analysis of imaging spectrometer data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1993-01-01

    The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).

  12. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  13. Guiding Students to Develop an Understanding of Scientific Inquiry: A Science Skills Approach to Instruction and Assessment

    PubMed Central

    Stone, Elisa M.

    2014-01-01

    New approaches for teaching and assessing scientific inquiry and practices are essential for guiding students to make the informed decisions required of an increasingly complex and global society. The Science Skills approach described here guides students to develop an understanding of the experimental skills required to perform a scientific investigation. An individual teacher's investigation of the strategies and tools she designed to promote scientific inquiry in her classroom is outlined. This teacher-driven action research in the high school biology classroom presents a simple study design that allowed for reciprocal testing of two simultaneous treatments, one that aimed to guide students to use vocabulary to identify and describe different scientific practices they were using in their investigations—for example, hypothesizing, data analysis, or use of controls—and another that focused on scientific collaboration. A knowledge integration (KI) rubric was designed to measure how students integrated their ideas about the skills and practices necessary for scientific inquiry. KI scores revealed that student understanding of scientific inquiry increased significantly after receiving instruction and using assessment tools aimed at promoting development of specific inquiry skills. General strategies for doing classroom-based action research in a straightforward and practical way are discussed, as are implications for teaching and evaluating introductory life sciences courses at the undergraduate level. PMID:24591508

  14. AMD NOX REDUCTION IMPACTS

    EPA Science Inventory

    This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...

  15. An Interactive Virtual 3D Tool for Scientific Exploration of Planetary Surfaces

    NASA Astrophysics Data System (ADS)

    Traxler, Christoph; Hesina, Gerd; Gupta, Sanjeev; Paar, Gerhard

    2014-05-01

    In this paper we present an interactive 3D visualization tool for scientific analysis and planning of planetary missions. At the moment scientists have to look at individual camera images separately. There is no tool to combine them in three dimensions and look at them seamlessly as a geologist would do (by walking backwards and forwards resulting in different scales). For this reason a virtual 3D reconstruction of the terrain that can be interactively explored is necessary. Such a reconstruction has to consider multiple scales ranging from orbital image data to close-up surface image data from rover cameras. The 3D viewer allows seamless zooming between these various scales, giving scientists the possibility to relate small surface features (e.g. rock outcrops) to larger geological contexts. For a reliable geologic assessment a realistic surface rendering is important. Therefore the material properties of the rock surfaces will be considered for real-time rendering. This is achieved by an appropriate Bidirectional Reflectance Distribution Function (BRDF) estimated from the image data. The BRDF is implemented to run on the Graphical Processing Unit (GPU) to enable realistic real-time rendering, which allows a naturalistic perception for scientific analysis. Another important aspect for realism is the consideration of natural lighting conditions, which means skylight to illuminate the reconstructed scene. In our case we provide skylights from Mars and Earth, which allows switching between these two modes of illumination. This gives geologists the opportunity to perceive rock outcrops from Mars as they would appear on Earth facilitating scientific assessment. Besides viewing the virtual reconstruction on multiple scales, scientists can also perform various measurements, i.e. geo-coordinates of a selected point or distance between two surface points. Rover or other models can be placed into the scene and snapped onto certain location of the terrain. These are important features to support the planning of rover paths. In addition annotations can be placed directly into the 3D scene, which also serve as landmarks to aid navigation. The presented visualization and planning tool is a valuable asset for scientific analysis of planetary mission data. It complements traditional methods by giving access to an interactive virtual 3D reconstruction, which is realistically rendered. Representative examples and further information about the interactive 3D visualization tool can be found on the FP7-SPACE Project PRoViDE web page http://www.provide-space.eu/interactive-virtual-3d-tool/. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 312377 'PRoViDE'.

  16. Grid Stability Awareness System (GSAS) Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feuerborn, Scott; Ma, Jian; Black, Clifton

    The project team developed a software suite named Grid Stability Awareness System (GSAS) for power system near real-time stability monitoring and analysis based on synchrophasor measurement. The software suite consists of five analytical tools: an oscillation monitoring tool, a voltage stability monitoring tool, a transient instability monitoring tool, an angle difference monitoring tool, and an event detection tool. These tools have been integrated into one framework to provide power grid operators with both real-time or near real-time stability status of a power grid and historical information about system stability status. These tools are being considered for real-time use in themore » operation environment.« less

  17. Rethinking the Role of Information Technology-Based Research Tools in Students' Development of Scientific Literacy

    NASA Astrophysics Data System (ADS)

    van Eijck, Michiel; Roth, Wolff-Michael

    2007-06-01

    Given the central place IT-based research tools take in scientific research, the marginal role such tools currently play in science curricula is dissatisfying from the perspective of making students scientifically literate. To appropriately frame the role of IT-based research tools in science curricula, we propose a framework that is developed to understand the use of tools in human activity, namely cultural-historical activity theory (CHAT). Accordingly, IT-based research tools constitute central moments of scientific research activity and neither can be seen apart from its objectives, nor can it be considered apart from the cultural-historical determined forms of activity (praxis) in which human subjects participate. Based on empirical data involving students participating in research activity, we point out how an appropriate account of IT-based research tools involves subjects' use of tools with respect to the objectives of research activity and the contribution to the praxis of research. We propose to reconceptualize the role of IT-based research tools as contributing to scientific literacy if students apply these tools with respect to the objectives of the research activity and contribute to praxis of research by evaluating and modifying the application of these tools. We conclude this paper by sketching the educational implications of this reconceptualized role of IT-based research tools.

  18. Investigating Image Formation with a Camera Obscura: a Study in Initial Primary Science Teacher Education

    NASA Astrophysics Data System (ADS)

    Muñoz-Franco, Granada; Criado, Ana María; García-Carmona, Antonio

    2018-04-01

    This article presents the results of a qualitative study aimed at determining the effectiveness of the camera obscura as a didactic tool to understand image formation (i.e., how it is possible to see objects and how their image is formed on the retina, and what the image formed on the retina is like compared to the object observed) in a context of scientific inquiry. The study involved 104 prospective primary teachers (PPTs) who were being trained in science teaching. To assess the effectiveness of this tool, an open questionnaire was applied before (pre-test) and after (post-test) the educational intervention. The data were analyzed by combining methods of inter- and intra-rater analysis. The results showed that more than half of the PPTs advanced in their ideas towards the desirable level of knowledge in relation to the phenomena studied. The conclusion reached is that the camera obscura, used in a context of scientific inquiry, is a useful tool for PPTs to improve their knowledge about image formation and experience in the first person an authentic scientific inquiry during their teacher training.

  19. Bringing "Scientific Expeditions" Into the Schools

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as simulations or measurements of fluid dynamics). The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics (CFD) and wind tunnel testing. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualiZation of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: 1. The visual is much higher in resolution (1280xl024 pixels with 24 bits of color) than typical video format transmitted over the network. 2. The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). 3. A rich variety of guided expeditions through the data can be included easily. 4. A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. 5. The scenes can be viewed in 3D using stereo vision. 6. The network bandwidth used for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.)

  20. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1993-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  1. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1992-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  2. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  3. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  4. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  5. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    NASA Astrophysics Data System (ADS)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  6. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    ERIC Educational Resources Information Center

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  7. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  8. Using Microsoft PowerPoint as an Astronomical Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Beck-Winchatz, Bernhard

    2006-12-01

    Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies

  9. Using Remote Access to Scientific Instrumentation to Create Authentic Learning Activities in Pharmaceutical Analysis

    PubMed Central

    Albon, Simon P.; Cancilla, Devon A.; Hubball, Harry

    2006-01-01

    Objectives To pilot test and evaluate a gas chromatography-mass spectrometry (GCMS) case study as a teaching and learning tool. Design A case study incorporating remote access to a GCMS instrument through the Integrated Laboratory Network (ILN) at Western Washington University was developed and implemented. Student surveys, faculty interviews, and examination score data were used to evaluate learning. Assessment While the case study did not impact final examination scores, approximately 70% of students and all faculty members felt the ILN-supported case study improved student learning about GCMS. Faculty members felt the “live” instrument access facilitated more authentic teaching. Students and faculty members felt the ILN should continue to be developed as a teaching tool. Conclusion Remote access to scientific instrumentation can be used to modify case studies to enhance student learning and teaching practice in pharmaceutical analysis. PMID:17149450

  10. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  11. Motion based parsing for video from observational psychology

    NASA Astrophysics Data System (ADS)

    Kokaram, Anil; Doyle, Erika; Lennon, Daire; Joyeux, Laurent; Fuller, Ray

    2006-01-01

    In Psychology it is common to conduct studies involving the observation of humans undertaking some task. The sessions are typically recorded on video and used for subjective visual analysis. The subjective analysis is tedious and time consuming, not only because much useless video material is recorded but also because subjective measures of human behaviour are not necessarily repeatable. This paper presents tools using content based video analysis that allow automated parsing of video from one such study involving Dyslexia. The tools rely on implicit measures of human motion that can be generalised to other applications in the domain of human observation. Results comparing quantitative assessment of human motion with subjective assessment are also presented, illustrating that the system is a useful scientific tool.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  13. Community Coordinated Modeling Center (CCMC): Using innovative tools and services to support worldwide space weather scientific communities and networks

    NASA Astrophysics Data System (ADS)

    Mendoza, A. M.; Bakshi, S.; Berrios, D.; Chulaki, A.; Evans, R. M.; Kuznetsova, M. M.; Lee, H.; MacNeice, P. J.; Maddox, M. M.; Mays, M. L.; Mullinix, R. E.; Ngwira, C. M.; Patel, K.; Pulkkinen, A.; Rastaetter, L.; Shim, J.; Taktakishvili, A.; Zheng, Y.

    2012-12-01

    Community Coordinated Modeling Center (CCMC) was established to enhance basic solar terrestrial research and to aid in the development of models for specifying and forecasting conditions in the space environment. In achieving this goal, CCMC has developed and provides a set of innovative tools varying from: Integrated Space Weather Analysis (iSWA) web -based dissemination system for space weather information, Runs-On-Request System providing access to unique collection of state-of-the-art solar and space physics models (unmatched anywhere in the world), Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and recently Mobile apps (iPhone/Android) to view space weather data anywhere to the scientific community. The number of runs requested and the number of resulting scientific publications and presentations from the research community has not only been an indication of the broad scientific usage of the CCMC and effective participation by space scientists and researchers, but also guarantees active collaboration and coordination amongst the space weather research community. Arising from the course of CCMC activities, CCMC also supports community-wide model validation challenges and research focus group projects for a broad range of programs such as the multi-agency National Space Weather Program, NSF's CEDAR (Coupling, Energetics and Dynamics of Atmospheric Regions), GEM (Geospace Environment Modeling) and Shine (Solar Heliospheric and INterplanetary Environment) programs. In addition to performing research and model development, CCMC also supports space science education by hosting summer students through local universities; through the provision of simulations in support of classroom programs such as Heliophysics Summer School (with student research contest) and CCMC Workshops; training next generation of junior scientists in space weather forecasting; and educating the general public about the importance and impacts of space weather effects. Although CCMC is organizationally comprised of United States federal agencies, CCMC services are open to members of the international science community and encourages interagency and international collaboration. In this poster, we provide an overview of using Community Coordinated Modeling Center (CCMC) tools and services to support worldwide space weather scientific communities and networks.;

  14. NIH Image to ImageJ: 25 years of Image Analysis

    PubMed Central

    Schneider, Caroline A.; Rasband, Wayne S.; Eliceiri, Kevin W.

    2017-01-01

    For the past twenty five years the NIH family of imaging software, NIH Image and ImageJ have been pioneers as open tools for scientific image analysis. We discuss the origins, challenges and solutions of these two programs, and how their history can serve to advise and inform other software projects. PMID:22930834

  15. User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org

    PubMed Central

    Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.

    2013-01-01

    Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278

  16. Environmental risk, precaution, and scientific rationality in the context of WTO/NAFTA trade rules.

    PubMed

    Crawford-Brown, Douglas; Pauwelyn, Joost; Smith, Kelly

    2004-04-01

    This article considers the role of scientific rationality in understanding statements of risk produced by a scientific community. An argument is advanced that, while scientific rationality does impose constraints on valid scientific justifications for restrictions on products and practices, it also provides flexibility in the judgments needed to both develop and apply characterizations of risk. The implications of this flexibility for the understanding of risk estimates in WTO and NAFTA deliberations are explored, with the goal of finding an intermediate ground between the view that science unambiguously justifies or rejects a policy, and the view that science is yet another cultural tool that can be manipulated in support of any decision. The result is a proposal for a dialogical view of scientific rationality in which risk estimates are depicted as confidence distributions that follow from a structured dialogue of scientific panels focused on judgments of evidence, evidential reasoning, and epistemic analysis.

  17. Microtube strip heat exchanger

    NASA Astrophysics Data System (ADS)

    Doty, F. D.

    1991-07-01

    During the last quarter, Doty Scientific, Inc. (DSI) continued to make progress on the microtube strip (MTS) heat exchanger. The DSI completed a heat exchanger stress analysis of the ten-module heat exchanger bank; and performed a shell-side flow inhomogeneity analysis of the three-module heat exchanger bank. The company produced 50 tubestrips using an in-house CNC milling machine and began pressing them onto tube arrays. The DSI revised some of the tooling required to encapsulate a tube array and press tubestrips into the array to improve some of the prototype tooling.

  18. iGlobe Interactive Visualization and Analysis of Spatial Data

    NASA Technical Reports Server (NTRS)

    Hogan, Patrick

    2012-01-01

    iGlobe is open-source software built on NASA World Wind virtual globe technology. iGlobe provides a growing set of tools for weather science, climate research, and agricultural analysis. Up until now, these types of sophisticated tools have been developed in isolation by national agencies, academic institutions, and research organizations. By providing an open-source solution to analyze and visualize weather, climate, and agricultural data, the scientific and research communities can more readily advance solutions needed to understand better the dynamics of our home planet, Earth

  19. Science in the cloud (SIC): A use case in MRI connectomics

    PubMed Central

    Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal

    2017-01-01

    Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935

  20. Science in the cloud (SIC): A use case in MRI connectomics.

    PubMed

    Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T

    2017-05-01

    Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.

  1. Collaborative workbench for cyberinfrastructure to accelerate science algorithm development

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Maskey, M.; Kuo, K.; Lynnes, C.

    2013-12-01

    There are significant untapped resources for information and knowledge creation within the Earth Science community in the form of data, algorithms, services, analysis workflows or scripts, and the related knowledge about these resources. Despite the huge growth in social networking and collaboration platforms, these resources often reside on an investigator's workstation or laboratory and are rarely shared. A major reason for this is that there are very few scientific collaboration platforms, and those that exist typically require the use of a new set of analysis tools and paradigms to leverage the shared infrastructure. As a result, adoption of these collaborative platforms for science research is inhibited by the high cost to an individual scientist of switching from his or her own familiar environment and set of tools to a new environment and tool set. This presentation will describe an ongoing project developing an Earth Science Collaborative Workbench (CWB). The CWB approach will eliminate this barrier by augmenting a scientist's current research environment and tool set to allow him or her to easily share diverse data and algorithms. The CWB will leverage evolving technologies such as commodity computing and social networking to design an architecture for scalable collaboration that will support the emerging vision of an Earth Science Collaboratory. The CWB is being implemented on the robust and open source Eclipse framework and will be compatible with widely used scientific analysis tools such as IDL. The myScience Catalog built into CWB will capture and track metadata and provenance about data and algorithms for the researchers in a non-intrusive manner with minimal overhead. Seamless interfaces to multiple Cloud services will support sharing algorithms, data, and analysis results, as well as access to storage and computer resources. A Community Catalog will track the use of shared science artifacts and manage collaborations among researchers.

  2. Earth Exploration Toolbook Workshops: Helping Teachers and Students Analyze Web-based Scientific Data

    NASA Astrophysics Data System (ADS)

    McAuliffe, C.; Ledley, T.; Dahlman, L.; Haddad, N.

    2007-12-01

    One of the challenges faced by Earth science teachers, particularly in K-12 settings, is that of connecting scientific research to classroom experiences. Helping teachers and students analyze Web-based scientific data is one way to bring scientific research to the classroom. The Earth Exploration Toolbook (EET) was developed as an online resource to accomplish precisely that. The EET consists of chapters containing step-by-step instructions for accessing Web-based scientific data and for using a software analysis tool to explore issues or concepts in science, technology, and mathematics. For example, in one EET chapter, users download Earthquake data from the USGS and bring it into a geographic information system (GIS), analyzing factors affecting the distribution of earthquakes. The goal of the EET Workshops project is to provide professional development that enables teachers to incorporate Web-based scientific data and analysis tools in ways that meet their curricular needs. In the EET Workshops project, Earth science teachers participate in a pair of workshops that are conducted in a combined teleconference and Web-conference format. In the first workshop, the EET Data Analysis Workshop, participants are introduced to the National Science Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). They also walk through an Earth Exploration Toolbook (EET) chapter and discuss ways to use Earth science datasets and tools with their students. In a follow-up second workshop, the EET Implementation Workshop, teachers share how they used these materials in the classroom by describing the projects and activities that they carried out with students. The EET Workshops project offers unique and effective professional development. Participants work at their own Internet-connected computers, and dial into a toll-free group teleconference for step-by-step facilitation and interaction. They also receive support via Elluminate, a Web-conferencing software program. The software allows participants to see the facilitator's computer as the analysis techniques of an EET chapter are demonstrated. If needed, the facilitator can also view individual participant's computers, assisting with technical difficulties. In addition, it enables a large number of end users, often widely distributed, to engage in interactive, real-time instruction. In this presentation, we will describe the elements of an EET Workshop pair, highlighting the capabilities and use of Elluminate. We will share lessons learned through several years of conducting this type of professional development. We will also share findings from survey data gathered from teachers who have participated in our workshops.

  3. Supporting Data Stewardship Throughout the Data Life Cycle in the Solid Earth Sciences

    NASA Astrophysics Data System (ADS)

    Ferrini, V.; Lehnert, K. A.; Carbotte, S. M.; Hsu, L.

    2013-12-01

    Stewardship of scientific data is fundamental to enabling new data-driven research, and ensures preservation, accessibility, and quality of the data, yet researchers, especially in disciplines that typically generate and use small, but complex, heterogeneous, and unstructured datasets are challenged to fulfill increasing demands of properly managing their data. The IEDA Data Facility (www.iedadata.org) provides tools and services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds upon and brings together over a decade of development and experiences of its component data systems, the Marine Geoscience Data System (MGDS, www.marine-geo.org) and EarthChem (www.earthchem.org). IEDA services include domain-focused data curation and synthesis, tools for data discovery, access, visualization and analysis, as well as investigator support services that include tools for data contribution, data publication services, and data compliance support. IEDA data synthesis efforts (e.g. PetDB and Global Multi-Resolution Topography (GMRT) Synthesis) focus on data integration and analysis while emphasizing provenance and attribution. IEDA's domain-focused data catalogs (e.g. MGDS and EarthChem Library) provide access to metadata-rich long-tail data complemented by extensive metadata including attribution information and links to related publications. IEDA's visualization and analysis tools (e.g. GeoMapApp) broaden access to earth science data for domain specialist and non-specialists alike, facilitating both interdisciplinary research and education and outreach efforts. As a disciplinary data repository, a key role IEDA plays is to coordinate with its user community and to bridge the requirements and standards for data curation with both the evolving needs of its science community and emerging technologies. Development of IEDA tools and services is based first and foremost on the scientific needs of its user community. As data stewardship becomes a more integral component of the scientific workflow, IEDA investigator support services (e.g. Data Management Plan Tool and Data Compliance Reporting Tool) continue to evolve with the goal of lessening the 'burden' of data management for individual investigators by increasing awareness and facilitating the adoption of data management practices. We will highlight a variety of IEDA system components that support investigators throughout the data life cycle, and will discuss lessons learned and future directions.

  4. Reclaiming freshwater sustainability in the Cadillac Desert

    PubMed Central

    Sabo, John L.; Sinha, Tushar; Bowling, Laura C.; Schoups, Gerrit H. W.; Wallender, Wesley W.; Campana, Michael E.; Cherkauer, Keith A.; Fuller, Pam L.; Graf, William L.; Hopmans, Jan W.; Kominoski, John S.; Taylor, Carissa; Trimble, Stanley W.; Webb, Robert H.; Wohl, Ellen E.

    2010-01-01

    Increasing human appropriation of freshwater resources presents a tangible limit to the sustainability of cities, agriculture, and ecosystems in the western United States. Marc Reisner tackles this theme in his 1986 classic Cadillac Desert: The American West and Its Disappearing Water. Reisner's analysis paints a portrait of region-wide hydrologic dysfunction in the western United States, suggesting that the storage capacity of reservoirs will be impaired by sediment infilling, croplands will be rendered infertile by salt, and water scarcity will pit growing desert cities against agribusiness in the face of dwindling water resources. Here we evaluate these claims using the best available data and scientific tools. Our analysis provides strong scientific support for many of Reisner's claims, except the notion that reservoir storage is imminently threatened by sediment. More broadly, we estimate that the equivalent of nearly 76% of streamflow in the Cadillac Desert region is currently appropriated by humans, and this figure could rise to nearly 86% under a doubling of the region's population. Thus, Reisner's incisive journalism led him to the same conclusions as those rendered by copious data, modern scientific tools, and the application of a more genuine scientific method. We close with a prospectus for reclaiming freshwater sustainability in the Cadillac Desert, including a suite of recommendations for reducing region-wide human appropriation of streamflow to a target level of 60%. PMID:21149727

  5. Reclaiming freshwater sustainability in the Cadillac Desert

    USGS Publications Warehouse

    Sabo, John L.; Sinha, Tushar; Bowling, Laura C.; Schoups, Gerrit H.W.; Wallender, Wesley W.; Campana, Michael E.; Cherkauer, Keith A.; Fuller, Pam L.; Graf, William L.; Hopmans, Jan W.; Kominoski, John S.; Taylor, Carissa; Trimble, Stanley W.; Webb, Robert H.; Wohl, Ellen E.

    2010-01-01

    Increasing human appropriation of freshwater resources presents a tangible limit to the sustainability of cities, agriculture, and ecosystems in the western United States. Marc Reisner tackles this theme in his 1986 classic Cadillac Desert: The American West and Its Disappearing Water. Reisner's analysis paints a portrait of region-wide hydrologic dysfunction in the western United States, suggesting that the storage capacity of reservoirs will be impaired by sediment infilling, croplands will be rendered infertile by salt, and water scarcity will pit growing desert cities against agribusiness in the face of dwindling water resources. Here we evaluate these claims using the best available data and scientific tools. Our analysis provides strong scientific support for many of Reisner's claims, except the notion that reservoir storage is imminently threatened by sediment. More broadly, we estimate that the equivalent of nearly 76% of streamflow in the Cadillac Desert region is currently appropriated by humans, and this figure could rise to nearly 86% under a doubling of the region's population. Thus, Reisner's incisive journalism led him to the same conclusions as those rendered by copious data, modern scientific tools, and the application of a more genuine scientific method. We close with a prospectus for reclaiming freshwater sustainability in the Cadillac Desert, including a suite of recommendations for reducing region-wide human appropriation of streamflow to a target level of 60%.

  6. The Papillomavirus Episteme: a central resource for papillomavirus sequence data and analysis.

    PubMed

    Van Doorslaer, Koenraad; Tan, Qina; Xirasagar, Sandhya; Bandaru, Sandya; Gopalan, Vivek; Mohamoud, Yasmin; Huyen, Yentram; McBride, Alison A

    2013-01-01

    The goal of the Papillomavirus Episteme (PaVE) is to provide an integrated resource for the analysis of papillomavirus (PV) genome sequences and related information. The PaVE is a freely accessible, web-based tool (http://pave.niaid.nih.gov) created around a relational database, which enables storage, analysis and exchange of sequence information. From a design perspective, the PaVE adopts an Open Source software approach and stresses the integration and reuse of existing tools. Reference PV genome sequences have been extracted from publicly available databases and reannotated using a custom-created tool. To date, the PaVE contains 241 annotated PV genomes, 2245 genes and regions, 2004 protein sequences and 47 protein structures, which users can explore, analyze or download. The PaVE provides scientists with the data and tools needed to accelerate scientific progress for the study and treatment of diseases caused by PVs.

  7. π Scope: python based scientific workbench with visualization tool for MDSplus data

    NASA Astrophysics Data System (ADS)

    Shiraiwa, S.

    2014-10-01

    π Scope is a python based scientific data analysis and visualization tool constructed on wxPython and Matplotlib. Although it is designed to be a generic tool, the primary motivation for developing the new software is 1) to provide an updated tool to browse MDSplus data, with functionalities beyond dwscope and jScope, and 2) to provide a universal foundation to construct interface tools to perform computer simulation and modeling for Alcator C-Mod. It provides many features to visualize MDSplus data during tokamak experiments including overplotting different signals and discharges, various plot types (line, contour, image, etc.), in-panel data analysis using python scripts, and publication quality graphics generation. Additionally, the logic to produce multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for dwscope users. πScope uses multi-threading to reduce data transfer latency, and its object-oriented design makes it easy to modify and expand while the open source nature allows portability. A built-in tree data browser allows a user to approach the data structure both from a GUI and a script, enabling relatively complex data analysis workflow to be built quickly. As an example, an IDL-based interface to perform GENRAY/CQL3D simulations was ported on πScope, thus allowing LHCD simulation to be run between-shot using C-Mod experimental profiles. This workflow is being used to generate a large database to develop a LHCD actuator model for the plasma control system. Supported by USDoE Award DE-FC02-99ER54512.

  8. An Expert Assistant for Computer Aided Parallelization

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    The prototype implementation of an expert system was developed to assist the user in the computer aided parallelization process. The system interfaces to tools for automatic parallelization and performance analysis. By fusing static program structure information and dynamic performance analysis data the expert system can help the user to filter, correlate, and interpret the data gathered by the existing tools. Sections of the code that show poor performance and require further attention are rapidly identified and suggestions for improvements are presented to the user. In this paper we describe the components of the expert system and discuss its interface to the existing tools. We present a case study to demonstrate the successful use in full scale scientific applications.

  9. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  10. Scalable data management, analysis and visualization (SDAV) Institute. Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk

    The purpose of the SDAV institute is to provide tools and expertise in scientific data management, analysis, and visualization to DOE’s application scientists. Our goal is to actively work with application teams to assist them in achieving breakthrough science, and to provide technical solutions in the data management, analysis, and visualization regimes that are broadly used by the computational science community. Over the last 5 years members of our institute worked directly with application scientists and DOE leadership-class facilities to assist them by applying the best tools and technologies at our disposal. We also enhanced our tools based on inputmore » from scientists on their needs. Many of the applications we have been working with are based on connections with scientists established in previous years. However, we contacted additional scientists though our outreach activities, as well as engaging application teams running on leading DOE computing systems. Our approach is to employ an evolutionary development and deployment process: first considering the application of existing tools, followed by the customization necessary for each particular application, and then the deployment in real frameworks and infrastructures. The institute is organized into three areas, each with area leaders, who keep track of progress, engagement of application scientists, and results. The areas are: (1) Data Management, (2) Data Analysis, and (3) Visualization. Kitware has been involved in the Visualization area. This report covers Kitware’s contributions over the last 5 years (February 2012 – February 2017). For details on the work performed by the SDAV institute as a whole, please see the SDAV final report.« less

  11. E-Labs - Learning with Authentic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardeen, Marjorie G.; Wayne, Mitchell

    the success teachers have had providing an opportunity for students to: • Organize and conduct authentic research. • Experience the environment of scientific collaborations. • Possibly make real contributions to a burgeoning scientific field. We've created projects that are problem-based, student driven and technology dependent. Students reach beyond classroom walls to explore data with other students and experts and share results, publishing original work to a worldwide audience. Students can discover and extend the research of other students, modeling the processes of modern, large-scale research projects. From start to finish e-Labs are student-led, teacher-guided projects. Students need only a Webmore » browser to access computing techniques employed by professional researchers. A Project Map with milestones allows students to set the research plan rather than follow a step-by-step process common in other online projects. Most importantly, e-Labs build the learning experience around the students' own questions and let them use the very tools that scientists use. Students contribute to and access shared data, most derived from professional research databases. They use common analysis tools, store their work and use metadata to discover, replicate and confirm the research of others. This is where real scientific collaboration begins. Using online tools, students correspond with other research groups, post comments and questions, prepare summary reports, and in general participate in the part of scientific research that is often left out of classroom experiments. Teaching tools such as student and teacher logbooks, pre- and post-tests and an assessment rubric aligned with learner outcomes help teachers guide student work. Constraints on interface designs and administrative tools such as registration databases give teachers the "one-stop-shopping" they seek for multiple e-Labs. Teaching and administrative tools also allow us to track usage and assess the impact on student learning.« less

  12. A Vision on the Status and Evolution of HEP Physics Software Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canal, P.; Elvira, D.; Hatcher, R.

    2013-07-28

    This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.

  13. Leveraging Python Interoperability Tools to Improve Sapphire's Usability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gezahegne, A; Love, N S

    2007-12-10

    The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.

  14. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  15. EnviroAtlas Cyanobacteria Assessment Network (CyAN) Dashboard: A Tool for Data Visualization and Exploratory Analysis

    EPA Science Inventory

    Economic, health, and environmental impacts of cyanobacteria and associated harmful algal blooms are increasingly recognized by policymakers, managers, and scientific researchers. However, spatially-distributed, long-term data on cyanobacteria blooms are largely unavailable. The ...

  16. Object classification and outliers analysis in the forthcoming Gaia mission

    NASA Astrophysics Data System (ADS)

    Ordóñez-Blanco, D.; Arcay, B.; Dafonte, C.; Manteiga, M.; Ulla, A.

    2010-12-01

    Astrophysics is evolving towards the rational optimization of costly observational material by the intelligent exploitation of large astronomical databases from both terrestrial telescopes and spatial mission archives. However, there has been relatively little advance in the development of highly scalable data exploitation and analysis tools needed to generate the scientific returns from these large and expensively obtained datasets. Among the upcoming projects of astronomical instrumentation, Gaia is the next cornerstone ESA mission. The Gaia survey foresees the creation of a data archive and its future exploitation with automated or semi-automated analysis tools. This work reviews some of the work that is being developed by the Gaia Data Processing and Analysis Consortium for the object classification and analysis of outliers in the forthcoming mission.

  17. The new Planetary Science Archive (PSA): Exploration and discovery of scientific datasets from ESA's planetary missions

    NASA Astrophysics Data System (ADS)

    Martinez, Santa; Besse, Sebastien; Heather, Dave; Barbarisi, Isa; Arviset, Christophe; De Marchi, Guido; Barthelemy, Maud; Docasal, Ruben; Fraga, Diego; Grotheer, Emmanuel; Lim, Tanya; Macfarlane, Alan; Rios, Carlos; Vallejo, Fran; Saiz, Jaime; ESDC (European Space Data Centre) Team

    2016-10-01

    The Planetary Science Archive (PSA) is the European Space Agency's (ESA) repository of science data from all planetary science and exploration missions. The PSA provides access to scientific datasets through various interfaces at http://archives.esac.esa.int/psa. All datasets are scientifically peer-reviewed by independent scientists, and are compliant with the Planetary Data System (PDS) standards. The PSA is currently implementing a number of significant improvements, mostly driven by the evolution of the PDS standard, and the growing need for better interfaces and advanced applications to support science exploitation. The newly designed PSA will enhance the user experience and will significantly reduce the complexity for users to find their data promoting one-click access to the scientific datasets with more specialised views when needed. This includes a better integration with Planetary GIS analysis tools and Planetary interoperability services (search and retrieve data, supporting e.g. PDAP, EPN-TAP). It will be also up-to-date with versions 3 and 4 of the PDS standards, as PDS4 will be used for ESA's ExoMars and upcoming BepiColombo missions. Users will have direct access to documentation, information and tools that are relevant to the scientific use of the dataset, including ancillary datasets, Software Interface Specification (SIS) documents, and any tools/help that the PSA team can provide. A login mechanism will provide additional functionalities to the users to aid / ease their searches (e.g. saving queries, managing default views). This contribution will introduce the new PSA, its key features and access interfaces.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svetlana Shasharina

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  19. Scientific approaches to science policy.

    PubMed

    Berg, Jeremy M

    2013-11-01

    The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.

  20. Scientific Workflow Management in Proteomics

    PubMed Central

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  1. Analysis of citation networks as a new tool for scientific research

    DOE PAGES

    Vasudevan, R. K.; Ziatdinov, M.; Chen, C.; ...

    2016-12-06

    The rapid growth of scientific publications necessitates new methods to understand the direction of scientific research within fields of study, ascertain the importance of particular groups, authors, or institutions, compute metrics that can determine the importance (centrality) of particular seminal papers, and provide insight into the social (collaboration) networks that are present. We present one such method based on analysis of citation networks, using the freely available CiteSpace Program. We use citation network analysis on three examples, including a single material that has been widely explored in the last decade (BiFeO 3), two small subfields with a minimal number ofmore » authors (flexoelectricity and Kitaev physics), and a much wider field with thousands of publications pertaining to a single technique (scanning tunneling microscopy). Interpretation of the analysis and key insights into the fields, such as whether the fields are experiencing resurgence or stagnation, are discussed, and author or collaboration networks that are prominent are determined. Such methods represent a paradigm shift in our way of dealing with the large volume of scientific publications and could change the way literature searches and reviews are conducted, as well as how the impact of specific work is assessed.« less

  2. Analysis of citation networks as a new tool for scientific research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasudevan, R. K.; Ziatdinov, M.; Chen, C.

    The rapid growth of scientific publications necessitates new methods to understand the direction of scientific research within fields of study, ascertain the importance of particular groups, authors, or institutions, compute metrics that can determine the importance (centrality) of particular seminal papers, and provide insight into the social (collaboration) networks that are present. We present one such method based on analysis of citation networks, using the freely available CiteSpace Program. We use citation network analysis on three examples, including a single material that has been widely explored in the last decade (BiFeO 3), two small subfields with a minimal number ofmore » authors (flexoelectricity and Kitaev physics), and a much wider field with thousands of publications pertaining to a single technique (scanning tunneling microscopy). Interpretation of the analysis and key insights into the fields, such as whether the fields are experiencing resurgence or stagnation, are discussed, and author or collaboration networks that are prominent are determined. Such methods represent a paradigm shift in our way of dealing with the large volume of scientific publications and could change the way literature searches and reviews are conducted, as well as how the impact of specific work is assessed.« less

  3. STEREO In-situ Data Analysis

    NASA Astrophysics Data System (ADS)

    Schroeder, P. C.; Luhmann, J. G.; Davis, A. J.; Russell, C. T.

    2006-12-01

    STEREO's IMPACT (In-situ Measurements of Particles and CME Transients) investigation provides the first opportunity for long duration, detailed observations of 1 AU magnetic field structures, plasma and suprathermal electrons, and energetic particles at points bracketing Earth's heliospheric location. The PLASTIC instrument takes plasma ion composition measurements completing STEREO's comprehensive in-situ perspective. Stereoscopic/3D information from the STEREO SECCHI imagers and SWAVES radio experiment make it possible to use both multipoint and quadrature studies to connect interplanetary Coronal Mass Ejections (ICME) and solar wind structures to CMEs and coronal holes observed at the Sun. The uniqueness of the STEREO mission requires novel data analysis tools and techniques to take advantage of the mission's full scientific potential. An interactive browser with the ability to create publication-quality plots has been developed which integrates STEREO's in-situ data with data from a variety of other missions including WIND and ACE. Also, an application program interface (API) is provided allowing users to create custom software that ties directly into STEREO's data set. The API allows for more advanced forms of data mining than currently available through most web-based data services. A variety of data access techniques and the development of cross-spacecraft data analysis tools allow the larger scientific community to combine STEREO's unique in-situ data with those of other missions, particularly the L1 missions, and, therefore, to maximize STEREO's scientific potential in gaining a greater understanding of the heliosphere.

  4. SpecViz: Interactive Spectral Data Analysis

    NASA Astrophysics Data System (ADS)

    Earl, Nicholas Michael; STScI

    2016-06-01

    The astronomical community is about to enter a new generation of scientific enterprise. With next-generation instrumentation and advanced capabilities, the need has arisen to equip astronomers with the necessary tools to deal with large, multi-faceted data. The Space Telescope Science Institute has initiated a data analysis forum for the creation, development, and maintenance of software tools for the interpretation of these new data sets. SpecViz is a spectral 1-D interactive visualization and analysis application built with Python in an open source development environment. A user-friendly GUI allows for a fast, interactive approach to spectral analysis. SpecViz supports handling of unique and instrument-specific data, incorporation of advanced spectral unit handling and conversions in a flexible, high-performance interactive plotting environment. Active spectral feature analysis is possible through interactive measurement and statistical tools. It can be used to build wide-band SEDs, with the capability of combining or overplotting data products from various instruments. SpecViz sports advanced toolsets for filtering and detrending spectral lines; identifying, isolating, and manipulating spectral features; as well as utilizing spectral templates for renormalizing data in an interactive way. SpecViz also includes a flexible model fitting toolset that allows for multi-component models, as well as custom models, to be used with various fitting and decomposition routines. SpecViz also features robust extension via custom data loaders and connection to the central communication system underneath the interface for more advanced control. Incorporation with Jupyter notebooks via connection with the active iPython kernel allows for SpecViz to be used in addition to a user’s normal workflow without demanding the user drastically alter their method of data analysis. In addition, SpecViz allows the interactive analysis of multi-object spectroscopy in the same straight-forward, consistent way. Through the development of such tools, STScI hopes to unify astronomical data analysis software for JWST and other instruments, allowing for efficient, reliable, and consistent scientific results.

  5. Bibliometric analysis of scientific literature on intestinal parasites in Argentina during the period 1985-2014.

    PubMed

    Basualdo, Juan A; Grenóvero, María S; Bertucci, Evangelina; Molina, Nora B

    2016-01-01

    The study of scientific production is a good indicator of the progress in research and knowledge generation. Bibliometrics is a scientific discipline that uses a set of indicators to quantitatively express the bibliographic characteristics of scientific publications. The scientific literature on the epidemiology of intestinal parasites in Argentina is scattered in numerous sources, hindering access and visibility to the scientific community. Our purpose was to perform a quantitative, bibliometric study of the scientific literature on intestinal parasites in humans in Argentina published in the period 1985-2014. This bibliometric analysis showed an increase in the number of articles on intestinal parasites in humans in Argentina published over the past 30 years. Those articles showed a collaboration index similar to that of the literature, with a high index of institutionality for national institutions and a very low one for international collaboration. The original articles were published in scientific journals in the American Continent, Europe and Asia. The use of bibliometric indicators can provide a solid tool for the diagnosis and survey of the research on epidemiology of intestinal parasites and contributes to the dissemination and visibility of information on the scientific production developed in Argentina. Copyright © 2016 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Virtual Planetary Analysis Environment for Remote Science

    NASA Technical Reports Server (NTRS)

    Keely, Leslie; Beyer, Ross; Edwards. Laurence; Lees, David

    2009-01-01

    All of the data for NASA's current planetary missions and most data for field experiments are collected via orbiting spacecraft, aircraft, and robotic explorers. Mission scientists are unable to employ traditional field methods when operating remotely. We have developed a virtual exploration tool for remote sites with data analysis capabilities that extend human perception quantitatively and qualitatively. Scientists and mission engineers can use it to explore a realistic representation of a remote site. It also provides software tools to "touch" and "measure" remote sites with an immediacy that boosts scientific productivity and is essential for mission operations.

  7. Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less

  8. M-Learning and Augmented Reality: A Review of the Scientific Literature on the WoS Repository

    ERIC Educational Resources Information Center

    Fombona, Javier; Pascual-Sevillano, Maria-Angeles; González-Videgara, MariCarmen

    2017-01-01

    Augmented reality emerges as a tool, on which it is necessary to examine its real educational value. This paper shows the results of a bibliometric analysis performed on documents collected from the Web of Science repository, an Internet service that concentrates bibliographic information from more than 7,000 institutions. Our analysis included an…

  9. Measuring, Understanding, and Responding to Covert Social Networks: Passive and Active Tomography

    DTIC Science & Technology

    2017-11-29

    Methods for generating a random sample of networks with desired properties are important tools for the analysis of social , biological, and information...on Theoretical Foundations for Statistical Network Analysis at the Isaac Newton Institute for Mathematical Sciences at Cambridge U. (organized by...Approach SOCIAL SCIENCES STATISTICS EECS Problems span three disciplines Scientific focus is needed at the interfaces

  10. Science Partnerships Enabling Rapid Response: Designing a Strategy for Improving Scientific Collaboration during Crisis Response

    NASA Astrophysics Data System (ADS)

    Mease, L.; Gibbs, T.; Adiseshan, T.

    2014-12-01

    The 2010 Deepwater Horizon disaster required unprecedented engagement and collaboration with scientists from multiple disciplines across government, academia, and industry. Although this spurred the rapid advancement of valuable new scientific knowledge and tools, it also exposed weaknesses in the system of information dissemination and exchange among the scientists from those three sectors. Limited government communication with the broader scientific community complicated the rapid mobilization of the scientific community to assist with spill response, evaluation of impact, and public perceptions of the crisis. The lessons and new laws produced from prior spills such as Exxon Valdez were helpful, but ultimately did not lead to the actions necessary to prepare a suitable infrastructure that would support collaboration with non-governmental scientists. As oil demand pushes drilling into increasingly extreme environments, addressing the challenge of effective, science-based disaster response is an imperative. Our study employs a user-centered design process to 1) understand the obstacles to and opportunity spaces for effective scientific collaboration during environmental crises such as large oil spills, 2) identify possible tools and strategies to enable rapid information exchange between government responders and non-governmental scientists from multiple relevant disciplines, and 3) build a network of key influencers to secure sufficient buy-in for scaled implementation of appropriate tools and strategies. Our methods include user ethnography, complex system mapping, individual and system behavioral analysis, and large-scale system design to identify and prototype a solution to this crisis collaboration challenge. In this talk, we will present out insights gleaned from existing analogs of successful scientific collaboration during crises and our initial findings from the 60 targeted interviews we conducted that highlight key collaboration challenges that government agencies, academic research institutions, and industry scientists face during oil spill crises. We will also present a synthesis of leverage points in the system that may amplify the impact of an improved collaboration strategy among scientific stakeholders.

  11. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  12. The Blue LED Nobel Prize: Historical context, current scientific understanding, human benefit

    DOE PAGES

    Tsao, Jeffrey Y.; Han, Jung; Haitz, Roland H.; ...

    2015-06-19

    Here, the paths that connect scientific understanding with tools and technology are rarely linear. Sometimes scientific understanding leads and enables, sometimes tools and technologies lead and enable. But by feeding on each other, they create virtuous spirals of forward and backward innovation.

  13. The Blue LED Nobel Prize: Historical context, current scientific understanding, human benefit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsao, Jeffrey Y.; Han, Jung; Haitz, Roland H.

    Here, the paths that connect scientific understanding with tools and technology are rarely linear. Sometimes scientific understanding leads and enables, sometimes tools and technologies lead and enable. But by feeding on each other, they create virtuous spirals of forward and backward innovation.

  14. USEPA’s Land‐Based Materials Management Exposure and Risk Assessment Tool System

    EPA Science Inventory

    It is recognized that some kinds of 'waste' materials can in fact be reused as input materials for making safe products that benefit society. RIMM (Risk-Informed Materials Management) provides an integrated data gathering and analysis capability to enable scientifically rigorous ...

  15. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  16. Visual Data Exploration and Analysis - Report on the Visualization Breakout Session of the SCaLeS Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Frank, Randy; Fulcomer, Sam

    Scientific visualization is the transformation of abstract information into images, and it plays an integral role in the scientific process by facilitating insight into observed or simulated phenomena. Visualization as a discipline spans many research areas from computer science, cognitive psychology and even art. Yet the most successful visualization applications are created when close synergistic interactions with domain scientists are part of the algorithmic design and implementation process, leading to visual representations with clear scientific meaning. Visualization is used to explore, to debug, to gain understanding, and as an analysis tool. Visualization is literally everywhere--images are present in this report,more » on television, on the web, in books and magazines--the common theme is the ability to present information visually that is rapidly assimilated by human observers, and transformed into understanding or insight. As an indispensable part a modern science laboratory, visualization is akin to the biologist's microscope or the electrical engineer's oscilloscope. Whereas the microscope is limited to small specimens or use of optics to focus light, the power of scientific visualization is virtually limitless: visualization provides the means to examine data that can be at galactic or atomic scales, or at any size in between. Unlike the traditional scientific tools for visual inspection, visualization offers the means to ''see the unseeable.'' Trends in demographics or changes in levels of atmospheric CO{sub 2} as a function of greenhouse gas emissions are familiar examples of such unseeable phenomena. Over time, visualization techniques evolve in response to scientific need. Each scientific discipline has its ''own language,'' verbal and visual, used for communication. The visual language for depicting electrical circuits is much different than the visual language for depicting theoretical molecules or trends in the stock market. There is no ''one visualization too'' that can serve as a panacea for all science disciplines. Instead, visualization researchers work hand in hand with domain scientists as part of the scientific research process to define, create, adapt and refine software that ''speaks the visual language'' of each scientific domain.« less

  17. Exploiting the Use of Social Networking to Facilitate Collaboration in the Scientific Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coppock, Edrick G.

    The goal of this project was to exploit social networking to facilitate scientific collaboration. The project objective was to research and identify scientific collaboration styles that are best served by social networking applications and to model the most effective social networking applications to substantiate how social networking can support scientific collaboration. To achieve this goal and objective, the project was to develop an understanding of the types of collaborations conducted by scientific researchers, through classification, data analysis and identification of unique collaboration requirements. Another technical objective in support of this goal was to understand the current state of technology inmore » collaboration tools. In order to test hypotheses about which social networking applications effectively support scientific collaboration the project was to create a prototype scientific collaboration system. The ultimate goal for testing the hypotheses and research of the project was to refine the prototype into a functional application that could effectively facilitate and grow collaboration within the U.S. Department of Energy (DOE) research community.« less

  18. Scientific Digital Libraries, Interoperability, and Ontologies

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  19. The route to best science in implementation of the Endangered Species Act's consultation mandate: the benefits of structured effects analysis.

    PubMed

    Murphy, Dennis D; Weiland, Paul S

    2011-02-01

    The Endangered Species Act is intended to conserve at-risk species and the ecosystems upon which they depend, and it is premised on the notion that if the wildlife agencies that are charged with implementing the statute use the best available scientific information, they can successfully carry out this intention. We assess effects analysis as a tool for using best science to guide agency decisions under the Act. After introducing effects analysis, we propose a framework that facilitates identification and use of the best available information in the development of agency determinations. The framework includes three essential steps--the collection of reliable scientific information, the critical assessment and synthesis of available data and analyses derived from those data, and the analysis of the effects of actions on listed species and their habitats. We warn of likely obstacles to rigorous, structured effect analyses and describe the extent to which independent scientific review may assist in overcoming these obstacles. We conclude by describing eight essential elements that are required for a successful effects analysis.

  20. Web-based analysis and publication of flow cytometry experiments.

    PubMed

    Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M

    2010-07-01

    Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.

  1. Web-Based Analysis and Publication of Flow Cytometry Experiments

    PubMed Central

    Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.

    2014-01-01

    Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106

  2. How the World Gains Understanding of a Planet: Analysis of Scientific Understanding in Earth Sciences and of the Communication of Earth-Scientific Explanation

    NASA Astrophysics Data System (ADS)

    Voute, S.; Kleinhans, M. G.; de Regt, H.

    2010-12-01

    A scientific explanation for a phenomenon is based on relevant theory and initial and background conditions. Scientific understanding, on the other hand, requires intelligibility, which means that a scientist can recognise qualitative characteristic consequences of the theory without doing the actual calculations, and apply it to develop further explanations and predictions. If explanation and understanding are indeed fundamentally different, then it may be possible to convey understanding of earth-scientific phenomena to laymen without the full theoretical background. The aim of this thesis is to analyze how scientists and laymen gain scientific understanding in Earth Sciences, based on the newest insights in the philosophy of science, pedagogy, and science communication. All three disciplines have something to say about how humans learn and understand, even if at very different levels of scientists, students, children or the general public. If different disciplines with different approaches identify and quantify the same theory in the same manner, then there is likely to be something “real” behind the theory. Comparing methodology and learning styles of the different disciplines within the Earth Sciences and by critically analyze earth-scientific exhibitions in different museums may provide insight in the different approaches for earth-scientific explanation and communication. In order to gain earth-scientific understanding, a broad suite of tools is used, such as maps and images, symbols and diagrams, cross-sections and sketches, categorization and classification, modelling, laboratory experiments, (computer) simulations and analogies, remote sensing, and fieldwork. All these tools have a dual nature, containing both theoretical and embodied components. Embodied knowledge is created by doing the actual modelling, intervening in experiments and doing fieldwork. Scientific practice includes discovery and exploration, data collection and analyses, verification or falsification and conclusions that must be well grounded and argued. The intelligibility of theories is improved by the combination of these two types of understanding. This is also attested by the fact that both theoretical and embodied skills are considered essential for the training of university students at all levels. However, from surprised and confounded reactions of the public to natural disasters it appears that just showing scientific results is not enough to convey the scientific understanding to the public. By using the tools used by earth scientists to develop explanations and achieve understanding, laymen could achieve understanding as well without rigorous theoretical training. We are presently investigating in science musea whether engaging the public in scientific activities based on embodied skills leads to understanding of earth-scientific phenomena by laymen.

  3. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  4. Spec Tool; an online education and research resource

    NASA Astrophysics Data System (ADS)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springmeyer, R R; Brugger, E; Cook, R

    The Data group provides data analysis and visualization support to its customers. This consists primarily of the development and support of VisIt, a data analysis and visualization tool. Support ranges from answering questions about the tool, providing classes on how to use the tool, and performing data analysis and visualization for customers. The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include applying visualization software for large scale data exploration; running video production labs on two networks; supporting graphics libraries and tools for end users;more » maintaining PowerWalls and assorted other displays; and developing software for searching and managing scientific data. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization techniques for large scale data exploration that are funded by the ASC program, among others. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry. The IMG group is located in the Terascale Simulation Facility, home to Dawn, Atlas, BGL, and others, which includes both classified and unclassified visualization theaters, a visualization computer floor and deployment workshop, and video production labs. We continued to provide the traditional graphics group consulting and video production support. We maintained five PowerWalls and many other displays. We deployed a 576-node Opteron/IB cluster with 72 TB of memory providing a visualization production server on our classified network. We continue to support a 128-node Opteron/IB cluster providing a visualization production server for our unclassified systems and an older 256-node Opteron/IB cluster for the classified systems, as well as several smaller clusters to drive the PowerWalls. The visualization production systems includes NFS servers to provide dedicated storage for data analysis and visualization. The ASC projects have delivered new versions of visualization and scientific data management tools to end users and continue to refine them. VisIt had 4 releases during the past year, ending with VisIt 2.0. We released version 2.4 of Hopper, a Java application for managing and transferring files. This release included a graphical disk usage view which works on all types of connections and an aggregated copy feature for quickly transferring massive datasets quickly and efficiently to HPSS. We continue to use and develop Blockbuster and Telepath. Both the VisIt and IMG teams were engaged in a variety of movie production efforts during the past year in addition to the development tasks.« less

  6. Software and the Scientist: Coding and Citation Practices in Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, Lorraine; Fish, Allison; Soito, Laura; Smith, MacKenzie; Kellogg, Louise H.

    2017-11-01

    In geodynamics as in other scientific areas, computation has become a core component of research, complementing field observation, laboratory analysis, experiment, and theory. Computational tools for data analysis, mapping, visualization, modeling, and simulation are essential for all aspects of the scientific workflow. Specialized scientific software is often developed by geodynamicists for their own use, and this effort represents a distinctive intellectual contribution. Drawing on a geodynamics community that focuses on developing and disseminating scientific software, we assess the current practices of software development and attribution, as well as attitudes about the need and best practices for software citation. We analyzed publications by participants in the Computational Infrastructure for Geodynamics and conducted mixed method surveys of the solid earth geophysics community. From this we learned that coding skills are typically learned informally. Participants considered good code as trusted, reusable, readable, and not overly complex and considered a good coder as one that participates in the community in an open and reasonable manor contributing to both long- and short-term community projects. Participants strongly supported citing software reflected by the high rate a software package was named in the literature and the high rate of citations in the references. However, lacking are clear instructions from developers on how to cite and education of users on what to cite. In addition, citations did not always lead to discoverability of the resource. A unique identifier to the software package itself, community education, and citation tools would contribute to better attribution practices.

  7. Integrated Analysis Tools for Determination of Structural Integrity and Durability of High temperature Polymer Matrix Composites

    DTIC Science & Technology

    2008-08-18

    fidelity will be used to reduce the massive experimental testing and associated time required for qualification of new materials. Tools and...develping a model of the thermo-oxidative process for polymer systems, that incorporates the effects of reaction rates, Fickian diffusion, time varying...degradation processes. Year: 2005 Month: 12 Not required at this time . AIR FORCE OFFICE OF SCIENTIFIC KESEARCH 04 SEP 2008 Page 2 of 2 DTIC Data

  8. Sustainable groundwater management in California

    USGS Publications Warehouse

    Phillips, Steven P.; Rogers, Laurel Lynn; Faunt, Claudia

    2015-12-01

    The U.S. Geological Survey (USGS) uses data collection, modeling tools, and scientific analysis to help water managers plan for, and assess, hydrologic issues that can cause “undesirable results” associated with groundwater use. This information helps managers understand trends and investigate and predict effects of different groundwater-management strategies.

  9. Open Science and the Monitoring of Aquatic Ecosystems

    EPA Science Inventory

    Open science represents both a philosophy and a set of tools that can be leveraged for more effective scientific analysis. At the core of the open science movement is the concept that research should be reproducible and transparent, in addition to having long-term provenance thro...

  10. The Great Spray Can Debate.

    ERIC Educational Resources Information Center

    Bassow, Herb

    This booklet, designed to be used in high school classrooms, concerns the technological, economic, and political contexts of the fluorocarbon-ozone depletion controversy. The curriculum is divided into three phases: the scientific dimension, which is a pure science analysis using lab-classroom tools and methodologies; the philosophical dimension,…

  11. Climate Change Education in Earth System Science

    NASA Astrophysics Data System (ADS)

    Hänsel, Stephanie; Matschullat, Jörg

    2013-04-01

    The course "Atmospheric Research - Climate Change" is offered to master Earth System Science students within the specialisation "Climate and Environment" at the Technical University Bergakademie Freiberg. This module takes a comprehensive approach to climate sciences, reaching from the natural sciences background of climate change via the social components of the issue to the statistical analysis of changes in climate parameters. The course aims at qualifying the students to structure the physical and chemical basics of the climate system including relevant feedbacks. The students can evaluate relevant drivers of climate variability and change on various temporal and spatial scales and can transform knowledge from climate history to the present and the future. Special focus is given to the assessment of uncertainties related to climate observations and projections as well as the specific challenges of extreme weather and climate events. At the end of the course the students are able to critically reflect and evaluate climate change related results of scientific studies and related issues in media. The course is divided into two parts - "Climate Change" and "Climate Data Analysis" and encompasses two lectures, one seminar and one exercise. The weekly "Climate change" lecture transmits the physical and chemical background for climate variation and change. (Pre)historical, observed and projected climate changes and their effects on various sectors are being introduced and discussed regarding their implications for society, economics, ecology and politics. The related seminar presents and discusses the multiple reasons for controversy in climate change issues, based on various texts. Students train the presentation of scientific content and the discussion of climate change aspects. The biweekly lecture on "Climate data analysis" introduces the most relevant statistical tools and methods in climate science. Starting with checking data quality via tools of exploratory data analysis the approaches on climate time series, trend analysis and extreme events analysis are explained. Tools to describe relations within the data sets and significance tests further corroborate this. Within the weekly exercises that have to be prepared at home, the students work with self-selected climate data sets and apply the learned methods. The presentation and discussion of intermediate results by the students is as much part of the exercises as the illustration of possible methodological procedures by the teacher using exemplary data sets. The total time expenditure of the course is 270 hours with 90 attendance hours. The remainder consists of individual studies, e.g., preparation of discussions and presentations, statistical data analysis, and scientific writing. Different forms of examination are applied including written or oral examination, scientific report, presentation and portfolio work.

  12. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  13. Characterization of the peer review network at the Center for Scientific Review, National Institutes of Health.

    PubMed

    Boyack, Kevin W; Chen, Mei-Ching; Chacko, George

    2014-01-01

    The National Institutes of Health (NIH) is the largest source of funding for biomedical research in the world. This funding is largely effected through a competitive grants process. Each year the Center for Scientific Review (CSR) at NIH manages the evaluation, by peer review, of more than 55,000 grant applications. A relevant management question is how this scientific evaluation system, supported by finite resources, could be continuously evaluated and improved for maximal benefit to the scientific community and the taxpaying public. Towards this purpose, we have created the first system-level description of peer review at CSR by applying text analysis, bibliometric, and graph visualization techniques to administrative records. We identify otherwise latent relationships across scientific clusters, which in turn suggest opportunities for structural reorganization of the system based on expert evaluation. Such studies support the creation of monitoring tools and provide transparency and knowledge to stakeholders.

  14. Enhancing Scientific Practice and Education through Collaborative Digital Libraries.

    ERIC Educational Resources Information Center

    Maini, Gaurav; Leggett, John J.; Ong, Teongjoo; Wilson, Hugh D.; Reed, Monique D.; Hatch, Stephan L.; Dawson, John E.

    The need for accurate and current scientific information in the fast paced Internet-aware world has prompted the scientific community to develop tools that reduce the scientist's time and effort to make digital information available to all interested parties. The availability of such tools has made the Internet a vast digital repository of…

  15. Integrating Data Base into the Elementary School Science Program.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    This document describes seven science activities that combine scientific principles and computers. The objectives for the activities are to show students how the computer can be used as a tool to store and arrange scientific data, provide students with experience using the computer as a tool to manage scientific data, and provide students with…

  16. The revised NEUROGES-ELAN system: An objective and reliable interdisciplinary analysis tool for nonverbal behavior and gesture.

    PubMed

    Lausberg, Hedda; Sloetjes, Han

    2016-09-01

    As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.

  17. Data-Oriented Astrophysics at NOAO: The Science Archive & The Data Lab

    NASA Astrophysics Data System (ADS)

    Juneau, Stephanie; NOAO Data Lab, NOAO Science Archive

    2018-06-01

    As we keep progressing into an era of increasingly large astronomy datasets, NOAO’s data-oriented mission is growing in prominence. The NOAO Science Archive, which captures and processes the pixel data from mountaintops in Chile and Arizona, now contains holdings at Petabyte scales. Working at the intersection of astronomy and data science, the main goal of the NOAO Data Lab is to provide users with a suite of tools to work close to this data, the catalogs derived from them, as well as externally provided datasets, and thus optimize the scientific productivity of the astronomy community. These tools and services include databases, query tools, virtual storage space, workflows through our Jupyter Notebook server, and scripted analysis. We currently host datasets from NOAO facilities such as the Dark Energy Survey (DES), the DESI imaging Legacy Surveys (LS), the Dark Energy Camera Plane Survey (DECaPS), and the nearly all-sky NOAO Source Catalog (NSC). We are further preparing for large spectroscopy datasets such as DESI. After a brief overview of the Science Archive, the Data Lab and datasets, I will briefly showcase scientific applications showing use of our data holdings. Lastly, I will describe our vision for future developments as we tackle the next technical and scientific challenges.

  18. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.

    PubMed

    McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S

    2015-10-20

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  19. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories

    PubMed Central

    McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.

    2015-01-01

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642

  20. Assessing Ecosystem Impacts from Simulant and Decontaminant Use

    DTIC Science & Technology

    1988-05-01

    on the relationship between metak- olism and body weight, W: DMAN = DANIMAL (WANIMAL/WMAN) 0.25 (7) Values of the scaling factor, (WANIMAL/WMAN)P’ 25...chemical. Structure-activity analysis is a relatively new field, and the available tools are still crude. The user must exercise scientific judgment in

  1. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-11-13

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  2. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-01-30

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  3. The TimeStudio Project: An open source scientific workflow system for the behavioral and brain sciences.

    PubMed

    Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf

    2016-06-01

    This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.

  4. FAST: A multi-processed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.

  5. Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis.

    PubMed

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2018-04-15

    Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele. darrell.hurt@nih.gov.

  6. Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis

    PubMed Central

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2018-01-01

    Abstract Motivation Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. Results The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. Availability and implementation https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele Contact darrell.hurt@nih.gov PMID:29028892

  7. Theoretical and technological building blocks for an innovation accelerator

    NASA Astrophysics Data System (ADS)

    van Harmelen, F.; Kampis, G.; Börner, K.; van den Besselaar, P.; Schultes, E.; Goble, C.; Groth, P.; Mons, B.; Anderson, S.; Decker, S.; Hayes, C.; Buecheler, T.; Helbing, D.

    2012-11-01

    Modern science is a main driver of technological innovation. The efficiency of the scientific system is of key importance to ensure the competitiveness of a nation or region. However, the scientific system that we use today was devised centuries ago and is inadequate for our current ICT-based society: the peer review system encourages conservatism, journal publications are monolithic and slow, data is often not available to other scientists, and the independent validation of results is limited. The resulting scientific process is hence slow and sloppy. Building on the Innovation Accelerator paper by Helbing and Balietti [1], this paper takes the initial global vision and reviews the theoretical and technological building blocks that can be used for implementing an innovation (in first place: science) accelerator platform driven by re-imagining the science system. The envisioned platform would rest on four pillars: (i) Redesign the incentive scheme to reduce behavior such as conservatism, herding and hyping; (ii) Advance scientific publications by breaking up the monolithic paper unit and introducing other building blocks such as data, tools, experiment workflows, resources; (iii) Use machine readable semantics for publications, debate structures, provenance etc. in order to include the computer as a partner in the scientific process, and (iv) Build an online platform for collaboration, including a network of trust and reputation among the different types of stakeholders in the scientific system: scientists, educators, funding agencies, policy makers, students and industrial innovators among others. Any such improvements to the scientific system must support the entire scientific process (unlike current tools that chop up the scientific process into disconnected pieces), must facilitate and encourage collaboration and interdisciplinarity (again unlike current tools), must facilitate the inclusion of intelligent computing in the scientific process, must facilitate not only the core scientific process, but also accommodate other stakeholders such science policy makers, industrial innovators, and the general public. We first describe the current state of the scientific system together with up to a dozen new key initiatives, including an analysis of the role of science as an innovation accelerator. Our brief survey will show that there exist many separate ideas and concepts and diverse stand-alone demonstrator systems for different components of the ecosystem with many parts are still unexplored, and overall integration lacking. By analyzing a matrix of stakeholders vs. functionalities, we identify the required innovations. We (non-exhaustively) discuss a few of them: Publications that are meaningful to machines, innovative reviewing processes, data publication, workflow archiving and reuse, alternative impact metrics, tools for the detection of trends, community formation and emergence, as well as modular publications, citation objects and debate graphs. To summarize, the core idea behind the Innovation Accelerator is to develop new incentive models, rules, and interaction mechanisms to stimulate true innovation, revolutionizing the way in which we create knowledge and disseminate information.

  8. Interactive visualization of multi-data-set Rietveld analyses using Cinema:Debye-Scherrer.

    PubMed

    Vogel, Sven C; Biwer, Chris M; Rogers, David H; Ahrens, James P; Hackenberg, Robert E; Onken, Drew; Zhang, Jianzhong

    2018-06-01

    A tool named Cinema:Debye-Scherrer to visualize the results of a series of Rietveld analyses is presented. The multi-axis visualization of the high-dimensional data sets resulting from powder diffraction analyses allows identification of analysis problems, prediction of suitable starting values, identification of gaps in the experimental parameter space and acceleration of scientific insight from the experimental data. The tool is demonstrated with analysis results from 59 U-Nb alloy samples with different compositions, annealing times and annealing temperatures as well as with a high-temperature study of the crystal structure of CsPbBr 3 . A script to extract parameters from a series of Rietveld analyses employing the widely used GSAS Rietveld software is also described. Both software tools are available for download.

  9. Interactive visualization of multi-data-set Rietveld analyses using Cinema:Debye-Scherrer

    PubMed Central

    Biwer, Chris M.; Rogers, David H.; Ahrens, James P.; Hackenberg, Robert E.; Onken, Drew; Zhang, Jianzhong

    2018-01-01

    A tool named Cinema:Debye-Scherrer to visualize the results of a series of Rietveld analyses is presented. The multi-axis visualization of the high-dimensional data sets resulting from powder diffraction analyses allows identification of analysis problems, prediction of suitable starting values, identification of gaps in the experimental parameter space and acceleration of scientific insight from the experimental data. The tool is demonstrated with analysis results from 59 U–Nb alloy samples with different compositions, annealing times and annealing temperatures as well as with a high-temperature study of the crystal structure of CsPbBr3. A script to extract parameters from a series of Rietveld analyses employing the widely used GSAS Rietveld software is also described. Both software tools are available for download. PMID:29896062

  10. Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?

    NASA Astrophysics Data System (ADS)

    Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.

    2014-02-01

    The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.

  11. The State of Software for Evolutionary Biology

    PubMed Central

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-01-01

    Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525

  12. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  13. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  14. Supporting Beginning Teacher Planning and Enactment of Investigation-based Science Discussions: The Design and Use of Tools within Practice-based Teacher Education

    NASA Astrophysics Data System (ADS)

    Kademian, Sylvie M.

    Current reform efforts prioritize science instruction that provides opportunities for students to engage in productive talk about scientific phenomena. Given the challenges teachers face enacting instruction that integrates science practices and science content, beginning teachers need support to develop the knowledge and teaching practices required to teach reform-oriented science lessons. Practice-based teacher education shows potential for supporting beginning teachers while they are learning to teach in this way. However, little is known about how beginning elementary teachers draw upon the types of support and tools associated with practice-based teacher education to learn to successfully enact this type of instruction. This dissertation addresses this gap by investigating how a practice-based science methods course using a suite of teacher educator-provided tools can support beginning teachers' planning and enactment of investigation-based science lessons. Using qualitative case study methodologies, this study drew on video-records, lesson plans, class assignments, and surveys from one cohort of 22 pre-service teachers (called interns in this study) enrolled in a year-long elementary education master of the arts and teaching certification program. Six focal interns were also interviewed at multiple time-points during the methods course. Similarities existed across the types of tools and teaching practices interns used most frequently to plan and enact investigation-based discussions. For the focal interns, use of four synergistic teaching practices throughout the lesson enactments (including consideration of students' initial ideas; use of open-ended questions to elicit, extend, and challenge ideas; connecting across students' ideas and the disciplinary core ideas; and use of a representation to organize and highlight students' ideas) appeared to lead to increased opportunities for students to share their ideas and engage in data analysis, argumentation and explanation construction. Student opportunities to engage in practices that prioritize scientific discourse also occurred when interns were using dialogic voice and the tools designed to foster development of teacher knowledge for facilitating investigation-based science discussions. However, several intern characteristics likely moderated or mediated intern use of tools, dialogic voice, and productive teaching practices to capitalize on student contributions. These characteristics included intern knowledge of the science content and practices and initial beliefs about science teaching. Missed opportunities to use a combination of several teaching practices and tools designed to foster the development of knowledge for science teaching resulted in fewer opportunities for students to engage in data analysis, argumentation based on evidence, and construction of scientific explanations. These findings highlight the potential of teacher-educator provided tools for supporting beginning teachers in learning to facilitate investigation-based discussions that capitalize on student contributions. These findings also help the field conceptualize how beginning teachers use tools and teaching practices to plan and enact investigation-based science lessons, and how intern characteristics relate to tool use and planned and enacted lessons. By analyzing the investigation-based science lessons holistically, this study begins to unpack the complexities of facilitating investigation-based discussions including the interplay between intern characteristics and tool use, and the ways intern engagement in synergistic teaching practices provide opportunities for students to engage in data analysis, explanation construction, and argumentation. This study also describes methodological implications for this type of whole-lesson analysis and comments on the need for further research investigating beginning teachers' use of tools over time. Finally, I propose the need for iterative design of scaffolds to further support beginning teacher facilitation of investigation-based science lessons.

  15. Thermal performance modeling of NASA s scientific balloons

    NASA Astrophysics Data System (ADS)

    Franco, H.; Cathey, H.

    The flight performance of a scientific balloon is highly dependant on the interaction between the balloon and its environment. The balloon is a thermal vehicle. Modeling a scientific balloon's thermal performance has proven to be a difficult analytical task. Most previous thermal models have attempted these analyses by using either a bulk thermal model approach, or by simplified representations of the balloon. These approaches to date have provided reasonable, but not very accurate results. Improvements have been made in recent years using thermal analysis tools developed for the thermal modeling of spacecraft and other sophisticated heat transfer problems. These tools, which now allow for accurate modeling of highly transmissive materials, have been applied to the thermal analysis of NASA's scientific balloons. A research effort has been started that utilizes the "Thermal Desktop" addition to AUTO CAD. This paper will discuss the development of thermal models for both conventional and Ultra Long Duration super-pressure balloons. This research effort has focused on incremental analysis stages of development to assess the accuracy of the tool and the required model resolution to produce usable data. The first stage balloon thermal analyses started with simple spherical balloon models with a limited number of nodes, and expanded the number of nodes to determine required model resolution. These models were then modified to include additional details such as load tapes. The second stage analyses looked at natural shaped Zero Pressure balloons. Load tapes were then added to these shapes, again with the goal of determining the required modeling accuracy by varying the number of gores. The third stage, following the same steps as the Zero Pressure balloon efforts, was directed at modeling super-pressure pumpkin shaped balloons. The results were then used to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. The development of the radiative environment and program input files, the development of the modeling techniques for balloons, and the development of appropriate data output handling techniques for both the raw data and data plots will be discussed. A general guideline to match predicted balloon performance with known flight data will also be presented. One long-term goal of this effort is to develop simplified approaches and techniques to include results in performance codes being developed.

  16. A Proposal: Modification for Instruments and Tools Used in the Science Laboratory Setting for Students with Disabilities

    ERIC Educational Resources Information Center

    Kogan, Denis

    2015-01-01

    The purpose of this action research proposal is to create a Modification of Instruments and Tools in Science (MITS) program to address the need for providing Students With Disabilities (SWDs) appropriate access to scientific tools and techniques of scientific inquiry. This proposal contains a review of literature on SWDs, differentiating…

  17. Radio Astronomy Tools in Python: Spectral-cube, pvextractor, and more

    NASA Astrophysics Data System (ADS)

    Ginsburg, A.; Robitaille, T.; Beaumont, C.; Rosolowsky, E.; Leroy, A.; Brogan, C.; Hunter, T.; Teuben, P.; Brisbin, D.

    2015-12-01

    The radio-astro-tools organization has been established to facilitate development of radio and millimeter analysis tools by the scientific community. The first packages developed under its umbrella are: • The spectral-cube package, for reading, writing, and analyzing spectral data cubes • The pvextractor package for extracting position-velocity slices from position-position-velocity cubes along aribitrary paths • The radio-beam package to handle gaussian beams in the context of the astropy quantity and unit framework • casa-python to enable installation of these packages - and any other - into users' CASA environments without conflicting with the underlying CASA package. Community input in the form of code contributions, suggestions, questions and commments is welcome on all of these tools. They can all be found at http://radio-astro-tools.github.io.

  18. gPhoton: Time-tagged GALEX photon events analysis tools

    NASA Astrophysics Data System (ADS)

    Million, Chase C.; Fleming, S. W.; Shiao, B.; Loyd, P.; Seibert, M.; Smith, M.

    2016-03-01

    Written in Python, gPhoton calibrates and sky-projects the ~1.1 trillion ultraviolet photon events detected by the microchannel plates on the Galaxy Evolution Explorer Spacecraft (GALEX), archives these events in a publicly accessible database at the Mikulski Archive for Space Telescopes (MAST), and provides tools for working with the database to extract scientific results, particularly over short time domains. The software includes a re-implementation of core functionality of the GALEX mission calibration pipeline to produce photon list files from raw spacecraft data as well as a suite of command line tools to generate calibrated light curves, images, and movies from the MAST database.

  19. Understanding How Families Use Magnifiers during Nature Center Walks

    ERIC Educational Resources Information Center

    Zimmerman, Heather Toomey; McClain, Lucy Richardson; Crowl, Michele

    2013-01-01

    This analysis uses a sociocultural learning theory and parent-child interaction framework to understand families' interactions with one type of scientific tool, the magnifier, during nature walks offered by a nature center. Families were video recorded to observe how they organized their activities where they used magnifiers to explore in the…

  20. Open Simulation Laboratories [Guest editors' introduction

    DOE PAGES

    Alexander, Francis J.; Meneveau, Charles

    2015-09-01

    The introduction for the special issue on open simulation laboratories, the guest editors describe how OSLs will become more common as their potential is better understood and they begin providing access to valuable datasets to much larger segments of the scientific community. Moreover, new analysis tools and ways to do science will inevitably develop as a result.

  1. Uncertainty analysis in ecological studies: an overview

    Treesearch

    Harbin Li; Jianguo Wu

    2006-01-01

    Large-scale simulation models are essential tools for scientific research and environmental decision-making because they can be used to synthesize knowledge, predict consequences of potential scenarios, and develop optimal solutions (Clark et al. 2001, Berk et al. 2002, Katz 2002). Modeling is often the only means of addressing complex environmental problems that occur...

  2. Organizational Approach to the Ergonomic Examination of E-Learning Modules

    ERIC Educational Resources Information Center

    Lavrov, Evgeniy; Kupenko, Olena; Lavryk, Tetiana; Barchenko, Natalia

    2013-01-01

    With a significant increase in the number of e-learning resources the issue of quality is of current importance. An analysis of existing scientific and methodological literature shows the variety of approaches, methods and tools to evaluate e-learning materials. This paper proposes an approach based on the procedure for estimating parameters of…

  3. Online Reflections about Tinkering in Early Childhood: A Socio-Cultural Analysis

    ERIC Educational Resources Information Center

    Jane, Beverley

    2006-01-01

    Science education research predominantly shows that students improve their scientific understandings when they tinker (or pull apart) tools and simple household machines. In this study, the qualitative data collected took the form of online journal entries by final year, female, primary teacher trainees, who reflected upon their early childhood…

  4. The Value of Information in Decision-Analytic Modeling for Malaria Vector Control in East Africa.

    PubMed

    Kim, Dohyeong; Brown, Zachary; Anderson, Richard; Mutero, Clifford; Miranda, Marie Lynn; Wiener, Jonathan; Kramer, Randall

    2017-02-01

    Decision analysis tools and mathematical modeling are increasingly emphasized in malaria control programs worldwide to improve resource allocation and address ongoing challenges with sustainability. However, such tools require substantial scientific evidence, which is costly to acquire. The value of information (VOI) has been proposed as a metric for gauging the value of reduced model uncertainty. We apply this concept to an evidenced-based Malaria Decision Analysis Support Tool (MDAST) designed for application in East Africa. In developing MDAST, substantial gaps in the scientific evidence base were identified regarding insecticide resistance in malaria vector control and the effectiveness of alternative mosquito control approaches, including larviciding. We identify four entomological parameters in the model (two for insecticide resistance and two for larviciding) that involve high levels of uncertainty and to which outputs in MDAST are sensitive. We estimate and compare a VOI for combinations of these parameters in evaluating three policy alternatives relative to a status quo policy. We find having perfect information on the uncertain parameters could improve program net benefits by up to 5-21%, with the highest VOI associated with jointly eliminating uncertainty about reproductive speed of malaria-transmitting mosquitoes and initial efficacy of larviciding at reducing the emergence of new adult mosquitoes. Future research on parameter uncertainty in decision analysis of malaria control policy should investigate the VOI with respect to other aspects of malaria transmission (such as antimalarial resistance), the costs of reducing uncertainty in these parameters, and the extent to which imperfect information about these parameters can improve payoffs. © 2016 Society for Risk Analysis.

  5. Citation analysis of publications of NASU mechanicians in the database of the Thomson Reuters Institute for Scientific Information

    NASA Astrophysics Data System (ADS)

    Guz, A. N.; Rushchitsky, J. J.

    2009-07-01

    The paper performs a citation analysis of publications of mechanicians of the National Academy of Sciences of Ukraine (NASU) based on information tools developed by the Thomson Reuters Institute for Scientific Information. Two groups of mechanicians are considered: representatives of the S. P. Timoshenko Institute of Mechanics of the NASU (NASU members, heads of departments) and members (academicians) of the NASU Division of Mechanics. Three elements of the Citation Report (Results Found, Citation Index (Sum of the Times Cited), h-index) are presented for each scientist. This paper may be considered as a follow-up on the papers [6-11] published by Prikladnaya Mekhanika ( International Applied Mechanics) in 2005-2009

  6. Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nugent, Peter E.; Simonson, J. Michael

    2011-10-24

    This report is based on the Department of Energy (DOE) Workshop on “Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery” that was held at the Bethesda Marriott in Maryland on October 24-25, 2011. The workshop brought together leading researchers from the Basic Energy Sciences (BES) facilities and Advanced Scientific Computing Research (ASCR). The workshop was co-sponsored by these two Offices to identify opportunities and needs for data analysis, ownership, storage, mining, provenance and data transfer at light sources, neutron sources, microscopy centers and other facilities. Their charge was to identify current and anticipated issues inmore » the acquisition, analysis, communication and storage of experimental data that could impact the progress of scientific discovery, ascertain what knowledge, methods and tools are needed to mitigate present and projected shortcomings and to create the foundation for information exchanges and collaboration between ASCR and BES supported researchers and facilities. The workshop was organized in the context of the impending data tsunami that will be produced by DOE’s BES facilities. Current facilities, like SLAC National Accelerator Laboratory’s Linac Coherent Light Source, can produce up to 18 terabytes (TB) per day, while upgraded detectors at Lawrence Berkeley National Laboratory’s Advanced Light Source will generate ~10TB per hour. The expectation is that these rates will increase by over an order of magnitude in the coming decade. The urgency to develop new strategies and methods in order to stay ahead of this deluge and extract the most science from these facilities was recognized by all. The four focus areas addressed in this workshop were: Workflow Management - Experiment to Science: Identifying and managing the data path from experiment to publication. Theory and Algorithms: Recognizing the need for new tools for computation at scale, supporting large data sets and realistic theoretical models. Visualization and Analysis: Supporting near-real-time feedback for experiment optimization and new ways to extract and communicate critical information from large data sets. Data Processing and Management: Outlining needs in computational and communication approaches and infrastructure needed to handle unprecedented data volume and information content. It should be noted that almost all participants recognized that there were unlikely to be any turn-key solutions available due to the unique, diverse nature of the BES community, where research at adjacent beamlines at a given light source facility often span everything from biology to materials science to chemistry using scattering, imaging and/or spectroscopy. However, it was also noted that advances supported by other programs in data research, methodologies, and tool development could be implemented on reasonable time scales with modest effort. Adapting available standard file formats, robust workflows, and in-situ analysis tools for user facility needs could pay long-term dividends. Workshop participants assessed current requirements as well as future challenges and made the following recommendations in order to achieve the ultimate goal of enabling transformative science in current and future BES facilities: Theory and analysis components should be integrated seamlessly within experimental workflow. Develop new algorithms for data analysis based on common data formats and toolsets. Move analysis closer to experiment. Move the analysis closer to the experiment to enable real-time (in-situ) streaming capabilities, live visualization of the experiment and an increase of the overall experimental efficiency. Match data management access and capabilities with advancements in detectors and sources. Remove bottlenecks, provide interoperability across different facilities/beamlines and apply forefront mathematical techniques to more efficiently extract science from the experiments. This workshop report examines and reviews the status of several BES facilities and highlights the successes and shortcomings of the current data and communication pathways for scientific discovery. It then ascertains what methods and tools are needed to mitigate present and projected data bottlenecks to science over the next 10 years. The goal of this report is to create the foundation for information exchanges and collaborations among ASCR and BES supported researchers, the BES scientific user facilities, and ASCR computing and networking facilities. To jumpstart these activities, there was a strong desire to see a joint effort between ASCR and BES along the lines of the highly successful Scientific Discovery through Advanced Computing (SciDAC) program in which integrated teams of engineers, scientists and computer scientists were engaged to tackle a complete end-to-end workflow solution at one or more beamlines, to ascertain what challenges will need to be addressed in order to handle future increases in data« less

  7. A Unique Digital Electrocardiographic Repository for the Development of Quantitative Electrocardiography and Cardiac Safety: The Telemetric and Holter ECG Warehouse (THEW)

    PubMed Central

    Couderc, Jean-Philippe

    2010-01-01

    The sharing of scientific data reinforces open scientific inquiry; it encourages diversity of analysis and opinion while promoting new research and facilitating the education of next generations of scientists. In this article, we present an initiative for the development of a repository containing continuous electrocardiographic information and their associated clinical information. This information is shared with the worldwide scientific community in order to improve quantitative electrocardiology and cardiac safety. First, we present the objectives of the initiative and its mission. Then, we describe the resources available in this initiative following three components: data, expertise and tools. The Data available in the Telemetric and Holter ECG Warehouse (THEW) includes continuous ECG signals and associated clinical information. The initiative attracted various academic and private partners whom expertise covers a large list of research arenas related to quantitative electrocardiography; their contribution to the THEW promotes cross-fertilization of scientific knowledge, resources, and ideas that will advance the field of quantitative electrocardiography. Finally, the tools of the THEW include software and servers to access and review the data available in the repository. To conclude, the THEW is an initiative developed to benefit the scientific community and to advance the field of quantitative electrocardiography and cardiac safety. It is a new repository designed to complement the existing ones such as Physionet, the AHA-BIH Arrhythmia Database, and the CSE database. The THEW hosts unique datasets from clinical trials and drug safety studies that, so far, were not available to the worldwide scientific community. PMID:20863512

  8. Data processing with Pymicra, the Python tool for Micrometeorological Analyses

    NASA Astrophysics Data System (ADS)

    Chor, T. L.; Dias, N. L.

    2017-12-01

    With the ever-increasing capability of instrumentation of collecting high-frequency turbulence data, micrometeorological experiments are now generating significant amounts of data. Clearly, data processing -- and not data collection anymore -- has become the limiting factor for those very large data sets. The ability of extracting useful scientific information from those experiments, therefore, hinges on tools that (i) are able to process those data effectively and accurately, (ii) are flexible enough to be adapted to the specific requirements of each investigation, and (iii) are robust enough to make data analysis easily reproducible over different sets of large data sets. We have developed a framework for micrometeorological data analysis called Pymicra which does deliver such capabilities while maintaining proximity of the investigator with the data. It is fully written in an open-source, very high level language, Python, which has been gaining widespread acceptance as a scientific tool. It follows the philosophy of "not reinventing the wheel" and, as a result, relies on existing well-established open-source Python packages such as Numpy and Pandas. Thus, minimum effort is needed to program statistics, array processing, Fourier analysis, etc. Among the things that Pymicra does are reading and organizing data from virtually any format, applying common quality control procedures, extracting fluctuations in a number of ways, correcting for sensor drift, automatic calculation of fluid properties (such as air and dry air density), handling of units, calculation of cross-spectra, calculation of turbulent fluxes and scales, and all other features already provided by Pandas (interpolation, statistical tests, handling of missing data, etc.). Pymicra is freely available on Github and the fact that it makes heavy use of high-level programming makes adding and modifying code considerably easy for any scientific programmer, making it straightforward for other scientists to contribute with new functionality and point out room for improvements. Because of that, Pymicra is a candidate to be a community-developed code in the future and to centralize part of the data processing aimed at micrometeorology.

  9. Scientific Story Telling & Social Media The role of social media in effectively communicating science

    NASA Astrophysics Data System (ADS)

    Brinkhuis, D.; Peart, L.

    2012-12-01

    Scientific discourse generally takes place in appropriate journals, using the language and conventions of science. That's fine, as long as the discourse remains in scientific circles. It is only outside those circles that the rules and techniques of engaging social media tools gain importance. A young generation of scientists are eager to share their experiences by using social media, but is this effective? And how can we better integrate all outreach & media channels to engage general audiences? How can Facebook, Twitter, Skype and YouTube be used as synergy tools in scientific story telling? Case: during IODP Expedtion 342 (June-July 2012) onboard the scientific drillship JOIDES Resolution an onboard educator and videographer worked non-stop fort two months on an integrated outreach plan that tried and tested the limits of all social media tools available to interact with an international public while at sea. The results are spectacular!

  10. Publishing Platform for Scientific Software - Lessons Learned

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim

    2015-04-01

    Scientific software has become an indispensable commodity for the production, processing and analysis of empirical data but also for modelling and simulation of complex processes. Software has a significant influence on the quality of research results. For strengthening the recognition of the academic performance of scientific software development, for increasing its visibility and for promoting the reproducibility of research results, concepts for the publication of scientific software have to be developed, tested, evaluated, and then transferred into operations. For this, the publication and citability of scientific software have to fulfil scientific criteria by means of defined processes and the use of persistent identifiers, similar to data publications. The SciForge project is addressing these challenges. Based on interviews a blueprint for a scientific software publishing platform and a systematic implementation plan has been designed. In addition, the potential of journals, software repositories and persistent identifiers have been evaluated to improve the publication and dissemination of reusable software solutions. It is important that procedures for publishing software as well as methods and tools for software engineering are reflected in the architecture of the platform, in order to improve the quality of the software and the results of research. In addition, it is necessary to work continuously on improving specific conditions that promote the adoption and sustainable utilization of scientific software publications. Among others, this would include policies for the development and publication of scientific software in the institutions but also policies for establishing the necessary competencies and skills of scientists and IT personnel. To implement the concepts developed in SciForge a combined bottom-up / top-down approach is considered that will be implemented in parallel in different scientific domains, e.g. in earth sciences, climate research and the life sciences. Based on the developed blueprints a scientific software publishing platform will be iteratively implemented, tested, and evaluated. Thus the platform should be developed continuously on the basis of gained experiences and results. The platform services will be extended one by one corresponding to the requirements of the communities. Thus the implemented platform for the publication of scientific software can be improved and stabilized incrementally as a tool with software, science, publishing, and user oriented features.

  11. Total Diet Studies as a Tool for Ensuring Food Safety

    PubMed Central

    Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung

    2015-01-01

    With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety. PMID:26483881

  12. Scientific visualization of volumetric radar cross section data

    NASA Astrophysics Data System (ADS)

    Wojszynski, Thomas G.

    1992-12-01

    For aircraft design and mission planning, designers, threat analysts, mission planners, and pilots require a Radar Cross Section (RCS) central tendency with its associated distribution about a specified aspect and its relation to a known threat, Historically, RCS data sets have been statically analyzed to evaluate a d profile. However, Scientific Visualization, the application of computer graphics techniques to produce pictures of complex physical phenomena appears to be a more promising tool to interpret this data. This work describes data reduction techniques and a surface rendering algorithm to construct and display a complex polyhedron from adjacent contours of RCS data. Data reduction is accomplished by sectorizing the data and characterizing the statistical properties of the data. Color, lighting, and orientation cues are added to complete the visualization system. The tool may be useful for synthesis, design, and analysis of complex, low observable air vehicles.

  13. CheS-Mapper - Chemical Space Mapping and Visualization in 3D.

    PubMed

    Gütlein, Martin; Karwath, Andreas; Kramer, Stefan

    2012-03-17

    Analyzing chemical datasets is a challenging task for scientific researchers in the field of chemoinformatics. It is important, yet difficult to understand the relationship between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects. To that respect, visualization tools can help to better comprehend the underlying correlations. Our recently developed 3D molecular viewer CheS-Mapper (Chemical Space Mapper) divides large datasets into clusters of similar compounds and consequently arranges them in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kind of features, like structural fragments as well as quantitative chemical descriptors. These features can be highlighted within CheS-Mapper, which aids the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. As a final function, the tool can also be used to select and export specific subsets of a given dataset for further analysis.

  14. CheS-Mapper - Chemical Space Mapping and Visualization in 3D

    PubMed Central

    2012-01-01

    Analyzing chemical datasets is a challenging task for scientific researchers in the field of chemoinformatics. It is important, yet difficult to understand the relationship between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects. To that respect, visualization tools can help to better comprehend the underlying correlations. Our recently developed 3D molecular viewer CheS-Mapper (Chemical Space Mapper) divides large datasets into clusters of similar compounds and consequently arranges them in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kind of features, like structural fragments as well as quantitative chemical descriptors. These features can be highlighted within CheS-Mapper, which aids the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. As a final function, the tool can also be used to select and export specific subsets of a given dataset for further analysis. PMID:22424447

  15. The Methods of Cognitive Visualization for the Astronomical Databases Analyzing Tools Development

    NASA Astrophysics Data System (ADS)

    Vitkovskiy, V.; Gorohov, V.

    2008-08-01

    There are two kinds of computer graphics: the illustrative one and the cognitive one. Appropriate the cognitive pictures not only make evident and clear the sense of complex and difficult scientific concepts, but promote, --- and not so very rarely, --- a birth of a new knowledge. On the basis of the cognitive graphics concept, we worked out the SW-system for visualization and analysis. It allows to train and to aggravate intuition of researcher, to raise his interest and motivation to the creative, scientific cognition, to realize process of dialogue with the very problems simultaneously.

  16. Instrument control software requirement specification for Extremely Large Telescopes

    NASA Astrophysics Data System (ADS)

    Young, Peter J.; Kiekebusch, Mario J.; Chiozzi, Gianluca

    2010-07-01

    Engineers in several observatories are now designing the next generation of optical telescopes, the Extremely Large Telescopes (ELT). These are very complex machines that will host sophisticated astronomical instruments to be used for a wide range of scientific studies. In order to carry out scientific observations, a software infrastructure is required to orchestrate the control of the multiple subsystems and functions. This paper will focus on describing the considerations, strategies and main issues related to the definition and analysis of the software requirements for the ELT's Instrument Control System using modern development processes and modelling tools like SysML.

  17. Cybernetics and Cybernation

    ERIC Educational Resources Information Center

    Hilton, Alice Mary

    1973-01-01

    The use of cybernetics is shown to be a tool for bringing together various scientific fields and the social sciences via logic. Topics are discussed with application to help science teachers employ scientific knowledge and technical tools to build a truly civilized world of abundance.'' (DF)

  18. Automatic sentence extraction for the detection of scientific paper relations

    NASA Astrophysics Data System (ADS)

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.

  19. A new dataset validation system for the Planetary Science Archive

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Zender, J.; Heather, D.; Martinez, S.

    2007-08-01

    The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.

  20. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  1. Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS)

    NASA Astrophysics Data System (ADS)

    Daniels, M. D.; Graves, S. J.; Vernon, F.; Kerkez, B.; Chandra, C. V.; Keiser, K.; Martin, C.

    2014-12-01

    Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) Access, utilization and management of real-time data continue to be challenging for decision makers, as well as researchers in several scientific fields. This presentation will highlight infrastructure aimed at addressing some of the gaps in handling real-time data, particularly in increasing accessibility of these data to the scientific community through cloud services. The Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) system addresses the ever-increasing importance of real-time scientific data, particularly in mission critical scenarios, where informed decisions must be made rapidly. Advances in the distribution of real-time data are leading many new transient phenomena in space-time to be observed, however real-time decision-making is infeasible in many cases that require streaming scientific data as these data are locked down and sent only to proprietary in-house tools or displays. This lack of accessibility to the broader scientific community prohibits algorithm development and workflows initiated by these data streams. As part of NSF's EarthCube initiative, CHORDS proposes to make real-time data available to the academic community via cloud services. The CHORDS infrastructure will enhance the role of real-time data within the geosciences, specifically expanding the potential of streaming data sources in enabling adaptive experimentation and real-time hypothesis testing. Adherence to community data and metadata standards will promote the integration of CHORDS real-time data with existing standards-compliant analysis, visualization and modeling tools.

  2. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  3. 3D Feature Extraction for Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Silver, Deborah

    1996-01-01

    Visualization techniques provide tools that help scientists identify observed phenomena in scientific simulation. To be useful, these tools must allow the user to extract regions, classify and visualize them, abstract them for simplified representations, and track their evolution. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This article explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and those from Finite Element Analysis.

  4. STEREO In-situ Data Analysis

    NASA Astrophysics Data System (ADS)

    Schroeder, P. C.; Luhmann, J. G.; Davis, A. J.; Russell, C. T.

    2007-05-01

    STEREO's IMPACT (In-situ Measurements of Particles and CME Transients) investigation provides the first opportunity for long duration, detailed observations of 1 AU magnetic field structures, plasma and suprathermal electrons, and energetic particles at points bracketing Earth's heliospheric location. The PLASTIC instrument takes plasma ion composition measurements completing STEREO's comprehensive in-situ perspective. Stereoscopic/3D information from the STEREO SECCHI imagers and SWAVES radio experiment make it possible to use both multipoint and quadrature studies to connect interplanetary Coronal Mass Ejections (ICME) and solar wind structures to CMEs and coronal holes observed at the Sun. The uniqueness of the STEREO mission requires novel data analysis tools and techniques to take advantage of the mission's full scientific potential. An interactive browser with the ability to create publication-quality plots has been developed which integrates STEREO's in-situ data with data from a variety of other missions including WIND and ACE. Static summary plots and a key-parameter type data set with a related online browser provide alternative data access. Finally, an application program interface (API) is provided allowing users to create custom software that ties directly into STEREO's data set. The API allows for more advanced forms of data mining than currently available through most web-based data services. A variety of data access techniques and the development of cross- spacecraft data analysis tools allow the larger scientific community to combine STEREO's unique in-situ data with those of other missions, particularly the L1 missions, and, therefore, to maximize STEREO's scientific potential in gaining a greater understanding of the heliosphere.

  5. An analysis of South African Grade 9 natural sciences textbooks for their representation of nature of science

    NASA Astrophysics Data System (ADS)

    Dewnarain Ramnarain, Umesh; Chanetsa, Tarisai

    2016-04-01

    This article reports on an analysis and comparison of three South African Grade 9 (13-14 years) Natural Sciences textbooks for the representation of nature of science (NOS). The analysis was framed by an analytical tool developed and validated by Abd-El-Khalick and a team of researchers in a large-scale study on the high school textbooks in the USA. The three textbooks were scored on targeted NOS aspects on a scale of -3 to +3 that reflected the explicitness with which these aspects were addressed. The analysis revealed that the textbooks poorly depict NOS, and in particular, there was scant attention given to the social dimension of science, science versus pseudoscience and the 'myth of the scientific method'. The findings of this study are incommensurate with the strong emphasis in a reformed school science curriculum that underlies the need for learners to understand the scientific enterprise, and how scientific knowledge develops. In view of this, the findings of this research reinforce the need for a review on the mandate given to textbook publishers and writers so that a stronger focus be placed on the development of materials that better represent the tenets of NOS.

  6. Applied mediation analyses: a review and tutorial.

    PubMed

    Lange, Theis; Hansen, Kim Wadt; Sørensen, Rikke; Galatius, Søren

    2017-01-01

    In recent years, mediation analysis has emerged as a powerful tool to disentangle causal pathways from an exposure/treatment to clinically relevant outcomes. Mediation analysis has been applied in scientific fields as diverse as labour market relations and randomized clinical trials of heart disease treatments. In parallel to these applications, the underlying mathematical theory and computer tools have been refined. This combined review and tutorial will introduce the reader to modern mediation analysis including: the mathematical framework; required assumptions; and software implementation in the R package medflex. All results are illustrated using a recent study on the causal pathways stemming from the early invasive treatment of acute coronary syndrome, for which the rich Danish population registers allow us to follow patients' medication use and more after being discharged from hospital.

  7. Designing Summer Research Experiences for Teachers and Students That Promote Classroom Science Inquiry Projects and Produce Research Results

    NASA Astrophysics Data System (ADS)

    George, L. A.; Parra, J.; Rao, M.; Offerman, L.

    2007-12-01

    Research experiences for science teachers are an important mechanism for increasing classroom teachers' science content knowledge and facility with "real world" research processes. We have developed and implemented a summer scientific research and education workshop model for high school teachers and students which promotes classroom science inquiry projects and produces important research results supporting our overarching scientific agenda. The summer training includes development of a scientific research framework, design and implementation of preliminary studies, extensive field research and training in and access to instruments, measurement techniques and statistical tools. The development and writing of scientific papers is used to reinforce the scientific research process. Using these skills, participants collaborate with scientists to produce research quality data and analysis. Following the summer experience, teachers report increased incorporation of research inquiry in their classrooms and student participation in science fair projects. This workshop format was developed for an NSF Biocomplexity Research program focused on the interaction of urban climates, air quality and human response and can be easily adapted for other scientific research projects.

  8. Tools for observational gait analysis in patients with stroke: a systematic review.

    PubMed

    Ferrarello, Francesco; Bianchi, Valeria Anna Maria; Baccini, Marco; Rubbieri, Gaia; Mossello, Enrico; Cavallini, Maria Chiara; Marchionni, Niccolò; Di Bari, Mauro

    2013-12-01

    Stroke severely affects walking ability, and assessment of gait kinematics is important in defining diagnosis, planning treatment, and evaluating interventions in stroke rehabilitation. Although observational gait analysis is the most common approach to evaluate gait kinematics, tools useful for this purpose have received little attention in the scientific literature and have not been thoroughly reviewed. The aims of this systematic review were to identify tools proposed to conduct observational gait analysis in adults with a stroke, to summarize evidence concerning their quality, and to assess their implementation in rehabilitation research and clinical practice. An extensive search was performed of original articles reporting on visual/observational tools developed to investigate gait kinematics in adults with a stroke. Two reviewers independently selected studies, extracted data, assessed quality of the included studies, and scored the metric properties and clinical utility of each tool. Rigor in reporting metric properties and dissemination of the tools also was evaluated. Five tools were identified, not all of which had been tested adequately for their metric properties. Evaluation of content validity was partially satisfactory. Reliability was poorly investigated in all but one tool. Concurrent validity and sensitivity to change were shown for 3 and 2 tools, respectively. Overall, adequate levels of quality were rarely reached. The dissemination of the tools was poor. Based on critical appraisal, the Gait Assessment and Intervention Tool shows a good level of quality, and its use in stroke rehabilitation is recommended. Rigorous studies are needed for the other tools in order to establish their usefulness.

  9. Scientific Visualization Tools for Enhancement of Undergraduate Research

    NASA Astrophysics Data System (ADS)

    Rodriguez, W. J.; Chaudhury, S. R.

    2001-05-01

    Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable four-dimensional interactive environment. These tools allow students to make higher order decisions based on large multidimensional sets of data while diminishing the level of frustration that results from dealing with the details of processing large data sets.

  10. Artificial intelligence support for scientific model-building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.

  11. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos

    Application analysis is facilitated through a number of program profiling tools. The tools vary in their complexity, ease of deployment, design, and profiling detail. Specifically, understand- ing, analyzing, and optimizing is of particular importance for scientific applications where minor changes in code paths and data-structure layout can have profound effects. Understanding how intricate data-structures are accessed and how a given memory system responds is a complex task. In this paper we describe a trace profiling tool, Glprof, specifically aimed to lessen the burden of the programmer to pin-point heavily involved data-structures during an application's run-time, and understand data-structure run-time usage.more » Moreover, we showcase the tool's modularity using additional cache simulation components. We elaborate on the tool's design, and features. Finally we demonstrate the application of our tool in the context of Spec bench- marks using the Glprof profiler and two concurrently running cache simulators, PPC440 and AMD Interlagos.« less

  13. Comparative Analysis of University-Government-Enterprise Co-Authorship Networks in Three Scientific Domains in the Region of Madrid

    ERIC Educational Resources Information Center

    Olmeda-Gomez, Carlos; Perianes-Rodriguez, Antonio; Ovalle-Perandones, Maria Antonia; Moya-Anegon, Felix

    2008-01-01

    Introduction: In an economy geared to innovation and competitiveness in research and development activities, inter-relationships between the university, private enterprise and government are of considerable interest. Networking constitutes a priority strategy to attain this strategic objective and a tool in knowledge-based economies. Method:…

  14. Alzheimer's Disease under Scrutiny: Short Newspaper Articles as a Case Study Tool.

    ERIC Educational Resources Information Center

    Hudecki, Michael S.

    2001-01-01

    After reading a newspaper article on Alzheimer disease, an incurable medical problem involving gradual and debilitating loss of memory, students examine the key elements of the scientific method as conveyed in the story. During their analysis students explore the workings of the nervous system and consider the role of animal model systems in…

  15. Parallel Index and Query for Large Scale Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chou, Jerry; Wu, Kesheng; Ruebel, Oliver

    2011-07-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing ofmore » a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.« less

  16. Conservation genomics of natural and managed populations: building a conceptual and practical framework.

    PubMed

    Benestan, Laura Marilyn; Ferchaud, Anne-Laure; Hohenlohe, Paul A; Garner, Brittany A; Naylor, Gavin J P; Baums, Iliana Brigitta; Schwartz, Michael K; Kelley, Joanna L; Luikart, Gordon

    2016-07-01

    The boom of massive parallel sequencing (MPS) technology and its applications in conservation of natural and managed populations brings new opportunities and challenges to meet the scientific questions that can be addressed. Genomic conservation offers a wide range of approaches and analytical techniques, with their respective strengths and weaknesses that rely on several implicit assumptions. However, finding the most suitable approaches and analysis regarding our scientific question are often difficult and time-consuming. To address this gap, a recent workshop entitled 'ConGen 2015' was held at Montana University in order to bring together the knowledge accumulated in this field and to provide training in conceptual and practical aspects of data analysis applied to the field of conservation and evolutionary genomics. Here, we summarize the expertise yield by each instructor that has led us to consider the importance of keeping in mind the scientific question from sampling to management practices along with the selection of appropriate genomics tools and bioinformatics challenges. © 2016 John Wiley & Sons Ltd.

  17. Scientific Benchmarks for Guiding Macromolecular Energy Function Improvement

    PubMed Central

    Leaver-Fay, Andrew; O’Meara, Matthew J.; Tyka, Mike; Jacak, Ron; Song, Yifan; Kellogg, Elizabeth H.; Thompson, James; Davis, Ian W.; Pache, Roland A.; Lyskov, Sergey; Gray, Jeffrey J.; Kortemme, Tanja; Richardson, Jane S.; Havranek, James J.; Snoeyink, Jack; Baker, David; Kuhlman, Brian

    2013-01-01

    Accurate energy functions are critical to macromolecular modeling and design. We describe new tools for identifying inaccuracies in energy functions and guiding their improvement, and illustrate the application of these tools to improvement of the Rosetta energy function. The feature analysis tool identifies discrepancies between structures deposited in the PDB and low energy structures generated by Rosetta; these likely arise from inaccuracies in the energy function. The optE tool optimizes the weights on the different components of the energy function by maximizing the recapitulation of a wide range of experimental observations. We use the tools to examine three proposed modifications to the Rosetta energy function: improving the unfolded state energy model (reference energies), using bicubic spline interpolation to generate knowledge based torisonal potentials, and incorporating the recently developed Dunbrack 2010 rotamer library (Shapovalov and Dunbrack, 2011). PMID:23422428

  18. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    PubMed

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  19. Elaboration and formalization of current scientific knowledge of risks and preventive measures illustrated by colorectal cancer.

    PubMed

    Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M

    2001-01-01

    Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.

  20. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis

    PubMed Central

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475

  1. Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)

    NASA Astrophysics Data System (ADS)

    Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.

    2017-12-01

    We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.

  2. Tools for Observation: Art and the Scientific Process

    NASA Astrophysics Data System (ADS)

    Pettit, E. C.; Coryell-Martin, M.; Maisch, K.

    2015-12-01

    Art can support the scientific process during different phases of a scientific discovery. Art can help explain and extend the scientific concepts for the general public; in this way art is a powerful tool for communication. Art can aid the scientist in processing and interpreting the data towards an understanding of the concepts and processes; in this way art is powerful - if often subconscious - tool to inform the process of discovery. Less often acknowledged, art can help engage students and inspire scientists during the initial development of ideas, observations, and questions; in this way art is a powerful tool to develop scientific questions and hypotheses. When we use art as a tool for communication of scientific discoveries, it helps break down barriers and makes science concepts less intimidating and more accessible and understandable for the learner. Scientists themselves use artistic concepts and processes - directly or indirectly - to help deepen their understanding. Teachers are following suit by using art more to stimulate students' creative thinking and problem solving. We show the value of teaching students to use the artistic "way of seeing" to develop their skills in observation, questioning, and critical thinking. In this way, art can be a powerful tool to engage students (from elementary to graduate) in the beginning phase of a scientific discovery, which is catalyzed by inquiry and curiosity. Through qualitative assessment of the Girls on Ice program, we show that many of the specific techniques taught by art teachers are valuable for science students to develop their observation skills. In particular, the concepts of contour drawing, squinting, gesture drawing, inverted drawing, and others can provide valuable training for student scientists. These art techniques encourage students to let go of preconceptions and "see" the world (the "data") in new ways they help students focus on both large-scale patterns and small-scale details.

  3. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  4. El Programa de Fortalecimiento de Capacidades de COSPAR

    NASA Astrophysics Data System (ADS)

    Gabriel, C.

    2016-08-01

    The provision of scientific data archives and analysis tools by diverse institutions in the world represents a unique opportunity for the development of scientific activities. An example of this is the European Space Agency's space observatory XMM-Newton with its Science Operations Centre at the European Space Astronomy Centre near Madrid, Spain. It provides through its science archive and web pages, not only the raw and processed data from the mission, but also analysis tools, and full documentation greatly helping their dissemination and use. These data and tools, freely accesible to anyone in the world, are the practical elements around which COSPAR (COmmittee on SPAce Research) Capacity Building Workshops have been conceived and developed, and held for a decade and a half in developing countries. The Programme started with X-ray workshops, but in-between it has been broadened to the most diverse space science areas. The workshops help to develop science at the highest level in those countries, in a long and substainable way, with a minimal investment (computer plus a moderate Internet connection). In this paper we discuss the basis, concepts, and achievements of the Capacity Building Programme. Two instances of the Programme have already taken place in Argentina, one of them devoted to X-ray astronomy and another to Infrared Astronomy. Several others have been organised for the Latin American region (Brazil, Uruguay and Mexico) with a large participation of young investigators from Argentina.

  5. Open access for ALICE analysis based on virtualization technology

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  6. SedInConnect: a stand-alone, free and open source tool for the assessment of sediment connectivity

    NASA Astrophysics Data System (ADS)

    Crema, Stefano; Cavalli, Marco

    2018-02-01

    There is a growing call, within the scientific community, for solid theoretic frameworks and usable indices/models to assess sediment connectivity. Connectivity plays a significant role in characterizing structural properties of the landscape and, when considered in combination with forcing processes (e.g., rainfall-runoff modelling), can represent a valuable analysis for an improved landscape management. In this work, the authors present the development and application of SedInConnect: a free, open source and stand-alone application for the computation of the Index of Connectivity (IC), as expressed in Cavalli et al. (2013) with the addition of specific innovative features. The tool is intended to have a wide variety of users, both from the scientific community and from the authorities involved in the environmental planning. Thanks to its open source nature, the tool can be adapted and/or integrated according to the users' requirements. Furthermore, presenting an easy-to-use interface and being a stand-alone application, the tool can help management experts in the quantitative assessment of sediment connectivity in the context of hazard and risk assessment. An application to a sample dataset and an overview on up-to-date applications of the approach and of the tool shows the development potential of such analyses. The modelled connectivity, in fact, appears suitable not only to characterize sediment dynamics at the catchment scale but also to integrate prediction models and as a tool for helping geomorphological interpretation.

  7. Opal web services for biomedical applications.

    PubMed

    Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W

    2010-07-01

    Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.

  8. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-02-01

    The System for Automated Geoscientific Analyses (SAGA) is an open-source Geographic Information System (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular organized software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, an easily approachable graphical user interface with many visualization options, a command line interpreter, and interfaces to scripting and low level programming languages like R and Python. The current version 2.1.4 offers more than 700 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Further, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  9. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-07-01

    The System for Automated Geoscientific Analyses (SAGA) is an open source geographic information system (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, a user friendly graphical user interface with many visualization options, a command line interpreter, and interfaces to interpreted languages like R and Python. The current version 2.1.4 offers more than 600 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Furthermore, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies, with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  10. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    USGS Publications Warehouse

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  11. Component Technology for High-Performance Scientific Simulation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epperly, T; Kohn, S; Kumfert, G

    2000-11-09

    We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less

  12. Fourth and fifth grade Latino(a) students making meaning of scientific informational texts

    NASA Astrophysics Data System (ADS)

    Croce, Keri-Anne

    Using a socio-psycholinguistic perspective of literacy and a social-semiotic analysis of texts, this study investigates how six students made meaning of informational texts. The students came to school from a variety of English and Spanish language backgrounds. The research question being asked was 'How do Latino(a) fourth and fifth grade students make meaning of English informational texts?' Miscue analysis was used as a tool to investigate how students who have been labeled non-struggling readers by their classroom teacher and are from various language backgrounds approached five informational texts. In order to investigate students' responses to the nature of informational texts, this dissertation draws on commonly occurring structures within texts. Primary data collected included read alouds and retellings of five texts, retrospective miscue analysis, and interviews with six participant students. Two of these participants are discussed within this dissertation. Secondary data included classroom observations and teacher interviews. This study proposes that non-native speakers may use scientific concept placeholders as they transact with informational texts. The use of scientific concept placeholders by a reader indicates that the reader is engaged in the meaning making process and possesses evolving scientific knowledge about a phenomenon. The findings suggest that Latino(a) students' understandings of English informational texts is influenced not only by a student's language development but also (1) the nature of the text; (2) the reading strategies that a student uses, such as the use of placeholders; (3) the influence of the researcher during the aided retelling. This study contributes methodological tools to assess English language learners' reading. The conclusions presented within this study also support the idea that students from a variety of language backgrounds slightly altered their reliance on certain cuing systems as they encountered various sub-genres within an informational text. I conclude that reading assessment should account for how a student approaches different structural elements of a text.

  13. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caschili, Simone, E-mail: s.caschili@ucl.ac.uk; De Montis, Andrea; Ganciu, Amedeo

    2014-07-01

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnectedmore » if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals.« less

  14. Chandra Interactive Analysis of Observations (CIAO)

    NASA Technical Reports Server (NTRS)

    Dobrzycki, Adam

    2000-01-01

    The Chandra (formerly AXAF) telescope, launched on July 23, 1999, provides X-rays data with unprecedented spatial and spectral resolution. As part of the Chandra scientific support, the Chandra X-ray Observatory Center provides a new data analysis system, CIAO ("Chandra Interactive Analysis of Observations"). We will present the main components of the system: "First Look" analysis; SHERPA: a multi-dimensional, multi-mission modeling and fitting application; Chandra Imaging and Plotting System; Detect package-source detection algorithms; and DM package generic data manipulation tools, We will set up a demonstration of the portable version of the system and show examples of Chandra Data Analysis.

  15. Genetic Design Automation: engineering fantasy or scientific renewal?

    PubMed Central

    Lux, Matthew W.; Bramlett, Brian W.; Ball, David A.; Peccoud, Jean

    2013-01-01

    Synthetic biology aims to make genetic systems more amenable to engineering, which has naturally led to the development of Computer-Aided Design (CAD) tools. Experimentalists still primarily rely on project-specific ad-hoc workflows instead of domain-specific tools, suggesting that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. PMID:22001068

  16. Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples

    NASA Astrophysics Data System (ADS)

    Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.

    2012-12-01

    The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.

  17. A Meta-analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    NASA Astrophysics Data System (ADS)

    Zhang, Lin

    2014-02-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to different technology features? And how can pieces of qualitative and quantitative results be integrated to achieve a broader understanding of technology designs? To address these issues, this paper proposes a meta-analysis method. Detailed explanations about the structure of the methodology and its scientific mechanism are provided for discussions and suggestions. This paper ends with an in-depth discussion on the concerns and questions that educational researchers might raise, such as how this methodology takes care of learning contexts.

  18. Toward server-side, high performance climate change data analytics in the Earth System Grid Federation (ESGF) eco-system

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Williams, Dean; Aloisio, Giovanni

    2016-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.

  19. Data mart construction based on semantic annotation of scientific articles: A case study for the prioritization of drug targets.

    PubMed

    Teixeira, Marlon Amaro Coelho; Belloze, Kele Teixeira; Cavalcanti, Maria Cláudia; Silva-Junior, Floriano P

    2018-04-01

    Semantic text annotation enables the association of semantic information (ontology concepts) to text expressions (terms), which are readable by software agents. In the scientific scenario, this is particularly useful because it reveals a lot of scientific discoveries that are hidden within academic articles. The Biomedical area has more than 300 ontologies, most of them composed of over 500 concepts. These ontologies can be used to annotate scientific papers and thus, facilitate data extraction. However, in the context of a scientific research, a simple keyword-based query using the interface of a digital scientific texts library can return more than a thousand hits. The analysis of such a large set of texts, annotated with such numerous and large ontologies, is not an easy task. Therefore, the main objective of this work is to provide a method that could facilitate this task. This work describes a method called Text and Ontology ETL (TOETL), to build an analytical view over such texts. First, a corpus of selected papers is semantically annotated using distinct ontologies. Then, the annotation data is extracted, organized and aggregated into the dimensional schema of a data mart. Besides the TOETL method, this work illustrates its application through the development of the TaP DM (Target Prioritization data mart). This data mart has focus on the research of gene essentiality, a key concept to be considered when searching for genes showing potential as anti-infective drug targets. This work reveals that the proposed approach is a relevant tool to support decision making in the prioritization of new drug targets, being more efficient than the keyword-based traditional tools. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. A DBMS architecture for global change research

    NASA Astrophysics Data System (ADS)

    Hachem, Nabil I.; Gennert, Michael A.; Ward, Matthew O.

    1993-08-01

    The goal of this research is the design and development of an integrated system for the management of very large scientific databases, cartographic/geographic information processing, and exploratory scientific data analysis for global change research. The system will represent both spatial and temporal knowledge about natural and man-made entities on the eath's surface, following an object-oriented paradigm. A user will be able to derive, modify, and apply, procedures to perform operations on the data, including comparison, derivation, prediction, validation, and visualization. This work represents an effort to extend the database technology with an intrinsic class of operators, which is extensible and responds to the growing needs of scientific research. Of significance is the integration of many diverse forms of data into the database, including cartography, geography, hydrography, hypsography, images, and urban planning data. Equally important is the maintenance of metadata, that is, data about the data, such as coordinate transformation parameters, map scales, and audit trails of previous processing operations. This project will impact the fields of geographical information systems and global change research as well as the database community. It will provide an integrated database management testbed for scientific research, and a testbed for the development of analysis tools to understand and predict global change.

  1. Freva - Freie Univ Evaluation System Framework for Scientific Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe

    2016-04-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  2. ON IDENTIFIABILITY OF NONLINEAR ODE MODELS AND APPLICATIONS IN VIRAL DYNAMICS

    PubMed Central

    MIAO, HONGYU; XIA, XIAOHUA; PERELSON, ALAN S.; WU, HULIN

    2011-01-01

    Ordinary differential equations (ODE) are a powerful tool for modeling dynamic processes with wide applications in a variety of scientific fields. Over the last 2 decades, ODEs have also emerged as a prevailing tool in various biomedical research fields, especially in infectious disease modeling. In practice, it is important and necessary to determine unknown parameters in ODE models based on experimental data. Identifiability analysis is the first step in determing unknown parameters in ODE models and such analysis techniques for nonlinear ODE models are still under development. In this article, we review identifiability analysis methodologies for nonlinear ODE models developed in the past one to two decades, including structural identifiability analysis, practical identifiability analysis and sensitivity-based identifiability analysis. Some advanced topics and ongoing research are also briefly reviewed. Finally, some examples from modeling viral dynamics of HIV, influenza and hepatitis viruses are given to illustrate how to apply these identifiability analysis methods in practice. PMID:21785515

  3. Methodology for fast detection of false sharing in threaded scientific codes

    DOEpatents

    Chung, I-Hsin; Cong, Guojing; Murata, Hiroki; Negishi, Yasushi; Wen, Hui-Fang

    2014-11-25

    A profiling tool identifies a code region with a false sharing potential. A static analysis tool classifies variables and arrays in the identified code region. A mapping detection library correlates memory access instructions in the identified code region with variables and arrays in the identified code region while a processor is running the identified code region. The mapping detection library identifies one or more instructions at risk, in the identified code region, which are subject to an analysis by a false sharing detection library. A false sharing detection library performs a run-time analysis of the one or more instructions at risk while the processor is re-running the identified code region. The false sharing detection library determines, based on the performed run-time analysis, whether two different portions of the cache memory line are accessed by the generated binary code.

  4. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  5. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakala, Jacqueline Alexandra

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  6. Container-Based Clinical Solutions for Portable and Reproducible Image Analysis.

    PubMed

    Matelsky, Jordan; Kiar, Gregory; Johnson, Erik; Rivera, Corban; Toma, Michael; Gray-Roncal, William

    2018-05-08

    Medical imaging analysis depends on the reproducibility of complex computation. Linux containers enable the abstraction, installation, and configuration of environments so that software can be both distributed in self-contained images and used repeatably by tool consumers. While several initiatives in neuroimaging have adopted approaches for creating and sharing more reliable scientific methods and findings, Linux containers are not yet mainstream in clinical settings. We explore related technologies and their efficacy in this setting, highlight important shortcomings, demonstrate a simple use-case, and endorse the use of Linux containers for medical image analysis.

  7. "Do-It-Ourselves Science": Case Studies of Volunteer-Initiated Citizen Science Involvement

    NASA Astrophysics Data System (ADS)

    Raddick, Jordan; Bracey, G.; Gay, P. L.

    2009-05-01

    Galaxy Zoo is a citizen science website in which members of the public volunteer to classify galaxies, thereby helping astronomers conduct publishable research into galaxy morphologies and environments. Although the site was originally created to answer a few specific questions, some members of the community - both scientists and volunteers - have spontaneously developed an interest in a wider variety of questions. Volunteers have pursued answers to these questions with guidance from professional astronomers; in completing these projects, volunteers have independently used some of the same data viewing and analysis tools that professional astronomers use, and have even developed their own online tools. They have created their own research questions and their own plans for data analysis, and are planning to write scientific papers with the results to be submitted to peer-reviewed scientific journals. Volunteers have identified a number of such projects. These volunteer-initiated projects have extended the scientific reach of Galaxy Zoo, while also giving volunteers first-hand experience with the process of science. We are interested in the process by which volunteers become interested in volunteer-initiated projects, and what tasks they participate in, both initially and as their involvement increases. What motivates a volunteer to become involved in a volunteer-initiated project? How does his or her motivation change with further involvement? We are conducting a program of qualitative education research into these questions, using as data sources the posts that volunteers have made to the Galaxy Zoo forum and transcripts of interviews with volunteers.

  8. Genetic analysis and molecular mapping of an Rf gene from Helianthus angustifolius for a new cytoplasmic male-sterile line

    USDA-ARS?s Scientific Manuscript database

    The combination of cytoplasmic male-sterile (CMS) and the corresponding fertility restoration genes (Rf) is a critical tool in large-scale hybrid seed production of sunflower. A new CMS line 514A, derived from H. tuberosus / 7718B, was obtained from a scientific exchange with the Liaoning Academy of...

  9. Computational Omics Funding Opportunity | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) and the NVIDIA Foundation are pleased to announce funding opportunities in the fight against cancer. Each organization has launched a request for proposals (RFP) that will collectively fund up to $2 million to help to develop a new generation of data-intensive scientific tools to find new ways to treat cancer.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ucilia

    This report has the following articles: (1) Deconstructing Microbes--metagenomic research on bugs in termites relies on new data analysis tools; (2) Popular Science--a nanomaterial research paper in Nano Letters drew strong interest from the scientific community; (3) Direct Approach--researchers employ an algorithm to solve an energy-reduction issue essential in describing complex physical system; and (4) SciDAC Special--A science journal features research on petascale enabling technologies.

  11. Intentions and actions in molecular self-assembly: perspectives on students' language use

    NASA Astrophysics Data System (ADS)

    Höst, Gunnar E.; Anward, Jan

    2017-04-01

    Learning to talk science is an important aspect of learning to do science. Given that scientists' language frequently includes intentions and purposes in explanations of unobservable objects and events, teachers must interpret whether learners' use of such language reflects a scientific understanding or inaccurate anthropomorphism and teleology. In the present study, a framework consisting of three 'stances' (Dennett, 1987) - intentional, design and physical - is presented as a powerful tool for analysing students' language use. The aim was to investigate how the framework can be differentiated and used analytically for interpreting students' talk about a molecular process. Semi-structured group discussions and individual interviews about the molecular self-assembly process were conducted with engineering biology/chemistry (n = 15) and biology/chemistry teacher students (n = 6). Qualitative content analysis of transcripts showed that all three stances were employed by students. The analysis also identified subcategories for each stance, and revealed that intentional language with respect to molecular movement and assumptions about design requirements may be potentially problematic areas. Students' exclusion of physical stance explanations may indicate literal anthropomorphic interpretations. Implications for practice include providing teachers with a tool for scaffolding their use of metaphorical language and for supporting students' metacognitive development as scientific language users.

  12. Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.

    PubMed

    Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar

    2012-01-01

    Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.

  13. Scientists and Public: Is the Information Flow Direction Starting to Change?

    NASA Astrophysics Data System (ADS)

    Diaz-Doce, D.; Bee, E. J.; Bell, P. D.; Marchant, A. P.; Reay, S.; Richardson, S. L.; Shelley, W. A.

    2014-12-01

    Over half of the population of the UK own a smartphone, and about the same number of people uses social media such as Twitter. For the British Geological Survey (BGS) this means millions of potential reporters of real-time events and in-the-field data capturers, creating a new source of scientific information that could help to better understand and predict natural processes. BGS first started collecting citizen data, using crowd-sourcing, through websites and smartphone apps focused on gathering geological related information (e.g. mySoil and myVolcano). These tools ask volunteers to follow a guided form where they can upload data related to geology and geological events; including location, description, measurements, photos, videos, or even instructions on sending physical samples. This information is used to augment existing data collections. Social media provides a different channel for gathering useful scientific information from the public. BGS is starting to explore this route with the release of GeoSocial-Aurora , a web mapping tool that searches for tweets related to aurora sightings and locates them as markers on a map. Users are actively encouraged to contribute by sending tweets about aurora sightings in a specific format, which contains the #BGSaurora hashtag, the location of the sighting, and any comments or pictures. The tool harvests these tweets through the Twitter REST API and places them on the map, enabling the user to generate clusters and heatmaps. GeoSocial-Aurora provides scientists with a potential tool for gathering useful data for scientific analysis. It collects actual aurora sighting locations, enabling users to check where the aurora is taking place in real time. This may, in time, help scientists to improve future predictions of when and where auroras are visible.

  14. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation

    PubMed Central

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840

  15. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    PubMed

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  16. C3: A Collaborative Web Framework for NASA Earth Exchange

    NASA Astrophysics Data System (ADS)

    Foughty, E.; Fattarsi, C.; Hardoyo, C.; Kluck, D.; Wang, L.; Matthews, B.; Das, K.; Srivastava, A.; Votava, P.; Nemani, R. R.

    2010-12-01

    The NASA Earth Exchange (NEX) is a new collaboration platform for the Earth science community that provides a mechanism for scientific collaboration and knowledge sharing. NEX combines NASA advanced supercomputing resources, Earth system modeling, workflow management, NASA remote sensing data archives, and a collaborative communication platform to deliver a complete work environment in which users can explore and analyze large datasets, run modeling codes, collaborate on new or existing projects, and quickly share results among the Earth science communities. NEX is designed primarily for use by the NASA Earth science community to address scientific grand challenges. The NEX web portal component provides an on-line collaborative environment for sharing of Eearth science models, data, analysis tools and scientific results by researchers. In addition, the NEX portal also serves as a knowledge network that allows researchers to connect and collaborate based on the research they are involved in, specific geographic area of interest, field of study, etc. Features of the NEX web portal include: Member profiles, resource sharing (data sets, algorithms, models, publications), communication tools (commenting, messaging, social tagging), project tools (wikis, blogs) and more. The NEX web portal is built on the proven technologies and policies of DASHlink.arc.nasa.gov, (one of NASA's first science social media websites). The core component of the web portal is a C3 framework, which was built using Django and which is being deployed as a common framework for a number of collaborative sites throughout NASA.

  17. Geospatial Service Platform for Education and Research

    NASA Astrophysics Data System (ADS)

    Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.

    2014-04-01

    We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.

  18. Development of high resolution NMR spectroscopy as a structural tool

    NASA Astrophysics Data System (ADS)

    Feeney, James

    1992-06-01

    The discovery of the nuclear magnetic resonance (NMR) phenomenon and its development and exploitation as a scientific tool provide an excellent basis for a case-study for examining the factors which control the evolution of scientific techniques. Since the detection of the NMR phenomenon and the subsequent rapid discovery of all the important NMR spectral parameters in the late 1940s, the method has emerged as one of the most powerful techniques for determining structures of molecules in solution and for analysis of complex mixtures. The method has made a dramatic impact on the development of structural chemistry over the last 30 years and is now one of the key techniques in this area. Support for NMR instrumentation attracts a dominant slice of public funding in most scientifically developed countries. The technique is an excellent example of how instrumentation and technology have revolutionised structural chemistry and it is worth exploring how it has been developed so successfully. Clearly its wide range of application and the relatively direct connection between the NMR data and molecular structure has created a major market for the instrumentation. This has provided several competing manufacturers with the incentive to develop better and better instruments. Understanding the complexity of the basics of NMR spectroscopy has been an ongoing challenge attracting the attention of physicists. The well-organised specialist NMR literature and regular scientific meetings have ensured rapid exploitation of any theoretical advances that have a practical relevance. In parallel, the commercial development of the technology has allowed the fruits of such theoretical advances to be enjoyed by the wider scientific community.

  19. Developing focused wellness programs: using concept analysis to increase business value.

    PubMed

    Byczek, Lance; Kalina, Christine M; Levin, Pamela F

    2003-09-01

    Concept analysis is a useful tool in providing clarity to an abstract idea as well as an objective basis for developing wellness program products, goals, and outcomes. To plan for and develop successful wellness programs, it is critical for occupational health nurses to clearly understand a program concept as applied to a particular community or population. Occupational health nurses can use the outcome measures resulting from the concept analysis process to help demonstrate the business value of their wellness programs. This concept analysis demonstrates a predominance of the performance related attributes of fitness in the scientific literature.

  20. Reprint of "Citation analysis as a measure of article quality, journal influence and individual researcher performance".

    PubMed

    Nightingale, Julie M; Marshall, Gill

    2013-09-01

    The research-related performance of universities, as well as that of individual researchers, is increasingly evaluated through the use of objective measures, or metrics, which seek to support or in some cases even replace more traditional methods of peer review. In particular there is a growing awareness in research communities, government organisations and funding bodies around the concept of using evaluation metrics to analyse research citations. The tools available for 'citation analysis' are many and varied, enabling a quantification of scientific quality, academic impact and prestige. However there is increasing concern regarding the potential misuse of such tools, which have limitations in certain research disciplines.This article uses 'real world' examples from radiography research and scholarship to illustrate the range of currently available citation analysis tools. It explores the academic debate surrounding their strengths and limitations, and identifies the potential impact of citation analysis on the radiography research community.The article concludes that citation analysis is a valuable tool for researchers to use for personal reflection and research planning, yet there are inherent dangers if it is used inappropriately. Whilst citation analysis can give objective information regarding an individual, research group, journal or higher education institution, it should not be used as a total substitute for traditional qualitative review and peer assessment. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Databases and Web Tools for Cancer Genomics Study

    PubMed Central

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-01-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community. PMID:25707591

  2. SpecTracer: A Python-Based Interactive Solution for Echelle Spectra Reduction

    NASA Astrophysics Data System (ADS)

    Romero Matamala, Oscar Fernando; Petit, Véronique; Caballero-Nieves, Saida Maria

    2018-01-01

    SpecTracer is a newly developed interactive solution to reduce cross dispersed echelle spectra. The use of widgets saves the user the steep learning curves of currently available reduction software. SpecTracer uses well established image processing techniques based on IRAF to succesfully extract the stellar spectra. Comparisons with other reduction software, like IRAF, show comparable results, with the added advantages of ease of use, platform independence and portability. This tool can obtain meaningful scientific data and serve also as a training tool, especially for undergraduates doing research, in the procedure for spectroscopic analysis.

  3. WASP (Write a Scientific Paper) using Excel - 4: Histograms.

    PubMed

    Grech, Victor

    2018-02-01

    Plotting data into graphs is a crucial step in data analysis as part of an initial descriptive statistics exercise since it gives the researcher an overview of the shape and nature of the data. Outlier values may also be identified, and these may be incorrect data, or true and important outliers. This paper explains how to access Microsoft Excel's Analysis Toolpak and provides some pointers for the utilisation of the histogram tool within the Toolpak. Copyright © 2018. Published by Elsevier B.V.

  4. Data management and analysis for the Earth System Grid

    NASA Astrophysics Data System (ADS)

    Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.

    2008-07-01

    The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.

  5. Fungal genome resources at NCBI.

    PubMed

    Robbertse, B; Tatusova, T

    2011-09-01

    The National Center for Biotechnology Information (NCBI) is well known for the nucleotide sequence archive, GenBank and sequence analysis tool BLAST. However, NCBI integrates many types of biomolecular data from variety of sources and makes it available to the scientific community as interactive web resources as well as organized releases of bulk data. These tools are available to explore and compare fungal genomes. Searching all databases with Fungi [organism] at http://www.ncbi.nlm.nih.gov/ is the quickest way to find resources of interest with fungal entries. Some tools though are resources specific and can be indirectly accessed from a particular database in the Entrez system. These include graphical viewers and comparative analysis tools such as TaxPlot, TaxMap and UniGene DDD (found via UniGene Homepage). Gene and BioProject pages also serve as portals to external data such as community annotation websites, BioGrid and UniProt. There are many different ways of accessing genomic data at NCBI. Depending on the focus and goal of research projects or the level of interest, a user would select a particular route for accessing genomic databases and resources. This review article describes methods of accessing fungal genome data and provides examples that illustrate the use of analysis tools.

  6. Genetic design automation: engineering fantasy or scientific renewal?

    PubMed

    Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean

    2012-02-01

    The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. 78 FR 43066 - Magnuson-Stevens Act Provisions; National Standard 2-Scientific Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-19

    ... situations where simpler tools and assessment methods are warranted, scientific advice should be accompanied... higher data category and improve assessment methods.'' One commenter also suggested adding... consistent with appropriate scientific methods, undergo scientific review, and peer review, which may include...

  8. [Handbook for the preparation of evidence-based documents. Tools derived from scientific knowledge].

    PubMed

    Carrión-Camacho, M R; Martínez-Brocca, M A; Paneque-Sánchez-Toscano, I; Valencia-Martín, R; Palomino-García, A; Muñoz-Durán, C; Tamayo-López, M J; González-Eiris-Delgado, C; Otero-Candelera, R; Ortega-Ruiz, F; Sobrino-Márquez, J M; Jiménez-García-Bóveda, R; Fernández-Quero, M; Campos-Pareja, A M

    2013-01-01

    This handbook is intended to be an accessible, easy-to-consult guide to help professionals produce or adapt Evidence-Based Documents. Such documents will help standardize both clinical practice and decision-making, the quality always being monitored in such a way that established references are complied with. Evidence-Based Health Care Committee, a member of "Virgen del Rocío" University Hospital quality structure, proposed the preparation of a handbook to produce Evidence-Based Documents including: a description of products, characteristics, qualities, uses, methodology of production, and application scope of every one of them. The handbook consists of seven Evidence-Based tools, one chapter on critical analysis methodology of scientific literature, one chapter with internet resources, and some appendices with different assessment tools. This Handbook provides general practitioners with a great opportunity to improve quality and as a guideline to standardize clinical healthcare, and managers with a strategy to promote and encourage the development of documents in an effort to reduce clinical practice variability, as well as giving patients the opportunity of taking part in planning their own care. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.

  9. Mutation extraction tools can be combined for robust recognition of genetic variants in the literature

    PubMed Central

    Jimeno Yepes, Antonio; Verspoor, Karin

    2014-01-01

    As the cost of genomic sequencing continues to fall, the amount of data being collected and studied for the purpose of understanding the genetic basis of disease is increasing dramatically. Much of the source information relevant to such efforts is available only from unstructured sources such as the scientific literature, and significant resources are expended in manually curating and structuring the information in the literature. As such, there have been a number of systems developed to target automatic extraction of mutations and other genetic variation from the literature using text mining tools. We have performed a broad survey of the existing publicly available tools for extraction of genetic variants from the scientific literature. We consider not just one tool but a number of different tools, individually and in combination, and apply the tools in two scenarios. First, they are compared in an intrinsic evaluation context, where the tools are tested for their ability to identify specific mentions of genetic variants in a corpus of manually annotated papers, the Variome corpus. Second, they are compared in an extrinsic evaluation context based on our previous study of text mining support for curation of the COSMIC and InSiGHT databases. Our results demonstrate that no single tool covers the full range of genetic variants mentioned in the literature. Rather, several tools have complementary coverage and can be used together effectively. In the intrinsic evaluation on the Variome corpus, the combined performance is above 0.95 in F-measure, while in the extrinsic evaluation the combined recall performance is above 0.71 for COSMIC and above 0.62 for InSiGHT, a substantial improvement over the performance of any individual tool. Based on the analysis of these results, we suggest several directions for the improvement of text mining tools for genetic variant extraction from the literature. PMID:25285203

  10. GeNNet: an integrated platform for unifying scientific workflows and graph databases for transcriptome data analysis

    PubMed Central

    Gadelha, Luiz; Ribeiro-Alves, Marcelo; Porto, Fábio

    2017-01-01

    There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were analyzed. The results are integrated into GeNNet-DB, a database about genes, clusters, experiments and their properties and relationships. The resulting graph database is explored with queries that demonstrate the expressiveness of this data model for reasoning about gene interaction networks. GeNNet is the first platform to integrate the analytical process of transcriptome data with graph databases. It provides a comprehensive set of tools that would otherwise be challenging for non-expert users to install and use. Developers can add new functionality to components of GeNNet. The derived data allows for testing previous hypotheses about an experiment and exploring new ones through the interactive graph database environment. It enables the analysis of different data on humans, rhesus, mice and rat coming from Affymetrix platforms. GeNNet is available as an open source platform at https://github.com/raquele/GeNNet and can be retrieved as a software container with the command docker pull quelopes/gennet. PMID:28695067

  11. GeNNet: an integrated platform for unifying scientific workflows and graph databases for transcriptome data analysis.

    PubMed

    Costa, Raquel L; Gadelha, Luiz; Ribeiro-Alves, Marcelo; Porto, Fábio

    2017-01-01

    There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were analyzed. The results are integrated into GeNNet-DB, a database about genes, clusters, experiments and their properties and relationships. The resulting graph database is explored with queries that demonstrate the expressiveness of this data model for reasoning about gene interaction networks. GeNNet is the first platform to integrate the analytical process of transcriptome data with graph databases. It provides a comprehensive set of tools that would otherwise be challenging for non-expert users to install and use. Developers can add new functionality to components of GeNNet. The derived data allows for testing previous hypotheses about an experiment and exploring new ones through the interactive graph database environment. It enables the analysis of different data on humans, rhesus, mice and rat coming from Affymetrix platforms. GeNNet is available as an open source platform at https://github.com/raquele/GeNNet and can be retrieved as a software container with the command docker pull quelopes/gennet.

  12. BioVeL: a virtual laboratory for data analysis and modelling in biodiversity science and ecology.

    PubMed

    Hardisty, Alex R; Bacall, Finn; Beard, Niall; Balcázar-Vargas, Maria-Paula; Balech, Bachir; Barcza, Zoltán; Bourlat, Sarah J; De Giovanni, Renato; de Jong, Yde; De Leo, Francesca; Dobor, Laura; Donvito, Giacinto; Fellows, Donal; Guerra, Antonio Fernandez; Ferreira, Nuno; Fetyukova, Yuliya; Fosso, Bruno; Giddy, Jonathan; Goble, Carole; Güntsch, Anton; Haines, Robert; Ernst, Vera Hernández; Hettling, Hannes; Hidy, Dóra; Horváth, Ferenc; Ittzés, Dóra; Ittzés, Péter; Jones, Andrew; Kottmann, Renzo; Kulawik, Robert; Leidenberger, Sonja; Lyytikäinen-Saarenmaa, Päivi; Mathew, Cherian; Morrison, Norman; Nenadic, Aleksandra; de la Hidalga, Abraham Nieva; Obst, Matthias; Oostermeijer, Gerard; Paymal, Elisabeth; Pesole, Graziano; Pinto, Salvatore; Poigné, Axel; Fernandez, Francisco Quevedo; Santamaria, Monica; Saarenmaa, Hannu; Sipos, Gergely; Sylla, Karl-Heinz; Tähtinen, Marko; Vicario, Saverio; Vos, Rutger Aldo; Williams, Alan R; Yilmaz, Pelin

    2016-10-20

    Making forecasts about biodiversity and giving support to policy relies increasingly on large collections of data held electronically, and on substantial computational capability and capacity to analyse, model, simulate and predict using such data. However, the physically distributed nature of data resources and of expertise in advanced analytical tools creates many challenges for the modern scientist. Across the wider biological sciences, presenting such capabilities on the Internet (as "Web services") and using scientific workflow systems to compose them for particular tasks is a practical way to carry out robust "in silico" science. However, use of this approach in biodiversity science and ecology has thus far been quite limited. BioVeL is a virtual laboratory for data analysis and modelling in biodiversity science and ecology, freely accessible via the Internet. BioVeL includes functions for accessing and analysing data through curated Web services; for performing complex in silico analysis through exposure of R programs, workflows, and batch processing functions; for on-line collaboration through sharing of workflows and workflow runs; for experiment documentation through reproducibility and repeatability; and for computational support via seamless connections to supporting computing infrastructures. We developed and improved more than 60 Web services with significant potential in many different kinds of data analysis and modelling tasks. We composed reusable workflows using these Web services, also incorporating R programs. Deploying these tools into an easy-to-use and accessible 'virtual laboratory', free via the Internet, we applied the workflows in several diverse case studies. We opened the virtual laboratory for public use and through a programme of external engagement we actively encouraged scientists and third party application and tool developers to try out the services and contribute to the activity. Our work shows we can deliver an operational, scalable and flexible Internet-based virtual laboratory to meet new demands for data processing and analysis in biodiversity science and ecology. In particular, we have successfully integrated existing and popular tools and practices from different scientific disciplines to be used in biodiversity and ecological research.

  13. Making Geoscience Data Relevant for Students, Teachers, and the Public

    NASA Astrophysics Data System (ADS)

    Taber, M.; Ledley, T. S.; Prakash, A.; Domenico, B.

    2009-12-01

    The scientific data collected by government funded research belongs to the public. As such, the scientific and technical communities are responsible to make scientific data accessible and usable by the educational community. However, much geoscience data are difficult for educators and students to find and use. Such data are generally described by metadata that are narrowly focused and contain scientific language. Thus, data access presents a challenge to educators in determining if a particular dataset is relevant to their needs, and to effectively access and use the data. The AccessData project (EAR-0623136, EAR-0305058) has developed a model for bridging the scientific and educational communities to develop robust inquiry-based activities using scientific datasets in the form of Earth Exploration Toolbook (EET, http://serc.carleton.edu/eet) chapters. EET chapters provide step-by-step instructions for accessing specific data and analyzing it with a software analysis tool to explore issues or concepts in science, technology, and mathematics. The AccessData model involves working directly with small teams made up of data providers from scientific data archives or research teams, data analysis tool specialists, scientists, curriculum developers, and educators (AccessData, http://serc.carleton.edu/usingdata/accessdata). The process involves a number of steps including 1) building of the team; 2) pre-workshop facilitation; 3) face-to-face 2.5 day workshop; 4) post-workshop follow-up; 5) completion and review of the EET chapter. The AccessData model has been evolved over a series of six annual workshops hosting ~10 teams each. This model has been expanded to other venues to explore expanding its scope and sustainable mechanisms. These venues include 1) workshops focused on the data collected by a large research program (RIDGE, EarthScope); 2) a workshop focused on developing a citizen scientist guide to conducting research; and 3) facilitating a team on an annual basis within the structure of the Federation of Earth Science Information Partners (ESIP Federation), leveraging their semi-annual meetings. In this presentation we will describe the AccessData model of making geoscience data accessible and usable in educational contexts from the perspective of both the organizers and from a team. We will also describe how this model has been adapted to other contexts to facilitate a broader reach of geoscience data.

  14. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  15. Numerical Analysis Objects

    NASA Astrophysics Data System (ADS)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  16. Conceptual-level workflow modeling of scientific experiments using NMR as a case study

    PubMed Central

    Verdi, Kacy K; Ellis, Heidi JC; Gryk, Michael R

    2007-01-01

    Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment. PMID:17263870

  17. Conceptual-level workflow modeling of scientific experiments using NMR as a case study.

    PubMed

    Verdi, Kacy K; Ellis, Heidi Jc; Gryk, Michael R

    2007-01-30

    Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment.

  18. Scientific Use Cases for the Virtual Atomic and Molecular Data Center

    NASA Astrophysics Data System (ADS)

    Dubernet, M. L.; Aboudarham, J.; Ba, Y. A.; Boiziot, M.; Bottinelli, S.; Caux, E.; Endres, C.; Glorian, J. M.; Henry, F.; Lamy, L.; Le Sidaner, P.; Møller, T.; Moreau, N.; Rénié, C.; Roueff, E.; Schilke, P.; Vastel, C.; Zwoelf, C. M.

    2014-12-01

    VAMDC Consortium is a worldwide consortium which federates interoperable Atomic and Molecular databases through an e-science infrastructure. The contained data are of the highest scientific quality and are crucial for many applications: astrophysics, atmospheric physics, fusion, plasma and lighting technologies, health, etc. In this paper we present astrophysical scientific use cases in relation to the use of the VAMDC e-infrastructure. Those will cover very different applications such as: (i) modeling the spectra of interstellar objects using the myXCLASS software tool implemented in the Common Astronomy Software Applications package (CASA) or using the CASSIS software tool, in its stand-alone version or implemented in the Herschel Interactive Processing Environment (HIPE); (ii) the use of Virtual Observatory tools accessing VAMDC databases; (iii) the access of VAMDC from the Paris solar BASS2000 portal; (iv) the combination of tools and database from the APIS service (Auroral Planetary Imaging and Spectroscopy); (v) combination of heterogeneous data for the application to the interstellar medium from the SPECTCOL tool.

  19. Return on Scientific Investment - RoSI: a PMO dynamical index proposal for scientific projects performance evaluation and management.

    PubMed

    Caous, Cristofer André; Machado, Birajara; Hors, Cora; Zeh, Andrea Kaufmann; Dias, Cleber Gustavo; Amaro Junior, Edson

    2012-01-01

    To propose a measure (index) of expected risks to evaluate and follow up the performance analysis of research projects involving financial and adequate structure parameters for its development. A ranking of acceptable results regarding research projects with complex variables was used as an index to gauge a project performance. In order to implement this method the ulcer index as the basic model to accommodate the following variables was applied: costs, high impact publication, fund raising, and patent registry. The proposed structured analysis, named here as RoSI (Return on Scientific Investment) comprises a pipeline of analysis to characterize the risk based on a modeling tool that comprises multiple variables interacting in semi-quantitatively environments. This method was tested with data from three different projects in our Institution (projects A, B and C). Different curves reflected the ulcer indexes identifying the project that may have a minor risk (project C) related to development and expected results according to initial or full investment. The results showed that this model contributes significantly to the analysis of risk and planning as well as to the definition of necessary investments that consider contingency actions with benefits to the different stakeholders: the investor or donor, the project manager and the researchers.

  20. Framing Psychology as a Discipline (1950-1999): A Large-Scale Term Co-Occurrence Analysis of Scientific Literature in Psychology.

    PubMed

    Flis, Ivan; van Eck, Nees Jan

    2017-07-20

    This study investigated the structure of psychological literature as represented by a corpus of 676,393 articles in the period from 1950 to 1999. The corpus was extracted from 1,269 journals indexed by PsycINFO. The data in our analysis consisted of the relevant terms mined from the titles and abstracts of all of the articles in the corpus. Based on the co-occurrences of these terms, we developed a series of chronological visualizations using a bibliometric software tool called VOSviewer. These visualizations produced a stable structure through the 5 decades under analysis, and this structure was analyzed as a data-mined proxy for the disciplinary formation of scientific psychology in the second part of the 20th century. Considering the stable structure uncovered by our term co-occurrence analysis and its visualization, we discuss it in the context of Lee Cronbach's "Two Disciplines of Scientific Psychology" (1957) and conventional history of 20th-century psychology's disciplinary formation and history of methods. Our aim was to provide a comprehensive digital humanities perspective on the large-scale structural development of research in English-language psychology from 1950 to 1999. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. [How to evaluate the application of Clinical Governance tools in the management of hospitalized hyperglycemic patients: results of a multicentric study].

    PubMed

    De Belvis, Antonio Giulio; Specchia, Maria Lucia; Ferriero, Anna Maria; Capizzi, Silvio

    2017-01-01

    Risk management is a key tool in Clinical Governance. Our project aimed to define, share, apply and measure the impact of tools and methodologies for the continuous improvement of quality of care, especially in relation to the multi-disciplinary and integrated management of the hyperglycemic patient in hospital settings. A training project, coordinated by a scientific board of experts in diabetes and health management and an Expert Meeting with representatives of all the participating centers was launched in 2014. The project involved eight hospitals through the organization of meetings with five managers and 25 speakers, including diabetologists, internists, pharmacists and nurses. The analysis showed a wide variability in the adoption of tools and processes towards a comprehensive and coordinated management of hyperglycemic patients.

  2. Virtual Interactomics of Proteins from Biochemical Standpoint

    PubMed Central

    Kubrycht, Jaroslav; Sigler, Karel; Souček, Pavel

    2012-01-01

    Virtual interactomics represents a rapidly developing scientific area on the boundary line of bioinformatics and interactomics. Protein-related virtual interactomics then comprises instrumental tools for prediction, simulation, and networking of the majority of interactions important for structural and individual reproduction, differentiation, recognition, signaling, regulation, and metabolic pathways of cells and organisms. Here, we describe the main areas of virtual protein interactomics, that is, structurally based comparative analysis and prediction of functionally important interacting sites, mimotope-assisted and combined epitope prediction, molecular (protein) docking studies, and investigation of protein interaction networks. Detailed information about some interesting methodological approaches and online accessible programs or databases is displayed in our tables. Considerable part of the text deals with the searches for common conserved or functionally convergent protein regions and subgraphs of conserved interaction networks, new outstanding trends and clinically interesting results. In agreement with the presented data and relationships, virtual interactomic tools improve our scientific knowledge, help us to formulate working hypotheses, and they frequently also mediate variously important in silico simulations. PMID:22928109

  3. A triangular fuzzy TOPSIS-based approach for the application of water technologies in different emergency water supply scenarios.

    PubMed

    Qu, Jianhua; Meng, Xianlin; Yu, Huan; You, Hong

    2016-09-01

    Because of the increasing frequency and intensity of unexpected natural disasters, providing safe drinking water for the affected population following a disaster has become a global challenge of growing concern. An onsite water supply technology that is portable, mobile, or modular is a more suitable and sustainable solution for the victims than transporting bottled water. In recent years, various water techniques, such as membrane-assisted technologies, have been proposed and successfully implemented in many places. Given the diversity of techniques available, the current challenge is how to scientifically identify the optimum options for different disaster scenarios. Hence, a fuzzy triangular-based multi-criteria, group decision-making tool was developed in this research. The approach was then applied to the selection of the most appropriate water technologies corresponding to the different emergency water supply scenarios. The results show this tool capable of facilitating scientific analysis in the evaluation and selection of emergency water technologies for enduring security drinking water supply in disaster relief.

  4. A lithospheric magnetic field model derived from the Swarm satellite magnetic field measurements

    NASA Astrophysics Data System (ADS)

    Hulot, G.; Thebault, E.; Vigneron, P.

    2015-12-01

    The Swarm constellation of satellites was launched in November 2013 and has since then delivered high quality scalar and vector magnetic field measurements. A consortium of several research institutions was selected by the European Space Agency (ESA) to provide a number of scientific products which will be made available to the scientific community. Within this framework, specific tools were tailor-made to better extract the magnetic signal emanating from Earth's the lithospheric. These tools rely on the scalar gradient measured by the lower pair of Swarm satellites and rely on a regional modeling scheme that is more sensitive to small spatial scales and weak signals than the standard spherical harmonic modeling. In this presentation, we report on various activities related to data analysis and processing. We assess the efficiency of this dedicated chain for modeling the lithospheric magnetic field using more than one year of measurements, and finally discuss refinements that are continuously implemented in order to further improve the robustness and the spatial resolution of the lithospheric field model.

  5. Using R for large spatiotemporal data sets

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer

    2017-04-01

    Writing and sharing scientific software is a means to communicate scientific ideas for finding scientific consensus, no more and no less than writing and sharing scientific papers is. Important factors for successful communication are adopting an open source environment, and using a language that is understood by many. For many scientist, R's combination of rich data abstraction and highly exposed data structures makes it an attractive communication tool. This paper discusses the development of spatial and spatiotemporal data handling and analysis with R since 2000, and will point to some of R's strengths and weaknesses in a historical perspective. We will also discuss a new, S3-based package for feature data ("Simple Features for R"), and point to a way forward into the data science realm, where pipeline-based workflows are assumed. Finally, we will discuss how, in a similar vein, massive satellite or climate model data sets, potentially held in a cloud environment, can be handled and analyzed with R.

  6. Enabling large-scale next-generation sequence assembly with Blacklight

    PubMed Central

    Couger, M. Brian; Pipes, Lenore; Squina, Fabio; Prade, Rolf; Siepel, Adam; Palermo, Robert; Katze, Michael G.; Mason, Christopher E.; Blood, Philip D.

    2014-01-01

    Summary A variety of extremely challenging biological sequence analyses were conducted on the XSEDE large shared memory resource Blacklight, using current bioinformatics tools and encompassing a wide range of scientific applications. These include genomic sequence assembly, very large metagenomic sequence assembly, transcriptome assembly, and sequencing error correction. The data sets used in these analyses included uncategorized fungal species, reference microbial data, very large soil and human gut microbiome sequence data, and primate transcriptomes, composed of both short-read and long-read sequence data. A new parallel command execution program was developed on the Blacklight resource to handle some of these analyses. These results, initially reported previously at XSEDE13 and expanded here, represent significant advances for their respective scientific communities. The breadth and depth of the results achieved demonstrate the ease of use, versatility, and unique capabilities of the Blacklight XSEDE resource for scientific analysis of genomic and transcriptomic sequence data, and the power of these resources, together with XSEDE support, in meeting the most challenging scientific problems. PMID:25294974

  7. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618

  8. Federal Data Repository Research: Recent Developments in Mercury Search System Architecture

    NASA Astrophysics Data System (ADS)

    Devarakonda, R.

    2015-12-01

    New data intensive project initiatives needs new generation data system architecture. This presentation will discuss the recent developments in Mercury System [1] including adoption, challenges, and future efforts to handle such data intensive projects. Mercury is a combination of three main tools (i) Data/Metadata registration Tool (Online Metadata Editor): The new Online Metadata Editor (OME) is a web-based tool to help document the scientific data in a well-structured, popular scientific metadata formats. (ii) Search and Visualization Tool: Provides a single portal to information contained in disparate data management systems. It facilitates distributed metadata management, data discovery, and various visuzalization capabilities. (iii) Data Citation Tool: In collaboration with Department of Energy's Oak Ridge National Laboratory (ORNL) Mercury Consortium (funded by NASA, USGS and DOE), established a Digital Object Identifier (DOI) service. Mercury is a open source system, developed and managed at Oak Ridge National Laboratory and is currently being funded by three federal agencies, including NASA, USGS and DOE. It provides access to millions of bio-geo-chemical and ecological data; 30,000 scientists use it each month. Some recent data intensive projects that are using Mercury tool: USGS Science Data Catalog (http://data.usgs.gov/), Next-Generation Ecosystem Experiments (http://ngee-arctic.ornl.gov/), Carbon Dioxide Information Analysis Center (http://cdiac.ornl.gov/), Oak Ridge National Laboratory - Distributed Active Archive Center (http://daac.ornl.gov), SoilSCAPE (http://mercury.ornl.gov/soilscape). References: [1] Devarakonda, Ranjeet, et al. "Mercury: reusable metadata management, data discovery and access system." Earth Science Informatics 3.1-2 (2010): 87-94.

  9. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  10. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less

  11. Interactive Parallel Data Analysis within Data-Centric Cluster Facilities using the IPython Notebook

    NASA Astrophysics Data System (ADS)

    Pascoe, S.; Lansdowne, J.; Iwi, A.; Stephens, A.; Kershaw, P.

    2012-12-01

    The data deluge is making traditional analysis workflows for many researchers obsolete. Support for parallelism within popular tools such as matlab, IDL and NCO is not well developed and rarely used. However parallelism is necessary for processing modern data volumes on a timescale conducive to curiosity-driven analysis. Furthermore, for peta-scale datasets such as the CMIP5 archive, it is no longer practical to bring an entire dataset to a researcher's workstation for analysis, or even to their institutional cluster. Therefore, there is an increasing need to develop new analysis platforms which both enable processing at the point of data storage and which provides parallelism. Such an environment should, where possible, maintain the convenience and familiarity of our current analysis environments to encourage curiosity-driven research. We describe how we are combining the interactive python shell (IPython) with our JASMIN data-cluster infrastructure. IPython has been specifically designed to bridge the gap between the HPC-style parallel workflows and the opportunistic curiosity-driven analysis usually carried out using domain specific languages and scriptable tools. IPython offers a web-based interactive environment, the IPython notebook, and a cluster engine for parallelism all underpinned by the well-respected Python/Scipy scientific programming stack. JASMIN is designed to support the data analysis requirements of the UK and European climate and earth system modeling community. JASMIN, with its sister facility CEMS focusing the earth observation community, has 4.5 PB of fast parallel disk storage alongside over 370 computing cores provide local computation. Through the IPython interface to JASMIN, users can make efficient use of JASMIN's multi-core virtual machines to perform interactive analysis on all cores simultaneously or can configure IPython clusters across multiple VMs. Larger-scale clusters can be provisioned through JASMIN's batch scheduling system. Outputs can be summarised and visualised using the full power of Python's many scientific tools, including Scipy, Matplotlib, Pandas and CDAT. This rich user experience is delivered through the user's web browser; maintaining the interactive feel of a workstation-based environment with the parallel power of a remote data-centric processing facility.

  12. The 3D widgets for exploratory scientific visualization

    NASA Technical Reports Server (NTRS)

    Herndon, Kenneth P.; Meyer, Tom

    1995-01-01

    Computational fluid dynamics (CFD) techniques are used to simulate flows of fluids like air or water around such objects as airplanes and automobiles. These techniques usually generate very large amounts of numerical data which are difficult to understand without using graphical scientific visualization techniques. There are a number of commercial scientific visualization applications available today which allow scientists to control visualization tools via textual and/or 2D user interfaces. However, these user interfaces are often difficult to use. We believe that 3D direct-manipulation techniques for interactively controlling visualization tools will provide opportunities for powerful and useful interfaces with which scientists can more effectively explore their datasets. A few systems have been developed which use these techniques. In this paper, we will present a variety of 3D interaction techniques for manipulating parameters of visualization tools used to explore CFD datasets, and discuss in detail various techniques for positioning tools in a 3D scene.

  13. Unified User Interface to Support Effective and Intuitive Data Discovery, Dissemination, and Analysis at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Petrenko, M.; Hegde, M.; Bryant, K.; Johnson, J. E.; Ritrivi, A.; Shen, S.; Volmer, B.; Pham, L. B.

    2015-01-01

    Goddard Earth Sciences Data and Information Services Center (GES DISC) has been providing access to scientific data sets since 1990s. Beginning as one of the first Earth Observing System Data and Information System (EOSDIS) archive centers, GES DISC has evolved to offer a wide range of science-enabling services. With a growing understanding of needs and goals of its science users, GES DISC continues to improve and expand on its broad set of data discovery and access tools, sub-setting services, and visualization tools. Nonetheless, the multitude of the available tools, a partial overlap of functionality, and independent and uncoupled interfaces employed by these tools often leave the end users confused as of what tools or services are the most appropriate for a task at hand. As a result, some the services remain underutilized or largely unknown to the users, significantly reducing the availability of the data and leading to a great loss of scientific productivity. In order to improve the accessibility of GES DISC tools and services, we have designed and implemented UUI, the Unified User Interface. UUI seeks to provide a simple, unified, and intuitive one-stop shop experience for the key services available at GES DISC, including sub-setting (Simple Subset Wizard), granule file search (Mirador), plotting (Giovanni), and other services. In this poster, we will discuss the main lessons, obstacles, and insights encountered while designing the UUI experience. We will also present the architecture and technology behind UUI, including NodeJS, Angular, and Mongo DB, as well as speculate on the future of the tool at GES DISC as well as in a broader context of the Space Science Informatics.

  14. Sentinel-2 ArcGIS Tool for Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Plesoianu, Alin; Cosmin Sandric, Ionut; Anca, Paula; Vasile, Alexandru; Calugaru, Andreea; Vasile, Cristian; Zavate, Lucian

    2017-04-01

    This paper addresses one of the biggest challenges regarding Sentinel-2 data, related to the need of an efficient tool to access and process the large collection of images that are available. Consequently, developing a tool for the automation of Sentinel-2 data analysis is the most immediate need. We developed a series of tools for the automation of Sentinel-2 data download and processing for vegetation health monitoring. The tools automatically perform the following operations: downloading image tiles from ESA's Scientific Hub or other venders (Amazon), pre-processing of the images to extract the 10-m bands, creating image composites, applying a series of vegetation indexes (NDVI, OSAVI, etc.) and performing change detection analyses on different temporal data sets. All of these tools run in a dynamic way in the ArcGIS Platform, without the need of creating intermediate datasets (rasters, layers), as the images are processed on-the-fly in order to avoid data duplication. Finally, they allow complete integration with the ArcGIS environment and workflows

  15. Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)

    NASA Astrophysics Data System (ADS)

    Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David

    2018-01-01

    Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of various planetary targets.

  16. The COMPASS Project

    NASA Astrophysics Data System (ADS)

    Duley, A. R.; Sullivan, D.; Fladeland, M. M.; Myers, J.; Craig, M.; Enomoto, F.; Van Gilst, D. P.; Johan, S.

    2011-12-01

    The Common Operations and Management Portal for Airborne Science Systems (COMPASS) project is a multi-center collaborative effort to advance and extend the research capabilities of the National Aeronautics and Space Administration's (NASA) Airborne Science Program (ASP). At its most basic, COMPASS provides tools for visualizing the position of aircraft and instrument observations during the course of a mission, and facilitates dissemination, discussion, and analysis and of multiple disparate data sources in order to more efficiently plan and execute airborne science missions. COMPASS targets a number of key objectives. First, deliver a common operating picture for improved shared situational awareness to all participants in NASA's Airborne Science missions. These participants include scientists, engineers, managers, and the general public. Second, encourage more responsive and collaborative measurements between instruments on multiple aircraft, satellites, and on the surface in order to increase the scientific value of these measurements. Fourth, provide flexible entry points for data providers to supply model and advanced analysis products to mission team members. Fifth, provide data consumers with a mechanism to ingest, search and display data products. Finally, embrace an open and transparent platform where common data products, services, and end user components can be shared with the broader scientific community. In pursuit of these objectives, and in concert with requirements solicited by the airborne science research community, the COMPASS project team has delivered a suite of core tools intended to represent the next generation toolset for airborne research. This toolset includes a collection of loosely coupled RESTful web-services, a system to curate, register, and search, commonly used data sources, end-user tools which leverage web socket and other next generation HTML5 technologies to aid real time aircraft position and data visualization, and an extensible a framework to rapidly accommodate mission specific requirements and mission tools.

  17. Rating Health and Social Indicators for Use with Indigenous Communities: A Tool for Balancing Cultural and Scientific Utility

    ERIC Educational Resources Information Center

    Daniel, Mark; Cargo, Margaret; Marks, Elisabeth; Paquet, Catherine; Simmons, David; Williams, Margaret; Rowley, Kevin; O'Dea, Kerin

    2009-01-01

    This study reports on the development and evaluation of a rating tool to assess the scientific utility and cultural appropriateness of community-level indicators for application with Indigenous populations. Indicator criteria proposed by the U.S. Institute of Medicine were culturally adapted through reviewing the literature and consultations with…

  18. Customized Resources | OSTI, US Dept of Energy Office of Scientific and

    Science.gov Websites

    Technical Information skip to main content Sign In Create Account OSTI.GOV title logo U.S . Department of Energy Office of Scientific and Technical Information Search terms: Advanced search options Tools Public Access Policy Data Services & Dev Tools About FAQs News Sign In Create Account This

  19. DOE Collections | OSTI, US Dept of Energy Office of Scientific and

    Science.gov Websites

    Technical Information skip to main content Sign In Create Account OSTI.GOV title logo U.S . Department of Energy Office of Scientific and Technical Information Search terms: Advanced search options Tools Public Access Policy Data Services & Dev Tools About FAQs News Sign In Create Account This

  20. Contact Us | OSTI, US Dept of Energy Office of Scientific and Technical

    Science.gov Websites

    Information skip to main content Sign In Create Account OSTI.GOV title logo U.S. Department of Energy Office of Scientific and Technical Information Search terms: Advanced search options Advanced Tools Public Access Policy Data Services & Dev Tools About FAQs News Sign In Create Account Contact

  1. Scientific Writing: Strategies and Tools for Students and Advisors

    ERIC Educational Resources Information Center

    Singh, Vikash; Mayer, Philipp

    2014-01-01

    Scientific writing is a demanding task and many students need more time than expected to finish their research articles. To speed up the process, we highlight some tools, strategies as well as writing guides. We recommend starting early in the research process with writing and to prepare research articles, not after but in parallel to the lab or…

  2. Board Games and Board Game Design as Learning Tools for Complex Scientific Concepts: Some Experiences

    ERIC Educational Resources Information Center

    Chiarello, Fabio; Castellano, Maria Gabriella

    2016-01-01

    In this paper the authors report different experiences in the use of board games as learning tools for complex and abstract scientific concepts such as Quantum Mechanics, Relativity or nano-biotechnologies. In particular we describe "Quantum Race," designed for the introduction of Quantum Mechanical principles, "Lab on a chip,"…

  3. Collaborative Planetary GIS with JMARS

    NASA Astrophysics Data System (ADS)

    Dickenshied, S.; Christensen, P. R.; Edwards, C. S.; Prashad, L. C.; Anwar, S.; Engle, E.; Noss, D.; Jmars Development Team

    2010-12-01

    Traditional GIS tools have allowed users to work locally with their own datasets in their own computing environment. More recently, data providers have started offering online repositories of preprocessed data which helps minimize the learning curve required to access new datasets. The ideal collaborative GIS tool provides the functionality of a traditional GIS and easy access to preprocessed data repositories while also enabling users to contribute data, analysis, and ideas back into the very tools they're using. JMARS (Java Mission-planning and Analysis for Remote Sensing) is a suite of geospatial applications developed by the Mars Space Flight Facility at Arizona State University. This software is used for mission planning and scientific data analysis by several NASA missions, including Mars Odyssey, Mars Reconnaissance Orbiter, and the Lunar Reconnaissance Orbiter. It is used by scientists, researchers and students of all ages from more than 40 countries around the world. In addition to offering a rich set of global and regional maps and publicly released orbiter images, the JMARS software development team has been working on ways to encourage the creation of collaborative datasets. Bringing together users from diverse teams and backgrounds allows new features to be developed with an interest in making the application useful and accessible to as wide a potential audience as possible. Actively engaging the scientific community in development strategy and hands on tasks allows the creation of user driven data content that would not otherwise be possible. The first community generated dataset to result from this effort is a tool mapping peer-reviewed papers to the locations they relate to on Mars with links to ancillary data. This allows users of JMARS to browse to an area of interest and then quickly locate papers corresponding to that area. Alternately, users can search for published papers over a specified time interval and visually see what areas of Mars have received the most attention over the requested time span.

  4. Informing Regional Water-Energy-Food Nexus with System Analysis and Interactive Visualizations

    NASA Astrophysics Data System (ADS)

    Yang, Y. C. E.; Wi, S.

    2016-12-01

    Communicating scientific results to non-technical practitioners is challenging due to their differing interests, concerns and agendas. It is further complicated by the growing number of relevant factors that need to be considered, such as climate change and demographic dynamic. Visualization is an effective method for the scientific community to disseminate results, and it represents an opportunity for the future of water resources systems analysis (WRSA). This study demonstrates an intuitive way to communicate WRSA results to practitioners using interactive web-based visualization tools developed by the JavaScript library: Data-Driven Documents (D3) with a case study in Great Ruaha River of Tanzania. The decreasing trend of streamflow during the last decades in the region highlights the need of assessing the water usage competition between agricultural production, energy generation, and ecosystem service. Our team conduct the advance water resources systems analysis to inform policy that will affect the water-energy-food nexus. Modeling results are presented in the web-based visualization tools and allow non-technical practitioners to brush the graph directly (e. g. Figure 1). The WRSA suggests that no single measure can completely resolve the water competition. A combination of measures, each of which is acceptable from a social and economic perspective, and accepting that zero flows cannot be totally eliminated during dry years in the wetland, are likely to be the best way forward.

  5. SIAMOC position paper on gait analysis in clinical practice: General requirements, methods and appropriateness. Results of an Italian consensus conference.

    PubMed

    Benedetti, Maria Grazia; Beghi, Ettore; De Tanti, Antonio; Cappozzo, Aurelio; Basaglia, Nino; Cutti, Andrea Giovanni; Cereatti, Andrea; Stagni, Rita; Verdini, Federica; Manca, Mario; Fantozzi, Silvia; Mazzà, Claudia; Camomilla, Valentina; Campanini, Isabella; Castagna, Anna; Cavazzuti, Lorenzo; Del Maestro, Martina; Croce, Ugo Della; Gasperi, Marco; Leo, Tommaso; Marchi, Pia; Petrarca, Maurizio; Piccinini, Luigi; Rabuffetti, Marco; Ravaschio, Andrea; Sawacha, Zimi; Spolaor, Fabiola; Tesio, Luigi; Vannozzi, Giuseppe; Visintin, Isabella; Ferrarin, Maurizio

    2017-10-01

    Gait analysis is recognized as a useful assessment tool in the field of human movement research. However, doubts remain on its real effectiveness as a clinical tool, i.e. on its capability to change the diagnostic-therapeutic practice. In particular, the conditions in which evidence of a favorable cost-benefit ratio is found and the methodology for properly conducting and interpreting the exam are not identified clearly. To provide guidelines for the use of Gait Analysis in the context of rehabilitation medicine, SIAMOC (the Italian Society of Clinical Movement Analysis) promoted a National Consensus Conference which was held in Bologna on September 14th, 2013. The resulting recommendations were the result of a three-stage process entailing i) the preparation of working documents on specific open issues, ii) the holding of the consensus meeting, and iii) the drafting of consensus statements by an external Jury. The statements were formulated based on scientific evidence or experts' opinion, when the quality/quantity of the relevant literature was deemed insufficient. The aim of this work is to disseminate the consensus statements. These are divided into 13 questions grouped in three areas of interest: 1) General requirements and management, 2) Methodological and instrumental issues, and 3) Scientific evidence and clinical appropriateness. SIAMOC hopes that this document will contribute to improve clinical practice and help promoting further research in the field. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Using Communication Technology to Facilitate Scientific Literacy: A Framework for Engaged Learning

    NASA Astrophysics Data System (ADS)

    VanBuskirk, Shireen Adele

    The purpose of this research project is to describe how existing communication technologies are used to foster scientific literacy for secondary students. This study develops a new framework as an analytic tool to categorize the activities of teachers and students involved in scientific literacy to describe what elements of scientific literacy are facilitated by such technologies. Four case studies are analyzed using the framework to describe the scientific literacy initiatives. Data collection at each site included interviews with the teacher, student focus groups, student surveys, and classroom observations. Qualitative analysis of the data provided insight into the learning activities and student experiences in the four cases. This study intentionally provides a platform for student voice. Very few previous empirical studies in the area of scientific literacy include the student experience. This represents a significant gap in the current literature on scientific literacy. An interpretation of scientific literacy that promotes student engagement, interaction, and initiative corresponds to a need to listen to students' perspectives on these experiences. Findings of the study indicated that the classroom activities depended on the teacher's philosophy regarding scientific literacy. Communication technology was ubiquitous; where the teacher did not initiate the use of social media in the classroom, the students did. The goal of supporting scientific literacy in students is an objective that extends beyond the boundaries of classroom walls, and it can be facilitated by technologies that seem both abundant and underutilized. Technology-enhanced pedagogy altered the classroom practices and resulted in more student participation and engagement.

  7. A Python-based interface to examine motions in time series of solar images

    NASA Astrophysics Data System (ADS)

    Campos-Rozo, J. I.; Vargas Domínguez, S.

    2017-10-01

    Python is considered to be a mature programming language, besides of being widely accepted as an engaging option for scientific analysis in multiple areas, as will be presented in this work for the particular case of solar physics research. SunPy is an open-source library based on Python that has been recently developed to furnish software tools to solar data analysis and visualization. In this work we present a graphical user interface (GUI) based on Python and Qt to effectively compute proper motions for the analysis of time series of solar data. This user-friendly computing interface, that is intended to be incorporated to the Sunpy library, uses a local correlation tracking technique and some extra tools that allows the selection of different parameters to calculate, vizualize and analyze vector velocity fields of solar data, i.e. time series of solar filtergrams and magnetograms.

  8. Advancing data management and analysis in different scientific disciplines

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.

    2017-10-01

    Over the past several years, rapid growth of data has affected many fields of science. This has often resulted in the need for overhauling or exchanging the tools and approaches in the disciplines’ data life cycles. However, this allows the application of new data analysis methods and facilitates improved data sharing. The project Large-Scale Data Management and Analysis (LSDMA) of the German Helmholtz Association has been addressing both specific and generic requirements in its data life cycle successfully since 2012. Its data scientists work together with researchers from the fields such as climatology, energy and neuroscience to improve the community-specific data life cycles, in several cases even all stages of the data life cycle, i.e. from data acquisition to data archival. LSDMA scientists also study methods and tools that are of importance to many communities, e.g. data repositories and authentication and authorization infrastructure.

  9. KNIME for reproducible cross-domain analysis of life science data.

    PubMed

    Fillbrunn, Alexander; Dietz, Christian; Pfeuffer, Julianus; Rahn, René; Landrum, Gregory A; Berthold, Michael R

    2017-11-10

    Experiments in the life sciences often involve tools from a variety of domains such as mass spectrometry, next generation sequencing, or image processing. Passing the data between those tools often involves complex scripts for controlling data flow, data transformation, and statistical analysis. Such scripts are not only prone to be platform dependent, they also tend to grow as the experiment progresses and are seldomly well documented, a fact that hinders the reproducibility of the experiment. Workflow systems such as KNIME Analytics Platform aim to solve these problems by providing a platform for connecting tools graphically and guaranteeing the same results on different operating systems. As an open source software, KNIME allows scientists and programmers to provide their own extensions to the scientific community. In this review paper we present selected extensions from the life sciences that simplify data exploration, analysis, and visualization and are interoperable due to KNIME's unified data model. Additionally, we name other workflow systems that are commonly used in the life sciences and highlight their similarities and differences to KNIME. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Data Integration Tool: Permafrost Data Debugging

    NASA Astrophysics Data System (ADS)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Pulsifer, P. L.; Strawhacker, C.; Yarmey, L.; Basak, R.

    2017-12-01

    We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the Global Terrestrial Network-Permafrost (GTN-P). The United States National Science Foundation funded this project through the National Snow and Ice Data Center (NSIDC) with the GTN-P to improve permafrost data access and discovery. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets (https://github.com/PermaData/DIT). Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs, incrementally interact with and evolve the widget workflows, and save those workflows for reproducibility. Taking ideas from visual programming found in the art and design domain, debugging and iterative design principles from software engineering, and the scientific data processing and analysis power of Fortran and Python it was written for interactive, iterative data manipulation, quality control, processing, and analysis of inconsistent data in an easily installable application. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.

  11. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  12. Introductory Tools for Radiative Transfer Models

    NASA Astrophysics Data System (ADS)

    Feldman, D.; Kuai, L.; Natraj, V.; Yung, Y.

    2006-12-01

    Satellite data are currently so voluminous that, despite their unprecedented quality and potential for scientific application, only a small fraction is analyzed due to two factors: researchers' computational constraints and a relatively small number of researchers actively utilizing the data. Ultimately it is hoped that the terabytes of unanalyzed data being archived can receive scientific scrutiny but this will require a popularization of the methods associated with the analysis. Since a large portion of complexity is associated with the proper implementation of the radiative transfer model, it is reasonable and appropriate to make the model as accessible as possible to general audiences. Unfortunately, the algorithmic and conceptual details that are necessary for state-of-the-art analysis also tend to frustrate the accessibility for those new to remote sensing. Several efforts have been made to have web- based radiative transfer calculations, and these are useful for limited calculations, but analysis of more than a few spectra requires the utilization of home- or server-based computing resources. We present a system that is designed to allow for easier access to radiative transfer models with implementation on a home computing platform in the hopes that this system can be utilized in and expanded upon in advanced high school and introductory college settings. This learning-by-doing process is aided through the use of several powerful tools. The first is a wikipedia-style introduction to the salient features of radiative transfer that references the seminal works in the field and refers to more complicated calculations and algorithms sparingly5. The second feature is a technical forum, commonly referred to as a tiki-wiki, that addresses technical and conceptual questions through public postings, private messages, and a ranked searching routine. Together, these tools may be able to facilitate greater interest in the field of remote sensing.

  13. Thermal Imaging in the Science Classroom

    ERIC Educational Resources Information Center

    Short, Daniel B.

    2012-01-01

    Thermal cameras are useful tools for use in scientific investigation and for teaching scientific concepts to students in the classroom. Demonstrations of scientific phenomena can be greatly enhanced visually by the use of this cutting-edge technology. (Contains 7 figures.)

  14. Widening the adoption of workflows to include human and human-machine scientific processes

    NASA Astrophysics Data System (ADS)

    Salayandia, L.; Pinheiro da Silva, P.; Gates, A. Q.

    2010-12-01

    Scientific workflows capture knowledge in the form of technical recipes to access and manipulate data that help scientists manage and reuse established expertise to conduct their work. Libraries of scientific workflows are being created in particular fields, e.g., Bioinformatics, where combined with cyber-infrastructure environments that provide on-demand access to data and tools, result in powerful workbenches for scientists of those communities. The focus in these particular fields, however, has been more on automating rather than documenting scientific processes. As a result, technical barriers have impeded a wider adoption of scientific workflows by scientific communities that do not rely as heavily on cyber-infrastructure and computing environments. Semantic Abstract Workflows (SAWs) are introduced to widen the applicability of workflows as a tool to document scientific recipes or processes. SAWs intend to capture a scientists’ perspective about the process of how she or he would collect, filter, curate, and manipulate data to create the artifacts that are relevant to her/his work. In contrast, scientific workflows describe the process from the point of view of how technical methods and tools are used to conduct the work. By focusing on a higher level of abstraction that is closer to a scientist’s understanding, SAWs effectively capture the controlled vocabularies that reflect a particular scientific community, as well as the types of datasets and methods used in a particular domain. From there on, SAWs provide the flexibility to adapt to different environments to carry out the recipes or processes. These environments range from manual fieldwork to highly technical cyber-infrastructure environments, i.e., such as those already supported by scientific workflows. Two cases, one from Environmental Science and another from Geophysics, are presented as illustrative examples.

  15. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M

    2005-01-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecularmore » dynamics application of great interest to the computational biology community.« less

  16. Application of non-traditional stable isotopes in analytical ecogeochemistry assessed by MC ICP-MS--A critical review.

    PubMed

    Irrgeher, Johanna; Prohaska, Thomas

    2016-01-01

    Analytical ecogeochemistry is an evolving scientific field dedicated to the development of analytical methods and tools and their application to ecological questions. Traditional stable isotopic systems have been widely explored and have undergone continuous development during the last century. The variations of the isotopic composition of light elements (H, O, N, C, and S) have provided the foundation of stable isotope analysis followed by the analysis of traditional geochemical isotope tracers (e.g., Pb, Sr, Nd, Hf). Questions in a considerable diversity of scientific fields have been addressed, many of which can be assigned to the field of ecogeochemistry. Over the past 15 years, other stable isotopes (e.g., Li, Zn, Cu, Cl) have emerged gradually as novel tools for the investigation of scientific topics that arise in ecosystem research and have enabled novel discoveries and explorations. These systems are often referred to as non-traditional isotopes. The small isotopic differences of interest that are increasingly being addressed for a growing number of isotopic systems represent a challenge to the analytical scientist and push the limits of today's instruments constantly. This underlines the importance of a metrologically sound concept of analytical protocols and procedures and a solid foundation of data processing strategies and uncertainty considerations before these small isotopic variations can be interpreted in the context of applied ecosystem research. This review focuses on the development of isotope research in ecogeochemistry, the requirements for successful detection of small isotopic shifts, and highlights the most recent and innovative applications in the field.

  17. Lithology and mineralogy recognition from geochemical logging tool data using multivariate statistical analysis.

    PubMed

    Konaté, Ahmed Amara; Ma, Huolin; Pan, Heping; Qin, Zhen; Ahmed, Hafizullah Abba; Dembele, N'dji Dit Jacques

    2017-10-01

    The availability of a deep well that penetrates deep into the Ultra High Pressure (UHP) metamorphic rocks is unusual and consequently offers a unique chance to study the metamorphic rocks. One such borehole is located in the southern part of Donghai County in the Sulu UHP metamorphic belt of Eastern China, from the Chinese Continental Scientific Drilling Main hole. This study reports the results obtained from the analysis of oxide log data. A geochemical logging tool provides in situ, gamma ray spectroscopy measurements of major and trace elements in the borehole. Dry weight percent oxide concentration logs obtained for this study were SiO 2 , K 2 O, TiO 2 , H 2 O, CO 2 , Na 2 O, Fe 2 O 3 , FeO, CaO, MnO, MgO, P 2 O 5 and Al 2 O 3 . Cross plot and Principal Component Analysis methods were applied for lithology characterization and mineralogy description respectively. Cross plot analysis allows lithological variations to be characterized. Principal Component Analysis shows that the oxide logs can be summarized by two components related to the feldspar and hydrous minerals. This study has shown that geochemical logging tool data is accurate and adequate to be tremendously useful in UHP metamorphic rocks analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. RIEMS: a software pipeline for sensitive and comprehensive taxonomic classification of reads from metagenomics datasets.

    PubMed

    Scheuch, Matthias; Höper, Dirk; Beer, Martin

    2015-03-03

    Fuelled by the advent and subsequent development of next generation sequencing technologies, metagenomics became a powerful tool for the analysis of microbial communities both scientifically and diagnostically. The biggest challenge is the extraction of relevant information from the huge sequence datasets generated for metagenomics studies. Although a plethora of tools are available, data analysis is still a bottleneck. To overcome the bottleneck of data analysis, we developed an automated computational workflow called RIEMS - Reliable Information Extraction from Metagenomic Sequence datasets. RIEMS assigns every individual read sequence within a dataset taxonomically by cascading different sequence analyses with decreasing stringency of the assignments using various software applications. After completion of the analyses, the results are summarised in a clearly structured result protocol organised taxonomically. The high accuracy and performance of RIEMS analyses were proven in comparison with other tools for metagenomics data analysis using simulated sequencing read datasets. RIEMS has the potential to fill the gap that still exists with regard to data analysis for metagenomics studies. The usefulness and power of RIEMS for the analysis of genuine sequencing datasets was demonstrated with an early version of RIEMS in 2011 when it was used to detect the orthobunyavirus sequences leading to the discovery of Schmallenberg virus.

  19. Cloud-based data-proximate visualization and analysis

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2017-04-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready. The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  20. Y0: An innovative tool for spatial data analysis

    NASA Astrophysics Data System (ADS)

    Wilson, Jeremy C.

    1993-08-01

    This paper describes an advanced analysis and visualization tool, called Y0 (pronounced ``Why not?!''), that has been developed to directly support the scientific process for earth and space science research. Y0 aids the scientific research process by enabling the user to formulate algorithms and models within an integrated environment, and then interactively explore the solution space with the aid of appropriate visualizations. Y0 has been designed to provide strong support for both quantitative analysis and rich visualization. The user's algorithm or model is defined in terms of algebraic formulas in cells on worksheets, in a similar fashion to spreadsheet programs. Y0 is specifically designed to provide the data types and rich function set necessary for effective analysis and manipulation of remote sensing data. This includes various types of arrays, geometric objects, and objects for representing geographic coordinate system mappings. Visualization of results is tailored to the needs of remote sensing, with straightforward methods of composing, comparing, and animating imagery and graphical information, with reference to geographical coordinate systems. Y0 is based on advanced object-oriented technology. It is implemented in C++ for use in Unix environments, with a user interface based on the X window system. Y0 has been delivered under contract to Unidata, a group which provides data and software support to atmospheric researches in universities affiliated with UCAR. This paper will explore the key concepts in Y0, describe its utility for remote sensing analysis and visualization, and will give a specific example of its application to the problem of measuring glacier flow rates from Landsat imagery.

  1. Physical Science Experiments for Scientific Glassblowing Technicians.

    ERIC Educational Resources Information Center

    Tillis, Samuel E.; Donaghay, Herbert C.

    The twenty experiments in this text have been designed to give the scientific glassblowing technician the opportunity to use scientific glass apparatus in the study of physical science. Primary emphasis of these experiments is on the practical application of the physical science program as a working tool for the scientific glassblowing technician.…

  2. The cock, the Academy, and the best scientific journal in the world.

    PubMed

    Romanovsky, Andrej A

    2015-01-01

    The reader is invited to travel to Ancient Greece, contemporary Brazil, and other places in a fantasy search for the best scientific journal. This whimsical search does not rely on the impact factor, the most popular tool used in real life for finding good journals. Instead, it takes advantage of the so-called authority factor, a recently proposed alternative to the impact factor. The authority factor of a particular journal is the mean h-index (Hirsch's index) of the most suitable group of this journal's editors. Having no connection to any major function of scientific journals, and also being arbitrary (which group of editors to select?), this factor is poorly suited for any technical analysis, but it seems to work well for "small-talk" editorials and self-promotion by complacent editors. Interestingly, the highest authority factor we could find belongs to the journal Temperature. This claim, however, should not be taken too seriously.

  3. Explore the virtual side of earth science

    USGS Publications Warehouse

    ,

    1998-01-01

    Scientists have always struggled to find an appropriate technology that could represent three-dimensional (3-D) data, facilitate dynamic analysis, and encourage on-the-fly interactivity. In the recent past, scientific visualization has increased the scientist's ability to visualize information, but it has not provided the interactive environment necessary for rapidly changing the model or for viewing the model in ways not predetermined by the visualization specialist. Virtual Reality Modeling Language (VRML 2.0) is a new environment for visualizing 3-D information spaces and is accessible through the Internet with current browser technologies. Researchers from the U.S. Geological Survey (USGS) are using VRML as a scientific visualization tool to help convey complex scientific concepts to various audiences. Kevin W. Laurent, computer scientist, and Maura J. Hogan, technical information specialist, have created a collection of VRML models available through the Internet at Virtual Earth Science (virtual.er.usgs.gov).

  4. WHAM!: a web-based visualization suite for user-defined analysis of metagenomic shotgun sequencing data.

    PubMed

    Devlin, Joseph C; Battaglia, Thomas; Blaser, Martin J; Ruggles, Kelly V

    2018-06-25

    Exploration of large data sets, such as shotgun metagenomic sequence or expression data, by biomedical experts and medical professionals remains as a major bottleneck in the scientific discovery process. Although tools for this purpose exist for 16S ribosomal RNA sequencing analysis, there is a growing but still insufficient number of user-friendly interactive visualization workflows for easy data exploration and figure generation. The development of such platforms for this purpose is necessary to accelerate and streamline microbiome laboratory research. We developed the Workflow Hub for Automated Metagenomic Exploration (WHAM!) as a web-based interactive tool capable of user-directed data visualization and statistical analysis of annotated shotgun metagenomic and metatranscriptomic data sets. WHAM! includes exploratory and hypothesis-based gene and taxa search modules for visualizing differences in microbial taxa and gene family expression across experimental groups, and for creating publication quality figures without the need for command line interface or in-house bioinformatics. WHAM! is an interactive and customizable tool for downstream metagenomic and metatranscriptomic analysis providing a user-friendly interface allowing for easy data exploration by microbiome and ecological experts to facilitate discovery in multi-dimensional and large-scale data sets.

  5. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    NASA Astrophysics Data System (ADS)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  6. The Interface of Art and Science in the Museum: Disclosing a 4th Dimension of Art Preservation and Connoisseurship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casadio, Franceska

    Drawing from her most recent experience at the Art Institute and past experiences in Italy, Dr. Casadio will discuss examples of applications of scientific analysis to the field of Cultural Heritage, including: 1) The use of instrumental analysis to address fundamental questions regarding artists' techniques, and as an aid to unraveling paint technology, as with the fascinating pre-Columbia pigment Maya Blue; 2) The investigation into deterioration of artifacts, the design of innovative conservation materials and the testing of their durability, expemplified with the case study of the conservation of the facade of the gothic Cathedral of Milan; 3) Development ofmore » fine-tuned conservation strategies for the cleaning of Michelangelo's David; 4) The study of the effect of environmental parameters on objects in exhibitions and storage to help design compatible display cases; 5) The role of scientific analysis in matters of authentication and dating. Future trends that increasingly see science as a tool for virtual restoration will be discussed.« less

  7. The Interface of Art and Science in the Museum: Disclosing a 4th Dimension of Art Preservation and Connoisseurship

    ScienceCinema

    Casadio, Franceska [Art Institute of Chicago, Chicago, Illinois, United States

    2017-12-09

    Drawing from her most recent experience at the Art Institute and past experiences in Italy, Dr. Casadio will discuss examples of applications of scientific analysis to the field of Cultural Heritage, including: 1) The use of instrumental analysis to address fundamental questions regarding artists' techniques, and as an aid to unraveling paint technology, as with the fascinating pre-Columbia pigment Maya Blue; 2) The investigation into deterioration of artifacts, the design of innovative conservation materials and the testing of their durability, expemplified with the case study of the conservation of the facade of the gothic Cathedral of Milan; 3) Development of fine-tuned conservation strategies for the cleaning of Michelangelo's David; 4) The study of the effect of environmental parameters on objects in exhibitions and storage to help design compatible display cases; 5) The role of scientific analysis in matters of authentication and dating. Future trends that increasingly see science as a tool for virtual restoration will be discussed.

  8. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Artificial neural networks in biology and chemistry: the evolution of a new analytical tool.

    PubMed

    Cartwright, Hugh M

    2008-01-01

    Once regarded as an eccentric and unpromising algorithm for the analysis of scientific data, the neural network has been developed in the last decade into a powerful computational tool. Its use now spans all areas of science, from the physical sciences and engineering to the life sciences and allied subjects. Applications range from the assessment of epidemiological data or the deconvolution of spectra to highly practical applications, such as the electronic nose. This introductory chapter considers briefly the growth in the use of neural networks and provides some general background in preparation for the more detailed chapters that follow.

  10. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  11. wft4galaxy: a workflow testing tool for galaxy.

    PubMed

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  12. The ImageJ ecosystem: an open platform for biomedical image analysis

    PubMed Central

    Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368

  13. The ImageJ ecosystem: An open platform for biomedical image analysis.

    PubMed

    Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.

  14. ThinkerTools. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2012

    2012-01-01

    "ThinkerTools" is a computer-based program that aims to develop students' understanding of physics and scientific modeling. The program is composed of two curricula for middle school students, "ThinkerTools Inquiry" and "Model-Enhanced ThinkerTools". "ThinkerTools Inquiry" allows students to explore the…

  15. Searching for scientific literacy and critical pedagogy in socioscientific curricula: A critical discourse analysis

    NASA Astrophysics Data System (ADS)

    Cummings, Kristina M.

    The omnipresence of science and technology in our society require the development of a critical and scientifically literate citizenry. However, the inclusion of socioscientific issues, which are open-ended controversial issues informed by both science and societal factors such as politics, economics, and ethics, do not guarantee the development of these skills. The purpose of this critical discourse analysis is to identify and analyze the discursive strategies used in intermediate science texts and curricula that address socioscientific topics and the extent to which the discourses are designed to promote or suppress the development of scientific literacy and a critical pedagogy. Three curricula that address the issue of energy and climate change were analyzed using Gee's (2011) building tasks and inquiry tools. The curricula were written by an education organization entitled PreSEES, a corporate-sponsored group called NEED, and a non-profit organization named Oxfam. The analysis found that the PreSEES and Oxfam curricula elevated the significance of climate change and the NEED curriculum deemphasized the issue. The PreSEES and Oxfam curricula promoted the development of scientific literacy while the NEED curricula suppressed its development. The PreSEES and Oxfam curricula both promoted the development of the critical pedagogy; however, only the Oxfam curricula provided authentic opportunities to enact sociopolitical change. The NEED curricula suppressed the development of critical pedagogy. From these findings, the following conclusions were drawn. When socioscientific issues are presented with the development of scientific literacy and critical pedagogy, the curricula allow students to develop fact-based opinions about the issue. However, curricula that address socioscientific issues without the inclusion of these skills minimize the significance of the issue and normalize the hegemonic worldview promoted by the curricula's authors. Based on these findings, additional research is necessary to confirm the connection between the way curricula address a socioscientific issue and develop or suppress scientific literacy. Additionally, further analysis is necessary to confirm the connection between corporate-sponsored curricula and the suppression of socioscientific issues, scientific literacy, and critical pedagogy. Finally, this study addressed only the intended results of the curricula. Further research is necessary to measure the actual impacts of these curricula on students.

  16. A conceptual framework for economic optimization of single hazard surveillance in livestock production chains.

    PubMed

    Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W

    2014-06-01

    Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Techniques and Tools for Estimating Ionospheric Effects in Interferometric and Polarimetric SAR Data

    NASA Technical Reports Server (NTRS)

    Rosen, P.; Lavalle, M.; Pi, X.; Buckley, S.; Szeliga, W.; Zebker, H.; Gurrola, E.

    2011-01-01

    The InSAR Scientific Computing Environment (ISCE) is a flexible, extensible software tool designed for the end-to-end processing and analysis of synthetic aperture radar data. ISCE inherits the core of the ROI_PAC interferometric tool, but contains improvements at all levels of the radar processing chain, including a modular and extensible architecture, new focusing approach, better geocoding of the data, handling of multi-polarization data, radiometric calibration, and estimation and correction of ionospheric effects. In this paper we describe the characteristics of ISCE with emphasis on the ionospheric modules. To detect ionospheric anomalies, ISCE implements the Faraday rotation method using quadpolarimetric images, and the split-spectrum technique using interferometric single-, dual- and quad-polarimetric images. The ability to generate co-registered time series of quad-polarimetric images makes ISCE also an ideal tool to be used for polarimetric-interferometric radar applications.

  18. Multi-Spacecraft Analysis with Generic Visualization Tools

    NASA Astrophysics Data System (ADS)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  19. STEM Engagement with NASA's Solar System Treks Portals for Lunar and Planetary Mapping and Modeling

    NASA Technical Reports Server (NTRS)

    Law, E. S.; Day, B. H.

    2018-01-01

    This presentation will provide an overview of the uses and capabilities of NASA's Solar System Treks family of online mapping and modeling portals. While also designed to support mission planning and scientific research, this presentation will focus on the Science, Technology, Engineering, and Math (STEM) engagement and public outreach capabilities of these web based suites of data visualization and analysis tools.

  20. A method of hidden Markov model optimization for use with geophysical data sets

    NASA Technical Reports Server (NTRS)

    Granat, R. A.

    2003-01-01

    Geophysics research has been faced with a growing need for automated techniques with which to process large quantities of data. A successful tool must meet a number of requirements: it should be consistent, require minimal parameter tuning, and produce scientifically meaningful results in reasonable time. We introduce a hidden Markov model (HMM)-based method for analysis of geophysical data sets that attempts to address these issues.

  1. Explaining as Mediated Action: An Analysis of Pre-Service Teachers' Account of Forces of Inertia in Non-Inertial Frames of Reference

    ERIC Educational Resources Information Center

    de Pereira, Alexsandro Pereira; Lima Junior, Paulo; Rodrigues, Renato Felix

    2016-01-01

    Explaining is one of the most important everyday practices in science education. In this article, we examine how scientific explanations could serve as cultural tools for members of a group of pre-service physics teachers. Specifically, we aim at their use of explanations about forces of inertia in non-inertial frames of reference. A basic…

  2. The microcomputer scientific software series 9: user's guide to Geo-CLM: geostatistical interpolation of the historical climatic record in the Lake States.

    Treesearch

    Margaret R. Holdaway

    1994-01-01

    Describes Geo-CLM, a computer application (for Mac or DOS) whose primary aim is to perform multiple kriging runs to interpolate the historic climatic record at research plots in the Lake States. It is an exploration and analysis tool. Addition capabilities include climatic databases, a flexible test mode, cross validation, lat/long conversion, English/metric units,...

  3. The International Conference on Intelligent Biology and Medicine (ICIBM) 2016: from big data to big analytical tools.

    PubMed

    Liu, Zhandong; Zheng, W Jim; Allen, Genevera I; Liu, Yin; Ruan, Jianhua; Zhao, Zhongming

    2017-10-03

    The 2016 International Conference on Intelligent Biology and Medicine (ICIBM 2016) was held on December 8-10, 2016 in Houston, Texas, USA. ICIBM included eight scientific sessions, four tutorials, one poster session, four highlighted talks and four keynotes that covered topics on 3D genomics structural analysis, next generation sequencing (NGS) analysis, computational drug discovery, medical informatics, cancer genomics, and systems biology. Here, we present a summary of the nine research articles selected from ICIBM 2016 program for publishing in BMC Bioinformatics.

  4. UMAMI: A Recipe for Generating Meaningful Metrics through Holistic I/O Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockwood, Glenn K.; Yoo, Wucherl; Byna, Suren

    I/O efficiency is essential to productivity in scientific computing, especially as many scientific domains become more data-intensive. Many characterization tools have been used to elucidate specific aspects of parallel I/O performance, but analyzing components of complex I/O subsystems in isolation fails to provide insight into critical questions: how do the I/O components interact, what are reasonable expectations for application performance, and what are the underlying causes of I/O performance problems? To address these questions while capitalizing on existing component-level characterization tools, we propose an approach that combines on-demand, modular synthesis of I/O characterization data into a unified monitoring and metricsmore » interface (UMAMI) to provide a normalized, holistic view of I/O behavior. We evaluate the feasibility of this approach by applying it to a month-long benchmarking study on two distinct largescale computing platforms. We present three case studies that highlight the importance of analyzing application I/O performance in context with both contemporaneous and historical component metrics, and we provide new insights into the factors affecting I/O performance. By demonstrating the generality of our approach, we lay the groundwork for a production-grade framework for holistic I/O analysis.« less

  5. Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos; Wang, Dali

    2013-01-01

    Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This papermore » describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.« less

  6. Techniques and Tools for Performance Tuning of Parallel and Distributed Scientific Applications

    NASA Technical Reports Server (NTRS)

    Sarukkai, Sekhar R.; VanderWijngaart, Rob F.; Castagnera, Karen (Technical Monitor)

    1994-01-01

    Performance degradation in scientific computing on parallel and distributed computer systems can be caused by numerous factors. In this half-day tutorial we explain what are the important methodological issues involved in obtaining codes that have good performance potential. Then we discuss what are the possible obstacles in realizing that potential on contemporary hardware platforms, and give an overview of the software tools currently available for identifying the performance bottlenecks. Finally, some realistic examples are used to illustrate the actual use and utility of such tools.

  7. ProphTools: general prioritization tools for heterogeneous biological networks.

    PubMed

    Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos

    2017-12-01

    Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.

  8. Science Goal Monitor: Science Goal Driven Automation for NASA Missions

    NASA Technical Reports Server (NTRS)

    Koratkar, Anuradha; Grosvenor, Sandy; Jung, John; Pell, Melissa; Matusow, David; Bailyn, Charles

    2004-01-01

    Infusion of automation technologies into NASA s future missions will be essential because of the need to: (1) effectively handle an exponentially increasing volume of scientific data, (2) successfully meet dynamic, opportunistic scientific goals and objectives, and (3) substantially reduce mission operations staff and costs. While much effort has gone into automating routine spacecraft operations to reduce human workload and hence costs, applying intelligent automation to the science side, i.e., science data acquisition, data analysis and reactions to that data analysis in a timely and still scientifically valid manner, has been relatively under-emphasized. In order to introduce science driven automation in missions, we must be able to: capture and interpret the science goals of observing programs, represent those goals in machine interpretable language; and allow spacecrafts onboard systems to autonomously react to the scientist's goals. In short, we must teach our platforms to dynamically understand, recognize, and react to the scientists goals. The Science Goal Monitor (SGM) project at NASA Goddard Space Flight Center is a prototype software tool being developed to determine the best strategies for implementing science goal driven automation in missions. The tools being developed in SGM improve the ability to monitor and react to the changing status of scientific events. The SGM system enables scientists to specify what to look for and how to react in descriptive rather than technical terms. The system monitors streams of science data to identify occurrences of key events previously specified by the scientist. When an event occurs, the system autonomously coordinates the execution of the scientist s desired reactions. Through SGM, we will improve om understanding about the capabilities needed onboard for success, develop metrics to understand the potential increase in science returns, and develop an operational prototype so that the perceived risks associated with increased use of automation can be reduced.

  9. The Science of Filming Science

    NASA Astrophysics Data System (ADS)

    Harned, D.

    2016-12-01

    Filmmaking is a science. It is observation, data collection, analysis, experimentation, structure, and presentation. Filmmaking is a process that is familiar to scientists. Observation - what we know is gained from observation of the world around us. Film allows us to focus this observation, to pick out details, to understand nuance, to direct seeing. Filmmaking is a tool for learning about the world. Data collection - to study what we observe we must see what it is now, and how it is changing. This element of filmmaking is collecting images, video, documenting events, and gathering information. Analysis - to understand the film data we have collected we must understand connections, correlations, and cause and effect. We ask questions. We discover. Experimentation - film allows us to experiment with different scenarios, to test observations and make models. Structure - what we find or what we want to present must be sorted into a structured format using the tools of writing, filming, and editing. Presentation - the final film is the result of what we observe, what observations we collect, what we learn from those observations, how we test what we've learned, and how we organize and show what we find. Online video is transforming the way we see the world. We now have easy access to lectures by the famous and the obscure; we can observe lab experiments, documentaries of field expeditions, and actually see recent research results. Video is omnipresent in our culture and supplements or even replaces writing in many applications. We can easily present our own scientific results to new and important audiences. Video can do a lot for science and scientists: It can provide an expanded audience for scientific news and information, educate thousands, spread the word about scientific developments, help frame controversial science issues, show real scientists at work in the real world, promote interest in scientific publications, and report on science-agency programs. It can also interest young people in future science careers.

  10. Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems

    DOE PAGES

    Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...

    2016-07-21

    The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less

  11. Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrix, Valerie; Fox, James; Ghoshal, Devarshi

    The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less

  12. An Assessment of Civic Scientific Literacy in Japan: Development of a More Authentic Assessment Task and Scoring Rubric

    ERIC Educational Resources Information Center

    Naganuma, Shotaro

    2017-01-01

    Scientific literacy has been measured by a variety of assessment tools in the past few decades. International surveys such as Trends in International Mathematics and Science Study (TIMSS) and the Programme for International Student Assessment (PISA) emphasize the importance of scientific literacy. Scientific literacy is now regarded as a…

  13. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  14. Inferring Genetic Ancestry: Opportunities, Challenges, and Implications

    PubMed Central

    Royal, Charmaine D.; Novembre, John; Fullerton, Stephanie M.; Goldstein, David B.; Long, Jeffrey C.; Bamshad, Michael J.; Clark, Andrew G.

    2010-01-01

    Increasing public interest in direct-to-consumer (DTC) genetic ancestry testing has been accompanied by growing concern about issues ranging from the personal and societal implications of the testing to the scientific validity of ancestry inference. The very concept of “ancestry” is subject to misunderstanding in both the general and scientific communities. What do we mean by ancestry? How exactly is ancestry measured? How far back can such ancestry be defined and by which genetic tools? How do we validate inferences about ancestry in genetic research? What are the data that demonstrate our ability to do this correctly? What can we say and what can we not say from our research findings and the test results that we generate? This white paper from the American Society of Human Genetics (ASHG) Ancestry and Ancestry Testing Task Force builds upon the 2008 ASHG Ancestry Testing Summary Statement in providing a more in-depth analysis of key scientific and non-scientific aspects of genetic ancestry inference in academia and industry. It culminates with recommendations for advancing the current debate and facilitating the development of scientifically based, ethically sound, and socially attentive guidelines concerning the use of these continually evolving technologies. PMID:20466090

  15. Scientific Data Stewardship in the 21'st Century

    NASA Astrophysics Data System (ADS)

    Mabie, J. J.; Redmon, R.; Bullett, T.; Kihn, E. A.; Conkright, R.; Manley, J.; Horan, K.

    2008-12-01

    The Ionosonde Program at the National Geophysical Data Center (NGDC) serves as a case study for how to approach data stewardship in the 21'st century. As the number and sophistication of scientific instruments increase, along with the volumes and complexity of data that need to be preserved for future generations, the old approach of simply storing data in a library, physical or electronic, is no longer sufficient to ensure the long term preservation of our important environmental data. To ensure the data can be accessed, understood, and used by future generations, the data stewards must be familiar with the observation process before the data reach the archive and the scientific applications to which the data may be called to serve. This familiarity is best obtained by active participation. In the NGDC Ionosonde Program team, we strive to have activity and expertise in ionosonde field operations and scientific data analysis in addition to our core mission of preservation and distribution of data and metadata. We believe this approach produces superior data quality, proper documentation and evaluation tools for data customers as part of the archive process. We are presenting the Ionosonde Program as an example of modern scientific data stewardship.

  16. Looking Forward to the electronic Geophysical Year

    NASA Astrophysics Data System (ADS)

    Kamide, Y.; Baker, D. N.; Thompson, B.; Barton, C.; Kihn, E.

    2004-12-01

    During the International Geophysical Year (1957-1958), member countries established many new capabilities pursuing the major IGY objectives of collecting geophysical data as widely as possible and providing free access to these data for all scientists around the globe. A key achievement of the IGY was the establishment of a worldwide system of data centers and physical observatories. The worldwide scientific community has now endorsed and is promoting an electronic Geophysical Year (eGY) initiative. The proposed eGY concept would both commemorate the 50th anniversary of the IGY in 2007-2008 and would provide a forward impetus to geophysics in the 21st century, similar to that provide by the IGY fifty years ago. The eGY concept advocates the establishment of a series of virtual geophysical observatories now being deployed in cyberspace. We discuss plans to aggregate measurements into a readily accessible database along with analysis, visualization, and display tools that will make information available and useful to the scientific community, to the user community, and to the general public. We are examining the possibilities for near-realtime acquisition of data and utilization of forecast tools in order to provide users with advanced space weather capabilities. This program will provide powerful tools for education and public outreach concerning the connected Sun-Earth System.

  17. Fostering Outreach, Education and Exploration of the Moon Using the Lunar Mapping & Modeling Portal

    NASA Astrophysics Data System (ADS)

    Dodge, K.; Law, E.; Malhotra, S.; Chang, G.; Kim, R. M.; Bui, B.; Sadaqathullah, S.; Day, B. H.

    2014-12-01

    The Lunar Mapping and Modeling Portal (LMMP)[1], is a web-based Portal and a suite of interactive visualization and analysis tools for users to access mapped lunar data products (including image mosaics, digital elevation models, etc.) from past and current lunar missions (e.g., Lunar Reconnaissance Orbiter, Apollo, etc.). Originally designed as a mission planning tool for the Constellation Program, LMMP has grown into a generalized suite of tools facilitating a wide range of activities in support of lunar exploration including public outreach, education, lunar mission planning and scientific research. LMMP fosters outreach, education, and exploration of the Moon by educators, students, amateur astronomers, and the general public. These efforts are enhanced by Moon Tours, LMMP's mobile application, which makes LMMP's information accessible to people of all ages, putting opportunities for real lunar exploration in the palms of their hands. Our talk will include an overview of LMMP and a demonstration of its technologies (web portals, mobile apps), to show how it serves NASA data as commodities for use by advanced visualization facilities (e.g., planetariums) and how it contributes to improving teaching and learning, increasing scientific literacy of the general public, and enriching STEM efforts. References:[1] http://www.lmmp.nasa.gov

  18. Tool Support for Software Lookup Table Optimization

    DOE PAGES

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.

    2011-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less

  19. Earthscape, a Multi-Purpose Interactive 3d Globe Viewer for Hybrid Data Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Sarthou, A.; Mas, S.; Jacquin, M.; Moreno, N.; Salamon, A.

    2015-08-01

    The hybrid visualization and interaction tool EarthScape is presented here. The software is able to display simultaneously LiDAR point clouds, draped videos with moving footprint, volume scientific data (using volume rendering, isosurface and slice plane), raster data such as still satellite images, vector data and 3D models such as buildings or vehicles. The application runs on touch screen devices such as tablets. The software is based on open source libraries, such as OpenSceneGraph, osgEarth and OpenCV, and shader programming is used to implement volume rendering of scientific data. The next goal of EarthScape is to perform data analysis using ENVI Services Engine, a cloud data analysis solution. EarthScape is also designed to be a client of Jagwire which provides multisource geo-referenced video fluxes. When all these components will be included, EarthScape will be a multi-purpose platform that will provide at the same time data analysis, hybrid visualization and complex interactions. The software is available on demand for free at france@exelisvis.com.

  20. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    NASA Astrophysics Data System (ADS)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  1. Collaboration pathway(s) using new tools for optimizing operational climate monitoring from space

    NASA Astrophysics Data System (ADS)

    Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.

    2014-10-01

    Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the needs of decision makers, scientific investigators and global users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent (2014) rulebased decision engine modeling runs that targeted optimizing the intended NPOESS architecture, becomes a surrogate for global operational climate monitoring architecture(s). This rule-based systems tools provide valuable insight for Global climate architectures, through the comparison and evaluation of alternatives considered and the exhaustive range of trade space explored. A representative optimization of Global ECV's (essential climate variables) climate monitoring architecture(s) is explored and described in some detail with thoughts on appropriate rule-based valuations. The optimization tools(s) suggest and support global collaboration pathways and hopefully elicit responses from the audience and climate science shareholders.

  2. Social media: a tool to spread information: a case study analysis of twitter conversation at the Cardiac Society of Australia & New Zealand 61st annual scientific meeting 2013.

    PubMed

    Ferguson, Caleb; Inglis, Sally C; Newton, Phillip J; Cripps, Peter J S; MacDonald, Peter S; Davidson, Patricia M

    2014-01-01

    The World Wide Web has changed the way in which people communicate and consume information. More importantly, this innovation has increased the speed and spread of information. There has been recent increase in the percentage of cardiovascular professionals, including journals and associations using Twitter to engage with others and exchange ideas. Evaluating the reach and impact in scientific meetings is important in promoting the use of social media. This study evaluated Twitter use during the recent 61st Annual Scientific Meeting at the Cardiac Society of Australia and New Zealand. During the Cardiac Society of Australia and New Zealand 2013 61st Annual Scientific Meeting Symplur was used to curate conversations that were publicly posted with the hashtag #CSANZ2013. The hashtag was monitored with analysis focused on the influencers, latest tweets, tweet statistics, activity comparisons, and tweet activity during the conference. Additionally, Radian6 social media listening software was used to collect data. A summary is provided. There were 669 total tweets sent from 107 unique Twitter accounts during 8th August 9 a.m. to 11th August 1 p.m. This averaged nine tweets per hour and six tweets per participant. This assisted in the sharing of ideas and disseminating the findings and conclusions from presenters at the conference with a total 1,432,573 potential impressions in Twitter users tweet streams. This analysis of Twitter conversations during a recent scientific meeting highlights the significance and place of social media within research dissemination and collaboration. Researchers and clinicians should consider using this technology to enhance timely communication of findings. The potential to engage with consumers and enhance shared decision-making should be explored further.

  3. Environmental risk assessment of chemicals and nanomaterials--The best foundation for regulatory decision-making?

    PubMed

    Syberg, Kristian; Hansen, Steffen Foss

    2016-01-15

    Environmental risk assessment (ERA) is often considered as the most transparent, objective and reliable decision-making tool for informing the risk management of chemicals and nanomaterials. ERAs are based on the assumption that it is possible to provide accurate estimates of hazard and exposure and, subsequently, to quantify risk. In this paper we argue that since the quantification of risk is dominated by uncertainties, ERAs do not provide a transparent or an objective foundation for decision-making and they should therefore not be considered as a "holy grail" for informing risk management. We build this thesis on the analysis of two case studies (of nonylphenol and nanomaterials) as well as a historical analysis in which we address the scientific foundation for ERAs. The analyses show that ERAs do not properly address all aspects of actual risk, such as the mixture effect and the environmentally realistic risk from nanomaterials. Uncertainties have been recognised for decades, and assessment factors are used to compensate for the lack of realism in ERAs. The assessment factors' values were pragmatically determined, thus lowering the scientific accuracy of the ERAs. Furthermore, the default choice of standard assay for assessing a hazard might not always be the most biologically relevant, so we therefore argue that an ERA should be viewed as a pragmatic decision-making tool among several, and it should not have a special status for informing risk management. In relation to other relevant decision-making tools we discuss the use of chemical alternative assessments (CAAs) and the precautionary principle. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. SensorWeb Hub infrastructure for open access to scientific research data

    NASA Astrophysics Data System (ADS)

    de Filippis, Tiziana; Rocchi, Leandro; Rapisardi, Elena

    2015-04-01

    The sharing of research data is a new challenge for the scientific community that may benefit from a large amount of information to solve environmental issues and sustainability in agriculture and urban contexts. Prerequisites for this challenge is the development of an infrastructure that ensure access, management and preservation of data, technical support for a coordinated and harmonious management of data that, in the framework of Open Data Policies, should encourages the reuse and the collaboration. The neogeography and the citizen as sensors approach, highlight that new data sources need a new set of tools and practices so to collect, validate, categorize, and use / access these "crowdsourced" data, that integrate the data sets produced in the scientific field, thus "feeding" the overall available data for analysis and research. When the scientific community embraces the dimension of collaboration and sharing, access and re-use, in order to accept the open innovation approach, it should redesign and reshape the processes of data management: the challenges of technological and cultural innovation, enabled by web 2.0 technologies, bring to the scenario where the sharing of structured and interoperable data will constitute the unavoidable building block to set up a new paradigm of scientific research. In this perspective the Institute of Biometeorology, CNR, whose aim is contributing to sharing and development of research data, has developed the "SensorWebHub" (SWH) infrastructure to support the scientific activities carried out in several research projects at national and international level. It is designed to manage both mobile and fixed open source meteorological and environmental sensors, in order to integrate the existing agro-meteorological and urban monitoring networks. The proposed architecture uses open source tools to ensure sustainability in the development and deployment of web applications with geographic features and custom analysis, as requested by the different research projects. The SWH components are organized in typical client-server architecture and interact from the sensing process to the representation of the results to the end-users. The Web Application enables to view and analyse the data stored in the GeoDB. The interface is designed following Internet browsers specifications allowing the visualization of collected data in different formats (tabular, chart and geographic map). The services for the dissemination of geo-referenced information, adopt the OGC specifications. SWH is a bottom-up collaborative initiative to share real time research data and pave the way for a open innovation approach in the scientific research. Until now this framework has been used for several WebGIS applications and WebApp for environmental monitoring at different temporal and spatial scales.

  5. The Telemetric and Holter ECG Warehouse Initiative (THEW): a Data Repository for the Design, Implementation and Validation of ECG-related Technologies

    PubMed Central

    Couderc, Jean-Philippe

    2011-01-01

    We present an initiative supported by the National Heart Lung, and Blood Institute and the Food and Drug Administration for the development of a repository containing continuous electrocardiographic information to be shared with the worldwide scientific community. We believe that sharing data reinforces open scientific inquiry. It encourages diversity of analysis and opinion while promoting new research and facilitating the education of new researchers. In this paper, we present the resources available in this initiative for the scientific community. We describe the set of ECG signals currently hosted and we briefly discuss the associated clinical information (medical history. Disease and study-specific endpoints) and software tools we propose. Currently, the repository contains more than 250GB of data from eight clinical studies including healthy individuals and cardiac patients. This data is available for the development, implementation and validation of technologies related to body-surface ECGs. To conclude, the Telemetric and Holter ECG Warehouse (THEW) is an initiative developed to benefit the scientific community and to advance the field of quantitative electrocardiography and cardiac safety. PMID:21097349

  6. The Use of Lego Technologies in Elementary Teacher Preparation

    NASA Astrophysics Data System (ADS)

    Hadjiachilleos, Stella; Avraamidou, Lucy; Papastavrou, Stavros

    2013-10-01

    The need to reform science teacher preparation programs has been pointed out in research (Bryan and Abell in J Res Sci Teach 36:121-140, 1999; Bryan and Atwater in Sci Educ 8(6):821-839, 2002; Harrington and Hathaway in J Teach Educ 46(4):275-284, 1995). Science teachers are charged with the responsibility of incorporating both cognitive and non-cognitive parameters in their everyday teaching practices. This often results in their reluctance to teach science because they often lack disciplinary and/or pedagogical expertise required to promote science learning. The purpose of this study is to propose an alternative instructional approach in which Lego vehicles were used as a tool to promote pre-service elementary teachers' development and to examine whether there are non-cognitive parameters that promote or obstruct them from using Lego Technologies as a teaching tool. The context of the study was defined by a teacher preparation program of a private university in a small Mediterranean country. A sample of 28 pre-service elementary teachers, working in five 5-6-member groups were involved in scientific inquiries, during which they had to use vehicles in order to solve scientific problems related to concepts such as gear functioning, force, and motion. The nature of their cognitive engagement in the scientific inquiry process, non-cognitive parameters contributing to their cognitive engagement, and the impact of their involvement in the process on their development were examined through qualitative analysis of pre- and post-inquiry interviews, presentations of their solutions to the scientific problems and of their personal reflective journals.

  7. Gaming science: the "Gamification" of scientific thinking.

    PubMed

    Morris, Bradley J; Croker, Steve; Zimmerman, Corinne; Gill, Devin; Romig, Connie

    2013-09-09

    Science is critically important for advancing economics, health, and social well-being in the twenty-first century. A scientifically literate workforce is one that is well-suited to meet the challenges of an information economy. However, scientific thinking skills do not routinely develop and must be scaffolded via educational and cultural tools. In this paper we outline a rationale for why we believe that video games have the potential to be exploited for gain in science education. The premise we entertain is that several classes of video games can be viewed as a type of cultural tool that is capable of supporting three key elements of scientific literacy: content knowledge, process skills, and understanding the nature of science. We argue that there are three classes of mechanisms through which video games can support scientific thinking. First, there are a number of motivational scaffolds, such as feedback, rewards, and flow states that engage students relative to traditional cultural learning tools. Second, there are a number of cognitive scaffolds, such as simulations and embedded reasoning skills that compensate for the limitations of the individual cognitive system. Third, fully developed scientific thinking requires metacognition, and video games provide metacognitive scaffolding in the form of constrained learning and identity adoption. We conclude by outlining a series of recommendations for integrating games and game elements in science education and provide suggestions for evaluating their effectiveness.

  8. Gaming science: the “Gamification” of scientific thinking

    PubMed Central

    Morris, Bradley J.; Croker, Steve; Zimmerman, Corinne; Gill, Devin; Romig, Connie

    2013-01-01

    Science is critically important for advancing economics, health, and social well-being in the twenty-first century. A scientifically literate workforce is one that is well-suited to meet the challenges of an information economy. However, scientific thinking skills do not routinely develop and must be scaffolded via educational and cultural tools. In this paper we outline a rationale for why we believe that video games have the potential to be exploited for gain in science education. The premise we entertain is that several classes of video games can be viewed as a type of cultural tool that is capable of supporting three key elements of scientific literacy: content knowledge, process skills, and understanding the nature of science. We argue that there are three classes of mechanisms through which video games can support scientific thinking. First, there are a number of motivational scaffolds, such as feedback, rewards, and flow states that engage students relative to traditional cultural learning tools. Second, there are a number of cognitive scaffolds, such as simulations and embedded reasoning skills that compensate for the limitations of the individual cognitive system. Third, fully developed scientific thinking requires metacognition, and video games provide metacognitive scaffolding in the form of constrained learning and identity adoption. We conclude by outlining a series of recommendations for integrating games and game elements in science education and provide suggestions for evaluating their effectiveness. PMID:24058354

  9. Measuring Academic Productivity and Changing Definitions of Scientific Impact

    PubMed Central

    Sarli, Cathy C.; Carpenter, Christopher R.

    2016-01-01

    This manuscript provides a brief overview of the history of communication of scientific research and reporting of scientific research impact outcomes. Current day practices are outlined along with examples of how organizations and libraries are providing tools to evaluate and document the impact of scientific research to provide a meaningful narrative suitable for a variety of purposes and audiences. PMID:25438359

  10. Pendulum Phenomena and the Assessment of Scientific Inquiry Capabilities

    ERIC Educational Resources Information Center

    Zachos, Paul

    2004-01-01

    Phenomena associated with the "pendulum" present numerous opportunities for assessing higher order human capabilities related to "scientific inquiry" and the "discovery" of natural law. This paper illustrates how systematic "assessment of scientific inquiry capabilities", using "pendulum" phenomena, can provide a useful tool for classroom teachers…

  11. Data-proximate Visualization via Unidata Cloud Technologies

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.; Oxelson Ganter, J.; Weber, J.

    2016-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service.The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready.The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be.Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  12. INA-Rxiv: The Missing Puzzle in Indonesia’s Scientific Publishing Workflow

    NASA Astrophysics Data System (ADS)

    Rahim, R.; Irawan, D. E.; Zulfikar, A.; Hardi, R.; Arliman S, L.; Gultom, E. R.; Ginting, G.; Wahyuni, S. S.; Mesran, M.; Mahjudin, M.; Saputra, I.; Waruwu, F. T.; Suginam, S.; Buulolo, E.; Abraham, J.

    2018-04-01

    INA-Rxiv is the first Indonesia preprint server marking the new development initiated by the open science community. This study aimed at describing the development of INA-Rxiv and its conversations. It usedanalyzer of Inarxiv.id, WhatsApp Group Analyzer, and Twitter Analytics as the tools for data analysis complemented with observation.The results showed that INA-Rxiv users are growing because of the numerous discussions in social media, e.g.WhatsApp,as well as some other positive response of writers who have been using INA- Rxiv. The perspective of growth mindset and the implication of INA-Rxiv movement for filling up the gap in accelerating scientific dissemination process are presented at the end of this article.

  13. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software used to execute analyses and models, and derived data and products that arise from these computations. This provenance is vital to interpretation and understanding of science, and provides an audit trail that researchers can use to understand and replicate computational workflows in the geosciences.

  14. A Pipeline Software Architecture for NMR Spectrum Data Translation

    PubMed Central

    Ellis, Heidi J.C.; Weatherby, Gerard; Nowling, Ronald J.; Vyas, Jay; Fenwick, Matthew; Gryk, Michael R.

    2012-01-01

    The problem of formatting data so that it conforms to the required input for scientific data processing tools pervades scientific computing. The CONNecticut Joint University Research Group (CONNJUR) has developed a data translation tool based on a pipeline architecture that partially solves this problem. The CONNJUR Spectrum Translator supports data format translation for experiments that use Nuclear Magnetic Resonance to determine the structure of large protein molecules. PMID:24634607

  15. A web portal for hydrodynamical, cosmological simulations

    NASA Astrophysics Data System (ADS)

    Ragagnin, A.; Dolag, K.; Biffi, V.; Cadolle Bel, M.; Hammer, N. J.; Krukau, A.; Petkova, M.; Steinborn, D.

    2017-07-01

    This article describes a data centre hosting a web portal for accessing and sharing the output of large, cosmological, hydro-dynamical simulations with a broad scientific community. It also allows users to receive related scientific data products by directly processing the raw simulation data on a remote computing cluster. The data centre has a multi-layer structure: a web portal, a job control layer, a computing cluster and a HPC storage system. The outer layer enables users to choose an object from the simulations. Objects can be selected by visually inspecting 2D maps of the simulation data, by performing highly compounded and elaborated queries or graphically by plotting arbitrary combinations of properties. The user can run analysis tools on a chosen object. These services allow users to run analysis tools on the raw simulation data. The job control layer is responsible for handling and performing the analysis jobs, which are executed on a computing cluster. The innermost layer is formed by a HPC storage system which hosts the large, raw simulation data. The following services are available for the users: (I) CLUSTERINSPECT visualizes properties of member galaxies of a selected galaxy cluster; (II) SIMCUT returns the raw data of a sub-volume around a selected object from a simulation, containing all the original, hydro-dynamical quantities; (III) SMAC creates idealized 2D maps of various, physical quantities and observables of a selected object; (IV) PHOX generates virtual X-ray observations with specifications of various current and upcoming instruments.

  16. STEREO in-situ data analysis

    NASA Astrophysics Data System (ADS)

    Schroeder, P.; Luhmann, J.; Davis, A.; Russell, C.

    STEREO s IMPACT In-situ Measurements of Particles and CME Transients investigation provides the first opportunity for long duration detailed observations of 1 AU magnetic field structures plasma and suprathermal electrons and energetic particles at points bracketing Earth s heliospheric location The PLASTIC instrument will make plasma ion composition measurements completing STEREO s comprehensive in-situ perspective Stereoscopic 3D information from the STEREO SECCHI imagers and SWAVES radio experiment will make it possible to use both multipoint and quadrature studies to connect interplanetary Coronal Mass Ejections ICME and solar wind structures to CMEs and coronal holes observed at the Sun The uniqueness of the STEREO mission requires novel data analysis tools and techniques to take advantage of the mission s full scientific potential An interactive browser with the ability to create publication-quality plots is being developed which will integrate STEREO s in-situ data with data from a variety of other missions including WIND and ACE Also an application program interface API will be provided allowing users to create custom software that ties directly into STEREO s data set The API will allow for more advanced forms of data mining than currently available through most data web services A variety of data access techniques and the development of cross-spacecraft data analysis tools will allow the larger scientific community to combine STEREO s unique in-situ data with those of other missions particularly the L1 missions and therefore to maximize

  17. Topological and Geometric Tools for the Analysis fo Complex Networks

    DTIC Science & Technology

    2013-10-01

    CONTRACT NUMBER FA 9550-09-1-0090 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Ali Jadbabaie (Penn) Shing-Tung Yau (Harvard) Fan Chung...NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) University of Pennsylvania 34th and Spruce Street, Philadelphia 19104-6303 8. PERFORMING...ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) Air Force Office of Scientific Research 875 North Randolph Street

  18. A graphics approach in the design of the dual air density Explorer satellites

    NASA Technical Reports Server (NTRS)

    Mcdougal, D. S.

    1975-01-01

    A computer program was developed to generate a graphics display of the Dual Air Density (DAD) Explorer satellites which aids in the engineering and scientific design. The program displays a two-dimensional view of both spacecraft and their surface features from any direction. The graphics have been an indispensable tool in the design, analysis, and understanding of the critical locations of the various surface features for both satellites.

  19. Design Concept for the Advanced Radar Test Bed (ARTB). Volume 2. Appendices.

    DTIC Science & Technology

    1994-12-31

    A (Statement of Work) to ARTB Technical Report z~ 0w W W = w Cl) LU)u ui Mi 0.0r- -a ZeC 0 (WY) zj M CU 0=. 3 ""Jo 0 XEOWE 4(06 0 L Scientific ~ ~ C...Analysis from Theory to Software, A. K. Peters Ltd, May 1994) which allow wa.vetes to be se-n as Pvstm implementation tools, rather than mathematical

  20. [Algorithms of artificial neural networks--practical application in medical science].

    PubMed

    Stefaniak, Bogusław; Cholewiński, Witold; Tarkowska, Anna

    2005-12-01

    Artificial Neural Networks (ANN) may be a tool alternative and complementary to typical statistical analysis. However, in spite of many computer applications of various ANN algorithms ready for use, artificial intelligence is relatively rarely applied to data processing. This paper presents practical aspects of scientific application of ANN in medicine using widely available algorithms. Several main steps of analysis with ANN were discussed starting from material selection and dividing it into groups, to the quality assessment of obtained results at the end. The most frequent, typical reasons for errors as well as the comparison of ANN method to the modeling by regression analysis were also described.

  1. Managing large-scale workflow execution from resource provisioning to provenance tracking: The CyberShake example

    USGS Publications Warehouse

    Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2006-01-01

    This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.

  2. Energy and Exergy Analysis of Vapour Absorption Refrigeration Cycle—A Review

    NASA Astrophysics Data System (ADS)

    Kanabar, Bhaveshkumar Kantilal; Ramani, Bharatkumar Maganbhai

    2016-07-01

    In recent years, an energy crisis and the energy consumption have become global problems which restrict the sustainable growth. In these scenarios the scientific energy recovery and the utilization of various kinds of waste heat become very important. The waste heat can be utilized in many ways and one of the best practices is to use it for vapour absorption refrigeration system. To ensure efficient working of absorption cycle and utilization of optimum heat, exergy is the best tool for analysis. This paper provides the comprehensive picture of research and development of absorption refrigeration technology, practical and theoretical analysis with different arrangements of the cycle.

  3. The Gamma-Ray Burst ToolSHED is Open for Business

    NASA Astrophysics Data System (ADS)

    Giblin, Timothy W.; Hakkila, Jon; Haglin, David J.; Roiger, Richard J.

    2004-09-01

    The GRB ToolSHED, a Gamma-Ray Burst SHell for Expeditions in Data-Mining, is now online and available via a web browser to all in the scientific community. The ToolSHED is an online web utility that contains pre-processed burst attributes of the BATSE catalog and a suite of induction-based machine learning and statistical tools for classification and cluster analysis. Users create their own login account and study burst properties within user-defined multi-dimensional parameter spaces. Although new GRB attributes are periodically added to the database for user selection, the ToolSHED has a feature that allows users to upload their own burst attributes (e.g. spectral parameters, etc.) so that additional parameter spaces can be explored. A data visualization feature using GNUplot and web-based IDL has also been implemented to provide interactive plotting of user-selected session output. In an era in which GRB observations and attributes are becoming increasingly more complex, a utility such as the GRB ToolSHED may play an important role in deciphering GRB classes and understanding intrinsic burst properties.

  4. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  5. The AmeriFlux Data Activity and Data System: An Evolving Collection of Data Management Techniques, Tools, Products and Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boden, Thomas A; Krassovski, Misha B; Yang, Bai

    2013-01-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the U.S. Department of Energy and international climate change science since 1982. Over this period, climate change science has expanded from research focusing on basic understanding of geochemical cycles, particularly the carbon cycle, to integrated research addressing climate change impacts, vulnerability, adaptation, and mitigation. Interests in climate change data and information worldwide have grown remarkably and, as a result, so have demands and expectations for CDIAC s data systems. To meet the growing demands, CDIAC s strategy has beenmore » to design flexible data systems using proven technologies blended with new, evolving technologies and standards. CDIAC development teams are multidisciplinary and include computer science and information technology expertise, but also scientific expertise necessary to address data quality and documentation issues and to identify data products and system capabilities needed by climate change scientists. CDIAC has learned there is rarely a single commercial tool or product readily available to satisfy long-term scientific data system requirements (i.e., one size does not fit all and the breadth and diversity of environmental data are often too complex for easy use with commercial products) and typically deploys a variety of tools and data products in an effort to provide credible data freely to users worldwide. Like many scientific data management applications, CDIAC s data systems are highly customized to satisfy specific scientific usage requirements (e.g., developing data products specific for model use) but are also designed to be flexible and interoperable to take advantage of new software engineering techniques, standards (e.g., metadata standards) and tools and to support future Earth system data efforts (e.g., ocean acidification). CDIAC has provided data management support for numerous long-term measurement projects crucial to climate change science. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. We share our approaches in satisfying the challenges of delivering AmeriFlux data worldwide to benefit others with similar challenges handling climate change data, further heighten awareness and use of an outstanding ecological data resource, and highlight expanded software engineering applications being used for climate change measurement data.« less

  6. Enabling Data Intensive Science through Service Oriented Science: Virtual Laboratories and Science Gateways

    NASA Astrophysics Data System (ADS)

    Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.

    2014-12-01

    We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.

  7. Use of a metadata documentation and search tool for large data volumes: The NGEE arctic example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devarakonda, Ranjeet; Hook, Leslie A; Killeffer, Terri S

    The Online Metadata Editor (OME) is a web-based tool to help document scientific data in a well-structured, popular scientific metadata format. In this paper, we will discuss the newest tool that Oak Ridge National Laboratory (ORNL) has developed to generate, edit, and manage metadata and how it is helping data-intensive science centers and projects, such as the U.S. Department of Energy s Next Generation Ecosystem Experiments (NGEE) in the Arctic to prepare metadata and make their big data produce big science and lead to new discoveries.

  8. Applying AN Object-Oriented Database Model to a Scientific Database Problem: Managing Experimental Data at Cebaf.

    NASA Astrophysics Data System (ADS)

    Ehlmann, Bryon K.

    Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.

  9. The difficulties of systematic reviews.

    PubMed

    Westgate, Martin J; Lindenmayer, David B

    2017-10-01

    The need for robust evidence to support conservation actions has driven the adoption of systematic approaches to research synthesis in ecology. However, applying systematic review to complex or open questions remains challenging, and this task is becoming more difficult as the quantity of scientific literature increases. We drew on the science of linguistics for guidance as to why the process of identifying and sorting information during systematic review remains so labor intensive, and to provide potential solutions. Several linguistic properties of peer-reviewed corpora-including nonrandom selection of review topics, small-world properties of semantic networks, and spatiotemporal variation in word meaning-greatly increase the effort needed to complete the systematic review process. Conversely, the resolution of these semantic complexities is a common motivation for narrative reviews, but this process is rarely enacted with the rigor applied during linguistic analysis. Therefore, linguistics provides a unifying framework for understanding some key challenges of systematic review and highlights 2 useful directions for future research. First, in cases where semantic complexity generates barriers to synthesis, ecologists should consider drawing on existing methods-such as natural language processing or the construction of research thesauri and ontologies-that provide tools for mapping and resolving that complexity. These tools could help individual researchers classify research material in a more robust manner and provide valuable guidance for future researchers on that topic. Second, a linguistic perspective highlights that scientific writing is a rich resource worthy of detailed study, an observation that can sometimes be lost during the search for data during systematic review or meta-analysis. For example, mapping semantic networks can reveal redundancy and complementarity among scientific concepts, leading to new insights and research questions. Consequently, wider adoption of linguistic approaches may facilitate improved rigor and richness in research synthesis. © 2017 Society for Conservation Biology.

  10. EUROPLANET-RI modelling service for the planetary science community: European Modelling and Data Analysis Facility (EMDAF)

    NASA Astrophysics Data System (ADS)

    Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian

    2010-05-01

    Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.

  11. Guidelines for a Scientific Approach to Critical Thinking Assessment

    ERIC Educational Resources Information Center

    Bensley, D. Alan; Murtagh, Michael P.

    2012-01-01

    Assessment of student learning outcomes can be a powerful tool for improvement of instruction when a scientific approach is taken; unfortunately, many educators do not take full advantage of this approach. This article examines benefits of taking a scientific approach to critical thinking assessment and proposes guidelines for planning,…

  12. Martian Boneyards: Scientific Inquiry in an MMO Game

    ERIC Educational Resources Information Center

    Asbell-Clarke, Jodi; Edwards, Teon; Rowe, Elizabeth; Larsen, Jamie; Sylvan, Elisabeth; Hewitt, Jim

    2012-01-01

    This paper reports on research of a game designed for scientific inquiry in a new and publicly available massively-multiplayer online environment (MMO). Educators and game designers worked together to create a highly immersive environment, a compelling storyline, and research-grounded tools for scientific inquiry within the game. The designers…

  13. Epistemologies in Practice: Making Scientific Practices Meaningful for Students

    ERIC Educational Resources Information Center

    Berland, Leema K.; Schwarz, Christina V.; Krist, Christina; Kenyon, Lisa; Lo, Abraham S.; Reiser, Brian J.

    2016-01-01

    Recent research and policy documents call for engaging students and teachers in scientific practices such that the goal of science education shifts from students "knowing" scientific and epistemic ideas, to students "developing and using" these understandings as tools to make sense of the world. This perspective pushes students…

  14. An efficient framework for Java data processing systems in HPC environments

    NASA Astrophysics Data System (ADS)

    Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül

    2011-11-01

    Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).

  15. Multiscale recurrence analysis of spatio-temporal data

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Marwan, N.; Kurths, J.

    2015-12-01

    The description and analysis of spatio-temporal dynamics is a crucial task in many scientific disciplines. In this work, we propose a method which uses the mapogram as a similarity measure between spatially distributed data instances at different time points. The resulting similarity values of the pairwise comparison are used to construct a recurrence plot in order to benefit from established tools of recurrence quantification analysis and recurrence network analysis. In contrast to other recurrence tools for this purpose, the mapogram approach allows the specific focus on different spatial scales that can be used in a multi-scale analysis of spatio-temporal dynamics. We illustrate this approach by application on mixed dynamics, such as traveling parallel wave fronts with additive noise, as well as more complicate examples, pseudo-random numbers and coupled map lattices with a semi-logistic mapping rule. Especially the complicate examples show the usefulness of the multi-scale consideration in order to take spatial pattern of different scales and with different rhythms into account. So, this mapogram approach promises new insights in problems of climatology, ecology, or medicine.

  16. Multiscale recurrence analysis of spatio-temporal data.

    PubMed

    Riedl, M; Marwan, N; Kurths, J

    2015-12-01

    The description and analysis of spatio-temporal dynamics is a crucial task in many scientific disciplines. In this work, we propose a method which uses the mapogram as a similarity measure between spatially distributed data instances at different time points. The resulting similarity values of the pairwise comparison are used to construct a recurrence plot in order to benefit from established tools of recurrence quantification analysis and recurrence network analysis. In contrast to other recurrence tools for this purpose, the mapogram approach allows the specific focus on different spatial scales that can be used in a multi-scale analysis of spatio-temporal dynamics. We illustrate this approach by application on mixed dynamics, such as traveling parallel wave fronts with additive noise, as well as more complicate examples, pseudo-random numbers and coupled map lattices with a semi-logistic mapping rule. Especially the complicate examples show the usefulness of the multi-scale consideration in order to take spatial pattern of different scales and with different rhythms into account. So, this mapogram approach promises new insights in problems of climatology, ecology, or medicine.

  17. Salt, time, and metaphor: examining norms in scientific culture

    NASA Astrophysics Data System (ADS)

    Brady, Anna G.

    2017-06-01

    As has been widely discussed, the National Research Council's (NRC) current policy in United States education advocates supporting students toward acquiring skills to engage in scientific practices. NRC policy also suggests that supporting students in the practices of science may require different approaches than what is required for supporting student engagement with scientific content. Further, acquiring skills in scientific practices is not limited to gaining proficiency in utilizing tools that support scientific inquiry: students must also understand how to interpret information generated from such tools. These tools of scientific practices are embedded within scientific culture, which from Sewell's perspective, is comprised of both practice and semiotic code (symbols and meanings). To become scientifically literate students must learn to utilize this code in practice. Author Germà Garcia-Belmonte identified one example of learning to utilize the semiotic code in scientific practice and considers challenges faced by undergraduate physics and engineering students within that context. Garcia-Belmonte observes students struggle to interpret symbols and meaning (the visual display generated) while engaging in practice (utilizing an oscilloscope) and posits that two, culturally bound, competing, linguistic metaphors of time may be the cause. Ultimately, however, the author does not explore beyond hypotheses. Although his theory may be correct, the paper serves as a reminder of the responsibility we have to students. As educators, it is useful and beneficial to make observations and develop theories surrounding why our students struggle. However, in addition to theorizing on why, for example, a particular scientific norm might present challenges for our students, we must remain mindful that challenges may not be uniform and may vary considerably according to students' culture(s). Engaging with students and soliciting specific information regarding the challenges they face allows us, as educators, to both examine whether students' reported challenges align or conflict with our own perceptions of those challenges, and subsequently devise and test methods toward supporting students in overcoming their challenges.

  18. Development of a practical tool to measure the impact of publications on the society based on focus group discussions with scientists.

    PubMed

    Niederkrotenthaler, Thomas; Dorner, Thomas E; Maier, Manfred

    2011-07-25

    A 'societal impact factor' that complements the scientific impact factor would contribute to a more comprehensive evaluation of scientific research. In order to develop a practical tool for its assessment, it is important to learn about perceptions of scientists on how to measure a societal impact factor. This qualitative study presents the development of a practical tool to measure the societal impact of publications based on 8 focus group discussions with 24 biomedical scientists at the Medical University Vienna between May 2008 and May 2009. Topics focused on (1) features of an ideal tool, (2) criteria that should be considered in the assessment, and (3) the identification of practical pitfalls. In an iterative exercise involving the repeated application of the drafted tool to scientific papers, criteria for the assessment were refined. A small-scale exercise to evaluate the tool in terms of its comprehensibility, relevance and practicability was conducted using questionnaires for 6 external experts in leading positions of public health, and yielded acceptable results. The tool developed consists of three quantitative dimensions, that is (1) the aim of a publication, (2) the efforts of the authors to translate their research results, and, if translation was accomplished, (3) (a) the size of the area where translation was accomplished (regional, national or international), (b) its status (preliminary versus permanent) and (c) the target group of the translation (individuals, subgroup of population, total population). Focus group discussions with scientists suggested that the societal impact factor of a publication should consider the effect of the publication in a wide set of non-scientific areas, but also the motivation behind the publication, and efforts by the authors to translate their findings. The proposed tool provides some valuable insights for further research and practical applications in the topic area.

  19. Development of a practical tool to measure the impact of publications on the society based on focus group discussions with scientists

    PubMed Central

    2011-01-01

    Background A 'societal impact factor' that complements the scientific impact factor would contribute to a more comprehensive evaluation of scientific research. In order to develop a practical tool for its assessment, it is important to learn about perceptions of scientists on how to measure a societal impact factor. Methods This qualitative study presents the development of a practical tool to measure the societal impact of publications based on 8 focus group discussions with 24 biomedical scientists at the Medical University Vienna between May 2008 and May 2009. Topics focused on (1) features of an ideal tool, (2) criteria that should be considered in the assessment, and (3) the identification of practical pitfalls. In an iterative exercise involving the repeated application of the drafted tool to scientific papers, criteria for the assessment were refined. A small-scale exercise to evaluate the tool in terms of its comprehensibility, relevance and practicability was conducted using questionnaires for 6 external experts in leading positions of public health, and yielded acceptable results. Results The tool developed consists of three quantitative dimensions, that is (1) the aim of a publication, (2) the efforts of the authors to translate their research results, and, if translation was accomplished, (3) (a) the size of the area where translation was accomplished (regional, national or international), (b) its status (preliminary versus permanent) and (c) the target group of the translation (individuals, subgroup of population, total population). Conclusions Focus group discussions with scientists suggested that the societal impact factor of a publication should consider the effect of the publication in a wide set of non-scientific areas, but also the motivation behind the publication, and efforts by the authors to translate their findings. The proposed tool provides some valuable insights for further research and practical applications in the topic area. PMID:21787432

  20. Scientific Overview /Regional Analyses and Approaches

    EPA Science Inventory

    A workshop, Tools for Assessing Stream Dissolved Minerals, will introduce approaches and EPA tools for regional and site specific development of water quality criteria based on observations from Arkansas streams. In this presentation regional approaches and tools are described. ...

  1. ADOMA: A Command Line Tool to Modify ClustalW Multiple Alignment Output.

    PubMed

    Zaal, Dionne; Nota, Benjamin

    2016-01-01

    We present ADOMA, a command line tool that produces alternative outputs from ClustalW multiple alignments of nucleotide or protein sequences. ADOMA can simplify the output of alignments by showing only the different residues between sequences, which is often desirable when only small differences such as single nucleotide polymorphisms are present (e.g., between different alleles). Another feature of ADOMA is that it can enhance the ClustalW output by coloring the residues in the alignment. This tool is easily integrated into automated Linux pipelines for next-generation sequencing data analysis, and may be useful for researchers in a broad range of scientific disciplines including evolutionary biology and biomedical sciences. The source code is freely available at https://sourceforge. net/projects/adoma/. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. DaGO-Fun: tool for Gene Ontology-based functional analysis using term information content measures.

    PubMed

    Mazandu, Gaston K; Mulder, Nicola J

    2013-09-25

    The use of Gene Ontology (GO) data in protein analyses have largely contributed to the improved outcomes of these analyses. Several GO semantic similarity measures have been proposed in recent years and provide tools that allow the integration of biological knowledge embedded in the GO structure into different biological analyses. There is a need for a unified tool that provides the scientific community with the opportunity to explore these different GO similarity measure approaches and their biological applications. We have developed DaGO-Fun, an online tool available at http://web.cbio.uct.ac.za/ITGOM, which incorporates many different GO similarity measures for exploring, analyzing and comparing GO terms and proteins within the context of GO. It uses GO data and UniProt proteins with their GO annotations as provided by the Gene Ontology Annotation (GOA) project to precompute GO term information content (IC), enabling rapid response to user queries. The DaGO-Fun online tool presents the advantage of integrating all the relevant IC-based GO similarity measures, including topology- and annotation-based approaches to facilitate effective exploration of these measures, thus enabling users to choose the most relevant approach for their application. Furthermore, this tool includes several biological applications related to GO semantic similarity scores, including the retrieval of genes based on their GO annotations, the clustering of functionally related genes within a set, and term enrichment analysis.

  3. JASMINE simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Tsujimoto, Takuji; Suganuma, Masahiro; Niwa, Yoshito; Sako, Nobutada; Hatsutori, Yoichi; Tanaka, Takashi

    2006-06-01

    We explain simulation tools in JASMINE project (JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations into error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented(OO) methodologies are ideal tools for the simulation system of JASMINE(the JASMINE simulator). In this article, we explain the framework of the JASMINE simulator.

  4. Nanohole optical tweezers in heterogeneous mixture analysis

    NASA Astrophysics Data System (ADS)

    Hacohen, Noa; Ip, Candice J. X.; Laxminarayana, Gurunatha K.; DeWolf, Timothy S.; Gordon, Reuven

    2017-08-01

    Nanohole optical trapping is a tool that has been shown to analyze proteins at the single molecule level using pure samples. The next step is to detect and study single molecules with dirty samples. We demonstrate that using our double nanohole optical tweezing configuration, single particles in an egg white solution can be classified when trapped. Different sized molecules provide different signal variations in their trapped state, allowing the proteins to be statistically characterized. Root mean squared variation and trap stiffness are methods used on trapped signals to distinguish between the different proteins. This method to isolate and determine single molecules in heterogeneous samples provides huge potential to become a reliable tool for use within biomedical and scientific communities.

  5. Reference management: A critical element of scientific writing

    PubMed Central

    Kali, Arunava

    2016-01-01

    With the rapid growth of medical science, the number of scientific writing contributing to medical literature has increased significantly in recent years. Owing to considerable variation of formatting in different citation styles, strict adherence to the accurate referencing manually is labor intensive and challenging. However, the introduction of referencing tools has decreased the complexity to a great extent. These software have advanced overtime to include newer features to support effective reference management. Since scientific writing is an essential component of medical curriculum, it is imperative for medical graduates to understand various referencing systems to effectively make use of these tools in their dissertations and future researches. PMID:26952149

  6. Reference management: A critical element of scientific writing.

    PubMed

    Kali, Arunava

    2016-01-01

    With the rapid growth of medical science, the number of scientific writing contributing to medical literature has increased significantly in recent years. Owing to considerable variation of formatting in different citation styles, strict adherence to the accurate referencing manually is labor intensive and challenging. However, the introduction of referencing tools has decreased the complexity to a great extent. These software have advanced overtime to include newer features to support effective reference management. Since scientific writing is an essential component of medical curriculum, it is imperative for medical graduates to understand various referencing systems to effectively make use of these tools in their dissertations and future researches.

  7. Development of a Data Citations Database for an Interdisciplinary Data Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; Downs, R. R.; Schumacher, J.; Gerard, A.

    2017-12-01

    The scientific community has long depended on consistent citation of the scientific literature to enable traceability, support replication, and facilitate analysis and debate about scientific hypotheses, theories, assumptions, and conclusions. However, only in the past few years has the community focused on consistent citation of scientific data, e.g., through the application of Digital Object Identifiers (DOIs) to data, the development of peer-reviewed data publications, community principles and guidelines, and other mechanisms. This means that, moving ahead, it should be easier to identify and track data citations and conduct systematic bibliometric studies. However, this still leaves the problem that many legacy datasets and past citations lack DOIs, making it difficult to develop a historical baseline or assess trends. With this in mind, the NASA Socioeconomic Data and Applications Center (SEDAC) has developed a searchable citations database, containing more than 3,400 citations of SEDAC data and information products over the past 20 years. These citations were collected through various indices and search tools and in some cases through direct contacts with authors. The citations come from a range of natural, social, health, and engineering science journals, books, reports, and other media. The database can be used to find and extract citations filtered by a range of criteria, enabling quantitative analysis of trends, intercomparisons between data collections, and categorization of citations by type. We present a preliminary analysis of citations for selected SEDAC data collections, in order to establish a baseline and assess options for ongoing metrics to track the impact of SEDAC data on interdisciplinary science. We also present an analysis of the uptake of DOIs within data citations reported in published studies that used SEDAC data.

  8. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    PubMed

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  9. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  10. Building Bridges Between IPY Scientists and the Educational Community: A Spectrum of IPY Education and Outreach Activities

    NASA Astrophysics Data System (ADS)

    Ledley, T. S.; Dahlman, L.; McAuliffe, C.; Domenico, B.; Taber, M. R.

    2006-12-01

    The International Polar Year is an opportunity to simultaneously increase our scientific understanding of the polar regions and to engage the next generation of Earth scientists and socially responsible citizens. However, building the bridge between the scientific community who conduct the research and the education community who convey that information to students requires specific and continuing efforts. The Earth Exploration Toolbook (EET, http://serc.carleton.edu/eet) and the accompanying spectrum of activities encompassing development of materials that can provide access and understanding of IPY data and knowledge, and teacher professional development to facilitate the effective use of these materials with students can help build that bridge. The EET is an online resource that provides an easy way for educators to learn how to use Earth science datasets and data analysis tools to convey science concepts. Modules (called chapters) in the EET provide step-by-step instructions for accessing and analyzing these datasets within compelling case studies, and provide pedagogical information to help the educator use the data with their students. New EET chapters, featuring IPY data, can be developed through the use of an EET chapter template that standardizes the content and structure of the chapter. The initiation of new chapters can be facilitated through our Data in Education Workshops (previously DLESE Data Services Workshops, http://swiki.dlese.org/2006- dataservicesworkshop/). During these workshops IPY data providers, analysis tool specialists, IPY scientists, curriculum developers, and educators participate on teams of 5-6 members to create an outline of a new EET chapter featuring the IPY data and analysis tools represented on the team. New chapters will be completed by a curriculum developer following the workshop. Use of the IPY EET chapters will be facilitated by a range of professional development activities ranging from two 2-hour telecon-online workshops over the period of a month, to a year long professional development program that includes telecon-online workshops, a two-week summer workshop, follow-up online discussions and one-day meetings. In this paper we will discuss the EET and the spectrum of activities that can facilitate building a bridge between the IPY scientific community and future scientists and socially responsible citizens.

  11. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    NASA Astrophysics Data System (ADS)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  12. The Visual Geophysical Exploration Environment: A Multi-dimensional Scientific Visualization

    NASA Astrophysics Data System (ADS)

    Pandya, R. E.; Domenico, B.; Murray, D.; Marlino, M. R.

    2003-12-01

    The Visual Geophysical Exploration Environment (VGEE) is an online learning environment designed to help undergraduate students understand fundamental Earth system science concepts. The guiding principle of the VGEE is the importance of hands-on interaction with scientific visualization and data. The VGEE consists of four elements: 1) an online, inquiry-based curriculum for guiding student exploration; 2) a suite of El Nino-related data sets adapted for student use; 3) a learner-centered interface to a scientific visualization tool; and 4) a set of concept models (interactive tools that help students understand fundamental scientific concepts). There are two key innovations featured in this interactive poster session. One is the integration of concept models and the visualization tool. Concept models are simple, interactive, Java-based illustrations of fundamental physical principles. We developed eight concept models and integrated them into the visualization tool to enable students to probe data. The ability to probe data using a concept model addresses the common problem of transfer: the difficulty students have in applying theoretical knowledge to everyday phenomenon. The other innovation is a visualization environment and data that are discoverable in digital libraries, and installed, configured, and used for investigations over the web. By collaborating with the Integrated Data Viewer developers, we were able to embed a web-launchable visualization tool and access to distributed data sets into the online curricula. The Thematic Real-time Environmental Data Distributed Services (THREDDS) project is working to provide catalogs of datasets that can be used in new VGEE curricula under development. By cataloging this curricula in the Digital Library for Earth System Education (DLESE), learners and educators can discover the data and visualization tool within a framework that guides their use.

  13. Bringing values and deliberation to science communication.

    PubMed

    Dietz, Thomas

    2013-08-20

    Decisions always involve both facts and values, whereas most science communication focuses only on facts. If science communication is intended to inform decisions, it must be competent with regard to both facts and values. Public participation inevitably involves both facts and values. Research on public participation suggests that linking scientific analysis to public deliberation in an iterative process can help decision making deal effectively with both facts and values. Thus, linked analysis and deliberation can be an effective tool for science communication. However, challenges remain in conducting such process at the national and global scales, in enhancing trust, and in reconciling diverse values.

  14. Non-negative Tensor Factorization for Robust Exploratory Big-Data Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrov, Boian; Vesselinov, Velimir Valentinov; Djidjev, Hristo Nikolov

    Currently, large multidimensional datasets are being accumulated in almost every field. Data are: (1) collected by distributed sensor networks in real-time all over the globe, (2) produced by large-scale experimental measurements or engineering activities, (3) generated by high-performance simulations, and (4) gathered by electronic communications and socialnetwork activities, etc. Simultaneous analysis of these ultra-large heterogeneous multidimensional datasets is often critical for scientific discoveries, decision-making, emergency response, and national and global security. The importance of such analyses mandates the development of the next-generation of robust machine learning (ML) methods and tools for bigdata exploratory analysis.

  15. Metabolic modelling and flux analysis of microorganisms from the Atacama Desert used in biotechnological processes.

    PubMed

    Razmilic, Valeria; Castro, Jean Franco; Marchant, Francisca; Asenjo, Juan A; Andrews, Barbara

    2018-02-02

    Metabolic modelling is a useful tool that enables the rational design of metabolic engineering experiments and the study of the unique capabilities of biotechnologically important microorganisms. The extreme abiotic conditions of the Atacama Desert have selected microbial diversity with exceptional characteristics that can be applied in the mining industry for bioleaching processes and for production of specialised metabolites with antimicrobial, antifungal, antiviral, antitumoral, among other activities. In this review we summarise the scientific data available of the use of metabolic modelling and flux analysis to improve the performance of Atacama Desert microorganisms in biotechnological applications.

  16. Workflow based framework for life science informatics.

    PubMed

    Tiwari, Abhishek; Sekhar, Arvind K T

    2007-10-01

    Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.

  17. PYCHEM: a multivariate analysis package for python.

    PubMed

    Jarvis, Roger M; Broadhurst, David; Johnson, Helen; O'Boyle, Noel M; Goodacre, Royston

    2006-10-15

    We have implemented a multivariate statistical analysis toolbox, with an optional standalone graphical user interface (GUI), using the Python scripting language. This is a free and open source project that addresses the need for a multivariate analysis toolbox in Python. Although the functionality provided does not cover the full range of multivariate tools that are available, it has a broad complement of methods that are widely used in the biological sciences. In contrast to tools like MATLAB, PyChem 2.0.0 is easily accessible and free, allows for rapid extension using a range of Python modules and is part of the growing amount of complementary and interoperable scientific software in Python based upon SciPy. One of the attractions of PyChem is that it is an open source project and so there is an opportunity, through collaboration, to increase the scope of the software and to continually evolve a user-friendly platform that has applicability across a wide range of analytical and post-genomic disciplines. http://sourceforge.net/projects/pychem

  18. Rapid development of medical imaging tools with open-source libraries.

    PubMed

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  19. Performance Analysis of Scientific and Engineering Applications Using MPInside and TAU

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Mehrotra, Piyush; Taylor, Kenichi Jun Haeng; Shende, Sameer Suresh; Biswas, Rupak

    2010-01-01

    In this paper, we present performance analysis of two NASA applications using performance tools like Tuning and Analysis Utilities (TAU) and SGI MPInside. MITgcmUV and OVERFLOW are two production-quality applications used extensively by scientists and engineers at NASA. MITgcmUV is a global ocean simulation model, developed by the Estimating the Circulation and Climate of the Ocean (ECCO) Consortium, for solving the fluid equations of motion using the hydrostatic approximation. OVERFLOW is a general-purpose Navier-Stokes solver for computational fluid dynamics (CFD) problems. Using these tools, we analyze the MPI functions (MPI_Sendrecv, MPI_Bcast, MPI_Reduce, MPI_Allreduce, MPI_Barrier, etc.) with respect to message size of each rank, time consumed by each function, and how ranks communicate. MPI communication is further analyzed by studying the performance of MPI functions used in these two applications as a function of message size and number of cores. Finally, we present the compute time, communication time, and I/O time as a function of the number of cores.

  20. Correcting the bias against interdisciplinary research.

    PubMed

    Shapiro, Ehud

    2014-04-01

    When making decisions about funding and jobs the scientific community should recognise that most of the tools used to evaluate scientific excellence are biased in favour of established disciplines and against interdisciplinary research.

  1. A precise goniometer/tensiometer using a low cost single-board computer

    NASA Astrophysics Data System (ADS)

    Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.

    2017-12-01

    Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.

  2. Interactive Visualization of Computational Fluid Dynamics using Mosaic

    NASA Technical Reports Server (NTRS)

    Clucas, Jean; Watson, Velvin; Chancellor, Marisa K. (Technical Monitor)

    1994-01-01

    The Web provides new Methods for accessing Information world-wide, but the current text-and-pictures approach neither utilizes all the Web's possibilities not provides for its limitations. While the inclusion of pictures and animations in a paper communicates more effectively than text alone, It Is essentially an extension of the concept of "publication." Also, as use of the Web increases putting images and animations online will quickly load even the "Information Superhighway." We need to find forms of communication that take advantage of the special nature of the Web. This paper presents one approach: the use of the Internet and the Mosaic interface for data sharing and collaborative analysis. We will describe (and In the presentation, demonstrate) our approach: using FAST (Flow Analysis Software Toolkit), a scientific visualization package, as a data viewer and interactive tool called from MOSAIC. Our intent is to stimulate the development of other tools that utilize the unique nature of electronic communication.

  3. Charming the Snake: Student Experiences with Python Programming as a Data Analysis Tool

    NASA Astrophysics Data System (ADS)

    Booker, Melissa; Ivers, C. B.; Piper, M.; Powers, L.; Ali, B.

    2014-01-01

    During the past year, twelve high school students and one undergraduate student participated in the NASA/IPAC Teacher Archive Research Program (NITARP) alongside three high school educators and one informal educator, gaining experience in using Python as a tool for analyzing the vast amount of photometry data available from the Herschel and Spitzer telescopes in the NGC 281 region. Use of Python appeared to produce two main positive gains: (1) a gain in student ability to successfully write and execute Python programs for the bulk analysis of data, and (2) a change in their perceptions of the utility of computer programming and of the students’ abilities to use programming to solve problems. We outline the trials, tribulations, successes, and failures of the teachers and students through this learning exercise and provide some recommendations for incorporating programming in scientific learning.

  4. Using Scientific Detective Videos to Support the Design of Technology Learning Activities

    ERIC Educational Resources Information Center

    Yu, Kuang-Chao; Fan, Szu-Chun; Tsai, Fu-Hsing; Chu, Yih-hsien

    2013-01-01

    This article examines the effect of scientific detective video as a vehicle to support the design of technology activities by technology teachers. Ten graduate students, including current and future technology teachers, participated in a required technology graduate course that used scientific detective videos as a pedagogical tool to motivate…

  5. Operation ARA: A Computerized Learning Game that Teaches Critical Thinking and Scientific Reasoning

    ERIC Educational Resources Information Center

    Halpern, Diane F.; Millis, Keith; Graesser, Arthur C.; Butler, Heather; Forsyth, Carol; Cai, Zhiqiang

    2012-01-01

    Operation ARA (Acquiring Research Acumen) is a computerized learning game that teaches critical thinking and scientific reasoning. It is a valuable learning tool that utilizes principles from the science of learning and serious computer games. Students learn the skills of scientific reasoning by engaging in interactive dialogs with avatars. They…

  6. Agnotology as a Teaching Tool: Learning Climate Science by Studying Misinformation

    ERIC Educational Resources Information Center

    Bedford, Daniel

    2010-01-01

    Despite the existence of a clear scientific consensus about global warming, opinion surveys find confusion among the American public, regarding both scientific issues and the strength of the scientific consensus. Evidence increasingly points to misinformation as a contributing factor. This situation is both a challenge and an opportunity for…

  7. The Media as an Invaluable Tool for Informal Earth System Science Education

    NASA Astrophysics Data System (ADS)

    James, E.; Gautier, C.

    2001-12-01

    One of the most widely utilized avenues for educating the general public about the Earth's environment is the media, be it print, radio or broadcast. Accurate and effective communication of issues in Earth System Science (ESS), however, is significantly hindered by the public's relative scientific illiteracy. Discussion of ESS concepts requires the laying down of a foundation of complex scientific information, which must first be conveyed to an incognizant audience before any strata of sophisticated social context can be appropriately considered. Despite such a substantial obstacle to be negotiated, the environmental journalist is afforded the unique opportunity of providing a broad-reaching informal scientific education to a largely scientifically uninformed population base. This paper will review the tools used by various environmental journalists to address ESS issues and consider how successful each of these approaches has been at conveying complex scientific messages to a general audience lacking sufficient scientific sophistication. Different kinds of media materials used to this effect will be analyzed for their ideas and concepts conveyed, as well as their effectiveness in reaching the public at large.

  8. Freva - Freie Univ Evaluation System Framework for Scientific HPC Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Schartner, T.; Grieger, J.; Kirchner, I.; Rust, H.; Cubasch, U.; Ulbrich, U.

    2017-12-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science (e.g. www-miklip.dkrz.de, cmip-eval.dkrz.de). Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  9. European Society of Clinical Pharmacy (ESCP) glossary of scientific terms: a tool for standardizing scientific jargon.

    PubMed

    Carollo, Anna; Rieutord, André; Launay-Vacher, Vincent

    2012-04-01

    This glossary is a tool for clinicians who have to confront topics in which medical, scientific and technical jargon is closely linked. It provides definitions for the key concepts and terms of pharmaceutical care, clinical pharmacy, and research in the health care system in clinical settings. It includes items that are not particularly technical, but that should be part of the know-how of staff working in medical and scientific fields. In fact, the glossary can also help clinical technicians who want to understand the precise definition of scientific terms, which often do not coincide with the ones used in the practice setting. PRINCIPAL GOALS AND OBJECTIVES: The aim of this glossary is to aid in the development of more standardized and established terminology for clinical pharmacy, facilitate communication among different stakeholders and, ultimately, contribute to a higher-quality health care system. The glossary contains 165 definitions of concepts and principles in clinical pharmacy, and terms widely used in this field. The criteria for the inclusion of terms were specific applications in health promotion, or terms used in other fields that have a specific meaning or application when used in reference to clinical activity. The glossary arose from the need to standardize terminology in the scientific field. It was not intended as a comprehensive listing that would include all medical terms, but as a useful tool for clinical pharmacists working in this area, and for users who occasionally encounter unusual, often hard to understand, terminology.

  10. The 1000 Genomes Project: data management and community access.

    PubMed

    Clarke, Laura; Zheng-Bradley, Xiangqun; Smith, Richard; Kulesha, Eugene; Xiao, Chunlin; Toneva, Iliana; Vaughan, Brendan; Preuss, Don; Leinonen, Rasko; Shumway, Martin; Sherry, Stephen; Flicek, Paul

    2012-04-27

    The 1000 Genomes Project was launched as one of the largest distributed data collection and analysis projects ever undertaken in biology. In addition to the primary scientific goals of creating both a deep catalog of human genetic variation and extensive methods to accurately discover and characterize variation using new sequencing technologies, the project makes all of its data publicly available. Members of the project data coordination center have developed and deployed several tools to enable widespread data access.

  11. Development of an Automated Modality-Independent Elastographic Image Analysis System for Tumor Screening

    DTIC Science & Technology

    2008-02-01

    journal article. Didactic coursework requirements for the PhD degree have been completed at this time as well as successful presentation of the...Libraries", Modern Software Tools in Scientific Computing. Birkhauser Press, pp. 163-202, 1997. [5] Doyley MM, Weaver JB, Van Houten EEW, Kennedy FE...data from MR, x-ray computed tomography (CT) and digital photography have been used to successfully drive the algorithm in two-dimensional (2D) work

  12. The GTC Scientific Data Centre

    NASA Astrophysics Data System (ADS)

    Solano, E.

    2005-12-01

    Since the early stages of the GTC project, the need of a scientific archive was already identified as an important tool for the scientific exploitation of the data. In this work, the conceptual design and the main functionalities of the Scientific Data Archive of the Gran Telescopio Canarias (GSA) are described. The system will be developed, implemented and maintained at the Laboratorio de Astrofísica Espacial y Física Fundamental (LAEFF).

  13. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  14. The critical steps for successful research: The research proposal and scientific writing: (A report on the pre-conference workshop held in conjunction with the 64(th) annual conference of the Indian Pharmaceutical Congress-2012).

    PubMed

    Balakumar, Pitchai; Inamdar, Mohammed Naseeruddin; Jagadeesh, Gowraganahalli

    2013-04-01

    An interactive workshop on 'The Critical Steps for Successful Research: The Research Proposal and Scientific Writing' was conducted in conjunction with the 64(th) Annual Conference of the Indian Pharmaceutical Congress-2012 at Chennai, India. In essence, research is performed to enlighten our understanding of a contemporary issue relevant to the needs of society. To accomplish this, a researcher begins search for a novel topic based on purpose, creativity, critical thinking, and logic. This leads to the fundamental pieces of the research endeavor: Question, objective, hypothesis, experimental tools to test the hypothesis, methodology, and data analysis. When correctly performed, research should produce new knowledge. The four cornerstones of good research are the well-formulated protocol or proposal that is well executed, analyzed, discussed and concluded. This recent workshop educated researchers in the critical steps involved in the development of a scientific idea to its successful execution and eventual publication.

  15. Investigating Socioscientific Issues via Scientific Habits of Mind: Development and validation of the Scientific Habits of Mind Survey

    NASA Astrophysics Data System (ADS)

    Çalik, Muammer; Coll, Richard Kevin

    2012-08-01

    In this paper, we describe the Scientific Habits of Mind Survey (SHOMS) developed to explore public, science teachers', and scientists' understanding of habits of mind (HoM). The instrument contained 59 items, and captures the seven SHOM identified by Gauld. The SHOM was validated by administration to two cohorts of pre-service science teachers: primary science teachers with little science background or interest (n = 145), and secondary school science teachers (who also were science graduates) with stronger science knowledge (n = 145). Face validity was confirmed by the use of a panel of experts and a pilot study employing participants similar in demographics to the intended sample. To confirm convergent and discriminant validity, confirmatory factor analysis and evaluation of the reliability were calculated. Statistical data and other data gathered from interviews suggest that the SHOMS will prove to be a useful tool for educators and researchers who wish to investigate HoM for a variety of participants.

  16. Data Mining as a Service (DMaaS)

    NASA Astrophysics Data System (ADS)

    Tejedor, E.; Piparo, D.; Mascetti, L.; Moscicki, J.; Lamanna, M.; Mato, P.

    2016-10-01

    Data Mining as a Service (DMaaS) is a software and computing infrastructure that allows interactive mining of scientific data in the cloud. It allows users to run advanced data analyses by leveraging the widely adopted Jupyter notebook interface. Furthermore, the system makes it easier to share results and scientific code, access scientific software, produce tutorials and demonstrations as well as preserve the analyses of scientists. This paper describes how a first pilot of the DMaaS service is being deployed at CERN, starting from the notebook interface that has been fully integrated with the ROOT analysis framework, in order to provide all the tools for scientists to run their analyses. Additionally, we characterise the service backend, which combines a set of IT services such as user authentication, virtual computing infrastructure, mass storage, file synchronisation, development portals or batch systems. The added value acquired by the combination of the aforementioned categories of services is discussed, focusing on the opportunities offered by the CERNBox synchronisation service and its massive storage backend, EOS.

  17. Neo: an object model for handling electrophysiology data in multiple formats

    PubMed Central

    Garcia, Samuel; Guarino, Domenico; Jaillet, Florent; Jennings, Todd; Pröpper, Robert; Rautenberg, Philipp L.; Rodgers, Chris C.; Sobolev, Andrey; Wachtler, Thomas; Yger, Pierre; Davison, Andrew P.

    2014-01-01

    Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named “Neo,” suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology. PMID:24600386

  18. Neo: an object model for handling electrophysiology data in multiple formats.

    PubMed

    Garcia, Samuel; Guarino, Domenico; Jaillet, Florent; Jennings, Todd; Pröpper, Robert; Rautenberg, Philipp L; Rodgers, Chris C; Sobolev, Andrey; Wachtler, Thomas; Yger, Pierre; Davison, Andrew P

    2014-01-01

    Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named "Neo," suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology.

  19. Advancements in Large-Scale Data/Metadata Management for Scientific Data.

    NASA Astrophysics Data System (ADS)

    Guntupally, K.; Devarakonda, R.; Palanisamy, G.; Frame, M. T.

    2017-12-01

    Scientific data often comes with complex and diverse metadata which are critical for data discovery and users. The Online Metadata Editor (OME) tool, which was developed by an Oak Ridge National Laboratory team, effectively manages diverse scientific datasets across several federal data centers, such as DOE's Atmospheric Radiation Measurement (ARM) Data Center and USGS's Core Science Analytics, Synthesis, and Libraries (CSAS&L) project. This presentation will focus mainly on recent developments and future strategies for refining OME tool within these centers. The ARM OME is a standard based tool (https://www.archive.arm.gov/armome) that allows scientists to create and maintain metadata about their data products. The tool has been improved with new workflows that help metadata coordinators and submitting investigators to submit and review their data more efficiently. The ARM Data Center's newly upgraded Data Discovery Tool (http://www.archive.arm.gov/discovery) uses rich metadata generated by the OME to enable search and discovery of thousands of datasets, while also providing a citation generator and modern order-delivery techniques like Globus (using GridFTP), Dropbox and THREDDS. The Data Discovery Tool also supports incremental indexing, which allows users to find new data as and when they are added. The USGS CSAS&L search catalog employs a custom version of the OME (https://www1.usgs.gov/csas/ome), which has been upgraded with high-level Federal Geographic Data Committee (FGDC) validations and the ability to reserve and mint Digital Object Identifiers (DOIs). The USGS's Science Data Catalog (SDC) (https://data.usgs.gov/datacatalog) allows users to discover a myriad of science data holdings through a web portal. Recent major upgrades to the SDC and ARM Data Discovery Tool include improved harvesting performance and migration using new search software, such as Apache Solr 6.0 for serving up data/metadata to scientific communities. Our presentation will highlight the future enhancements of these tools which enable users to retrieve fast search results, along with parallelizing the retrieval process from online and High Performance Storage Systems. In addition, these improvements to the tools will support additional metadata formats like the Large-Eddy Simulation (LES) ARM Symbiotic and Observation (LASSO) bundle data.

  20. 3-D Imaging of Mars’ Polar Ice Caps Using Orbital Radar Data

    PubMed Central

    Foss, Frederick J.; Putzig, Nathaniel E.; Campbell, Bruce A.; Phillips, Roger J.

    2018-01-01

    Since its arrival in early 2006, various instruments aboard NASA’s Mars Reconnaissance Orbiter (MRO) have been collecting a variety of scientific and engineering data from orbit around Mars. Among these is the SHAllow RADar (SHARAD) instrument, supplied by Agenzia Spaziale Italiana (ASI) and designed for subsurface sounding in the 15–25 MHz frequency band. As of this writing, MRO has completed over 46,000 nearly polar orbits of Mars, 30% of which have included active SHARAD data collection. By 2009, a sufficient density of SHARAD coverage had been obtained over the polar regions to support 3-D processing and analysis of the data. Using tools and techniques commonly employed in terrestrial seismic data processing, we have processed subsets of the resulting collection of SHARAD observations covering the north and south polar regions as SHARAD 3-D volumes, imaging the interiors of the north and south polar ice caps known, respectively, as Planum Boreum and Planum Australe. After overcoming a series of challenges revealed during the 3-D processing and analysis, a completed Planum Boreum 3-D volume is currently being used for scientific research. Lessons learned in the northern work fed forward into our 3-D processing and analysis of the Planum Australe 3-D volume, currently under way. We discuss our experiences with these projects and present results and scientific insights stemming from these efforts. PMID:29400351

  1. Exploring the potential of using stories about diverse scientists and reflective activities to enrich primary students' images of scientists and scientific work

    NASA Astrophysics Data System (ADS)

    Sharkawy, Azza

    2012-06-01

    The purpose of this qualitative study was to explore the potential of using stories about diverse scientists to broaden primary students' images of scientists and scientific work. Stories featuring scientists from diverse socio-cultural backgrounds (i.e., physical ability, gender, ethnicity) were presented to 11 grade one students over a 15 -week period. My analysis of pre-and post audio-taped interview transcripts, draw-a-scientist-tests (Chambers 1983), participant observations and student work suggest that the stories about scientists and follow-up reflective activities provided resources for students that helped them: (a) acquire images of scientists from less dominant socio-cultural backgrounds; (b) enrich their views of scientific work from predominantly hands-on/activity-oriented views to ones that includes cognitive and positive affective dimensions. One of the limitations of using stories as a tool to extend students' thinking about science is highlighted in a case study of a student who expresses resistance to some of the counter-stereotypic images presented in the stories. I also present two additional case studies that illustrate how shifts in student' views of the nature of scientific work can change their interest in future participation in scientific work.

  2. Gender-fair assessment of young gifted students' scientific thinking skills

    NASA Astrophysics Data System (ADS)

    Dori, Y. J.; Zohar, A.; Fischer-Shachor, D.; Kohan-Mass, J.; Carmi, M.

    2018-04-01

    This paper describes an Israeli national-level research examining the extent to which admissions of elementary school students to the gifted programmes based on standardised tests are gender-fair. In the research, the gifted students consisted of 275 boys, 128 girls, and additional 80 girls who were admitted to the gifted programme through affirmative action (AA). To assess these young students' scientific thinking skills, also referred to as science practices, open-ended questions of case-based questionnaires were developed. The investigated scientific thinking skills were question posing, explanation, graphing, inquiry, and metacognition. Analysis of the students' responses revealed that gifted girls who entered the programmes through AA performed at the same level as the other gifted students. We found significant differences between the three research groups in question posing and graphing skills. We suggest increasing gender-fairness by revising the standard national testing system to include case-based narratives followed by open-ended questions that assess gifted students' scientific thinking skills. This may diminish the gender inequity expressed by the different number of girls and boys accepted to the gifted programmes. We show that open-ended tools for analysing students' scientific thinking might better serve both research and practice by identifying gifted girls and boys equally well.

  3. Toyz: A framework for scientific analysis of large datasets and astronomical images

    NASA Astrophysics Data System (ADS)

    Moolekamp, F.; Mamajek, E.

    2015-11-01

    As the size of images and data products derived from astronomical data continues to increase, new tools are needed to visualize and interact with that data in a meaningful way. Motivated by our own astronomical images taken with the Dark Energy Camera (DECam) we present Toyz, an open source Python package for viewing and analyzing images and data stored on a remote server or cluster. Users connect to the Toyz web application via a web browser, making it ​a convenient tool for students to visualize and interact with astronomical data without having to install any software on their local machines. In addition it provides researchers with an easy-to-use tool that allows them to browse the files on a server and quickly view very large images (>2 Gb) taken with DECam and other cameras with a large FOV and create their own visualization tools that can be added on as extensions to the default Toyz framework.

  4. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  5. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.

  6. V-FOR-WaTer - a new virtual research environment for environmental research

    NASA Astrophysics Data System (ADS)

    Strobl, Marcus; Azmi, Elnaz; Hassler, Sibylle; Mälicke, Mirko; Meyer, Jörg; Zehe, Erwin

    2017-04-01

    The preparation of heterogeneous datasets for scientific analysis is still a demanding task. Data preprocessing for hydrological models typically involves gathering datasets from different sources, extensive work within geoinformation systems, data transformation, the generation of computational grids and the definition of initial and boundary conditions. V-FOR-WaTer, a standardized and scalable data hub with compatible analysis tools, will ease comprehensive studies and significantly reduce data preparation time. The idea behind V-FOR-WaTer is to bring together various datasets (e.g. point measurements, 2D/3D data, time series data) from different sources (e.g. gathered in research projects, or as part of regular monitoring of state offices) and to provide common as well as innovative scaling tools in space and time to generate a coherent data grid. Each dataset holds detailed standardized metadata to ensure usability of the data, offer a comprehensive search function and provide reference information for appropriate citation of the dataset creators. V-FOR-WaTer includes a basis of data and tools, but its purpose is to grow by users who extend the virtual research environment with their own tools and research data. Researchers who upload new data or tools can receive a digital object identifier, or protect their data and tools from others until publication. Access to data and tools provided from V-FOR-WaTer happens via an easy-to-use web portal. Due to its modular architecture the portal is ready to be extended with new tools and features and also offers interfaces to Matlab, Python and R.

  7. Geoinformatic subsystem for real estate market analysis). (Polish Title: Podsystem geoinformatyczny do analizy rynku nieruchomosci)

    NASA Astrophysics Data System (ADS)

    Basista, A.

    2013-12-01

    There are many tools to manage spatial data. They called Geographic Information System (GIS), which apart from data visualization in space, let users make various spatial analysis. Thanks to them, it is possible to obtain more, essential information for real estate market analysis. Many scientific research present GIS exploitation to future mass valuation, because it is necessary to use advanced tools to manage such a huge real estates' data sets gathered for mass valuation needs. In practice, appraisers use rarely these tools for single valuation, because there are not many available GIS tools to support real estate valuation. The paper presents the functionality of geoinformatic subsystem, that is used to support real estate market analysis and real estate valuation. There are showed a detailed description of the process relied to attributes' inputting into the database and the attributes' values calculation based on the proposed definition of attributes' scales. This work presents also the algorithm of similar properties selection that was implemented within the described subsystem. The main stage of this algorithm is the calculation of the price creative indicator for each real estate, using their attributes' values. The set of properties, chosen in this way, are visualized on the map. The geoinformatic subsystem is used for the un-built real estates and living premises. Geographic Information System software was used to worked out this project. The basic functionality of gvSIG software (open source software) was extended and some extra functions were added to support real estate market analysis.

  8. An Ontology-Enabled Natural Language Processing Pipeline for Provenance Metadata Extraction from Biomedical Text (Short Paper).

    PubMed

    Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S

    2016-10-01

    Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.

  9. Developing a Science Commons for Geosciences

    NASA Astrophysics Data System (ADS)

    Lenhardt, W. C.; Lander, H.

    2016-12-01

    Many scientific communities, recognizing the research possibilities inherent in data sets, have created domain specific archives such as the Incorporated Research Institutions for Seismology (iris.edu) and ClinicalTrials.gov. Though this is an important step forward, most scientists, including geoscientists, also use a variety of software tools and at least some amount of computation to conduct their research. While the archives make it simpler for scientists to locate the required data, provisioning disk space, compute resources, and network bandwidth can still require significant efforts. This challenge exists despite the wealth of resources available to researchers, namely lab IT resources, institutional IT resources, national compute resources (XSEDE, OSG), private clouds, public clouds, and the development of cyberinfrastructure technologies meant to facilitate use of those resources. Further tasks include obtaining and installing required tools for analysis and visualization. If the research effort is a collaboration or involves certain types of data, then the partners may well have additional non-scientific tasks such as securing the data and developing secure sharing methods for the data. These requirements motivate our investigations into the "Science Commons". This paper will present a working definition of a science commons, compare and contrast examples of existing science commons, and describe a project based at RENCI to implement a science commons for risk analytics. We will then explore what a similar tool might look like for the geosciences.

  10. Exploring Antarctic Land Surface Temperature Extremes Using Condensed Anomaly Databases

    NASA Astrophysics Data System (ADS)

    Grant, Glenn Edwin

    Satellite observations have revolutionized the Earth Sciences and climate studies. However, data and imagery continue to accumulate at an accelerating rate, and efficient tools for data discovery, analysis, and quality checking lag behind. In particular, studies of long-term, continental-scale processes at high spatiotemporal resolutions are especially problematic. The traditional technique of downloading an entire dataset and using customized analysis code is often impractical or consumes too many resources. The Condensate Database Project was envisioned as an alternative method for data exploration and quality checking. The project's premise was that much of the data in any satellite dataset is unneeded and can be eliminated, compacting massive datasets into more manageable sizes. Dataset sizes are further reduced by retaining only anomalous data of high interest. Hosting the resulting "condensed" datasets in high-speed databases enables immediate availability for queries and exploration. Proof of the project's success relied on demonstrating that the anomaly database methods can enhance and accelerate scientific investigations. The hypothesis of this dissertation is that the condensed datasets are effective tools for exploring many scientific questions, spurring further investigations and revealing important information that might otherwise remain undetected. This dissertation uses condensed databases containing 17 years of Antarctic land surface temperature anomalies as its primary data. The study demonstrates the utility of the condensate database methods by discovering new information. In particular, the process revealed critical quality problems in the source satellite data. The results are used as the starting point for four case studies, investigating Antarctic temperature extremes, cloud detection errors, and the teleconnections between Antarctic temperature anomalies and climate indices. The results confirm the hypothesis that the condensate databases are a highly useful tool for Earth Science analyses. Moreover, the quality checking capabilities provide an important method for independent evaluation of dataset veracity.

  11. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    NASA Astrophysics Data System (ADS)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.

  12. Analyzing microtomography data with Python and the scikit-image library.

    PubMed

    Gouillart, Emmanuelle; Nunez-Iglesias, Juan; van der Walt, Stéfan

    2017-01-01

    The exploration and processing of images is a vital aspect of the scientific workflows of many X-ray imaging modalities. Users require tools that combine interactivity, versatility, and performance. scikit-image is an open-source image processing toolkit for the Python language that supports a large variety of file formats and is compatible with 2D and 3D images. The toolkit exposes a simple programming interface, with thematic modules grouping functions according to their purpose, such as image restoration, segmentation, and measurements. scikit-image users benefit from a rich scientific Python ecosystem that contains many powerful libraries for tasks such as visualization or machine learning. scikit-image combines a gentle learning curve, versatile image processing capabilities, and the scalable performance required for the high-throughput analysis of X-ray imaging data.

  13. The Miami Barrel: An Innovation in Forensic Firearms Identification

    ERIC Educational Resources Information Center

    Fadul, Thomas G., Jr.

    2009-01-01

    The scientific foundation in firearm and tool mark identification is that each firearm/tool produces a signature of identification (striation/impression) that is unique to that firearm/tool, and through examining the individual striations/impressions; the signature can be positively identified to the firearm/tool that produced it. There is no set…

  14. Data-driven Ontology Development: A Case Study at NASA's Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Hertz, J.; Huffer, E.; Kusterer, J.

    2012-12-01

    Well-founded ontologies are key to enabling transformative semantic technologies and accelerating scientific research. One example is semantically enabled search and discovery, making scientific data accessible and more understandable by accurately modeling a complex domain. The ontology creation process remains a challenge for many anxious to pursue semantic technologies. The key may be that the creation process -- whether formal, community-based, automated or semi-automated -- should encompass not only a foundational core and supplemental resources but also a focus on the purpose or mission the ontology is created to support. Are there tools or processes to de-mystify, assess or enhance the resulting ontology? We suggest that comparison and analysis of a domain-focused ontology can be made using text engineering tools for information extraction, tokenizers, named entity transducers and others. The results are analyzed to ensure the ontology reflects the core purpose of the domain's mission and that the ontology integrates and describes the supporting data in the language of the domain - how the science is analyzed and discussed among all users of the data. Commonalities and relationships among domain resources describing the Clouds and Earth's Radiant Energy (CERES) Bi-Directional Scan (BDS) datasets from NASA's Atmospheric Science Data Center are compared. The domain resources include: a formal ontology created for CERES; scientific works such as papers, conference proceedings and notes; information extracted from the datasets (i.e., header metadata); and BDS scientific documentation (Algorithm Theoretical Basis Documents, collection guides, data quality summaries and others). These resources are analyzed using the open source software General Architecture for Text Engineering, a mature framework for computational tasks involving human language.

  15. Outlining and dictating scientific manuscripts is a useful method for health researchers: A focus group interview.

    PubMed

    Andresen, Kristoffer; Laursen, Jannie; Rosenberg, Jacob

    2018-01-01

    Young researchers may experience difficulties when writing scientific articles for publication in biomedical journals. Various methods may facilitate the writing process including outlining the paper before the actual writing and using dictation instead of writing the first draft. The aim of this study was to investigate the experiences and difficulties for young, experienced researchers when writing articles using a detailed outline and dictation of the first draft. We used qualitative focus group interviews and the study was reported according to the COnsolidated criteria for REporting Qualitative research guideline. Participants were sampled from a group of researchers participating in a writing retreat/course. The interviews were recorded on a digital recorder and transcribed. The text was analyzed according to content analysis and coded and condensed into themes and subthemes. Groups of participants were added until data saturation was reached. A total of 14 researchers participated (9 women and 5 men). Their clinical experience was median (range) of 6 (1-11) years since graduation from medical school. Two themes arose during the analyses of the data: "Process guidance with the outline as the map" and "arrival at dictation." The outline was used in the preparation phase leading up to the day of dictation and was used in collaboration with co-authors and supervisors. The participants found it to be a useful tool for preparing the manuscript and dictating their initial first full draft. Experienced young researchers found beneficial effects of using a structured outline to prepare for dictation of scientific articles. The outline was a tool that would develop in close collaboration with co-authors and mentors. With dictation, a full first draft of a manuscript can be produced in a few hours. Participants positively evaluated this structured and reproducible way of producing scientific articles.

  16. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-05-04

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less

  17. User Guidelines for the Brassica Database: BRAD.

    PubMed

    Wang, Xiaobo; Cheng, Feng; Wang, Xiaowu

    2016-01-01

    The genome sequence of Brassica rapa was first released in 2011. Since then, further Brassica genomes have been sequenced or are undergoing sequencing. It is therefore necessary to develop tools that help users to mine information from genomic data efficiently. This will greatly aid scientific exploration and breeding application, especially for those with low levels of bioinformatic training. Therefore, the Brassica database (BRAD) was built to collect, integrate, illustrate, and visualize Brassica genomic datasets. BRAD provides useful searching and data mining tools, and facilitates the search of gene annotation datasets, syntenic or non-syntenic orthologs, and flanking regions of functional genomic elements. It also includes genome-analysis tools such as BLAST and GBrowse. One of the important aims of BRAD is to build a bridge between Brassica crop genomes with the genome of the model species Arabidopsis thaliana, thus transferring the bulk of A. thaliana gene study information for use with newly sequenced Brassica crops.

  18. Cause-and-effect mapping of critical events.

    PubMed

    Graves, Krisanne; Simmons, Debora; Galley, Mark D

    2010-06-01

    Health care errors are routinely reported in the scientific and public press and have become a major concern for most Americans. In learning to identify and analyze errors health care can develop some of the skills of a learning organization, including the concept of systems thinking. Modern experts in improving quality have been working in other high-risk industries since the 1920s making structured organizational changes through various frameworks for quality methods including continuous quality improvement and total quality management. When using these tools, it is important to understand systems thinking and the concept of processes within organization. Within these frameworks of improvement, several tools can be used in the analysis of errors. This article introduces a robust tool with a broad analytical view consistent with systems thinking, called CauseMapping (ThinkReliability, Houston, TX, USA), which can be used to systematically analyze the process and the problem at the same time. Copyright 2010 Elsevier Inc. All rights reserved.

  19. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  20. Active Storage with Analytics Capabilities and I/O Runtime System for Petascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok

    Computational scientists must understand results from experimental, observational and computational simulation generated data to gain insights and perform knowledge discovery. As systems approach the petascale range, problems that were unimaginable a few years ago are within reach. With the increasing volume and complexity of data produced by ultra-scale simulations and high-throughput experiments, understanding the science is largely hampered by the lack of comprehensive I/O, storage, acceleration of data manipulation, analysis, and mining tools. Scientists require techniques, tools and infrastructure to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis, statistical analysis and knowledgemore » discovery. The goal of this work is to enable more effective analysis of scientific datasets through the integration of enhancements in the I/O stack, from active storage support at the file system layer to MPI-IO and high-level I/O library layers. We propose to provide software components to accelerate data analytics, mining, I/O, and knowledge discovery for large-scale scientific applications, thereby increasing productivity of both scientists and the systems. Our approaches include 1) design the interfaces in high-level I/O libraries, such as parallel netCDF, for applications to activate data mining operations at the lower I/O layers; 2) Enhance MPI-IO runtime systems to incorporate the functionality developed as a part of the runtime system design; 3) Develop parallel data mining programs as part of runtime library for server-side file system in PVFS file system; and 4) Prototype an active storage cluster, which will utilize multicore CPUs, GPUs, and FPGAs to carry out the data mining workload.« less

  1. Quantifying falsifiability of scientific theories

    NASA Astrophysics Data System (ADS)

    Nemenman, Ilya

    I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.

  2. Co-authorship network analysis in health research: method and potential use.

    PubMed

    Fonseca, Bruna de Paula Fonseca E; Sampaio, Ricardo Barros; Fonseca, Marcus Vinicius de Araújo; Zicker, Fabio

    2016-04-30

    Scientific collaboration networks are a hallmark of contemporary academic research. Researchers are no longer independent players, but members of teams that bring together complementary skills and multidisciplinary approaches around common goals. Social network analysis and co-authorship networks are increasingly used as powerful tools to assess collaboration trends and to identify leading scientists and organizations. The analysis reveals the social structure of the networks by identifying actors and their connections. This article reviews the method and potential applications of co-authorship network analysis in health. The basic steps for conducting co-authorship studies in health research are described and common network metrics are presented. The application of the method is exemplified by an overview of the global research network for Chikungunya virus vaccines.

  3. Volcanic Alert System (VAS) developed during the (2011-2013) El Hierro (Canary Islands) volcanic process

    NASA Astrophysics Data System (ADS)

    Ortiz, Ramon; Berrocoso, Manuel; Marrero, Jose Manuel; Fernandez-Ros, Alberto; Prates, Gonçalo; De la Cruz-Reyna, Servando; Garcia, Alicia

    2014-05-01

    In volcanic areas with long repose periods (as El Hierro), recently installed monitoring networks offer no instrumental record of past eruptions nor experience in handling a volcanic crisis. Both conditions, uncertainty and inexperience, contribute to make the communication of hazard more difficult. In fact, in the initial phases of the unrest at El Hierro, the perception of volcanic risk was somewhat distorted, as even relatively low volcanic hazards caused a high political impact. The need of a Volcanic Alert System became then evident. In general, the Volcanic Alert System is comprised of the monitoring network, the software tools for the analysis of the observables, the management of the Volcanic Activity Level, and the assessment of the threat. The Volcanic Alert System presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself. As part of the Volcanic Alert System, we introduce here the Volcanic Activity Level which continuously applies a routine analysis of monitoring data (particularly seismic and deformation data) to detect data trend changes or monitoring network failures. The data trend changes are quantified according to the Failure Forecast Method (FFM). When data changes and/or malfunctions are detected, by an automated watchdog, warnings are automatically issued to the Monitoring Scientific Team. Changes in the data patterns are then translated by the Monitoring Scientific Team into a simple Volcanic Activity Level, that is easy to use and understand by the scientists and technicians in charge for the technical management of the unrest. The main feature of the Volcanic Activity Level is its objectivity, as it does not depend on expert opinions, which are left to the Scientific Committee, and its capabilities for early detection of precursors. As a consequence of the El Hierro experience we consider the objectivity of the Volcanic Activity Level a powerful tool to focus the discussions in a Scientific Committee on the activity forecast and on the expected scenarios, rather than on the multiple explanations of the data fluctuations, which is one of the main sources of conflict in the Scientific Committee discussions. Although the Volcanic Alert System was designed specifically for the unrest episodes at El Hierro, the involved methodologies may be applied to other situations of unrest.

  4. M4AST - A Tool for Asteroid Modelling

    NASA Astrophysics Data System (ADS)

    Birlan, Mirel; Popescu, Marcel; Irimiea, Lucian; Binzel, Richard

    2016-10-01

    M4AST (Modelling for asteroids) is an online tool devoted to the analysis and interpretation of reflection spectra of asteroids in the visible and near-infrared spectral intervals. It consists into a spectral database of individual objects and a set of routines for analysis which address scientific aspects such as: taxonomy, curve matching with laboratory spectra, space weathering models, and mineralogical diagnosis. Spectral data were obtained using groundbased facilities; part of these data are precompiled from the literature[1].The database is composed by permanent and temporary files. Each permanent file contains a header and two or three columns (wavelength, spectral reflectance, and the error on spectral reflectance). Temporary files can be uploaded anonymously, and are purged for the property of submitted data. The computing routines are organized in order to accomplish several scientific objectives: visualize spectra, compute the asteroid taxonomic class, compare an asteroid spectrum with similar spectra of meteorites, and computing mineralogical parameters. One facility of using the Virtual Observatory protocols was also developed.A new version of the service was released in June 2016. This new release of M4AST contains a database and facilities to model more than 6,000 spectra of asteroids. A new web-interface was designed. This development allows new functionalities into a user-friendly environment. A bridge system of access and exploiting the database SMASS-MIT (http://smass.mit.edu) allows the treatment and analysis of these data in the framework of M4AST environment.Reference:[1] M. Popescu, M. Birlan, and D.A. Nedelcu, "Modeling of asteroids: M4AST," Astronomy & Astrophysics 544, EDP Sciences, pp. A130, 2012.

  5. Creating global comparative analyses of tectonic rifts, monogenetic volcanism and inverted relief

    NASA Astrophysics Data System (ADS)

    van Wyk de Vries, Benjamin

    2016-04-01

    I have been all around the world, and to other planets and have travelled from the present to the Archaean and back to seek out the most significant tectonic rifts, monogenetic volcanoes and examples of inverted relief. I have done this to provide a broad foundation of the comparative analysis for the Chaîne des Puys - Limagne fault nomination to UNESCO world Heritage. This would have been an impossible task, if not for the cooperation of the scientific community and for Google Earth, Google Maps and academic search engines. In preparing global comparisons of geological features, these quite recently developed tools provide a powerful way to find and describe geological features. The ability to do scientific crowd sourcing, rapidly discussing with colleagues about features, allows large numbers of areas to be checked and the open GIS tools (such as Google Earth) allow a standardised description. Search engines also allow the literature on areas to be checked and compared. I will present a comparative study of rifts of the world, monogenetic volcanic field and inverted relief, integrated to analyse the full geological system represented by the Chaîne des Puys - Limagne fault. The analysis confirms that the site is an exceptional example of the first steps of continental drift in a mountain rift setting, and that this is necessarily seen through the combined landscape of tectonic, volcanic and geomorphic features. The analysis goes further to deepen the understanding of geological systems and stresses the need for more study on geological heritage using such a global and broad systems approach.

  6. The Astronomy Workshop: Scientific Notation and Solar System Visualizer

    NASA Astrophysics Data System (ADS)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2008-09-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes and by the general public. The philosophy of the site is to foster student interest in astronomy by exploiting their fascination with computers and the internet. We have expanded the "Scientific Notation” tool from simply converting decimal numbers into and out of scientific notation to adding, subtracting, multiplying, and dividing numbers expressed in scientific notation. Students practice these skills and when confident they may complete a quiz. In addition, there are suggestions on how instructors may use the site to encourage students to practice these basic skills. The Solar System Visualizer animates orbits of planets, moons, and rings to scale. Extrasolar planetary systems are also featured. This research was sponsored by NASA EPO grant NNG06GGF99G.

  7. Exploring Scientific Information for Policy Making under Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Forni, L.; Galaitsi, S.; Mehta, V. K.; Escobar, M.; Purkey, D. R.; Depsky, N. J.; Lima, N. A.

    2016-12-01

    Each actor evaluating potential management strategies brings her/his own distinct set of objectives to a complex decision space of system uncertainties. The diversity of these objectives require detailed and rigorous analyses that responds to multifaceted challenges. However, the utility of this information depends on the accessibility of scientific information to decision makers. This paper demonstrates data visualization tools for presenting scientific results to decision makers in two case studies, La Paz/ El Alto, Bolivia, and Yuba County,California. Visualization output from the case studies combines spatiotemporal, multivariate and multirun/multiscenario information to produce information corresponding to the objectives defined by key actors and stakeholders. These tools can manage complex data and distill scientific information into accessible formats. Using the visualizations, scientists and decision makers can navigate the decision space and potential objective trade-offs to facilitate discussion and consensus building. These efforts can support identifying stable negotiatedagreements between different stakeholders.

  8. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  9. Developing multiple-choices test items as tools for measuring the scientific-generic skills on solar system

    NASA Astrophysics Data System (ADS)

    Bhakti, Satria Seto; Samsudin, Achmad; Chandra, Didi Teguh; Siahaan, Parsaoran

    2017-05-01

    The aim of research is developing multiple-choices test items as tools for measuring the scientific of generic skills on solar system. To achieve the aim that the researchers used the ADDIE model consisting Of: Analyzing, Design, Development, Implementation, dan Evaluation, all of this as a method research. While The scientific of generic skills limited research to five indicator including: (1) indirect observation, (2) awareness of the scale, (3) inference logic, (4) a causal relation, and (5) mathematical modeling. The participants are 32 students at one of junior high schools in Bandung. The result shown that multiple-choices that are constructed test items have been declared valid by the expert validator, and after the tests show that the matter of developing multiple-choices test items be able to measuring the scientific of generic skills on solar system.

  10. [Using Twitter in oncology. Research, continuing education, and advocacy].

    PubMed

    De Fiore, Luciano; Ascierto, Paolo

    2015-01-01

    Traditional mass media coverage has been enhanced by Twitter, an interactive, real-time media, useful in health care, and particularly in oncology. Social media such as Twitter are gaining increasing acceptance as tools for instantaneous scientific dialogue. Professional medical societies such as ASCO and ESMO are using microblogging to expand the reach of scientific communications at and around their scientific meetings. To widen the message and maximize the potential for word-of-mouth marketing using Twitter, organizations (such as AIOM, ASCO or ESMO) and industries need a strategic communications plan to ensure on-going social media conversations. Twitter is a very powerful tool indeed that amplifies the results of scientific meetings, and conference organisers should put in place strategies to capitalise on this. This review demonstrates that cancer patients also share information more and more via Twitter about their disease, including diagnosis, symptoms, and treatments. This information could prove useful to health care providers.

  11. A Tropical Marine Microbial Natural Products Geobibliography as an Example of Desktop Exploration of Current Research Using Web Visualisation Tools

    PubMed Central

    Mukherjee, Joydeep; Llewellyn, Lyndon E; Evans-Illidge, Elizabeth A

    2008-01-01

    Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind™, ArcGIS Explorer™ and Google Earth™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article. PMID:19172194

  12. Scientific, technological, and economic aspects of rapid tooling by electric arc spray forming

    NASA Astrophysics Data System (ADS)

    Grant, P. S.; Duncan, S. R.; Roche, A.; Johnson, C. F.

    2006-12-01

    For the last seven years, Oxford University and Ford Motor Company personnel have been researching jointly the development of the large-scale spray forming of steel tooling capable for use in mass production, particularly for the pressing of sheet metal in automotive applications. These investigations have involved: the comprehensive microstructure and property studies, modeling of shape evolution and heat flow, realtime feedback control of tool temperature to eliminate tool distortion, high-speed imaging and particle image velocimetry of droplet deposition on three-dimensional (3D) shapes, testing of full-scale tools for different applications in the production environment, and detailed studies of the cost and time savings realized for different tooling applications. This paper provides an overview of the scientific and technical progress to date, presents the latest results, and describes the current state-of-the-art. Many of the insights described have relevance and applicability across the family of thermal spray processes and applications.

  13. Visualization: A pathway to enhanced scientific productivity in the expanding missions of Space and Earth Sciences

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, E. P.

    1995-01-01

    The movement toward the solution of problems involving large-scale system science, the ever-increasing capabilities of three-dimensional, time-dependent numerical models, and the enhanced capabilities of 'in situ' and remote sensing instruments bring a new era of scientific endeavor that requires an important change in our approach to mission planning and the task of data reduction and analysis. Visualization is at the heart of the requirements for a much-needed enhancement in scientific productivity as we face these new challenges. This article draws a perspective on the problem as it crosses discipline boundaries from solar physics to atmospheric and ocean sciences. It also attempts to introduce visualization as a new approach to scientific discovery and a tool which expedites and improves our insight into physically complex problems. A set of simple illustrations demonstrates a number of visualization techniques and the discussion emphasizes the trial-and-error and search-and-discover modes that are necessary for the techniques to reach their full potential. Further discussions also point to the importance of integrating data access, management, mathematical operations, and visualization into a single system. Some of the more recent developments in this area are reviewed.

  14. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  15. Mycobacterial biomaterials and resources for researchers.

    PubMed

    Hazbón, Manzour Hernando; Rigouts, Leen; Schito, Marco; Ezewudo, Matthew; Kudo, Takuji; Itoh, Takashi; Ohkuma, Moriya; Kiss, Katalin; Wu, Linhuan; Ma, Juncai; Hamada, Moriyuki; Strong, Michael; Salfinger, Max; Daley, Charles L; Nick, Jerry A; Lee, Jung-Sook; Rastogi, Nalin; Couvin, David; Hurtado-Ortiz, Raquel; Bizet, Chantal; Suresh, Anita; Rodwell, Timothy; Albertini, Audrey; Lacourciere, Karen A; Deheer-Graham, Ana; Alexander, Sarah; Russell, Julie E; Bradford, Rebecca; Riojas, Marco A

    2018-06-01

    There are many resources available to mycobacterial researchers, including culture collections around the world that distribute biomaterials to the general scientific community, genomic and clinical databases, and powerful bioinformatics tools. However, many of these resources may be unknown to the research community. This review article aims to summarize and publicize many of these resources, thus strengthening the quality and reproducibility of mycobacterial research by providing the scientific community access to authenticated and quality-controlled biomaterials and a wealth of information, analytical tools and research opportunities.

  16. The Units Ontology: a tool for integrating units of measurement in science

    PubMed Central

    Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert

    2012-01-01

    Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432

  17. Beware of Geeks Bearing Gifts - Are we Meeting the Requirements of our User Communities?

    NASA Astrophysics Data System (ADS)

    Klump, J.

    2007-12-01

    The 20th century brought about an "information revolution" that has forever altered the way we work, communicate, and live. The way science has been conducted for the past 200 years has been challenged by new media of communication and for the dissemination of data. We now have the tools at hand, commonly called cyberinfrastructure, that enable new forms of global collaboration. But are we fully realising the potential of cyberinfrastructure? Has it become an integral part of our scientific culture? Tools developed in Earth and Space Science Informatics projects suffer the same effects like informatics developments in other fields. Many of the projects fail to meet user requirements, and they do so for a number of reasons. Besides a certain reluctance on the side of scientists to adopt new tools for conducting their research, many cyberinfrastructure projects suffer from "marketing myopia" (Levitt, 1960) in the way they try to "sell" their applications. According to Levitt, the difference between selling and marketing is that the former fulfils the needs of the seller and the latter the needs of the buyer. Cyberinfrastructure projects must stop trying to sell their achievements to the scientific community, and instead market them by considering the scientists" needs right at the beginning of their endeavours. Admittedly, the requirements of scientific user communities are "moving targets", because scientific workflows are often subject to ad-hoc changes, depending on the outcome of the preceding step. Another important risk factor, faced by many cyberinfrastructure projects, is that the designated user community is not aware of the availability of this new resource. This is where training and outreach are essential, especially to draw in early adopters of new technology and multipliers among researchers. Only cyberinfrastructure tools that truly serve their designated user community will eventually become part of the scientific infrastructure. This presentation looks at the factors and strategies that affect adoption of cyberinfrastructrures by the scientific community.

  18. The Lunar Mapping and Modeling Project

    NASA Technical Reports Server (NTRS)

    Nall, M.; French, R.; Noble, S.; Muery, K.

    2010-01-01

    The Lunar Mapping and Modeling Project (LMMP) is managing a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, de-sign, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. Although the project was initiated primarily to serve the needs of the Constellation program, it is equally suited for supporting landing site selection and planning for a variety of robotic missions, including NASA science and/or human precursor missions and commercial missions such as those planned by the Google Lunar X-Prize participants. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public out-reach (E/PO) activities.

  19. Bioinformatics-based tools in drug discovery: the cartography from single gene to integrative biological networks.

    PubMed

    Ramharack, Pritika; Soliman, Mahmoud E S

    2018-06-01

    Originally developed for the analysis of biological sequences, bioinformatics has advanced into one of the most widely recognized domains in the scientific community. Despite this technological evolution, there is still an urgent need for nontoxic and efficient drugs. The onus now falls on the 'omics domain to meet this need by implementing bioinformatics techniques that will allow for the introduction of pioneering approaches in the rational drug design process. Here, we categorize an updated list of informatics tools and explore the capabilities of integrative bioinformatics in disease control. We believe that our review will serve as a comprehensive guide toward bioinformatics-oriented disease and drug discovery research. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Air Markets Program Data (AMPD)

    EPA Pesticide Factsheets

    The Air Markets Program Data tool allows users to search EPA data to answer scientific, general, policy, and regulatory questions about industry emissions. Air Markets Program Data (AMPD) is a web-based application that allows users easy access to both current and historical data collected as part of EPA's emissions trading programs. This site allows you to create and view reports and to download emissions data for further analysis. AMPD provides a query tool so users can create custom queries of industry source emissions data, allowance data, compliance data, and facility attributes. In addition, AMPD provides interactive maps, charts, reports, and pre-packaged datasets. AMPD does not require any additional software, plug-ins, or security controls and can be accessed using a standard web browser.

Top