Sample records for enable scientific modeling

  1. A Model for Enabling an Effective Outcome-Oriented Communication between the Scientific and Educational Communities

    ERIC Educational Resources Information Center

    Ledley, Tamara Shapiro; Taber, Michael R.; Lynds, Susan; Domenico, Ben; Dahlman, LuAnn

    2012-01-01

    Traditionally, there has been a large gap between the scientific and educational communities in terms of communication, which hinders the transfer of new scientific knowledge to teachers and students and the understanding of each other's needs and capabilities. In this paper, we describe a workshop model we have developed to facilitate…

  2. Science in the cloud (SIC): A use case in MRI connectomics

    PubMed Central

    Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal

    2017-01-01

    Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935

  3. Science in the cloud (SIC): A use case in MRI connectomics.

    PubMed

    Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T

    2017-05-01

    Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.

  4. Evaluation of Student Models on Current Socio-Scientific Topics Based on System Dynamics

    ERIC Educational Resources Information Center

    Nuhoglu, Hasret

    2014-01-01

    This study aims to 1) enable primary school students to develop models that will help them understand and analyze a system, through a learning process based on system dynamics approach, 2) examine and evaluate students' models related to socio-scientific issues using certain criteria. The research method used is a case study. The study sample…

  5. Visual representation of scientific information.

    PubMed

    Wong, Bang

    2011-02-15

    Great technological advances have enabled researchers to generate an enormous amount of data. Data analysis is replacing data generation as the rate-limiting step in scientific research. With this wealth of information, we have an opportunity to understand the molecular causes of human diseases. However, the unprecedented scale, resolution, and variety of data pose new analytical challenges. Visual representation of data offers insights that can lead to new understanding, whether the purpose is analysis or communication. This presentation shows how art, design, and traditional illustration can enable scientific discovery. Examples will be drawn from the Broad Institute's Data Visualization Initiative, aimed at establishing processes for creating informative visualization models.

  6. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them to better understand the nature of complex, multiscale geoscience data.

  7. Visualizing the Heliosphere

    NASA Technical Reports Server (NTRS)

    Bridgman, William T.; Shirah, Greg W.; Mitchell, Horace G.

    2008-01-01

    Today, scientific data and models can combine with modern animation tools to produce compelling visualizations to inform and educate. The Scientific Visualization Studio at Goddard Space Flight Center merges these techniques from the very different worlds of entertainment and science to enable scientists and the general public to 'see the unseeable' in new ways.

  8. Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model

    NASA Astrophysics Data System (ADS)

    Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi

    Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.

  9. The Dream

    NASA Astrophysics Data System (ADS)

    Peach, D.

    2010-12-01

    Models have become a routine research output over the past decade. They are used by many scientific disciplines in order to understand and analyses the processes and conditions within their domain of interest. This has resulted in a significant increase in scientific understanding and in a multitude of discipline specific models, modelling system software and workflows. There is now a growing realisation that to address the most pertinent questions of the age, such as climate change and the sustainable use of natural resources, we need to bringing together climate, ecological, hydrological, hydrogeological, geological and socio-economic models in order to provide the necessary framework to make truly informed decisions. At the British Geological Survey our vision is to provide scientists with the data, tools, techniques and support to address trans-disciplinary environmental questions impacting on human society. We hope to achieve this by being a leading member of an open community that will share data, applications and environmental models thus enabling collaboration and achieving sustainable solutions. The British Geological Survey has recently completed a scoping study with the aim of planning the implementation of the vision and preparing the organisation for the changes that are required to enable it to engage more effectively in trans-disciplinary modelling. This has resulted in the launch of a cross-cutting project call Data and Research for Environmental Applications and Models: The DREAM. The scoping study recognised that the investment and knowledge captured within the many existing scientific models represent a significant resource and not one that could be easily replicated in more centralised environmental modelling software. The only viable option in a ‘linked models’ approach which enables models to pass parameters between each other at runtime. This is perceived to be a pragmatic, achievable and cost-effective solution. This solution brings together the best and most appropriate scientific models and allows the various scientific disciplines to continue the development of their current models. The European Union has funded multi-national, multi-disciplinary research into ‘linked modelling’, using the Open Model Interchange (OPENMI) standard. This software used, in conjunction with critical underpinning activities such as data management, semantics and ontologies, understanding of uncertainty and visualisation, offers a rapidly maturing solution with the potential to fulfil The DREAM.

  10. Scientific Inquiry Based Professional Development Models in Teacher Education

    ERIC Educational Resources Information Center

    Corlu, Mehmet Ali; Corlu, M. Sencer

    2012-01-01

    Scientific inquiry helps students develop critical thinking abilities and enables students to think and construct knowledge like a scientist. The study describes a method course implementation at a major public teachers college in Turkey. The main goal of the course was to improve research and teaching abilities of prospective physics teachers…

  11. PyMT: A Python package for model-coupling in the Earth sciences

    NASA Astrophysics Data System (ADS)

    Hutton, E.

    2016-12-01

    The current landscape of Earth-system models is not only broad in scientific scope, but also broad in type. On the one hand, the large variety of models is exciting, as it provides fertile ground for extending or linking models together in novel ways to answer new scientific questions. However, the heterogeneity in model type acts to inhibit model coupling, model development, or even model use. Existing models are written in a variety of programming languages, operate on different grids, use their own file formats (both for input and output), have different user interfaces, have their own time steps, etc. Each of these factors become obstructions to scientists wanting to couple, extend - or simply run - existing models. For scientists whose main focus may not be computer science these barriers become even larger and become significant logistical hurdles. And this is all before the scientific difficulties of coupling or running models are addressed. The CSDMS Python Modeling Toolkit (PyMT) was developed to help non-computer scientists deal with these sorts of modeling logistics. PyMT is the fundamental package the Community Surface Dynamics Modeling System uses for the coupling of models that expose the Basic Modeling Interface (BMI). It contains: Tools necessary for coupling models of disparate time and space scales (including grid mappers) Time-steppers that coordinate the sequencing of coupled models Exchange of data between BMI-enabled models Wrappers that automatically load BMI-enabled models into the PyMT framework Utilities that support open-source interfaces (UGRID, SGRID,CSDMS Standard Names, etc.) A collection of community-submitted models, written in a variety of programminglanguages, from a variety of process domains - but all usable from within the Python programming language A plug-in framework for adding additional BMI-enabled models to the framework In this presentation we intoduce the basics of the PyMT as well as provide an example of coupling models of different domains and grid types.

  12. A Model for Math Modeling

    ERIC Educational Resources Information Center

    Lin, Tony; Erfan, Sasan

    2016-01-01

    Mathematical modeling is an open-ended research subject where no definite answers exist for any problem. Math modeling enables thinking outside the box to connect different fields of studies together including statistics, algebra, calculus, matrices, programming and scientific writing. As an integral part of society, it is the foundation for many…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strout, Michelle

    Programming parallel machines is fraught with difficulties: the obfuscation of algorithms due to implementation details such as communication and synchronization, the need for transparency between language constructs and performance, the difficulty of performing program analysis to enable automatic parallelization techniques, and the existence of important "dusty deck" codes. The SAIMI project developed abstractions that enable the orthogonal specification of algorithms and implementation details within the context of existing DOE applications. The main idea is to enable the injection of small programming models such as expressions involving transcendental functions, polyhedral iteration spaces with sparse constraints, and task graphs into full programsmore » through the use of pragmas. These smaller, more restricted programming models enable orthogonal specification of many implementation details such as how to map the computation on to parallel processors, how to schedule the computation, and how to allocation storage for the computation. At the same time, these small programming models enable the expression of the most computationally intense and communication heavy portions in many scientific simulations. The ability to orthogonally manipulate the implementation for such computations will significantly ease performance programming efforts and expose transformation possibilities and parameter to automated approaches such as autotuning. At Colorado State University, the SAIMI project was supported through DOE grant DE-SC3956 from April 2010 through August 2015. The SAIMI project has contributed a number of important results to programming abstractions that enable the orthogonal specification of implementation details in scientific codes. This final report summarizes the research that was funded by the SAIMI project.« less

  14. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  15. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  16. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  17. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  18. Discovering Mendeleev's Model.

    ERIC Educational Resources Information Center

    Sterling, Donna

    1996-01-01

    Presents an activity that introduces the historical developments in science that led to the discovery of the periodic table and lets students experience scientific discovery firsthand. Enables students to learn about patterns among the elements and experience how scientists analyze data to discover patterns and build models. (JRH)

  19. Nanoethics, Science Communication, and a Fourth Model for Public Engagement.

    PubMed

    Miah, Andy

    2017-01-01

    This paper develops a fourth model of public engagement with science, grounded in the principle of nurturing scientific agency through participatory bioethics. It argues that social media is an effective device through which to enable such engagement, as it has the capacity to empower users and transforms audiences into co-producers of knowledge, rather than consumers of content. Social media also fosters greater engagement with the political and legal implications of science, thus promoting the value of scientific citizenship. This argument is explored by considering the case of nanoscience and nanotechnology, as an exemplar for how emerging technologies may be handled by the scientific community and science policymakers.

  20. Damsel: A Data Model Storage Library for Exascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koziol, Quincey

    The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. We will accomplish this through three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community.

  1. Enabling NVM for Data-Intensive Scientific Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carns, Philip; Jenkins, John; Seo, Sangmin

    Specialized, transient data services are playing an increasingly prominent role in data-intensive scientific computing. These services offer flexible, on-demand pairing of applications with storage hardware using semantics that are optimized for the problem domain. Concurrent with this trend, upcoming scientific computing and big data systems will be deployed with emerging NVM technology to achieve the highest possible price/productivity ratio. Clearly, therefore, we must develop techniques to facilitate the confluence of specialized data services and NVM technology. In this work we explore how to enable the composition of NVM resources within transient distributed services while still retaining their essential performance characteristics.more » Our approach involves eschewing the conventional distributed file system model and instead projecting NVM devices as remote microservices that leverage user-level threads, RPC services, RMA-enabled network transports, and persistent memory libraries in order to maximize performance. We describe a prototype system that incorporates these concepts, evaluate its performance for key workloads on an exemplar system, and discuss how the system can be leveraged as a component of future data-intensive architectures.« less

  2. A Scientific Software Product Line for the Bioinformatics domain.

    PubMed

    Costa, Gabriella Castro B; Braga, Regina; David, José Maria N; Campos, Fernanda

    2015-08-01

    Most specialized users (scientists) that use bioinformatics applications do not have suitable training on software development. Software Product Line (SPL) employs the concept of reuse considering that it is defined as a set of systems that are developed from a common set of base artifacts. In some contexts, such as in bioinformatics applications, it is advantageous to develop a collection of related software products, using SPL approach. If software products are similar enough, there is the possibility of predicting their commonalities, differences and then reuse these common features to support the development of new applications in the bioinformatics area. This paper presents the PL-Science approach which considers the context of SPL and ontology in order to assist scientists to define a scientific experiment, and to specify a workflow that encompasses bioinformatics applications of a given experiment. This paper also focuses on the use of ontologies to enable the use of Software Product Line in biological domains. In the context of this paper, Scientific Software Product Line (SSPL) differs from the Software Product Line due to the fact that SSPL uses an abstract scientific workflow model. This workflow is defined according to a scientific domain and using this abstract workflow model the products (scientific applications/algorithms) are instantiated. Through the use of ontology as a knowledge representation model, we can provide domain restrictions as well as add semantic aspects in order to facilitate the selection and organization of bioinformatics workflows in a Scientific Software Product Line. The use of ontologies enables not only the expression of formal restrictions but also the inferences on these restrictions, considering that a scientific domain needs a formal specification. This paper presents the development of the PL-Science approach, encompassing a methodology and an infrastructure, and also presents an approach evaluation. This evaluation presents case studies in bioinformatics, which were conducted in two renowned research institutions in Brazil. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. An open annotation ontology for science on web 3.0

    PubMed Central

    2011-01-01

    Background There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Methods Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. Results This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables “stand-off” or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO’s Google Code page: http://code.google.com/p/annotation-ontology/ . Conclusions The Annotation Ontology meets critical requirements for an open, freely shareable model in OWL, of annotation metadata created against scientific documents on the Web. We believe AO can become a very useful common model for annotation metadata on Web documents, and will enable biomedical domain ontologies to be used quite widely to annotate the scientific literature. Potential collaborators and those with new relevant use cases are invited to contact the authors. PMID:21624159

  4. An open annotation ontology for science on web 3.0.

    PubMed

    Ciccarese, Paolo; Ocana, Marco; Garcia Castro, Leyla Jael; Das, Sudeshna; Clark, Tim

    2011-05-17

    There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables "stand-off" or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO's Google Code page: http://code.google.com/p/annotation-ontology/ . The Annotation Ontology meets critical requirements for an open, freely shareable model in OWL, of annotation metadata created against scientific documents on the Web. We believe AO can become a very useful common model for annotation metadata on Web documents, and will enable biomedical domain ontologies to be used quite widely to annotate the scientific literature. Potential collaborators and those with new relevant use cases are invited to contact the authors.

  5. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  6. Mining and Indexing Graph Databases

    ERIC Educational Resources Information Center

    Yuan, Dayu

    2013-01-01

    Graphs are widely used to model structures and relationships of objects in various scientific and commercial fields. Chemical molecules, proteins, malware system-call dependencies and three-dimensional mechanical parts are all modeled as graphs. In this dissertation, we propose to mine and index those graph data to enable fast and scalable search.…

  7. On the formalization and reuse of scientific research.

    PubMed

    King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N

    2011-10-07

    The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.

  8. Design and Implementation of Scientific Software Components to Enable Multiscale Modeling: The Effective Fragment Potential (QM/EFP) Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha

    2012-10-19

    The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less

  9. Enabling Data Fusion via a Common Data Model and Programming Interface

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Wilson, A.

    2011-12-01

    Much progress has been made in scientific data interoperability, especially in the areas of metadata and discovery. However, while a data user may have improved techniques for finding data, there is often a large chasm to span when it comes to acquiring the desired subsets of various datasets and integrating them into a data processing environment. Some tools such as OPeNDAP servers and the Unidata Common Data Model (CDM) have introduced improved abstractions for accessing data via a common interface, but they alone do not go far enough to enable fusion of data from multidisciplinary sources. Although data from various scientific disciplines may represent semantically similar concepts (e.g. time series), the user may face widely varying structural representations of the data (e.g. row versus column oriented), not to mention radically different storage formats. It is not enough to convert data to a common format. The key to fusing scientific data is to represent each dataset with consistent sampling. This can best be done by using a data model that expresses the functional relationship that each dataset represents. The domain of those functions determines how the data can be combined. The Visualization for Algorithm Development (VisAD) Java API has provided a sophisticated data model for representing the functional nature of scientific datasets for well over a decade. Because VisAD is largely designed for its visualization capabilities, the data model can be cumbersome to use for numerical computation, especially for those not comfortable with Java. Although both VisAD and the implementation of the CDM are written in Java, neither defines a pure Java interface that others could implement and program to, further limiting potential for interoperability. In this talk, we will present a solution for data integration based on a simple discipline-agnostic scientific data model and programming interface that enables a dataset to be defined in terms of three variable types: Scalar (a), Tuple (a,b), and Function (a -> b). These basic building blocks can be combined and nested to represent any arbitrarily complex dataset. For example, a time series of surface temperature and pressure could be represented as: time -> ((lon,lat) -> (T,P)). Our data model is expressed in UML and can be implemented in numerous programming languages. We will demonstrate an implementation of our data model and interface using the Scala programming language. Given its functional programming constructs, sophisticated type system, and other language features, Scala enables us to construct complex data structures that can be manipulated using natural mathematical expressions while taking advantage of the language's ability to operate on collections in parallel. This API will be applied to the problem of assimilating various measurements of the solar spectrum and other proxies from multiple sources to construct a composite Lyman-alpha irradiance dataset.

  10. Web Based Semi-automatic Scientific Validation of Models of the Corona and Inner Heliosphere

    NASA Astrophysics Data System (ADS)

    MacNeice, P. J.; Chulaki, A.; Taktakishvili, A.; Kuznetsova, M. M.

    2013-12-01

    Validation is a critical step in preparing models of the corona and inner heliosphere for future roles supporting either or both the scientific research community and the operational space weather forecasting community. Validation of forecasting quality tends to focus on a short list of key features in the model solutions, with an unchanging order of priority. Scientific validation exposes a much larger range of physical processes and features, and as the models evolve to better represent features of interest, the research community tends to shift its focus to other areas which are less well understood and modeled. Given the more comprehensive and dynamic nature of scientific validation, and the limited resources available to the community to pursue this, it is imperative that the community establish a semi-automated process which engages the model developers directly into an ongoing and evolving validation process. In this presentation we describe the ongoing design and develpment of a web based facility to enable this type of validation of models of the corona and inner heliosphere, on the growing list of model results being generated, and on strategies we have been developing to account for model results that incorporate adaptively refined numerical grids.

  11. AceTree: a major update and case study in the long term maintenance of open-source scientific software.

    PubMed

    Katzman, Braden; Tang, Doris; Santella, Anthony; Bao, Zhirong

    2018-04-04

    AceTree, a software application first released in 2006, facilitates exploration, curation and editing of tracked C. elegans nuclei in 4-dimensional (4D) fluorescence microscopy datasets. Since its initial release, AceTree has been continuously used to interact with, edit and interpret C. elegans lineage data. In its 11 year lifetime, AceTree has been periodically updated to meet the technical and research demands of its community of users. This paper presents the newest iteration of AceTree which contains extensive updates, demonstrates the new applicability of AceTree in other developmental contexts, and presents its evolutionary software development paradigm as a viable model for maintaining scientific software. Large scale updates have been made to the user interface for an improved user experience. Tools have been grouped according to functionality and obsolete methods have been removed. Internal requirements have been changed that enable greater flexibility of use both in C. elegans contexts and in other model organisms. Additionally, the original 3-dimensional (3D) viewing window has been completely reimplemented. The new window provides a new suite of tools for data exploration. By responding to technical advancements and research demands, AceTree has remained a useful tool for scientific research for over a decade. The updates made to the codebase have extended AceTree's applicability beyond its initial use in C. elegans and enabled its usage with other model organisms. The evolution of AceTree demonstrates a viable model for maintaining scientific software over long periods of time.

  12. Methods for Specifying Scientific Data Standards and Modeling Relationships with Applications to Neuroscience

    PubMed Central

    Rübel, Oliver; Dougherty, Max; Prabhat; Denes, Peter; Conant, David; Chang, Edward F.; Bouchard, Kristofer

    2016-01-01

    Neuroscience continues to experience a tremendous growth in data; in terms of the volume and variety of data, the velocity at which data is acquired, and in turn the veracity of data. These challenges are a serious impediment to sharing of data, analyses, and tools within and across labs. Here, we introduce BRAINformat, a novel data standardization framework for the design and management of scientific data formats. The BRAINformat library defines application-independent design concepts and modules that together create a general framework for standardization of scientific data. We describe the formal specification of scientific data standards, which facilitates sharing and verification of data and formats. We introduce the concept of Managed Objects, enabling semantic components of data formats to be specified as self-contained units, supporting modular and reusable design of data format components and file storage. We also introduce the novel concept of Relationship Attributes for modeling and use of semantic relationships between data objects. Based on these concepts we demonstrate the application of our framework to design and implement a standard format for electrophysiology data and show how data standardization and relationship-modeling facilitate data analysis and sharing. The format uses HDF5, enabling portable, scalable, and self-describing data storage and integration with modern high-performance computing for data-driven discovery. The BRAINformat library is open source, easy-to-use, and provides detailed user and developer documentation and is freely available at: https://bitbucket.org/oruebel/brainformat. PMID:27867355

  13. A Conceptual Model of the World of Work.

    ERIC Educational Resources Information Center

    VanRooy, William H.

    The conceptual model described in this paper resulted from the need to organize a body of knowledge related to the world of work which would enable curriculum developers to prepare accurate, realistic instructional materials. The world of work is described by applying Malinowski's scientific study of the structural components of culture. It is…

  14. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less

  15. Simulating Technology Processes to Foster Learning.

    ERIC Educational Resources Information Center

    Krumholtz, Nira

    1998-01-01

    Based on a spiral model of technology evolution, elementary students used LOGO computer software to become both developers and users of technology. The computerized environment enabled 87% to reach intuitive understanding of physical concepts; 24% expressed more formal scientific understanding. (SK)

  16. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  17. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  18. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  19. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk; Maynard, Robert

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less

  20. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  1. Facilitating Stewardship of scientific data through standards based workflows

    NASA Astrophysics Data System (ADS)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    There are main suites of standards that can be used to define the fundamental scientific methodology of data, methods and results. These are firstly Metadata standards to enable discovery of the data (ISO 19115), secondly the Sensor Web Enablement (SWE) suite of standards that include the O&M and SensorML standards and thirdly Ontology that provide vocabularies to define the scientific concepts and relationships between these concepts. All three types of standards have to be utilised by the practicing scientist to ensure that those who ultimately have to steward the data stewards to ensure that the data can be preserved curated and reused and repurposed. Additional benefits of this approach include transparency of scientific processes from the data acquisition to creation of scientific concepts and models, and provision of context to inform data use. Collecting and recording metadata is the first step in scientific data flow. The primary role of metadata is to provide details of geographic extent, availability and high-level description of data suitable for its initial discovery through common search engines. The SWE suite provides standardised patterns to describe observations and measurements taken for these data, capture detailed information about observation or analytical methods, used instruments and define quality determinations. This information standardises browsing capability over discrete data types. The standardised patterns of the SWE standards simplify aggregation of observation and measurement data enabling scientists to transfer disintegrated data to scientific concepts. The first two steps provide a necessary basis for the reasoning about concepts of ';pure' science, building relationship between concepts of different domains (linked-data), and identifying domain classification and vocabularies. Geoscience Australia is re-examining its marine data flows, including metadata requirements and business processes, to achieve a clearer link between scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.

  2. Laboratory Astrophysics: Enabling Scientific Discovery and Understanding

    NASA Technical Reports Server (NTRS)

    Kirby, K.

    2006-01-01

    NASA's Science Strategic Roadmap for Universe Exploration lays out a series of science objectives on a grand scale and discusses the various missions, over a wide range of wavelengths, which will enable discovery. Astronomical spectroscopy is arguably the most powerful tool we have for exploring the Universe. Experimental and theoretical studies in Laboratory Astrophysics convert "hard-won data into scientific understanding". However, the development of instruments with increasingly high spectroscopic resolution demands atomic and molecular data of unprecedented accuracy and completeness. How to meet these needs, in a time of severe budgetary constraints, poses a significant challenge both to NASA, the astronomical observers and model-builders, and the laboratory astrophysics community. I will discuss these issues, together with some recent examples of productive astronomy/lab astro collaborations.

  3. The Blue LED Nobel Prize: Historical context, current scientific understanding, human benefit

    DOE PAGES

    Tsao, Jeffrey Y.; Han, Jung; Haitz, Roland H.; ...

    2015-06-19

    Here, the paths that connect scientific understanding with tools and technology are rarely linear. Sometimes scientific understanding leads and enables, sometimes tools and technologies lead and enable. But by feeding on each other, they create virtuous spirals of forward and backward innovation.

  4. The Blue LED Nobel Prize: Historical context, current scientific understanding, human benefit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsao, Jeffrey Y.; Han, Jung; Haitz, Roland H.

    Here, the paths that connect scientific understanding with tools and technology are rarely linear. Sometimes scientific understanding leads and enables, sometimes tools and technologies lead and enable. But by feeding on each other, they create virtuous spirals of forward and backward innovation.

  5. Ames Culture Chamber System: Enabling Model Organism Research Aboard the international Space Station

    NASA Technical Reports Server (NTRS)

    Steele, Marianne

    2014-01-01

    Understanding the genetic, physiological, and behavioral effects of spaceflight on living organisms and elucidating the molecular mechanisms that underlie these effects are high priorities for NASA. Certain organisms, known as model organisms, are widely studied to help researchers better understand how all biological systems function. Small model organisms such as nem-atodes, slime mold, bacteria, green algae, yeast, and moss can be used to study the effects of micro- and reduced gravity at both the cellular and systems level over multiple generations. Many model organisms have sequenced genomes and published data sets on their transcriptomes and proteomes that enable scientific investigations of the molecular mechanisms underlying the adaptations of these organisms to space flight.

  6. Taming theory with thought experiments: Understanding and scientific progress.

    PubMed

    Stuart, Michael T

    2016-08-01

    I claim that one way thought experiments contribute to scientific progress is by increasing scientific understanding. Understanding does not have a currently accepted characterization in the philosophical literature, but I argue that we already have ways to test for it. For instance, current pedagogical practice often requires that students demonstrate being in either or both of the following two states: 1) Having grasped the meaning of some relevant theory, concept, law or model, 2) Being able to apply that theory, concept, law or model fruitfully to new instances. Three thought experiments are presented which have been important historically in helping us pass these tests, and two others that cause us to fail. Then I use this operationalization of understanding to clarify the relationships between scientific thought experiments, the understanding they produce, and the progress they enable. I conclude that while no specific instance of understanding (thus conceived) is necessary for scientific progress, understanding in general is. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Technologies Enabling Scientific Exploration of Asteroids and Moons

    NASA Astrophysics Data System (ADS)

    Shaw, A.; Fulford, P.; Chappell, L.

    2016-12-01

    Scientific exploration of moons and asteroids is enabled by several key technologies that yield topographic information, allow excavation of subsurface materials, and allow delivery of higher-mass scientific payloads to moons and asteroids. These key technologies include lidar systems, robotics, and solar-electric propulsion spacecraft buses. Many of these technologies have applications for a variety of planetary targets. Lidar systems yield high-resolution shape models of asteroids and moons. These shape models can then be combined with radio science information to yield insight into density and internal structure. Further, lidar systems allow investigation of topographic surface features, large and small, which yields information on regolith properties. Robotic arms can be used for a variety of purposes, especially to support excavation, revealing subsurface material and acquiring material from depth for either in situ analysis or sample return. Robotic arms with built-in force sensors can also be used to gauge the strength of materials as a function of depth, yielding insight into regolith physical properties. Mobility systems allow scientific exploration of multiple sites, and also yield insight into regolith physical properties due to the interaction of wheels with regolith. High-power solar electric propulsion (SEP) spacecraft bus systems allow more science instruments to be included on missions given their ability to support greater payload mass. In addition, leveraging a cost-effective commercially-built SEP spacecraft bus can significantly reduce mission cost.

  8. Design, Development and Validation of a Model of Problem Solving for Egyptian Science Classes

    ERIC Educational Resources Information Center

    Shahat, Mohamed A.; Ohle, Annika; Treagust, David F.; Fischer, Hans E.

    2013-01-01

    Educators and policymakers envision the future of education in Egypt as enabling learners to acquire scientific inquiry and problem-solving skills. In this article, we describe the validation of a model for problem solving and the design of instruments for evaluating new teaching methods in Egyptian science classes. The instruments were based on…

  9. Detangling Spaghetti: Tracking Deep Ocean Currents in the Gulf of Mexico

    ERIC Educational Resources Information Center

    Curran, Mary Carla; Bower, Amy S.; Furey, Heather H.

    2017-01-01

    Creation of physical models can help students learn science by enabling them to be more involved in the scientific process of discovery and to use multiple senses during investigations. This activity achieves these goals by having students model ocean currents in the Gulf of Mexico. In general, oceans play a key role in influencing weather…

  10. Intrinsic ethics regarding integrated assessment models for climate management.

    PubMed

    Schienke, Erich W; Baum, Seth D; Tuana, Nancy; Davis, Kenneth J; Keller, Klaus

    2011-09-01

    In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes.

  11. A system for environmental model coupling and code reuse: The Great Rivers Project

    NASA Astrophysics Data System (ADS)

    Eckman, B.; Rice, J.; Treinish, L.; Barford, C.

    2008-12-01

    As part of the Great Rivers Project, IBM is collaborating with The Nature Conservancy and the Center for Sustainability and the Global Environment (SAGE) at the University of Wisconsin, Madison to build a Modeling Framework and Decision Support System (DSS) designed to help policy makers and a variety of stakeholders (farmers, fish & wildlife managers, hydropower operators, et al.) to assess, come to consensus, and act on land use decisions representing effective compromises between human use and ecosystem preservation/restoration. Initially focused on Brazil's Paraguay-Parana, China's Yangtze, and the Mississippi Basin in the US, the DSS integrates data and models from a wide variety of environmental sectors, including water balance, water quality, carbon balance, crop production, hydropower, and biodiversity. In this presentation we focus on the modeling framework aspect of this project. In our approach to these and other environmental modeling projects, we see a flexible, extensible modeling framework infrastructure for defining and running multi-step analytic simulations as critical. In this framework, we divide monolithic models into atomic components with clearly defined semantics encoded via rich metadata representation. Once models and their semantics and composition rules have been registered with the system by their authors or other experts, non-expert users may construct simulations as workflows of these atomic model components. A model composition engine enforces rules/constraints for composing model components into simulations, to avoid the creation of Frankenmodels, models that execute but produce scientifically invalid results. A common software environment and common representations of data and models are required, as well as an adapter strategy for code written in e.g., Fortran or python, that still enables efficient simulation runs, including parallelization. Since each new simulation, as a new composition of model components, requires calibration of parameters (fudge factors) to produce scientifically valid results, we are also developing an autocalibration engine. Finally, visualization is a key element of this modeling framework strategy, both to convey complex scientific data effectively, and also to enable non-expert users to make full use of the relevant features of the framework. We are developing a visualization environment with a strong data model, to enable visualizations, model results, and data all to be handled similarly.

  12. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  13. Synchronous international scientific mobility in the space of affiliations: evidence from Russia.

    PubMed

    Markova, Yulia V; Shmatko, Natalia A; Katchanov, Yurij L

    2016-01-01

    The article presents a survey of Russian researchers' synchronous international scientific mobility as an element of the global system of scientific labor market. Synchronous international scientific mobility is a simultaneous holding of scientific positions in institutions located in different countries. The study explores bibliometric data from the Web of Science Core Collection and socio-economic indicators for 56 countries. In order to examine international scientific mobility, we use a method of affiliations. The paper introduces a model of synchronous international scientific mobility. It enables to specify country's involvement in the international division of scientific labor. Synchronous international scientific mobility is a modern form of the international division of labor in science. It encompasses various forms of part-time, temporary and remote employment of scientists. The analysis reveals the distribution of Russian authors in the space of affiliations, and directions of upward/downward international scientific mobility. The bibliometric characteristics of mobile authors are isomorphic to those of receiver country authors. Synchronous international scientific mobility of Russian authors is determined by differences in scientific impacts between receiver and donor countries.

  14. Elaboration and formalization of current scientific knowledge of risks and preventive measures illustrated by colorectal cancer.

    PubMed

    Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M

    2001-01-01

    Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.

  15. Blizzard 2016

    NASA Image and Video Library

    2017-12-08

    A NASA Center for Climate Simulation supercomputer model that shows the flow of ‪#‎Blizzard2016‬ thru Sunday. Learn more here: go.nasa.gov/1WBm547 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  16. Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process

    NASA Technical Reports Server (NTRS)

    Racette, Paul

    2010-01-01

    Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.

  17. Test Driven Development: Lessons from a Simple Scientific Model

    NASA Astrophysics Data System (ADS)

    Clune, T. L.; Kuo, K.

    2010-12-01

    In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.

  18. [The transfer of scientific knowledge in clinical practice].

    PubMed

    Pearson, Alan

    2012-12-01

    The rapid growth of evidence-based practice has been facilitated by the development of theoretical models enabling the components of this concept to be assimilated. The Joanna Briggs Institute is an international organisation which offers a wide range of resources in order to promote this practice in the health sector.

  19. Conceptualizing the World of Work

    ERIC Educational Resources Information Center

    Van Rooy, William H.; Bailey, Larry J.

    1973-01-01

    The conceptual model described here has resulted from the need to organize a body of knowledge related to the world of work which would enable curriculum developers to prepare accurate, realistic instructional materials. The world of work is described by applying Malinowski's scientific study of the structural components of culture. (Author/DS)

  20. Improvements in GRACE Gravity Field Determination through Stochastic Observation Modeling

    NASA Astrophysics Data System (ADS)

    McCullough, C.; Bettadpur, S. V.

    2016-12-01

    Current unconstrained Release 05 GRACE gravity field solutions from the Center for Space Research (CSR RL05) assume random observation errors following an independent multivariate Gaussian distribution. This modeling of observations, a simplifying assumption, fails to account for long period, correlated errors arising from inadequacies in the background force models. Fully modeling the errors inherent in the observation equations, through the use of a full observation covariance (modeling colored noise), enables optimal combination of GPS and inter-satellite range-rate data and obviates the need for estimating kinematic empirical parameters during the solution process. Most importantly, fully modeling the observation errors drastically improves formal error estimates of the spherical harmonic coefficients, potentially enabling improved uncertainty quantification of scientific results derived from GRACE and optimizing combinations of GRACE with independent data sets and a priori constraints.

  1. High performance hybrid functional Petri net simulations of biological pathway models on CUDA.

    PubMed

    Chalkidis, Georgios; Nagasaki, Masao; Miyano, Satoru

    2011-01-01

    Hybrid functional Petri nets are a wide-spread tool for representing and simulating biological models. Due to their potential of providing virtual drug testing environments, biological simulations have a growing impact on pharmaceutical research. Continuous research advancements in biology and medicine lead to exponentially increasing simulation times, thus raising the demand for performance accelerations by efficient and inexpensive parallel computation solutions. Recent developments in the field of general-purpose computation on graphics processing units (GPGPU) enabled the scientific community to port a variety of compute intensive algorithms onto the graphics processing unit (GPU). This work presents the first scheme for mapping biological hybrid functional Petri net models, which can handle both discrete and continuous entities, onto compute unified device architecture (CUDA) enabled GPUs. GPU accelerated simulations are observed to run up to 18 times faster than sequential implementations. Simulating the cell boundary formation by Delta-Notch signaling on a CUDA enabled GPU results in a speedup of approximately 7x for a model containing 1,600 cells.

  2. Use of computational modeling combined with advanced visualization to develop strategies for the design of crop ideotypes to address food security

    DOE PAGES

    Christensen, A. J.; Srinivasan, V.; Hart, J. C.; ...

    2018-03-17

    Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have ledmore » to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. Lastly, this survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.« less

  3. Use of computational modeling combined with advanced visualization to develop strategies for the design of crop ideotypes to address food security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, A. J.; Srinivasan, V.; Hart, J. C.

    Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have ledmore » to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. Lastly, this survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.« less

  4. Use of computational modeling combined with advanced visualization to develop strategies for the design of crop ideotypes to address food security.

    PubMed

    Christensen, A J; Srinivasan, Venkatraman; Hart, John C; Marshall-Colon, Amy

    2018-05-01

    Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have led to discoveries in "big data" analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. This survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.

  5. Some thoughts on building, evaluating and constraining hydrologic models from catchment to continental scales

    NASA Astrophysics Data System (ADS)

    Wagener, Thorsten

    2017-04-01

    We increasingly build and apply hydrologic models that simulate systems beyond the catchment scale. Such models run at regional, national or even continental scales. They therefore offer opportunities for new scientific insights, for example by enabling comparative hydrology or connectivity studies, and for water management, where we might better understand changes to water resources from larger scale activities like agriculture or from hazards such as droughts. However, these models also require us to rethink how we build and evaluate them given that some of the unsolved problems from the catchment scale have not gone away. So what role should such models play in scientific advancement in hydrology? What problems do we still have to resolve before they can fulfill their role? What opportunities for solving these problems are there, but have not yet been utilized? I will provide some thoughts on these issues in the context of the IAHS Panta Rhei initiative and the scientific challenges it has set out for hydrology (Montanari et al., 2013, Hydrological Sciences Journal; McMillan et al., 2016, Hydrological Sciences Journal).

  6. Use of computational modeling combined with advanced visualization to develop strategies for the design of crop ideotypes to address food security

    PubMed Central

    Christensen, A J; Srinivasan, Venkatraman; Hart, John C; Marshall-Colon, Amy

    2018-01-01

    Abstract Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have led to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. This survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields. PMID:29562368

  7. Building and Sustaining International Scientific Partnerships Through Data Sharing

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Yoksas, T.

    2008-05-01

    Understanding global environmental processes and their regional linkages has heightened the importance of strong international scientific partnerships. At the same time, the Internet and its myriad manifestations, along with innovative web services, have amply demonstrated the compounding benefits of cyberinfrastructure and the power of networked communities. The increased globalization of science, especially in solving interdisciplinary Earth system science problems, requires that science be conducted collaboratively by distributed teams of investigators, often involving sharing of knowledge and resources like community models and other tools. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. Its understanding requires finding, collecting, integrating, and assimilating data from observations and model simulations from diverse fields and across traditional disciplinary boundaries. For the past two decades, the NSF-sponsored Unidata Program Center has been providing the data services, tools, and cyberinfrastructure leadership that advance Earth system science education and research, and enabled opportunities for broad participation. Beginning as a collection of US-based, mostly atmospheric science departments, the Unidata community now transcends international boundaries and geoscience disciplines. Today, Unidata technologies are used in many countries on all continents in research, education and operational settings, and in many international projects (e.g., IPCC assessments, International Polar Year, and THORPEX). The program places high value on the transformational changes enabled by such international scientific partnerships and continually provides opportunities to share knowledge, data, tools and other resources to advance geoscience research and education. This talk will provide an overview of Unidata's ongoing efforts to foster to international scientific partnerships toward building a globally-engaged community of educators and researchers in the geosciences. The presentation will discuss how developments in Earth and Space Science Informatics are enabling new approaches to solving geoscientific problems. The presentation will also describe how Unidata resources are being leveraged by broader initiatives in UCAR and elsewhere.

  8. 3D Printing of Biomolecular Models for Research and Pedagogy

    PubMed Central

    Da Veiga Beltrame, Eduardo; Tyrwhitt-Drake, James; Roy, Ian; Shalaby, Raed; Suckale, Jakob; Pomeranz Krummel, Daniel

    2017-01-01

    The construction of physical three-dimensional (3D) models of biomolecules can uniquely contribute to the study of the structure-function relationship. 3D structures are most often perceived using the two-dimensional and exclusively visual medium of the computer screen. Converting digital 3D molecular data into real objects enables information to be perceived through an expanded range of human senses, including direct stereoscopic vision, touch, and interaction. Such tangible models facilitate new insights, enable hypothesis testing, and serve as psychological or sensory anchors for conceptual information about the functions of biomolecules. Recent advances in consumer 3D printing technology enable, for the first time, the cost-effective fabrication of high-quality and scientifically accurate models of biomolecules in a variety of molecular representations. However, the optimization of the virtual model and its printing parameters is difficult and time consuming without detailed guidance. Here, we provide a guide on the digital design and physical fabrication of biomolecule models for research and pedagogy using open source or low-cost software and low-cost 3D printers that use fused filament fabrication technology. PMID:28362403

  9. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  10. ENABLE 2017, the First EUROPEAN PhD and Post-Doc Symposium. Session 3: In Vitro to In Vivo: Modeling Life in 3D.

    PubMed

    Di Mauro, Gianmarco; Dondi, Ambra; Giangreco, Giovanni; Hogrebe, Alexander; Louer, Elja; Magistrati, Elisa; Mullari, Meeli; Turon, Gemma; Verdurmen, Wouter; Cortada, Helena Xicoy; Zivanovic, Sanja

    2018-05-22

    The EUROPEAN ACADEMY FOR BIOMEDICAL SCIENCE (ENABLE) is an initiative funded by the European Union Horizon 2020 program involving four renowned European research institutes (Institute for Research in Biomedicine-IRB Barcelona, Spain; Radboud Institute for Molecular Life Sciences-RIMLS, the Netherlands; Novo Nordisk Foundation Center for Protein Research-NNF CPR, Denmark; European School of Molecular Medicine-SEMM, Italy) and an innovative science communication agency (Scienseed). With the aim to promote biomedical science of excellence in Europe, ENABLE organizes an annual three-day international event. This gathering includes a top-level scientific symposium bringing together leading scientists, PhD students, and post-doctoral fellows; career development activities supporting the progression of young researchers and fostering discussion about opportunities beyond the bench; outreach activities stimulating the interaction between science and society. The first European PhD and Postdoc Symposium, entitled "Breaking Down Complexity: Innovative models and techniques in biomedicine", was hosted by the vibrant city of Barcelona. The scientific program of the conference was focused on the most recent advances and applications of modern techniques and models in biomedical research and covered a wide range of topics, from synthetic biology to translational medicine. Overall, the event was a great success, with more than 200 attendees from all over Europe actively participating in the symposium by presenting their research and exchanging ideas with their peers and world-renowned scientists.

  11. An organizational model for an international Mars mission (From the 1991 International Space University (ISU) design project)

    NASA Technical Reports Server (NTRS)

    Stoffel, Wilhelm; Mendell, Wendell W.

    1991-01-01

    An international Mars mission aimed at designing a long term, multinational program for conducting scientific exploration of Mars and developing and/or validating technology enabling the eventual human settlement on the planet is discussed. Emphasis is placed on political and legal issues of the project.

  12. Ideas for a Teaching Sequence for the Concept of Energy

    ERIC Educational Resources Information Center

    Duit, Reinders; Neumann, Knut

    2014-01-01

    The energy concept is one of the most important ideas for students to understand. Looking at phenomena through the lens of energy provides powerful tools to model, analyse and predict phenomena in the scientific disciplines. The cross-disciplinary nature of the energy concept enables students to look at phenomena from different angles, helping…

  13. Data management and analysis for the Earth System Grid

    NASA Astrophysics Data System (ADS)

    Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.

    2008-07-01

    The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.

  14. A call for virtual experiments: accelerating the scientific process.

    PubMed

    Cooper, Jonathan; Vik, Jon Olav; Waltemath, Dagmar

    2015-01-01

    Experimentation is fundamental to the scientific method, whether for exploration, description or explanation. We argue that promoting the reuse of virtual experiments (the in silico analogues of wet-lab or field experiments) would vastly improve the usefulness and relevance of computational models, encouraging critical scrutiny of models and serving as a common language between modellers and experimentalists. We review the benefits of reusable virtual experiments: in specifying, assaying, and comparing the behavioural repertoires of models; as prerequisites for reproducible research; to guide model reuse and composition; and for quality assurance in the translational application of models. A key step towards achieving this is that models and experimental protocols should be represented separately, but annotated so as to facilitate the linking of models to experiments and data. Lastly, we outline how the rigorous, streamlined confrontation between experimental datasets and candidate models would enable a "continuous integration" of biological knowledge, transforming our approach to systems biology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC

    NASA Astrophysics Data System (ADS)

    Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan

    2016-04-01

    The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.

  16. Bring NASA Scientific Data into GIS

    NASA Astrophysics Data System (ADS)

    Xu, H.

    2016-12-01

    NASA's Earth Observation System (EOS) and many other missions produce data of huge volume and near real time which drives the research and understanding of climate change. Geographic Information System (GIS) is a technology used for the management, visualization and analysis of spatial data. Since it's inception in the 1960s, GIS has been applied to many fields at the city, state, national, and world scales. People continue to use it today to analyze and visualize trends, patterns, and relationships from the massive datasets of scientific data. There is great interest in both the scientific and GIS communities in improving technologies that can bring scientific data into a GIS environment, where scientific research and analysis can be shared through the GIS platform to the public. Most NASA scientific data are delivered in the Hierarchical Data Format (HDF), a format is both flexible and powerful. However, this flexibility results in challenges when trying to develop supported GIS software - data stored with HDF formats lack a unified standard and convention among these products. The presentation introduces an information model that enables ArcGIS software to ingest NASA scientific data and create a multidimensional raster - univariate and multivariate hypercubes - for scientific visualization and analysis. We will present the framework how ArcGIS leverages the open source GDAL (Geospatial Data Abstract Library) to support its raster data access, discuss how we overcame the GDAL drivers limitations in handing scientific products that are stored with HDF4 and HDF5 formats and how we improve the way in modeling the multidimensionality with GDAL. In additional, we will talk about the direction of ArcGIS handling NASA products and demonstrate how the multidimensional information model can help scientists work with various data products such as MODIS, MOPPIT, SMAP as well as many data products in a GIS environment.

  17. A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics

    NASA Technical Reports Server (NTRS)

    Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan

    2013-01-01

    In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.

  18. Building an International Initiative to Infuse Novel Cancer Models into the Research Community | Office of Cancer Genomics

    Cancer.gov

    My name is Caitlyn Barrett and I am the Scientific Program Manager for the Human Cancer Model Initiative (HCMI) in the Office of Cancer Genomics (OCG). In my role within the HCMI, I am helping to establish communication pathways and build the foundation for collaboration that will enable the completion of the Initiative’s aim to develop as many as 1000 next-generation cancer models, established from patient tumors and accompanied by clinical and molecular data.

  19. SEEPLUS: A SIMPLE ONLINE CLIMATE MODEL

    NASA Astrophysics Data System (ADS)

    Tsutsui, Junichi

    A web application for a simple climate model - SEEPLUS (a Simple climate model to Examine Emission Pathways Leading to Updated Scenarios) - has been developed. SEEPLUS consists of carbon-cycle and climate-change modules, through which it provides the information infrastructure required to perform climate-change experiments, even on a millennial-timescale. The main objective of this application is to share the latest scientific knowledge acquired from climate modeling studies among the different stakeholders involved in climate-change issues. Both the carbon-cycle and climate-change modules employ impulse response functions (IRFs) for their key processes, thereby enabling the model to integrate the outcome from an ensemble of complex climate models. The current IRF parameters and forcing manipulation are basically consistent with, or within an uncertainty range of, the understanding of certain key aspects such as the equivalent climate sensitivity and ocean CO2 uptake data documented in representative literature. The carbon-cycle module enables inverse calculation to determine the emission pathway required in order to attain a given concentration pathway, thereby providing a flexible way to compare the module with more advanced modeling studies. The module also enables analytical evaluation of its equilibrium states, thereby facilitating the long-term planning of global warming mitigation.

  20. The Power of Storytelling: A Native Hawaiian Approach to Science Communication

    NASA Astrophysics Data System (ADS)

    Frank, K. L.

    2016-12-01

    Generational assimilation of observational data enabled Native Hawaiians to preserve a holistic understanding of the connectivity, structure and function - from mountain to sea - within their island ecosystems. Their intimate understandings of the geographic and temporal variability in winds, rains, and currents, and how these factors governed the extent and distribution of biodiversity were perpetuated through stories, songs and chants. Many of these oral histories - which conveyed information via anthropomorphized characters in entertaining and engaging plots - preserved the scientific integrity of traditional phenomenological observations and remain shockingly consistent with contemporary biogeochemical and geophysical observations. These indigenous methods of communicating scientific knowledge are clear models for contemporary best practices in geoscience communication. Storytelling is a tried and true mechanism that both engages and teaches diverse audiences of all ages, ethnicities and skill levels. Scientific storytelling - which can either be examinations of indigenous stories through scientific lenses, or generations of new stories based on scientific observation - enables multiple layers of meaning and levels of knowledge acquisition that bridge cultural and historical place-based knowledge with contemporary knowledge systems. Here, I will share my journey of optimizing the engagement of Native Hawaiian communities (students, land managers, stewards, practitioners, etc…) with my biogeochemical research on a Native Hawaiian coastal estuarine environment (Héeia Fishpond). I will speak about the importance and effectiveness of disseminating research in culturally accessible formats by framing research in the context of traditional knowledge to help elevate the perception of "science" in the Hawaiian community.

  1. Probabilities and Predictions: Modeling the Development of Scientific Problem-Solving Skills

    ERIC Educational Resources Information Center

    Stevens, Ron; Johnson, David F.; Soller, Amy

    2005-01-01

    The IMMEX (Interactive Multi-Media Exercises) Web-based problem set platform enables the online delivery of complex, multimedia simulations, the rapid collection of student performance data, and has already been used in several genetic simulations. The next step is the use of these data to understand and improve student learning in a formative…

  2. The Teaching of Anthropogenic Climate Change and Earth Science via Technology-Enabled Inquiry Education

    ERIC Educational Resources Information Center

    Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark

    2016-01-01

    A gap has existed between the tools and processes of scientists working on anthropogenic global climate change (AGCC) and the technologies and curricula available to educators teaching the subject through student inquiry. Designing realistic scientific inquiry into AGCC poses a challenge because research on it relies on complex computer models,…

  3. GillesPy: A Python Package for Stochastic Model Building and Simulation.

    PubMed

    Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R

    2016-09-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.

  4. GillesPy: A Python Package for Stochastic Model Building and Simulation

    PubMed Central

    Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.

    2017-01-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community. PMID:28630888

  5. Entity Linking Leveraging the GeoDeepDive Cyberinfrastructure and Managing Uncertainty with Provenance.

    NASA Astrophysics Data System (ADS)

    Maio, R.; Arko, R. A.; Lehnert, K.; Ji, P.

    2017-12-01

    Unlocking the full, rich, network of links between the scientific literature and the real world entities to which data correspond - such as field expeditions (cruises) on oceanographic research vessels and physical samples collected during those expeditions - remains a challenge for the geoscience community. Doing so would enable data reuse and integration on a broad scale; making it possible to inspect the network and discover, for example, all rock samples reported in the scientific literature found within 10 kilometers of an undersea volcano, and associated geochemical analyses. Such a capability could facilitate new scientific discoveries. The GeoDeepDive project provides negotiated access to 4.2+ million documents from scientific publishers, enabling text and document mining via a public API and cyberinfrastructure. We mined this corpus using entity linking techniques, which are inherently uncertain, and recorded provenance information about each link. This opens the entity linking methodology to scrutiny, and enables downstream applications to make informed assessments about the suitability of an entity link for consumption. A major challenge is how to model and disseminate the provenance information. We present results from entity linking between journal articles, research vessels and cruises, and physical samples from the Petrological Database (PetDB), and incorporate Linked Data resources such as cruises in the Rolling Deck to Repository (R2R) catalog where possible. Our work demonstrates the value and potential of the GeoDeepDive cyberinfrastructure in combination with Linked Data infrastructure provided by the EarthCube GeoLink project. We present a research workflow to capture provenance information that leverages the World Wide Web Consortium (W3C) recommendation PROV Ontology.

  6. The NASA Solar System Exploration Virtual Institute: International Efforts in Advancing Lunar Science with Prospects for the Future

    NASA Technical Reports Server (NTRS)

    Schmidt, Gregory K.

    2014-01-01

    The NASA Solar System Exploration Research Virtual Institute (SSERVI), originally chartered in 2008 as the NASA Lunar Science Institute (NLSI), is chartered to advance both the scientific goals needed to enable human space exploration, as well as the science enabled by such exploration. NLSI and SSERVI have in succession been "institutes without walls," fostering collaboration between domestic teams (7 teams for NLSI, 9 for SSERVI) as well as between these teams and the institutes' international partners, resulting in a greater global endeavor. SSERVI teams and international partners participate in sharing ideas, information, and data arising from their respective research efforts, and contribute to the training of young scientists and bringing the scientific results and excitement of exploration to the public. The domestic teams also respond to NASA's strategic needs, providing community-based responses to NASA needs in partnership with NASA's Analysis Groups. Through the many partnerships enabled by NLSI and SSERVI, scientific results have well exceeded initial projections based on the original PI proposals, proving the validity of the virtual institute model. NLSI and SSERVI have endeavored to represent not just the selected and funded domestic teams, but rather the entire relevant scientific community; this has been done through many means such as the annual Lunar Science Forum (now re-named Exploration Science Forum), community-based grass roots Focus Groups on a wide range of topics, and groups chartered to further the careers of young scientists. Additionally, NLSI and SSERVI have co-founded international efforts such as the pan-European lunar science consortium, with an overall goal of raising the tide of lunar science (and now more broadly exploration science) across the world.

  7. The Scientific Method - Critical and Creative Thinking

    NASA Astrophysics Data System (ADS)

    Cotton, John; Scarlise, Randall

    2011-10-01

    The ``scientific method'' is not just for scientists! Combined with critical thinking, the scientific method can enable students to distinguish credible sources of information from nonsense and become intelligent consumers of information. Professors John Cotton and Randall Scalise illustrate these principles using a series of examples and demonstrations that is enlightening, educational, and entertaining. This lecture/demonstration features highlights from their course (whose unofficial title is ``debunking pseudoscience'' ) which enables students to detect pseudoscience in its many guises: paranormal phenomena, free-energy devices, alternative medicine, and many others.

  8. Multi-year Content Analysis of User Facility Related Publications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, Robert M; Stahl, Christopher G; Hines, Jayson

    2013-01-01

    Scientific user facilities provide resources and support that enable scientists to conduct experiments or simulations pertinent to their respective research. Consequently, it is critical to have an informed understanding of the impact and contributions that these facilities have on scientific discoveries. Leveraging insight into scientific publications that acknowledge the use of these facilities enables more informed decisions by facility management and sponsors in regard to policy, resource allocation, and influencing the direction of science as well as more effectively understand the impact of a scientific user facility. This work discusses preliminary results of mining scientific publications that utilized resources atmore » the Oak Ridge Leadership Computing Facility (OLCF) at Oak Ridge National Laboratory (ORNL). These results show promise in identifying and leveraging multi-year trends and providing a higher resolution view of the impact that a scientific user facility may have on scientific discoveries.« less

  9. Watershed System Model: The Essentials to Model Complex Human-Nature System at the River Basin Scale

    NASA Astrophysics Data System (ADS)

    Li, Xin; Cheng, Guodong; Lin, Hui; Cai, Ximing; Fang, Miao; Ge, Yingchun; Hu, Xiaoli; Chen, Min; Li, Weiyue

    2018-03-01

    Watershed system models are urgently needed to understand complex watershed systems and to support integrated river basin management. Early watershed modeling efforts focused on the representation of hydrologic processes, while the next-generation watershed models should represent the coevolution of the water-land-air-plant-human nexus in a watershed and provide capability of decision-making support. We propose a new modeling framework and discuss the know-how approach to incorporate emerging knowledge into integrated models through data exchange interfaces. We argue that the modeling environment is a useful tool to enable effective model integration, as well as create domain-specific models of river basin systems. The grand challenges in developing next-generation watershed system models include but are not limited to providing an overarching framework for linking natural and social sciences, building a scientifically based decision support system, quantifying and controlling uncertainties, and taking advantage of new technologies and new findings in the various disciplines of watershed science. The eventual goal is to build transdisciplinary, scientifically sound, and scale-explicit watershed system models that are to be codesigned by multidisciplinary communities.

  10. Enabling large-scale viscoelastic calculations via neural network acceleration

    NASA Astrophysics Data System (ADS)

    Robinson DeVries, P.; Thompson, T. B.; Meade, B. J.

    2017-12-01

    One of the most significant challenges involved in efforts to understand the effects of repeated earthquake cycle activity are the computational costs of large-scale viscoelastic earthquake cycle models. Deep artificial neural networks (ANNs) can be used to discover new, compact, and accurate computational representations of viscoelastic physics. Once found, these efficient ANN representations may replace computationally intensive viscoelastic codes and accelerate large-scale viscoelastic calculations by more than 50,000%. This magnitude of acceleration enables the modeling of geometrically complex faults over thousands of earthquake cycles across wider ranges of model parameters and at larger spatial and temporal scales than have been previously possible. Perhaps most interestingly from a scientific perspective, ANN representations of viscoelastic physics may lead to basic advances in the understanding of the underlying model phenomenology. We demonstrate the potential of artificial neural networks to illuminate fundamental physical insights with specific examples.

  11. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  12. The BGC Feedbacks Scientific Focus Area 2016 Annual Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forrest M.; Riley, William J.; Randerson, James T.

    2016-06-01

    The BGC Feedbacks Project will identify and quantify the feedbacks between biogeochemical cycles and the climate system, and quantify and reduce the uncertainties in Earth System Models (ESMs) associated with those feedbacks. The BGC Feedbacks Project will contribute to the integration of the experimental and modeling science communities, providing researchers with new tools to compare measurements and models, thereby enabling DOE to contribute more effectively to future climate assessments by the U.S. Global Change Research Program (USGCRP) and the Intergovernmental Panel on Climate Change (IPCC).

  13. Open Knee: Open Source Modeling & Simulation to Enable Scientific Discovery and Clinical Care in Knee Biomechanics

    PubMed Central

    Erdemir, Ahmet

    2016-01-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical function of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor intensive reproduction of model development steps can be avoided. The interested parties can immediately utilize readily available models for scientific discovery and for clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes detailed anatomical representation of the joint's major tissue structures, their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next generation knee models are noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age groups and pathological states. PMID:26444849

  14. Potential Follow on Experiments for the Zero Boil Off Tank Experiment

    NASA Technical Reports Server (NTRS)

    Chato, David; Kassemi, Mohammad

    2014-01-01

    Cryogenic Storage &Transfer are enabling propulsion technologies in the direct path of nearly all future human or robotic missions; It is identified by NASA as an area with greatest potential for cost saving; This proposal aims at resolving fundamental scientific issues behind the engineering development of the storage tanks; We propose to use the ISS lab to generate & collect archival scientific data:, raise our current state-of-the-art understanding of transport and phase change issues affecting the storage tank cryogenic fluid management (CFM), develop and validate state-of-the-art CFD models to innovate, optimize, and advance the future engineering designs

  15. ENABLE 2017, the First EUROPEAN PhD and Post-Doc Symposium. Session 4: From Discovery to Cure: The Future of Therapeutics.

    PubMed

    Di Mauro, Gianmarco; Dondi, Ambra; Giangreco, Giovanni; Hogrebe, Alexander; Louer, Elja; Magistrati, Elisa; Mullari, Meeli; Turon, Gemma; Verdurmen, Wouter; Xicoy Cortada, Helena; Zivanovic, Sanja

    2018-05-28

    The EUROPEAN ACADEMY FOR BIOMEDICAL SCIENCE (ENABLE) is an initiative funded by the European Union Horizon 2020 program involving four renowned European Research Institutes (Institute for Research in Biomedicine-IRB Barcelona, Spain; Radboud Institute for Molecular Life Sciences-RIMLS, the Netherlands; Novo Nordisk Foundation Center for Protein Research-NNF CPR, Denmark; European School of Molecular Medicine-SEMM, Italy) and an innovative science communication agency (Scienseed). With the aim of promoting biomedical science of excellence in Europe, ENABLE organizes an annual three-day international event. This gathering includes a top-level scientific symposium bringing together leading scientists, PhD students, and post-doctoral fellows; career development activities supporting the progression of young researchers and fostering discussion about opportunities beyond the bench; and outreach activities stimulating the interaction between science and society. The first European PhD and Postdoc Symposium, entitled "Breaking Down Complexity: Innovative Models and Techniques in Biomedicine", was hosted by the vibrant city of Barcelona. The scientific program of the conference was focused on the most recent advances and applications of modern techniques and models in biomedical research and covered a wide range of topics, from synthetic biology to translational medicine. Overall, the event was a great success, with more than 200 attendees from all over Europe actively participating in the symposium by presenting their research and exchanging ideas with their peers and world-renowned scientists.

  16. Enhancing Interdisciplinary Human System Risk Research Through Modeling and Network Approaches

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark

    2015-01-01

    NASA's Human Research Program (HRP) supports research to reduce human health and performance risks inherent in future human space exploration missions. Understanding risk outcomes and contributing factors in an integrated manner allows HRP research to support development of efficient and effective mitigations from cross-disciplinary perspectives, and to enable resilient human and engineered systems for spaceflight. The purpose of this work is to support scientific collaborations and research portfolio management by utilizing modeling for analysis and visualization of current and potential future interdisciplinary efforts.

  17. The role of models/and analogies in science education: implications from research

    NASA Astrophysics Data System (ADS)

    Coll, Richard K.; France, Bev; Taylor, Ian

    2005-02-01

    Models and modelling are key tools for scientists, science teachers and science learners. In this paper we argue that classroom-based research evidence demonstrates that the use of models and analogies within the pedagogy of science education may provide a route for students to gain some understanding of the nature of science. A common theme to emerge from the literature reviewed here is that in order to successfully develop conceptual understandings in science, learners need to be able to reflect on and discuss their understandings of scientific concepts as they are developing them. Pedagogies that involve various types of modelling are most effective when students are able to construct and critique their own and scientists' models. Research also suggests that group work and peer discussion are important ways of enhancing students' cognitive and metacognitive thinking skills. Further we argue that an understanding of science models and the modelling process enables students to develop a metacognitive awareness of knowledge development within the science community, as well as providing the tools to reflect on their own scientific understanding.

  18. Atmospheric Ionizing Radiation (AIR) ER-2 Preflight Analysis

    NASA Technical Reports Server (NTRS)

    Tai, Hsiang; Wilson, John W.; Maiden, D. L.

    1998-01-01

    Atmospheric ionizing radiation (AIR) produces chemically active radicals in biological tissues that alter the cell function or result in cell death. The AIR ER-2 flight measurements will enable scientists to study the radiation risk associated with the high-altitude operation of a commercial supersonic transport. The ER-2 radiation measurement flights will follow predetermined, carefully chosen courses to provide an appropriate database matrix which will enable the evaluation of predictive modeling techniques. Explicit scientific results such as dose rate, dose equivalent rate, magnetic cutoff, neutron flux, and air ionization rate associated with those flights are predicted by using the AIR model. Through these flight experiments, we will further increase our knowledge and understanding of the AIR environment and our ability to assess the risk from the associated hazard.

  19. To ontologise or not to ontologise: An information model for a geospatial knowledge infrastructure

    NASA Astrophysics Data System (ADS)

    Stock, Kristin; Stojanovic, Tim; Reitsma, Femke; Ou, Yang; Bishr, Mohamed; Ortmann, Jens; Robertson, Anne

    2012-08-01

    A geospatial knowledge infrastructure consists of a set of interoperable components, including software, information, hardware, procedures and standards, that work together to support advanced discovery and creation of geoscientific resources, including publications, data sets and web services. The focus of the work presented is the development of such an infrastructure for resource discovery. Advanced resource discovery is intended to support scientists in finding resources that meet their needs, and focuses on representing the semantic details of the scientific resources, including the detailed aspects of the science that led to the resource being created. This paper describes an information model for a geospatial knowledge infrastructure that uses ontologies to represent these semantic details, including knowledge about domain concepts, the scientific elements of the resource (analysis methods, theories and scientific processes) and web services. This semantic information can be used to enable more intelligent search over scientific resources, and to support new ways to infer and visualise scientific knowledge. The work describes the requirements for semantic support of a knowledge infrastructure, and analyses the different options for information storage based on the twin goals of semantic richness and syntactic interoperability to allow communication between different infrastructures. Such interoperability is achieved by the use of open standards, and the architecture of the knowledge infrastructure adopts such standards, particularly from the geospatial community. The paper then describes an information model that uses a range of different types of ontologies, explaining those ontologies and their content. The information model was successfully implemented in a working geospatial knowledge infrastructure, but the evaluation identified some issues in creating the ontologies.

  20. Enabling Data Intensive Science through Service Oriented Science: Virtual Laboratories and Science Gateways

    NASA Astrophysics Data System (ADS)

    Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.

    2014-12-01

    We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.

  1. Implementing Science-Technology-Society Approaches in Middle School Science Teaching

    ERIC Educational Resources Information Center

    Akcay, Hakan; Yager, Robert E.

    2010-01-01

    The National Science Education Standards emphasize a goal that students should achieve scientific literacy, which is defined as the knowledge and understanding of scientific concepts needed in daily living. Scientific literacy enables people to not only use scientific principles and processes in making personal decisions but also to participate in…

  2. Science literacy and academic identity formulation

    NASA Astrophysics Data System (ADS)

    Reveles, John M.; Cordova, Ralph; Kelly, Gregory J.

    2004-12-01

    The purpose of this article is to report findings from an ethnographic study that focused on the co-development of science literacy and academic identity formulation within a third-grade classroom. Our theoretical framework draws from sociocultural theory and studies of scientific literacy. Through analysis of classroom discourse, we identified opportunities afforded students to learn specific scientific knowledge and practices during a series of science investigations. The results of this study suggest that the collective practice of the scientific conversations and activities that took place within this classroom enabled students to engage in the construction of communal science knowledge through multiple textual forms. By examining the ways in which students contributed to the construction of scientific understanding, and then by examining their performances within and across events, we present evidence of the co-development of students' academic identities and scientific literacy. Students' communication and participation in science during the investigations enabled them to learn the structure of the discipline by identifying and engaging in scientific activities. The intersection of academic identities with the development of scientific literacy provides a basis for considering specific ways to achieve scientific literacy for all students.

  3. Comparison of Scientific Calipers and Computer-Enabled CT Review for the Measurement of Skull Base and Craniomaxillofacial Dimensions

    PubMed Central

    Citardi, Martin J.; Herrmann, Brian; Hollenbeak, Chris S.; Stack, Brendan C.; Cooper, Margaret; Bucholz, Richard D.

    2001-01-01

    Traditionally, cadaveric studies and plain-film cephalometrics provided information about craniomaxillofacial proportions and measurements; however, advances in computer technology now permit software-based review of computed tomography (CT)-based models. Distances between standardized anatomic points were measured on five dried human skulls with standard scientific calipers (Geneva Gauge, Albany, NY) and through computer workstation (StealthStation 2.6.4, Medtronic Surgical Navigation Technology, Louisville, CO) review of corresponding CT scans. Differences in measurements between the caliper and CT model were not statistically significant for each parameter. Measurements obtained by computer workstation CT review of the cranial skull base are an accurate representation of actual bony anatomy. Such information has important implications for surgical planning and clinical research. ImagesFigure 1Figure 2Figure 3 PMID:17167599

  4. A statistical shape model of the human second cervical vertebra.

    PubMed

    Clogenson, Marine; Duff, John M; Luethi, Marcel; Levivier, Marc; Meuli, Reto; Baur, Charles; Henein, Simon

    2015-07-01

    Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.

  5. The Teaching of Anthropogenic Climate Change and Earth Science via Technology-Enabled Inquiry Education

    NASA Technical Reports Server (NTRS)

    Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark

    2016-01-01

    A gap has existed between the tools and processes of scientists working on anthropogenic global climate change (AGCC) and the technologies and curricula available to educators teaching the subject through student inquiry. Designing realistic scientific inquiry into AGCC poses a challenge because research on it relies on complex computer models, globally distributed data sets, and complex laboratory and data collection procedures. Here we examine efforts by the scientific community and educational researchers to design new curricula and technology that close this gap and impart robust AGCC and Earth Science understanding. We find technology-based teaching shows promise in promoting robust AGCC understandings if associated curricula address mitigating factors such as time constraints in incorporating technology and the need to support teachers implementing AGCC and Earth Science inquiry. We recommend the scientific community continue to collaborate with educational researchers to focus on developing those inquiry technologies and curricula that use realistic scientific processes from AGCC research and/or the methods for determining how human society should respond to global change.

  6. Open access to biomedical engineering publications.

    PubMed

    Flexman, Jennifer A

    2008-01-01

    Scientific research is disseminated within the community and to the public in part through journals. Most scientific journals, in turn, protect the manuscript through copyright and recover their costs by charging subscription fees to individuals and institutions. This revenue stream is used to support the management of the journal and, in some cases, professional activities of the sponsoring society such as the Institute of Electrical and Electronics Engineers (IEEE). For example, the IEEE Engineering in Medicine and Biology Society (EMBS) manages seven academic publications representing the various areas of biomedical engineering. New business models have been proposed to distribute journal articles free of charge, either immediately or after a delay, to enable a greater dissemination of knowledge to both the public and the scientific community. However, publication costs must be recovered and likely at a higher cost to the manuscript authors. While there is little doubt that the foundations of scientific publication will change, the specifics and implications of an open source framework must be discussed.

  7. The Virtual Brain: a simulator of primate brain network dynamics.

    PubMed

    Sanz Leon, Paula; Knock, Stuart A; Woodman, M Marmaduke; Domide, Lia; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor

    2013-01-01

    We present The Virtual Brain (TVB), a neuroinformatics platform for full brain network simulations using biologically realistic connectivity. This simulation environment enables the model-based inference of neurophysiological mechanisms across different brain scales that underlie the generation of macroscopic neuroimaging signals including functional MRI (fMRI), EEG and MEG. Researchers from different backgrounds can benefit from an integrative software platform including a supporting framework for data management (generation, organization, storage, integration and sharing) and a simulation core written in Python. TVB allows the reproduction and evaluation of personalized configurations of the brain by using individual subject data. This personalization facilitates an exploration of the consequences of pathological changes in the system, permitting to investigate potential ways to counteract such unfavorable processes. The architecture of TVB supports interaction with MATLAB packages, for example, the well known Brain Connectivity Toolbox. TVB can be used in a client-server configuration, such that it can be remotely accessed through the Internet thanks to its web-based HTML5, JS, and WebGL graphical user interface. TVB is also accessible as a standalone cross-platform Python library and application, and users can interact with the scientific core through the scripting interface IDLE, enabling easy modeling, development and debugging of the scientific kernel. This second interface makes TVB extensible by combining it with other libraries and modules developed by the Python scientific community. In this article, we describe the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components as well as potential neuroscience applications.

  8. The Virtual Brain: a simulator of primate brain network dynamics

    PubMed Central

    Sanz Leon, Paula; Knock, Stuart A.; Woodman, M. Marmaduke; Domide, Lia; Mersmann, Jochen; McIntosh, Anthony R.; Jirsa, Viktor

    2013-01-01

    We present The Virtual Brain (TVB), a neuroinformatics platform for full brain network simulations using biologically realistic connectivity. This simulation environment enables the model-based inference of neurophysiological mechanisms across different brain scales that underlie the generation of macroscopic neuroimaging signals including functional MRI (fMRI), EEG and MEG. Researchers from different backgrounds can benefit from an integrative software platform including a supporting framework for data management (generation, organization, storage, integration and sharing) and a simulation core written in Python. TVB allows the reproduction and evaluation of personalized configurations of the brain by using individual subject data. This personalization facilitates an exploration of the consequences of pathological changes in the system, permitting to investigate potential ways to counteract such unfavorable processes. The architecture of TVB supports interaction with MATLAB packages, for example, the well known Brain Connectivity Toolbox. TVB can be used in a client-server configuration, such that it can be remotely accessed through the Internet thanks to its web-based HTML5, JS, and WebGL graphical user interface. TVB is also accessible as a standalone cross-platform Python library and application, and users can interact with the scientific core through the scripting interface IDLE, enabling easy modeling, development and debugging of the scientific kernel. This second interface makes TVB extensible by combining it with other libraries and modules developed by the Python scientific community. In this article, we describe the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components as well as potential neuroscience applications. PMID:23781198

  9. Middle School Students' Learning about Genetic Inheritance through On-Line Scaffolding Supports

    ERIC Educational Resources Information Center

    Manokore, Viola

    2010-01-01

    The main goal of school science is to enable learners to become scientifically literate through their participation in scientific discourses (McNeill & Krajcik, 2009). One of the key elements of scientific discourses is the ability to construct scientific explanations that consist of valid claims supported by appropriate evidence (e.g., McNeill &…

  10. Metabolic modelling and flux analysis of microorganisms from the Atacama Desert used in biotechnological processes.

    PubMed

    Razmilic, Valeria; Castro, Jean Franco; Marchant, Francisca; Asenjo, Juan A; Andrews, Barbara

    2018-02-02

    Metabolic modelling is a useful tool that enables the rational design of metabolic engineering experiments and the study of the unique capabilities of biotechnologically important microorganisms. The extreme abiotic conditions of the Atacama Desert have selected microbial diversity with exceptional characteristics that can be applied in the mining industry for bioleaching processes and for production of specialised metabolites with antimicrobial, antifungal, antiviral, antitumoral, among other activities. In this review we summarise the scientific data available of the use of metabolic modelling and flux analysis to improve the performance of Atacama Desert microorganisms in biotechnological applications.

  11. A Community Framework for Integrative, Coupled Modeling of Human-Earth Systems

    NASA Astrophysics Data System (ADS)

    Barton, C. M.; Nelson, G. C.; Tucker, G. E.; Lee, A.; Porter, C.; Ullah, I.; Hutton, E.; Hoogenboom, G.; Rogers, K. G.; Pritchard, C.

    2017-12-01

    We live today in a humanized world, where critical zone dynamics are driven by coupled human and biophysical processes. First generation modeling platforms have been invaluable in providing insight into dynamics of biophysical systems and social systems. But to understand today's humanized planet scientifically and to manage it sustainably, we need integrative modeling of this coupled human-Earth system. To address both scientific and policy questions, we also need modeling that can represent variable combinations of human-Earth system processes at multiple scales. Simply adding more code needed to do this to large, legacy first generation models is impractical, expensive, and will make them even more difficult to evaluate or understand. We need an approach to modeling that mirrors and benefits from the architecture of the complexly coupled systems we hope to model. Building on a series of international workshops over the past two years, we present a community framework to enable and support an ecosystem of diverse models as components that can be interconnected as needed to facilitate understanding of a range of complex human-earth systems interactions. Models are containerized in Docker to make them platform independent. A Basic Modeling Interface and Standard Names ontology (developed by the Community Surface Dynamics Modeling System) is applied to make them interoperable. They are then transformed into RESTful micro-services to allow them to be connected and run in a browser environment. This enables a flexible, multi-scale modeling environment to help address diverse issues with combinations of smaller, focused, component models that are easier to understand and evaluate. We plan to develop, deploy, and maintain this framework for integrated, coupled modeling in an open-source collaborative development environment that can democratize access to advanced technology and benefit from diverse global participation in model development. We also present an initial proof-of-concept of this framework, coupling a widely used agricultural crop model (DSSAT) with a widely used hydrology model (TopoFlow).

  12. The James Webb Space Telescope Sunshield Waterfall

    NASA Image and Video Library

    2017-12-08

    This shiny silver "waterfall" is actually the five layers of the full-scale engineering model of NASA's James Webb Space Telescope sunshield being laid out by technicians at the Northrop Grumman Aerospace Systems Space Park facility in Redondo Beach, Calif. who are conducting endurance tests on them. For more information, visit: jwst.nasa.gov Credit: Northrop Grumman NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  13. Swedish Delegation Visits NASA Goddard

    NASA Image and Video Library

    2017-12-08

    Swedish Delegation Visits GSFC – May 3, 2017 - Members of the Royal Swedish Academy of Engineering Sciences listen to Catherine Peddie, Wide Field Infrared Survey Telescope (WFIRST) Deputy Project Manager use a full-scale model of WFIRST to describe the features of the observatory. Photo Credit: NASA/Goddard/Rebecca Roth Read more: go.nasa.gov/2p1rP0h NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  14. Mars’ Moon Phobos is Slowly Falling Apart

    NASA Image and Video Library

    2017-12-08

    New modeling indicates that the grooves on Mars’ moon Phobos could be produced by tidal forces – the mutual gravitational pull of the planet and the moon. Initially, scientists had thought the grooves were created by the massive impact that made Stickney crater (lower right). Credits: NASA/JPL-Caltech/University of Arizona Read more: go.nasa.gov/1RLCS1v NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  15. New NASA 3D Animation Shows Seven Days of Simulated Earth Weather

    NASA Image and Video Library

    2014-08-11

    This visualization shows early test renderings of a global computational model of Earth's atmosphere based on data from NASA's Goddard Earth Observing System Model, Version 5 (GEOS-5). This particular run, called Nature Run 2, was run on a supercomputer, spanned 2 years of simulation time at 30 minute intervals, and produced Petabytes of output. The visualization spans a little more than 7 days of simulation time which is 354 time steps. The time period was chosen because a simulated category-4 typhoon developed off the coast of China. The 7 day period is repeated several times during the course of the visualization. Credit: NASA's Scientific Visualization Studio Read more or download here: svs.gsfc.nasa.gov/goto?4180 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  16. NASA Virtual Institutes: International Bridges for Space Exploration

    NASA Technical Reports Server (NTRS)

    Schmidt, Gregory K.

    2016-01-01

    NASA created the first virtual institute, the NASA Astrobiology Institute (NAI), in 2009 with an aim toward bringing together geographically disparate and multidisciplinary teams toward the goal of answering broad questions in the then-new discipline of astrobiology. With the success of the virtual institute model, NASA then created the NASA Lunar Science Institute (NLSI) in 2008 to address questions of science and human exploration of the Moon, and then the NASA Aeronautics Research Institute (NARI) in 2012 which addresses key questions in the development of aeronautics technologies. With the broadening of NASA's human exploration targets to include Near Earth Asteroids and the moons of Mars as well as the Moon, the NLSI morphed into the Solar System Exploration Research Virtual Institute (SSERVI) in 2012. SSERVI funds domestic research teams to address broad questions at the intersection of science and human exploration, with the underlying principle that science enables human exploration, and human exploration enables science. Nine domestic teams were funded in 2014 for a five-year period to address a variety of different topics, and nine international partners (with more to come) also work with the U.S. teams on a variety of topics of mutual interest. The result is a robust and productive research infrastructure that is not only scientifically productive but can respond to strategic topics of domestic and international interest, and which develops a new generation of researchers. This is all accomplished with the aid of virtual collaboration technologies which enable scientific research at a distance. The virtual institute model is widely applicable to a range of space science and exploration problems.

  17. Semantic e-Science: From Microformats to Models

    NASA Astrophysics Data System (ADS)

    Lumb, L. I.; Freemantle, J. R.; Aldridge, K. D.

    2009-05-01

    A platform has been developed to transform semi-structured ASCII data into a representation based on the eXtensible Markup Language (XML). A subsequent transformation allows the XML-based representation to be rendered in the Resource Description Format (RDF). Editorial metadata, expressed as external annotations (via XML Pointer Language), also survives this transformation process (e.g., Lumb et al., http://dx.doi.org/10.1016/j.cageo.2008.03.009). Because the XML-to-RDF transformation uses XSLT (eXtensible Stylesheet Language Transformations), semantic microformats ultimately encode the scientific data (Lumb & Aldridge, http://dx.doi.org/10.1109/HPCS.2006.26). In building the relationship-centric representation in RDF, a Semantic Model of the scientific data is extracted. The systematic enhancement in the expressivity and richness of the scientific data results in representations of knowledge that are readily understood and manipulated by intelligent software agents. Thus scientists are able to draw upon various resources within and beyond their discipline to use in their scientific applications. Since the resulting Semantic Models are independent conceptualizations of the science itself, the representation of scientific knowledge and interaction with the same can stimulate insight from different perspectives. Using the Global Geodynamics Project (GGP) for the purpose of illustration, the introduction of GGP microformats enable a Semantic Model for the GGP that can be semantically queried (e.g., via SPARQL, http://www.w3.org/TR/rdf-sparql-query). Although the present implementation uses the Open Source Redland RDF Libraries (http://librdf.org/), the approach is generalizable to other platforms and to projects other than the GGP (e.g., Baker et al., Informatics and the 2007-2008 Electronic Geophysical Year, Eos Trans. Am. Geophys. Un., 89(48), 485-486, 2008).

  18. Water use data to enhance scientific and policy insight

    NASA Astrophysics Data System (ADS)

    Konar, M.

    2017-12-01

    We live in an era of big data. However, water use data remains sparse. There is an urgent need to enhance both the quality and resolution of water data. Metered water use information - as opposed to estimated water use, typically based on climate - would enhance the quality of existing water databases. Metered water use data would enable the research community to evaluate the "who, where, and when" of water use. Importantly, this information would enable the scientific community to better understand decision making related to water use (i.e. the "why"), providing the insight necessary to guide policies that promote water conservation. Metered water use data is needed at a sufficient resolution (i.e. spatial, temporal, and water user) to fully resolve how water is used throughout the economy and society. Improving the quality and resolution of water use data will enable scientific understanding that can inform policy.

  19. Enabling comparative modeling of closely related genomes: Example genus Brucella

    DOE PAGES

    Faria, José P.; Edirisinghe, Janaka N.; Davis, James J.; ...

    2014-03-08

    For many scientific applications, it is highly desirable to be able to compare metabolic models of closely related genomes. In this study, we attempt to raise awareness to the fact that taking annotated genomes from public repositories and using them for metabolic model reconstructions is far from being trivial due to annotation inconsistencies. We are proposing a protocol for comparative analysis of metabolic models on closely related genomes, using fifteen strains of genus Brucella, which contains pathogens of both humans and livestock. This study lead to the identification and subsequent correction of inconsistent annotations in the SEED database, as wellmore » as the identification of 31 biochemical reactions that are common to Brucella, which are not originally identified by automated metabolic reconstructions. We are currently implementing this protocol for improving automated annotations within the SEED database and these improvements have been propagated into PATRIC, Model-SEED, KBase and RAST. This method is an enabling step for the future creation of consistent annotation systems and high-quality model reconstructions that will support in predicting accurate phenotypes such as pathogenicity, media requirements or type of respiration.« less

  20. Enabling comparative modeling of closely related genomes: Example genus Brucella

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faria, José P.; Edirisinghe, Janaka N.; Davis, James J.

    For many scientific applications, it is highly desirable to be able to compare metabolic models of closely related genomes. In this study, we attempt to raise awareness to the fact that taking annotated genomes from public repositories and using them for metabolic model reconstructions is far from being trivial due to annotation inconsistencies. We are proposing a protocol for comparative analysis of metabolic models on closely related genomes, using fifteen strains of genus Brucella, which contains pathogens of both humans and livestock. This study lead to the identification and subsequent correction of inconsistent annotations in the SEED database, as wellmore » as the identification of 31 biochemical reactions that are common to Brucella, which are not originally identified by automated metabolic reconstructions. We are currently implementing this protocol for improving automated annotations within the SEED database and these improvements have been propagated into PATRIC, Model-SEED, KBase and RAST. This method is an enabling step for the future creation of consistent annotation systems and high-quality model reconstructions that will support in predicting accurate phenotypes such as pathogenicity, media requirements or type of respiration.« less

  1. Arctic Synthesis Collaboratory: A Virtual Organization for Transformative Research and Education on a Changing Arctic

    NASA Astrophysics Data System (ADS)

    Warnick, W. K.; Wiggins, H. V.; Hinzman, L.; Holland, M.; Murray, M. S.; Vörösmarty, C.; Loring, A. J.

    2008-12-01

    About the Arctic Synthesis Collaboratory The Arctic Synthesis Collaboratory concept, developed through a series of NSF-funded workshops and town hall meetings, is envisioned as a cyber-enabled, technical, organizational, and social-synthesis framework to foster: • Interactions among interdisciplinary experts and stakeholders • Integrated data analysis and modeling activities • Training and development of the arctic science community • Delivery of outreach, education, and policy-relevant resources Scientific Rationale The rapid rate of arctic change and our incomplete understanding of the arctic system present the arctic community with a grand scientific challenge and three related issues. First, a wealth of observations now exists as disconnected data holdings, which must be coordinated and synthesized to fully detect and assess arctic change. Second, despite great strides in the development of arctic system simulations, we still have incomplete capabilities for modeling and predicting the behavior of the system as a whole. Third, policy-makers, stakeholders, and the public are increasingly making demands of the science community for forecasts and guidance in mitigation and adaptation strategies. Collaboratory Components The Arctic Synthesis Collaboratory is organized around four integrated functions that will be established virtually as a distributed set of activities, but also with the advantage of existing facilities that could sponsor some of the identified activities. Community Network "Meeting Grounds:" The Collaboratory will link distributed individuals, organizations, and activities to enable collaboration and foster new research initiatives. Specific activities could include: an expert directory, social networking services, and virtual and face-to-face meetings. Data Integration, Synthesis, and Modeling Activities: The Collaboratory will utilize appropriate tools to enable the combination of data and models. Specific activities could include: a web-enabled model library, user forums, a data search and discovery system, and an online library. Support Scientist Professional Development: Experts at all career levels must keep pace with the newest developments in data integration and modeling, interdisciplinary science, and cyber-enabled collaboration. Specific project activities could include: web seminars, short courses, and a mentor program. Education, Outreach, and Policy Resources: An Arctic Virtual Outreach Center (AVOC) will provide critical education, outreach, and policy elements of the Collaboratory. Specific activities could include: public eSeminars, a virtual pressroom, K-12 classroom resources, and an eNewsletter. A Collaboratory Implementation Workshop is being planned for winter 2009; further details will be available soon. For more information, contact Helen V. Wiggins, Arctic Research Consortium of the U.S. (ARCUS) at: helen@arcus.org, or go to the website of the community workshop, "New Perspectives through Data Discovery and Modeling," at: http://www.arcus.org/ARCSS/2007_data/index.html.

  2. Open cyberGIS software for geospatial research and education in the big data era

    NASA Astrophysics Data System (ADS)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  3. Joined-up Planetary Information, in the Cloud and on Devices.

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Emmott, S.; Purves, D. W.; Joppa, L. N.; Lyutsarev, V.

    2014-12-01

    In scientific research and development, emphasis is placed on research over development. A significant cost is that the two-way interaction between scientific insights and societal needs does not function effectively to lead to impacts in the wider world. We simply must embrace new software and hardware approaches if we are to provide timely predictive information to address global problems, support businesses and inform governments and citizens. The Microsoft Research Computational Science Lab has been pioneering research into software and methodologies to provide useful and usable new environmental information. Our approach has been very joined-up: from accellerating data acquisition from the field with remote sensor technology, targetted data collection and citizen science, to enabling proces based modelling-using multiple heterogeneous data-sets in the cloud and enabling the resulting planetary information to be accessed from any device. This talk will demonstrate some of the specific research and development we are doing to accerate the pace in which important science has impact on the wider world and will emphasise the important insights gained from advancing the research and develoment together.

  4. Making species checklists understandable to machines - a shift from relational databases to ontologies.

    PubMed

    Laurenne, Nina; Tuominen, Jouni; Saarenmaa, Hannu; Hyvönen, Eero

    2014-01-01

    The scientific names of plants and animals play a major role in Life Sciences as information is indexed, integrated, and searched using scientific names. The main problem with names is their ambiguous nature, because more than one name may point to the same taxon and multiple taxa may share the same name. In addition, scientific names change over time, which makes them open to various interpretations. Applying machine-understandable semantics to these names enables efficient processing of biological content in information systems. The first step is to use unique persistent identifiers instead of name strings when referring to taxa. The most commonly used identifiers are Life Science Identifiers (LSID), which are traditionally used in relational databases, and more recently HTTP URIs, which are applied on the Semantic Web by Linked Data applications. We introduce two models for expressing taxonomic information in the form of species checklists. First, we show how species checklists are presented in a relational database system using LSIDs. Then, in order to gain a more detailed representation of taxonomic information, we introduce meta-ontology TaxMeOn to model the same content as Semantic Web ontologies where taxa are identified using HTTP URIs. We also explore how changes in scientific names can be managed over time. The use of HTTP URIs is preferable for presenting the taxonomic information of species checklists. An HTTP URI identifies a taxon and operates as a web address from which additional information about the taxon can be located, unlike LSID. This enables the integration of biological data from different sources on the web using Linked Data principles and prevents the formation of information silos. The Linked Data approach allows a user to assemble information and evaluate the complexity of taxonomical data based on conflicting views of taxonomic classifications. Using HTTP URIs and Semantic Web technologies also facilitate the representation of the semantics of biological data, and in this way, the creation of more "intelligent" biological applications and services.

  5. Telescope Scientist on the Advanced X-ray Astrophysics Observatory

    NASA Technical Reports Server (NTRS)

    VanSpeybroeck, L.; Smith, Carl M. (Technical Monitor)

    2002-01-01

    This period included many scientific observations made with the Chandra Observatory. The results, as is well known, are spectacular. Fortunately, the High Resolution Mirror Assembly (HRMA) performance continues to be essentially identical to that predicted from ground calibration data. The Telescope Scientist Team has improved the mirror model to provide a more accurate description to the Chandra observers and enable them to reduce the systematic errors and uncertainties in their data reduction. We also have made considerable progress in improving the scattering model. There also has been progress in the scientific program. At this time 58 distant clusters of galaxies have been observed. We are performing a systematic analysis of this rather large data set for the purpose of determining absolute distances utilizing the Sunyaev Zel'dovich effect. These observations also have been used to study the evolution of the cluster baryon mass function and the cosmological constraints which result from this evolution.

  6. Magnetic Field Lines on the Sun

    NASA Image and Video Library

    2015-01-28

    Scientists have developed a way to produce models of where the magnetic field lines are several times each day. Here we have created a time-lapse version of these models over four days (2-3 each day) to give you a peek at how these change over time. The spiraling arcs of magnetic field lines emerge from active regions and connect back to areas with the opposite polarity. The field lines are more concentrated where regions are more magnetically intense. And of course, they rotate with the rotation of the Sun. Credit: NASA/Solar Dynamics Observatory NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  7. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGES

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  8. The Neuroanatomical, Neurophysiological and Psychological Basis of Memory: Current Models and Their Origins

    PubMed Central

    Camina, Eduardo; Güell, Francisco

    2017-01-01

    This review aims to classify and clarify, from a neuroanatomical, neurophysiological, and psychological perspective, different memory models that are currently widespread in the literature as well as to describe their origins. We believe it is important to consider previous developments without which one cannot adequately understand the kinds of models that are now current in the scientific literature. This article intends to provide a comprehensive and rigorous overview for understanding and ordering the latest scientific advances related to this subject. The main forms of memory presented include sensory memory, short-term memory, and long-term memory. Information from the world around us is first stored by sensory memory, thus enabling the storage and future use of such information. Short-term memory (or memory) refers to information processed in a short period of time. Long-term memory allows us to store information for long periods of time, including information that can be retrieved consciously (explicit memory) or unconsciously (implicit memory). PMID:28713278

  9. The Neuroanatomical, Neurophysiological and Psychological Basis of Memory: Current Models and Their Origins.

    PubMed

    Camina, Eduardo; Güell, Francisco

    2017-01-01

    This review aims to classify and clarify, from a neuroanatomical, neurophysiological, and psychological perspective, different memory models that are currently widespread in the literature as well as to describe their origins. We believe it is important to consider previous developments without which one cannot adequately understand the kinds of models that are now current in the scientific literature. This article intends to provide a comprehensive and rigorous overview for understanding and ordering the latest scientific advances related to this subject. The main forms of memory presented include sensory memory, short-term memory, and long-term memory. Information from the world around us is first stored by sensory memory, thus enabling the storage and future use of such information. Short-term memory (or memory) refers to information processed in a short period of time. Long-term memory allows us to store information for long periods of time, including information that can be retrieved consciously (explicit memory) or unconsciously (implicit memory).

  10. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    NASA Astrophysics Data System (ADS)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).

  11. Social and Behavioral Science: Monitoring Social Foraging Behavior in a Biological Model System

    DTIC Science & Technology

    2016-10-12

    SECURITY CLASSIFICATION OF: The aim of this project was to establish instrumentation to record honey bee foraging behavior through a Radio- Frequency...Identification (RFID) monitoring and to train students in the use of this technology and in the science underlying honey bee behavior. This enables...basic scientific advances in how honey bees adapt behaviorally to different stressors. Most notably, it will examine how early life stress and

  12. Introducing Argonne’s Theta Supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Theta, the Argonne Leadership Computing Facility’s (ALCF) new Intel-Cray supercomputer, is officially open to the research community. Theta’s massively parallel, many-core architecture puts the ALCF on the path to Aurora, the facility’s future Intel-Cray system. Capable of nearly 10 quadrillion calculations per second, Theta enables researchers to break new ground in scientific investigations that range from modeling the inner workings of the brain to developing new materials for renewable energy applications.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael; Lethin, Richard

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less

  14. JWST Full-Scale Model on Display in Orlando

    NASA Image and Video Library

    2017-12-08

    JWST Full-Scale Model on Display. A full-scale model of the James Webb Space Telescope was built by the prime contractor, Northrop Grumman, to provide a better understanding of the size, scale and complexity of this satellite. The model is constructed mainly of aluminum and steel, weighs 12,000 lb., and is approximately 80 feet long, 40 feet wide and 40 feet tall. The model requires 2 trucks to ship it and assembly takes a crew of 12 approximately four days. This model has traveled to a few sites since 2005. The photographs below were taken at some of its destinations. The model was on display at The International Society for Optical Engineering's (SPIE) week-long Astronomical Telescopes and Instrumentations conference,May 25 - 30, 2006. Credit: NASA/Goddard Space Flight Center/Dr Mark Clampin NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  15. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  16. Modeling Extra-Long Tsunami Propagation: Assessing Data, Model Accuracy and Forecast Implications

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Moore, C. W.; Rabinovich, A.

    2017-12-01

    Detecting and modeling tsunamis propagating tens of thousands of kilometers from the source is a formidable scientific challenge and seemingly satisfies only scientific curiosity. However, results of such analyses produce a valuable insight into the tsunami propagation dynamics, model accuracy and would provide important implications for tsunami forecast. The Mw = 9.3 megathrust earthquake of December 26, 2004 off the coast of Sumatra generated a tsunami that devastated Indian Ocean coastlines and spread into the Pacific and Atlantic oceans. The tsunami was recorded by a great number of coastal tide gauges, including those located in 15-25 thousand kilometers from the source area. To date, it is still the farthest instrumentally detected tsunami. The data from these instruments throughout the world oceans enabled to estimate various statistical parameters and energy decay of this event. High-resolution records of this tsunami from DARTs 32401 (offshore of northern Chile), 46405 and NeMO (both offshore of the US West Coast), combined with the mainland tide gauge measurements enabled us to examine far-field characteristics of the 2004 in the Pacific Ocean and to compare the results of global numerical simulations with the observations. Despite their small heights (less than 2 cm at deep-ocean locations), the records demonstrated consistent spatial and temporal structure. The numerical model described well the frequency content, amplitudes and general structure of the observed waves at deep-ocean and coastal gages. We present analysis of the measurements and comparison with model data to discuss implication for tsunami forecast accuracy. Model study for such extreme distances from the tsunami source and at extra-long times after the event is an attempt to find accuracy bounds for tsunami models and accuracy limitations of model use for forecast. We discuss results in application to tsunami model forecast and tsunami modeling in general.

  17. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  18. Enabling technologies built on a sonochemical platform: challenges and opportunities.

    PubMed

    Cintas, Pedro; Tagliapietra, Silvia; Caporaso, Marina; Tabasso, Silvia; Cravotto, Giancarlo

    2015-07-01

    Scientific and technological progress now occurs at the interface between two or more scientific and technical disciplines while chemistry is intertwined with almost all scientific domains. Complementary and synergistic effects have been found in the overlay between sonochemistry and other enabling technologies such as mechanochemistry, microwave chemistry and flow-chemistry. Although their nature and effects are intrinsically different, these techniques share the ability to significantly activate most chemical processes and peculiar phenomena. These studies offer a comprehensive overview of sonochemistry, provide a better understanding of correlated phenomena (mechanochemical effects, hot spots, etc.), and pave the way for emerging applications which unite hybrid reactors. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. An automated and integrated framework for dust storm detection based on ogc web processing services

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.

  20. Envisioning a Future of Computational Geoscience in a Data Rich Semantic World

    NASA Astrophysics Data System (ADS)

    Kumar, P.; Elag, M.; Jiang, P.; Marini, L.

    2015-12-01

    Advances in observational systems and reduction in their cost are allowing us to explore, monitor, and digitally represent our environment in unprecedented details and over large areas. Low cost in situ sensors, unmanned autonomous vehicles, imaging technologies, and other new observational approaches along with airborne and space borne systems are allowing us to measure nearly everything, almost everywhere, and at almost all the time. Under the aegis of observatories they are enabling an integrated view across space and time scales ranging from storms to seasons to years and, in some cases, decades. Rapid increase in the convergence of computational, communication and information systems and their inter-operability through advances in technologies such as semantic web can provide opportunities to further facilitate fusion and synthesis of heterogeneous measurements with knowledge systems. This integration can enable us to break disciplinary boundaries and bring sensor data directly to desktop or handheld devices. We describe CyberInfrastructure effort that is being developed through projects such as Earthcube Geosemantics (http://geosemantics.hydrocomplexity.net), (SEAD (http://sead-data.net/), and Browndog (http://browndog.ncsa.illinois.edu/)s o that data across all of earth science can be easily shared and integrated with models. This also includes efforts to enable models to become interoperable among themselves and with data using technologies that enable human-out-of-the-loop integration. Through such technologies our ability to use real time information for decision-making and scientific investigations will increase multifold. The data goes through a sequence of steps, often iterative, from collection to long-term preservation. Similarly the scientific investigation and associated outcomes are composed of a number of iterative steps from problem identification to solutions. However, the integration between these two pathways is rather limited. We describe characteristics of new technologies that are needed to bring these processes together in the near future to significantly reduce the latency between data, science, and agile and informed actions that support sustainability.

  1. Students' Positions and Considerations of Scientific Evidence about a Controversial Socioscientific Issue

    ERIC Educational Resources Information Center

    Albe, Virginie

    2008-01-01

    Efforts have been devoted to introduce in science curricula direct instruction for evaluating scientific reports on socioscientific issues. In this study, students' opinions on the SSI of mobile telephones effects have been investigated before and after a classroom activity designed to enable students to assess scientific data. Aspects of the…

  2. Use of CFD modelling for analysing air parameters in auditorium halls

    NASA Astrophysics Data System (ADS)

    Cichowicz, Robert

    2017-11-01

    Modelling with the use of numerical methods is currently the most popular method of solving scientific as well as engineering problems. Thanks to the use of computer methods it is possible for example to comprehensively describe the conditions in a given room and to determine thermal comfort, which is a complex issue including subjective sensations of the persons in a given room. The article presents the results of measurements and numerical computing that enabled carrying out the assessment of environment parameters, taking into consideration microclimate, temperature comfort, speeds in the zone of human presence and dustiness in auditory halls. For this purpose measurements of temperature, relative humidity and dustiness were made with the use of a digital microclimate meter and a laser dust particles counter. Thanks to the above by using the application DesignBuilder numerical computing was performed and the obtained results enabled determining PMV comfort indicator in selected rooms.

  3. [Mathematical model of technical equipment of a clinical-diagnostic laboratory].

    PubMed

    Bukin, S I; Busygin, D V; Tilevich, M E

    1990-01-01

    The paper is concerned with the problems of technical equipment of standard clinico-diagnostic laboratories (CDL) in this country. The authors suggest a mathematic model that may minimize expenditures for laboratory studies. The model enables the following problems to be solved: to issue scientifically-based recommendations for technical equipment of CDL; to validate the medico-technical requirements for newly devised items; to select the optimum types of uniform items; to define optimal technical decisions at the stage of the design; to determine the lab assistant's labour productivity and the cost of some investigations; to compute the medical laboratory engineering requirement for treatment and prophylactic institutions of this country.

  4. Systems Biology of the Vervet Monkey

    PubMed Central

    Jasinska, Anna J.; Schmitt, Christopher A.; Service, Susan K.; Cantor, Rita M.; Dewar, Ken; Jentsch, James D.; Kaplan, Jay R.; Turner, Trudy R.; Warren, Wesley C.; Weinstock, George M.; Woods, Roger P.; Freimer, Nelson B.

    2013-01-01

    Nonhuman primates (NHP) provide crucial biomedical model systems intermediate between rodents and humans. The vervet monkey (also called the African green monkey) is a widely used NHP model that has unique value for genetic and genomic investigations of traits relevant to human diseases. This article describes the phylogeny and population history of the vervet monkey and summarizes the use of both captive and wild vervet monkeys in biomedical research. It also discusses the effort of an international collaboration to develop the vervet monkey as the most comprehensively phenotypically and genomically characterized NHP, a process that will enable the scientific community to employ this model for systems biology investigations. PMID:24174437

  5. Software-Reconfigurable Processors for Spacecraft

    NASA Technical Reports Server (NTRS)

    Farrington, Allen; Gray, Andrew; Bell, Bryan; Stanton, Valerie; Chong, Yong; Peters, Kenneth; Lee, Clement; Srinivasan, Jeffrey

    2005-01-01

    A report presents an overview of an architecture for a software-reconfigurable network data processor for a spacecraft engaged in scientific exploration. When executed on suitable electronic hardware, the software performs the functions of a physical layer (in effect, acts as a software radio in that it performs modulation, demodulation, pulse-shaping, error correction, coding, and decoding), a data-link layer, a network layer, a transport layer, and application-layer processing of scientific data. The software-reconfigurable network processor is undergoing development to enable rapid prototyping and rapid implementation of communication, navigation, and scientific signal-processing functions; to provide a long-lived communication infrastructure; and to provide greatly improved scientific-instrumentation and scientific-data-processing functions by enabling science-driven in-flight reconfiguration of computing resources devoted to these functions. This development is an extension of terrestrial radio and network developments (e.g., in the cellular-telephone industry) implemented in software running on such hardware as field-programmable gate arrays, digital signal processors, traditional digital circuits, and mixed-signal application-specific integrated circuits (ASICs).

  6. The Visual Geophysical Exploration Environment: A Multi-dimensional Scientific Visualization

    NASA Astrophysics Data System (ADS)

    Pandya, R. E.; Domenico, B.; Murray, D.; Marlino, M. R.

    2003-12-01

    The Visual Geophysical Exploration Environment (VGEE) is an online learning environment designed to help undergraduate students understand fundamental Earth system science concepts. The guiding principle of the VGEE is the importance of hands-on interaction with scientific visualization and data. The VGEE consists of four elements: 1) an online, inquiry-based curriculum for guiding student exploration; 2) a suite of El Nino-related data sets adapted for student use; 3) a learner-centered interface to a scientific visualization tool; and 4) a set of concept models (interactive tools that help students understand fundamental scientific concepts). There are two key innovations featured in this interactive poster session. One is the integration of concept models and the visualization tool. Concept models are simple, interactive, Java-based illustrations of fundamental physical principles. We developed eight concept models and integrated them into the visualization tool to enable students to probe data. The ability to probe data using a concept model addresses the common problem of transfer: the difficulty students have in applying theoretical knowledge to everyday phenomenon. The other innovation is a visualization environment and data that are discoverable in digital libraries, and installed, configured, and used for investigations over the web. By collaborating with the Integrated Data Viewer developers, we were able to embed a web-launchable visualization tool and access to distributed data sets into the online curricula. The Thematic Real-time Environmental Data Distributed Services (THREDDS) project is working to provide catalogs of datasets that can be used in new VGEE curricula under development. By cataloging this curricula in the Digital Library for Earth System Education (DLESE), learners and educators can discover the data and visualization tool within a framework that guides their use.

  7. Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.

  8. Thermal conductivity model for powdered materials under vacuum based on experimental studies

    NASA Astrophysics Data System (ADS)

    Sakatani, N.; Ogawa, K.; Iijima, Y.; Arakawa, M.; Honda, R.; Tanaka, S.

    2017-01-01

    The thermal conductivity of powdered media is characteristically very low in vacuum, and is effectively dependent on many parameters of their constituent particles and packing structure. Understanding of the heat transfer mechanism within powder layers in vacuum and theoretical modeling of their thermal conductivity are of great importance for several scientific and engineering problems. In this paper, we report the results of systematic thermal conductivity measurements of powdered media of varied particle size, porosity, and temperature under vacuum using glass beads as a model material. Based on the obtained experimental data, we investigated the heat transfer mechanism in powdered media in detail, and constructed a new theoretical thermal conductivity model for the vacuum condition. This model enables an absolute thermal conductivity to be calculated for a powder with the input of a set of powder parameters including particle size, porosity, temperature, and compressional stress or gravity, and vice versa. Our model is expected to be a competent tool for several scientific and engineering fields of study related to powders, such as the thermal infrared observation of air-less planetary bodies, thermal evolution of planetesimals, and performance of thermal insulators and heat storage powders.

  9. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hebner, Gregory A.

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less

  10. Report for the Office of Scientific and Technical Information: Population Modeling of the Emergence and Development of Scientific Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bettencourt, L. M. A.; Castillo-Chavez, C.; Kaiser, D.

    2006-10-04

    The accelerated development of digital libraries and archives, in tandem with efficient search engines and the computational ability to retrieve and parse massive amounts of information, are making it possible to quantify the time evolution of scientific literatures. These data are but one piece of the tangible recorded evidence of the processes whereby scientists create and exchange information in their journeys towards the generation of knowledge. As such, these tools provide a proxy with which to study our ability to innovate. Innovation has often been linked with prosperity and growth and, consequently, trying to understand what drives scientific innovation ismore » of extreme interest. Identifying sets of population characteristics, factors, and mechanisms that enable scientific communities to remain at the cutting edge, accelerate their growth, or increase their ability to re-organize around new themes or research topics is therefore of special significance. Yet generating a quantitative understanding of the factors that make scientific fields arise and/or become more or less productive is still in its infancy. This is precisely the type of knowledge most needed for promoting and sustaining innovation. Ideally, the efficient and strategic allocation of resources on the part of funding agencies and corporations would be driven primarily by knowledge of this type. Early steps have been taken toward such a quantitative understanding of scientific innovation. Some have focused on characterizing the broad properties of relevant time series, such as numbers of publications and authors in a given field. Others have focused on the structure and evolution of networks of coauthorship and citation. Together these types of studies provide much needed statistical analyses of the structure and evolution of scientific communities. Despite these efforts, however, crucial elements of prediction have remained elusive. Building on many of these earlier insights, we provide here a coarse-grained approach to modeling the time-evolution of scientific fields mathematically, through adaptive models of contagion. That is, our models are inspired by epidemic contact processes, but take into account the social interactions and processes whereby scientific ideas spread - social interactions gleaned from close empirical study of historical cases. Variations in model parameters can increase or hamper the speed at which a field develops. In this way, models for the spread of 'infectious' ideas can be used to identify pressure points in the process of innovation that may allow for the evaluation of possible interventions by those responsible for promoting innovation, such as funding agencies. This report is organized as follows: Section 2 introduces and discusses the population model used here to describe the dynamics behind the establishment of scientific fields. The approach is based on a succinct (coarse) description of contact processes between scientists, and is a simplified version of a general class of models developed in the course of this work. We selected this model based primarily on its ability to treat a wide range of data patterns efficiently, across several different scientific fields. We also describe our methods for estimating parameter values, our optimization techniques used to match the model to data, and our method of generating error estimates. Section 3 presents brief accounts of six case studies of scientific evolution, measured by the growth in number of active authors over time, and shows the results of fitting our model to these data, including extrapolations to the near future. Section 4 discusses these results and provides some perspectives on the values and limitations of the models used. We also discuss topics for further research which should improve our ability to predict (and perhaps influence) the course of future scientific research. Section 5 provides more detail on the broad class of epidemic models developed as part of this project.« less

  11. The Cell Collective: Toward an open and collaborative approach to systems biology

    PubMed Central

    2012-01-01

    Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178

  12. SWARM : a scientific workflow for supporting Bayesian approaches to improve metabolic models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, X.; Stevens, R.; Mathematics and Computer Science

    2008-01-01

    With the exponential growth of complete genome sequences, the analysis of these sequences is becoming a powerful approach to build genome-scale metabolic models. These models can be used to study individual molecular components and their relationships, and eventually study cells as systems. However, constructing genome-scale metabolic models manually is time-consuming and labor-intensive. This property of manual model-building process causes the fact that much fewer genome-scale metabolic models are available comparing to hundreds of genome sequences available. To tackle this problem, we design SWARM, a scientific workflow that can be utilized to improve genome-scale metabolic models in high-throughput fashion. SWARM dealsmore » with a range of issues including the integration of data across distributed resources, data format conversions, data update, and data provenance. Putting altogether, SWARM streamlines the whole modeling process that includes extracting data from various resources, deriving training datasets to train a set of predictors and applying Bayesian techniques to assemble the predictors, inferring on the ensemble of predictors to insert missing data, and eventually improving draft metabolic networks automatically. By the enhancement of metabolic model construction, SWARM enables scientists to generate many genome-scale metabolic models within a short period of time and with less effort.« less

  13. Thermo-chemical Ice Penetrator for Icy Moons

    NASA Astrophysics Data System (ADS)

    Arenberg, J. W.; Lee, G.; Harpole, G.; Zamel, J.; Sen, B.; Ross, F.; Retherford, K. D.

    2016-12-01

    The ability to place sensors or to take samples below the ice surface enables a wide variety of potential scientific investigations. Penetrating an ice cap can be accomplished via a mechanical drill, laser drill, kinetic impactor, or heated penetrator. This poster reports on the development of technology for the latter most option, namely a self-heated probe driven by an exothermic chemical reaction: a Thermo-chemical ice penetrator (TChIP). Our penetrator design employs a eutectic mix of alkali metals that produce an exothermic reaction upon contact with an icy surface. This reaction increases once the ice starts melting, so no external power is required. This technology is inspired by a classified Cold-War era program developed at Northrop Grumman for the US Navy. Terrestrial demonstration of this technology took place in the Arctic; however, this device cannot be considered high TRL for application at the icy moons of the solar system due to the environmental differences between Earth's Arctic and the icy moons. These differences demand a TChIP design specific to these cold, low mass, airless worlds. It is expected that this model of TChIP performance will be complex, incorporating all of the forces on the penetrator, gravity, the thermo-chemistry at the interface between penetrator and ice, and multi-phase heat and mass transport, and hydrodynamics. Our initial efforts are aimed at the development of a validated set of tools and simulations to predict the performance of the penetrator for both the environment found on these icy moons and for a terrestrial environment. The purpose of the inclusion of the terrestrial environment is to aid in model validation. Once developed and validated, our models will allow us to design penetrators for a specific scientific application on a specific body. This poster discusses the range of scientific investigations that are enabled by TChIP. We also introduce the development plan to advance TChIP to the point where it can be considered for infusion into a program.

  14. Technology for a Thermo-chemical Ice Penetrator for Icy Moons

    NASA Astrophysics Data System (ADS)

    Arenberg, Jonathan; Harpole, George; Zamel, James; Sen, Bashwar; Lee, Greg; Ross, Floyd; Retherford, Kurt D.

    2016-10-01

    The ability to place sensors or to take samples below the ice surface enables a wide variety of potential scientific investigations. Penetrating an ice cap can be accomplished via a mechanical drill, laser drill, kinetic impactor, or heated penetrator. This poster reports on the development of technology for the latter most option, namely a self-heated probe driven by an exothermic chemical reaction: a Thermo-chemical ice penetrator (TChIP). Our penetrator design employs a eutectic mix of alkali metals that produce an exothermic reaction upon contact with an icy surface. This reaction increases once the ice starts melting, so no external power is required. This technology is inspired by a classified Cold-War era program developed at Northrop Grumman for the US Navy. Terrestrial demonstration of this technology took place in the Arctic; however, this device cannot be considered high TRL for application at the icy moons of the solar system due to the environmental differences between Earth's Arctic and the icy moons. These differences demand a TChIP design specific to these cold, low mass, airless worlds. It is expected that this model of TChIP performance will be complex, incorporating all of the forces on the penetrator, gravity, the thermo-chemistry at the interface between penetrator and ice, and multi-phase heat and mass transport, and hydrodynamics. Our initial efforts are aimed at the development of a validated set of tools and simulations to predict the performance of the penetrator for both the environment found on these icy moons and for a terrestrial environment. The purpose of the inclusion of the terrestrial environment is to aid in model validation. Once developed and validated, our models will allow us to design penetrators for a specific scientific application on a specific body. This poster discusses the range of scientific investigations that are enabled by TChIP. We also introduce the development plan to advance TChIP to the point where it can be considered for infusion into a program.

  15. Cloud-based Jupyter Notebooks for Water Data Analysis

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.

  16. Management and assimilation of diverse, distributed watershed datasets

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Faybishenko, B.; Versteeg, R.; Agarwal, D.; Hubbard, S. S.; Hendrix, V.

    2016-12-01

    The U.S. Department of Energy's (DOE) Watershed Function Scientific Focus Area (SFA) seeks to determine how perturbations to mountainous watersheds (e.g., floods, drought, early snowmelt) impact the downstream delivery of water, nutrients, carbon, and metals over seasonal to decadal timescales. We are building a software platform that enables integration of diverse and disparate field, laboratory, and simulation datasets, of various types including hydrological, geological, meteorological, geophysical, geochemical, ecological and genomic datasets across a range of spatial and temporal scales within the Rifle floodplain and the East River watershed, Colorado. We are using agile data management and assimilation approaches, to enable web-based integration of heterogeneous, multi-scale dataSensor-based observations of water-level, vadose zone and groundwater temperature, water quality, meteorology as well as biogeochemical analyses of soil and groundwater samples have been curated and archived in federated databases. Quality Assurance and Quality Control (QA/QC) are performed on priority datasets needed for on-going scientific analyses, and hydrological and geochemical modeling. Automated QA/QC methods are used to identify and flag issues in the datasets. Data integration is achieved via a brokering service that dynamically integrates data from distributed databases via web services, based on user queries. The integrated results are presented to users in a portal that enables intuitive search, interactive visualization and download of integrated datasets. The concepts, approaches and codes being used are shared across various data science components of various large DOE-funded projects such as the Watershed Function SFA, Next Generation Ecosystem Experiment (NGEE) Tropics, Ameriflux/FLUXNET, and Advanced Simulation Capability for Environmental Management (ASCEM), and together contribute towards DOE's cyberinfrastructure for data management and model-data integration.

  17. The virtual mission approach: Empowering earth and space science missions

    NASA Astrophysics Data System (ADS)

    Hansen, Elaine

    1993-08-01

    Future Earth and Space Science missions will address increasingly broad and complex scientific issues. To accomplish this task, we will need to acquire and coordinate data sets from a number of different instrumetns, to make coordinated observations of a given phenomenon, and to coordinate the operation of the many individual instruments making these observations. These instruments will need to be used together as a single ``Virtual Mission.'' This coordinated approach is complicated in that these scientific instruments will generally be on different platforms, in different orbits, from different control centers, at different institutions, and report to different user groups. Before this Virtual Mission approach can be implemented, techniques need to be developed to enable separate instruments to work together harmoniously, to execute observing sequences in a synchronized manner, and to be managed by the Virtual Mission authority during times of these coordinated activities. Enabling technologies include object-oriented designed approaches, extended operations management concepts and distributed computing techniques. Once these technologies are developed and the Virtual Mission concept is available, we believe the concept will provide NASA's Science Program with a new, ``go-as-you-pay,'' flexible, and resilient way of accomplishing its science observing program. The concept will foster the use of smaller and lower cost satellites. It will enable the fleet of scientific satellites to evolve in directions that best meet prevailing science needs. It will empower scientists by enabling them to mix and match various combinations of in-space, ground, and suborbital instruments - combinations which can be called up quickly in response to new events or discoveries. And, it will enable small groups such as universities, Space Grant colleges, and small businesses to participate significantly in the program by developing small components of this evolving scientific fleet.

  18. Scientific Hybrid Reality Environments (SHyRE): Bringing Field Work into the Laboratory

    NASA Astrophysics Data System (ADS)

    Miller, M. J.; Graff, T.; Young, K.; Coan, D.; Whelley, P.; Richardson, J.; Knudson, C.; Bleacher, J.; Garry, W. B.; Delgado, F.; Noyes, M.; Valle, P.; Buffington, J.; Abercromby, A.

    2018-04-01

    The SHyRE program aims to develop a scientifically-robust analog environment using a new and innovative hybrid reality setting that enables frequent operational testing and rapid protocol development for future planetary exploration.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wellman, Dawn M.; Freshley, Mark D.; Truex, Michael J.

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical to achieve. Use of risk-informed alternate endpoints provide a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable establishing a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment thatmore » integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches.« less

  20. Advances and Remaining Challenges in the Study of Influenza and Anthrax Infection in Lung Cell Culture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Joshua D.; Straub, Timothy M.

    For over thirty years immortalized lung cells have enabled researchers to elucidate lung-pathogen molecular interactions. However, over the last five years numerous commercial companies are now providing affordable, ready-to-use primary lung cells for use in research laboratories. Despite advances in primary cell culture, studies using immortalized lung cells still dominate the recent scientific literature. In this paper, we highlight recent influenza and anthrax studies using in vitro primary lung tissue models and how these models are providing better predictive outcomes for when extrapolated to in vivo observations.

  1. Advances and Remaining Challenges in the Study of Influenza and Anthrax Infection in Lung Cell Culture

    DOE PAGES

    Powell, Joshua D.; Straub, Timothy M.

    2018-01-17

    For over thirty years immortalized lung cells have enabled researchers to elucidate lung-pathogen molecular interactions. However, over the last five years numerous commercial companies are now providing affordable, ready-to-use primary lung cells for use in research laboratories. Despite advances in primary cell culture, studies using immortalized lung cells still dominate the recent scientific literature. In this paper, we highlight recent influenza and anthrax studies using in vitro primary lung tissue models and how these models are providing better predictive outcomes for when extrapolated to in vivo observations.

  2. ArcGIS Framework for Scientific Data Analysis and Serving

    NASA Astrophysics Data System (ADS)

    Xu, H.; Ju, W.; Zhang, J.

    2015-12-01

    ArcGIS is a platform for managing, visualizing, analyzing, and serving geospatial data. Scientific data as part of the geospatial data features multiple dimensions (X, Y, time, and depth) and large volume. Multidimensional mosaic dataset (MDMD), a newly enhanced data model in ArcGIS, models the multidimensional gridded data (e.g. raster or image) as a hypercube and enables ArcGIS's capabilities to handle the large volume and near-real time scientific data. Built on top of geodatabase, the MDMD stores the dimension values and the variables (2D arrays) in a geodatabase table which allows accessing a slice or slices of the hypercube through a simple query and supports animating changes along time or vertical dimension using ArcGIS desktop or web clients. Through raster types, MDMD can manage not only netCDF, GRIB, and HDF formats but also many other formats or satellite data. It is scalable and can handle large data volume. The parallel geo-processing engine makes the data ingestion fast and easily. Raster function, definition of a raster processing algorithm, is a very important component in ArcGIS platform for on-demand raster processing and analysis. The scientific data analytics is achieved through the MDMD and raster function templates which perform on-demand scientific computation with variables ingested in the MDMD. For example, aggregating monthly average from daily data; computing total rainfall of a year; calculating heat index for forecasting data, and identifying fishing habitat zones etc. Addtionally, MDMD with the associated raster function templates can be served through ArcGIS server as image services which provide a framework for on-demand server side computation and analysis, and the published services can be accessed by multiple clients such as ArcMap, ArcGIS Online, JavaScript, REST, WCS, and WMS. This presentation will focus on the MDMD model and raster processing templates. In addtion, MODIS land cover, NDFD weather service, and HYCOM ocean model will be used to illustrate how ArcGIS platform and MDMD model can facilitate scientific data visualization and analytics and how the analysis results can be shared to more audience through ArcGIS Online and Portal.

  3. Exposure to traffic pollution: comparison between measurements and a model.

    PubMed

    Alili, F; Momas, I; Callais, F; Le Moullec, Y; Sacre, C; Chiron, M; Flori, J P

    2001-01-01

    French researchers from the Building Scientific and Technical Center have produced a traffic-exposure index. To achieve this, they used an air pollution dispersion model that enabled them to calculate automobile pollutant concentrations in front of subjects' residences and places of work. Researchers used this model, which was tested at 27 Paris canyon street sites, and compared nitrogen oxides measurements obtained with passive samplers during a 6-wk period and calculations derived from the model. There was a highly significant correlation (r = .83) between the 2 series of values; their mean concentrations were not significantly different. The results suggested that the aforementioned model could be a useful epidemiological tool for the classification of city dwellers by present-or even cumulative exposure to automobile air pollution.

  4. Scientific analogs and the development of human mission architectures for the Moon, deep space and Mars

    NASA Astrophysics Data System (ADS)

    Lim, D. S. S.; Abercromby, A.; Beaton, K.; Brady, A. L.; Cardman, Z.; Chappell, S.; Cockell, C. S.; Cohen, B. A.; Cohen, T.; Deans, M.; Deliz, I.; Downs, M.; Elphic, R. C.; Hamilton, J. C.; Heldmann, J.; Hillenius, S.; Hoffman, J.; Hughes, S. S.; Kobs-Nawotniak, S. E.; Lees, D. S.; Marquez, J.; Miller, M.; Milovsoroff, C.; Payler, S.; Sehlke, A.; Squyres, S. W.

    2016-12-01

    Analogs are destinations on Earth that allow researchers to approximate operational and/or physical conditions on other planetary bodies and within deep space. Over the past decade, our team has been conducting geobiological field science studies under simulated deep space and Mars mission conditions. Each of these missions integrate scientific and operational research with the goal to identify concepts of operations (ConOps) and capabilities that will enable and enhance scientific return during human and human-robotic missions to the Moon, into deep space and on Mars. Working under these simulated mission conditions presents a number of unique challenges that are not encountered during typical scientific field expeditions. However, there are significant benefits to this working model from the perspective of the human space flight and scientific operations research community. Specifically, by applying human (and human-robotic) mission architectures to real field science endeavors, we create a unique operational litmus test for those ConOps and capabilities that have otherwise been vetted under circumstances that did not necessarily demand scientific data return meeting the rigors of peer-review standards. The presentation will give an overview of our team's recent analog research, with a focus on the scientific operations research. The intent is to encourage collaborative dialog with a broader set of analog research community members with an eye towards future scientific field endeavors that will have a significant impact on how we design human and human-robotic missions to the Moon, into deep space and to Mars.

  5. Auspice: Automatic Service Planning in Cloud/Grid Environments

    NASA Astrophysics Data System (ADS)

    Chiu, David; Agrawal, Gagan

    Recent scientific advances have fostered a mounting number of services and data sets available for utilization. These resources, though scattered across disparate locations, are often loosely coupled both semantically and operationally. This loosely coupled relationship implies the possibility of linking together operations and data sets to answer queries. This task, generally known as automatic service composition, therefore abstracts the process of complex scientific workflow planning from the user. We have been exploring a metadata-driven approach toward automatic service workflow composition, among other enabling mechanisms, in our system, Auspice: Automatic Service Planning in Cloud/Grid Environments. In this paper, we present a complete overview of our system's unique features and outlooks for future deployment as the Cloud computing paradigm becomes increasingly eminent in enabling scientific computing.

  6. University Education in the USSR.

    ERIC Educational Resources Information Center

    Smirnov, A. G.; Kleho, Yu. Ya.

    1989-01-01

    Universities in the USSR fulfill the role of leading educational, scientific, and cultural centers. Their main function is training researchers and teachers and conducting scientific research. They also offer courses enabling adults to enrich their knowledge of various fields of culture. (SK)

  7. Scientific Data Management (SDM) Center for Enabling Technologies. Final Report, 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Our contributions to advancing the State of the Art in scientific workflows have focused on the following areas: Workflow development; Generic workflow components and templates; Provenance collection and analysis; and, Workflow reliability and fault tolerance.

  8. Models versus theories as a primary carrier of nursing knowledge: A philosophical argument.

    PubMed

    Bender, Miriam

    2018-01-01

    Theories and models are not equivalent. I argue that an orientation towards models as a primary carrier of nursing knowledge overcomes many ongoing challenges in philosophy of nursing science, including the theory-practice divide and the paradoxical pursuit of predictive theories in a discipline that is defined by process and a commitment to the non-reducibility of the health/care experience. Scientific models describe and explain the dynamics of specific phenomenon. This is distinct from theory, which is traditionally defined as propositions that explain and/or predict the world. The philosophical case has been made against theoretical universalism, showing that a theory can be true in its domain, but that no domain is universal. Subsequently, philosophers focused on scientific models argued that they do the work of defining the boundary conditions-the domain(s)-of a theory. Further analysis has shown the ways models can be constructed and function independent of theory, meaning models can comprise distinct, autonomous "carriers of scientific knowledge." Models are viewed as representations of the active dynamics, or mechanisms, of a phenomenon. Mechanisms are entities and activities organized such that they are productive of regular changes. Importantly, mechanisms are by definition not static: change may alter the mechanism and thereby alter or create entirely new phenomena. Orienting away from theory, and towards models, focuses scholarly activity on dynamics and change. This makes models arguably critical to nursing science, enabling the production of actionable knowledge about the dynamics of process and change in health/care. I briefly explore the implications for nursing-and health/care-knowledge and practice. © 2017 John Wiley & Sons Ltd.

  9. Opal web services for biomedical applications.

    PubMed

    Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W

    2010-07-01

    Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.

  10. Data-driven Ontology Development: A Case Study at NASA's Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Hertz, J.; Huffer, E.; Kusterer, J.

    2012-12-01

    Well-founded ontologies are key to enabling transformative semantic technologies and accelerating scientific research. One example is semantically enabled search and discovery, making scientific data accessible and more understandable by accurately modeling a complex domain. The ontology creation process remains a challenge for many anxious to pursue semantic technologies. The key may be that the creation process -- whether formal, community-based, automated or semi-automated -- should encompass not only a foundational core and supplemental resources but also a focus on the purpose or mission the ontology is created to support. Are there tools or processes to de-mystify, assess or enhance the resulting ontology? We suggest that comparison and analysis of a domain-focused ontology can be made using text engineering tools for information extraction, tokenizers, named entity transducers and others. The results are analyzed to ensure the ontology reflects the core purpose of the domain's mission and that the ontology integrates and describes the supporting data in the language of the domain - how the science is analyzed and discussed among all users of the data. Commonalities and relationships among domain resources describing the Clouds and Earth's Radiant Energy (CERES) Bi-Directional Scan (BDS) datasets from NASA's Atmospheric Science Data Center are compared. The domain resources include: a formal ontology created for CERES; scientific works such as papers, conference proceedings and notes; information extracted from the datasets (i.e., header metadata); and BDS scientific documentation (Algorithm Theoretical Basis Documents, collection guides, data quality summaries and others). These resources are analyzed using the open source software General Architecture for Text Engineering, a mature framework for computational tasks involving human language.

  11. Science Autonomy in Robotic Exploration

    NASA Technical Reports Server (NTRS)

    Roush, Ted L.; DeVincenzi, Donald (Technical Monitor)

    2001-01-01

    Historical mission operations have involved: (1) return of scientific data; (2) evaluation of these data by scientists; (3) recommendations for future mission activity by scientists; (4) commands for these transmitted to the craft; and (5) the activity being undertaken. This cycle is repeated throughout the mission with command opportunities once or twice per day. For a rover, this historical cycle is not amenable to rapid long range traverses or rapid response to any novel or unexpected situations. In addition to real-time response issues, imaging and/or spectroscopic devices can produce tremendous data volumes during a traverse. However, such data volumes can rapidly exceed on-board memory capabilities prior to the ability to transmit it to Earth. Additionally, the necessary communication band-widths are restrictive enough so that only a small portion of these data can actually be returned to Earth. Such scenarios suggest enabling some science decisions to be made on-board the robots. These decisions involve automating various aspects of scientific discovery instead of the electromechanical control, health, and navigation issues associated with robotic operations. The robot retains access to the full data fidelity obtained by its scientific sensors, and is in the best position to implement actions based upon these data. Such an approach would eventually enable the robot to alter observations and assure only the highest quality data is obtained for analysis. Additionally, the robot can begin to understand what is scientifically interesting and implement alternative observing sequences, because the observed data deviate from expectations based upon current theories/models of planetary processes. Such interesting data and/or conclusions can then be prioritized and selectively transmitted to Earth; reducing memory and communications demands. Results of Ames' current work in this area will be presented.

  12. Robotic Exploration: The Role of Science Autonomy

    NASA Technical Reports Server (NTRS)

    Roush, Ted L.; DeVincenzi, Donald (Technical Monitor)

    2001-01-01

    Historical mission operations have involved: (1) return of scientific data; (2) evaluation of these data by scientists; (3) recommendations for future mission activity by scientists; (4) commands for these transmitted to the craft; and (5) the activity being, undertaken. This cycle is repeated throughout the mission with command opportunities once or twice per day. For a rover, this historical cycle is not amenable to rapid long range traverses or rapid response to any novel or unexpected situations. In addition to real-time response issues, imaging and/or spectroscopic devices can produce tremendous data volumes during a traverse. However, such data volumes can rapidly exceed on-board memory capabilities prior to the ability to transmit it to Earth. Additionally, the necessary communication band-widths are restrictive enough so that only a small portion of these data can actually be returned to Earth. Such scenarios suggest enabling some science decisions to be made on-board the robots. These decisions involve automating various aspects of scientific discovery instead of the electromechanical control, health, and navigation issues associated with robotic operations. The robot retains access to the full data fidelity obtained by its scientific sensors, and is in the best position to implement actions based upon these data. Such an approach would eventually enable the robot to alter observations and assure only the highest quality data is obtained for analysis. Additionally, the robot can begin to understand what is scientifically interesting and implement alternative observing sequences, because the observed data deviate from expectations based upon current theories/models of planetary processes. Such interesting data and/or conclusions can then be prioritized and selectively transmitted to Earth; reducing memory and communications demands. Results of Ames' current work in this area will be presented.

  13. The thrill of scientific discovery and leadership with my group

    PubMed Central

    Greco, Valentina

    2016-01-01

    My group and I feel tremendously honored to be recognized with the 2016 Early Career Life Scientist Award from the American Society for Cell Biology. In this essay I share the scientific questions that my lab has been excitedly pursuing since starting in August 2009 and the leadership behaviors we have adopted that enable our collective scientific productivity. PMID:27799490

  14. Computer networks for remote laboratories in physics and engineering

    NASA Technical Reports Server (NTRS)

    Starks, Scott; Elizandro, David; Leiner, Barry M.; Wiskerchen, Michael

    1988-01-01

    This paper addresses a relatively new approach to scientific research, telescience, which is the conduct of scientific operations in locations remote from the site of central experimental activity. A testbed based on the concepts of telescience is being developed to ultimately enable scientific researchers on earth to conduct experiments onboard the Space Station. This system along with background materials are discussed.

  15. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Brissebrat, Guillaume; Fleury, Laurence; Boichard, Jean-Luc; Cloché, Sophie; Eymard, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim; Asencio, Nicole; Favot, Florence; Roussot, Odile

    2013-04-01

    The AMMA information system aims at expediting data and scientific results communication inside the AMMA community and beyond. It has already been adopted as the data management system by several projects and is meant to become a reference information system about West Africa area for the whole scientific community. The AMMA database and the associated on line tools have been developed and are managed by two French teams (IPSL Database Centre, Palaiseau and OMP Data Service, Toulouse). The complete system has been fully duplicated and is operated by AGRHYMET Regional Centre in Niamey, Niger. The AMMA database contains a wide variety of datasets: - about 250 local observation datasets, that cover geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health...) They come from either operational networks or scientific experiments, and include historical data in West Africa from 1850; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Database users can access all the data using either the portal http://database.amma-international.org or http://amma.agrhymet.ne/amma-data. Different modules are available. The complete catalogue enables to access metadata (i.e. information about the datasets) that are compliant with the international standards (ISO19115, INSPIRE...). Registration pages enable to read and sign the data and publication policy, and to apply for a user database account. The data access interface enables to easily build a data extraction request by selecting various criteria like location, time, parameters... At present, the AMMA database counts more than 740 registered users and process about 80 data requests every month In order to monitor day-to-day meteorological and environment information over West Africa, some quick look and report display websites have been developed. They met the operational needs for the observational teams during the AMMA 2006 (http://aoc.amma-international.org) and FENNEC 2011 (http://fenoc.sedoo.fr) campaigns. But they also enable scientific teams to share physical indices along the monsoon season (http://misva.sedoo.fr from 2011). A collaborative WIKINDX tool has been set on line in order to manage scientific publications and communications of interest to AMMA (http://biblio.amma-international.org). Now the bibliographic database counts about 1200 references. It is the most exhaustive document collection about African Monsoon available for all. Every scientist is invited to make use of the different AMMA on line tools and data. Scientists or project leaders who have data management needs for existing or future datasets over West Africa are welcome to use the AMMA database framework and to contact ammaAdmin@sedoo.fr .

  16. The place of human values in the language of science: Kuhn, Saussure, and structuralism.

    PubMed

    Psaty, B M; Inui, T S

    1991-12-01

    The current paradigm in medicine generally distinguishes between genetic and environmental causes of disease. Although the word "paradigm" has become a commonplace, the theories of Thomas Kuhn have not received much attention in the journals of medicine. Kuhn's structuralist method differs radically from the daily activities of the scientific method itself. Using linguistic theory, this essay offers a structuralist reading of Thomas Kuhn's The Structure of Scientific Revolutions. Our purpose is to highlight the similarities between these structuralist models of science and language. In part, we focus on the logic that enables Kuhn to assert the priority of perception over interpretation in the history of science. To illustrate some of these issues, we refer to the distinction between environmental and genetic causes of disease. While the activity of scientific research results in the revision of concepts in science, the production of significant differences that shape our knowledge is in part a social and linguistic process.

  17. The effects of diversity and network ties on innovations: The emergence of a new scientific field.

    PubMed

    Lungeanu, Alina; Contractor, Noshir S

    2015-05-01

    This study examines the influence of different types of diversity, both observable and unobservable, on the creation of innovative ideas. Our framework draws upon theory and research on information processing, social categorization, coordination, and homophily to posit the influence of cognitive, gender, and country diversity on innovation. Our longitudinal model is based on a unique dataset of 1,354 researchers who helped create the new scientific field of Oncofertility, by collaborating on 469 publications over a four-year period. We capture the differences among researchers along cognitive, country and gender dimensions, as well as examine how the resulting diversity or homophily influences the formation of collaborative innovation networks. We find that innovation, operationalized as publishing in a new scientific discipline, benefits from both homophily and diversity. Homophily in country of residence and working with prior collaborators help reduce uncertainty in the interactions associated with innovation, while diversity in knowledge enables the recombinant knowledge required for innovation.

  18. Scientific information repository assisting reflectance spectrometry in legal medicine.

    PubMed

    Belenki, Liudmila; Sterzik, Vera; Bohnert, Michael; Zimmermann, Klaus; Liehr, Andreas W

    2012-06-01

    Reflectance spectrometry is a fast and reliable method for the characterization of human skin if the spectra are analyzed with respect to a physical model describing the optical properties of human skin. For a field study performed at the Institute of Legal Medicine and the Freiburg Materials Research Center of the University of Freiburg, a scientific information repository has been developed, which is a variant of an electronic laboratory notebook and assists in the acquisition, management, and high-throughput analysis of reflectance spectra in heterogeneous research environments. At the core of the repository is a database management system hosting the master data. It is filled with primary data via a graphical user interface (GUI) programmed in Java, which also enables the user to browse the database and access the results of data analysis. The latter is carried out via Matlab, Python, and C programs, which retrieve the primary data from the scientific information repository, perform the analysis, and store the results in the database for further usage.

  19. Heliophysics Radio Observations Enabled by the Deep Space Gateway

    NASA Astrophysics Data System (ADS)

    Kasper, J. C.

    2018-02-01

    This presentation reviews the scientific potential of low frequency radio imaging from space, the SunRISE radio interferometer, and the scientific value of larger future arrays in deep space and how they would benefit from the Deep Space Gateway.

  20. The Scientific and Technical Revolution in the Socialist Republic of Viet Nam.

    ERIC Educational Resources Information Center

    Vien, Nguyen Khac

    1979-01-01

    Discussed are the reasons for the Socialist Republic of Viet Nam's scientific backwardness. A development project which will enable this country to become a modern, economically self-sufficient country by the year 2000 is outlined. (BT)

  1. Implementing interactive decision support: A case for combining cyberinfrastructure, data fusion, and social process to mobilize scientific knowledge in sustainability problems

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.

    2014-12-01

    Geosciences are becoming increasingly data intensive, particularly in relation to sustainability problems, which are multi-dimensional, weakly structured and characterized by high levels of uncertainty. In the case of complex resource management problems, the challenge is to extract meaningful information from data and make sense of it. Simultaneously, scientific knowledge alone is insufficient to change practice. Creating tools, and group decision support processes for end users to interact with data are key challenges to transforming science-based information into actionable knowledge. The ENCOMPASS project began as a multi-year case study in the Atacama Desert of Chile to design and implement a knowledge transfer model for energy-water-mining conflicts in the region. ENCOMPASS combines the use of cyberinfrastructure (CI), automated data collection, interactive interfaces for dynamic decision support, and participatory modelling to support social learning. A pilot version of the ENCOMPASS CI uses open source systems and serves as a structure to integrate and store multiple forms of data and knowledge, such as DEM, meteorological, water quality, geomicrobiological, energy demand, and groundwater models. In the case study, informatics and data fusion needs related to scientific uncertainty around deep groundwater flowpaths and energy-water connections. Users may upload data from field sites with handheld devices or desktops. Once uploaded, data assets are accessible for a variety of uses. To address multi-attributed decision problems in the Atacama region a standalone application with touch-enabled interfaces was created to improve real-time interactions with datasets by groups. The tool was used to merge datasets from the ENCOMPASS CI to support exploration among alternatives and build shared understanding among stakeholders. To date, the project has increased technical capacity among stakeholders, resulted in the creation of both for-profit and non-profit entities, enabled cross-sector collaboration with mining-indigenous stakeholders, and produced an interactive application for group decision support. ENCOMPASS leverages advances in computational tools to deliver data and models for group decision support applied to sustainability science problems.

  2. Big Data from Europe's Natural Science Collections through DiSSCo

    NASA Astrophysics Data System (ADS)

    Addink, Wouter; Koureas, Dimitris; Casino, Ana

    2017-04-01

    DiSSCo, a Distributed System of Scientific Collections, will be a Research Infrastructure delivering big data describing the history of Planet Earth. Approximately 1.5 billion biological and geological specimens, representing the last 300 years of scientific study on the natural world, reside in collections all over Europe. These span 4.5 billion years of history, from the formation of the solar system to the present day. In the European landscape of environmental Research Infrastructures, different projects and landmarks describe services that aim at aggregating, monitoring, analysing and modelling geo-diversity information. The effectiveness of these services, however, is based on the quality and availability of primary reference data that today is scattered and uncomplete. DiSSCo provides the required bio-geographical, taxonomic and species trait data at the level of precision and accuracy required to enable and speed up research for the rapidly growing seven grand societal challenges that are priorities of the Europe 2020 strategy. DiSSCo enables better connections between collection data and observations in biodiversity observation networks, such as EU BON and GEOBON. This supports research areas like long term ecological research, for which the continuity and long term research is a strength of biological collections.

  3. Enabling Research Tools for Sustained Climate Assessment

    NASA Technical Reports Server (NTRS)

    Leidner, Allison K.; Bosilovich, Michael G.; Jasinski, Michael F.; Nemani, Ramakrishna R.; Waliser, Duane Edward; Lee, Tsengdar J.

    2016-01-01

    The U.S. Global Change Research Program Sustained Assessment process benefits from long-term investments in Earth science research that enable the scientific community to conduct assessment-relevant science. To this end, NASA initiated several research programs over the past five years to support the Earth observation community in developing indicators, datasets, research products, and tools to support ongoing and future National Climate Assessments. These activities complement NASA's ongoing Earth science research programs. One aspect of the assessment portfolio funds four "enabling tools" projects at NASA research centers. Each tool leverages existing capacity within the center, but has developed tailored applications and products for National Climate Assessments. The four projects build on the capabilities of a global atmospheric reanalysis (MERRA-2), a continental U.S. land surface reanalysis (NCA-LDAS), the NASA Earth Exchange (NEX), and a Regional Climate Model Evaluation System (RCMES). Here, we provide a brief overview of each enabling tool, highlighting the ways in which it has advanced assessment science to date. We also discuss how the assessment community can access and utilize these tools for National Climate Assessments and other sustained assessment activities.

  4. Highly coherent vacuum ultraviolet radiation at the 15th harmonic with echo-enabled harmonic generation technique

    NASA Astrophysics Data System (ADS)

    Hemsing, E.; Dunning, M.; Hast, C.; Raubenheimer, T. O.; Weathersby, S.; Xiang, D.

    2014-07-01

    X-ray free-electron lasers are enabling access to new science by producing ultrafast and intense x rays that give researchers unparalleled power and precision in examining the fundamental nature of matter. In the quest for fully coherent x rays, the echo-enabled harmonic generation technique is one of the most promising methods. In this technique, coherent radiation at the high harmonic frequencies of two seed lasers is generated from the recoherence of electron beam phase space memory. Here we report on the generation of highly coherent and stable vacuum ultraviolet radiation at the 15th harmonic of an infrared seed laser with this technique. The experiment demonstrates two distinct advantages that are intrinsic to the highly nonlinear phase space gymnastics of echo-enabled harmonic generation in a new regime, i.e., high frequency up-conversion efficiency and insensitivity to electron beam phase space imperfections. Our results allow comparison and confirmation of predictive models and scaling laws, and mark a significant step towards fully coherent x-ray free-electron lasers that will open new scientific research.

  5. Community-based Participatory Research

    PubMed Central

    Holkup, Patricia A.; Tripp-Reimer, Toni; Salois, Emily Matt; Weinert, Clarann

    2009-01-01

    Community-based participatory research (CBPR), with its emphasis on joining with the community as full and equal partners in all phases of the research process, makes it an appealing model for research with vulnerable populations. However, the CBPR approach is not without special challenges relating to ethical, cultural, and scientific issues. In this article, we describe how we managed the challenges we encountered while conducting a CBPR project with a Native American community. We also suggest criteria that will enable evaluation of the project. PMID:15455579

  6. Object-oriented structures supporting remote sensing databases

    NASA Technical Reports Server (NTRS)

    Wichmann, Keith; Cromp, Robert F.

    1995-01-01

    Object-oriented databases show promise for modeling the complex interrelationships pervasive in scientific domains. To examine the utility of this approach, we have developed an Intelligent Information Fusion System based on this technology, and applied it to the problem of managing an active repository of remotely-sensed satellite scenes. The design and implementation of the system is compared and contrasted with conventional relational database techniques, followed by a presentation of the underlying object-oriented data structures used to enable fast indexing into the data holdings.

  7. Final Report, “Exploiting Global View for Resilience”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chien, Andrew

    2017-03-29

    Final technical report for the "Exploiting Global View for Resilience" project. The GVR project aims to create a new approach to portable, resilient applications. The GVR approach builds on a global view data model,, adding versioning (multi-version), user control of timing and rate (multi-stream), and flexible cross layer error signalling and recovery. With a versioned array as a portable abstraction, GVR enables application programmers to exploit deep scientific and application code insights to manage resilience (and its overhead) in a flexible, portable fashion.

  8. Exploring the Possibilities: Earth and Space Science Missions in the Context of Exploration

    NASA Technical Reports Server (NTRS)

    Pfarr, Barbara; Calabrese, Michael; Kirkpatrick, James; Malay, Jonathan T.

    2006-01-01

    According to Dr. Edward J. Weiler, Director of the Goddard Space Flight Center, "Exploration without science is tourism". At the American Astronautical Society's 43rd Annual Robert H. Goddard Memorial Symposium it was quite apparent to all that NASA's current Exploration Initiative is tightly coupled to multiple scientific initiatives: exploration will enable new science and science will enable exploration. NASA's Science Mission Directorate plans to develop priority science missions that deliver science that is vital, compelling and urgent. This paper will discuss the theme of the Goddard Memorial Symposium that science plays a key role in exploration. It will summarize the key scientific questions and some of the space and Earth science missions proposed to answer them, including the Mars and Lunar Exploration Programs, the Beyond Einstein and Navigator Programs, and the Earth-Sun System missions. It will also discuss some of the key technologies that will enable these missions, including the latest in instruments and sensors, large space optical system technologies and optical communications, and briefly discuss developments and achievements since the Symposium. Throughout history, humans have made the biggest scientific discoveries by visiting unknown territories; by going to the Moon and other planets and by seeking out habitable words, NASA is continuing humanity's quest for scientific knowledge.

  9. Confidence interval or p-value?: part 4 of a series on evaluation of scientific publications.

    PubMed

    du Prel, Jean-Baptist; Hommel, Gerhard; Röhrig, Bernd; Blettner, Maria

    2009-05-01

    An understanding of p-values and confidence intervals is necessary for the evaluation of scientific articles. This article will inform the reader of the meaning and interpretation of these two statistical concepts. The uses of these two statistical concepts and the differences between them are discussed on the basis of a selective literature search concerning the methods employed in scientific articles. P-values in scientific studies are used to determine whether a null hypothesis formulated before the performance of the study is to be accepted or rejected. In exploratory studies, p-values enable the recognition of any statistically noteworthy findings. Confidence intervals provide information about a range in which the true value lies with a certain degree of probability, as well as about the direction and strength of the demonstrated effect. This enables conclusions to be drawn about the statistical plausibility and clinical relevance of the study findings. It is often useful for both statistical measures to be reported in scientific articles, because they provide complementary types of information.

  10. A simple model of hysteresis behavior using spreadsheet analysis

    NASA Astrophysics Data System (ADS)

    Ehrmann, A.; Blachowicz, T.

    2015-01-01

    Hysteresis loops occur in many scientific and technical problems, especially as field dependent magnetization of ferromagnetic materials, but also as stress-strain-curves of materials measured by tensile tests including thermal effects, liquid-solid phase transitions, in cell biology or economics. While several mathematical models exist which aim to calculate hysteresis energies and other parameters, here we offer a simple model for a general hysteretic system, showing different hysteresis loops depending on the defined parameters. The calculation which is based on basic spreadsheet analysis plus an easy macro code can be used by students to understand how these systems work and how the parameters influence the reactions of the system on an external field. Importantly, in the step-by-step mode, each change of the system state, compared to the last step, becomes visible. The simple program can be developed further by several changes and additions, enabling the building of a tool which is capable of answering real physical questions in the broad field of magnetism as well as in other scientific areas, in which similar hysteresis loops occur.

  11. [Scientific and methodologic approaches to evaluating medical management for workers of Kazakhstan].

    PubMed

    2012-01-01

    The article covers topical problems of workers' health preservation. Complex research results enabled to evaluate and analyze occupational risks in leading industries of Kazakhstan, for improving scientific and methodologic approaches to medical management for workers subjected to hazardous conditions.

  12. EUROPLANET-RI modelling service for the planetary science community: European Modelling and Data Analysis Facility (EMDAF)

    NASA Astrophysics Data System (ADS)

    Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian

    2010-05-01

    Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.

  13. The Heliophysics Data Environment: Open Source, Open Systems and Open Data.

    NASA Astrophysics Data System (ADS)

    King, Todd; Roberts, Aaron; Walker, Raymond; Thieman, James

    2012-07-01

    The Heliophysics Data Environment (HPDE) is a place for scientific discovery. Today the Heliophysics Data Environment is a framework of technologies, standards and services which enables the international community to collaborate more effectively in space physics research. Crafting a framework for a data environment begins with defining a model of the tasks to be performed, then defining the functional aspects and the work flow. The foundation of any data environment is an information model which defines the structure and content of the metadata necessary to perform the tasks. In the Heliophysics Data Environment the information model is the Space Physics Archive Search and Extract (SPASE) model and available resources are described by using this model. A described resource can reside anywhere on the internet which makes it possible for a national archive, mission, data center or individual researcher to be a provider. The generated metadata is shared, reviewed and harvested to enable services. Virtual Observatories use the metadata to provide community based portals. Through unique identifiers and registry services tools can quickly discover and access data available anywhere on the internet. This enables a researcher to quickly view and analyze data in a variety of settings and enhances the Heliophysics Data Environment. To illustrate the current Heliophysics Data Environment we present the design, architecture and operation of the Heliophysics framework. We then walk through a real example of using available tools to investigate the effects of the solar wind on Earth's magnetosphere.

  14. The need for scientific software engineering in the pharmaceutical industry

    NASA Astrophysics Data System (ADS)

    Luty, Brock; Rose, Peter W.

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  15. The need for scientific software engineering in the pharmaceutical industry.

    PubMed

    Luty, Brock; Rose, Peter W

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  16. Managed Entry Agreements for Pharmaceuticals in the Context of Adaptive Pathways in Europe.

    PubMed

    Bouvy, Jacoline C; Sapede, Claudine; Garner, Sarah

    2018-01-01

    As per the EMA definition, adaptive pathways is a scientific concept for the development of medicines which seeks to facilitate patient access to promising medicines addressing high unmet need through a prospectively planned approach in a sustainable way. This review reports the findings of activities undertaken by the ADAPT-SMART consortium to identify enablers and explore the suitability of managed entry agreements for adaptive pathways products in Europe. We found that during 2006-2016 outcomes-based managed entry agreements were not commonly used for products with a conditional marketing authorization or authorized under exceptional circumstances. The barriers and enablers to develop workable managed entry agreements models for adaptive pathways products were discussed through interviews and a multi-stakeholder workshop with a number of recommendations made in this paper.

  17. A Learned Society's Perspective on Publishing.

    PubMed

    Suzuki, Kunihiko; Edelson, Alan; Iversen, Leslie L; Hausmann, Laura; Schulz, Jörg B; Turner, Anthony J

    2016-10-01

    Scientific journals that are owned by a learned society, like the Journal of Neurochemistry (JNC), which is owned by the International Society for Neurochemistry (ISN), benefit the scientific community in that a large proportion of the income is returned to support the scientific mission of the Society. The income generated by the JNC enables the ISN to organize conferences as a platform for members and non-members alike to share their research, supporting researchers particularly in developing countries by travel grants and other funds, and promoting education in student schools. These direct benefits and initiatives for ISN members and non-members distinguish a society journal from pure commerce. However, the world of scholarly publishing is changing rapidly. Open access models have challenged the business model of traditional journal subscription and hence provided free access to publicly funded scientific research. In these models, the manuscript authors pay a publication cost after peer review and acceptance of the manuscript. Over the last decade, numerous new open access journals have been launched and traditional subscription journals have started to offer open access (hybrid journals). However, open access journals follow the general scheme that, of all participating parties, the publisher receives the highest financial benefit. The income is generated by researchers whose positions and research are mostly financed by taxpayers' or funders' money, and by reviewers and editors, who frequently are not reimbursed. Last but not least, the authors pay for the publication of their work after a rigorous and sometimes painful review process. JNC itself has an open access option, at a significantly reduced cost for Society members as an additional benefit. This article provides first-hand insights from two former Editors-in-Chief, Kunihiko Suzuki and Leslie Iversen, about the history of JNC's ownership and about the difficulties and battles fought along the way to its current success and reputation. Scientific journals that are owned by a learned society, like the Journal of Neurochemistry (JNC) which is owned by the International Society for Neurochemistry (ISN), benefit the scientific community in that a large proportion of the income is returned to support the scientific mission of the Society. The income generated by the JNC enables the ISN to organize conferences as a platform for members and non-members alike to share their research, supporting researchers particularly in developing countries by travel grants and other funds, and to promote education in student schools. These direct benefits and initiatives for ISN members and non-members distinguish a society journal from pure commerce. However, the world of scholarly publishing is changing rapidly. Open access models have challenged the business model of traditional journal subscription and hence provide free access to publicly funded scientific research. In these models, the manuscript authors pay a publication cost after peer review and acceptance of the manuscript. Over the last decade, numerous new open access journals have been launched and traditional subscription journals have started to offer open access (hybrid journals). However, open access journals pertain to the general scheme that, of all participating parties, the publisher receives the highest financial benefit. The income is generated by researchers whose positions and research are mostly financed by tax payers' or funders' money, reviewers and editors, who frequently are not reimbursed. Last but not least, the authors pay for the publication of their work after a rigorous and sometimes painful review process. JNC itself has an open access option, at a significantly reduced cost for Society members as an additional benefit. This article provides first-hand insights from a long-standing Editor-in-Chief, Kunihiko Suzuki, about the history of JNC's ownership and about difficulties and battles fought on the way to its current success and reputation today. This article is part of the 60th Anniversary special issue. © 2016 International Society for Neurochemistry.

  18. Dynamic computer model for the metallogenesis and tectonics of the Circum-North Pacific

    USGS Publications Warehouse

    Scotese, Christopher R.; Nokleberg, Warren J.; Monger, James W.H.; Norton, Ian O.; Parfenov, Leonid M.; Khanchuk, Alexander I.; Bundtzen, Thomas K.; Dawson, Kenneth M.; Eremin, Roman A.; Frolov, Yuri F.; Fujita, Kazuya; Goryachev, Nikolai A.; Pozdeev, Anany I.; Ratkin, Vladimir V.; Rodinov, Sergey M.; Rozenblum, Ilya S.; Scholl, David W.; Shpikerman, Vladimir I.; Sidorov, Anatoly A.; Stone, David B.

    2001-01-01

    The digital files on this report consist of a dynamic computer model of the metallogenesis and tectonics of the Circum-North Pacific, and background articles, figures, and maps. The tectonic part of the dynamic computer model is derived from a major analysis of the tectonic evolution of the Circum-North Pacific which is also contained in directory tectevol. The dynamic computer model and associated materials on this CD-ROM are part of a project on the major mineral deposits, metallogenesis, and tectonics of the Russian Far East, Alaska, and the Canadian Cordillera. The project provides critical information on bedrock geology and geophysics, tectonics, major metalliferous mineral resources, metallogenic patterns, and crustal origin and evolution of mineralizing systems for this region. The major scientific goals and benefits of the project are to: (1) provide a comprehensive international data base on the mineral resources of the region that is the first, extensive knowledge available in English; (2) provide major new interpretations of the origin and crustal evolution of mineralizing systems and their host rocks, thereby enabling enhanced, broad-scale tectonic reconstructions and interpretations; and (3) promote trade and scientific and technical exchanges between North America and Eastern Asia.

  19. The PEcAn Project: Model-Data Ecoinformatics for the Observatory Era

    NASA Astrophysics Data System (ADS)

    Dietze, M. C.; LeBauer, D. S.; Davidson, C. D.; Desai, A. R.; Kooper, R.; McHenry, K.; Mulrooney, P.

    2011-12-01

    The fundamental questions about how terrestrial ecosystems will respond to climate change are straightforward and well known, yet a small number of important gaps separate the information we have gathered from the understanding required to inform policy and management. A critical gap is that no one data source provides a complete picture of the terrestrial biosphere, and therefore multiple data sources must be integrated in a sensible manner. Process-based models represent an ideal framework for this synthesis, but to date model-data synthesize has only made use of a subset of the available data types, and remains inaccessible to much of the scientific community, largely due to the daunting ecoinformatics challenges. The Predictive Ecosystem Analyzer (PEcAn) is an open-source scientific workflow system and ecoinformatics toolbox that manages the flow of information in and out of regional-scale terrestrial biosphere models, facilitates formal data assimilation, and enables more effective feedbacks between models and field research. PEcAn makes complex analyses transparent, repeatable, and accessible to a diverse array of researchers. PEcAn is not model specific, but rather encapsulates any ecosystem model within a set of standardized input and output modules. Herein we demonstrate PEcAn's ability to automate many of the tasks involved in modeling by gathering and processing a diverse arrays of data sets, initiating ensembles of model runs, visualizing output, and comparing models to observations. PEcAn employs a fully Bayesian approach to model parameterization and the estimation of ecosystem pools and fluxes that allows a straightforward propagation of uncertainties into analyses and forecasts. This approach also makes possible the synthesis of a diverse array of data types operating at different spatial and temporal scales and to easily update predictions as new information becomes available. We also demonstrate PEcAn's ability to iteratively synthesize information for literature trait databases, ground observations, eddy-covariance towers and quantify the reductions in overall uncertainty as each new dataset is added. PEcAn also automates a number of model analyses, such as sensitivity analyses, ensemble prediction, and variance decomposition which collectively allow the system to partition and ascribe uncertainties to different model parameters and processes. PEcAn provides a direct feedback to field research by further automating the estimation of sample sizes and sampling distributions required to reduce model uncertainties, enabling further measurements to be targeted and optimized. Finally, we will present the PEcAn development plan and timeline, including new features such as the synthesis of remotely sensed data, regional-scale data assimilation, and real-time forecasting. Ultimately, PEcAn aims to make ecosystem modeling and data assimilation routine tools for answering scientific questions and informing policy and management.

  20. Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication

    ERIC Educational Resources Information Center

    Wolf, Michael Maclean

    2009-01-01

    Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…

  1. Global Atmosphere Watch Workshop on Measurement-Model ...

    EPA Pesticide Factsheets

    The World Meteorological Organization’s (WMO) Global Atmosphere Watch (GAW) Programme coordinates high-quality observations of atmospheric composition from global to local scales with the aim to drive high-quality and high-impact science while co-producing a new generation of products and services. In line with this vision, GAW’s Scientific Advisory Group for Total Atmospheric Deposition (SAG-TAD) has a mandate to produce global maps of wet, dry and total atmospheric deposition for important atmospheric chemicals to enable research into biogeochemical cycles and assessments of ecosystem and human health effects. The most suitable scientific approach for this activity is the emerging technique of measurement-model fusion for total atmospheric deposition. This technique requires global-scale measurements of atmospheric trace gases, particles, precipitation composition and precipitation depth, as well as predictions of the same from global/regional chemical transport models. The fusion of measurement and model results requires data assimilation and mapping techniques. The objective of the GAW Workshop on Measurement-Model Fusion for Global Total Atmospheric Deposition (MMF-GTAD), an initiative of the SAG-TAD, was to review the state-of-the-science and explore the feasibility and methodology of producing, on a routine retrospective basis, global maps of atmospheric gas and aerosol concentrations as well as wet, dry and total deposition via measurement-model

  2. Proceedings of the First International Linked Science Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pouchard, Line Catherine; Kauppinnen, Tomi; Kessler, Carsten

    2011-01-01

    Scientific efforts are traditionally published only as articles, with an estimate of millions of publications worldwide per year; the growth rate of PubMed alone is now 1 papers per minute. The validation of scientific results requires reproducible methods, which can only be achieved if the same data, processes, and algorithms as those used in the original experiments were available. However, the problem is that although publications, methods and datasets are very related, they are not always openly accessible and interlinked. Even where data is discoverable, accessible and assessable, significant challenges remain in the reuse of the data, in particular facilitatingmore » the necessary correlation, integration and synthesis of data across levels of theory, techniques and disciplines. In the LISC 2011 (1st International Workshop on Linked Science) we will discuss and present results of new ways of publishing, sharing, linking, and analyzing such scientific resources motivated by driving scientific requirements, as well as reasoning over the data to discover interesting new links and scientific insights. Making entities identifiable and referenceable using URIs augmented by semantic, scientifically relevant annotations greatly facilitates access and retrieval for data which used to be hardly accessible. This Linked Science approach, i.e., publishing, sharing and interlinking scientific resources and data, is of particular importance for scientific research, where sharing is crucial for facilitating reproducibility and collaboration within and across disciplines. This integrated process, however, has not been established yet. Bibliographic contents are still regarded as the main scientific product, and associated data, models and software are either not published at all, or published in separate places, often with no reference to the respective paper. In the workshop we will discuss whether and how new emerging technologies (Linked Data, and semantic technologies more generally) can realize the vision of Linked Science. We see that this depends on their enabling capability throughout the research process, leading up to extended publications and data sharing environments. Our workshop aims to address challenges related to enabling the easy creation of data bundles - data, processes, tools, provenance and annotation - supporting both publication and reuse of the data. Secondly, we look for tools and methods for the easy correlation, integration and synthesis of shared data. This problem is often found in many disciplines (including astronomy, biology, geosciences, cultural heritage, earth, climate, environmental and ecological sciences and impacts etc.), as they need to span techniques, levels of theory, scales, and disciplines. With the advent of Linked Science, it is timely and crucial to address these identified research challenges through both practical and formal approaches.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yao; Balaprakash, Prasanna; Meng, Jiayuan

    We present Raexplore, a performance modeling framework for architecture exploration. Raexplore enables rapid, automated, and systematic search of architecture design space by combining hardware counter-based performance characterization and analytical performance modeling. We demonstrate Raexplore for two recent manycore processors IBM Blue- Gene/Q compute chip and Intel Xeon Phi, targeting a set of scientific applications. Our framework is able to capture complex interactions between architectural components including instruction pipeline, cache, and memory, and to achieve a 3–22% error for same-architecture and cross-architecture performance predictions. Furthermore, we apply our framework to assess the two processors, and discover and evaluate a list ofmore » architectural scaling options for future processor designs.« less

  4. Earth Science System of the Future: Observing, Processing, and Delivering Data Products Directly to Users

    NASA Technical Reports Server (NTRS)

    Crisp, David; Komar, George (Technical Monitor)

    2001-01-01

    Advancement of our predictive capabilities will require new scientific knowledge, improvement of our modeling capabilities, and new observation strategies to generate the complex data sets needed by coupled modeling networks. New observation strategies must support remote sensing from a variety of vantage points and will include "sensorwebs" of small satellites in low Earth orbit, large aperture sensors in Geostationary orbits, and sentinel satellites at L1 and L2 to provide day/night views of the entire globe. Onboard data processing and high speed computing and communications will enable near real-time tailoring and delivery of information products (i.e., predictions) directly to users.

  5. DATS, the data tag suite to enable discoverability of datasets.

    PubMed

    Sansone, Susanna-Assunta; Gonzalez-Beltran, Alejandra; Rocca-Serra, Philippe; Alter, George; Grethe, Jeffrey S; Xu, Hua; Fore, Ian M; Lyle, Jared; Gururaj, Anupama E; Chen, Xiaoling; Kim, Hyeon-Eui; Zong, Nansu; Li, Yueling; Liu, Ruiling; Ozyurt, I Burak; Ohno-Machado, Lucila

    2017-06-06

    Today's science increasingly requires effective ways to find and access existing datasets that are distributed across a range of repositories. For researchers in the life sciences, discoverability of datasets may soon become as essential as identifying the latest publications via PubMed. Through an international collaborative effort funded by the National Institutes of Health (NIH)'s Big Data to Knowledge (BD2K) initiative, we have designed and implemented the DAta Tag Suite (DATS) model to support the DataMed data discovery index. DataMed's goal is to be for data what PubMed has been for the scientific literature. Akin to the Journal Article Tag Suite (JATS) used in PubMed, the DATS model enables submission of metadata on datasets to DataMed. DATS has a core set of elements, which are generic and applicable to any type of dataset, and an extended set that can accommodate more specialized data types. DATS is a platform-independent model also available as an annotated serialization in schema.org, which in turn is widely used by major search engines like Google, Microsoft, Yahoo and Yandex.

  6. Application of the enterprise management tools Lean Six Sigma and PMBOK in developing a program of research management.

    PubMed

    Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente

    2012-01-01

    Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.

  7. ScienceOrganizer System and Interface Summary

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Norvig, Peter (Technical Monitor)

    2001-01-01

    ScienceOrganizer is a specialized knowledge management tool designed to enhance the information storage, organization, and access capabilities of distributed NASA science teams. Users access ScienceOrganizer through an intuitive Web-based interface that enables them to upload, download, and organize project information - including data, documents, images, and scientific records associated with laboratory and field experiments. Information in ScienceOrganizer is "threaded", or interlinked, to enable users to locate, track, and organize interrelated pieces of scientific data. Linkages capture important semantic relationships among information resources in the repository, and these assist users in navigating through the information related to their projects.

  8. Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem

    NASA Astrophysics Data System (ADS)

    Doyle, R. J.; Crichton, D.

    2017-12-01

    NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer opportunities to gain new insights from space missions and their vast data collections. We are working to innovate new architectures, exploit emerging technologies, develop new data-driven methodologies, and transfer them across disciplines, while working across the dual dimensions of the data lifecycle and the data ecosystem.

  9. The Role of Mindfulness in Positive Reappraisal

    PubMed Central

    Garland, Eric; Gaylord, Susan; Park, Jongbae

    2009-01-01

    Mindfulness meditation is increasingly well known for therapeutic efficacy in a variety of illnesses and conditions, but its mechanism of action is still under debate in scientific circles. In this paper we propose a hypothetical causal model that argues for the role of mindfulness in positive reappraisal coping. Positive reappraisal is a critical component of meaning-based coping that enables individuals to adapt successfully to stressful life events. Mindfulness, as a metacognitive form of awareness, involves the process of decentering, a shifting of cognitive sets that enables alternate appraisals of life events. We review the concept of positive reappraisal in transactional stress and coping theory; then describe research and traditional literature related to mindfulness and cognitive reappraisal, and detail the central role of mindfulness in the reappraisal process. With this understanding, we present a causal model explicating the proposed mechanism. The discussion has implications for clinical practice, suggesting how mindfulness-based integrative medicine interventions can be designed to support adaptive coping processes. PMID:19114262

  10. Sentinel-3 for Science

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Regner, P.; Desnos, Y. L.

    2015-12-01

    The Scientific Exploitation of Operational Mission (SEOM) programme element (http://seom.esa.int/) is part of the ESA's Fourth Earth Observation Envelope Programme (2013-2017). The prime objective is to federate, support and expand the international research community that the ERS, ENVISAT and the Envelope programmes have built up over the last 25 years. It aims to further strengthen the leadership of the European Earth Observation research community by enabling them to extensively exploit future European operational EO missions. SEOM is enabling the science community to address new scientific research that are opened by free and open access to data from operational EO missions. The Programme is based on community-wide recommendations for actions on key research issues, gathered through a series of international thematic workshops and scientific user consultation meetings such as the Sentinel-3 for Science Workshop held last June in Venice, Italy (see http://seom.esa.int/S3forScience2015). The 2015 SEOM work plan includes the launch of new R&D studies for scientific exploitation of the Sentinels, the development of open-source multi-mission scientific toolboxes, the organization of advanced international training courses, summer schools and educational materials, as well as activities for promoting the scientific use of EO data, also via the organization of Workshops. This paper will report the recommendations from the International Scientific Community concerning the Sentinel-3 Scientific Exploitation, as expressed in Venice, keeping in mind that Sentinel-3 is an operational mission to provide operational services (see http://www.copernicus.eu).

  11. A core observational data model for enhancing the interoperability of ontologically annotated environmental data

    NASA Astrophysics Data System (ADS)

    Schildhauer, M.; Bermudez, L. E.; Bowers, S.; Dibner, P. C.; Gries, C.; Jones, M. B.; McGuinness, D. L.; Cao, H.; Cox, S. J.; Kelling, S.; Lagoze, C.; Lapp, H.; Madin, J.

    2010-12-01

    Research in the environmental sciences often requires accessing diverse data, collected by numerous data providers over varying spatiotemporal scales, incorporating specialized measurements from a range of instruments. These measurements are typically documented using idiosyncratic, disciplinary specific terms, and stored in management systems ranging from desktop spreadsheets to the Cloud, where the information is often further decomposed or stylized in unpredictable ways. This situation creates major informatics challenges for broadly discovering, interpreting, and merging the data necessary for integrative earth science research. A number of scientific disciplines have recognized these issues, and been developing semantically enhanced data storage frameworks, typically based on ontologies, to enable communities to better circumscribe and clarify the content of data objects within their domain of practice. There is concern, however, that cross-domain compatibility of these semantic solutions could become problematic. We describe here our efforts to address this issue by developing a core, unified Observational Data Model, that should greatly facilitate interoperability among the semantic solutions growing organically within diverse scientific domains. Observational Data Models have emerged independently from several distinct scientific communities, including the biodiversity sciences, ecology, evolution, geospatial sciences, and hydrology, to name a few. Informatics projects striving for data integration within each of these domains had converged on identifying "observations" and "measurements" as fundamental abstractions that provide useful "templates" through which scientific data can be linked— at the structural, composited, or even cell value levels— to domain terms stored in ontologies or other forms of controlled vocabularies. The Scientific Observations Network, SONet (http://sonet.ecoinformatics.org) brings together a number of these observational data efforts, and is harmonizing their models. The specific observational data models currently under consideration include the OGC's Observations and Measurements Encoding Standard, O&M; the ecological community's Extensible Observation Ontology, OBOE'; the evolutionary community's Entity-Quality model, EQ; and the VSTO core classes, intended for describing atmospheric and solar-terrestrial phenomena, VSTO.OWL. These models all share high structural similarities, expressed in different languages (e.g. UML or OWL), and are intended for use with very different forms of data. The main focus of this talk will be describing these Observational Data Models, and more importantly, how harmonizing these will catalyze semantically enhanced access to large additional data resources across the earth and life sciences.

  12. Telescope Scientist on the Advanced X-ray Astrophysics Observatory

    NASA Technical Reports Server (NTRS)

    Smith, Carl M. (Technical Monitor); VanSpeybroeck, Leon; Tananbaum, Harvey D.

    2004-01-01

    In this period, the Chandra X-ray Observatory continued to perform exceptionally well, with many scientific observations and spectacular results. The HRMA performance continues to be essentially identical to that predicted from ground calibration data. The Telescope Scientist Team has improved the mirror model to provide a more accurate description to the Chandra observers, enabling them to reduce the systematic errors and uncertainties in their data reduction. There also has been good progress in the scientific program. Using the Telescope Scientist GTO time, we carried out an extensive Chandra program to observe distant clusters of galaxies. The goals of this program were to use clusters to derive cosmological constraints and to investigate the physics and evolution of clusters. A total of 71 clusters were observed with ACIS-I; the last observations were completed in December 2003.

  13. Earth from Orbit 2014

    NASA Image and Video Library

    2015-04-20

    Every day of every year, NASA satellites provide useful data about our home planet, and along the way, some beautiful images as well. This video includes satellite images of Earth in 2014 from NASA and its partners as well as photos and a time lapse video from the International Space Station. We’ve also included a range of data visualizations, model runs, and a conceptual animation that were produced in 2014 (but in some cases might have been utilizing data from earlier years.) Credit: NASA's Goddard Space Flight Center NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  14. Development of a Web-Based Visualization Platform for Climate Research Using Google Earth

    NASA Technical Reports Server (NTRS)

    Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue

    2011-01-01

    Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.

  15. Search Pathways: Modeling GeoData Search Behavior to Support Usable Application Development

    NASA Astrophysics Data System (ADS)

    Yarmey, L.; Rosati, A.; Tressel, S.

    2014-12-01

    Recent technical advances have enabled development of new scientific data discovery systems. Metadata brokering, linked data, and other mechanisms allow users to discover scientific data of interes across growing volumes of heterogeneous content. Matching this complex content with existing discovery technologies, people looking for scientific data are presented with an ever-growing array of features to sort, filter, subset, and scan through search returns to help them find what they are looking for. This paper examines the applicability of available technologies in connecting searchers with the data of interest. What metrics can be used to track success given shifting baselines of content and technology? How well do existing technologies map to steps in user search patterns? Taking a user-driven development approach, the team behind the Arctic Data Explorer interdisciplinary data discovery application invested heavily in usability testing and user search behavior analysis. Building on earlier library community search behavior work, models were developed to better define the diverse set of thought processes and steps users took to find data of interest, here called 'search pathways'. This research builds a deeper understanding of the user community that seeks to reuse scientific data. This approach ensures that development decisions are driven by clearly articulated user needs instead of ad hoc technology trends. Initial results from this research will be presented along with lessons learned for other discovery platform development and future directions for informatics research into search pathways.

  16. Creating a Five-Minute Conversation about Cyberinfrastructure

    ERIC Educational Resources Information Center

    Jelinkova, Klara; Carvalho, Terezsa; Kerian, Dorette; Knosp, Boyd; Percival, Kent; Yagi, Stan

    2008-01-01

    Cyberinfrastructure is the IT infrastructure that enables scientific inquiry. It anticipates a scientific and scholarly world that is increasingly dependent on information technology. It has many facets, and each institution will need to review its own strengths and weaknesses to decide on areas of concentration. In higher education,…

  17. Program Supports Scientific Visualization

    NASA Technical Reports Server (NTRS)

    Keith, Stephan

    1994-01-01

    Primary purpose of General Visualization System (GVS) computer program is to support scientific visualization of data generated by panel-method computer program PMARC_12 (inventory number ARC-13362) on Silicon Graphics Iris workstation. Enables user to view PMARC geometries and wakes as wire frames or as light shaded objects. GVS is written in C language.

  18. IMG_4301

    NASA Image and Video Library

    2015-08-14

    NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. Entextualized Humor in the Formation of Scientist Identities among U.S. Undergraduates

    ERIC Educational Resources Information Center

    Bucholtz, Mary; Skapoulli, Elena; Barnwell, Brendan; Lee, Jung-Eun Janie

    2011-01-01

    Studies of the socialization of novices into scientific cultures typically emphasize official knowledge-making activities. However, scientific socialization is also accomplished informally through humor. As entextualized humor, formulaic jokes enable U.S. undergraduate students in science to claim scientist identities both through a displayed…

  20. Bayesian Models for Astrophysical Data Using R, JAGS, Python, and Stan

    NASA Astrophysics Data System (ADS)

    Hilbe, Joseph M.; de Souza, Rafael S.; Ishida, Emille E. O.

    2017-05-01

    This comprehensive guide to Bayesian methods in astronomy enables hands-on work by supplying complete R, JAGS, Python, and Stan code, to use directly or to adapt. It begins by examining the normal model from both frequentist and Bayesian perspectives and then progresses to a full range of Bayesian generalized linear and mixed or hierarchical models, as well as additional types of models such as ABC and INLA. The book provides code that is largely unavailable elsewhere and includes details on interpreting and evaluating Bayesian models. Initial discussions offer models in synthetic form so that readers can easily adapt them to their own data; later the models are applied to real astronomical data. The consistent focus is on hands-on modeling, analysis of data, and interpretations that address scientific questions. A must-have for astronomers, its concrete approach will also be attractive to researchers in the sciences more generally.

  1. Human Exploration and Development of Space: Strategic Plan

    NASA Technical Reports Server (NTRS)

    Branscome, Darrell (Editor); Allen, Marc (Editor); Bihner, William (Editor); Craig, Mark (Editor); Crouch, Matthew (Editor); Crouch, Roger (Editor); Flaherty, Chris (Editor); Haynes, Norman (Editor); Horowitz, Steven (Editor)

    2000-01-01

    The five goals of the Human Exploration and Development of Space include: 1) Explore the Space Frontier; 2) Expand Scientific Knowledge; 3) Enable Humans to Live and Work Permanently in Space; 4) Enable the Commercial Development of Space; and 5) Share the Experience and Benefits of Discovery.

  2. Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.

    2013-12-01

    Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.

  3. Architectural Blueprint for Plate Boundary Observatories based on interoperable Data Management Platforms

    NASA Astrophysics Data System (ADS)

    Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.

    2014-12-01

    Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate Observing System EPOS), it also supports integrated approaches to include sensor networks that provide complementary processes for dynamic monitoring. Moreover, the integration of such observatories into superordinate research infrastructures (federation of virtual observatories) will be enabled.

  4. Teaching Climate Social Science and Its Practices: A Two-Pronged Approach to Climate Literacy

    NASA Astrophysics Data System (ADS)

    Shwom, R.; Isenhour, C.; McCright, A.; Robinson, J.; Jordan, R.

    2014-12-01

    The Essential Principles of Climate Science Literacy states that a climate-literate individual can: "understand the essential principles of Earth's climate system, assess scientifically credible information about climate change, communicate about climate and climate change in a meaningful way, and make informed and responsible decisions with regard to actions that may affect climate." We argue that further integration of the social science dimensions of climate change will advance the climate literacy goals of communication and responsible actions. The underlying rationale for this argues: 1) teaching the habits of mind and scientific practices that have synergies across the social and natural sciences can strengthen students ability to understand and assess science in general and that 2) understanding the empirical research on the social, political, and economic processes (including climate science itself) that are part of the climate system is an important step for enabling effective action and communication. For example, while climate literacy has often identified the public's faulty mental models of climate processes as a partial explanation of complacency, emerging research suggests that the public's mental models of the social world are equally or more important in leading to informed and responsible climate decisions. Building student's ability to think across the social and natural sciences by understanding "how we know what we know" through the sciences and a scientific understanding of the social world allows us to achieve climate literacy goals more systematically and completely. To enable this integration we first identify the robust social science insights for the climate science literacy principles that involve social systems. We then briefly identify significant social science contributions to climate science literacy that do not clearly fit within the seven climate literacy principles but arguably could advance climate literacy goals. We conclude with suggestions on how the identified social science insights could be integrated into climate literacy efforts.

  5. Community Capacity Building as a vital mechanism for enhancing the growth and efficacy of a sustainable scientific software ecosystem: experiences running a real-time bi-coastal "Open Science for Synthesis" Training Institute for young Earth and Environmental scientists

    NASA Astrophysics Data System (ADS)

    Schildhauer, M.; Jones, M. B.; Bolker, B.; Lenhardt, W. C.; Hampton, S. E.; Idaszak, R.; Rebich Hespanha, S.; Ahalt, S.; Christopherson, L.

    2014-12-01

    Continuing advances in computational capabilities, access to Big Data, and virtual collaboration technologies are creating exciting new opportunities for accomplishing Earth science research at finer resolutions, with much broader scope, using powerful modeling and analytical approaches that were unachievable just a few years ago. Yet, there is a perceptible lag in the abilities of the research community to capitalize on these new possibilities, due to lacking the relevant skill-sets, especially with regards to multi-disciplinary and integrative investigations that involve active collaboration. UC Santa Barbara's National Center for Ecological Analysis and Synthesis (NCEAS), and the University of North Carolina's Renaissance Computing Institute (RENCI), were recipients of NSF OCI S2I2 "Conceptualization awards", charged with helping define the needs of the research community relative to enabling science and education through "sustained software infrastructure". Over the course of our activities, a consistent request from Earth scientists was for "better training in software that enables more effective, reproducible research." This community-based feedback led to creation of an "Open Science for Synthesis" Institute— a innovative, three-week, bi-coastal training program for early career researchers. We provided a mix of lectures, hands-on exercises, and working group experience on topics including: data discovery and preservation; code creation, management, sharing, and versioning; scientific workflow documentation and reproducibility; statistical and machine modeling techniques; virtual collaboration mechanisms; and methods for communicating scientific results. All technologies and quantitative tools presented were suitable for advancing open, collaborative, and reproducible synthesis research. In this talk, we will report on the lessons learned from running this ambitious training program, that involved coordinating classrooms among two remote sites, and included developing original synthesis research activities as part of the course. We also report on the feedback provided by participants as to the learning approaches and topical issues they found most engaging, and why.

  6. Advancing Water Science through Improved Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Koch, B. J.; Miles, B.; Rai, A.; Ahalt, S.; Band, L. E.; Minsker, B.; Palmer, M.; Williams, M. R.; Idaszak, R.; Whitton, M. C.

    2012-12-01

    Major scientific advances are needed to help address impacts of climate change and increasing human-mediated environmental modification on the water cycle at global and local scales. However, such advances within the water sciences are limited in part by inadequate information infrastructures. For example, cyberinfrastructure (CI) includes the integrated computer hardware, software, networks, sensors, data, and human capital that enable scientific workflows to be carried out within and among individual research efforts and across varied disciplines. A coordinated transformation of existing CI and development of new CI could accelerate the productivity of water science by enabling greater discovery, access, and interoperability of data and models, and by freeing scientists to do science rather than create and manage technological tools. To elucidate specific ways in which improved CI could advance water science, three challenges confronting the water science community were evaluated: 1) How does ecohydrologic patch structure affect nitrogen transport and fate in watersheds?, 2) How can human-modified environments emulate natural water and nutrient cycling to enhance both human and ecosystem well-being?, 3) How do changes in climate affect water availability to support biodiversity and human needs? We assessed the approaches used by researchers to address components of these challenges, identified barriers imposed by limitations of current CI, and interviewed leaders in various water science subdisciplines to determine the most recent CI tools employed. Our preliminary findings revealed four areas where CI improvements are likely to stimulate scientific advances: 1) sensor networks, 2) data quality assurance/quality control, 3) data and modeling standards, 4) high performance computing. In addition, the full potential of a re-envisioned water science CI cannot be realized without a substantial training component. In light of these findings, we suggest that CI industry-proven practices such as open-source community architecture, agile development methodologies, and sound software engineering methods offer a promising pathway to a transformed water science CI capable of meeting the demands of both individual scientists and community-wide research initiatives.

  7. Data Curation Education in Research Centers (DCERC)

    NASA Astrophysics Data System (ADS)

    Marlino, M. R.; Mayernik, M. S.; Kelly, K.; Allard, S.; Tenopir, C.; Palmer, C.; Varvel, V. E., Jr.

    2012-12-01

    Digital data both enable and constrain scientific research. Scientists are enabled by digital data to develop new research methods, utilize new data sources, and investigate new topics, but they also face new data collection, management, and preservation burdens. The current data workforce consists primarily of scientists who receive little formal training in data management and data managers who are typically educated through on-the-job training. The Data Curation Education in Research Centers (DCERC) program is investigating a new model for educating data professionals to contribute to scientific research. DCERC is a collaboration between the University of Illinois at Urbana-Champaign Graduate School of Library and Information Science, the University of Tennessee School of Information Sciences, and the National Center for Atmospheric Research. The program is organized around a foundations course in data curation and provides field experiences in research and data centers for both master's and doctoral students. This presentation will outline the aims and the structure of the DCERC program and discuss results and lessons learned from the first set of summer internships in 2012. Four masters students participated and worked with both data mentors and science mentors, gaining first hand experiences in the issues, methods, and challenges of scientific data curation. They engaged in a diverse set of topics, including climate model metadata, observational data management workflows, and data cleaning, documentation, and ingest processes within a data archive. The students learned current data management practices and challenges while developing expertise and conducting research. They also made important contributions to NCAR data and science teams by evaluating data management workflows and processes, preparing data sets to be archived, and developing recommendations for particular data management activities. The master's student interns will return in summer of 2013, and two Ph.D. students will conduct data curation-related dissertation fieldwork during the 2013-2014 academic year.

  8. Laser Interferometry Method as a Novel Tool in Endotoxins Research.

    PubMed

    Arabski, Michał; Wąsik, Sławomir

    2017-01-01

    Optical properties of chemical substances are widely used at present for assays thereof in a variety of scientific disciplines. One of the measurement techniques applied in physical sciences, with a potential for novel applications in biology, is laser interferometry. This method enables to record the diffusion properties of chemical substances. Here we describe the novel application of laser interferometry in chitosan interactions with lipopolysaccharide by detection of colistin diffusion. The proposed model could be used in simple measurements of polymer interactions with endotoxins and/or biological active compounds, like antibiotics.

  9. Lobachevsky Year at Kazan University: Center of Science, Education, Intellectual-Cognitive Tourism "Kazan - GeoNa - 2020+" and "Kazan-Moon-2020+" projects

    NASA Astrophysics Data System (ADS)

    Gusev, A.; Trudkova, N.

    2017-09-01

    Center "GeoNa" will enable scientists and teachers of the Russian universities to join to advanced achievements of a science, information technologies; to establish scientific communications with foreign colleagues in sphere of the high technology, educational projects and Intellectual-Cognitive Tourism. The Project "Kazan - Moon - 2020+" is directed on the decision of fundamental problems of celestial mechanics, selenodesy and geophysics of the Moon(s) connected to carrying out of complex theoretical researches and computer modelling.

  10. A Data Driven Framework for Integrating Regional Climate Models

    NASA Astrophysics Data System (ADS)

    Lansing, C.; Kleese van Dam, K.; Liu, Y.; Elsethagen, T.; Guillen, Z.; Stephan, E.; Critchlow, T.; Gorton, I.

    2012-12-01

    There are increasing needs for research addressing complex climate sensitive issues of concern to decision-makers and policy planners at a regional level. Decisions about allocating scarce water across competing municipal, agricultural, and ecosystem demands is just one of the challenges ahead, along with decisions regarding competing land use priorities such as biofuels, food, and species habitat. Being able to predict the extent of future climate change in the context of introducing alternative energy production strategies requires a new generation of modeling capabilities. We will also need more complete representations of human systems at regional scales, incorporating the influences of population centers, land use, agriculture and existing and planned electrical demand and generation infrastructure. At PNNL we are working towards creating a first-of-a-kind capability known as the Integrated Regional Earth System Model (iRESM). The fundamental goal of the iRESM initiative is the critical analyses of the tradeoffs and consequences of decision and policy making for integrated human and environmental systems. This necessarily combines different scientific processes, bridging different temporal and geographic scales and resolving the semantic differences between them. To achieve this goal, iRESM is developing a modeling framework and supporting infrastructure that enable the scientific team to evaluate different scenarios in light of specific stakeholder questions such as "How do regional changes in mean climate states and climate extremes affect water storage and energy consumption and how do such decisions influence possible mitigation and carbon management schemes?" The resulting capability will give analysts a toolset to gain insights into how regional economies can respond to climate change mitigation policies and accelerated deployment of alternative energy technologies. The iRESM framework consists of a collection of coupled models working with high resolution data that can represent the climate, geography, economy, energy supply, and demand of a region under study; an integrated data management framework that captures information about models, model couplings (workflows), observational and derived data sets, numerical experiments, and the provenance metadata connecting them; and a collaborative environment that enables scientific users to explore the datasets, register models and codes, launch workflows, retrieve provenance, and analyze results. In this presentation we address the challenges of coupling heterogeneous codes and handling large data sets. We describe our integration approach, which is based on a loosely coupled software architecture that supports experimentation and evolution of models on different datasets. We present our software prototype and show the scalability of our approach to handle a large number ( > 17,000) of model runs and a significant quantity of data in the order of terabytes. The resulting environment is now used by domain scientists and has proven useful to improve productivity in the evolving development of iRESM model coupling.

  11. They See a Rat, We Seek a Cure for Diseases: The Current Status of Animal Experimentation in Medical Practice

    PubMed Central

    Kehinde, Elijah O.

    2013-01-01

    The objective of this review article was to examine current and prospective developments in the scientific use of laboratory animals, and to find out whether or not there are still valid scientific benefits of and justification for animal experimentation. The PubMed and Web of Science databases were searched using the following key words: animal models, basic research, pharmaceutical research, toxicity testing, experimental surgery, surgical simulation, ethics, animal welfare, benign, malignant diseases. Important relevant reviews, original articles and references from 1970 to 2012 were reviewed for data on the use of experimental animals in the study of diseases. The use of laboratory animals in scientific research continues to generate intense public debate. Their use can be justified today in the following areas of research: basic scientific research, use of animals as models for human diseases, pharmaceutical research and development, toxicity testing and teaching of new surgical techniques. This is because there are inherent limitations in the use of alternatives such as in vitro studies, human clinical trials or computer simulation. However, there are problems of transferability of results obtained from animal research to humans. Efforts are on-going to find suitable alternatives to animal experimentation like cell and tissue culture and computer simulation. For the foreseeable future, it would appear that to enable scientists to have a more precise understanding of human disease, including its diagnosis, prognosis and therapeutic intervention, there will still be enough grounds to advocate animal experimentation. However, efforts must continue to minimize or eliminate the need for animal testing in scientific research as soon as possible. PMID:24217224

  12. They see a rat, we seek a cure for diseases: the current status of animal experimentation in medical practice.

    PubMed

    Kehinde, Elijah O

    2013-01-01

    The objective of this review article was to examine current and prospective developments in the scientific use of laboratory animals, and to find out whether or not there are still valid scientific benefits of and justification for animal experimentation. The PubMed and Web of Science databases were searched using the following key words: animal models, basic research, pharmaceutical research, toxicity testing, experimental surgery, surgical simulation, ethics, animal welfare, benign, malignant diseases. Important relevant reviews, original articles and references from 1970 to 2012 were reviewed for data on the use of experimental animals in the study of diseases. The use of laboratory animals in scientific research continues to generate intense public debate. Their use can be justified today in the following areas of research: basic scientific research, use of animals as models for human diseases, pharmaceutical research and development, toxicity testing and teaching of new surgical techniques. This is because there are inherent limitations in the use of alternatives such as in vitro studies, human clinical trials or computer simulation. However, there are problems of transferability of results obtained from animal research to humans. Efforts are on-going to find suitable alternatives to animal experimentation like cell and tissue culture and computer simulation. For the foreseeable future, it would appear that to enable scientists to have a more precise understanding of human disease, including its diagnosis, prognosis and therapeutic intervention, there will still be enough grounds to advocate animal experimentation. However, efforts must continue to minimize or eliminate the need for animal testing in scientific research as soon as possible. © 2013 S. Karger AG, Basel.

  13. Scientometric and patentometric analyses to determine the knowledge landscape in innovative technologies: The case of 3D bioprinting

    PubMed Central

    2017-01-01

    This research proposes an innovative data model to determine the landscape of emerging technologies. It is based on a competitive technology intelligence methodology that incorporates the assessment of scientific publications and patent analysis production, and is further supported by experts’ feedback. It enables the definition of the growth rate of scientific and technological output in terms of the top countries, institutions and journals producing knowledge within the field as well as the identification of main areas of research and development by analyzing the International Patent Classification codes including keyword clusterization and co-occurrence of patent assignees and patent codes. This model was applied to the evolving domain of 3D bioprinting. Scientific documents from the Scopus and Web of Science databases, along with patents from 27 authorities and 140 countries, were retrieved. In total, 4782 scientific publications and 706 patents were identified from 2000 to mid-2016. The number of scientific documents published and patents in the last five years showed an annual average growth of 20% and 40%, respectively. Results indicate that the most prolific nations and institutions publishing on 3D bioprinting are the USA and China, including the Massachusetts Institute of Technology (USA), Nanyang Technological University (Singapore) and Tsinghua University (China), respectively. Biomaterials and Biofabrication are the predominant journals. The most prolific patenting countries are China and the USA; while Organovo Holdings Inc. (USA) and Tsinghua University (China) are the institutions leading. International Patent Classification codes reveal that most 3D bioprinting inventions intended for medical purposes apply porous or cellular materials or biologically active materials. Knowledge clusters and expert drivers indicate that there is a research focus on tissue engineering including the fabrication of organs, bioinks and new 3D bioprinting systems. Our model offers a guide to researchers to understand the knowledge production of pioneering technologies, in this case 3D bioprinting. PMID:28662187

  14. Scientometric and patentometric analyses to determine the knowledge landscape in innovative technologies: The case of 3D bioprinting.

    PubMed

    Rodríguez-Salvador, Marisela; Rio-Belver, Rosa María; Garechana-Anacabe, Gaizka

    2017-01-01

    This research proposes an innovative data model to determine the landscape of emerging technologies. It is based on a competitive technology intelligence methodology that incorporates the assessment of scientific publications and patent analysis production, and is further supported by experts' feedback. It enables the definition of the growth rate of scientific and technological output in terms of the top countries, institutions and journals producing knowledge within the field as well as the identification of main areas of research and development by analyzing the International Patent Classification codes including keyword clusterization and co-occurrence of patent assignees and patent codes. This model was applied to the evolving domain of 3D bioprinting. Scientific documents from the Scopus and Web of Science databases, along with patents from 27 authorities and 140 countries, were retrieved. In total, 4782 scientific publications and 706 patents were identified from 2000 to mid-2016. The number of scientific documents published and patents in the last five years showed an annual average growth of 20% and 40%, respectively. Results indicate that the most prolific nations and institutions publishing on 3D bioprinting are the USA and China, including the Massachusetts Institute of Technology (USA), Nanyang Technological University (Singapore) and Tsinghua University (China), respectively. Biomaterials and Biofabrication are the predominant journals. The most prolific patenting countries are China and the USA; while Organovo Holdings Inc. (USA) and Tsinghua University (China) are the institutions leading. International Patent Classification codes reveal that most 3D bioprinting inventions intended for medical purposes apply porous or cellular materials or biologically active materials. Knowledge clusters and expert drivers indicate that there is a research focus on tissue engineering including the fabrication of organs, bioinks and new 3D bioprinting systems. Our model offers a guide to researchers to understand the knowledge production of pioneering technologies, in this case 3D bioprinting.

  15. Managed Entry Agreements for Pharmaceuticals in the Context of Adaptive Pathways in Europe

    PubMed Central

    Bouvy, Jacoline C.; Sapede, Claudine; Garner, Sarah

    2018-01-01

    As per the EMA definition, adaptive pathways is a scientific concept for the development of medicines which seeks to facilitate patient access to promising medicines addressing high unmet need through a prospectively planned approach in a sustainable way. This review reports the findings of activities undertaken by the ADAPT-SMART consortium to identify enablers and explore the suitability of managed entry agreements for adaptive pathways products in Europe. We found that during 2006–2016 outcomes-based managed entry agreements were not commonly used for products with a conditional marketing authorization or authorized under exceptional circumstances. The barriers and enablers to develop workable managed entry agreements models for adaptive pathways products were discussed through interviews and a multi-stakeholder workshop with a number of recommendations made in this paper. PMID:29636692

  16. The neutron star interior composition explorer (NICER): mission definition

    NASA Astrophysics Data System (ADS)

    Arzoumanian, Z.; Gendreau, K. C.; Baker, C. L.; Cazeau, T.; Hestnes, P.; Kellogg, J. W.; Kenyon, S. J.; Kozon, R. P.; Liu, K.-C.; Manthripragada, S. S.; Markwardt, C. B.; Mitchell, A. L.; Mitchell, J. W.; Monroe, C. A.; Okajima, T.; Pollard, S. E.; Powers, D. F.; Savadkin, B. J.; Winternitz, L. B.; Chen, P. T.; Wright, M. R.; Foster, R.; Prigozhin, G.; Remillard, R.; Doty, J.

    2014-07-01

    Over a 10-month period during 2013 and early 2014, development of the Neutron star Interior Composition Explorer (NICER) mission [1] proceeded through Phase B, Mission Definition. An external attached payload on the International Space Station (ISS), NICER is scheduled to launch in 2016 for an 18-month baseline mission. Its prime scientific focus is an in-depth investigation of neutron stars—objects that compress up to two Solar masses into a volume the size of a city—accomplished through observations in 0.2-12 keV X-rays, the electromagnetic band into which the stars radiate significant fractions of their thermal, magnetic, and rotational energy stores. Additionally, NICER enables the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) demonstration of spacecraft navigation using pulsars as beacons. During Phase B, substantive refinements were made to the mission-level requirements, concept of operations, and payload and instrument design. Fabrication and testing of engineering-model components improved the fidelity of the anticipated scientific performance of NICER's X-ray Timing Instrument (XTI), as well as of the payload's pointing system, which enables tracking of science targets from the ISS platform. We briefly summarize advances in the mission's formulation that, together with strong programmatic performance in project management, culminated in NICER's confirmation by NASA into Phase C, Design and Development, in March 2014.

  17. Hybrid modeling for quality by design and PAT-benefits and challenges of applications in biopharmaceutical industry.

    PubMed

    von Stosch, Moritz; Davy, Steven; Francois, Kjell; Galvanauskas, Vytautas; Hamelink, Jan-Martijn; Luebbert, Andreas; Mayer, Martin; Oliveira, Rui; O'Kennedy, Ronan; Rice, Paul; Glassey, Jarka

    2014-06-01

    This report highlights the drivers, challenges, and enablers of the hybrid modeling applications in biopharmaceutical industry. It is a summary of an expert panel discussion of European academics and industrialists with relevant scientific and engineering backgrounds. Hybrid modeling is viewed in its broader sense, namely as the integration of different knowledge sources in form of parametric and nonparametric models into a hybrid semi-parametric model, for instance the integration of fundamental and data-driven models. A brief description of the current state-of-the-art and industrial uptake of the methodology is provided. The report concludes with a number of recommendations to facilitate further developments and a wider industrial application of this modeling approach. These recommendations are limited to further exploiting the benefits of this methodology within process analytical technology (PAT) applications in biopharmaceutical industry. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Developing Accessible Cyberinfrastructure-Enabled Knowledge Communities in the National Disability Community: Theory, Practice, and Policy

    ERIC Educational Resources Information Center

    Myhill, William N.; Cogburn, Derrick L.; Samant, Deepti

    2008-01-01

    Since publication of the Atkins Commission report 2003, the national scientific community has placed significant emphasis on developing cyberinfrastructure-enabled knowledge communities, which are designed to facilitate enhanced efficiency and collaboration in geographically distributed networks of researchers. This article suggests that the new…

  19. Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.

    PubMed

    Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar

    2012-01-01

    Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.

  20. The research data alliance photon and neutron science interest group

    DOE PAGES

    Boehnlein, Amber; Matthews, Brian; Proffen, Thomas; ...

    2015-04-01

    Scientific research data provides unique challenges that are distinct from classic “Big Data” sources. One common element in research data is that the experiment, observations, or simulation were designed, and data were specifically acquired, to shed light on an open scientific question. The data and methods are usually “owned” by the researcher(s) and the data itself might not be viewed to have long-term scientific significance after the results have been published. Often, the data volume was relatively low, with data sometimes easier to reproduce than to catalog and store. Some data and meta-data were not collected in a digital form,more » or were stored on antiquated or obsolete media. Generally speaking, policies, tools, and management of digital research data have reflected an ad hoc approach that varies domain by domain and research group by research group. This model, which treats research data as disposable, is proving to be a serious limitation as the volume and complexity of research data explodes. Changes are required at every level of scientific research: within the individual groups, and across scientific domains and interdisciplinary collaborations. Enabling researchers to learn about available tools, processes, and procedures should encourage a spirit of cooperation and collaboration, allowing researchers to come together for the common good. In conclusion, these community-oriented efforts provide the potential for targeted projects with high impact.« less

  1. The research data alliance photon and neutron science interest group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boehnlein, Amber; Matthews, Brian; Proffen, Thomas

    Scientific research data provides unique challenges that are distinct from classic “Big Data” sources. One common element in research data is that the experiment, observations, or simulation were designed, and data were specifically acquired, to shed light on an open scientific question. The data and methods are usually “owned” by the researcher(s) and the data itself might not be viewed to have long-term scientific significance after the results have been published. Often, the data volume was relatively low, with data sometimes easier to reproduce than to catalog and store. Some data and meta-data were not collected in a digital form,more » or were stored on antiquated or obsolete media. Generally speaking, policies, tools, and management of digital research data have reflected an ad hoc approach that varies domain by domain and research group by research group. This model, which treats research data as disposable, is proving to be a serious limitation as the volume and complexity of research data explodes. Changes are required at every level of scientific research: within the individual groups, and across scientific domains and interdisciplinary collaborations. Enabling researchers to learn about available tools, processes, and procedures should encourage a spirit of cooperation and collaboration, allowing researchers to come together for the common good. In conclusion, these community-oriented efforts provide the potential for targeted projects with high impact.« less

  2. University-Level Teaching of Anthropogenic Global Climate Change (AGCC) via Student Inquiry

    NASA Technical Reports Server (NTRS)

    Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark

    2017-01-01

    This paper reviews university-level efforts to improve understanding of anthropogenic global climate change (AGCC) through curricula that enable student scientific inquiry. We examined 152 refereed publications and proceedings from academic conferences and selected 26 cases of inquiry learning that overcome specific challenges to AGCC teaching. This review identifies both the strengths and weaknesses of each of these case studies. It is the first to go beyond examining the impact of specific inquiry instructional approaches to offer a synthesis of cases. We find that inquiry teaching can succeed by concretising scientific processes, providing access to global data and evidence, imparting critical and higher order thinking about AGCC science policy and contextualising learning with places and scientific facts. We recommend educational researchers and scientists collaborate to create and refine curricula that utilise geospatial technologies, climate models and communication technologies to bring students into contact with scientists, climate data and authentic AGCC research processes. Many available science education technologies and curricula also require further research to maximise trade-offs between implementation and training costs and their educational value.

  3. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  4. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  5. Laser Ranging to the Moon: How Evolving Technology Enables New Science

    NASA Astrophysics Data System (ADS)

    Faller, James

    2010-03-01

    Technological advances have long been the enabler of scientific progress. The invention of the laser is a prime example of this symbiotic relationship between technical progress and scientific advances. The laser, which today is omnipresent in each of our lives, made its first appearance during the time that I was a graduate student in Professor Dicke's group at Princeton. A major change occurring during that time period was that technology was transforming the study of gravitational physics from just a theoretical subject into also an experimental subject where one could hope to measure things using by-then-available laboratory technologies and techniques. During this same time, the idea for the lunar laser ranging experiment was born. The history and accomplishments of this experiment--a still ongoing experiment which is one of the real scientific triumphs of NASA's Apollo program--will be given.

  6. Ground-based multi-station spectroscopic imaging with ALIS. - Scientific highlights, project status and future prospects

    NASA Astrophysics Data System (ADS)

    Brändström; Gustavsson, Björn; Pellinen-Wannberg, Asta; Sandahl, Ingrid; Sergienko, Tima; Steen, Ake

    2005-08-01

    The Auroral Large Imaging System (ALIS) was first proposed at the ESA-PAC meeting in Lahnstein 1989. The first spectroscopic imaging station was operational in 1994, and since then up to six stations have been in simultaneous operation. Each station has a scientific-grade CCD-detector and a filter-wheel for narrow-band interference-filters with six positions. The field-of-view is around 70°. Each imager is mounted in a positioning system, enabling imaging of a common volume from several sites. This enables triangulation and tomography. Raw data from ALIS is freely available at ("http://alis.irf.se") and ALIS is open for scientific colaboration. ALIS made the first unambiguous observations of Radio-induced optical emissions at high latitudes, and the detection of water in a Leonid meteor-trail. Both rockets and satellite coordination are considered for future observations with ALIS.

  7. The EGS Data Collaboration Platform: Enabling Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weers, Jonathan D; Johnston, Henry; Huggins, Jay V

    Collaboration in the digital age has been stifled in recent years. Reasonable responses to legitimate security concerns have created a virtual landscape of silos and fortified castles incapable of sharing information efficiently. This trend is unfortunately opposed to the geothermal scientific community's migration toward larger, more collaborative projects. To facilitate efficient sharing of information between team members from multiple national labs, universities, and private organizations, the 'EGS Collab' team has developed a universally accessible, secure data collaboration platform and has fully integrated it with the U.S. Department of Energy's (DOE) Geothermal Data Repository (GDR) and the National Geothermal Data Systemmore » (NGDS). This paper will explore some of the challenges of collaboration in the modern digital age, highlight strategies for active data management, and discuss the integration of the EGS Collab data management platform with the GDR to enable scientific discovery through the timely dissemination of information.« less

  8. The Co-evolution of Climate Models and the Intergovernmental Panel on Climate Change

    NASA Astrophysics Data System (ADS)

    Somerville, R. C.

    2010-12-01

    As recently as the 1950s, global climate models, or GCMs, did not exist, and the notion that man-made carbon dioxide might lead to significant climate change was not regarded as a serious possibility by most experts. Today, of course, the prospect or threat of exactly this type of climate change dominates the science and ranks among the most pressing issues confronting all mankind. Indeed, the prevailing scientific view throughout the first half of the twentieth century was that adding carbon dioxide to the atmosphere would have only a negligible effect on climate. The science of climate change caused by atmospheric carbon dioxide changes has thus undergone a genuine revolution. An extraordinarily rapid development of global climate models has also characterized this period, especially in the three decades since about 1980. In these three decades, the number of GCMs has greatly increased, and their physical and computational aspects have both markedly improved. Modeling progress has been enabled by many scientific advances, of course, but especially by a massive increase in available computer power, with supercomputer speeds increasing by roughly a factor of a million in the three decades from about 1980 to 2010. This technological advance has permitted a rapid increase in the physical comprehensiveness of GCMs as well as in spatial computational resolution. In short, GCMs have dramatically evolved over time, in exactly the same recent period as popular interest and scientific concern about anthropogenic climate change have markedly increased. In parallel, a unique international organization, the Intergovernmental Panel on Climate Change, or IPCC, has also recently come into being and also evolved rapidly. Today, the IPCC has become widely respected and globally influential. The IPCC was founded in 1988, and its history is thus even shorter than that of GCMs. Yet, its stature today is such that a series of IPCC reports assessing climate change science has already been endorsed by many leading scientific professional societies and academies of science worldwide. These reports are considered as definitive summaries of the state of the science. In 2007, in recognition of its exceptional accomplishments, the IPCC shared the Nobel Peace Prize equally with Al Gore. The present era is characterized not only by the reality and seriousness of human-caused climate change, but also by a young yet powerful science that enables us to understand much about the climate change that has occurred already and that awaits in the future. The development of GCMs is a critical part of the scientific story, and the development of the IPCC is a key factor in connecting the science to the perceptions and priorities of the global public and policymakers. GCMs and the IPCC have co-evolved and strongly influenced one another, as both scientists and the world at large have worked to confront the challenge of climate change.

  9. Dr. Robert Goddard

    NASA Image and Video Library

    2017-12-08

    Dr. Robert Goddard's rocket ready for flight. Roswell, New Mexico. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  10. Kindergarten and Primary School Children's Everyday, Synthetic, and Scientific Concepts of Clouds and Rainfall

    ERIC Educational Resources Information Center

    Malleus, Elina; Kikas, Eve; Marken, Tiivi

    2017-01-01

    The purpose of this research was to explore children's understandings of everyday, synthetic and scientific concepts to enable a description of how abstract, verbally taught material relates to previous experience-based knowledge and the consistency of understanding about cloud formation. This study examined the conceptual understandings of cloud…

  11. Do Pre-Service Science Teachers Have Understanding of the Nature of Science?: Explicit-Reflective Approach

    ERIC Educational Resources Information Center

    Örnek, Funda; Turkey, Kocaeli

    2014-01-01

    Current approaches in Science Education attempt to enable students to develop an understanding of the nature of science, develop fundamental scientific concepts, and develop the ability to structure, analyze, reason, and communicate effectively. Students pose, solve, and interpret scientific problems, and eventually set goals and regulate their…

  12. Training in Decision-Making Strategies: An Approach to Enhance Students' Competence to Deal with Socio-Scientific Issues

    ERIC Educational Resources Information Center

    Gresch, Helge; Hasselhorn, Marcus; Bögeholz, Susanne

    2013-01-01

    Dealing with socio-scientific issues in science classes enables students to participate productively in controversial discussions concerning ethical topics, such as sustainable development. In this respect, well-structured decision-making processes are essential for elaborate reasoning. To foster decision-making competence, a computer-based…

  13. Reference Architecture Model Enabling Standards Interoperability.

    PubMed

    Blobel, Bernd

    2017-01-01

    Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.

  14. Establishing lunar resource viability

    NASA Astrophysics Data System (ADS)

    Carpenter, J.; Fisackerly, R.; Houdou, B.

    2016-11-01

    Recent research has highlighted the potential of lunar resources as an important element of space exploration but their viability has not been demonstrated. Establishing whether or not they can be considered in future plans is a multidisciplinary effort, requiring scientific expertise and delivering scientific results. To this end various space agencies and private entities are looking to lunar resources, extracted and processed in situ, as a potentially game changing element in future space architectures, with the potential to increase scale and reduce cost. However, before any decisions can be made on the inclusion of resources in exploration roadmaps or future scenarios some big questions need to be answered about the viability of different resource deposits and the processes for extraction and utilisation. The missions and measurements that will be required to answer these questions, and which are being prepared by agencies and others, can only be performed through the engagement and support of the science community. In answering questions about resources, data and knowledge will be generated that is of fundamental scientific importance. In supporting resource prospecting missions the science community will de facto generate new scientific knowledge. Science enables exploration and exploration enables science.

  15. THE VIRTUAL INSTRUMENT: SUPPORT FOR GRID-ENABLED MCELL SIMULATIONS

    PubMed Central

    Casanova, Henri; Berman, Francine; Bartol, Thomas; Gokcay, Erhan; Sejnowski, Terry; Birnbaum, Adam; Dongarra, Jack; Miller, Michelle; Ellisman, Mark; Faerman, Marcio; Obertelli, Graziano; Wolski, Rich; Pomerantz, Stuart; Stiles, Joel

    2010-01-01

    Ensembles of widely distributed, heterogeneous resources, or Grids, have emerged as popular platforms for large-scale scientific applications. In this paper we present the Virtual Instrument project, which provides an integrated application execution environment that enables end-users to run and interact with running scientific simulations on Grids. This work is performed in the specific context of MCell, a computational biology application. While MCell provides the basis for running simulations, its capabilities are currently limited in terms of scale, ease-of-use, and interactivity. These limitations preclude usage scenarios that are critical for scientific advances. Our goal is to create a scientific “Virtual Instrument” from MCell by allowing its users to transparently access Grid resources while being able to steer running simulations. In this paper, we motivate the Virtual Instrument project and discuss a number of relevant issues and accomplishments in the area of Grid software development and application scheduling. We then describe our software design and report on the current implementation. We verify and evaluate our design via experiments with MCell on a real-world Grid testbed. PMID:20689618

  16. Scientific workflows as productivity tools for drug discovery.

    PubMed

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  17. D-X Payload Ready For Flight

    NASA Image and Video Library

    2017-12-08

    Matthew Mullin and Bobby Meazell, Orbital ATK/Columbia Scientific Balloon Facility technicians, conduct compatibility testing on NASA Langley Research Center’s Radiation Dosimetry Experiment payload Wednesday, Sept. 9, at Fort Sumner, N.M. The successful compatibility test was a key milestone in ensuring the flight readiness of RaD-X, which is scheduled to launch on an 11-million-cubic-foot NASA scientific balloon no earlier than Friday, Sept. 11, from the agency’s balloon launching facility in Fort Sumner. RaD-X will measure cosmic ray energy at two separate altitude regions in the stratosphere—above 110,000 feet and between 69,000 to 88,500 feet. The data is key to confirming Langley’s Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) model, which is a physics-based model that determines solar radiation and galactic cosmic ray exposure globally in real-time. The NAIRAS modeling tool will be used to help enhance aircraft safety as well as safety procedures for the International Space Station. In addition to the primary payload, 100 small student experiments will fly on the RaD-X mission as part of the Cubes in Space program. The program provides 11- to 18-year-old middle and high school students a no-cost opportunity to design and compete to launch an experiment into space or into the near-space environment. The cubes measure just 4 centimeters by 4 centimeters. NASA’s scientific balloons offer low-cost, near-space access for scientific payloads weighing up to 8,000 pounds for conducting scientific investigations in fields such as astrophysics, heliophysics and atmospheric research. NASA’s Wallops Flight Facility in Virginia manages the agency’s scientific balloon program with 10 to 15 flights each year from launch sites worldwide. Orbital ATK provides program management, mission planning, engineering services and field operations for NASA’s scientific balloon program. The program is executed from the Columbia Scientific Balloon Facility in Palestine, Texas. The Columbia team has launched more than 1,700 scientific balloons in over 35 years of operation. Anyone may track the progress of the Fort Sumner flights, which includes a map showing the balloon’s real-time location, at: towerfts.csbf.nasa.gov/ For more information on the balloon program, see: www.nasa.gov/scientificballoons NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  18. Modeling Emergence in Neuroprotective Regulatory Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Haack, Jereme N.; McDermott, Jason E.

    2013-01-05

    The use of predictive modeling in the analysis of gene expression data can greatly accelerate the pace of scientific discovery in biomedical research by enabling in silico experimentation to test disease triggers and potential drug therapies. Techniques that focus on modeling emergence, such as agent-based modeling and multi-agent simulations, are of particular interest as they support the discovery of pathways that may have never been observed in the past. Thus far, these techniques have been primarily applied at the multi-cellular level, or have focused on signaling and metabolic networks. We present an approach where emergence modeling is extended to regulatorymore » networks and demonstrate its application to the discovery of neuroprotective pathways. An initial evaluation of the approach indicates that emergence modeling provides novel insights for the analysis of regulatory networks that can advance the discovery of acute treatments for stroke and other diseases.« less

  19. Graphics Processing Unit (GPU) Acceleration of the Goddard Earth Observing System Atmospheric Model

    NASA Technical Reports Server (NTRS)

    Putnam, Williama

    2011-01-01

    The Goddard Earth Observing System 5 (GEOS-5) is the atmospheric model used by the Global Modeling and Assimilation Office (GMAO) for a variety of applications, from long-term climate prediction at relatively coarse resolution, to data assimilation and numerical weather prediction, to very high-resolution cloud-resolving simulations. GEOS-5 is being ported to a graphics processing unit (GPU) cluster at the NASA Center for Climate Simulation (NCCS). By utilizing GPU co-processor technology, we expect to increase the throughput of GEOS-5 by at least an order of magnitude, and accelerate the process of scientific exploration across all scales of global modeling, including: The large-scale, high-end application of non-hydrostatic, global, cloud-resolving modeling at 10- to I-kilometer (km) global resolutions Intermediate-resolution seasonal climate and weather prediction at 50- to 25-km on small clusters of GPUs Long-range, coarse-resolution climate modeling, enabled on a small box of GPUs for the individual researcher After being ported to the GPU cluster, the primary physics components and the dynamical core of GEOS-5 have demonstrated a potential speedup of 15-40 times over conventional processor cores. Performance improvements of this magnitude reduce the required scalability of 1-km, global, cloud-resolving models from an unfathomable 6 million cores to an attainable 200,000 GPU-enabled cores.

  20. Big Data and Dementia: Charting the Route Ahead for Research, Ethics, and Policy

    PubMed Central

    Ienca, Marcello; Vayena, Effy; Blasimme, Alessandro

    2018-01-01

    Emerging trends in pervasive computing and medical informatics are creating the possibility for large-scale collection, sharing, aggregation and analysis of unprecedented volumes of data, a phenomenon commonly known as big data. In this contribution, we review the existing scientific literature on big data approaches to dementia, as well as commercially available mobile-based applications in this domain. Our analysis suggests that big data approaches to dementia research and care hold promise for improving current preventive and predictive models, casting light on the etiology of the disease, enabling earlier diagnosis, optimizing resource allocation, and delivering more tailored treatments to patients with specific disease trajectories. Such promissory outlook, however, has not materialized yet, and raises a number of technical, scientific, ethical, and regulatory challenges. This paper provides an assessment of these challenges and charts the route ahead for research, ethics, and policy. PMID:29468161

  1. Combinatorial Algorithms to Enable Computational Science and Engineering: Work from the CSCAPES Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric

    2015-01-16

    This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less

  2. Cretaceous Footprints Found on Goddard Campus

    NASA Image and Video Library

    2012-08-20

    About 110 million light years away, the bright, barred spiral galaxy NGC3259 was just forming stars in dark bands of dust and gas. On Earth, a plant-eating dinosaur left footprints in the Cretaceous mud of what would later become the grounds of NASA’s Goddard Space Flight Center in Greenbelt, Md. A model of a Nodosaur dinosaur sits inside what is believed to be the fossil of a Nodosaur footprint. The footprint was found by Ray Stanford a local dinosaur hunter. To read more go to: www.nasa.gov/centers/goddard/news/features/2012/nodosaur.... Credit: NASA/Goddard/Rebecca Roth NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  3. Integrating advanced visualization technology into the planetary Geoscience workflow

    NASA Astrophysics Data System (ADS)

    Huffman, John; Forsberg, Andrew; Loomis, Andrew; Head, James; Dickson, James; Fassett, Caleb

    2011-09-01

    Recent advances in computer visualization have allowed us to develop new tools for analyzing the data gathered during planetary missions, which is important, since these data sets have grown exponentially in recent years to tens of terabytes in size. As part of the Advanced Visualization in Solar System Exploration and Research (ADVISER) project, we utilize several advanced visualization techniques created specifically with planetary image data in mind. The Geoviewer application allows real-time active stereo display of images, which in aggregate have billions of pixels. The ADVISER desktop application platform allows fast three-dimensional visualization of planetary images overlain on digital terrain models. Both applications include tools for easy data ingest and real-time analysis in a programmatic manner. Incorporation of these tools into our everyday scientific workflow has proved important for scientific analysis, discussion, and publication, and enabled effective and exciting educational activities for students from high school through graduate school.

  4. Big Data and Dementia: Charting the Route Ahead for Research, Ethics, and Policy.

    PubMed

    Ienca, Marcello; Vayena, Effy; Blasimme, Alessandro

    2018-01-01

    Emerging trends in pervasive computing and medical informatics are creating the possibility for large-scale collection, sharing, aggregation and analysis of unprecedented volumes of data, a phenomenon commonly known as big data. In this contribution, we review the existing scientific literature on big data approaches to dementia, as well as commercially available mobile-based applications in this domain. Our analysis suggests that big data approaches to dementia research and care hold promise for improving current preventive and predictive models, casting light on the etiology of the disease, enabling earlier diagnosis, optimizing resource allocation, and delivering more tailored treatments to patients with specific disease trajectories. Such promissory outlook, however, has not materialized yet, and raises a number of technical, scientific, ethical, and regulatory challenges. This paper provides an assessment of these challenges and charts the route ahead for research, ethics, and policy.

  5. Adapting California’s ecosystems to a changing climate

    USGS Publications Warehouse

    Elizabeth Chornesky,; David Ackerly,; Paul Beier,; Frank Davis,; Flint, Lorraine E.; Lawler, Joshua J.; Moyle, Peter B.; Moritz, Max A.; Scoonover, Mary; Byrd, Kristin B.; Alvarez, Pelayo; Heller, Nicole E.; Micheli, Elisabeth; Weiss, Stuart

    2017-01-01

    Significant efforts are underway to translate improved understanding of how climate change is altering ecosystems into practical actions for sustaining ecosystem functions and benefits. We explore this transition in California, where adaptation and mitigation are advancing relatively rapidly, through four case studies that span large spatial domains and encompass diverse ecological systems, institutions, ownerships, and policies. The case studies demonstrate the context specificity of societal efforts to adapt ecosystems to climate change and involve applications of diverse scientific tools (e.g., scenario analyses, downscaled climate projections, ecological and connectivity models) tailored to specific planning and management situations (alternative energy siting, wetland management, rangeland management, open space planning). They illustrate how existing institutional and policy frameworks provide numerous opportunities to advance adaptation related to ecosystems and suggest that progress is likely to be greatest when scientific knowledge is integrated into collective planning and when supportive policies and financing enable action.

  6. From Science To Design: Systems Engineering For The Lsst

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Axelrod, T.; Fouts, K.; Kantor, J.; Nordby, M.; Sebag, J.; LSST Collaboration

    2009-01-01

    The LSST is a universal-purpose survey telescope that will address scores of scientific missions. To assist the technical teams to convergence to a specific engineering design, the LSST Science Requirements Document (SRD) selects four stressing principle scientific missions: 1) Constraining Dark Matter and Dark Energy; 2) taking an Inventory of the Solar System; 3) Exploring the Transient Optical Sky; and 4) mapping the Milky Way. From these 4 missions the SRD specifies the needed requirements for single images and the full 10 year survey that enables a wide range of science beyond the 4 principle missions. Through optical design and analysis, operations simulation, and throughput modeling the systems engineering effort in the LSST has largely focused on taking the SRD specifications and deriving system functional requirements that define the system design. A Model Based Systems Engineering approach with SysML is used to manage the flow down of requirements from science to system function to sub-system. The rigor of requirements flow and management assists the LSST in keeping the overall scope, hence budget and schedule, under control.

  7. CubeSat mission design software tool for risk estimating relationships

    NASA Astrophysics Data System (ADS)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  8. Science Gateways, Scientific Workflows and Open Community Software

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.

  9. A Drupal-Based Collaborative Framework for Science Workflows

    NASA Astrophysics Data System (ADS)

    Pinheiro da Silva, P.; Gandara, A.

    2010-12-01

    Cyber-infrastructure is built from utilizing technical infrastructure to support organizational practices and social norms to provide support for scientific teams working together or dependent on each other to conduct scientific research. Such cyber-infrastructure enables the sharing of information and data so that scientists can leverage knowledge and expertise through automation. Scientific workflow systems have been used to build automated scientific systems used by scientists to conduct scientific research and, as a result, create artifacts in support of scientific discoveries. These complex systems are often developed by teams of scientists who are located in different places, e.g., scientists working in distinct buildings, and sometimes in different time zones, e.g., scientist working in distinct national laboratories. The sharing of these specifications is currently supported by the use of version control systems such as CVS or Subversion. Discussions about the design, improvement, and testing of these specifications, however, often happen elsewhere, e.g., through the exchange of email messages and IM chatting. Carrying on a discussion about these specifications is challenging because comments and specifications are not necessarily connected. For instance, the person reading a comment about a given workflow specification may not be able to see the workflow and even if the person can see the workflow, the person may not specifically know to which part of the workflow a given comments applies to. In this paper, we discuss the design, implementation and use of CI-Server, a Drupal-based infrastructure, to support the collaboration of both local and distributed teams of scientists using scientific workflows. CI-Server has three primary goals: to enable information sharing by providing tools that scientists can use within their scientific research to process data, publish and share artifacts; to build community by providing tools that support discussions between scientists about artifacts used or created through scientific processes; and to leverage the knowledge collected within the artifacts and scientific collaborations to support scientific discoveries.

  10. π Scope: python based scientific workbench with visualization tool for MDSplus data

    NASA Astrophysics Data System (ADS)

    Shiraiwa, S.

    2014-10-01

    π Scope is a python based scientific data analysis and visualization tool constructed on wxPython and Matplotlib. Although it is designed to be a generic tool, the primary motivation for developing the new software is 1) to provide an updated tool to browse MDSplus data, with functionalities beyond dwscope and jScope, and 2) to provide a universal foundation to construct interface tools to perform computer simulation and modeling for Alcator C-Mod. It provides many features to visualize MDSplus data during tokamak experiments including overplotting different signals and discharges, various plot types (line, contour, image, etc.), in-panel data analysis using python scripts, and publication quality graphics generation. Additionally, the logic to produce multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for dwscope users. πScope uses multi-threading to reduce data transfer latency, and its object-oriented design makes it easy to modify and expand while the open source nature allows portability. A built-in tree data browser allows a user to approach the data structure both from a GUI and a script, enabling relatively complex data analysis workflow to be built quickly. As an example, an IDL-based interface to perform GENRAY/CQL3D simulations was ported on πScope, thus allowing LHCD simulation to be run between-shot using C-Mod experimental profiles. This workflow is being used to generate a large database to develop a LHCD actuator model for the plasma control system. Supported by USDoE Award DE-FC02-99ER54512.

  11. Enabling responsible public genomics.

    PubMed

    Conley, John M; Doerr, Adam K; Vorhaus, Daniel B

    2010-01-01

    As scientific understandings of genetics advance, researchers require increasingly rich datasets that combine genomic data from large numbers of individuals with medical and other personal information. Linking individuals' genetic data and personal information precludes anonymity and produces medically significant information--a result not contemplated by the established legal and ethical conventions governing human genomic research. To pursue the next generation of human genomic research and commerce in a responsible fashion, scientists, lawyers, and regulators must address substantial new issues, including researchers' duties with respect to clinically significant data, the challenges to privacy presented by genomic data, the boundary between genomic research and commerce, and the practice of medicine. This Article presents a new model for understanding and addressing these new challenges--a "public genomics" premised on the idea that ethically, legally, and socially responsible genomics research requires openness, not privacy, as its organizing principle. Responsible public genomics combines the data contributed by informed and fully consenting information altruists and the research potential of rich datasets in a genomic commons that is freely and globally available. This Article examines the risks and benefits of this public genomics model in the context of an ambitious genetic research project currently under way--the Personal Genome Project. This Article also (i) demonstrates that large-scale genomic projects are desirable, (ii) evaluates the risks and challenges presented by public genomics research, and (iii) determines that the current legal and regulatory regimes restrict beneficial and responsible scientific inquiry while failing to adequately protect participants. The Article concludes by proposing a modified normative and legal framework that embraces and enables a future of responsible public genomics.

  12. Multidrug-Resistant TB: Implementing the Right to Health through the Right to Enjoy the Benefits of Scientific Progress.

    PubMed

    London, Leslie; Cox, Helen; Coomans, Fons

    2016-06-01

    The right to enjoy the benefits of scientific progress (REBSP) is a little-known but potentially valuable right that can contribute to rights-based approaches to addressing multidrug-resistant TB (MDR-TB). We argue that better understanding of the REBSP may help to advance legal and civil society action for health rights. While the REBSP does not provide an individual entitlement to have a new drug developed for MDR-TB, it sets up entitlements to expect a state to establish a legislative and policy framework aimed at developing scientific capacity to address the most important health issues and at disseminating the outcomes of scientific research. By making scientific findings available and accessible, people can be enabled to claim the use of science for social benefits. Inasmuch as the market fails to address neglected diseases such as MDR-TB, the REBSP provides a potential counterbalance to frame a positive obligation on states to both marshal their own resources and to coordinate the actions of multiple other actors towards this goal, including non-state actors. While the latter do not hold the same level of accountability as states, the REBSP can still enable the recognition of obligations at a level of "soft law" responsibilities.

  13. Procedural apprenticeship in school science: Constructivist enabling of connoisseurship

    NASA Astrophysics Data System (ADS)

    Bencze, J. Lawrence

    2000-11-01

    In many parts of the world, school science, especially at the secondary school level, is a sort of selection and training camp for future scientists and engineers. For most students, their general lack of cultural capital (Apple, 1990) minimizes their opportunities to survive the rapid coverage of large volumes of abstract, decontextualized laws, theories, and inventions so typical of school science. Most graduates and drop-outs become relatively scientifically and technologically illiterate. They either have forgotten or have confused conceptions of scientific and technological knowledge; often view science and technology as relatively certain, unbiased, and benign with respect to effects on society and the environment; and lack resources necessary to effectively judge products and processes of science and technology or, crucially, to create their own explanations for and changes to phenomena. Citizens with illiteracy to this extent may have little control over their own thoughts and actions and be prey to whims of those who control knowledge, its production and dissemination. Curriculum frameworks are required that enable all students to achieve their maximum potential literacy and, as well, to create their own knowledge, to develop in directions unique to their needs, interests, abilities, and perspectives; that is, to become self-actualized. This latter goal can, in part, be achieved through apprenticeship education in schools, such that students acquire a measure of scientific and technological connoisseurship - expertise enabling them to conduct open-ended scientific investigations and invention projects of their design. In collaboration with five teachers of secondary school science, such a framework was, indeed, developed, and field-tested. Through a spiraling, cyclical process involving synchronous reconstruction of conceptual and procedural understandings, evidence suggests students were able to carry out experiments, studies, and tests of their inventions with minimal teacher involvement. Furthermore, they appeared to accommodate more realistic conceptions of scientific and technological work. Moreover, many seemed to have made progress toward intellectual independence; able to judge knowledge claims independent of authorities. It is hoped that with more schools, systems, and teachers enabling development of such connoisseurship, all students will be better served by school science and, as well, the larger society will be more diverse, adaptable, and free.

  14. Soft computing methods for geoidal height transformation

    NASA Astrophysics Data System (ADS)

    Akyilmaz, O.; Özlüdemir, M. T.; Ayan, T.; Çelik, R. N.

    2009-07-01

    Soft computing techniques, such as fuzzy logic and artificial neural network (ANN) approaches, have enabled researchers to create precise models for use in many scientific and engineering applications. Applications that can be employed in geodetic studies include the estimation of earth rotation parameters and the determination of mean sea level changes. Another important field of geodesy in which these computing techniques can be applied is geoidal height transformation. We report here our use of a conventional polynomial model, the Adaptive Network-based Fuzzy (or in some publications, Adaptive Neuro-Fuzzy) Inference System (ANFIS), an ANN and a modified ANN approach to approximate geoid heights. These approximation models have been tested on a number of test points. The results obtained through the transformation processes from ellipsoidal heights into local levelling heights have also been compared.

  15. Garbage Patch Visualization Experiment

    NASA Image and Video Library

    2015-08-20

    Goddard visualizers show us how five garbage patches formed in the world's oceans using 35 years of data. Read more: 1.usa.gov/1Lnj7xV Credit: NASA's Scientific Visualization Studio NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  16. Enabling the SMART Wind Power Plant of the Future Through Science-Based Innovation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykes, Katherine L.; Hand, M. M.; Lantz, Eric J.

    This report describes the scientific challenges facing wind energy today and the recent scientific advancements that position the research community to tackle those challenges, as well as the new U.S. Department of Energy applied research program Atmosphere to Electrons that takes an integrated approach to addressing those challenges. It also ties these resulting scientific accomplishments to future technological innovation and quantifies the impact of that collection of innovations on 2030 wind power cost of energy.

  17. Chemical datuments as scientific enablers.

    PubMed

    Rzepa, Henry S

    2013-01-23

    This article is an attempt to construct a chemical datument as a means of presenting insights into chemical phenomena in a scientific journal. An exploration of the interactions present in a small fragment of duplex Z-DNA and the nature of the catalytic centre of a carbon-dioxide/alkene epoxide alternating co-polymerisation is presented in this datument, with examples of the use of three software tools, one based on Java, the other two using Javascript and HTML5 technologies. The implications for the evolution of scientific journals are discussed.

  18. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Robert Goddard with a rocket in his workshop at Roswell, NM. October 1935. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  19. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Goddard with a rocket in his workshop at Roswell, NM. October 1935. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  20. Silly Science: Strange and Startling Projects To Amaze Your Family and Friends.

    ERIC Educational Resources Information Center

    Levine, Shar; Johnstone, Leslie

    This book is a collection of 28 experiments that are not meant to have any practical purpose. Each experiment, however, illustrates a scientific principle and enables students to discover how scientific facts and theories apply to seemingly useless experiments. Each experiment includes a list of materials, a series of steps, an explanation of the…

  1. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Dr. Robert Goddard's rocket nose cone, parachute, and relase device, April 19, 1935. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  2. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Dr. Robert Goddard with batteries and relay at the launch tower, May 19, 1937. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  3. An STS Approach to Organizing a Secondary Science Methods Course: Preliminary Findings.

    ERIC Educational Resources Information Center

    Dass, Pradeep M.

    The current agenda in science education calls for science instruction that enhances student understanding of the nature of scientific enterprise, enables students to critically analyze scientific information as well as apply it in real-life situations, and sets them on a path of lifelong learning in science. In order to prepare teachers who can…

  4. Synergy and Students' Explanations: Exploring the Role of Generic and Content-Specific Scaffolds

    ERIC Educational Resources Information Center

    Delen, Ibrahim; Krajcik, Joseph

    2018-01-01

    In this study, we explored how a teacher used a new mobile application that enables students to collect data inside and outside the classroom, and then use the data to create scientific explanations by using claim-evidence-reasoning framework. Previous technologies designed to support scientific explanations focused on how these programs improve…

  5. Internet Activities Using Scientific Data. A Self-Guided Exploration.

    ERIC Educational Resources Information Center

    Froseth, Stan; Poppe, Barbara

    This guide is intended for the secondary school teacher (especially math or science) or the student who wants to access and learn about scientific data on the Internet. It is organized as a self-guided exploration. Nine exercises enable the user to access and analyze on-line information from the National Oceanic and Atmospheric Administration…

  6. Bangladeshi Science Teachers' Perspectives of Scientific Literacy and Teaching Practices

    ERIC Educational Resources Information Center

    Sarkar, Mahbub; Corrigan, Deborah

    2014-01-01

    In line with a current global trend, junior secondary science education in Bangladesh aims to provide science education for all students to enable them to use their science learning in everyday life. This aim is consistent with the call for scientific literacy, which argues for engaging students with science in everyday life. This paper…

  7. CILogon: An Integrated Identity and Access Management Platform for Science

    NASA Astrophysics Data System (ADS)

    Basney, J.

    2016-12-01

    When scientists work together, they use web sites and other software to share their ideas and data. To ensure the integrity of their work, these systems require the scientists to log in and verify that they are part of the team working on a particular science problem. Too often, the identity and access verification process is a stumbling block for the scientists. Scientific research projects are forced to invest time and effort into developing and supporting Identity and Access Management (IAM) services, distracting them from the core goals of their research collaboration. CILogon provides an IAM platform that enables scientists to work together to meet their IAM needs more effectively so they can allocate more time and effort to their core mission of scientific research. The CILogon platform enables federated identity management and collaborative organization management. Federated identity management enables researchers to use their home organization identities to access cyberinfrastructure, rather than requiring yet another username and password to log on. Collaborative organization management enables research projects to define user groups for authorization to collaboration platforms (e.g., wikis, mailing lists, and domain applications). CILogon's IAM platform serves the unique needs of research collaborations, namely the need to dynamically form collaboration groups across organizations and countries, sharing access to data, instruments, compute clusters, and other resources to enable scientific discovery. CILogon provides a software-as-a-service platform to ease integration with cyberinfrastructure, while making all software components publicly available under open source licenses to enable re-use. Figure 1 illustrates the components and interfaces of this platform. CILogon has been operational since 2010 and has been used by over 7,000 researchers from more than 170 identity providers to access cyberinfrastructure including Globus, LIGO, Open Science Grid, SeedMe, and XSEDE. The "CILogon 2.0" platform, launched in 2016, adds support for virtual organization (VO) membership management, identity linking, international collaborations, and standard integration protocols, through integration with the Internet2 COmanage collaboration software.

  8. The Virtual Mission - A step-wise approach to large space missions

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine; Jones, Morgan; Hooke, Adrian; Pomphrey, Richard

    1992-01-01

    Attention is given to the Virtual Mission (VM) concept, wherein multiple scientific instruments will be on different platforms, in different orbits, operated from different control centers, at different institutions, and reporting to different user groups. The VM concept enables NASA's science and application users to accomplish their broad science goals with a fleet made up of smaller, more focused spacecraft and to alleviate the difficulties involved with single, large, complex spacecraft. The concept makes possible the stepwise 'go-as-you-pay' extensible approach recommended by Augustine (1990). It enables scientists to mix and match the use of many smaller satellites in novel ways to respond to new scientific ideas and needs.

  9. Component Technology for High-Performance Scientific Simulation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epperly, T; Kohn, S; Kumfert, G

    2000-11-09

    We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less

  10. The influence of authentic scientific research experiences on teachers' conceptions of the nature of science (NOS) and their NOS teaching practices

    NASA Astrophysics Data System (ADS)

    Moriarty, Meghan A.

    This study explored the influence of teachers' authentic scientific research experiences (ASREs) on teachers' conceptions of the nature of science (NOS) and teachers' NOS instruction. Twelve high school biology teachers participated in this study. Six of the participants had authentic scientific research experience (ASRE) and six had not participated in authentic scientific research. Data included background surveys, modified Views of the Nature of Science (VNOS) questionnaires, interviews, and teaching observations. Data was coded based on the eight NOS understandings outlined in 2013 in the Next Generation Science Standards (NGSS). Evidence from this study indicates participating in authentic scientific research as a member of a scientific community has dual benefits of enabling high school science teachers with informed understandings of the NOS and positioning them to teach with the NOS. However, these benefits do not always result from an ASRE. If the nature of the ASRE is limited, then it may limit teachers' NOS understandings and their NOS teaching practices. The results of this study suggest that participation in ASREs may be one way to improve teachers' NOS understandings and teaching practices if the experiences themselves offer a comprehensive view of the NOS. Because ASREs and other science learning experiences do not always offer such experiences, pre-service teacher education and professional development opportunities may engage science teachers in two ways: (1) becoming part of a scientific community may enable them to teach with NOS and (2) being reflective about what being a scientist means may improve teachers' NOS understandings and better position them to teach about NOS.. Keywords: nature of science, authentic scientific research experiences, Next Generation Science Standards, teaching about NOS, teaching with NOS.

  11. High performance computing and communications: Advancing the frontiers of information technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-12-31

    This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental inmore » the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.« less

  12. Enabling Linked Science in Global Climate Uncertainty Quantification (UQ) Research

    NASA Astrophysics Data System (ADS)

    Elsethagen, T.; Stephan, E.; Lin, G.; Williams, D.; Banks, E.

    2012-12-01

    This paper shares a real-world global climate UQ science use case and illustrates how a linked science application called Provenance Environment (ProvEn), currently being developed, enables and facilitates scientific teams to publish, share, link, and discover new links over their UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. This research claims that scientists using this linked science approach will not only allow them to greatly benefit from understanding a particular dataset within a knowledge context, a benefit can also be seen by the cross reference of knowledge among the numerous UQ studies being stored in ESGF. ProvEn collects native forms of data provenance resources as the UQ study is carried out. The native data provenance resources can be collected from a variety of sources such as scripts, a workflow engine log, simulation log files, scientific team members etc. Schema alignment is used to translate the native forms of provenance into a set of W3C PROV-O semantic statements used as a common interchange format which will also contain URI references back to resources in the UQ study dataset for querying and cross referencing. ProvEn leverages Fedora Commons' digital object model in a Resource Oriented Architecture (ROA) (i.e. a RESTful framework) to logically organize and partition native and translated provenance resources by UQ study. The ROA also provides scientists the means to both search native and translated forms of provenance.

  13. Modeling non-point source pollutants in the vadose zone: Back to the basics

    NASA Astrophysics Data System (ADS)

    Corwin, Dennis L.; Letey, John, Jr.; Carrillo, Marcia L. K.

    More than ever before in the history of scientific investigation, modeling is viewed as a fundamental component of the scientific method because of the relatively recent development of the computer. No longer must the scientific investigator be confined to artificially isolated studies of individual processes that can lead to oversimplified and sometimes erroneous conceptions of larger phenomena. Computer models now enable scientists to attack problems related to open systems such as climatic change, and the assessment of environmental impacts, where the whole of the interactive processes are greater than the sum of their isolated components. Environmental assessment involves the determination of change of some constituent over time. This change can be measured in real time or predicted with a model. The advantage of prediction, like preventative medicine, is that it can be used to alter the occurrence of potentially detrimental conditions before they are manifest. The much greater efficiency of preventative, rather than remedial, efforts strongly justifies the need for an ability to accurately model environmental contaminants such as non-point source (NPS) pollutants. However, the environmental modeling advances that have accompanied computer technological development are a mixed blessing. Where once we had a plethora of discordant data without a holistic theory, now the pendulum has swung so that we suffer from a growing stockpile of models of which a significant number have never been confirmed or even attempts made to confirm them. Modeling has become an end in itself rather than a means because of limited research funding, the high cost of field studies, limitations in time and patience, difficulty in cooperative research and pressure to publish papers as quickly as possible. Modeling and experimentation should be ongoing processes that reciprocally enhance one another with sound, comprehensive experiments serving as the building blocks of models and models serving to economize experimental designs and directing objectives. The responsibility lies in the hands of modelers to adhere to the modeling process and to seek out experimentalists that can evaluate their model. Even though this warning is nothing new, the effort by modelers to heed it is still as much the exception as the rule.

  14. Applications of Digital Micromirror Devices to Astronomical Instrumentation

    NASA Astrophysics Data System (ADS)

    Robberto, M.

    MEMS devices are among the major technological breakthroughs of the last two decades. Besides finding widespread use in high-tech and consumer market electronics, MEMS enable new types of astronomical instruments. I concentrate on Digital Micromirror Devices, which have been already adopted in astronomy and can enable scientific investigations that would otherwise remain beyond our technical capabilities.

  15. Climate tools in mainstream Linux distributions

    NASA Astrophysics Data System (ADS)

    McKinstry, Alastair

    2015-04-01

    Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.

  16. Mapping benefits from updated ifsar data in Alaska: improved source data enables better maps

    USGS Publications Warehouse

    Craun, Kari J.

    2015-08-06

    The U.S. Geological Survey (USGS) and partners in other Federal and State agencies are working collaboratively toward Statewide coverage of interferometric synthetic aperture radar (ifsar) elevation data in Alaska. These data will provide many benefits to a wide range of stakeholders and users. Some applications include development of more accurate and highly detailed topographic maps; improvement of surface water information included in the National Hydrography (NHD) and Watershed Boundary Datasets (WBDs); and use in scientific modeling applications such as calculating glacier surface elevation differences over time and estimating tsunami inundation areas.

  17. Boxes of Model Building and Visualization.

    PubMed

    Turk, Dušan

    2017-01-01

    Macromolecular crystallography and electron microscopy (single-particle and in situ tomography) are merging into a single approach used by the two coalescing scientific communities. The merger is a consequence of technical developments that enabled determination of atomic structures of macromolecules by electron microscopy. Technological progress in experimental methods of macromolecular structure determination, computer hardware, and software changed and continues to change the nature of model building and visualization of molecular structures. However, the increase in automation and availability of structure validation are reducing interactive manual model building to fiddling with details. On the other hand, interactive modeling tools increasingly rely on search and complex energy calculation procedures, which make manually driven changes in geometry increasingly powerful and at the same time less demanding. Thus, the need for accurate manual positioning of a model is decreasing. The user's push only needs to be sufficient to bring the model within the increasing convergence radius of the computing tools. It seems that we can now better than ever determine an average single structure. The tools work better, requirements for engagement of human brain are lowered, and the frontier of intellectual and scientific challenges has moved on. The quest for resolution of new challenges requires out-of-the-box thinking. A few issues such as model bias and correctness of structure, ongoing developments in parameters defining geometric restraints, limitations of the ideal average single structure, and limitations of Bragg spot data are discussed here, together with the challenges that lie ahead.

  18. OOI CyberInfrastructure - Next Generation Oceanographic Research

    NASA Astrophysics Data System (ADS)

    Farcas, C.; Fox, P.; Arrott, M.; Farcas, E.; Klacansky, I.; Krueger, I.; Meisinger, M.; Orcutt, J.

    2008-12-01

    Software has become a key enabling technology for scientific discovery, observation, modeling, and exploitation of natural phenomena. New value emerges from the integration of individual subsystems into networked federations of capabilities exposed to the scientific community. Such data-intensive interoperability networks are crucial for future scientific collaborative research, as they open up new ways of fusing data from different sources and across various domains, and analysis on wide geographic areas. The recently established NSF OOI program, through its CyberInfrastructure component addresses this challenge by providing broad access from sensor networks for data acquisition up to computational grids for massive computations and binding infrastructure facilitating policy management and governance of the emerging system-of-scientific-systems. We provide insight into the integration core of this effort, namely, a hierarchic service-oriented architecture for a robust, performant, and maintainable implementation. We first discuss the relationship between data management and CI crosscutting concerns such as identity management, policy and governance, which define the organizational contexts for data access and usage. Next, we detail critical services including data ingestion, transformation, preservation, inventory, and presentation. To address interoperability issues between data represented in various formats we employ a semantic framework derived from the Earth System Grid technology, a canonical representation for scientific data based on DAP/OPeNDAP, and related data publishers such as ERDDAP. Finally, we briefly present the underlying transport based on a messaging infrastructure over the AMQP protocol, and the preservation based on a distributed file system through SDSC iRODS.

  19. Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS)

    NASA Astrophysics Data System (ADS)

    Daniels, M. D.; Graves, S. J.; Vernon, F.; Kerkez, B.; Chandra, C. V.; Keiser, K.; Martin, C.

    2014-12-01

    Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) Access, utilization and management of real-time data continue to be challenging for decision makers, as well as researchers in several scientific fields. This presentation will highlight infrastructure aimed at addressing some of the gaps in handling real-time data, particularly in increasing accessibility of these data to the scientific community through cloud services. The Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) system addresses the ever-increasing importance of real-time scientific data, particularly in mission critical scenarios, where informed decisions must be made rapidly. Advances in the distribution of real-time data are leading many new transient phenomena in space-time to be observed, however real-time decision-making is infeasible in many cases that require streaming scientific data as these data are locked down and sent only to proprietary in-house tools or displays. This lack of accessibility to the broader scientific community prohibits algorithm development and workflows initiated by these data streams. As part of NSF's EarthCube initiative, CHORDS proposes to make real-time data available to the academic community via cloud services. The CHORDS infrastructure will enhance the role of real-time data within the geosciences, specifically expanding the potential of streaming data sources in enabling adaptive experimentation and real-time hypothesis testing. Adherence to community data and metadata standards will promote the integration of CHORDS real-time data with existing standards-compliant analysis, visualization and modeling tools.

  20. NASA's Solar System Exploration Research Virtual Institute: Science and Technology for Lunar Exploration

    NASA Technical Reports Server (NTRS)

    Schmidt, Greg; Bailey, Brad; Gibbs, Kristina

    2015-01-01

    The NASA Solar System Exploration Research Virtual Institute (SSERVI) is a virtual institute focused on research at the intersection of science and exploration, training the next generation of lunar scientists, and development and support of the international community. As part of its mission, SSERVI acts as a hub for opportunities that engage the larger scientific and exploration communities in order to form new interdisciplinary, research-focused collaborations. The nine domestic SSERVI teams that comprise the U.S. complement of the Institute engage with the international science and exploration communities through workshops, conferences, online seminars and classes, student exchange programs and internships. SSERVI represents a close collaboration between science, technology and exploration enabling a deeper, integrated understanding of the Moon and other airless bodies as human exploration moves beyond low Earth orbit. SSERVI centers on the scientific aspects of exploration as they pertain to the Moon, Near Earth Asteroids (NEAs) and the moons of Mars, with additional aspects of related technology development, including a major focus on human exploration-enabling efforts such as resolving Strategic Knowledge Gaps (SKGs). The Institute focuses on interdisciplinary, exploration-related science focused on airless bodies targeted as potential human destinations. Areas of study represent the broad spectrum of lunar, NEA, and Martian moon sciences encompassing investigations of the surface, interior, exosphere, and near-space environments as well as science uniquely enabled from these bodies. This research profile integrates investigations of plasma physics, geology/geochemistry, technology integration, solar system origins/evolution, regolith geotechnical properties, analogues, volatiles, ISRU and exploration potential of the target bodies. New opportunities for both domestic and international partnerships are continually generated through these research and community development efforts, and SSERVI can further serve as a model for joint international scientific efforts through its creation of bridges across disciplines and between countries. Since the inception of the NASA Lunar Science Institute (SSERVIs predecessor), it has and will continue to contribute in many ways toward the advancement of lunar science and the eventual human exploration of the Moon.

  1. Selection of relevant input variables in storm water quality modeling by multiobjective evolutionary polynomial regression paradigm

    NASA Astrophysics Data System (ADS)

    Creaco, E.; Berardi, L.; Sun, Siao; Giustolisi, O.; Savic, D.

    2016-04-01

    The growing availability of field data, from information and communication technologies (ICTs) in "smart" urban infrastructures, allows data modeling to understand complex phenomena and to support management decisions. Among the analyzed phenomena, those related to storm water quality modeling have recently been gaining interest in the scientific literature. Nonetheless, the large amount of available data poses the problem of selecting relevant variables to describe a phenomenon and enable robust data modeling. This paper presents a procedure for the selection of relevant input variables using the multiobjective evolutionary polynomial regression (EPR-MOGA) paradigm. The procedure is based on scrutinizing the explanatory variables that appear inside the set of EPR-MOGA symbolic model expressions of increasing complexity and goodness of fit to target output. The strategy also enables the selection to be validated by engineering judgement. In such context, the multiple case study extension of EPR-MOGA, called MCS-EPR-MOGA, is adopted. The application of the proposed procedure to modeling storm water quality parameters in two French catchments shows that it was able to significantly reduce the number of explanatory variables for successive analyses. Finally, the EPR-MOGA models obtained after the input selection are compared with those obtained by using the same technique without benefitting from input selection and with those obtained in previous works where other data-modeling techniques were used on the same data. The comparison highlights the effectiveness of both EPR-MOGA and the input selection procedure.

  2. Augmenting Research, Education, and Outreach with Client-Side Web Programming.

    PubMed

    Abriata, Luciano A; Rodrigues, João P G L M; Salathé, Marcel; Patiny, Luc

    2018-05-01

    The evolution of computing and web technologies over the past decade has enabled the development of fully fledged scientific applications that run directly on web browsers. Powered by JavaScript, the lingua franca of web programming, these 'web apps' are starting to revolutionize and democratize scientific research, education, and outreach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. "To Be a Scientist Sometimes You Have to Break Down Stuff about Animals": Examining the Normative Scientific Practices of a Summer Herpetological Program for Children

    ERIC Educational Resources Information Center

    Scott, Catherine Marie

    2016-01-01

    When studying informal science programs, researchers often overlook the opportunities enabled and constrained in each program and the practices reinforced for participants. In this case study, I examined the normative scientific practices reinforced in one-week-long "Herpetology" (the study of reptiles and amphibians) program for…

  4. The NASA Scientific and Technical Information (STI) Program's Implementation of Open Archives Initiative (OAI) for Data Interoperability and Data Exchange.

    ERIC Educational Resources Information Center

    Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.

    Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through Web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI)…

  5. The Global Ethics Corner: Foundations, Beliefs, and the Teaching of Biomedical and Scientific Ethics around the World

    ERIC Educational Resources Information Center

    Jakubowski, Henry; Xie, Jianping; Kumar Mitra, Arup; Ghooi, Ravindra; Hosseinkhani, Saman; Alipour, Mohsen; Hajipour, Behnam; Obiero, George

    2017-01-01

    The profound advances in the biomolecular sciences over the last decades have enabled similar advances in biomedicine. These advances have increasingly challenged our abilities to deploy them in an equitable and ethically acceptable manner. As such, it has become necessary and important to teach biomedical and scientific ethics to our students who…

  6. Surveying Geology Concepts in Education Standards for a Rapidly Changing Global Context

    ERIC Educational Resources Information Center

    Guffey, Sarah K.; Slater, Stephanie J.; Schleigh, Sharon P.; Slater, Timothy F.; Heyer, Inge

    2016-01-01

    Internationally much attention is being paid to which of a seemingly endless list of scientific concepts should be taught to schoolchildren to enable them to best participate in the global economy of the 21st Century. In regards to science education, the concepts framing the subject of geology holds exalted status as core scientific principles in…

  7. Multidrug-Resistant TB

    PubMed Central

    Cox, Helen; Coomans, Fons

    2016-01-01

    Abstract The right to enjoy the benefits of scientific progress (REBSP) is a little-known but potentially valuable right that can contribute to rights-based approaches to addressing multidrug-resistant TB (MDR-TB). We argue that better understanding of the REBSP may help to advance legal and civil society action for health rights. While the REBSP does not provide an individual entitlement to have a new drug developed for MDR-TB, it sets up entitlements to expect a state to establish a legislative and policy framework aimed at developing scientific capacity to address the most important health issues and at disseminating the outcomes of scientific research. By making scientific findings available and accessible, people can be enabled to claim the use of science for social benefits. Inasmuch as the market fails to address neglected diseases such as MDR-TB, the REBSP provides a potential counterbalance to frame a positive obligation on states to both marshal their own resources and to coordinate the actions of multiple other actors towards this goal, including non-state actors. While the latter do not hold the same level of accountability as states, the REBSP can still enable the recognition of obligations at a level of “soft law” responsibilities. PMID:27780997

  8. Virtual Observatories, Data Mining, and Astroinformatics

    NASA Astrophysics Data System (ADS)

    Borne, Kirk

    The historical, current, and future trends in knowledge discovery from data in astronomy are presented here. The story begins with a brief history of data gathering and data organization. A description of the development ofnew information science technologies for astronomical discovery is then presented. Among these are e-Science and the virtual observatory, with its data discovery, access, display, and integration protocols; astroinformatics and data mining for exploratory data analysis, information extraction, and knowledge discovery from distributed data collections; new sky surveys' databases, including rich multivariate observational parameter sets for large numbers of objects; and the emerging discipline of data-oriented astronomical research, called astroinformatics. Astroinformatics is described as the fourth paradigm of astronomical research, following the three traditional research methodologies: observation, theory, and computation/modeling. Astroinformatics research areas include machine learning, data mining, visualization, statistics, semantic science, and scientific data management.Each of these areas is now an active research discipline, with significantscience-enabling applications in astronomy. Research challenges and sample research scenarios are presented in these areas, in addition to sample algorithms for data-oriented research. These information science technologies enable scientific knowledge discovery from the increasingly large and complex data collections in astronomy. The education and training of the modern astronomy student must consequently include skill development in these areas, whose practitioners have traditionally been limited to applied mathematicians, computer scientists, and statisticians. Modern astronomical researchers must cross these traditional discipline boundaries, thereby borrowing the best of breed methodologies from multiple disciplines. In the era of large sky surveys and numerous large telescopes, the potential for astronomical discovery is equally large, and so the data-oriented research methods, algorithms, and techniques that are presented here will enable the greatest discovery potential from the ever-growing data and information resources in astronomy.

  9. Towards writing the encyclopaedia of life: an introduction to DNA barcoding

    PubMed Central

    Savolainen, Vincent; Cowan, Robyn S; Vogler, Alfried P; Roderick, George K; Lane, Richard

    2005-01-01

    An international consortium of major natural history museums, herbaria and other organizations has launched an ambitious project, the ‘Barcode of Life Initiative’, to promote a process enabling the rapid and inexpensive identification of the estimated 10 million species on Earth. DNA barcoding is a diagnostic technique in which short DNA sequence(s) can be used for species identification. The first international scientific conference on Barcoding of Life was held at the Natural History Museum in London in February 2005, and here we review the scientific challenges discussed during this conference and in previous publications. Although still controversial, the scientific benefits of DNA barcoding include: (i) enabling species identification, including any life stage or fragment, (ii) facilitating species discoveries based on cluster analyses of gene sequences (e.g. cox1=CO1, in animals), (iii) promoting development of handheld DNA sequencing technology that can be applied in the field for biodiversity inventories and (iv) providing insight into the diversity of life. PMID:16214739

  10. Contributions of treatment theory and enablement theory to rehabilitation research and practice.

    PubMed

    Whyte, John

    2014-01-01

    Scientific theory is crucial to the advancement of clinical research. The breadth of rehabilitation treatment requires that many different theoretical perspectives be incorporated into the design and testing of treatment interventions. In this article, the 2 broad classes of theory relevant to rehabilitation research and practice are defined, and their distinct but complementary contributions to research and clinical practice are explored. These theory classes are referred to as treatment theories (theories about how to effect change in clinical targets) and enablement theories (theories about how changes in a proximal clinical target will influence distal clinical aims). Treatment theories provide the tools for inducing clinical change but do not specify how far reaching the ultimate impact of the change will be. Enablement theories model the impact of changes on other areas of function but provide no insight as to how treatment can create functional change. Treatment theories are more critical in the early stages of treatment development, whereas enablement theories become increasingly relevant in specifying the clinical significance and practical effectiveness of more mature treatments. Understanding the differences in the questions these theory classes address and how to combine their insights is crucial for effective research development and clinical practice. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  11. An Ontology-Enabled Natural Language Processing Pipeline for Provenance Metadata Extraction from Biomedical Text (Short Paper).

    PubMed

    Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S

    2016-10-01

    Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.

  12. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.

    2013-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband Platform facilitates the addition of new scientific methods, which are written by earth scientists in a number of languages such as C, C++, Fortran, and Python. The Broadband Platform's modular design also supports the reuse of existing software modules as building blocks to create new scientific methods. Additionally, the Platform implements a wrapper around each scientific module, converting input and output files to and from the specific formats required (or produced) by individual scientific codes. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes the addition of 3 new simulation methods and several new data products, such as map and distance-based goodness of fit plots. Finally, as the number and complexity of scenarios simulated using the Broadband Platform increase, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  13. The Five 'R's' for Developing Trusted Software Frameworks to increase confidence in, and maximise reuse of, Open Source Software.

    NASA Astrophysics Data System (ADS)

    Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens

    2015-04-01

    Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.

  14. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  15. The Conservation Effects Assessment Project (CEAP): a national scale natural resources and conservation needs assessment and decision support tool

    NASA Astrophysics Data System (ADS)

    Johnson, M.-V. V.; Norfleet, M. L.; Atwood, J. D.; Behrman, K. D.; Kiniry, J. R.; Arnold, J. G.; White, M. J.; Williams, J.

    2015-07-01

    The Conservation Effects Assessment Project (CEAP) was initiated to quantify the impacts of agricultural conservation practices at the watershed, regional, and national scales across the United States. Representative cropland acres in all major U.S. watersheds were surveyed in 2003-2006 as part of the seminal CEAP Cropland National Assessment. Two process-based models, the Agricultural Policy Environmental eXtender(APEX) and the Soil Water Assessment Tool (SWAT), were applied to the survey data to provide a quantitative assessment of current conservation practice impacts, establish a benchmark against which future conservation trends and efforts could be measured, and identify outstanding conservation concerns. The flexibility of these models and the unprecedented amount of data on current conservation practices across the country enabled Cropland CEAP to meet its Congressional mandate of quantifying the value of current conservation practices. It also enabled scientifically grounded exploration of a variety of conservation scenarios, empowering CEAP to not only inform on past successes and additional needs, but to also provide a decision support tool to help guide future policy development and conservation practice decision making. The CEAP effort will repeat the national survey in 2015-2016, enabling CEAP to provide analyses of emergent conservation trends, outstanding needs, and potential costs and benefits of pursuing various treatment scenarios for all agricultural watersheds across the United States.

  16. How to Receive More Funding for Your Research? Get Connected to the Right People!

    PubMed

    Ebadi, Ashkan; Schiffauerova, Andrea

    2015-01-01

    Funding has been viewed in the literature as one of the main determinants of scientific activities. Also, at an individual level, securing funding is one of the most important factors for a researcher, enabling him/her to carry out research projects. However, not everyone is successful in obtaining the necessary funds. The main objective of this work is to measure the effect of several important factors such as past productivity, scientific collaboration or career age of researchers, on the amount of funding that is allocated to them. For this purpose, the paper estimates a temporal non-linear multiple regression model. According to the results, although past productivity of researchers positively affects the funding level, our findings highlight the significant role of networking and collaboration. It was observed that being a member of large scientific teams and getting connected to productive researchers who have also a good control over the collaboration network and the flow of information can increase the chances for securing more money. In fact, our results show that in the quest for the research money it is more important how researchers build their collaboration network than what publications they produce and whether they are cited.

  17. How to Receive More Funding for Your Research? Get Connected to the Right People!

    PubMed Central

    Ebadi, Ashkan; Schiffauerova, Andrea

    2015-01-01

    Funding has been viewed in the literature as one of the main determinants of scientific activities. Also, at an individual level, securing funding is one of the most important factors for a researcher, enabling him/her to carry out research projects. However, not everyone is successful in obtaining the necessary funds. The main objective of this work is to measure the effect of several important factors such as past productivity, scientific collaboration or career age of researchers, on the amount of funding that is allocated to them. For this purpose, the paper estimates a temporal non-linear multiple regression model. According to the results, although past productivity of researchers positively affects the funding level, our findings highlight the significant role of networking and collaboration. It was observed that being a member of large scientific teams and getting connected to productive researchers who have also a good control over the collaboration network and the flow of information can increase the chances for securing more money. In fact, our results show that in the quest for the research money it is more important how researchers build their collaboration network than what publications they produce and whether they are cited. PMID:26222598

  18. pFlogger: The Parallel Fortran Logging Utility

    NASA Technical Reports Server (NTRS)

    Clune, Tom; Cruz, Carlos A.

    2017-01-01

    In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or 'logger)' similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger - a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.

  19. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Allcock, William; Beggio, Chris

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less

  20. Applied Mathematics at the U.S. Department of Energy: Past, Present and a View to the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; Bell, J; Estep, D

    2008-02-15

    Over the past half-century, the Applied Mathematics program in the U.S. Department of Energy's Office of Advanced Scientific Computing Research has made significant, enduring advances in applied mathematics that have been essential enablers of modern computational science. Motivated by the scientific needs of the Department of Energy and its predecessors, advances have been made in mathematical modeling, numerical analysis of differential equations, optimization theory, mesh generation for complex geometries, adaptive algorithms and other important mathematical areas. High-performance mathematical software libraries developed through this program have contributed as much or more to the performance of modern scientific computer codes as themore » high-performance computers on which these codes run. The combination of these mathematical advances and the resulting software has enabled high-performance computers to be used for scientific discovery in ways that could only be imagined at the program's inception. Our nation, and indeed our world, face great challenges that must be addressed in coming years, and many of these will be addressed through the development of scientific understanding and engineering advances yet to be discovered. The U.S. Department of Energy (DOE) will play an essential role in providing science-based solutions to many of these problems, particularly those that involve the energy, environmental and national security needs of the country. As the capability of high-performance computers continues to increase, the types of questions that can be answered by applying this huge computational power become more varied and more complex. It will be essential that we find new ways to develop and apply the mathematics necessary to enable the new scientific and engineering discoveries that are needed. In August 2007, a panel of experts in applied, computational and statistical mathematics met for a day and a half in Berkeley, California to understand the mathematical developments required to meet the future science and engineering needs of the DOE. It is important to emphasize that the panelists were not asked to speculate only on advances that might be made in their own research specialties. Instead, the guidance this panel was given was to consider the broad science and engineering challenges that the DOE faces and identify the corresponding advances that must occur across the field of mathematics for these challenges to be successfully addressed. As preparation for the meeting, each panelist was asked to review strategic planning and other informational documents available for one or more of the DOE Program Offices, including the Offices of Science, Nuclear Energy, Fossil Energy, Environmental Management, Legacy Management, Energy Efficiency & Renewable Energy, Electricity Delivery & Energy Reliability and Civilian Radioactive Waste Management as well as the National Nuclear Security Administration. The panelists reported on science and engineering needs for each of these offices, and then discussed and identified mathematical advances that will be required if these challenges are to be met. A review of DOE challenges in energy, the environment and national security brings to light a broad and varied array of questions that the DOE must answer in the coming years. A representative subset of such questions includes: (1) Can we predict the operating characteristics of a clean coal power plant? (2) How stable is the plasma containment in a tokamak? (3) How quickly is climate change occurring and what are the uncertainties in the predicted time scales? (4) How quickly can an introduced bio-weapon contaminate the agricultural environment in the US? (5) How do we modify models of the atmosphere and clouds to incorporate newly collected data of possibly of new types? (6) How quickly can the United States recover if part of the power grid became inoperable? (7) What are optimal locations and communication protocols for sensing devices in a remote-sensing network? (8) How can new materials be designed with a specified desirable set of properties? In comparing and contrasting these and other questions of importance to DOE, the panel found that while the scientific breadth of the requirements is enormous, a central theme emerges: Scientists are being asked to identify or provide technology, or to give expert analysis to inform policy-makers that requires the scientific understanding of increasingly complex physical and engineered systems. In addition, as the complexity of the systems of interest increases, neither experimental observation nor mathematical and computational modeling alone can access all components of the system over the entire range of scales or conditions needed to provide the required scientific understanding.« less

  1. Parallelization and Visual Analysis of Multidimensional Fields: Application to Ozone Production, Destruction, and Transport in Three Dimensions

    NASA Technical Reports Server (NTRS)

    Schwan, Karsten

    1997-01-01

    This final report has four sections. We first describe the actual scientific results attained by our research team, followed by a description of the high performance computing research enhancing those results and prompted by the scientific tasks being undertaken. Next, we describe our research in data and program visualization motivated by the scientific research and also enabling it. Last, we comment on the indirect effects this research effort has had on our work, in terms of follow up or additional funding, student training, etc.

  2. Perpetual Ocean - Gulf Stream

    NASA Image and Video Library

    2017-12-08

    This image shows ocean surface currents around the world during the period from June 2005 through Decmeber 2007. Go here to view a video of this data: www.flickr.com/photos/gsfc/7009056027/ NASA/Goddard Space Flight Center Scientific Visualization Studio NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  3. Chemical datuments as scientific enablers

    PubMed Central

    2013-01-01

    This article is an attempt to construct a chemical datument as a means of presenting insights into chemical phenomena in a scientific journal. An exploration of the interactions present in a small fragment of duplex Z-DNA and the nature of the catalytic centre of a carbon-dioxide/alkene epoxide alternating co-polymerisation is presented in this datument, with examples of the use of three software tools, one based on Java, the other two using Javascript and HTML5 technologies. The implications for the evolution of scientific journals are discussed. PMID:23343381

  4. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC) environments

    NASA Astrophysics Data System (ADS)

    Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.

    Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  5. Water Planning in Phoenix: Managing Risk in the Face of Climatic Uncertainty

    NASA Astrophysics Data System (ADS)

    Gober, P.

    2009-12-01

    The Decision Center for a Desert City (DCDC) was founded in 2004 to develop scientifically-credible support tools to improve water management decisions in the face of growing climatic uncertainty and rapid urbanization in metropolitan Phoenix. At the center of DCDC's effort is WaterSim, a model that integrates information about water supply from groundwater, the Colorado River, and upstream watersheds and water demand from land use change and population growth. Decision levers enable users to manipulate model outcomes in response to climate change scenarios, drought conditions, population growth rates, technology innovations, lifestyle changes, and policy decisions. WaterSim allows users to examine the risks of water shortage from global climate change, the tradeoffs between groundwater sustainability and lifestyle choices, the effects of various policy decisions, and the consequences of delaying policy for the exposure to risk. WaterSim is an important point of contact for DCDC’s relationships with local decision makers. Knowledge, tools, and visualizations are co-produced—by scientists and policy makers, and the Center’s social scientists mine this co-production process for new insights about model development and application. WaterSim is less a static scientific product and more a dynamic process of engagement between decision makers and scientists.

  6. Global Climate Models for the Classroom: The Educational Impact of Student Work with a Key Tool of Climate Scientists

    NASA Astrophysics Data System (ADS)

    Bush, D. F.; Sieber, R.; Seiler, G.; Chandler, M. A.; Chmura, G. L.

    2017-12-01

    Efforts to address climate change require public understanding of Earth and climate science. To meet this need, educators require instructional approaches and scientific technologies that overcome cultural barriers to impart conceptual understanding of the work of climate scientists. We compared student inquiry learning with now ubiquitous climate education toy models, data and tools against that which took place using a computational global climate model (GCM) from the National Aeronautics and Space Administration (NASA). Our study at McGill University and John Abbott College in Montreal, QC sheds light on how best to teach the research processes important to Earth and climate scientists studying atmospheric and Earth system processes but ill-understood by those outside the scientific community. We followed a pre/post, control/treatment experimental design that enabled detailed analysis and statistically significant results. Our research found more students succeed at understanding climate change when exposed to actual climate research processes and instruments. Inquiry-based education with a GCM resulted in significantly higher scores pre to post on diagnostic exams (quantitatively) and more complete conceptual understandings (qualitatively). We recognize the difficulty in planning and teaching inquiry with complex technology and we also found evidence that lectures support learning geared toward assessment exams.

  7. Workshop Report: The Medaka Model for Comparative Assessment of Human Disease Mechanisms

    PubMed Central

    Obara, Tomoko

    2015-01-01

    Results of recent studies showing the utility of medaka as a model of various human disease states were presented at the 7th Aquatic Models of Human Disease Conference (December 13–18, 2014, Austin, TX). This conference brought together many of the most highly regarded national and international scientists that employ the medaka model in their investigations. To take advantage of this opportunity, a cohort of established medaka researchers were asked to stay an extra day and represent the medaka scientific community in a workshop entitled “The Medaka Model for Comparative Assessment of Human Disease Mechanisms”. The central purpose of this medaka workshop was to assess current use and project the future resource needs of the American medaka research community. The workshop sought to spur discussions of issues that would promote more informative comparative disease model studies. Finally, workshop attendees met together to propose, discuss, and agree on recommendations regarding the most effective research resources needed to enable US scientists to perform experiments leading to impacting experimental results that directly translate to human disease. Consistent with this central purpose, the workshop was divided into two sessions of invited speakers having expertise and experience in the session topics. The workshop hosted 20 scientific participants (Appendices 1 and 2) and of these, nine scientists presented formal talks. Here, we present a summary report stemming from workshop presentations and subsequent round table discussions, and forward recommendations from this group that we believe represent views of the overall medaka research community. PMID:26099189

  8. PIMMS tools for capturing metadata about simulations

    NASA Astrophysics Data System (ADS)

    Pascoe, Charlotte; Devine, Gerard; Tourte, Gregory; Pascoe, Stephen; Lawrence, Bryan; Barjat, Hannah

    2013-04-01

    PIMMS (Portable Infrastructure for the Metafor Metadata System) provides a method for consistent and comprehensive documentation of modelling activities that enables the sharing of simulation data and model configuration information. The aim of PIMMS is to package the metadata infrastructure developed by Metafor for CMIP5 so that it can be used by climate modelling groups in UK Universities. PIMMS tools capture information about simulations from the design of experiments to the implementation of experiments via simulations that run models. PIMMS uses the Metafor methodology which consists of a Common Information Model (CIM), Controlled Vocabularies (CV) and software tools. PIMMS software tools provide for the creation and consumption of CIM content via a web services infrastructure and portal developed by the ES-DOC community. PIMMS metadata integrates with the ESGF data infrastructure via the mapping of vocabularies onto ESGF facets. There are three paradigms of PIMMS metadata collection: Model Intercomparision Projects (MIPs) where a standard set of questions is asked of all models which perform standard sets of experiments. Disciplinary level metadata collection where a standard set of questions is asked of all models but experiments are specified by users. Bespoke metadata creation where the users define questions about both models and experiments. Examples will be shown of how PIMMS has been configured to suit each of these three paradigms. In each case PIMMS allows users to provide additional metadata beyond that which is asked for in an initial deployment. The primary target for PIMMS is the UK climate modelling community where it is common practice to reuse model configurations from other researchers. This culture of collaboration exists in part because climate models are very complex with many variables that can be modified. Therefore it has become common practice to begin a series of experiments by using another climate model configuration as a starting point. Usually this other configuration is provided by a researcher in the same research group or by a previous collaborator with whom there is an existing scientific relationship. Some efforts have been made at the university department level to create documentation but there is a wide diversity in the scope and purpose of this information. The consistent and comprehensive documentation enabled by PIMMS will enable the wider sharing of climate model data and configuration information. The PIMMS methodology assumes an initial effort to document standard model configurations. Once these descriptions have been created users need only describe the specific way in which their model configuration is different from the standard. Thus the documentation burden on the user is specific to the experiment they are performing and fits easily into the workflow of doing their science. PIMMS metadata is independent of data and as such is ideally suited for documenting model development. PIMMS provides a framework for sharing information about failed model configurations for which data are not kept, the negative results that don't appear in scientific literature. PIMMS is a UK project funded by JISC, The University of Reading, The University of Bristol and STFC.

  9. Inexpensive, Low Power, Open-Source Data Logging hardware development

    NASA Astrophysics Data System (ADS)

    Sandell, C. T.; Schulz, B.; Wickert, A. D.

    2017-12-01

    Over the past six years, we have developed a suite of open-source, low-cost, and lightweight data loggers for scientific research. These loggers employ the popular and easy-to-use Arduino programming environment, but consist of custom hardware optimized for field research. They may be connected to a broad and expanding range of off-the-shelf sensors, with software support built in directly to the "ALog" library. Three main models exist: The ALog (for Autonomous or Arduino Logger) is the extreme low-power model for years-long deployments with only primary AA or D batteries. The ALog shield is a stripped-down ALog that nests with a standard Arduino board for prototyping or education. The TLog (for Telemetering Logger) contains an embedded radio with 500 m range and a GPS for communications and precision timekeeping. This enables meshed networks of loggers that can send their data back to an internet-connected "home base" logger for near-real-time field data retrieval. All boards feature feature a high-precision clock, full size SD card slot for high-volume data storage, large screw terminals to connect sensors, interrupts, SPI and I2C communication capability, and 3.3V/5V power outputs. The ALog and TLog have fourteen 16-bit analog inputs with a precision voltage reference for precise analog measurements. Their components are rated -40 to +85 degrees C, and they have been tested in harsh field conditions. These low-cost and open-source data loggers have enabled our research group to collect field data across North and South America on a limited budget, support student projects, and build toward better future scientific data systems.

  10. HydroShare for iUTAH: Collaborative Publication, Interoperability, and Reuse of Hydrologic Data and Models for a Large, Interdisciplinary Water Research Project

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Jones, A. S.

    2016-12-01

    Data and models used within the hydrologic science community are diverse. New research data and model repositories have succeeded in making data and models more accessible, but have been, in most cases, limited to particular types or classes of data or models and also lack the type of collaborative, and iterative functionality needed to enable shared data collection and modeling workflows. File sharing systems currently used within many scientific communities for private sharing of preliminary and intermediate data and modeling products do not support collaborative data capture, description, visualization, and annotation. More recently, hydrologic datasets and models have been cast as "social objects" that can be published, collaborated around, annotated, discovered, and accessed. Yet it can be difficult using existing software tools to achieve the kind of collaborative workflows and data/model reuse that many envision. HydroShare is a new, web-based system for sharing hydrologic data and models with specific functionality aimed at making collaboration easier and achieving new levels of interactive functionality and interoperability. Within HydroShare, we have developed new functionality for creating datasets, describing them with metadata, and sharing them with collaborators. HydroShare is enabled by a generic data model and content packaging scheme that supports describing and sharing diverse hydrologic datasets and models. Interoperability among the diverse types of data and models used by hydrologic scientists is achieved through the use of consistent storage, management, sharing, publication, and annotation within HydroShare. In this presentation, we highlight and demonstrate how the flexibility of HydroShare's data model and packaging scheme, HydroShare's access control and sharing functionality, and versioning and publication capabilities have enabled the sharing and publication of research datasets for a large, interdisciplinary water research project called iUTAH (innovative Urban Transitions and Aridregion Hydro-sustainability). We discuss the experiences of iUTAH researchers now using HydroShare to collaboratively create, curate, and publish datasets and models in a way that encourages collaboration, promotes reuse, and meets funding agency requirements.

  11. Concepts of scientific integrative medicine applied to the physiology and pathophysiology of catecholamine systems.

    PubMed

    Goldstein, David S

    2013-10-01

    This review presents concepts of scientific integrative medicine and relates them to the physiology of catecholamine systems and to the pathophysiology of catecholamine-related disorders. The applications to catecholamine systems exemplify how scientific integrative medicine links systems biology with integrative physiology. Concepts of scientific integrative medicine include (i) negative feedback regulation, maintaining stability of the body's monitored variables; (ii) homeostats, which compare information about monitored variables with algorithms for responding; (iii) multiple effectors, enabling compensatory activation of alternative effectors and primitive specificity of stress response patterns; (iv) effector sharing, accounting for interactions among homeostats and phenomena such as hyperglycemia attending gastrointestinal bleeding and hyponatremia attending congestive heart failure; (v) stress, applying a definition as a state rather than as an environmental stimulus or stereotyped response; (vi) distress, using a noncircular definition that does not presume pathology; (vii) allostasis, corresponding to adaptive plasticity of feedback-regulated systems; and (viii) allostatic load, explaining chronic degenerative diseases in terms of effects of cumulative wear and tear. From computer models one can predict mathematically the effects of stress and allostatic load on the transition from wellness to symptomatic disease. The review describes acute and chronic clinical disorders involving catecholamine systems-especially Parkinson disease-and how these concepts relate to pathophysiology, early detection, and treatment and prevention strategies in the post-genome era. Published 2013. Compr Physiol 3:1569-1610, 2013.

  12. Concepts of Scientific Integrative Medicine Applied to the Physiology and Pathophysiology of Catecholamine Systems

    PubMed Central

    Goldstein, David S.

    2016-01-01

    This review presents concepts of scientific integrative medicine and relates them to the physiology of catecholamine systems and to the pathophysiology of catecholamine-related disorders. The applications to catecholamine systems exemplify how scientific integrative medicine links systems biology with integrative physiology. Concepts of scientific integrative medicine include (i) negative feedback regulation, maintaining stability of the body’s monitored variables; (ii) homeostats, which compare information about monitored variables with algorithms for responding; (iii) multiple effectors, enabling compensatory activation of alternative effectors and primitive specificity of stress response patterns; (iv) effector sharing, accounting for interactions among homeostats and phenomena such as hyperglycemia attending gastrointestinal bleeding and hyponatremia attending congestive heart failure; (v) stress, applying a definition as a state rather than as an environmental stimulus or stereotyped response; (vi) distress, using a noncircular definition that does not presume pathology; (vii) allostasis, corresponding to adaptive plasticity of feedback-regulated systems; and (viii) allostatic load, explaining chronic degenerative diseases in terms of effects of cumulative wear and tear. From computer models one can predict mathematically the effects of stress and allostatic load on the transition from wellness to symptomatic disease. The review describes acute and chronic clinical disorders involving catecholamine systems—especially Parkinson disease—and how these concepts relate to pathophysiology, early detection, and treatment and prevention strategies in the post-genome era. PMID:24265239

  13. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Dr. Robert Goddard's tower for "static" test near the shop at Roswell, New Mexico, 1930. The observation shelter (left foreground) is visible. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  14. Socio-Scientific Issues and the Quality of Exploratory Talk--What can be Learned from Schools Involved in a "Collapsed Day" Project?

    ERIC Educational Resources Information Center

    Harris, Richard; Ratcliffe, Mary

    2005-01-01

    This project was designed to examine the feasibility of using a "collapsed day" to explore socio-scientific issues relating to genes and genetic engineering in secondary schools by enabling science and humanities staff to collaborate. It was believed that science staff would have expertise in promoting understanding of genetics and humanities…

  15. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Tail piece, with fixed movable air vanes, and vanes movable into the blast, of Dr. Robert Goddard's rocket, May 19, 1937. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  16. Smart article: application of intelligent platforms in next generation biomedical publications.

    PubMed

    Mohammadi, Babak; Saeedi, Marjan; Haghpanah, Vahid

    2017-01-01

    Production of scientific data has been accelerated exponentially though ease of access to the required knowledge is still challenging. Hence, the emergence of new frameworks to allow more efficient storage of information would be beneficial. Attaining intelligent platforms enable the smart article to serve as a forum for exchanging idea among experts of academic disciplines for a rapid and efficient scientific discourse.

  17. Adults' decision-making about the electronic waste issue: The role of the nature of science conceptualizations and moral concerns in socio-scientific decision-making

    NASA Astrophysics Data System (ADS)

    Yu, Yuqing

    Socio-scientific issues have become increasingly important in Science-Technology-Society (STS) education as a means to make science learning more relevant to students' lives. This study used the e-waste issue as a context to investigate two aspects of socio-scientific decision-making: (1) the relationship between the nature of science (NOS) conceptualizations and decision-making; and (2) moral concerns involved in the process of decision-making. This study contributes to the field of socio-scientific issue research and STS education in the following ways. First, it is the first study that performed meta-analysis to seek the relationship between the NOS understanding and decision-making. This study concludes that valuable NOS conceptualizations that are highly related to the socio-scientific issue under investigation, rather than general NOS understanding, exert statistically significant influences on decision-making. Second, this study empirically examined the Multiple Responses Model (MRM), which enables the transfer of qualitative NOS responses into quantitative data, and hence, inferential statistics. The current study justifies the significance of unidimensionality to the application of the MRM. It addresses the limitations associated with the MRM and provides implications for future use of the MRM in other contexts. Finally, the study explores the role of moral concerns in socio-scientific decision-making. Eight participants engaged in interviews that were designed to elicit their reactions and feelings regarding the issue of exporting e-waste to poor countries. Qualitative analyses demonstrated that moral considerations were significant influences on decision-making. In addition, participants' action responses revealed that they were motivated to take action to help the environment. The study has implications for socio-scientific issue studies in other contexts and for teacher education programs that use socio-scientific issues to advance teachers' reasoning and discourse skills.

  18. Constructed vs. received graphical representations for learning about scientific controversy: Implications for learning and coaching

    NASA Astrophysics Data System (ADS)

    Cavalli-Sforza, Violetta Laura Maria

    Students in science classes hardly ever study scientific controversy, especially in terms of the different types of arguments used to support and criticize theories and hypotheses. Yet, learning the reasons for scientific debate and scientific change is an important part of appreciating the nature of the scientific enterprise and communicating it to the non-scientific world. This dissertation explores the usefulness of graphical representations in teaching students about scientific arguments. Subjects participating in an extended experiment studied instructional materials and used the Belvedere graphical interface to analyze texts drawn from an actual scientific debate. In one experimental condition, subjects used a box-and-arrow representation whose primitive graphical elements had preassigned meanings tailored to the domain of instruction. In the other experimental condition, subjects could use the graphical elements as they wished, thereby creating their own representation. The development of a representation, by forcing a deeper analysis, can potentially yield a greater understanding of the domain under study. The results of the research suggest two conclusions. From the perspective of learning target concepts, asking subjects to develop their own representation may not hurt those subjects who gain a sufficient understanding of the possibilities of abstract representation. The risks are much greater for less able subjects because, if they develop a representation that is inadequate for expressing the target concepts, they will use those concepts less or not at all. From the perspective of coaching subjects as they diagram their analysis of texts, a predefined representation has significant advantages. If it is appropriately expressive for the task, it provides a common language and clearer shared meaning between the subject and the coach. It also enables the coach to understand subjects' analysis more easily, and to evaluate it more effectively against the coach's own model of the ideal analysis.

  19. Collaborative Project: The problem of bias in defining uncertainty in computationally enabled strategies for data-driven climate model development. Final Technical Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huerta, Gabriel

    The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projectionsmore » of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.« less

  20. The Virtual Geophysics Laboratory (VGL): Scientific Workflows Operating Across Organizations and Across Infrastructures

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.

    2012-12-01

    The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models, code, data, processing) are shared in the one virtual laboratory. VGL provides end users with access to an intuitive, user-centered interface that leverages cloud storage and cloud and cluster processing from both the research communities and commercial suppliers (e.g. Amazon). As the underlying data and information services are agnostic of the scientific domain, they can support many other data types. This fundamental characteristic results in a highly reusable virtual laboratory infrastructure that could also be used for example natural hazards, satellite processing, soil geochemistry, climate modeling, agriculture crop modeling.

  1. The Modular Modeling System (MMS): A toolbox for water- and environmental-resources management

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.; Hay, L.E.; ,

    2005-01-01

    The increasing complexity of water- and environmental-resource problems require modeling approaches that incorporate knowledge from a broad range of scientific and software disciplines. To address this need, the U.S. Geological Survey (USGS) has developed the Modular Modeling System (MMS). MMS is an integrated system of computer software for model development, integration, and application. Its modular design allows a high level of flexibility and adaptability to enable modelers to incorporate their own software into a rich array of built-in models and modeling tools. These include individual process models, tightly coupled models, loosely coupled models, and fully- integrated decision support systems. A geographic information system (GIS) interface, the USGS GIS Weasel, has been integrated with MMS to enable spatial delineation and characterization of basin and ecosystem features, and to provide objective parameter-estimation methods for models using available digital data. MMS provides optimization and sensitivity-analysis tools to analyze model parameters and evaluate the extent to which uncertainty in model parameters affects uncertainty in simulation results. MMS has been coupled with the Bureau of Reclamation object-oriented reservoir and river-system modeling framework, RiverWare, to develop models to evaluate and apply optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. This decision support system approach has been developed, tested, and implemented in the Gunnison, Yakima, San Joaquin, Rio Grande, and Truckee River basins of the western United States. MMS is currently being coupled with the U.S. Forest Service model SIMulating Patterns and Processes at Landscape Scales (SIMPPLLE) to assess the effects of alternative vegetation-management strategies on a variety of hydrological and ecological responses. Initial development and testing of the MMS-SIMPPLLE integration is being conducted on the Colorado Plateau region of the western United Sates.

  2. Tackling childhood obesity: the importance of understanding the context.

    PubMed

    Knai, Cécile; McKee, Martin

    2010-12-01

    Recommendations to tackle major health problems such as childhood obesity may not be appropriate if they fail to take account of the prevailing socio-political, cultural and economic context. We describe the development and application of a qualitative risk analysis approach to identify non-scientific considerations framing the policy response to obesity in Denmark and Latvia. Interviews conducted with key stakeholders in Denmark and Latvia, undertaken following a review of relevant literature on obesity and national policies. A qualitative risk analysis model was developed to help explain the findings in the light of national context. Non-scientific considerations that appeared to influence the response to obesity include the perceived relative importance of childhood obesity; the nature of stakeholder relations and its impact on decision-making; the place of obesity on the policy agenda; the legitimacy of the state to act for population health and views on alliances between public and private sectors. Better recognition of the exogenous factors affecting policy-making may lead to a more adequate policy response. The development and use of a qualitative risk analysis model enabled a better understanding of the contextual factors and processes influencing the response to childhood obesity in each country.

  3. The role of fluctuations and interactions in pedestrian dynamics

    NASA Astrophysics Data System (ADS)

    Corbetta, Alessandro; Meeusen, Jasper; Benzi, Roberto; Lee, Chung-Min; Toschi, Federico

    Understanding quantitatively the statistical behaviour of pedestrians walking in crowds is a major scientific challenge of paramount societal relevance. Walking humans exhibit a rich (stochastic) dynamics whose small and large deviations are driven, among others, by own will as well as by environmental conditions. Via 24/7 automatic pedestrian tracking from multiple overhead Microsoft Kinect depth sensors, we collected large ensembles of pedestrian trajectories (in the order of tens of millions) in different real-life scenarios. These scenarios include both narrow corridors and large urban hallways, enabling us to cover and compare a wide spectrum of typical pedestrian dynamics. We investigate the pedestrian motion measuring the PDFs, e.g. those of position, velocity and acceleration, and at unprecedentedly high statistical resolution. We consider the dependence of PDFs on flow conditions, focusing on diluted dynamics and pair-wise interactions (''collisions'') for mutual avoidance. By means of Langevin-like models we provide models for the measured data, inclusive typical fluctuations and rare events. This work is part of the JSTP research programme ``Vision driven visitor behaviour analysis and crowd management'' with Project Number 341-10-001, which is financed by the Netherlands Organisation for Scientific Research (NWO).

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagberg, Aric; Swart, Pieter; S Chult, Daniel

    NetworkX is a Python language package for exploration and analysis of networks and network algorithms. The core package provides data structures for representing many types of networks, or graphs, including simple graphs, directed graphs, and graphs with parallel edges and self loops. The nodes in NetworkX graphs can be any (hashable) Python object and edges can contain arbitrary data; this flexibility mades NetworkX ideal for representing networks found in many different scientific fields. In addition to the basic data structures many graph algorithms are implemented for calculating network properties and structure measures: shortest paths, betweenness centrality, clustering, and degree distributionmore » and many more. NetworkX can read and write various graph formats for eash exchange with existing data, and provides generators for many classic graphs and popular graph models, such as the Erdoes-Renyi, Small World, and Barabasi-Albert models, are included. The ease-of-use and flexibility of the Python programming language together with connection to the SciPy tools make NetworkX a powerful tool for scientific computations. We discuss some of our recent work studying synchronization of coupled oscillators to demonstrate how NetworkX enables research in the field of computational networks.« less

  5. Picturing the Sun’s Magnetic Field

    NASA Image and Video Library

    2017-12-08

    This illustration lays a depiction of the sun's magnetic fields over an image captured by NASA’s Solar Dynamics Observatory on March 12, 2016. The complex overlay of lines can teach scientists about the ways the sun's magnetism changes in response to the constant movement on and inside the sun. Note how the magnetic fields are densest near the bright spots visible on the sun – which are magnetically strong active regions – and many of the field lines link one active region to another. This magnetic map was created using the PFSS – Potential Field Source Surface – model, a model of the magnetic field in the sun’s atmosphere based on magnetic measurements of the solar surface. The underlying image was taken in extreme ultraviolet wavelengths of 171 angstroms. This type of light is invisible to our eyes, but is colorized here in gold. Credits: NASA/SDO/AIA/LMSAL NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  6. Plasma plume diagnostics of low power stationary plasma thruster (SPT-20M8) with collisional radiative model

    NASA Astrophysics Data System (ADS)

    Uttamsing Rajput, Rajendrasing; Alona, Khaustova; Loyan, Andriy V.

    2017-03-01

    Electric propulsion offers higher specific impulse compared to the chemical propulsion systems. It reduces the overall propellant mass and enables high operational lifetimes. Scientific Technological Center of Space Power and Energy (STC SPE), KhAI is involved in designing, manufacturing and testing of stationary plasma thrusters (SPT). Efforts are made to perform plasma diagnostics with corona and collisional radiative models (C-R model), as expected corona model falls short below 4 eV because of the heavy particle collisions elimination, whereas the C-R model's applicability is confirmed. Several tests are performed to analyze the electron temperature at various operational parameters of thruster like discharge voltage and mass flow rate. SPT-20M8 far and near-field plumes diagnostics are performed. Feasibility of C-R model by comparing its result to optical emission spectroscopy (OES) to investigate the electron temperature is validated with the probe measurements within the 10% of discrepancy.

  7. Opportunities and challenges in the wider adoption of liver and interconnected microphysiological systems

    PubMed Central

    Kostrzewski, Tomasz; Sceats, Emma L

    2017-01-01

    Liver disease represents a growing global health burden. The development of in vitro liver models which allow the study of disease and the prediction of metabolism and drug-induced liver injury in humans remains a challenge. The maintenance of functional primary hepatocytes cultures, the parenchymal cell of the liver, has historically been difficult with dedifferentiation and the consequent loss of hepatic function limiting utility. The desire for longer term functional liver cultures sparked the development of numerous systems, including collagen sandwiches, spheroids, micropatterned co-cultures and liver microphysiological systems. This review will focus on liver microphysiological systems, often referred to as liver-on-a-chip, and broaden to include platforms with interconnected microphysiological systems or multi-organ-chips. The interconnection of microphysiological systems presents the opportunity to explore system level effects, investigate organ cross talk, and address questions which were previously the preserve of animal experimentation. As a field, microphysiological systems have reached a level of maturity suitable for commercialization and consequent evaluation by a wider community of users, in academia and the pharmaceutical industry. Here scientific, operational, and organizational considerations relevant to the wider adoption of microphysiological systems will be discussed. Applications in which microphysiological systems might offer unique scientific insights or enable studies currently feasible only with animal models are described, and challenges which might be addressed to enable wider adoption of the technologies are highlighted. A path forward which envisions the development of microphysiological systems in partnerships between academia, vendors and industry, is proposed. Impact statement Microphysiological systems are in vitro models of human tissues and organs. These systems have advanced rapidly in recent years and are now being commercialized. To achieve wide adoption in the biological and pharmaceutical research communities, microphysiological systems must provide unique insights which translate to humans. This will be achieved by identifying key applications and making microphysiological systems intuitive to use. PMID:28504617

  8. Opportunities and challenges in the wider adoption of liver and interconnected microphysiological systems.

    PubMed

    Hughes, David J; Kostrzewski, Tomasz; Sceats, Emma L

    2017-10-01

    Liver disease represents a growing global health burden. The development of in vitro liver models which allow the study of disease and the prediction of metabolism and drug-induced liver injury in humans remains a challenge. The maintenance of functional primary hepatocytes cultures, the parenchymal cell of the liver, has historically been difficult with dedifferentiation and the consequent loss of hepatic function limiting utility. The desire for longer term functional liver cultures sparked the development of numerous systems, including collagen sandwiches, spheroids, micropatterned co-cultures and liver microphysiological systems. This review will focus on liver microphysiological systems, often referred to as liver-on-a-chip, and broaden to include platforms with interconnected microphysiological systems or multi-organ-chips. The interconnection of microphysiological systems presents the opportunity to explore system level effects, investigate organ cross talk, and address questions which were previously the preserve of animal experimentation. As a field, microphysiological systems have reached a level of maturity suitable for commercialization and consequent evaluation by a wider community of users, in academia and the pharmaceutical industry. Here scientific, operational, and organizational considerations relevant to the wider adoption of microphysiological systems will be discussed. Applications in which microphysiological systems might offer unique scientific insights or enable studies currently feasible only with animal models are described, and challenges which might be addressed to enable wider adoption of the technologies are highlighted. A path forward which envisions the development of microphysiological systems in partnerships between academia, vendors and industry, is proposed. Impact statement Microphysiological systems are in vitro models of human tissues and organs. These systems have advanced rapidly in recent years and are now being commercialized. To achieve wide adoption in the biological and pharmaceutical research communities, microphysiological systems must provide unique insights which translate to humans. This will be achieved by identifying key applications and making microphysiological systems intuitive to use.

  9. InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Gurrola, E. M.; Rosen, P. A.; Sacco, G.; Zebker, H. A.; Simons, M.; Sandwell, D. T.

    2010-12-01

    The InSAR Scientific Computing Environment (ISCE) is a software development effort in its second year within the NASA Advanced Information Systems and Technology program. The ISCE will provide a new computing environment for geodetic image processing for InSAR sensors that will enable scientists to reduce measurements directly from radar satellites and aircraft to new geophysical products without first requiring them to develop detailed expertise in radar processing methods. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. The NRC Decadal Survey-recommended DESDynI mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment is planned to become a key element in processing DESDynI data into higher level data products and it is expected to enable a new class of analyses that take greater advantage of the long time and large spatial scales of these new data, than current approaches. At the core of ISCE is both legacy processing software from the JPL/Caltech ROI_PAC repeat-pass interferometry package as well as a new InSAR processing package containing more efficient and more accurate processing algorithms being developed at Stanford for this project that is based on experience gained in developing processors for missions such as SRTM and UAVSAR. Around the core InSAR processing programs we are building object-oriented wrappers to enable their incorporation into a more modern, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models, and a robust, intuitive user interface with graduated exposure to the levels of sophistication, allowing novices to apply it readily for common tasks and experienced users to mine data with great facility and flexibility. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. In this paper we briefly describe both the legacy and the new core processing algorithms and their integration into the new computing environment. We describe the ISCE component and application architecture and the features that permit the desired flexibility, extensibility and ease-of-use. We summarize the state of progress of the environment and the plans for completion of the environment and for its future introduction into the radar processing community.

  10. Publishing and Editing of Semantically-Enabled Scientific Metadata Across Multiple Web Platforms: Challenges and Experiences

    NASA Astrophysics Data System (ADS)

    Patton, E. W.; West, P.; Greer, R.; Jin, B.

    2011-12-01

    Following on work presented at the 2010 AGU Fall Meeting, we present a number of real-world collections of semantically-enabled scientific metadata ingested into the Tetherless World RDF2HTML system as structured data and presented and edited using that system. Two separate datasets from two different domains (oceanography and solar sciences) are made available using existing web standards and services, e.g. encoded using ontologies represented with the Web Ontology Language (OWL) and stored in a SPARQL endpoint for querying. These datasets are deployed for use in three different web environments, i.e. Drupal, MediaWiki, and a custom web portal written in Java, to highlight the cross-platform nature of the data presentation. Stylesheets used to transform concepts in each domain as well as shared terms into HTML will be presented to show the power of using common ontologies to publish data and support reuse of existing terminologies. In addition, a single domain dataset is shared between two separate portal instances to demonstrate the ability for this system to offer distributed access and modification of content across the Internet. Lastly, we will highlight challenges that arose in the software engineering process, outline the design choices we made in solving those issues, and discuss how future improvements to this and other systems will enable the evolution of distributed, decentralized collaborations for scientific data sharing across multiple research groups.

  11. Enabling Grid Computing resources within the KM3NeT computing model

    NASA Astrophysics Data System (ADS)

    Filippidis, Christos

    2016-04-01

    KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  12. NCA-LDAS: A Terrestrial Water Analysis System Enabling Sustained Assessment and Dissemination of National Climate Indicators

    NASA Astrophysics Data System (ADS)

    Jasinski, M. F.; Kumar, S.; Peters-Lidard, C. D.; Arsenault, K. R.; Beaudoing, H. K.; Bolten, J. D.; Borak, J.; Kempler, S.; Li, B.; Mocko, D. M.; Rodell, M.; Rui, H.; Silberstein, D. S.; Teng, W. L.; Vollmer, B.

    2016-12-01

    The National Climate Assessment - Land Data Assimilation System, or NCA-LDAS, is an integrated terrestrial water analysis system created as an end-to-end enabling tool for sustained assessment and dissemination of terrestrial hydrologic indicators in support of the NCA. The primary features are i) gridded, daily time series of over forty hydrologic variables including terrestrial water and energy balance stores, states and fluxes over the continental U.S. derived from land surface modeling with multivariate satellite data record assimilation (1979-2015), ii) estimated trends of the principal water balance components over a wide range of scales and locations, and iii) public dissemination of all NCA-LDAS model forcings, and input and output data products through dedicated NCA-LDAS and NASA GES-DISC websites. NCA-LDAS supports sustained assessment of our national terrestrial hydrologic climate for improved scientific understanding, and the adaptation and management of water resources and related energy sectors. This presentation provides an overview of the NCA-LDAS system together with an evaluation of the initial release of NCA-LDAS data products and trends using two land surface models; Noah Ver. 3.3 and Catchment Ver. Fortuna 2.5, and a listing of several available pathways for public access and visualization of NCA-LDAS background information and data products.

  13. Cosmology Without Finality

    NASA Astrophysics Data System (ADS)

    Mahootian, F.

    2009-12-01

    The rapid convergence of advancing sensor technology, computational power, and knowledge discovery techniques over the past decade has brought unprecedented volumes of astronomical data together with unprecedented capabilities of data assimilation and analysis. A key result is that a new, data-driven "observational-inductive'' framework for scientific inquiry is taking shape and proving viable. The anticipated rise in data flow and processing power will have profound effects, e.g., confirmations and disconfirmations of existing theoretical claims both for and against the big bang model. But beyond enabling new discoveries can new data-driven frameworks of scientific inquiry reshape the epistemic ideals of science? The history of physics offers a comparison. The Bohr-Einstein debate over the "completeness'' of quantum mechanics centered on a question of ideals: what counts as science? We briefly examine lessons from that episode and pose questions about their applicability to cosmology. If the history of 20th century physics is any indication, the abandonment of absolutes (e.g., space, time, simultaneity, continuity, determinacy) can produce fundamental changes in understanding. The classical ideal of science, operative in both physics and cosmology, descends from the European Enlightenment. This ideal has for over 200 years guided science to seek the ultimate order of nature, to pursue the absolute theory, the "theory of everything.'' But now that we have new models of scientific inquiry powered by new technologies and driven more by data than by theory, it is time, finally, to relinquish dreams of a "final'' theory.

  14. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats

    2014-06-01

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.

  15. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less

  16. A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing

    DTIC Science & Technology

    2016-12-08

    project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from

  17. Technology Needs of Future Space Infrastructures Supporting Human Exploration and Development of Space

    NASA Technical Reports Server (NTRS)

    Carrington, Connie; Howell, Joe

    2001-01-01

    The path to human presence beyond near-Earth will be paved by the development of infrastructure. A fundamental technology in this infrastructure is energy, which enables not only the basic function of providing shelter for man and machine, but also enables transportation, scientific endeavors, and exploration. This paper discusses the near-term needs in technology that develop the infrastructure for HEDS.

  18. Analyzing Ocean Tracks: A model for student engagement in authentic scientific practices using data

    NASA Astrophysics Data System (ADS)

    Krumhansl, K.; Krumhansl, R.; Brown, C.; DeLisi, J.; Kochevar, R.; Sickler, J.; Busey, A.; Mueller-Northcott, J.; Block, B.

    2013-12-01

    The collection of large quantities of scientific data has not only transformed science, but holds the potential to transform teaching and learning by engaging students in authentic scientific work. Furthermore, it has become imperative in a data-rich world that students gain competency in working with and interpreting data. The Next Generation Science Standards reflect both the opportunity and need for greater integration of data in science education, and emphasize that both scientific knowledge and practice are essential elements of science learning. The process of enabling access by novice learners to data collected and used by experts poses significant challenges, however, recent research has demonstrated that barriers to student learning with data can be overcome by the careful design of data access and analysis tools that are specifically tailored to students. A group of educators at Education Development Center, Inc. (EDC) and scientists at Stanford University's Hopkins Marine Station are collaborating to develop and test a model for student engagement with scientific data using a web-based platform. This model, called Ocean Tracks: Investigating Marine Migrations in a Changing Ocean, provides students with the ability to plot and analyze tracks of migrating marine animals collected through the Tagging of Pacific Predators program. The interface and associated curriculum support students in identifying relationships between animal behavior and physical oceanographic variables (e.g. SST, chlorophyll, currents), making linkages between the living world and climate. Students are also supported in investigating possible sources of human impact to important biodiversity hotspots in the Pacific Ocean. The first round of classroom testing revealed that students were able to easily access and display data on the interface, and collect measurements from the animal tracks and oceanographic data layers. They were able to link multiple types of data to draw powerful inferences about how marine animal behavior is influenced by the ocean environment, and propose strategies to protect marine animals in the context of a changing ocean. Classroom testing also revealed the importance of providing students with real-world context to their learning, and the opportunity to directly compare their scientific investigations of data with those of scientists in the field. Our results also identified that student engagement was enhanced when they developed a direct personal connection to their scientific investigations by linking human activities to changes occurring in the natural world, and visualizing these changes using authentic data. This presentation will review the design elements of the Ocean Tracks interface and associated curriculum, our successes and challenges in supporting students in data based learning, and discuss specific linkages to the NGSS.

  19. An Ontology for Representing Geoscience Theories and Related Knowledge

    NASA Astrophysics Data System (ADS)

    Brodaric, B.

    2009-12-01

    Online scientific research, or e-science, is increasingly reliant on machine-readable representations of scientific data and knowledge. At present, much of the knowledge is represented in ontologies, which typically contain geoscience categories such as ‘water body’, ‘aquifer’, ‘granite’, ‘temperature’, ‘density’, ‘Co2’. While extremely useful for many e-science activities, such categorical representations constitute only a fragment of geoscience knowledge. Also needed are online representations of elements such as geoscience theories, to enable geoscientists to pose and evaluate hypotheses online. To address this need, the Science Knowledge Infrastructure ontology (SKIo) specializes the DOLCE foundational ontology with basic science knowledge primitives such as theory, model, observation, and prediction. Discussed will be SKIo as well as its implementation in the geosciences, including case studies from marine science, environmental science, and geologic mapping. These case studies demonstrate SKIo’s ability to represent a wide spectrum of geoscience knowledge types, to help fuel next generation e-science.

  20. CAWSES Related Projects in Japan : Grant-in-Aid for Creative Scientific Research ügBasic Study of Space Weather Predictionüh and CHAIN (Continuous H Alpha Imaging Network)

    NASA Astrophysics Data System (ADS)

    Shibata, K.; Kurokawa, H.

    The Grant-in-Aid for Creative Scientific Research of the Ministry of Education Science Sports Technology and Culture of Japan The Basic Study of Space Weather Prediction PI K Shibata Kyoto Univ has started in 2005 as 5 years projects with total budget 446Myen The purpose of this project is to develop a physical model of solar-terrestrial phenomena and space storms as a basis of space weather prediction by resolving fundamental physics of key phenomena from solar flares and coronal mass ejections to magnetospheric storms under international cooperation program CAWSES Climate and Weather of the Sun-Earth System Continuous H Alpha Imaging Network CHAIN Project led by H Kurokawa is a key project in this space weather study enabling continuous H alpha full Sun observations by connecting many solar telescopes in many countries through internet which provides the basis of the study of space weather prediction

  1. Technical note: The Linked Paleo Data framework - a common tongue for paleoclimatology

    NASA Astrophysics Data System (ADS)

    McKay, Nicholas P.; Emile-Geay, Julien

    2016-04-01

    Paleoclimatology is a highly collaborative scientific endeavor, increasingly reliant on online databases for data sharing. Yet there is currently no universal way to describe, store and share paleoclimate data: in other words, no standard. Data standards are often regarded by scientists as mere technicalities, though they underlie much scientific and technological innovation, as well as facilitating collaborations between research groups. In this article, we propose a preliminary data standard for paleoclimate data, general enough to accommodate all the archive and measurement types encountered in a large international collaboration (PAGES 2k). We also introduce a vehicle for such structured data (Linked Paleo Data, or LiPD), leveraging recent advances in knowledge representation (Linked Open Data).The LiPD framework enables quick querying and extraction, and we expect that it will facilitate the writing of open-source community codes to access, analyze, model and visualize paleoclimate observations. We welcome community feedback on this standard, and encourage paleoclimatologists to experiment with the format for their own purposes.

  2. dREL: a relational expression language for dictionary methods.

    PubMed

    Spadaccini, Nick; Castleden, Ian R; du Boulay, Doug; Hall, Sydney R

    2012-08-27

    The provision of precise metadata is an important but a largely underrated challenge for modern science [Nature 2009, 461, 145]. We describe here a dictionary methods language dREL that has been designed to enable complex data relationships to be expressed as formulaic scripts in data dictionaries written in DDLm [Spadaccini and Hall J. Chem. Inf. Model.2012 doi:10.1021/ci300075z]. dREL describes data relationships in a simple but powerful canonical form that is easy to read and understand and can be executed computationally to evaluate or validate data. The execution of dREL expressions is not a substitute for traditional scientific computation; it is to provide precise data dependency information to domain-specific definitions and a means for cross-validating data. Some scientific fields apply conventional programming languages to methods scripts but these tend to inhibit both dictionary development and accessibility. dREL removes the programming barrier and encourages the production of the metadata needed for seamless data archiving and exchange in science.

  3. Attack of the Killer Fungus: A Hypothesis-Driven Lab Module †

    PubMed Central

    Sato, Brian K.

    2013-01-01

    Discovery-driven experiments in undergraduate laboratory courses have been shown to increase student learning and critical thinking abilities. To this end, a lab module involving worm capture by a nematophagous fungus was developed. The goals of this module are to enhance scientific understanding of the regulation of worm capture by soil-dwelling fungi and for students to attain a set of established learning goals, including the ability to develop a testable hypothesis and search for primary literature for data analysis, among others. Students in a ten-week majors lab course completed the lab module and generated novel data as well as data that agrees with the published literature. In addition, learning gains were achieved as seen through a pre-module and post-module test, student self-assessment, class exam, and lab report. Overall, this lab module enables students to become active participants in the scientific method while contributing to the understanding of an ecologically relevant model organism. PMID:24358387

  4. Exploring students' epistemological knowledge of models and modelling in science: results from a teaching/learning experience on climate change

    NASA Astrophysics Data System (ADS)

    Tasquier, Giulia; Levrini, Olivia; Dillon, Justin

    2016-03-01

    The scientific community has been debating climate change for over two decades. In the light of certain arguments put forward by the aforesaid community, the EU has recommended a set of innovative reforms to science teaching such as incorporating environmental issues into the scientific curriculum, thereby helping to make schools a place of civic education. However, despite these European recommendations, relatively little emphasis is still given to climate change within science curricula. Climate change, although potentially engaging for students, is a complex topic that poses conceptual difficulties and emotional barriers, as well as epistemological challenges. Whilst the conceptual and emotional barriers have already been the object of several studies, students' reactions to the epistemological issues raised by climate changes have so far been rarely explored in science education research and thus are the main focus of this paper. This paper describes a study concerning the implementation of teaching materials designed to focus on the epistemological role of 'models and the game of modelling' in science and particularly when dealing with climate change. The materials were implemented in a course of 15 hours (five 3-hour lessons) for a class of Italian secondary-school students (grade 11; 16-17 years old). The purpose of the study is to investigate students' reactions to the epistemological dimension of the materials, and to explore if and how the material enabled them to develop their epistemological knowledge on models.

  5. Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors.

    PubMed

    Malinowski, Pawel E; Georgitzikis, Epimitheas; Maes, Jorick; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David

    2017-12-10

    Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III-V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10 -6 A/cm² at -2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors.

  6. Dr. Robert Goddard

    NASA Image and Video Library

    2017-12-08

    1930--Dr. Robert Goddard built this 30 by 60 ft. workshop for rocket construction at the Mescalero Ranch, 3 miles northeast of Roswell, New Mexico. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  7. Scientific Data Collection/Analysis: 1994-2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This custom bibliography from the NASA Scientific and Technical Information Program lists a sampling of records found in the NASA Aeronautics and Space Database. The scope of this topic includes technologies for lightweight, temperature-tolerant, radiation-hard sensors. This area of focus is one of the enabling technologies as defined by NASA s Report of the President s Commission on Implementation of United States Space Exploration Policy, published in June 2004.

  8. Dr. Robert Goddard

    NASA Image and Video Library

    2017-12-08

    The family home and birthplace of Dr. Robert Goddard in Worcester, Mass. was called Maple Hill and situated at Gates Lane, now called Tollawanda Drive. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  9. Edison Demonstration of Smallsat Networks

    NASA Technical Reports Server (NTRS)

    Westley, Deborah; Martinez, Andres; Petro, Andrew

    2015-01-01

    The goal of NASA's Edison Demonstration of Smallsat Networks (EDSN) mission is to demonstrate interactive satellite swarms capable of collecting, exchanging and transmitting multi-point scientific measurements. Satellite swarms enable a wide array of scientific, commercial and academic research not achievable with a single satellite. The EDSN satellites are scheduled to be launched into space as secondary payloads on the first flight of the Super Strypi launch vehicle no earlier than Oct. 29, 2015.

  10. Tropical Storm Toraji Approaching Japan

    NASA Image and Video Library

    2017-12-08

    Tropical Storm Toraji Approaching Japan, 09/03/2013 at 02:10 UTC. Terra/MODIS NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  11. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.

  12. Modeling the Skills and Practices of Scientists through an “All-Inclusive” Comparative Planetology Student Research Investigation

    NASA Technical Reports Server (NTRS)

    Graff, Paige; Bandfield, J.; Stefanov, W.; Vanderbloemen, L.; Willis, K.; Runco, S.

    2013-01-01

    To effectively prepare the nation's future Science, Technology, Engineering, and Mathematics (STEM) workforce, students in today's classrooms need opportunities to engage in authentic experiences that model skills and practices used by STEM professionals. Relevant, real-world authentic research experiences allow students to behave as scientists as they model the process of science. This enables students to get a true sense of STEM-related professions and also allows them to develop the requisite knowledge, skills, curiosity, and creativity necessary for success in STEM careers. Providing professional development and opportunities to help teachers infuse research in the classroom is one of the primary goals of the Expedition Earth and Beyond (EEAB) program. EEAB, facilitated by the Astromaterials Research and Exploration Science (ARES) Directorate at the NASA Johnson Space Center, is an Earth and planetary science education program designed to inspire, engage, and educate teachers and students in grades 5-12 by getting them actively involved with exploration, discovery, and the process of science. The program combines the expertise of scientists and educators to ensure the professional development provided to classroom teachers is scientifically valid and also recognizes classroom constraints. For many teachers, facilitating research in the classroom can be challenging. In addition to addressing required academic standards and dealing with time constraints, challenges include structuring a research investigation the entire class can successfully complete. To build educator confidence, foster positive classroom research experiences, and enable teachers to help students model the skills and practices of scientists, EEAB has created an "allinclusive" comparative planetology research investigation activity. This activity addresses academic standards while recognizing students (and teachers) potentially lack experience with scientific practices involved in conducting research. Designed as an entry level research engagement investigation, the activity introduces, illustrates, and teaches the skills involved in each step of the research process. Students use astronaut photos, provided through the ARES Crew Earth Observations (CEO) payload on the International Space Station (ISS) as well as remote sensing imagery of other planetary worlds. By including all the necessary tools to complete the investigation, students can focus on gaining experience in the process of science. Additionally, students are able to extend their experience of modeling the skills and practices of scientists through the opportunity to request new data of Earth from the ISS. Professional development offered through in-person and webinar trainings, along with the resources provided, enable educators to gain first-hand experience implementing a structured research investigation in the classroom. Through data and feedback collected from teachers, this type of "all-inclusive" investigation activity aims to become a model that can be utilized for other research topics and STEM disciplines.

  13. Modeling the Skills and Practices of Scientists through an 'All-Inclusive' Comparative Planetology Student Research Investigation

    NASA Astrophysics Data System (ADS)

    Graff, P. V.; Bandfield, J. L.; Stefanov, W. L.; Vanderbloemen, L.; Willis, K. J.; Runco, S.

    2013-12-01

    To effectively prepare the nation's future Science, Technology, Engineering, and Mathematics (STEM) workforce, students in today's classrooms need opportunities to engage in authentic experiences that model skills and practices used by STEM professionals. Relevant, real-world authentic research experiences allow students to behave as scientists as they model the process of science. This enables students to get a true sense of STEM-related professions and also allows them to develop the requisite knowledge, skills, curiosity, and creativity necessary for success in STEM careers. Providing professional development and opportunities to help teachers infuse research in the classroom is one of the primary goals of the Expedition Earth and Beyond (EEAB) program. EEAB, facilitated by the Astromaterials Research and Exploration Science (ARES) Directorate at the NASA Johnson Space Center, is an Earth and planetary science education program designed to inspire, engage, and educate teachers and students in grades 5-12 by getting them actively involved with exploration, discovery, and the process of science. The program combines the expertise of scientists and educators to ensure the professional development provided to classroom teachers is scientifically valid and also recognizes classroom constraints. For many teachers, facilitating research in the classroom can be challenging. In addition to addressing required academic standards and dealing with time constraints, challenges include structuring a research investigation the entire class can successfully complete. To build educator confidence, foster positive classroom research experiences, and enable teachers to help students model the skills and practices of scientists, EEAB has created an 'all-inclusive' comparative planetology research investigation activity. This activity addresses academic standards while recognizing students (and teachers) potentially lack experience with scientific practices involved in conducting research. Designed as an entry level research engagement investigation, the activity introduces, illustrates, and teaches the skills involved in each step of the research process. Students use astronaut photos, provided through the ARES Crew Earth Observations (CEO) payload on the International Space Station (ISS) as well as remote sensing imagery of other planetary worlds. By including all the necessary tools to complete the investigation, students can focus on gaining experience in the process of science. Additionally, students are able to extend their experience of modeling the skills and practices of scientists through the opportunity to request new data of Earth from the ISS. Professional development offered through in-person and webinar trainings, along with the resources provided, enable educators to gain first-hand experience implementing a structured research investigation in the classroom. Through data and feedback collected from teachers, this type of 'all-inclusive' investigation activity aims to become a model that can be utilized for other research topics and STEM disciplines.

  14. Lost in space: design of experiments and scientific exploration in a Hogarth Universe.

    PubMed

    Lendrem, Dennis W; Lendrem, B Clare; Woods, David; Rowland-Jones, Ruth; Burke, Matthew; Chatfield, Marion; Isaacs, John D; Owen, Martin R

    2015-11-01

    A Hogarth, or 'wicked', universe is an irregular environment generating data to support erroneous beliefs. Here, we argue that development scientists often work in such a universe. We demonstrate that exploring these multidimensional spaces using small experiments guided by scientific intuition alone, gives rise to an illusion of validity and a misplaced confidence in that scientific intuition. By contrast, design of experiments (DOE) permits the efficient mapping of such complex, multidimensional spaces. We describe simulation tools that enable research scientists to explore these spaces in relative safety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Cretaceous Footprints Found on Goddard Campus

    NASA Image and Video Library

    2017-12-08

    In December, 2012, Goddard scientists using ground penetrating radar showed that the sedimentary rock layer bearing these prints was preserved in its original location, but that investigation found no additional indications of locations of dinosaur track specimens of scientific value. Credit: NASA/Goddard/Michelle Handleman NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  16. Creating a FIESTA (Framework for Integrated Earth Science and Technology Applications) with MagIC

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Tauxe, L.; Constable, C.

    2017-12-01

    The Magnetics Information Consortium (https://earthref.org/MagIC) has recently developed a containerized web application to considerably reduce the friction in contributing, exploring and combining valuable and complex datasets for the paleo-, geo- and rock magnetic scientific community. The data produced in this scientific domain are inherently hierarchical and the communities evolving approaches to this scientific workflow, from sampling to taking measurements to multiple levels of interpretations, require a large and flexible data model to adequately annotate the results and ensure reproducibility. Historically, contributing such detail in a consistent format has been prohibitively time consuming and often resulted in only publishing the highly derived interpretations. The new open-source (https://github.com/earthref/MagIC) application provides a flexible upload tool integrated with the data model to easily create a validated contribution and a powerful search interface for discovering datasets and combining them to enable transformative science. MagIC is hosted at EarthRef.org along with several interdisciplinary geoscience databases. A FIESTA (Framework for Integrated Earth Science and Technology Applications) is being created by generalizing MagIC's web application for reuse in other domains. The application relies on a single configuration document that describes the routing, data model, component settings and external services integrations. The container hosts an isomorphic Meteor JavaScript application, MongoDB database and ElasticSearch search engine. Multiple containers can be configured as microservices to serve portions of the application or rely on externally hosted MongoDB, ElasticSearch, or third-party services to efficiently scale computational demands. FIESTA is particularly well suited for many Earth Science disciplines with its flexible data model, mapping, account management, upload tool to private workspaces, reference metadata, image galleries, full text searches and detailed filters. EarthRef's Seamount Catalog of bathymetry and morphology data, EarthRef's Geochemical Earth Reference Model (GERM) databases, and Oregon State University's Marine and Geology Repository (http://osu-mgr.org) will benefit from custom adaptations of FIESTA.

  17. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  18. Transferable Reactive Force Fields: Extensions of ReaxFF-lg to Nitromethane.

    PubMed

    Larentzos, James P; Rice, Betsy M

    2017-03-09

    Transferable ReaxFF-lg models of nitromethane that predict a variety of material properties over a wide range of thermodynamic states are obtained by screening a library of ∼6600 potentials that were previously optimized through the Multiple Objective Evolutionary Strategies (MOES) approach using a training set that included information for other energetic materials composed of carbon, hydrogen, nitrogen, and oxygen. Models that best match experimental nitromethane lattice constants at 4.2 K and 1 atm are evaluated for transferability to high-pressure states at room temperature and are shown to better predict various liquid- and solid-phase structural, thermodynamic, and transport properties as compared to the existing ReaxFF and ReaxFF-lg parametrizations. Although demonstrated for an energetic material, the library of ReaxFF-lg models is supplied to the scientific community to enable new research explorations of complex reactive phenomena in a variety of materials research applications.

  19. On Crowd-verification of Biological Networks

    PubMed Central

    Ansari, Sam; Binder, Jean; Boue, Stephanie; Di Fabio, Anselmo; Hayes, William; Hoeng, Julia; Iskandar, Anita; Kleiman, Robin; Norel, Raquel; O’Neel, Bruce; Peitsch, Manuel C.; Poussin, Carine; Pratt, Dexter; Rhrissorrakrai, Kahn; Schlage, Walter K.; Stolovitzky, Gustavo; Talikka, Marja

    2013-01-01

    Biological networks with a structured syntax are a powerful way of representing biological information generated from high density data; however, they can become unwieldy to manage as their size and complexity increase. This article presents a crowd-verification approach for the visualization and expansion of biological networks. Web-based graphical interfaces allow visualization of causal and correlative biological relationships represented using Biological Expression Language (BEL). Crowdsourcing principles enable participants to communally annotate these relationships based on literature evidences. Gamification principles are incorporated to further engage domain experts throughout biology to gather robust peer-reviewed information from which relationships can be identified and verified. The resulting network models will represent the current status of biological knowledge within the defined boundaries, here processes related to human lung disease. These models are amenable to computational analysis. For some period following conclusion of the challenge, the published models will remain available for continuous use and expansion by the scientific community. PMID:24151423

  20. Understanding Islamist political violence through computational social simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less

  1. Geoinformatics in the public service: building a cyberinfrastructure across the geological surveys

    USGS Publications Warehouse

    Allison, M. Lee; Gundersen, Linda C.; Richard, Stephen M.; Keller, G. Randy; Baru, Chaitanya

    2011-01-01

    Advanced information technology infrastructure is increasingly being employed in the Earth sciences to provide researchers with efficient access to massive central databases and to integrate diversely formatted information from a variety of sources. These geoinformatics initiatives enable manipulation, modeling and visualization of data in a consistent way, and are helping to develop integrated Earth models at various scales, and from the near surface to the deep interior. This book uses a series of case studies to demonstrate computer and database use across the geosciences. Chapters are thematically grouped into sections that cover data collection and management; modeling and community computational codes; visualization and data representation; knowledge management and data integration; and web services and scientific workflows. Geoinformatics is a fascinating and accessible introduction to this emerging field for readers across the solid Earth sciences and an invaluable reference for researchers interested in initiating new cyberinfrastructure projects of their own.

  2. Teaching concepts of clinical measurement variation to medical students.

    PubMed

    Hodder, R A; Longfield, J N; Cruess, D F; Horton, J A

    1982-09-01

    An exercise in clinical epidemiology was developed for medical students to demonstrate the process and limitations of scientific measurement using models that simulate common clinical experiences. All scales of measurement (nominal, ordinal and interval) were used to illustrate concepts of intra- and interobserver variation, systematic error, recording error, and procedural error. In a laboratory, students a) determined blood pressures on six videotaped subjects, b) graded sugar content of unknown solutions from 0 to 4+ using Clinitest tablets, c) measured papules that simulated PPD reactions, d) measured heart and kidney size on X-rays and, e) described a model skin lesion (melanoma). Traditionally, measurement variation is taught in biostatistics or epidemiology courses using previously collected data. Use of these models enables students to produce their own data using measurements commonly employed by the clinician. The exercise provided material for a meaningful discussion of the implications of measurement error in clinical decision-making.

  3. Metabolic Engineering for the Production of Natural Products

    PubMed Central

    Pickens, Lauren B.; Tang, Yi; Chooi, Yit-Heng

    2014-01-01

    Natural products and natural product derived compounds play an important role in modern healthcare as frontline treatments for many diseases and as inspiration for chemically synthesized therapeutics. With advances in sequencing and recombinant DNA technology, many of the biosynthetic pathways responsible for the production of these chemically complex and pharmaceutically valuable compounds have been elucidated. With an ever expanding toolkit of biosynthetic components, metabolic engineering is an increasingly powerful method to improve natural product titers and generate novel compounds. Heterologous production platforms have enabled access to pathways from difficult to culture strains; systems biology and metabolic modeling tools have resulted in increasing predictive and analytic capabilities; advances in expression systems and regulation have enabled the fine-tuning of pathways for increased efficiency, and characterization of individual pathway components has facilitated the construction of hybrid pathways for the production of new compounds. These advances in the many aspects of metabolic engineering have not only yielded fascinating scientific discoveries but also make it an increasingly viable approach for the optimization of natural product biosynthesis. PMID:22432617

  4. Working memory benefits creative insight, musical improvisation, and original ideation through maintained task-focused attention.

    PubMed

    De Dreu, Carsten K W; Nijstad, Bernard A; Baas, Matthijs; Wolsink, Inge; Roskes, Marieke

    2012-05-01

    Anecdotes from creative eminences suggest that executive control plays an important role in creativity, but scientific evidence is sparse. Invoking the Dual Pathway to Creativity Model, the authors hypothesize that working memory capacity (WMC) relates to creative performance because it enables persistent, focused, and systematic combining of elements and possibilities (persistence). Study 1 indeed showed that under cognitive load, participants performed worse on a creative insight task. Study 2 revealed positive associations between time-on-task and creativity among individuals high but not low in WMC, even after controlling for general intelligence. Study 3 revealed that across trials, semiprofessional cellists performed increasingly more creative improvisations when they had high rather than low WMC. Study 4 showed that WMC predicts original ideation because it allows persistent (rather than flexible) processing. The authors conclude that WMC benefits creativity because it enables the individual to maintain attention focused on the task and prevents undesirable mind wandering.

  5. Structure–property relationships in atomic-scale junctions: Histograms and beyond

    DOE PAGES

    Mark S. Hybertsen; Venkataraman, Latha

    2016-03-03

    Over the past 10 years, there has been tremendous progress in the measurement, modeling and understanding of structure–function relationships in single molecule junctions. Numerous research groups have addressed significant scientific questions, directed both to conductance phenomena at the single molecule level and to the fundamental chemistry that controls junction functionality. Many different functionalities have been demonstrated, including single-molecule diodes, optically and mechanically activated switches, and, significantly, physical phenomena with no classical analogues, such as those based on quantum interference effects. Experimental techniques for reliable and reproducible single molecule junction formation and characterization have led to this progress. In particular, themore » scanning tunneling microscope based break-junction (STM-BJ) technique has enabled rapid, sequential measurement of large numbers of nanoscale junctions allowing a statistical analysis to readily distinguish reproducible characteristics. Furthermore, harnessing fundamental link chemistry has provided the necessary chemical control over junction formation, enabling measurements that revealed clear relationships between molecular structure and conductance characteristics.« less

  6. Structure–property relationships in atomic-scale junctions: Histograms and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mark S. Hybertsen; Venkataraman, Latha

    Over the past 10 years, there has been tremendous progress in the measurement, modeling and understanding of structure–function relationships in single molecule junctions. Numerous research groups have addressed significant scientific questions, directed both to conductance phenomena at the single molecule level and to the fundamental chemistry that controls junction functionality. Many different functionalities have been demonstrated, including single-molecule diodes, optically and mechanically activated switches, and, significantly, physical phenomena with no classical analogues, such as those based on quantum interference effects. Experimental techniques for reliable and reproducible single molecule junction formation and characterization have led to this progress. In particular, themore » scanning tunneling microscope based break-junction (STM-BJ) technique has enabled rapid, sequential measurement of large numbers of nanoscale junctions allowing a statistical analysis to readily distinguish reproducible characteristics. Furthermore, harnessing fundamental link chemistry has provided the necessary chemical control over junction formation, enabling measurements that revealed clear relationships between molecular structure and conductance characteristics.« less

  7. Linear regression analysis: part 14 of a series on evaluation of scientific publications.

    PubMed

    Schneider, Astrid; Hommel, Gerhard; Blettner, Maria

    2010-11-01

    Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.

  8. The Sunrise project: An R&D project for a national information infrastructure prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Juhnyoung

    1995-02-01

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to a prototype National Information Infrastructure (NII) development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multimedia technologies, and data mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; and (3) To define a new way of collaboration between computer science and industrially relevant research.« less

  9. A network-based distributed, media-rich computing and information environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less

  10. Handling the Diversity in the Coming Flood of InSAR Data with the InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Gurrola, E. M.; Sacco, G. F.; Agram, P. S.; Lavalle, M.; Zebker, H. A.

    2014-12-01

    The NASA ESTO-developed InSAR Scientific Computing Environment (ISCE) provides acomputing framework for geodetic image processing for InSAR sensors that ismodular, flexible, and extensible, enabling scientists to reduce measurementsdirectly from a diverse array of radar satellites and aircraft to newgeophysical products. ISCE can serve as the core of a centralized processingcenter to bring Level-0 raw radar data up to Level-3 data products, but isadaptable to alternative processing approaches for science users interested innew and different ways to exploit mission data. This is accomplished throughrigorous componentization of processing codes, abstraction and generalization ofdata models, and a xml-based input interface with multi-level prioritizedcontrol of the component configurations depending on the science processingcontext. The proposed NASA-ISRO SAR (NISAR) Mission would deliver data ofunprecedented quantity and quality, making possible global-scale studies inclimate research, natural hazards, and Earth's ecosystems. ISCE is planned tobecome a key element in processing projected NISAR data into higher level dataproducts, enabling a new class of analyses that take greater advantage of thelong time and large spatial scales of these new data than current approaches.NISAR would be but one mission in a constellation of radar satellites in thefuture delivering such data. ISCE has been incorporated into two prototypecloud-based systems that have demonstrated its elasticity to addressing largerdata processing problems in a "production" context and its ability to becontrolled by individual science users on the cloud for large data problems.

  11. Molecular Neuroanatomy: A Generation of Progress

    PubMed Central

    Pollock, Jonathan D.; Wu, Da-Yu; Satterlee, John

    2014-01-01

    The neuroscience research landscape has changed dramatically over the past decade. An impressive array of neuroscience tools and technologies have been generated, including brain gene expression atlases, genetically encoded proteins to monitor and manipulate neuronal activity and function, cost effective genome sequencing, new technologies enabling genome manipulation, new imaging methods and new tools for mapping neuronal circuits. However, despite these technological advances, several significant scientific challenges must be overcome in the coming decade to enable a better understanding of brain function and to develop next generation cell type-targeted therapeutics to treat brain disorders. For example, we do not have an inventory of the different types of cells that exist in the brain, nor do we know how to molecularly phenotype them. We also lack robust technologies to map connections between cells. This review will provide an overview of some of the tools and technologies neuroscientists are currently using to move the field of molecular neuroanatomy forward and also discuss emerging technologies that may enable neuroscientists to address these critical scientific challenges over the coming decade. PMID:24388609

  12. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  13. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Dr. Robert Goddard and colleagues at Roswell, New Mexico. Successful test of May 19, 1937. Dr. Robert Goddard is holding the cap and pilot parachute, parts of the successful operation. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  14. Utilizing plasma physics to create biomolecular movies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hau-Riege, S

    In spring of 2000, the LCLS Scientific Advisory Committee selected the top scientific experiments for LCLS. One of the proposed flagship experiments is atomic-resolution three-dimensional structure determination of isolated biolgical macromolecules and particles, with the ultimate goal of obtaining molecular (snapshot) movies. The key enabling insight was that radiation damage may be overcome by using x-ray pulses that are shorter than the time it takes for damage to manifest itself.

  15. March 2015 Solar Eclipse

    NASA Image and Video Library

    2017-12-08

    Within the penumbra, the eclipse is partial (left), but within the umbra, the Moon completely covers the Sun (right). NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  16. A scientific role for Space Station Freedom: Research at the cellular level

    NASA Technical Reports Server (NTRS)

    Johnson, Terry C.; Brady, John N.

    1993-01-01

    The scientific importance of Space Station Freedom is discussed in light of the valuable information that can be gained in cellular and developmental biology with regard to the microgravity environment on the cellular cytoskeleton, cellular responses to extracellular signal molecules, morphology, events associated with cell division, and cellular physiology. Examples of studies in basic cell biology, as well as their potential importance to concerns for future enabling strategies, are presented.

  17. Novel in vitro and mathematical models for the prediction of chemical toxicity.

    PubMed

    Williams, Dominic P; Shipley, Rebecca; Ellis, Marianne J; Webb, Steve; Ward, John; Gardner, Iain; Creton, Stuart

    2013-01-01

    The focus of much scientific and medical research is directed towards understanding the disease process and defining therapeutic intervention strategies. The scientific basis of drug safety is very complex and currently remains poorly understood, despite the fact that adverse drug reactions (ADRs) are a major health concern and a serious impediment to development of new medicines. Toxicity issues account for ∼21% drug attrition during drug development and safety testing strategies require considerable animal use. Mechanistic relationships between drug plasma levels and molecular/cellular events that culminate in whole organ toxicity underpins development of novel safety assessment strategies. Current in vitro test systems are poorly predictive of toxicity of chemicals entering the systemic circulation, particularly to the liver. Such systems fall short because of (1) the physiological gap between cells currently used and human hepatocytes existing in their native state, (2) the lack of physiological integration with other cells/systems within organs, required to amplify the initial toxicological lesion into overt toxicity, (3) the inability to assess how low level cell damage induced by chemicals may develop into overt organ toxicity in a minority of patients, (4) lack of consideration of systemic effects. Reproduction of centrilobular and periportal hepatocyte phenotypes in in vitro culture is crucial for sensitive detection of cellular stress. Hepatocyte metabolism/phenotype is dependent on cell position along the liver lobule, with corresponding differences in exposure to substrate, oxygen and hormone gradients. Application of bioartificial liver (BAL) technology can encompass in vitro predictive toxicity testing with enhanced sensitivity and improved mechanistic understanding. Combining this technology with mechanistic mathematical models describing intracellular metabolism, fluid-flow, substrate, hormone and nutrient distribution provides the opportunity to design the BAL specifically to mimic the in vivo scenario. Such mathematical models enable theoretical hypothesis testing, will inform the design of in vitro experiments, and will enable both refinement and reduction of in vivo animal trials. In this way, development of novel mathematical modelling tools will help to focus and direct in vitro and in vivo research, and can be used as a framework for other areas of drug safety science.

  18. Nationwide Buildings Energy Research enabled through an integrated Data Intensive Scientific Workflow and Advanced Analysis Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin; Lansing, Carina S.; Elsethagen, Todd O.

    2014-01-28

    Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create andmore » execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern Interconnect domain, which they are now planning to extend to predict the demand for the complete century. The initial study raised their data demands from a few GBs to 400GB for the 3year study and expected tens of TBs for the full century.« less

  19. DIaaS: Data-Intensive workflows as a service - Enabling easy composition and deployment of data-intensive workflows on Virtual Research Environments

    NASA Astrophysics Data System (ADS)

    Filgueira, R.; Ferreira da Silva, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus. DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed within the same environment, this setup demonstrates the flexibility of DIaaS to run applications across e-Infrastructures. In summary, DIaaS delivers specialized software to execute data-intensive applications in a scalable, efficient, and robust manner reducing the engineering time and computational cost.

  20. Novel in vitro and mathematical models for the prediction of chemical toxicity

    PubMed Central

    Shipley, Rebecca; Ellis, Marianne J.; Webb, Steve; Ward, John; Gardner, Iain; Creton, Stuart

    2013-01-01

    The focus of much scientific and medical research is directed towards understanding the disease process and defining therapeutic intervention strategies. The scientific basis of drug safety is very complex and currently remains poorly understood, despite the fact that adverse drug reactions (ADRs) are a major health concern and a serious impediment to development of new medicines. Toxicity issues account for ∼21% drug attrition during drug development and safety testing strategies require considerable animal use. Mechanistic relationships between drug plasma levels and molecular/cellular events that culminate in whole organ toxicity underpins development of novel safety assessment strategies. Current in vitro test systems are poorly predictive of toxicity of chemicals entering the systemic circulation, particularly to the liver. Such systems fall short because of (1) the physiological gap between cells currently used and human hepatocytes existing in their native state, (2) the lack of physiological integration with other cells/systems within organs, required to amplify the initial toxicological lesion into overt toxicity, (3) the inability to assess how low level cell damage induced by chemicals may develop into overt organ toxicity in a minority of patients, (4) lack of consideration of systemic effects. Reproduction of centrilobular and periportal hepatocyte phenotypes in in vitro culture is crucial for sensitive detection of cellular stress. Hepatocyte metabolism/phenotype is dependent on cell position along the liver lobule, with corresponding differences in exposure to substrate, oxygen and hormone gradients. Application of bioartificial liver (BAL) technology can encompass in vitro predictive toxicity testing with enhanced sensitivity and improved mechanistic understanding. Combining this technology with mechanistic mathematical models describing intracellular metabolism, fluid-flow, substrate, hormone and nutrient distribution provides the opportunity to design the BAL specifically to mimic the in vivo scenario. Such mathematical models enable theoretical hypothesis testing, will inform the design of in vitro experiments, and will enable both refinement and reduction of in vivo animal trials. In this way, development of novel mathematical modelling tools will help to focus and direct in vitro and in vivo research, and can be used as a framework for other areas of drug safety science. PMID:26966512

  1. Sismos a l'Ecole : a Seismic Educational Network (FRANCE) linked with Research

    NASA Astrophysics Data System (ADS)

    Berenguer, J.; Le Puth, J.; Courboulex, F.; Zodmi, B.; Boneff, M.

    2007-12-01

    Ahead of the quick evolution of our society, in which scientific information has to be accurately understood by a great majority, the promotion of a responsible behaviour coming from educated and trained citizens has become a priority. One of the roles of school is to enable children to understand sciences, these same sciences that were long ago the prerogative of scientific laboratories. The educational network SISMOS à l\\'"Ecole is an example of a project structured on the knowledge of seismic risks through a scientific and technological approach. It develops a teaching method leading to an approach towards the knowledge of natural disasters. The original and innovating feature of this educational network is to enable students to set up a seismograph in their school. The recorded signals - coming from a regional or a worldwide seismic activity - feed an on- line database, which is in fact a real research centre for seismic resources as well as a starting point for educational and scientific activities. The network, that numbers about thirty stations set up in France, in its overseas departments and territories, and in a couple of French schools abroad, is based upon an experience initiated in the French Riviera ten years ago or so. The achievement of the program has from then on gone beyond the simple purpose of conveying seismic data that research and monitoring centres could have recorded. Thanks to the use of scientific measures, students become involved and get into complex notions revolving around geophysics and geosciences. Developing simple tools, setting up concrete experiments combined with an investigate reasoning makes it easier to build up a quality scientific culture as well as an education of citizens to risks.

  2. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  3. a Continual Engagement Approach Through Gis-Mcda Conflict Resolution of Loggerhead Sea Turtle Bycatch in Mexico

    NASA Astrophysics Data System (ADS)

    Bojórquez-Tapia, L. A.

    2015-12-01

    Continual engagement is an approach that emphasizes of uninterrupted interaction with the stakeholders with the purpose of fully integrating their knowledge into policymaking process. It focuses on the creation of hybrid scientific-local knowledge highly relevant to community and policy makers needs, while balancing the power asymmetries among stakeholders. Hence, it presupposes a capacity for a continuous revision and adjustment of the analyses that support the policymaking process. While continual engagement implies a capacity for enabling an effective communication, translation and mediation of knowledge among the diverse stakeholders, experts and policymakers, it also means keeping a close eye out for how knowledge evolves and how new data and information is introduced along a policymaking process. Through a case study, the loggerhead sea turtle (Caretta caretta) fishing bycatch in Mexico, a geographical information system-multicriteria modeling (GIS-MCDA) approach is presented to address the challenges of implementing continual engagement in conflict resolution processes. The GIS-MCDA combined the analytical hierarchy process (AHP) and compromise programming (CP) to generate consensus regarding the spatial pattern of conflicts. The AHP was fundamental for synthesizing the different sources of knowledge into a geospatial model. In particular, the AHP enabled the assess the salience, legitimacy, and credibility of the information produced for all involved. Results enabled the development of specific policies based upon an assessment of the risk of the loggerhead population to different levels of fishing bycatch, and the needs of the fishing communities in the region.

  4. Framework for Informed Policy Making Using Data from National Environmental Observatories

    NASA Astrophysics Data System (ADS)

    Wee, B.; Taylor, J. R.; Poinsatte, J.

    2012-12-01

    Large-scale environmental changes pose challenges that straddle environmental, economic, and social boundaries. As we design and implement climate adaptation strategies at the Federal, state, local, and tribal levels, accessible and usable data are essential for implementing actions that are informed by the best available information. Data-intensive science has been heralded as an enabler for scientific breakthroughs powered by advanced computing capabilities and interoperable data systems. Those same capabilities can be applied to data and information systems that facilitate the transformation of data into highly processed products. At the interface of scientifically informed public policy and data intensive science lies the potential for producers of credible, integrated, multi-scalar environmental data like the National Ecological Observatory Network (NEON) and its partners to capitalize on data and informatics interoperability initiatives that enable the integration of environmental data from across credible data sources. NSF's large-scale environmental observatories such as NEON and the Ocean Observatories Initiative (OOI) are designed to provide high-quality, long-term environmental data for research. These data are also meant to be repurposed for operational needs that like risk management, vulnerability assessments, resource management, and others. The proposed USDA Agriculture Research Service (ARS) Long Term Agro-ecosystem Research (LTAR) network is another example of such an environmental observatory that will produce credible data for environmental / agricultural forecasting and informing policy. To facilitate data fusion across observatories, there is a growing call for observation systems to more closely coordinate and standardize how variables are measured. Together with observation standards, cyberinfrastructure standards enable the proliferation of an ecosystem of applications that utilize diverse, high-quality, credible data. Interoperability facilitates the integration of data from multiple credible sources of data, and enables the repurposing of data for use at different geographical scales. Metadata that captures the transformation of data into value-added products ("provenance") lends reproducability and transparency to the entire process. This way, the datasets and model code used to create any product can be examined by other parties. This talk outlines a pathway for transforming environmental data into value-added products by various stakeholders to better inform sustainable agriculture using data from environmental observatories including NEON and LTAR.;

  5. An open and extensible framework for spatially explicit land use change modelling: the lulcc R package

    NASA Astrophysics Data System (ADS)

    Moulds, S.; Buytaert, W.; Mijic, A.

    2015-10-01

    We present the lulcc software package, an object-oriented framework for land use change modelling written in the R programming language. The contribution of the work is to resolve the following limitations associated with the current land use change modelling paradigm: (1) the source code for model implementations is frequently unavailable, severely compromising the reproducibility of scientific results and making it impossible for members of the community to improve or adapt models for their own purposes; (2) ensemble experiments to capture model structural uncertainty are difficult because of fundamental differences between implementations of alternative models; and (3) additional software is required because existing applications frequently perform only the spatial allocation of change. The package includes a stochastic ordered allocation procedure as well as an implementation of the CLUE-S algorithm. We demonstrate its functionality by simulating land use change at the Plum Island Ecosystems site, using a data set included with the package. It is envisaged that lulcc will enable future model development and comparison within an open environment.

  6. Position Paper - pFLogger: The Parallel Fortran Logging framework for HPC Applications

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.; Cruz, Carlos A.

    2017-01-01

    In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or logger) similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.

  7. POSITION PAPER - pFLogger: The Parallel Fortran Logging Framework for HPC Applications

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.; Cruz, Carlos A.

    2017-01-01

    In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or 'logger') similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger - a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.

  8. Reducing Time to Science: Unidata and JupyterHub Technology Using the Jetstream Cloud

    NASA Astrophysics Data System (ADS)

    Chastang, J.; Signell, R. P.; Fischer, J. L.

    2017-12-01

    Cloud computing can accelerate scientific workflows, discovery, and collaborations by reducing research and data friction. We describe the deployment of Unidata and JupyterHub technologies on the NSF-funded XSEDE Jetstream cloud. With the aid of virtual machines and Docker technology, we deploy a Unidata JupyterHub server co-located with a Local Data Manager (LDM), THREDDS data server (TDS), and RAMADDA geoscience content management system. We provide Jupyter Notebooks and the pre-built Python environments needed to run them. The notebooks can be used for instruction and as templates for scientific experimentation and discovery. We also supply a large quantity of NCEP forecast model results to allow data-proximate analysis and visualization. In addition, users can transfer data using Globus command line tools, and perform their own data-proximate analysis and visualization with Notebook technology. These data can be shared with others via a dedicated TDS server for scientific distribution and collaboration. There are many benefits of this approach. Not only is the cloud computing environment fast, reliable and scalable, but scientists can analyze, visualize, and share data using only their web browser. No local specialized desktop software or a fast internet connection is required. This environment will enable scientists to spend less time managing their software and more time doing science.

  9. The Landscape Evolution Observatory: a large-scale controllable infrastructure to study coupled Earth-surface processes

    USGS Publications Warehouse

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferré, T.P.A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-01-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  10. Toward the Next Generation of Air Quality Monitoring Indicators

    NASA Technical Reports Server (NTRS)

    Hsu, Angel; Reuben, Aaron; Shindell, Drew; deSherbinin, Alex; Levy, Marc

    2013-01-01

    This paper introduces an initiative to bridge the state of scientific knowledge on air pollution with the needs of policymakers and stakeholders to design the "next generation" of air quality indicators. As a first step this initiative assesses current monitoring and modeling associated with a number of important pollutants with an eye toward identifying knowledge gaps and scientific needs that are a barrier to reducing air pollution impacts on human and ecosystem health across the globe. Four outdoor air pollutants were considered e particulate matter, ozone, mercury, and Persistent Organic Pollutants (POPs) e because of their clear adverse impacts on human and ecosystem health and because of the availability of baseline data for assessment for each. While other papers appearing in this issue will address each pollutant separately, this paper serves as a summary of the initiative and presents recommendations for needed investments to provide improved measurement, monitoring, and modeling data for policyrelevant indicators. The ultimate goal of this effort is to enable enhanced public policy responses to air pollution by linking improved data and measurement methods to decision-making through the development of indicators that can allow policymakers to better understand the impacts of air pollution and, along with source attribution based on modeling and measurements, facilitate improved policies to solve it. The development of indicators represents a crucial next step in this process.

  11. The Globus Galaxies Platform. Delivering Science Gateways as a Service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madduri, Ravi; Chard, Kyle; Chard, Ryan

    We use public cloud computers to host sophisticated scientific data; software is then used to transform scientific practice by enabling broad access to capabilities previously available only to the few. The primary obstacle to more widespread use of public clouds to host scientific software (‘cloud-based science gateways’) has thus far been the considerable gap between the specialized needs of science applications and the capabilities provided by cloud infrastructures. We describe here a domain-independent, cloud-based science gateway platform, the Globus Galaxies platform, which overcomes this gap by providing a set of hosted services that directly address the needs of science gatewaymore » developers. The design and implementation of this platform leverages our several years of experience with Globus Genomics, a cloud-based science gateway that has served more than 200 genomics researchers across 30 institutions. Building on that foundation, we have also implemented a platform that leverages the popular Galaxy system for application hosting and workflow execution; Globus services for data transfer, user and group management, and authentication; and a cost-aware elastic provisioning model specialized for public cloud resources. We describe here the capabilities and architecture of this platform, present six scientific domains in which we have successfully applied it, report on user experiences, and analyze the economics of our deployments. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.« less

  12. Low Density Materials

    DTIC Science & Technology

    2013-03-07

    and toughness properties • Organic and inorganic components from molecular to macro length scales enables mechanically-robust materials with...Nanostructured Carbon 0D Fullerene 3D ? 13 DISTRIBUTION STATEMENT A – Unclassified, Unlimited Distribution Overarching Scientific Challenges

  13. The future of poultry science research: things I think I think.

    PubMed

    Taylor, R L

    2009-06-01

    Much poultry research progress has occurred over the first century of the Poultry Science Association. During that time, specific problems have been solved and much basic biological knowledge has been gained. Scientific discovery has exceeded its integration into foundation concepts. Researchers need to be involved in the public's development of critical thinking skills to enable discernment of fact versus fiction. Academic, government, and private institutions need to hire the best people. Issues of insufficient research funding will be remedied by a combination of strategies rather than by a single cure. Scientific advocacy for poultry-related issues is critical to success. Two other keys to the future are funding for higher-risk projects, whose outcome is truly unknown, and specific allocations for new investigators. Diligent, ongoing efforts by poultry scientists will enable progress beyond the challenges.

  14. Neurological Effects of Blast Injury

    PubMed Central

    Hicks, Ramona R.; Fertig, Stephanie J.; Desrocher, Rebecca E.; Koroshetz, Walter J.; Pancrazio, Joseph J.

    2010-01-01

    Over the last few years, thousands of soldiers and an even greater number of civilians have suffered traumatic injuries due to blast exposure, largely attributed to improvised explosive devices in terrorist and insurgent activities. The use of body armor is allowing soldiers to survive blasts that would otherwise be fatal due to systemic damage. Emerging evidence suggests that exposure to a blast can produce neurological consequences in the brain, but much remains unknown. To elucidate the current scientific basis for understanding blast-induced traumatic brain injury (bTBI), the NIH convened a workshop in April, 2008. A multidisciplinary group of neuroscientists, engineers, and clinicians were invited to share insights on bTBI, specifically pertaining to: physics of blast explosions, acute clinical observations and treatments, preclinical and computational models, and lessons from the international community on civilian exposures. This report provides an overview of the state of scientific knowledge of bTBI, drawing from the published literature, as well as presentations, discussions, and recommendations from the workshop. One of the major recommendations from the workshop was the need to characterize the effects of blast exposure on clinical neuropathology. Clearer understanding of the human neuropathology would enable validation of preclinical and computational models, which are attempting to simulate blast wave interactions with the central nervous system. Furthermore, the civilian experience with bTBI suggests that polytrauma models incorporating both brain and lung injuries may be more relevant to the study of civilian countermeasures than considering models with a neurological focus alone. PMID:20453776

  15. Interactive terrain visualization enables virtual field work during rapid scientific response to the 2010 Haiti earthquake

    USGS Publications Warehouse

    Cowgill, Eric; Bernardin, Tony S.; Oskin, Michael E.; Bowles, Christopher; Yikilmaz, M. Burak; Kreylos, Oliver; Elliott, Austin J.; Bishop, Scott; Gold, Ryan D.; Morelan, Alexander; Bawden, Gerald W.; Hamann, Bernd; Kellogg, Louise

    2012-01-01

    The moment magnitude (Mw) 7.0 12 January 2010 Haiti earthquake is the first major earthquake for which a large-footprint LiDAR (light detection and ranging) survey was acquired within several weeks of the event. Here, we describe the use of virtual reality data visualization to analyze massive amounts (67 GB on disk) of multiresolution terrain data during the rapid scientific response to a major natural disaster. In particular, we describe a method for conducting virtual field work using both desktop computers and a 4-sided, 22 m3 CAVE immersive virtual reality environment, along with KeckCAVES (Keck Center for Active Visualization in the Earth Sciences) software tools LiDAR Viewer, to analyze LiDAR point-cloud data, and Crusta, for 2.5 dimensional surficial geologic mapping on a bare-earth digital elevation model. This system enabled virtual field work that yielded remote observations of the topographic expression of active faulting within an ∼75-km-long section of the eastern Enriquillo–Plantain Garden fault spanning the 2010 epicenter. Virtual field observations indicated that the geomorphic evidence of active faulting and ancient surface rupture varies along strike. Landform offsets of 6–50 m along the Enriquillo–Plantain Garden fault east of the 2010 epicenter and closest to Port-au-Prince attest to repeated recent surface-rupturing earthquakes there. In the west, the fault trace is well defined by displaced landforms, but it is not as clear as in the east. The 2010 epicenter is within a transition zone between these sections that extends from Grand Goâve in the west to Fayette in the east. Within this transition, between L'Acul (lat 72°40′W) and the Rouillone River (lat 72°35′W), the Enriquillo–Plantain Garden fault is undefined along an embayed low-relief range front, with little evidence of recent surface rupture. Based on the geometry of the eastern and western faults that show evidence of recent surface rupture, we propose that the 2010 event occurred within a stepover that appears to have served as a long-lived boundary between rupture segments, explaining the lack of 2010 surface rupture. This study demonstrates how virtual reality–based data visualization has the potential to transform rapid scientific response by enabling virtual field studies and real-time interactive analysis of massive terrain data sets.

  16. Bay in Flux: Marine Climate Impacts, Art and Tablet App Design

    NASA Astrophysics Data System (ADS)

    Kintisch, E. S.

    2012-12-01

    Bay in Flux is a year-long experimental effort to design and develop interactive tablet computer apps exploring the marine impacts of climate change. The goal is to convey, visualize and enliven scientific ideas around this topic, while engaging a broad audience through the design of interactive content. Pioneering new models of scientist-artist collaborations are a central part of the effort as well. The project begins with an innovative studio class at the Rhode Island School of Design (RISD) called Bay in Flux, taught in the Fall 2012 semester. Its three instructor team includes two artist-designers and one science reporter, with active collaborations from affiliated marine scientists. The subject matter focus is the Narragansett Bay, which has shown physical, chemical and ecological impacts of climate change, along with the ongoing efforts of researchers to explain and characterize it. In exploring this rich story, we intend to innovate pioneering means of handling narrative material on interactive e-books, enable data collection by citizen scientists or devise game-like simulations to enable audiences to explore and understand complex natural systems. The lessons we seek to learn in this project include: how to effectively encourage collaborations between scientists and designers around digital design; how to pioneer new and compelling ways to tell science-based nonfiction stories on tablets; and how art and design students with no scientific training can engage with complex scientific content effectively. The project will also challenge us to think about the tablet computer not only as a data output device -- in which the user reads, watches, or interacts with provided content -- but also as a dynamic and ideal tool for mobile data input, enabling citizen science projects and novel connections between working researchers and the public. The intended audience could include high school students or older audiences who currently eschew science journalism. HTML5 is the likely language of choice, with the iPad being the initial intended platform. Following the fall class, a spring 2013 effort will involve developing a prototype app. Partners in the Bay in Flux project are the Knight Science Journalism program at MIT, RISD and the National Science Foundation's Rhode Island Experimental Program to Stimulate Competitive Research. Ultimately, the goal is to foster new ways for artists and designers to collaborate with scientists in the environmental field while reaching broad audiences.

  17. Novel residual-based large eddy simulation turbulence models for incompressible magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Sondak, David

    The goal of this work was to develop, introduce, and test a promising computational paradigm for the development of turbulence models for incompressible magnetohydrodynamics (MHD). MHD governs the behavior of an electrically conducting fluid in the presence of an external electromagnetic (EM) field. The incompressible MHD model is used in many engineering and scientific disciplines from the development of nuclear fusion as a sustainable energy source to the study of space weather and solar physics. Many interesting MHD systems exhibit the phenomenon of turbulence which remains an elusive problem from all scientific perspectives. This work focuses on the computational perspective and proposes techniques that enable the study of systems involving MHD turbulence. Direct numerical simulation (DNS) is not a feasible approach for studying MHD turbulence. In this work, turbulence models for incompressible MHD were developed from the variational multiscale (VMS) formulation wherein the solution fields were decomposed into resolved and unresolved components. The unresolved components were modeled with a term that is proportional to the residual of the resolved scales. Two additional MHD models were developed based off of the VMS formulation: a residual-based eddy viscosity (RBEV) model and a mixed model that partners the VMS formulation with the RBEV model. These models are endowed with several special numerical and physics features. Included in the numerical features is the internal numerical consistency of each of the models. Physically, the new models are able to capture desirable MHD physics such as the inverse cascade of magnetic energy and the subgrid dynamo effect. The models were tested with a Fourier-spectral numerical method and the finite element method (FEM). The primary test problem was the Taylor-Green vortex. Results comparing the performance of the new models to DNS were obtained. The performance of the new models was compared to classic and cutting-edge dynamic Smagorinsky eddy viscosity (DSEV) models. The new models typically outperform the classical models.

  18. Using Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) in a range of geoscience applications

    NASA Astrophysics Data System (ADS)

    Daniels, M. D.; Kerkez, B.; Chandrasekar, V.; Graves, S. J.; Stamps, D. S.; Dye, M. J.; Keiser, K.; Martin, C. L.; Gooch, S. R.

    2016-12-01

    Cloud-Hosted Real-time Data Services for the Geosciences, or CHORDS, addresses the ever-increasing importance of real-time scientific data, particularly in mission critical scenarios, where informed decisions must be made rapidly. Part of the broader EarthCube initiative, CHORDS seeks to investigate the role of real-time data in the geosciences. Many of the phenomenon occurring within the geosciences, ranging from hurricanes and severe weather, to earthquakes, volcanoes and floods, can benefit from better handling of real-time data. The National Science Foundation funds many small teams of researchers residing at Universities whose currently inaccessible measurements could contribute to a better understanding of these phenomenon in order to ultimately improve forecasts and predictions. This lack of easy accessibility prohibits advanced algorithm and workflow development that could be initiated or enhanced by these data streams. Often the development of tools for the broad dissemination of their valuable real-time data is a large IT overhead from a pure scientific perspective, and could benefit from an easy to use, scalable, cloud-based solution to facilitate access. CHORDS proposes to make a very diverse suite of real-time data available to the broader geosciences community in order to allow innovative new science in these areas to thrive. We highlight the recently developed CHORDS portal tools and processing systems aimed at addressing some of the gaps in handling real-time data, particularly in the provisioning of data from the "long-tail" scientific community through a simple interface deployed in the cloud. Examples shown include hydrology, atmosphere and solid earth sensors. Broad use of the CHORDS framework will expand the role of real-time data within the geosciences, and enhance the potential of streaming data sources to enable adaptive experimentation and real-time hypothesis testing. CHORDS enables real-time data to be discovered and accessed using existing standards for straightforward integration into analysis, visualization and modeling tools.

  19. Study design in medical research: part 2 of a series on the evaluation of scientific publications.

    PubMed

    Röhrig, Bernd; du Prel, Jean-Baptist; Blettner, Maria

    2009-03-01

    The scientific value and informativeness of a medical study are determined to a major extent by the study design. Errors in study design cannot be corrected afterwards. Various aspects of study design are discussed in this article. Six essential considerations in the planning and evaluation of medical research studies are presented and discussed in the light of selected scientific articles from the international literature as well as the authors' own scientific expertise with regard to study design. The six main considerations for study design are the question to be answered, the study population, the unit of analysis, the type of study, the measuring technique, and the calculation of sample size. This article is intended to give the reader guidance in evaluating the design of studies in medical research. This should enable the reader to categorize medical studies better and to assess their scientific quality more accurately.

  20. Science Partnerships Enabling Rapid Response: Designing a Strategy for Improving Scientific Collaboration during Crisis Response

    NASA Astrophysics Data System (ADS)

    Mease, L.; Gibbs, T.; Adiseshan, T.

    2014-12-01

    The 2010 Deepwater Horizon disaster required unprecedented engagement and collaboration with scientists from multiple disciplines across government, academia, and industry. Although this spurred the rapid advancement of valuable new scientific knowledge and tools, it also exposed weaknesses in the system of information dissemination and exchange among the scientists from those three sectors. Limited government communication with the broader scientific community complicated the rapid mobilization of the scientific community to assist with spill response, evaluation of impact, and public perceptions of the crisis. The lessons and new laws produced from prior spills such as Exxon Valdez were helpful, but ultimately did not lead to the actions necessary to prepare a suitable infrastructure that would support collaboration with non-governmental scientists. As oil demand pushes drilling into increasingly extreme environments, addressing the challenge of effective, science-based disaster response is an imperative. Our study employs a user-centered design process to 1) understand the obstacles to and opportunity spaces for effective scientific collaboration during environmental crises such as large oil spills, 2) identify possible tools and strategies to enable rapid information exchange between government responders and non-governmental scientists from multiple relevant disciplines, and 3) build a network of key influencers to secure sufficient buy-in for scaled implementation of appropriate tools and strategies. Our methods include user ethnography, complex system mapping, individual and system behavioral analysis, and large-scale system design to identify and prototype a solution to this crisis collaboration challenge. In this talk, we will present out insights gleaned from existing analogs of successful scientific collaboration during crises and our initial findings from the 60 targeted interviews we conducted that highlight key collaboration challenges that government agencies, academic research institutions, and industry scientists face during oil spill crises. We will also present a synthesis of leverage points in the system that may amplify the impact of an improved collaboration strategy among scientific stakeholders.

  1. [In silico, in vitro, in omic experimental models and drug safety evaluation].

    PubMed

    Claude, Nancy; Goldfain-Blanc, Françoise; Guillouzo, André

    2009-01-01

    Over the last few decades, toxicology has benefited from scientific, technical, and bioinformatic developments relating to patient safety assessment during clinical and drug marketing studies. Based on this knowledge, new in silico, in vitro, and "omic" experimental models are emerging. Although these models cannot currently replace classic safety evaluations performed on laboratory animals, they allow compounds with unacceptable toxicity to be rejected in the early stages of drug development, thereby reducing the number of laboratory animals needed. In addition, because these models are particularly adapted to mechanistic studies, they can help to improve the relevance of the data obtained, thus enabling better prevention and screening of the adverse effects that may occur in humans. Much progress remains to be done, especially in the field of validation. Nevertheless, current efforts by industrial, academic laboratories, and regulatory agencies should, in coming years, significantly improve preclinical drug safety evaluation thanks to the integration of these new methods into the drug research and development process.

  2. Improvements in the Scalability of the NASA Goddard Multiscale Modeling Framework for Hurricane Climate Studies

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Tao, Wei-Kuo; Chern, Jiun-Dar

    2007-01-01

    Improving our understanding of hurricane inter-annual variability and the impact of climate change (e.g., doubling CO2 and/or global warming) on hurricanes brings both scientific and computational challenges to researchers. As hurricane dynamics involves multiscale interactions among synoptic-scale flows, mesoscale vortices, and small-scale cloud motions, an ideal numerical model suitable for hurricane studies should demonstrate its capabilities in simulating these interactions. The newly-developed multiscale modeling framework (MMF, Tao et al., 2007) and the substantial computing power by the NASA Columbia supercomputer show promise in pursuing the related studies, as the MMF inherits the advantages of two NASA state-of-the-art modeling components: the GEOS4/fvGCM and 2D GCEs. This article focuses on the computational issues and proposes a revised methodology to improve the MMF's performance and scalability. It is shown that this prototype implementation enables 12-fold performance improvements with 364 CPUs, thereby making it more feasible to study hurricane climate.

  3. A Percolation Model for Fracking

    NASA Astrophysics Data System (ADS)

    Norris, J. Q.; Turcotte, D. L.; Rundle, J. B.

    2014-12-01

    Developments in fracking technology have enabled the recovery of vast reserves of oil and gas; yet, there is very little publicly available scientific research on fracking. Traditional reservoir simulator models for fracking are computationally expensive, and require many hours on a supercomputer to simulate a single fracking treatment. We have developed a computationally inexpensive percolation model for fracking that can be used to understand the processes and risks associated with fracking. In our model, a fluid is injected from a single site and a network of fractures grows from the single site. The fracture network grows in bursts, the failure of a relatively strong bond followed by the failure of a series of relatively weak bonds. These bursts display similarities to micro seismic events observed during a fracking treatment. The bursts follow a power-law (Gutenburg-Richter) frequency-size distribution and have growth rates similar to observed earthquake moment rates. These are quantifiable features that can be compared to observed microseismicity to help understand the relationship between observed microseismicity and the underlying fracture network.

  4. Modeling microbial community structure and functional diversity across time and space.

    PubMed

    Larsen, Peter E; Gibbons, Sean M; Gilbert, Jack A

    2012-07-01

    Microbial communities exhibit exquisitely complex structure. Many aspects of this complexity, from the number of species to the total number of interactions, are currently very difficult to examine directly. However, extraordinary efforts are being made to make these systems accessible to scientific investigation. While recent advances in high-throughput sequencing technologies have improved accessibility to the taxonomic and functional diversity of complex communities, monitoring the dynamics of these systems over time and space - using appropriate experimental design - is still expensive. Fortunately, modeling can be used as a lens to focus low-resolution observations of community dynamics to enable mathematical abstractions of functional and taxonomic dynamics across space and time. Here, we review the approaches for modeling bacterial diversity at both the very large and the very small scales at which microbial systems interact with their environments. We show that modeling can help to connect biogeochemical processes to specific microbial metabolic pathways. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  5. Artificial intelligence support for scientific model-building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.

  6. Dr. Robert Goddard

    NASA Image and Video Library

    2017-12-08

    Dr. Robert Goddard's 22 foot rocket in it's launching tower, 1940, near Roswell, New Mexico. N.T. Ljungquist on the ground, A.W. Kisk working on rocket and C. Mansur at top of tower. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  7. Cretaceous Footprints Found on Goddard Campus

    NASA Image and Video Library

    2017-12-08

    Michael Godfrey beginning the process of quarrying down around the footprint bearing layer. Photo taken December 31, 2012. Image courtesy Stephen Godfrey NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  8. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Dr. Robert Goddard on the campus of Clark University, Worcester, Mass. mounting a srocket chamber for the 1915-1916 experiments. Dr. Goddard earned his doctorate at Clark and also taught physics there. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  9. Big, Deep, and Smart Data in Scanning Probe Microscopy.

    PubMed

    Kalinin, Sergei V; Strelcov, Evgheni; Belianinov, Alex; Somnath, Suhas; Vasudevan, Rama K; Lingerfelt, Eric J; Archibald, Richard K; Chen, Chaomei; Proksch, Roger; Laanait, Nouamane; Jesse, Stephen

    2016-09-27

    Scanning probe microscopy (SPM) techniques have opened the door to nanoscience and nanotechnology by enabling imaging and manipulation of the structure and functionality of matter at nanometer and atomic scales. Here, we analyze the scientific discovery process in SPM by following the information flow from the tip-surface junction, to knowledge adoption by the wider scientific community. We further discuss the challenges and opportunities offered by merging SPM with advanced data mining, visual analytics, and knowledge discovery technologies.

  10. Achievements of ATS-6 beacon experiment over Indian sub-continent

    NASA Technical Reports Server (NTRS)

    Deshpande, M. R.; Rastogi, R. G.; Vats, H. O.; Sethia, G.; Chandra, H.; Davies, K.; Grubb, R. N.; Jones, J. E.

    1978-01-01

    The repositioning of the ATS-6 satellite at 34 deg E enabled the scientific community of India to use the satellite's radio beacon for ionospheric studies. Two scientific projects were undertaken. The objective of the first project was to map ionospheric electron content, range rate errors, traveling ionospheric phenomena, solar flare effect, and magnetic phenomena. The second project was aimed at studying geophysical phenomena associated with the equatorial electrojet. The principal results of these studies are described.

  11. Aeras: A next generation global atmosphere model

    DOE PAGES

    Spotz, William F.; Smith, Thomas M.; Demeshko, Irina P.; ...

    2015-06-01

    Sandia National Laboratories is developing a new global atmosphere model named Aeras that is performance portable and supports the quantification of uncertainties. These next-generation capabilities are enabled by building Aeras on top of Albany, a code base that supports the rapid development of scientific application codes while leveraging Sandia's foundational mathematics and computer science packages in Trilinos and Dakota. Embedded uncertainty quantification (UQ) is an original design capability of Albany, and performance portability is a recent upgrade. Other required features, such as shell-type elements, spectral elements, efficient explicit and semi-implicit time-stepping, transient sensitivity analysis, and concurrent ensembles, were not componentsmore » of Albany as the project began, and have been (or are being) added by the Aeras team. We present early UQ and performance portability results for the shallow water equations.« less

  12. Earth System Science at NASA: Teleconnections Between Sea Surface Temperature and Epidemics in Africa

    NASA Technical Reports Server (NTRS)

    Meeson, Blanche W.

    2000-01-01

    The research carried out in the Earth Sciences in NASA and at NASA's Goddard Space Flight Center will be the focus of the presentations. In addition, one research project that links sea surface temperature to epidemics in Africa will be highlighted. At GSFC research interests span the full breath of disciplines in Earth Science. Branches and research groups focus on areas as diverse as planetary geomagnetics and atmospheric chemistry. These organizations focus on atmospheric sciences (atmospheric chemistry, climate and radiation, regional processes, atmospheric modeling), hydrological sciences (snow, ice, oceans, and seasonal-to-interannual prediction), terrestrial physics (geology, terrestrial biology, land-atmosphere interactions, geophysics), climate modeling (global warming, greenhouse gases, climate change), on sensor development especially using lidar and microwave technologies, and on information technologies, that enable support of scientific and technical research.

  13. The Internet and science communication: blurring the boundaries

    PubMed Central

    Warden, R

    2010-01-01

    Scientific research is heavily dependent on communication and collaboration. Research does not exist in a bubble; scientific work must be communicated in order to add it to the body of knowledge within a scientific community, so that its members may ‘stand on the shoulders of giants’ and benefit from all that has come before. The effectiveness of scientific communication is crucial to the pace of scientific progress: in all its forms it enables ideas to be formulated, results to be compared, and replications and improvements to be made. The sharing of science is a foundational aspect of the scientific method. This paper, part of the policy research within the FP7 EUROCANCERCOMS project, discusses how the Internet has changed communication by cancer researchers and how it has the potential to change it still more in the future. It will detail two broad types of communication: formal and informal, and how these are changing with the use of new web tools and technologies. PMID:22276045

  14. Advances in Cross-Cutting Ideas for Computational Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, Esmond; Evans, Katherine J.; Caldwell, Peter

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less

  15. Advances in Cross-Cutting Ideas for Computational Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, E.; Evans, K.; Caldwell, P.

    This report presents results from the DOE-sponsored workshop titled, Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less

  16. Developing and validating advanced divertor solutions on DIII-D for next-step fusion devices

    NASA Astrophysics Data System (ADS)

    Guo, H. Y.; Hill, D. N.; Leonard, A. W.; Allen, S. L.; Stangeby, P. C.; Thomas, D.; Unterberg, E. A.; Abrams, T.; Boedo, J.; Briesemeister, A. R.; Buchenauer, D.; Bykov, I.; Canik, J. M.; Chrobak, C.; Covele, B.; Ding, R.; Doerner, R.; Donovan, D.; Du, H.; Elder, D.; Eldon, D.; Lasa, A.; Groth, M.; Guterl, J.; Jarvinen, A.; Hinson, E.; Kolemen, E.; Lasnier, C. J.; Lore, J.; Makowski, M. A.; McLean, A.; Meyer, B.; Moser, A. L.; Nygren, R.; Owen, L.; Petrie, T. W.; Porter, G. D.; Rognlien, T. D.; Rudakov, D.; Sang, C. F.; Samuell, C.; Si, H.; Schmitz, O.; Sontag, A.; Soukhanovskii, V.; Wampler, W.; Wang, H.; Watkins, J. G.

    2016-12-01

    A major challenge facing the design and operation of next-step high-power steady-state fusion devices is to develop a viable divertor solution with order-of-magnitude increases in power handling capability relative to present experience, while having acceptable divertor target plate erosion and being compatible with maintaining good core plasma confinement. A new initiative has been launched on DIII-D to develop the scientific basis for design, installation, and operation of an advanced divertor to evaluate boundary plasma solutions applicable to next step fusion experiments beyond ITER. Developing the scientific basis for fusion reactor divertor solutions must necessarily follow three lines of research, which we plan to pursue in DIII-D: (1) Advance scientific understanding and predictive capability through development and comparison between state-of-the art computational models and enhanced measurements using targeted parametric scans; (2) Develop and validate key divertor design concepts and codes through innovative variations in physical structure and magnetic geometry; (3) Assess candidate materials, determining the implications for core plasma operation and control, and develop mitigation techniques for any deleterious effects, incorporating development of plasma-material interaction models. These efforts will lead to design, installation, and evaluation of an advanced divertor for DIII-D to enable highly dissipative divertor operation at core density (n e/n GW), neutral fueling and impurity influx most compatible with high performance plasma scenarios and reactor relevant plasma facing components (PFCs). This paper highlights the current progress and near-term strategies of boundary/PMI research on DIII-D.

  17. Developing and validating advanced divertor solutions on DIII-D for next-step fusion devices

    DOE PAGES

    Guo, H. Y.; Hill, D. N.; Leonard, A. W.; ...

    2016-09-14

    A major challenge facing the design and operation of next-step high-power steady-state fusion devices is to develop a viable divertor solution with order-of-magnitude increases in power handling capability relative to present experience, while having acceptable divertor target plate erosion and being compatible with maintaining good core plasma confinement. A new initiative has been launched on DIII-D to develop the scientific basis for design, installation, and operation of an advanced divertor to evaluate boundary plasma solutions applicable to next step fusion experiments beyond ITER. Developing the scientific basis for fusion reactor divertor solutions must necessarily follow three lines of research, whichmore » we plan to pursue in DIII-D: (1) Advance scientific understanding and predictive capability through development and comparison between state-of-the art computational models and enhanced measurements using targeted parametric scans; (2) Develop and validate key divertor design concepts and codes through innovative variations in physical structure and magnetic geometry; (3) Assess candidate materials, determining the implications for core plasma operation and control, and develop mitigation techniques for any deleterious effects, incorporating development of plasma-material interaction models. These efforts will lead to design, installation, and evaluation of an advanced divertor for DIII-D to enable highly dissipative divertor operation at core density (n e/n GW), neutral fueling and impurity influx most compatible with high performance plasma scenarios and reactor relevant plasma facing components (PFCs). In conclusion, this paper highlights the current progress and near-term strategies of boundary/PMI research on DIII-D.« less

  18. Enabling a Scientific Cloud Marketplace: VGL (Invited)

    NASA Astrophysics Data System (ADS)

    Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.

    2013-12-01

    The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org

  19. "Smart" Vehicle Management System: A Necessity for Future Endeavors

    NASA Astrophysics Data System (ADS)

    Haddock, A. T.; Olden, G. W.; Barnes, P. K.

    2018-02-01

    The "Smart" Vehicle Management System (VMS) will give an overview of how a robust VMS would enable experiments to be conducted on the spacecraft in both manned and unmanned states, increasing the scientific benefits.

  20. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges

    PubMed Central

    Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin

    2016-01-01

    Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516

  1. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges.

    PubMed

    Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin

    2016-01-01

    The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.

  2. Audit of a Scientific Data Center for Certification as a Trustworthy Digital Repository: A Case Study

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2011-12-01

    Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.

  3. Alzforum and SWAN: the present and future of scientific web communities.

    PubMed

    Clark, Tim; Kinoshita, June

    2007-05-01

    Scientists drove the early development of the World Wide Web, primarily as a means for rapid communication, document sharing and data access. They have been far slower to adopt the web as a medium for building research communities. Yet, web-based communities hold great potential for accelerating the pace of scientific research. In this article, we will describe the 10-year experience of the Alzheimer Research Forum ('Alzforum'), a unique example of a thriving scientific web community, and explain the features that contributed to its success. We will then outline the SWAN (Semantic Web Applications in Neuromedicine) project, in which Alzforum curators are collaborating with informatics researchers to develop novel approaches that will enable communities to share richly contextualized information about scientific data, claims and hypotheses.

  4. Idle waves in high-performance computing

    NASA Astrophysics Data System (ADS)

    Markidis, Stefano; Vencels, Juris; Peng, Ivy Bo; Akhmetova, Dana; Laure, Erwin; Henri, Pierre

    2015-01-01

    The vast majority of parallel scientific applications distributes computation among processes that are in a busy state when computing and in an idle state when waiting for information from other processes. We identify the propagation of idle waves through processes in scientific applications with a local information exchange between the two processes. Idle waves are nondispersive and have a phase velocity inversely proportional to the average busy time. The physical mechanism enabling the propagation of idle waves is the local synchronization between two processes due to remote data dependency. This study provides a description of the large number of processes in parallel scientific applications as a continuous medium. This work also is a step towards an understanding of how localized idle periods can affect remote processes, leading to the degradation of global performance in parallel scientific applications.

  5. Big Data Ecosystems Enable Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, Terence J.; Kleese van Dam, Kerstin

    Over the past 5 years, advances in experimental, sensor and computational technologies have driven the exponential growth in the volumes, acquisition rates, variety and complexity of scientific data. As noted by Hey et al in their 2009 e-book The Fourth Paradigm, this availability of large-quantities of scientifically meaningful data has given rise to a new scientific methodology - data intensive science. Data intensive science is the ability to formulate and evaluate hypotheses using data and analysis to extend, complement and, at times, replace experimentation, theory, or simulation. This new approach to science no longer requires scientists to interact directly withmore » the objects of their research; instead they can utilize digitally captured, reduced, calibrated, analyzed, synthesized and visualized results - allowing them carry out 'experiments' in data.« less

  6. Software Reuse Methods to Improve Technological Infrastructure for e-Science

    NASA Technical Reports Server (NTRS)

    Marshall, James J.; Downs, Robert R.; Mattmann, Chris A.

    2011-01-01

    Social computing has the potential to contribute to scientific research. Ongoing developments in information and communications technology improve capabilities for enabling scientific research, including research fostered by social computing capabilities. The recent emergence of e-Science practices has demonstrated the benefits from improvements in the technological infrastructure, or cyber-infrastructure, that has been developed to support science. Cloud computing is one example of this e-Science trend. Our own work in the area of software reuse offers methods that can be used to improve new technological development, including cloud computing capabilities, to support scientific research practices. In this paper, we focus on software reuse and its potential to contribute to the development and evaluation of information systems and related services designed to support new capabilities for conducting scientific research.

  7. Modelling the interaction between flooding events and economic growth

    NASA Astrophysics Data System (ADS)

    Grames, Johanna; Fürnkranz-Prskawetz, Alexia; Grass, Dieter; Viglione, Alberto; Blöschl, Günter

    2016-04-01

    Recently socio-hydrology models have been proposed to analyze the interplay of community risk-coping culture, flooding damage and economic growth. These models descriptively explain the feedbacks between socio-economic development and natural disasters such as floods. Complementary to these descriptive models, we develop a dynamic optimization model, where the inter-temporal decision of an economic agent interacts with the hydrological system. This interdisciplinary approach matches with the goals of Panta Rhei i.e. to understand feedbacks between hydrology and society. It enables new perspectives but also shows limitations of each discipline. Young scientists need mentors from various scientific backgrounds to learn their different research approaches and how to best combine them such that interdisciplinary scientific work is also accepted by different science communities. In our socio-hydrology model we apply a macro-economic decision framework to a long-term flood-scenario. We assume a standard macro-economic growth model where agents derive utility from consumption and output depends on physical capital that can be accumulated through investment. To this framework we add the occurrence of flooding events which will destroy part of the capital. We identify two specific periodic long term solutions and denote them rich and poor economies. Whereas rich economies can afford to invest in flood defense and therefore avoid flood damage and develop high living standards, poor economies prefer consumption instead of investing in flood defense capital and end up facing flood damages every time the water level rises. Nevertheless, they manage to sustain at least a low level of physical capital. We identify optimal investment strategies and compare simulations with more frequent and more intense high water level events.

  8. Tracking a Superstorm

    NASA Image and Video Library

    2017-12-08

    Oct. 29, 2012 – A day before landfall, Sandy intensified into a Category 2 superstorm nearly 1,000 miles wide. Credit: NASA's Goddard Space Flight Center and NASA Center for Climate Simulation Video and images courtesy of NASA/GSFC/William Putman -- A NASA computer model simulates the astonishing track and forceful winds of Hurricane Sandy. Hurricane Sandy pummeled the East Coast late in 2012’s Atlantic hurricane season, causing 159 deaths and $70 billion in damages. Days before landfall, forecasts of its trajectory were still being made. Some computer models showed that a trough in the jet stream would kick the monster storm away from land and out to sea. Among the earliest to predict its true course was NASA’s GEOS-5 global atmosphere model. The model works by dividing Earth’s atmosphere into a virtual grid of stacked boxes. A supercomputer then solves mathematical equations inside each box to create a weather forecast predicting Sandy’s structure, path and other traits. The NASA model not only produced an accurate track of Sandy, but also captured fine-scale details of the storm’s changing intensity and winds. Watch the video to see it for yourself. For more information, please visit: gmao.gsfc.nasa.gov/research/atmosphericassim/tracking_hur... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  9. Does attainment of Piaget's formal operational level of cognitive development predict student understanding of scientific models?

    NASA Astrophysics Data System (ADS)

    Lahti, Richard Dennis, II

    Knowledge of scientific models and their uses is a concept that has become a key benchmark in many of the science standards of the past 30 years, including the proposed Next Generation Science Standards. Knowledge of models is linked to other important nature of science concepts such as theory change which are also rising in prominence in newer standards. Effective methods of instruction will need to be developed to enable students to achieve these standards. The literature reveals an inconsistent history of success with modeling education. These same studies point to a possible cognitive development component which might explain why some students succeeded and others failed. An environmental science course, rich in modeling experiences, was used to test both the extent to which knowledge of models and modeling could be improved over the course of one semester, and more importantly, to identify if cognitive ability was related to this improvement. In addition, nature of science knowledge, particularly related to theories and theory change, was also examined. Pretest and posttest results on modeling (SUMS) and nature of science (SUSSI), as well as data from the modeling activities themselves, was collected. Cognitive ability was measured (CTSR) as a covariate. Students' gain in six of seven categories of modeling knowledge was at least medium (Cohen's d >.5) and moderately correlated to CTSR for two of seven categories. Nature of science gains were smaller, although more strongly correlated with CTSR. Student success at creating a model was related to CTSR, significantly in three of five sub-categories. These results suggest that explicit, reflective experience with models can increase student knowledge of models and modeling (although higher cognitive ability students may have more success), but successfully creating models may depend more heavily on cognitive ability. This finding in particular has implications in the grade placement of modeling standards and curriculum chosen to help these students, particularly those with low cognitive ability, to meet the standards.

  10. High-Throughput Study of Diffusion and Phase Transformation Kinetics of Magnesium-Based Systems for Automotive Cast Magnesium Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Alan A; Zhao, Ji-Cheng; Riggi, Adrienne

    The objective of the proposed study is to establish a scientific foundation on kinetic modeling of diffusion, phase precipitation, and casting/solidification, in order to accelerate the design and optimization of cast magnesium (Mg) alloys for weight reduction of U.S. automotive fleet. The team has performed the following tasks: 1) study diffusion kinetics of various Mg-containing binary systems using high-throughput diffusion multiples to establish reliable diffusivity and mobility databases for the Mg-aluminum (Al)-zinc (Zn)-tin (Sn)-calcium (Ca)-strontium (Sr)-manganese (Mn) systems; 2) study the precipitation kinetics (nucleation, growth and coarsening) using both innovative dual-anneal diffusion multiples and cast model alloys to provide largemore » amounts of kinetic data (including interfacial energy) and microstructure atlases to enable implementation of the Kampmann-Wagner numerical model to simulate phase transformation kinetics of non-spherical/non-cuboidal precipitates in Mg alloys; 3) implement a micromodel to take into account back diffusion in the solid phase in order to predict microstructure and microsegregation in multicomponent Mg alloys during dendritic solidification especially under high pressure die-casting (HPDC) conditions; and, 4) widely disseminate the data, knowledge and information using the Materials Genome Initiative infrastructure (http://www.mgidata.org) as well as publications and digital data sharing to enable researchers to identify new pathways/routes to better cast Mg alloys.« less

  11. Enabling joined-up decision making with geotemporal information

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Ahmed, S. E.; Purves, D. W.; Emmott, S.; Joppa, L. N.; Caldararu, S.; Visconti, P.; Newbold, T.; Formica, A. F.

    2015-12-01

    While the use of geospatial data to assist in decision making is becoming increasingly common, the use of geotemporal information: information that can be indexed by geographical space AND time, is much rarer. I will describe our scientific research and software development efforts intended to advance the availability and use of geotemporal information in general. I will show two recent examples of "stacking" geotemporal information to support land use decision making in the Brazilian Amazon and Kenya, involving data-constrained predictive models and empirically derived datasets of road development, deforestation, carbon, agricultural yields, water purification and poverty alleviation services and will show how we use trade-off analyses and constraint reasoning algorithms to explore the costs and benefits of different decisions. For the Brazilian Amazon we explore tradeoffs involved in different deforestation scenarios, while for Kenya we explore the impacts of conserving forest to support international carbon conservation initiatives (REDD+). I will also illustrate the cloud-based software tools we have developed to enable anyone to access geotemporal information, gridded (e.g. climate) or non-gridded (e.g. protected areas), for the past, present or future and incorporate such information into their analyses (e.g. www.fetchclimate.org), including how we train new predictive models to such data using Bayesian techniques: on this latter point I will show how we combine satellite and ground measured data with predictive models to forecast how crops might respond to climate change.

  12. Making US Soil Taxonomy more scientifically applicable to environmental and food security issues.

    NASA Astrophysics Data System (ADS)

    Monger, Curtis; Lindbo, David L.; Wysocki, Doug; Schoeneberger, Phil; Libohova, Zamir

    2017-04-01

    US Department of Agriculture began mapping soils in the 1890s on a county-by-county basis until most of the conterminous United States was mapped by the late 1930s. This first-generation mapping was followed by a second-generation that re-mapped the US beginning in the 1940s. Soil classification during these periods evolved into the current system of Soil Taxonomy which is based on (1) soil features as natural phenomena and on (2) soil properties important for agriculture and other land uses. While this system has enabled communication among soil surveyors, the scientific applicability of Soil Taxonomy to address environmental and food security issues has been under-utilized. In particular, little effort has been exerted to understand how soil taxa interact and function together as larger units—as soil systems. Thus, much soil-geomorphic understanding that could be applied to process-based modeling remains unexploited. The challenge for soil taxonomists in the United States and elsewhere is to expand their expertise and work with modelers to explore how soil taxa are linked to each other, how they influence water, nutrient, and pollutant flow through the landscape, how they interact with ecology, and how they change with human land use.

  13. Bayesian modeling to assess populated areas impacted by radiation from Fukushima

    NASA Astrophysics Data System (ADS)

    Hultquist, C.; Cervone, G.

    2017-12-01

    Citizen-led movements producing spatio-temporal big data are increasingly important sources of information about populations that are impacted by natural disasters. Citizen science can be used to fill gaps in disaster monitoring data, in addition to inferring human exposure and vulnerability to extreme environmental impacts. As a response to the 2011 release of radiation from Fukushima, Japan, the Safecast project began collecting open radiation data which grew to be a global dataset of over 70 million measurements to date. This dataset is spatially distributed primarily where humans are located and demonstrates abnormal patterns of population movements as a result of the disaster. Previous work has demonstrated that Safecast is highly correlated in comparison to government radiation observations. However, there is still a scientific need to understand the geostatistical variability of Safecast data and to assess how reliable the data are over space and time. The Bayesian hierarchical approach can be used to model the spatial distribution of datasets and flexibly integrate new flows of data without losing previous information. This enables an understanding of uncertainty in the spatio-temporal data to inform decision makers on areas of high levels of radiation where populations are located. Citizen science data can be scientifically evaluated and used as a critical source of information about populations that are impacted by a disaster.

  14. Using our Heads and HARTSS*: Developing Perspective-Taking Skills for Socioscientific Reasoning (*Humanities, ARTs, and Social Sciences)

    NASA Astrophysics Data System (ADS)

    Kahn, Sami; Zeidler, Dana L.

    2016-04-01

    Functional scientific literacy demands an informed citizenry capable of negotiating controversial socioscientific issues (SSI). Perspective taking is critical to SSI implementation as it enables understanding of the diverse cognitive and emotional perspectives of others. Science teacher educators must therefore facilitate teachers' promotion of classroom environments that value diverse perspectives. The purpose of this theoretical paper is to propose the HARTSS model through which successful practices that promote perspective taking in the humanities, arts, and social sciences are identified and translated into socioscientific contexts, thereby developing an array of promising interventions designed for science teacher educators to foster perspective taking in current and future science teachers and their students.

  15. New working paradigms in research laboratories.

    PubMed

    Keighley, Wilma; Sewing, Andreas

    2009-07-01

    Work in research laboratories, especially within centralised functions in larger organisations, is changing fast. With easier access to external providers and Contract Research Organisations, and a focus on budgets and benchmarking, scientific expertise has to be complemented with operational excellence. New concepts, globally shared projects and restricted resources highlight the constraints of traditional operating models working from Monday to Friday and nine to five. Whilst many of our scientists welcome this new challenge, organisations have to enable and foster a more business-like mindset. Organisational structures, remuneration, as well as systems in finance need to be adapted to build operations that are best-in-class rather than merely minimising negative impacts of current organisational structures.

  16. A Curriculum for the New Dental Practitioner: Preparing Dentists for a Prospective Oral Health Care Environment

    PubMed Central

    Polverini, Peter J.

    2012-01-01

    The emerging concept of prospective health care would shift the focus of health care from disease management to disease prevention and health management. Dentistry has a unique opportunity to embrace this model of prospective and collaborative care and focus on the management of oral health. Academic dentistry must better prepare future dentists to succeed in this new health care environment by providing them with the scientific and technical knowledge required to understand and assess risk and practice disease prevention. Dental schools must consider creating career pathways for enabling future graduates to assume important leadership roles that will advance a prospective oral health care system. PMID:22390456

  17. The Living with a Star Program: NASA's Role in Assuring Performance in Space and Atmospheric Environments

    NASA Technical Reports Server (NTRS)

    Barth, Janet L.; LaBel, Kenneth; Brewer, Dana; Withbroe, George; Kauffman, Billy

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affect life and society. A goal of the program is to bridge the gap between science, engineering, and user application communities. This will enable future science, operational, and commercial objectives in space and atmospheric environments by improving engineering approaches to the accommodation and/or mitigation of the effects of solar variability on technological systems. A pre-formulation study determined the optimum combination of science missions, modeling, and technology infusion elements to accomplish this goal. The results of the study are described.

  18. Toward server-side, high performance climate change data analytics in the Earth System Grid Federation (ESGF) eco-system

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Williams, Dean; Aloisio, Giovanni

    2016-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.

  19. Solar Sail Propulsion: Enabling New Capabilities for Heliophysics

    NASA Technical Reports Server (NTRS)

    Johnson, L.; Young, R.; Alhorn, D.; Heaton, A.; Vansant, T.; Campbell, B.; Pappa, R.; Keats, W.; Liewer, P. C.; Alexander, D.; hide

    2010-01-01

    Solar sails can play a critical role in enabling solar and heliophysics missions. Solar sail technology within NASA is currently at 80% of TRL-6, suitable for an in-flight technology demonstration. It is conceivable that an initial demonstration could carry scientific payloads that, depending on the type of mission, are commensurate with the goals of the three study panels of the 2010 Heliophysics Survey. Follow-on solar sail missions, leveraging advances in solar sail technology to support Heliophysics Survey goals, would then be feasible. This white paper reports on a sampling of missions enabled by solar sails, the current state of the technology, and what funding is required to advance the current state of technology such that solar sails can enable these missions

  20. The (Surplus) Value of Scientific Communication.

    ERIC Educational Resources Information Center

    Frohlich, Gerhard

    1996-01-01

    Discusses research on scientific communication. Topics include theory-less and formal technical/natural scientific models of scientific communication; social-scientific, power-sensitive models; the sociology of scientific communication; sciences as fields of competition; fraud and deception; potential surplus value across subject information…

  1. James Webb Space Telescope Core 2 Test - Cryogenic Thermal Balance Test of the Observatorys Core Area Thermal Control Hardware

    NASA Technical Reports Server (NTRS)

    Cleveland, Paul; Parrish, Keith; Thomson, Shaun; Marsh, James; Comber, Brian

    2016-01-01

    The James Webb Space Telescope (JWST), successor to the Hubble Space Telescope, will be the largest astronomical telescope ever sent into space. To observe the very first light of the early universe, JWST requires a large deployed 6.5-meter primary mirror cryogenically cooled to less than 50 Kelvin. Three scientific instruments are further cooled via a large radiator system to less than 40 Kelvin. A fourth scientific instrument is cooled to less than 7 Kelvin using a combination pulse-tube Joule-Thomson mechanical cooler. Passive cryogenic cooling enables the large scale of the telescope which must be highly folded for launch on an Ariane 5 launch vehicle and deployed once on orbit during its journey to the second Earth-Sun Lagrange point. Passive cooling of the observatory is enabled by the deployment of a large tennis court sized five layer Sunshield combined with the use of a network of high efficiency radiators. A high purity aluminum heat strap system connects the three instrument's detector systems to the radiator systems to dissipate less than a single watt of parasitic and instrument dissipated heat. JWST's large scale features, while enabling passive cooling, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone of most space missions' thermal verification plans. This paper describes the JWST Core 2 Test, which is a cryogenic thermal balance test of a full size, high fidelity engineering model of the Observatory's 'Core' area thermal control hardware. The 'Core' area is the key mechanical and cryogenic interface area between all Observatory elements. The 'Core' area thermal control hardware allows for temperature transition of 300K to approximately 50 K by attenuating heat from the room temperature IEC (instrument electronics) and the Spacecraft Bus. Since the flight hardware is not available for test, the Core 2 test uses high fidelity and flight-like reproductions.

  2. The GEOSS User Requirement Registry (URR): A Cross-Cutting Service-Oriented Infrastructure Linking Science, Society and GEOSS

    NASA Astrophysics Data System (ADS)

    Plag, H.-P.; Foley, G.; Jules-Plag, S.; Ondich, G.; Kaufman, J.

    2012-04-01

    The Group on Earth Observations (GEO) is implementing the Global Earth Observation System of Systems (GEOSS) as a user-driven service infrastructure responding to the needs of users in nine interdependent Societal Benefit Areas (SBAs) of Earth observations (EOs). GEOSS applies an interdisciplinary scientific approach integrating observations, research, and knowledge in these SBAs in order to enable scientific interpretation of the collected observations and the extraction of actionable information. Using EOs to actually produce these societal benefits means getting the data and information to users, i.e., decision-makers. Thus, GEO needs to know what the users need and how they would use the information. The GEOSS User Requirements Registry (URR) is developed as a service-oriented infrastructure enabling a wide range of users, including science and technology (S&T) users, to express their needs in terms of EOs and to understand the benefits of GEOSS for their fields. S&T communities need to be involved in both the development and the use of GEOSS, and the development of the URR accounts for the special needs of these communities. The GEOSS Common Infrastructure (GCI) at the core of GEOSS includes system-oriented registries enabling users to discover, access, and use EOs and derived products and services available through GEOSS. In addition, the user-oriented URR is a place for the collection, sharing, and analysis of user needs and EO requirements, and it provides means for an efficient dialog between users and providers. The URR is a community-based infrastructure for the publishing, viewing, and analyzing of user-need related information. The data model of the URR has a core of seven relations for User Types, Applications, Requirements, Research Needs, Infrastructure Needs, Technology Needs, and Capacity Building Needs. The URR also includes a Lexicon, a number of controlled vocabularies, and

  3. MCSDSS: A Multi-Criteria Decision Support System for Merging Geoscience Information with Natural User Interfaces, Preference Ranking, and Interactive Data Utilities

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Gentle, J.

    2015-12-01

    The multi-criteria decision support system (MCSDSS) is a newly completed application for touch-enabled group decision support that uses D3 data visualization tools, a geojson conversion utility that we developed, and Paralelex to create an interactive tool. The MCSDSS is a prototype system intended to demonstrate the potential capabilities of a single page application (SPA) running atop a web and cloud based architecture utilizing open source technologies. The application is implemented on current web standards while supporting human interface design that targets both traditional mouse/keyboard interactions and modern touch/gesture enabled interactions. The technology stack for MCSDSS was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The application integrates current frameworks for highly performant agile development with unit testing, statistical analysis, data visualization, mapping technologies, geographic data manipulation, and cloud infrastructure while retaining support for traditional HTML5/CSS3 web standards. The software lifecylcle for MCSDSS has following best practices to develop, share, and document the codebase and application. Code is documented and shared via an online repository with the option for programmers to see, contribute, or fork the codebase. Example data files and tutorial documentation have been shared with clear descriptions and data object identifiers. And the metadata about the application has been incorporated into an OntoSoft entry to ensure that MCSDSS is searchable and clearly described. MCSDSS is a flexible platform that allows for data fusion and inclusion of large datasets in an interactive front-end application capable of connecting with other science-based applications and advanced computing resources. In addition, MCSDSS offers functionality that enables communication with non-technical users for policy, education, or engagement with groups around scientific topics with societal relevance.

  4. Advancing an Information Model for Environmental Observations

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Hooper, R. P.; Lehnert, K. A.; Schreuders, K.; Tarboton, D. G.; Valentine, D. W.; Zaslavsky, I.

    2011-12-01

    Observational data are fundamental to hydrology and water resources, and the way they are organized, described, and shared either enables or inhibits the analyses that can be performed using the data. The CUAHSI Hydrologic Information System (HIS) project is developing cyberinfrastructure to support hydrologic science by enabling better access to hydrologic data. HIS is composed of three major components. HydroServer is a software stack for publishing time series of hydrologic observations on the Internet as well as geospatial data using standards-based web feature, map, and coverage services. HydroCatalog is a centralized facility that catalogs the data contents of individual HydroServers and enables search across them. HydroDesktop is a client application that interacts with both HydroServer and HydroCatalog to discover, download, visualize, and analyze hydrologic observations published on one or more HydroServers. All three components of HIS are founded upon an information model for hydrologic observations at stationary points that specifies the entities, relationships, constraints, rules, and semantics of the observational data and that supports its data services. Within this information model, observations are described with ancillary information (metadata) about the observations to allow them to be unambiguously interpreted and used, and to provide traceable heritage from raw measurements to useable information. Physical implementations of this information model include the Observations Data Model (ODM) for storing hydrologic observations, Water Markup Language (WaterML) for encoding observations for transmittal over the Internet, the HydroCatalog metadata catalog database, and the HydroDesktop data cache database. The CUAHSI HIS and this information model have now been in use for several years, and have been deployed across many different academic institutions as well as across several national agency data repositories. Additionally, components of the HIS have been modified to support data management for the Critical Zone Observatories (CZOs). This paper will present limitations of the existing information model used by the CUAHSI HIS that have been uncovered through its deployment and use, as well as new advances to the information model, including: better representation of both in situ observations from field sensors and observations derived from environmental samples, extensibility in attributes used to describe observations, and observation provenance. These advances have been developed by the HIS team and the broader scientific community and will enable the information model to accommodate and better describe wider classes of environmental observations and to better meet the needs of the hydrologic science and CZO communities.

  5. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    NASA Astrophysics Data System (ADS)

    Pallant, Amy; Lee, Hee-Sun

    2015-04-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

  6. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  7. Accelerating scientific discovery : 2007 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Dave, P.; Drugan, C.

    2008-11-14

    As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less

  8. Development of an Empirically Based Learning Performances Framework for Third-Grade Students' Model-Based Explanations about Plant Processes

    ERIC Educational Resources Information Center

    Zangori, Laura; Forbes, Cory T.

    2016-01-01

    To develop scientific literacy, elementary students should engage in knowledge building of core concepts through scientific practice (Duschl, Schweingruber, & Schouse, 2007). A core scientific practice is engagement in scientific modeling to build conceptual understanding about discipline-specific concepts. Yet scientific modeling remains…

  9. Mathematical Methods of Subjective Modeling in Scientific Research: I. The Mathematical and Empirical Basis

    NASA Astrophysics Data System (ADS)

    Pyt'ev, Yu. P.

    2018-01-01

    mathematical formalism for subjective modeling, based on modelling of uncertainty, reflecting unreliability of subjective information and fuzziness that is common for its content. The model of subjective judgments on values of an unknown parameter x ∈ X of the model M( x) of a research object is defined by the researcher-modeler as a space1 ( X, p( X), P{I^{\\bar x}}, Be{l^{\\bar x}}) with plausibility P{I^{\\bar x}} and believability Be{l^{\\bar x}} measures, where x is an uncertain element taking values in X that models researcher—modeler's uncertain propositions about an unknown x ∈ X, measures P{I^{\\bar x}}, Be{l^{\\bar x}} model modalities of a researcher-modeler's subjective judgments on the validity of each x ∈ X: the value of P{I^{\\bar x}}(\\tilde x = x) determines how relatively plausible, in his opinion, the equality (\\tilde x = x) is, while the value of Be{l^{\\bar x}}(\\tilde x = x) determines how the inequality (\\tilde x = x) should be relatively believed in. Versions of plausibility Pl and believability Bel measures and pl- and bel-integrals that inherit some traits of probabilities, psychophysics and take into account interests of researcher-modeler groups are considered. It is shown that the mathematical formalism of subjective modeling, unlike "standard" mathematical modeling, •enables a researcher-modeler to model both precise formalized knowledge and non-formalized unreliable knowledge, from complete ignorance to precise knowledge of the model of a research object, to calculate relative plausibilities and believabilities of any features of a research object that are specified by its subjective model M(\\tilde x), and if the data on observations of a research object is available, then it: •enables him to estimate the adequacy of subjective model to the research objective, to correct it by combining subjective ideas and the observation data after testing their consistency, and, finally, to empirically recover the model of a research object.

  10. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    NASA Astrophysics Data System (ADS)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.

  11. Modelling Chemical Reasoning to Predict and Invent Reactions.

    PubMed

    Segler, Marwin H S; Waller, Mark P

    2017-05-02

    The ability to reason beyond established knowledge allows organic chemists to solve synthetic problems and invent novel transformations. Herein, we propose a model that mimics chemical reasoning, and formalises reaction prediction as finding missing links in a knowledge graph. We have constructed a knowledge graph containing 14.4 million molecules and 8.2 million binary reactions, which represents the bulk of all chemical reactions ever published in the scientific literature. Our model outperforms a rule-based expert system in the reaction prediction task for 180 000 randomly selected binary reactions. The data-driven model generalises even beyond known reaction types, and is thus capable of effectively (re-)discovering novel transformations (even including transition metal-catalysed reactions). Our model enables computers to infer hypotheses about reactivity and reactions by only considering the intrinsic local structure of the graph and because each single reaction prediction is typically achieved in a sub-second time frame, the model can be used as a high-throughput generator of reaction hypotheses for reaction discovery. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. On determining firing delay time of transitions for Petri net based signaling pathways by introducing stochastic decision rules.

    PubMed

    Miwa, Yoshimasa; Li, Chen; Ge, Qi-Wei; Matsuno, Hiroshi; Miyano, Satoru

    2010-01-01

    Parameter determination is important in modeling and simulating biological pathways including signaling pathways. Parameters are determined according to biological facts obtained from biological experiments and scientific publications. However, such reliable data describing detailed reactions are not reported in most cases. This prompted us to develop a general methodology of determining the parameters of a model in the case of that no information of the underlying biological facts is provided. In this study, we use the Petri net approach for modeling signaling pathways, and propose a method to determine firing delay times of transitions for Petri net models of signaling pathways by introducing stochastic decision rules. Petri net technology provides a powerful approach to modeling and simulating various concurrent systems, and recently have been widely accepted as a description method for biological pathways. Our method enables to determine the range of firing delay time which realizes smooth token flows in the Petri net model of a signaling pathway. The availability of this method has been confirmed by the results of an application to the interleukin-1 induced signaling pathway.

  13. On determining firing delay time of transitions for petri net based signaling pathways by introducing stochastic decision rules.

    PubMed

    Miwa, Yoshimasa; Li, Chen; Ge, Qi-Wei; Matsuno, Hiroshi; Miyano, Satoru

    2011-01-01

    Parameter determination is important in modeling and simulating biological pathways including signaling pathways. Parameters are determined according to biological facts obtained from biological experiments and scientific publications. However, such reliable data describing detailed reactions are not reported in most cases. This prompted us to develop a general methodology of determining the parameters of a model in the case of that no information of the underlying biological facts is provided. In this study, we use the Petri net approach for modeling signaling pathways, and propose a method to determine firing delay times of transitions for Petri net models of signaling pathways by introducing stochastic decision rules. Petri net technology provides a powerful approach to modeling and simulating various concurrent systems, and recently have been widely accepted as a description method for biological pathways. Our method enables to determine the range of firing delay time which realizes smooth token flows in the Petri net model of a signaling pathway. The availability of this method has been confirmed by the results of an application to the interleukin-1 induced signaling pathway.

  14. Architectural frameworks: defining the structures for implementing learning health systems.

    PubMed

    Lessard, Lysanne; Michalowski, Wojtek; Fung-Kee-Fung, Michael; Jones, Lori; Grudniewicz, Agnes

    2017-06-23

    The vision of transforming health systems into learning health systems (LHSs) that rapidly and continuously transform knowledge into improved health outcomes at lower cost is generating increased interest in government agencies, health organizations, and health research communities. While existing initiatives demonstrate that different approaches can succeed in making the LHS vision a reality, they are too varied in their goals, focus, and scale to be reproduced without undue effort. Indeed, the structures necessary to effectively design and implement LHSs on a larger scale are lacking. In this paper, we propose the use of architectural frameworks to develop LHSs that adhere to a recognized vision while being adapted to their specific organizational context. Architectural frameworks are high-level descriptions of an organization as a system; they capture the structure of its main components at varied levels, the interrelationships among these components, and the principles that guide their evolution. Because these frameworks support the analysis of LHSs and allow their outcomes to be simulated, they act as pre-implementation decision-support tools that identify potential barriers and enablers of system development. They thus increase the chances of successful LHS deployment. We present an architectural framework for LHSs that incorporates five dimensions-goals, scientific, social, technical, and ethical-commonly found in the LHS literature. The proposed architectural framework is comprised of six decision layers that model these dimensions. The performance layer models goals, the scientific layer models the scientific dimension, the organizational layer models the social dimension, the data layer and information technology layer model the technical dimension, and the ethics and security layer models the ethical dimension. We describe the types of decisions that must be made within each layer and identify methods to support decision-making. In this paper, we outline a high-level architectural framework grounded in conceptual and empirical LHS literature. Applying this architectural framework can guide the development and implementation of new LHSs and the evolution of existing ones, as it allows for clear and critical understanding of the types of decisions that underlie LHS operations. Further research is required to assess and refine its generalizability and methods.

  15. Enabling a new Paradigm to Address Big Data and Open Science Challenges

    NASA Astrophysics Data System (ADS)

    Ramamurthy, Mohan; Fisher, Ward

    2017-04-01

    Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers also need to give scientists an ecosystem that includes data, tools, workflows and other services needed to perform analytics, integration, interpretation, and synthesis - all in the same environment or platform. Instead of moving data to processing systems near users, as is the tradition, the cloud permits one to bring processing, computing, analysis and visualization to data - so called data proximate workbench capabilities, also known as server-side processing. In this talk, I will present the ongoing work at Unidata to facilitate a new paradigm for doing science by offering a suite of tools, resources, and platforms to leverage cloud technologies for addressing both big data and Open Science/reproducibility challenges. That work includes the development and deployment of new protocols for data access and server-side operations and Docker container images of key applications, JupyterHub Python notebook tools, and cloud-based analysis and visualization capability via the CloudIDV tool to enable reproducible workflows and effectively use the accessed data.

  16. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  17. Etude d'un modele de Boltzmann sur reseau pour la simulation assistee par ordinateur des fluides a plusieurs phases immiscibles

    NASA Astrophysics Data System (ADS)

    Leclaire, Sebastien

    The computer assisted simulation of the dynamics of fluid flow has been a highly rewarding topic of research for several decades now, in terms of the number of scientific problems that have been solved as a result, both in the academic world and in industry. In the fluid dynamics field, simulating multiphase immiscible fluid flow remains a challenge, because of the complexity of the interactions at the flow phase interfaces. Various numerical methods are available to study these phenomena, and, the lattice Boltzmann method has been shown in recent years to be well adapted to solving this type of complex flow. In this thesis, a lattice Boltzmann model for the simulation of two-phase immiscible flows is studied. The main objective of the thesis is to develop this promising method further, with a view to enhancing its validity. To achieve this objective, the research is divided into five distinct themes. The first two focus on correcting some of the deficiencies of the original model. The third generalizes the model to support the simulation of N-phase immiscible fluid flows. The fourth is aimed at modifying the model itself, to enable the simulation of immiscible fluid flows in which the density of the phases varies. With the lattice Boltzmann class of models studied here, this density variation has been inadequately modeled, and, after 20 years, the issue still has not been resolved. The fifth, which complements this thesis, is connected with the lattice Boltzmann method, in that it generalizes the theory of 2D and 3D isotropic gradients for a high order of spatial precision. These themes have each been the subject of a scientific article, as listed in the appendix to this thesis, and together they constitute a synthesis that explains the links between the articles, as well as their scientific contributions, and satisfy the main objective of this research. Globally, a number of qualitative and quantitative test cases based on the theory of multiphase fluid flows have highlighted issues plaguing the simulation model. These test cases have resulted in various modifications to the model, which have reduced or eliminated some numerical artifacts that were problematic. They also allowed us to validate the extensions that were applied to the original model.

  18. IN13B-1660: Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX

    NASA Technical Reports Server (NTRS)

    Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris

    2016-01-01

    We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.

  19. Analytics and Visualization Pipelines for Big ­Data on the NASA Earth Exchange (NEX) and OpenNEX

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.

    2016-12-01

    We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.

  20. Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors †

    PubMed Central

    Georgitzikis, Epimitheas; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David

    2017-01-01

    Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III–V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10−6 A/cm2 at −2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors. PMID:29232871

  1. OntoSoft: A Software Registry for Geosciences

    NASA Astrophysics Data System (ADS)

    Garijo, D.; Gil, Y.

    2017-12-01

    The goal of the EarthCube OntoSoft project is to enable the creation of an ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets. By sharing software metadata in OntoSoft, scientists enable broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a distributed scientific software repository that contains more than 750 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software and allows them to crowdsource its corresponding metadata. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance, and we are using OntoSoft to show them the benefits of publishing their software metadata. This training program is part of a Geoscience Papers of the Future Initiative, where scientists are reflecting on their current practices, benefits and effort for sharing software and data. This journal paper can be submitted to a Special Section of the AGU Earth and Space Science Journal.

  2. Planetary In Situ Resource Utilization: 2000-2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This custom bibliography from the NASA Scientific and Technical Information Program lists a sampling of records found in the NASA Aeronautics and Space Database. The scope of this topic includes technologies for ultimately enabling us to "cut the cord" with Earth for space logistics. This area of focus is one of the enabling technologies as defined by NASA s Report of the President s Commission on Implementation of United States Space Exploration Policy, published in June 2004.

  3. Final Scientific Report - Wireless and Sensing Solutions Advancing Industrial Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budampati, Rama; McBrady, Adam; Nusseibeh, Fouad

    2009-09-28

    The project team's goal for the Wireless and Sensing Solution Advancing Industrial Efficiency award (DE-FC36-04GO14002) was to develop, demonstrate, and test a number of leading edge technologies that could enable the emergence of wireless sensor and sampling systems for the industrial market space. This effort combined initiatives in advanced sensor development, configurable sampling and deployment platforms, and robust wireless communications to address critical obstacles in enabling enhanced industrial efficiency.

  4. The International Space Station: Systems and Science

    NASA Technical Reports Server (NTRS)

    Giblin, Timothy W.

    2010-01-01

    ISS Program Mission: Safely build, operate, and utilize a permanent human outpost in space through an international partnership of government, industry, and academia to advance exploration of the solar system, conduct scientific research, and enable commerce in space.

  5. APPROACHES IN PROTEOMICS AND GENOMICS FOR ECO-TOXICOLOGY

    EPA Science Inventory

    A new area of scientific investigation, coined toxicogenomics, enables researchers to understand and study the interaction between the environment and inherited genetic characteristics. This understanding will be critical to fully appreciate the response of organisms to environm...

  6. Technology Needs to Support Future Mars Exploration

    NASA Technical Reports Server (NTRS)

    Nilsen, Erik N.; Baker, John; Lillard, Randolph P.

    2013-01-01

    The Mars Program Planning Group (MPPG) under the direction of Dr. Orlando Figueroa, was chartered to develop options for a program-level architecture for robotic exploration of Mars consistent with the objective to send humans to Mars in the 2030's. Scientific pathways were defined for future exploration, and multiple architectural options were developed that meet current science goals and support the future human exploration objectives. Integral to the process was the identification of critical technologies which enable the future scientific and human exploration goals. This paper describes the process for technology capabilities identification and examines the critical capability needs identified in the MPPG process. Several critical enabling technologies that have been identified to support the robotic exploration goals and with potential feedforward application to human exploration goals. Potential roadmaps for the development and validation of these technologies are discussed, including options for subscale technology demonstrations of future human exploration technologies on robotic missions.

  7. System concepts and enabling technologies for an ESA low-cost mission to Jupiter / Europa

    NASA Astrophysics Data System (ADS)

    Renard, P.; Koeck, C.; Kemble, Steve; Atzei, Alessandro; Falkner, Peter

    2004-11-01

    The European Space Agency is currently studying the Jovian Minisat Explorer (JME), as part of its Technology Reference Studies (TRS), used for its development plan of technologies enabling future scientific missions. The JME focuses on the exploration of the Jovian system and particularly of Europa. The Jupiter Minisat Orbiter (JMO) study concerns the first mission phase of JME that counts up to three missions using pairs of minisats. The scientific objectives are the investigation of Europa's global topography, the composition of its (sub)surface and the demonstration of existence of a subsurface ocean below its icy crust. The present paper describes the candidate JMO system concept, based on a Europa Orbiter (JEO) supported by a communications relay satellite (JRS), and its associated technology development plan. It summarizes an analysis performed in 2004 jointly by ESA and the EADS-Astrium Company in the frame of an industrial technical assistance to ESA.

  8. PixelLearn

    NASA Technical Reports Server (NTRS)

    Mazzoni, Dominic; Wagstaff, Kiri; Bornstein, Benjamin; Tang, Nghia; Roden, Joseph

    2006-01-01

    PixelLearn is an integrated user-interface computer program for classifying pixels in scientific images. Heretofore, training a machine-learning algorithm to classify pixels in images has been tedious and difficult. PixelLearn provides a graphical user interface that makes it faster and more intuitive, leading to more interactive exploration of image data sets. PixelLearn also provides image-enhancement controls to make it easier to see subtle details in images. PixelLearn opens images or sets of images in a variety of common scientific file formats and enables the user to interact with several supervised or unsupervised machine-learning pixel-classifying algorithms while the user continues to browse through the images. The machinelearning algorithms in PixelLearn use advanced clustering and classification methods that enable accuracy much higher than is achievable by most other software previously available for this purpose. PixelLearn is written in portable C++ and runs natively on computers running Linux, Windows, or Mac OS X.

  9. Enabling the Use of Authentic Scientific Data in the Classroom--Lessons Learned from the AccessData and Data Services Workshops

    NASA Astrophysics Data System (ADS)

    Lynds, S. E.; Buhr, S. M.; Ledley, T. S.

    2007-12-01

    Enabling the Use of Authentic Scientific Data in the Classroom--Lessons Learned from the AccessData and Data Services Workshops Since 2004, the annual AccessData and DLESE Data Services workshops have gathered scientists, data managers, technology specialists, teachers, and curriculum developers to work together creating classroom- ready scientific data modules. Teams of five (one participant from each of the five professions) develop topic- specific online educational units of the Earth Exploration Toolbook (serc.carleton.edu/eet/). Educators from middle schools through undergraduate colleges have been represented, as have scientific data professionals from many organizations across the United States. Extensive evaluation has been included in the design of each workshop. The evaluation results have been used each year to improve subsequent workshops. In addition to refining the format and process of the workshop itself, evaluation data collected reveal attendees' experiences using scientific data for educational purposes. Workshop attendees greatly value the opportunity to network with those of other professional roles in developing a real-world education project using scientific data. Educators appreciate the opportunity to work directly with scientists and technology specialists, while researchers and those in technical fields value the classroom expertise of the educators. Attendees' data use experiences are explored every year. Although bandwidth and connectivity were problems for data use in 2004, that has become much less common over time. The most common barriers to data use cited now are discoverability, data format problems, incomplete data sets, and poor documentation. Most attendees agree that the most useful types of online documentation and user support for scientific data are step-by-step instructions, examples, tutorials, and reference manuals. Satellite imagery and weather data were the most commonly used types of data, and these were often modified for use in the classroom. This presentation will discuss supports and barriers to the use of scientific data in the classroom, as well as the benefits and challenges of using collaborations between technical and educational professionals to develop resources for the classroom.

  10. A collection of micrographs: where science and art meet

    PubMed Central

    Uskoković, Vuk

    2013-01-01

    Micrographs obtained using different instrumental techniques are presented with the purpose of demonstrating their artistic qualities. The quality of uniformity currently dominates the aesthetic assessment in scientific practice and is discussed in relation to the classical appreciation of the interplay between symmetry and asymmetry in arts. It is argued that scientific and artistic qualities have converged and inspired each other throughout millennia. With scientific discoveries and inventions enriching the world of communication, broadening the space for artistic creativity and making artistic products more accessible than ever, science inevitably influences artistic creativity. On the other hand, the importance of aesthetic principles in guiding scientific conduct has been appreciated by some of the most creative scientific minds. Science and arts can be thus considered as parallel rails of a single railroad track. Only when precisely coordinated is the passing of the train of human knowledge enabled. The presented micrographs, occupying the central part of this discourse, are displayed with the purpose of showing the rich aesthetic character of even the most ordinary scientific images. The inherent aesthetic nature of scientific imagery and the artistic nature of scientific conduct have thus been offered as the conclusion. PMID:24465169

  11. Mid June in the North Atlantic [crop

    NASA Image and Video Library

    2015-06-18

    Phytoplankton communities and sea ice limn the turbulent flow field around Iceland in this Suomi-NPP/VIIRS scene collected on June 14, 2015. Credit: NASA/Goddard/Suomi NPP/VIIRS NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  12. Mid June in the North Atlantic

    NASA Image and Video Library

    2015-06-18

    Phytoplankton communities and sea ice limn the turbulent flow field around Iceland in this Suomi-NPP/VIIRS scene collected on June 14, 2015. Credit: NASA/Goddard/Suomi NPP/VIIRS NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  13. JWST Flight Mirrors

    NASA Image and Video Library

    2011-05-25

    Project scientist Mark Clampin is reflected in the flight mirrors of the Webb Space Telescope at Marshall Space Flight Center. Portions of the Webb telescope are being built at NASA Goddard. Credit: Ball Aerospace/NASA NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook Find us on Instagram

  14. Airborne Cloud Computing Environment (ACCE)

    NASA Technical Reports Server (NTRS)

    Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz

    2011-01-01

    Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.

  15. NASA's 3D view shows Hurricane Matthew's intensity

    NASA Image and Video Library

    2017-12-08

    Scientists use satellite data to peer into the massive storm – learning how and why it changed throughout its course. More info: www.nasa.gov/matthew NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  16. Tracking-Data-Conversion Tool

    NASA Technical Reports Server (NTRS)

    Flora-Adams, Dana; Makihara, Jeanne; Benenyan, Zabel; Berner, Jeff; Kwok, Andrew

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-topeer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  17. Tropical Cyclone Madi Approaching India

    NASA Image and Video Library

    2013-12-09

    Tropical Cyclone Madi approaching India. Acquired by Aqua/MODIS on 12/07/2013 at 07:55 UTC. Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  18. Microsensors and Microinstruments for Space Science and Exploration

    NASA Technical Reports Server (NTRS)

    Kukkonen, C. A.; Venneri, S.

    1997-01-01

    Most future NASA spacecraft will be small, low cost, highly integrated vehicles using advanced technology. This will also be true of planetary rovers. In order to maintain a high scientific value to these missions, the instruments, sensors and subsystems must be dramatically miniaturized without compromising their measurement capabilities. A rover must be designed to deliver its science package. In fact, the rover should be considered as the arms, legs and/or wheels that are needed to enable a mobile integrated scientific payload.

  19. Software Framework for Peer Data-Management Services

    NASA Technical Reports Server (NTRS)

    Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  20. Historic Hurricane Patricia Bears Down on Mexico's Pacific Coast

    NASA Image and Video Library

    2017-12-08

    This image was taken by GOES East at 1445Z on October 23, 2015. Credit: NASA/NOAA via NOAA Environmental Visualization Laboratory NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

Top