75 FR 34095 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-16
...: University of Minnesota (Dept. of Chemical Engineering and Materials Science), 151 Amundson Hall, 421... Scientific Instruments Pursuant to Section 6(c) of the Educational, Scientific and Cultural Materials... coatings, of very high crystalline quality materials known as complex oxides. A pertinent characteristic of...
Managing Scientific Software Complexity with Bocca and CCA
Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; ...
2008-01-01
In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less
Sight Application Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G.
2014-09-17
The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.
Introduction to the LaRC central scientific computing complex
NASA Technical Reports Server (NTRS)
Shoosmith, John N.
1993-01-01
The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.
ERIC Educational Resources Information Center
Kim, Sangsoo; Park, Jongwon
2018-01-01
Observing scientific events or objects is a complex process that occurs through the interaction between the observer's knowledge or expectations, the surrounding context, physiological features of the human senses, scientific inquiry processes, and the use of observational instruments. Scientific observation has various features specific to this…
[The organization of scientific innovative laboratory complex of modern technologies].
Totskaia, E G; Rozhnova, O M; Mamonova, E V
2013-01-01
The article discusses the actual issues of scientific innovative activity during the realization of principles of private-public partnership. The experience of development of model of scientific innovative complex is presented The possibilities to implement research achievements and their application in the area of cell technologies, technologies of regenerative medicine, biochip technologies are demonstrated. The opportunities to provide high level of diagnostic and treatment in practical health care increase of accessibility and quality of medical care and population health promotion are discussed.
Stehr, N; Grundmann, R
2001-06-01
The assertion about the unique 'complexity' or the peculiarly intricate character of social phenomena has, at least within sociology, a long, venerable and virtually uncontested tradition. At the turn of the last century, classical social theorists, for example, Georg Simmel and Emile Durkheim, made prominent and repeated reference to this attribute of the subject matter of sociology and the degree to which it complicates, even inhibits the develop and application of social scientific knowledge. Our paper explores the origins, the basis and the consequences of this assertion and asks in particular whether the classic complexity assertion still deserves to be invoked in analyses that ask about the production and the utilization of social scientific knowledge in modern society. We present John Maynard Keynes' economic theory and its practical applications as an illustration. We conclude that the practical value of social scientific knowledge is not dependent on a faithful, in the sense of complete, representation of social reality. Instead, social scientific knowledge that wants to optimize its practicality has to attend and attach itself to elements of social situations that can be altered or are actionable.
Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing
Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong
2014-01-01
This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931
The application of interactive graphics to large time-dependent hydrodynamics problems
NASA Technical Reports Server (NTRS)
Gama-Lobo, F.; Maas, L. D.
1975-01-01
A written companion of a movie entitled "Interactive Graphics at Los Alamos Scientific Laboratory" was presented. While the movie presents the actual graphics terminal and the functions performed on it, the paper attempts to put in perspective the complexity of the application code and the complexity of the interaction that is possible.
The Importance of Why: An Intelligence Approach for a Multi-Polar World
2016-04-04
December 27, 2015). 12. 2 Jupiter Scientific, “Definitions of Important Terms in Chaos Theory ,” Jupiter Scientific website, http...Important Terms in Chaos Theory .” Linearizing a system is approximating a nonlinear system through the application of linear system model. 25...Complexity Theory to Anticipate Strategic Surprise,” 24. 16 M. Mitchell Waldrop, Complexity: The Emerging Science at the Edge of Order and Chaos (New
Scientific Programming Using Java: A Remote Sensing Example
NASA Technical Reports Server (NTRS)
Prados, Don; Mohamed, Mohamed A.; Johnson, Michael; Cao, Changyong; Gasser, Jerry
1999-01-01
This paper presents results of a project to port remote sensing code from the C programming language to Java. The advantages and disadvantages of using Java versus C as a scientific programming language in remote sensing applications are discussed. Remote sensing applications deal with voluminous data that require effective memory management, such as buffering operations, when processed. Some of these applications also implement complex computational algorithms, such as Fast Fourier Transformation analysis, that are very performance intensive. Factors considered include performance, precision, complexity, rapidity of development, ease of code reuse, ease of maintenance, memory management, and platform independence. Performance of radiometric calibration code written in Java for the graphical user interface and of using C for the domain model are also presented.
Opal web services for biomedical applications.
Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W
2010-07-01
Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
Multicore Architecture-aware Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srinivasa, Avinash
Modern high performance systems are becoming increasingly complex and powerful due to advancements in processor and memory architecture. In order to keep up with this increasing complexity, applications have to be augmented with certain capabilities to fully exploit such systems. These may be at the application level, such as static or dynamic adaptations or at the system level, like having strategies in place to override some of the default operating system polices, the main objective being to improve computational performance of the application. The current work proposes two such capabilites with respect to multi-threaded scientific applications, in particular a largemore » scale physics application computing ab-initio nuclear structure. The first involves using a middleware tool to invoke dynamic adaptations in the application, so as to be able to adjust to the changing computational resource availability at run-time. The second involves a strategy for effective placement of data in main memory, to optimize memory access latencies and bandwidth. These capabilties when included were found to have a significant impact on the application performance, resulting in average speedups of as much as two to four times.« less
pFlogger: The Parallel Fortran Logging Utility
NASA Technical Reports Server (NTRS)
Clune, Tom; Cruz, Carlos A.
2017-01-01
In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or 'logger)' similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger - a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinlan, D.; Yi, Q.; Buduc, R.
2005-02-17
ROSE is an object-oriented software infrastructure for source-to-source translation that provides an interface for programmers to write their own specialized translators for optimizing scientific applications. ROSE is a part of current research on telescoping languages, which provides optimizations of the use of libraries in scientific applications. ROSE defines approaches to extend the optimization techniques, common in well defined languages, to the optimization of scientific applications using well defined libraries. ROSE includes a rich set of tools for generating customized transformations to support optimization of applications codes. We currently support full C and C++ (including template instantiation etc.), with Fortran 90more » support under development as part of a collaboration and contract with Rice to use their version of the open source Open64 F90 front-end. ROSE represents an attempt to define an open compiler infrastructure to handle the full complexity of full scale DOE applications codes using the languages common to scientific computing within DOE. We expect that such an infrastructure will also be useful for the development of numerous tools that may then realistically expect to work on DOE full scale applications.« less
Proceedings of the Sierra Nevada Science Symposium
Dennis D. Murphy; Peter A. Stine
2004-01-01
Land and resource management issues in the Sierra Nevada are becoming increasingly complex and controversial. The objective of the Sierra Nevada Science Symposium was to provide a synoptic overview of the current state of scientific knowledge related to key management issues. Attempts were made to tie recent scientific findings to applications in land management and...
Position Paper - pFLogger: The Parallel Fortran Logging framework for HPC Applications
NASA Technical Reports Server (NTRS)
Clune, Thomas L.; Cruz, Carlos A.
2017-01-01
In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or logger) similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.
POSITION PAPER - pFLogger: The Parallel Fortran Logging Framework for HPC Applications
NASA Technical Reports Server (NTRS)
Clune, Thomas L.; Cruz, Carlos A.
2017-01-01
In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or 'logger') similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger - a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.
Bim Automation: Advanced Modeling Generative Process for Complex Structures
NASA Astrophysics Data System (ADS)
Banfi, F.; Fai, S.; Brumana, R.
2017-08-01
The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.
Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Koo, Michelle; Cao, Yu
Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less
An adaptable XML based approach for scientific data management and integration
NASA Astrophysics Data System (ADS)
Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo
2008-03-01
Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.
An Adaptable XML Based Approach for Scientific Data Management and Integration.
Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo
2008-02-20
Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver
2009-11-20
Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less
Lattice Boltzmann Modeling of Complex Flows for Engineering Applications
NASA Astrophysics Data System (ADS)
Montessori, Andrea; Falcucci, Giacomo
2018-01-01
Nature continuously presents a huge number of complex and multiscale phenomena, which in many cases, involve the presence of one or more fluids flowing, merging and evolving around us. Since the very first years of the third millennium, the Lattice Boltzmann method (LB) has seen an exponential growth of applications, especially in the fields connected with the simulation of complex and soft matter flows. LB, in fact, has shown a remarkable versatility in different fields of applications from nanoactive materials, free surface flows, and multiphase and reactive flows to the simulation of the processes inside engines and fluid machinery. In this book, the authors present the most recent advances of the application of the LB to complex flow phenomena of scientific and technical interest with focus on the multiscale modeling of heterogeneous catalysis within nano-porous media and multiphase, multicomponent flows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevrekidis, Ioannis G.
The work explored the linking of modern developing machine learning techniques (manifold learning and in particular diffusion maps) with traditional PDE modeling/discretization/scientific computation techniques via the equation-free methodology developed by the PI. The result (in addition to several PhD degrees, two of them by CSGF Fellows) was a sequence of strong developments - in part on the algorithmic side, linking data mining with scientific computing, and in part on applications, ranging from PDE discretizations to molecular dynamics and complex network dynamics.
FRIEDA: Flexible Robust Intelligent Elastic Data Management Framework
Ghoshal, Devarshi; Hendrix, Valerie; Fox, William; ...
2017-02-01
Scientific applications are increasingly using cloud resources for their data analysis workflows. However, managing data effectively and efficiently over these cloud resources is challenging due to the myriad storage choices with different performance, cost trade-offs, complex application choices and complexity associated with elasticity, failure rates in these environments. The different data access patterns for data-intensive scientific applications require a more flexible and robust data management solution than the ones currently in existence. FRIEDA is a Flexible Robust Intelligent Elastic Data Management framework that employs a range of data management strategies in cloud environments. FRIEDA can manage storage and data lifecyclemore » of applications in cloud environments. There are four different stages in the data management lifecycle of FRIEDA – (i) storage planning, (ii) provisioning and preparation, (iii) data placement, and (iv) execution. FRIEDA defines a data control plane and an execution plane. The data control plane defines the data partition and distribution strategy, whereas the execution plane manages the execution of the application using a master-worker paradigm. FRIEDA also provides different data management strategies, either to partition the data in real-time, or predetermine the data partitions prior to application execution.« less
FRIEDA: Flexible Robust Intelligent Elastic Data Management Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghoshal, Devarshi; Hendrix, Valerie; Fox, William
Scientific applications are increasingly using cloud resources for their data analysis workflows. However, managing data effectively and efficiently over these cloud resources is challenging due to the myriad storage choices with different performance, cost trade-offs, complex application choices and complexity associated with elasticity, failure rates in these environments. The different data access patterns for data-intensive scientific applications require a more flexible and robust data management solution than the ones currently in existence. FRIEDA is a Flexible Robust Intelligent Elastic Data Management framework that employs a range of data management strategies in cloud environments. FRIEDA can manage storage and data lifecyclemore » of applications in cloud environments. There are four different stages in the data management lifecycle of FRIEDA – (i) storage planning, (ii) provisioning and preparation, (iii) data placement, and (iv) execution. FRIEDA defines a data control plane and an execution plane. The data control plane defines the data partition and distribution strategy, whereas the execution plane manages the execution of the application using a master-worker paradigm. FRIEDA also provides different data management strategies, either to partition the data in real-time, or predetermine the data partitions prior to application execution.« less
NASA Technical Reports Server (NTRS)
Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David
1987-01-01
The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.
Nektar++: An open-source spectral/ hp element framework
NASA Astrophysics Data System (ADS)
Cantwell, C. D.; Moxey, D.; Comerford, A.; Bolis, A.; Rocco, G.; Mengaldo, G.; De Grazia, D.; Yakovlev, S.; Lombard, J.-E.; Ekelschot, D.; Jordi, B.; Xu, H.; Mohamied, Y.; Eskilsson, C.; Nelson, B.; Vos, P.; Biotto, C.; Kirby, R. M.; Sherwin, S. J.
2015-07-01
Nektar++ is an open-source software framework designed to support the development of high-performance scalable solvers for partial differential equations using the spectral/ hp element method. High-order methods are gaining prominence in several engineering and biomedical applications due to their improved accuracy over low-order techniques at reduced computational cost for a given number of degrees of freedom. However, their proliferation is often limited by their complexity, which makes these methods challenging to implement and use. Nektar++ is an initiative to overcome this limitation by encapsulating the mathematical complexities of the underlying method within an efficient C++ framework, making the techniques more accessible to the broader scientific and industrial communities. The software supports a variety of discretisation techniques and implementation strategies, supporting methods research as well as application-focused computation, and the multi-layered structure of the framework allows the user to embrace as much or as little of the complexity as they need. The libraries capture the mathematical constructs of spectral/ hp element methods, while the associated collection of pre-written PDE solvers provides out-of-the-box application-level functionality and a template for users who wish to develop solutions for addressing questions in their own scientific domains.
Science, technology and the future of small autonomous drones.
Floreano, Dario; Wood, Robert J
2015-05-28
We are witnessing the advent of a new era of robots - drones - that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications.
A characterization of workflow management systems for extreme-scale applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
A characterization of workflow management systems for extreme-scale applications
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...
2017-02-16
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
Physics through the 1990s: Scientific interfaces and technological applications
NASA Technical Reports Server (NTRS)
1986-01-01
The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.
Ultrafast fiber lasers: practical applications
NASA Astrophysics Data System (ADS)
Pastirk, Igor; Sell, Alexander; Herda, Robert; Brodschelm, Andreas; Zach, Armin
2015-05-01
Over past three decades ultrafast lasers have come a long way from the bulky, demanding and very sensitive scientific research projects to widely available commercial products. For the majority of this period the titanium-sapphire-based ultrafast systems were the workhorse for scientific and emerging industrial and biomedical applications. However the complexity and intrinsic bulkiness of solid state lasers have prevented even larger penetration into wider array of practical applications. With emergence of femtosecond fiber lasers, based primarily on Er-doped and Yb-doped fibers that provide compact, inexpensive and dependable fs and ps pulses, new practical applications have become a reality. The overview of current state of the art ultrafast fiber sources, their basic principles and most prominent applications will be presented, including micromachining and biomedical implementations (ophthalmology) on one end of the pulse energy spectrum and 3D lithography and THz applications on the other.
Ontology-Driven Provenance Management in eScience: An Application in Parasite Research
NASA Astrophysics Data System (ADS)
Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.
Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.
Application of the Institution of Exclusive Rights in the Field of Science
NASA Astrophysics Data System (ADS)
Yakovlev, D.; Yushkov, E.; Zanardo, A.; Bogatyreova, M.
2017-01-01
The problem of legal protection of scientific research results is of growing interest nowadays. However, none of the three hitherto existing rights (the right for trade secrets, patent and copyright) is able to fully take into account the characteristics of scientific activities. In Russia, the problem of legal protection of scientific research results has been developed actively since the 50-ies of the last century, in connection with the introduction of the system of state registration of scientific discoveries. A further concept allowed for not only the registration of discoveries, but also the entire array of scientific results. However, theoretical applicability of exclusive rights institutions in the sphere of science remained unstudied. The article describes a new system, which is not fixed in legislation and remains unnoticed by the vast majority of researchers. That is the institution of scientific and positional rights, focused on the recognition procedure of authorship, priority, and other characteristics of intellectual scientific results value. In case of complex intellectual results, comprising scientific results, the recognition of result-oriented exclusive rights proves to be unsustainable. This circumstance urges us to foreground the institution of scientific and positional exclusive rights. Its scope is budget science where non-fee published scientific results are generated. Any exclusive right to use open scientific results is out of the question. The sphere of open (budget) science is dominated by scientific and positional exclusive rights, sanctioned both by the state (S-sanctioned), the bodies of the scientific community (BSC-sanctioned) and scientific community (SC-sanctioned) rights.
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
Drawing the PDB: Protein-Ligand Complexes in Two Dimensions.
Stierand, Katrin; Rarey, Matthias
2010-12-09
The two-dimensional representation of molecules is a popular communication medium in chemistry and the associated scientific fields. Computational methods for drawing small molecules with and without manual investigation are well-established and widely spread in terms of numerous software tools. Concerning the planar depiction of molecular complexes, there is considerably less choice. We developed the software PoseView, which automatically generates two-dimensional diagrams of macromolecular complexes, showing the ligand, the interactions, and the interacting residues. All depicted molecules are drawn on an atomic level as structure diagrams; thus, the output plots are clearly structured and easily readable for the scientist. We tested the performance of PoseView in a large-scale application on nearly all druglike complexes of the PDB (approximately 200000 complexes); for more than 92% of the complexes considered for drawing, a layout could be computed. In the following, we will present the results of this application study.
Barrett, R. F.; Crozier, P. S.; Doerfler, D. W.; ...
2014-09-28
Computational science and engineering application programs are typically large, complex, and dynamic, and are often constrained by distribution limitations. As a means of making tractable rapid explorations of scientific and engineering application programs in the context of new, emerging, and future computing architectures, a suite of miniapps has been created to serve as proxies for full scale applications. Each miniapp is designed to represent a key performance characteristic that does or is expected to significantly impact the runtime performance of an application program. In this paper we introduce a methodology for assessing the ability of these miniapps to effectively representmore » these performance issues. We applied this methodology to four miniapps, examining the linkage between them and an application they are intended to represent. Herein we evaluate the fidelity of that linkage. This work represents the initial steps required to begin to answer the question, ''Under what conditions does a miniapp represent a key performance characteristic in a full app?''« less
Examining the "Whole Child" to Generate Usable Knowledge
ERIC Educational Resources Information Center
Rappolt-Schlichtmann, Gabrielle; Ayoub, Catherine C.; Gravel, Jenna W.
2009-01-01
Despite the promise of scientific knowledge contributing to issues facing vulnerable children, families, and communities, typical approaches to research have made applications challenging. While contemporary theories of human development offer appropriate complexity, research has mostly failed to address dynamic developmental processes. Research…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chase Qishi; Zhu, Michelle Mengxia
The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less
Jost, Nils; Schüssler-Lenz, Martina; Ziegele, Bettina; Reinhardt, Jens
2015-11-01
The aim of scientific advice is to support pharmaceutical developers in regulatory and scientific questions, thus facilitating the development of safe and efficacious new medicinal products. Recent years have shown that the development of advanced therapy medicinal products (ATMPs) in particular needs a high degree of regulatory support. On one hand, this is related to the complexity and heterogeneity of this group of medicinal products and on the other hand due to the fact that mainly academic research institutions and small- and medium-sized enterprises (SMEs) are developing ATMPs. These often have limited regulatory experience and resources. In 2009 the Paul-Ehrlich-Institut (PEI) initiated the Innovation Office as a contact point for applicants developing ATMPs. The mandate of the Innovation Office is to provide support on regulatory questions and to coordinate national scientific advice meetings concerning ATMPs for every phase in drug development and especially with view to the preparation of clinical trial applications. On the European level, the Scientific Advice Working Party (SAWP) of the Committee for Medicinal Products for Human Use (CHMP) of the European Medicinal Agency (EMA) offers scientific advice. This article describes the concepts of national and EMA scientific advice concerning ATMPs and summarizes the experience of the last six years.
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
New particle formation and growth in CMAQ via application of comprehensive modal methods
The formation and growth of new atmospheric ultrafine particles are exceedingly complex processes and recent scientific efforts have grown our understanding of them tremendously. This presentation describes the effort to apply this new knowledge to the CMAQ chemical transport mod...
Smartfiles: An OO approach to data file interoperability
NASA Technical Reports Server (NTRS)
Haines, Matthew; Mehrotra, Piyush; Vanrosendale, John
1995-01-01
Data files for scientific and engineering codes typically consist of a series of raw data values whose descriptions are buried in the programs that interact with these files. In this situation, making even minor changes in the file structure or sharing files between programs (interoperability) can only be done after careful examination of the data file and the I/O statement of the programs interacting with this file. In short, scientific data files lack self-description, and other self-describing data techniques are not always appropriate or useful for scientific data files. By applying an object-oriented methodology to data files, we can add the intelligence required to improve data interoperability and provide an elegant mechanism for supporting complex, evolving, or multidisciplinary applications, while still supporting legacy codes. As a result, scientists and engineers should be able to share datasets with far greater ease, simplifying multidisciplinary applications and greatly facilitating remote collaboration between scientists.
Scientific visualization of volumetric radar cross section data
NASA Astrophysics Data System (ADS)
Wojszynski, Thomas G.
1992-12-01
For aircraft design and mission planning, designers, threat analysts, mission planners, and pilots require a Radar Cross Section (RCS) central tendency with its associated distribution about a specified aspect and its relation to a known threat, Historically, RCS data sets have been statically analyzed to evaluate a d profile. However, Scientific Visualization, the application of computer graphics techniques to produce pictures of complex physical phenomena appears to be a more promising tool to interpret this data. This work describes data reduction techniques and a surface rendering algorithm to construct and display a complex polyhedron from adjacent contours of RCS data. Data reduction is accomplished by sectorizing the data and characterizing the statistical properties of the data. Color, lighting, and orientation cues are added to complete the visualization system. The tool may be useful for synthesis, design, and analysis of complex, low observable air vehicles.
End-User Applications of Real-Time Earthquake Information in Europe
NASA Astrophysics Data System (ADS)
Cua, G. B.; Gasparini, P.; Giardini, D.; Zschau, J.; Filangieri, A. R.; Reakt Wp7 Team
2011-12-01
The primary objective of European FP7 project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction) is to improve the efficiency of real-time earthquake risk mitigation methods and their capability of protecting structures, infrastructures, and populations. REAKT aims to address the issues of real-time earthquake hazard and response from end-to-end, with efforts directed along the full spectrum of methodology development in earthquake forecasting, earthquake early warning, and real-time vulnerability systems, through optimal decision-making, and engagement and cooperation of scientists and end users for the establishment of best practices for use of real-time information. Twelve strategic test cases/end users throughout Europe have been selected. This diverse group of applications/end users includes civil protection authorities, railway systems, hospitals, schools, industrial complexes, nuclear plants, lifeline systems, national seismic networks, and critical structures. The scale of target applications covers a wide range, from two school complexes in Naples, to individual critical structures, such as the Rion Antirion bridge in Patras, and the Fatih Sultan Mehmet bridge in Istanbul, to large complexes, such as the SINES industrial complex in Portugal and the Thessaloniki port area, to distributed lifeline and transportation networks and nuclear plants. Some end-users are interested in in-depth feasibility studies for use of real-time information and development of rapid response plans, while others intend to install real-time instrumentation and develop customized automated control systems. From the onset, REAKT scientists and end-users will work together on concept development and initial implementation efforts using the data products and decision-making methodologies developed with the goal of improving end-user risk mitigation. The aim of this scientific/end-user partnership is to ensure that scientific efforts are applicable to operational, real-world problems.
[Advances in studies on bear bile powder].
Zhou, Chao-fan; Gao, Guo-jian; Liu, Ying
2015-04-01
In this paper, a detailed analysis was made on relevant literatures about bear bile powder in terms of chemical component, pharmacological effect and clinical efficacy, indicating bear bile powder's significant pharmacological effects and clinical application in treating various diseases. Due to the complex composition, bear bile powder is relatively toxic. Therefore, efforts shall be made to study bear bile powder's pharmacological effects, clinical application, chemical composition and toxic side-effects, with the aim to provide a scientific basis for widespread reasonable clinical application of bear bile powder.
Data Provenance Hybridization Supporting Extreme-Scale Scientific WorkflowApplications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elsethagen, Todd O.; Stephan, Eric G.; Raju, Bibi
As high performance computing (HPC) infrastructures continue to grow in capability and complexity, so do the applications that they serve. HPC and distributed-area computing (DAC) (e.g. grid and cloud) users are looking increasingly toward workflow solutions to orchestrate their complex application coupling, pre- and post-processing needs To gain insight and a more quantitative understanding of a workflow’s performance our method includes not only the capture of traditional provenance information, but also the capture and integration of system environment metrics helping to give context and explanation for a workflow’s execution. In this paper, we describe IPPD’s provenance management solution (ProvEn) andmore » its hybrid data store combining both of these data provenance perspectives.« less
Orchestrating Distributed Resource Ensembles for Petascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baldin, Ilya; Mandal, Anirban; Ruth, Paul
2014-04-24
Distributed, data-intensive computational science applications of interest to DOE scientific com- munities move large amounts of data for experiment data management, distributed analysis steps, remote visualization, and accessing scientific instruments. These applications need to orchestrate ensembles of resources from multiple resource pools and interconnect them with high-capacity multi- layered networks across multiple domains. It is highly desirable that mechanisms are designed that provide this type of resource provisioning capability to a broad class of applications. It is also important to have coherent monitoring capabilities for such complex distributed environments. In this project, we addressed these problems by designing an abstractmore » API, enabled by novel semantic resource descriptions, for provisioning complex and heterogeneous resources from multiple providers using their native provisioning mechanisms and control planes: computational, storage, and multi-layered high-speed network domains. We used an extensible resource representation based on semantic web technologies to afford maximum flexibility to applications in specifying their needs. We evaluated the effectiveness of provisioning using representative data-intensive ap- plications. We also developed mechanisms for providing feedback about resource performance to the application, to enable closed-loop feedback control and dynamic adjustments to resource allo- cations (elasticity). This was enabled through development of a novel persistent query framework that consumes disparate sources of monitoring data, including perfSONAR, and provides scalable distribution of asynchronous notifications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
2008-05-04
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less
The formation and growth of new atmospheric ultrafine particles are exceedingly complex processes and recent scientific efforts have grown our understanding of them tremendously. This presentation describes the effort to apply this new knowledge to the CMAQ chemical transport mod...
The formation and growth of new atmospheric ultrafine particles are exceedingly complex processes and recent scientific efforts have grown our understanding of them tremendously. This presentation describes the effort to apply this new knowledge to the CMAQ chemical transport mod...
Accelerating scientific discovery : 2007 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, P.; Dave, P.; Drugan, C.
2008-11-14
As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less
Strategies towards controlling strain-induced mesoscopic phase separation in manganite thin films
NASA Astrophysics Data System (ADS)
Habermeier, H.-U.
2008-10-01
Complex oxides represent a class of materials with a plethora of fascinating intrinsic physical functionalities. The intriguing interplay of charge, spin and orbital ordering in these systems superimposed by lattice effects opens a scientifically rewarding playground for both fundamental as well as application oriented research. The existence of nanoscale electronic phase separation in correlated complex oxides is one of the areas in this field whose impact on the current understanding of their physics and potential applications is not yet clear. In this paper this issue is treated from the point of view of complex oxide thin film technology. Commenting on aspects of complex oxide thin film growth gives an insight into the complexity of a reliable thin film technology for these materials. Exploring fundamentals of interfacial strain generation and strain accommodation paves the way to intentionally manipulate thin film properties. Furthermore, examples are given for an extrinsic continuous tuning of intrinsic electronic inhomogeneities in perovskite-type complex oxide thin films.
A Performance Evaluation of the Cray X1 for Scientific Applications
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Biswas, Rupak; Borrill, Julian; Canning, Andrew; Carter, Jonathan; Djomehri, M. Jahed; Shan, Hongzhang; Skinner, David
2004-01-01
The last decade has witnessed a rapid proliferation of superscalar cache-based microprocessors to build high-end capability and cost effectiveness. However, the recent development of massively parallel vector systems is having a significant effect on the supercomputing landscape. In this paper, we compare the performance of the recently released Cray X1 vector system with that of the cacheless NEC SX-6 vector machine, and the superscalar cache-based IBM Power3 and Power4 architectures for scientific applications. Overall results demonstrate that the X1 is quite promising, but performance improvements are expected as the hardware, systems software, and numerical libraries mature. Code reengineering to effectively utilize the complex architecture may also lead to significant efficiency enhancements.
Basic mathematical function libraries for scientific computation
NASA Technical Reports Server (NTRS)
Galant, David C.
1989-01-01
Ada packages implementing selected mathematical functions for the support of scientific and engineering applications were written. The packages provide the Ada programmer with the mathematical function support found in the languages Pascal and FORTRAN as well as an extended precision arithmetic and a complete complex arithmetic. The algorithms used are fully described and analyzed. Implementation assumes that the Ada type FLOAT objects fully conform to the IEEE 754-1985 standard for single binary floating-point arithmetic, and that INTEGER objects are 32-bit entities. Codes for the Ada packages are included as appendixes.
Measuring the Level of Complexity of Scientific Inquiries: The LCSI Index
ERIC Educational Resources Information Center
Eilam, Efrat
2015-01-01
The study developed and applied an index for measuring the level of complexity of full authentic scientific inquiry. Complexity is a fundamental attribute of real life scientific research. The level of complexity is an overall reflection of complex cognitive and metacognitive processes which are required for navigating the authentic inquiry…
Wavelet-Smoothed Interpolation of Masked Scientific Data for JPEG 2000 Compression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brislawn, Christopher M.
2012-08-13
How should we manage scientific data with 'holes'? Some applications, like JPEG 2000, expect logically rectangular data, but some sources, like the Parallel Ocean Program (POP), generate data that isn't defined on certain subsets. We refer to grid points that lack well-defined, scientifically meaningful sample values as 'masked' samples. Wavelet-smoothing is a highly scalable interpolation scheme for regions with complex boundaries on logically rectangular grids. Computation is based on forward/inverse discrete wavelet transforms, so runtime complexity and memory scale linearly with respect to sample count. Efficient state-of-the-art minimal realizations yield small constants (O(10)) for arithmetic complexity scaling, and in-situ implementationmore » techniques make optimal use of memory. Implementation in two dimensions using tensor product filter banks is straighsorward and should generalize routinely to higher dimensions. No hand-tuning required when the interpolation mask changes, making the method aeractive for problems with time-varying masks. Well-suited for interpolating undefined samples prior to JPEG 2000 encoding. The method outperforms global mean interpolation, as judged by both SNR rate-distortion performance and low-rate artifact mitigation, for data distributions whose histograms do not take the form of sharply peaked, symmetric, unimodal probability density functions. These performance advantages can hold even for data whose distribution differs only moderately from the peaked unimodal case, as demonstrated by POP salinity data. The interpolation method is very general and is not tied to any particular class of applications, could be used for more generic smooth interpolation.« less
Integrating multiple scientific computing needs via a Private Cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Berzano, D.; Brunetti, R.; Lusso, S.; Vallero, S.
2014-06-01
In a typical scientific computing centre, diverse applications coexist and share a single physical infrastructure. An underlying Private Cloud facility eases the management and maintenance of heterogeneous use cases such as multipurpose or application-specific batch farms, Grid sites catering to different communities, parallel interactive data analysis facilities and others. It allows to dynamically and efficiently allocate resources to any application and to tailor the virtual machines according to the applications' requirements. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques; for example, rolling updates can be performed easily and minimizing the downtime. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 site and a dynamically expandable PROOF-based Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The Private Cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem (used in two different configurations for worker- and service-class hypervisors) and the OpenWRT Linux distribution (used for network virtualization). A future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and by using mainstream contextualization tools like CloudInit.
Parallel, distributed and GPU computing technologies in single-particle electron microscopy
Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger
2009-01-01
Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined. PMID:19564686
Parallel, distributed and GPU computing technologies in single-particle electron microscopy.
Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger
2009-07-01
Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.
Securing Information with Complex Optical Encryption Networks
2015-08-11
Network Security, Network Vulnerability , Multi-dimentional Processing, optoelectronic devices 16. SECURITY CLASSIFICATION OF: 17. LIMITATION... optoelectronic devices and systems should be analyzed before the retrieval, any hostile hacker will need to possess multi-disciplinary scientific...sophisticated optoelectronic principles and systems where he/she needs to process the information. However, in the military applications, most military
Childhood Trauma Remembered: A Report on the Current Scientific Knowledge Base and Its Applications.
ERIC Educational Resources Information Center
Roth, Susan, Ed.; Friedman, Matthew J., Ed.
1998-01-01
Complex issues are involved in the controversy about memories of childhood sexual abuse. Questions of childhood trauma, traumatic memory, the memory process, clinical issues, and forensic implications are reviewed. This article is condensed and modified from a more comprehensive document prepared by and available from the International Society for…
Science Education in Primary Schools: Is an Animation Worth a Thousand Pictures?
ERIC Educational Resources Information Center
Barak, Miri; Dori, Yehudit J.
2011-01-01
Science teaching deals with abstract concepts and processes that very often cannot be seen or touched. The development of Java, Flash, and other web-based applications allow teachers and educators to present complex animations that attractively illustrate scientific phenomena. Our study evaluated the integration of web-based animated movies into…
Reviewing model application to support animal health decision making.
Singer, Alexander; Salman, Mo; Thulke, Hans-Hermann
2011-04-01
Animal health is of societal importance as it affects human welfare, and anthropogenic interests shape decision making to assure animal health. Scientific advice to support decision making is manifold. Modelling, as one piece of the scientific toolbox, is appreciated for its ability to describe and structure data, to give insight in complex processes and to predict future outcome. In this paper we study the application of scientific modelling to support practical animal health decisions. We reviewed the 35 animal health related scientific opinions adopted by the Animal Health and Animal Welfare Panel of the European Food Safety Authority (EFSA). Thirteen of these documents were based on the application of models. The review took two viewpoints, the decision maker's need and the modeller's approach. In the reviewed material three types of modelling questions were addressed by four specific model types. The correspondence between tasks and models underpinned the importance of the modelling question in triggering the modelling approach. End point quantifications were the dominating request from decision makers, implying that prediction of risk is a major need. However, due to knowledge gaps corresponding modelling studies often shed away from providing exact numbers. Instead, comparative scenario analyses were performed, furthering the understanding of the decision problem and effects of alternative management options. In conclusion, the most adequate scientific support for decision making - including available modelling capacity - might be expected if the required advice is clearly stated. Copyright © 2011 Elsevier B.V. All rights reserved.
Constructing complex graphics applications with CLIPS and the X window system
NASA Technical Reports Server (NTRS)
Faul, Ben M.
1990-01-01
This article will demonstrate how the artificial intelligence concepts in CLIPS used to solve problems encountered in the design and implementation of graphics applications within the UNIX-X Window System environment. The design of an extended version of CLIPS, called XCLIPS, is presented to show how the X Windows System graphics can be incorporated without losing DOS compatibility. Using XCLIPS, a sample scientific application is built that applies solving capabilities of both two and three dimensional graphics presentations in conjunction with the standard CLIPS features.
Low-dimensional materials for organic electronic applications
NASA Astrophysics Data System (ADS)
Beniwal, Sumit
This thesis explores the self-assembly, surface interactions and electronic properties of functional molecules that have potential applications in electronics. Three classes of molecules - organic ferroelectric, spin-crossover complex, and molecules that assemble into a 2D semiconductor, have been studied through scanning tunneling microscopy and surfacesensitive spectroscopic methods. The scientific goal of this thesis is to understand the self-assembly of these molecules in low-dimensional (2D) configurations and the influence of substrate on their properties.
Koyama, Michihisa; Tsuboi, Hideyuki; Endou, Akira; Takaba, Hiromitsu; Kubo, Momoji; Del Carpio, Carlos A; Miyamoto, Akira
2007-02-01
Computational chemistry can provide fundamental knowledge regarding various aspects of materials. While its impact in scientific research is greatly increasing, its contributions to industrially important issues are far from satisfactory. In order to realize industrial innovation by computational chemistry, a new concept "combinatorial computational chemistry" has been proposed by introducing the concept of combinatorial chemistry to computational chemistry. This combinatorial computational chemistry approach enables theoretical high-throughput screening for materials design. In this manuscript, we review the successful applications of combinatorial computational chemistry to deNO(x) catalysts, Fischer-Tropsch catalysts, lanthanoid complex catalysts, and cathodes of the lithium ion secondary battery.
Fourier transform spectrometer controller for partitioned architectures
NASA Astrophysics Data System (ADS)
Tamas-Selicean, D.; Keymeulen, D.; Berisford, D.; Carlson, R.; Hand, K.; Pop, P.; Wadsworth, W.; Levy, R.
The current trend in spacecraft computing is to integrate applications of different criticality levels on the same platform using no separation. This approach increases the complexity of the development, verification and integration processes, with an impact on the whole system life cycle. Researchers at ESA and NASA advocated for the use of partitioned architecture to reduce this complexity. Partitioned architectures rely on platform mechanisms to provide robust temporal and spatial separation between applications. Such architectures have been successfully implemented in several industries, such as avionics and automotive. In this paper we investigate the challenges of developing and the benefits of integrating a scientific instrument, namely a Fourier Transform Spectrometer, in such a partitioned architecture.
An Open Simulation System Model for Scientific Applications
NASA Technical Reports Server (NTRS)
Williams, Anthony D.
1995-01-01
A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.
A Performance Evaluation of the Cray X1 for Scientific Applications
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Biswas, Rupak; Borrill, Julian; Canning, Andrew; Carter, Jonathan; Djomehri, M. Jahed; Shan, Hongzhang; Skinner, David
2003-01-01
The last decade has witnessed a rapid proliferation of superscalar cache-based microprocessors to build high-end capability and capacity computers because of their generality, scalability, and cost effectiveness. However, the recent development of massively parallel vector systems is having a significant effect on the supercomputing landscape. In this paper, we compare the performance of the recently-released Cray X1 vector system with that of the cacheless NEC SX-6 vector machine, and the superscalar cache-based IBM Power3 and Power4 architectures for scientific applications. Overall results demonstrate that the X1 is quite promising, but performance improvements are expected as the hardware, systems software, and numerical libraries mature. Code reengineering to effectively utilize the complex architecture may also lead to significant efficiency enhancements.
An Integrated Framework for Parameter-based Optimization of Scientific Workflows.
Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel
2009-01-01
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janjusic, Tommy; Kartsaklis, Christos
Application analysis is facilitated through a number of program profiling tools. The tools vary in their complexity, ease of deployment, design, and profiling detail. Specifically, understand- ing, analyzing, and optimizing is of particular importance for scientific applications where minor changes in code paths and data-structure layout can have profound effects. Understanding how intricate data-structures are accessed and how a given memory system responds is a complex task. In this paper we describe a trace profiling tool, Glprof, specifically aimed to lessen the burden of the programmer to pin-point heavily involved data-structures during an application's run-time, and understand data-structure run-time usage.more » Moreover, we showcase the tool's modularity using additional cache simulation components. We elaborate on the tool's design, and features. Finally we demonstrate the application of our tool in the context of Spec bench- marks using the Glprof profiler and two concurrently running cache simulators, PPC440 and AMD Interlagos.« less
NASA Astrophysics Data System (ADS)
Hullo, J.-F.; Thibault, G.; Boucheny, C.
2015-02-01
In a context of increased maintenance operations and workers generational renewal, a nuclear owner and operator like Electricité de France (EDF) is interested in the scaling up of tools and methods of "as-built virtual reality" for larger buildings and wider audiences. However, acquisition and sharing of as-built data on a large scale (large and complex multi-floored buildings) challenge current scientific and technical capacities. In this paper, we first present a state of the art of scanning tools and methods for industrial plants with very complex architecture. Then, we introduce the inner characteristics of the multi-sensor scanning and visualization of the interior of the most complex building of a power plant: a nuclear reactor building. We introduce several developments that made possible a first complete survey of such a large building, from acquisition, processing and fusion of multiple data sources (3D laser scans, total-station survey, RGB panoramic, 2D floor plans, 3D CAD as-built models). In addition, we present the concepts of a smart application developed for the painless exploration of the whole dataset. The goal of this application is to help professionals, unfamiliar with the manipulation of such datasets, to take into account spatial constraints induced by the building complexity while preparing maintenance operations. Finally, we discuss the main feedbacks of this large experiment, the remaining issues for the generalization of such large scale surveys and the future technical and scientific challenges in the field of industrial "virtual reality".
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Zhenhuan; Boyuka, David; Zou, X
Download Citation Email Print Request Permissions Save to Project The size and scope of cutting-edge scientific simulations are growing much faster than the I/O and storage capabilities of their run-time environments. The growing gap is exacerbated by exploratory, data-intensive analytics, such as querying simulation data with multivariate, spatio-temporal constraints, which induces heterogeneous access patterns that stress the performance of the underlying storage system. Previous work addresses data layout and indexing techniques to improve query performance for a single access pattern, which is not sufficient for complex analytics jobs. We present PARLO a parallel run-time layout optimization framework, to achieve multi-levelmore » data layout optimization for scientific applications at run-time before data is written to storage. The layout schemes optimize for heterogeneous access patterns with user-specified priorities. PARLO is integrated with ADIOS, a high-performance parallel I/O middleware for large-scale HPC applications, to achieve user-transparent, light-weight layout optimization for scientific datasets. It offers simple XML-based configuration for users to achieve flexible layout optimization without the need to modify or recompile application codes. Experiments show that PARLO improves performance by 2 to 26 times for queries with heterogeneous access patterns compared to state-of-the-art scientific database management systems. Compared to traditional post-processing approaches, its underlying run-time layout optimization achieves a 56% savings in processing time and a reduction in storage overhead of up to 50%. PARLO also exhibits a low run-time resource requirement, while also limiting the performance impact on running applications to a reasonable level.« less
ERIC Educational Resources Information Center
Zelnio, Ryan J.
2013-01-01
This dissertation seeks to contribute to a fuller understanding of how international scientific collaboration has affected national scientific systems. It does this by developing three methodological approaches grounded in social complexity theory and applying them to the evaluation of national scientific systems. The first methodology identifies…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potok, Thomas; Schuman, Catherine; Patton, Robert
The White House and Department of Energy have been instrumental in driving the development of a neuromorphic computing program to help the United States continue its lead in basic research into (1) Beyond Exascale—high performance computing beyond Moore’s Law and von Neumann architectures, (2) Scientific Discovery—new paradigms for understanding increasingly large and complex scientific data, and (3) Emerging Architectures—assessing the potential of neuromorphic and quantum architectures. Neuromorphic computing spans a broad range of scientific disciplines from materials science to devices, to computer science, to neuroscience, all of which are required to solve the neuromorphic computing grand challenge. In our workshopmore » we focus on the computer science aspects, specifically from a neuromorphic device through an application. Neuromorphic devices present a very different paradigm to the computer science community from traditional von Neumann architectures, which raises six major questions about building a neuromorphic application from the device level. We used these fundamental questions to organize the workshop program and to direct the workshop panels and discussions. From the white papers, presentations, panels, and discussions, there emerged several recommendations on how to proceed.« less
A Workshop for Developing Learning Modules for Science Classes Based on Biogeochemical Research
ERIC Educational Resources Information Center
Harrington, James M.; Gardner, Terrence G.; Amoozegar, Aziz; Andrews, Megan Y.; Rivera, Nelson A.; Duckworth, Owen W.
2013-01-01
A challenging aspect of educating secondary students is integrating complex scientific concepts related to modern research topics into lesson plans that students can relate to and understand at a basic level. One method of encouraging the achievement of learning outcomes is to use real-world applications and current research to fuel student…
ERIC Educational Resources Information Center
Strobl, Carolin; Malley, James; Tutz, Gerhard
2009-01-01
Recursive partitioning methods have become popular and widely used tools for nonparametric regression and classification in many scientific fields. Especially random forests, which can deal with large numbers of predictor variables even in the presence of complex interactions, have been applied successfully in genetics, clinical medicine, and…
ERIC Educational Resources Information Center
Stredney, Donald Larry
An overview of computer animation and the techniques involved in its creation is provided in the introduction to this masters thesis, which focuses on the problems encountered by students in learning the forms and functions of complex anatomical structures and ways in which computer animation can address these problems. The objectives for,…
ERIC Educational Resources Information Center
Yoon, Susan
2008-01-01
This study investigated seventh grade learners' decision making about genetic engineering concepts and applications. A social network analyses supported by technology tracked changes in student understanding with a focus on social and conceptual influences. Results indicated that several social and conceptual mechanisms potentially affected how…
Distinguishing Provenance Equivalence of Earth Science Data
NASA Technical Reports Server (NTRS)
Tilmes, Curt; Yesha, Ye; Halem, M.
2010-01-01
Reproducibility of scientific research relies on accurate and precise citation of data and the provenance of that data. Earth science data are often the result of applying complex data transformation and analysis workflows to vast quantities of data. Provenance information of data processing is used for a variety of purposes, including understanding the process and auditing as well as reproducibility. Certain provenance information is essential for producing scientifically equivalent data. Capturing and representing that provenance information and assigning identifiers suitable for precisely distinguishing data granules and datasets is needed for accurate comparisons. This paper discusses scientific equivalence and essential provenance for scientific reproducibility. We use the example of an operational earth science data processing system to illustrate the application of the technique of cascading digital signatures or hash chains to precisely identify sets of granules and as provenance equivalence identifiers to distinguish data made in an an equivalent manner.
Vollmar, Horst Christian; Kramer, Ursula; Müller, Hardy; Griemmert, Maria; Noelle, Guido; Schrappe, Matthias
2017-12-01
The term "digital health" is currently the most comprehensive term that includes all information and communication technologies in healthcare, including e-health, mobile health, telemedicine, big data, health apps and others. Digital health can be seen as a good example of the use of the concept and methodology of health services research in the interaction between complex interventions and complex contexts. The position paper deals with 1) digital health as the subject of health services research; 2) digital health as a methodological and ethical challenge for health services research. The often-postulated benefits of digital health interventions should be demonstrated with good studies. First systematic evaluations of apps for "treatment support" show that risks are higher than benefits. The need for a rigorous proof applies even more to big data-assisted interventions that support decision-making in the treatment process with the support of artificial intelligence. Of course, from the point of view of health services research, it is worth participating as much as possible in data access available through digital health and "big data". However, there is the risk that a noncritical application of digital health and big data will lead to a return to a linear understanding of biomedical research, which, at best, accepts complex conditions assuming multivariate models but does not take complex facts into account. It is not just a matter of scientific ethical requirements in health services care research, for instance, better research instead of unnecessary research ("reducing waste"), but it is primarily a matter of anticipating the social consequences (system level) of scientific analysis and evaluation. This is both a challenge and an attractive option for health services research to present itself as a mature and responsible scientific discipline. © Georg Thieme Verlag KG Stuttgart · New York.
On (scientific) integrity: conceptual clarification.
Patrão Neves, Maria do Céu
2018-06-01
The notion of "integrity" is currently quite common and broadly recognized as complex, mostly due to its recurring and diverse application in various distinct domains such as the physical, psychic or moral, the personal or professional, that of the human being or of the totality of beings. Nevertheless, its adjectivation imprints a specific meaning, as happens in the case of "scientific integrity". This concept has been defined mostly by via negativa, by pointing out what goes against integrity, that is, through the identification of its infringements, which has also not facilitated the elaboration of an overarching and consensual code of scientific integrity. In this context, it is deemed necessary to clarify the notion of "integrity", first etymologically, recovering the original meaning of the term, and then in a specifically conceptual way, through the identification of the various meanings with which the term can be legitimately used, particularly in the domain of scientific research and innovation. These two steps are fundamental and indispensable for a forthcoming attempt at systematizing the requirements of "scientific integrity".
A virtual data language and system for scientific workflow management in data grid environments
NASA Astrophysics Data System (ADS)
Zhao, Yong
With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.
Microbial Cellulases and Their Industrial Applications
Kuhad, Ramesh Chander; Gupta, Rishi; Singh, Ajay
2011-01-01
Microbial cellulases have shown their potential application in various industries including pulp and paper, textile, laundry, biofuel production, food and feed industry, brewing, and agriculture. Due to the complexity of enzyme system and immense industrial potential, cellulases have been a potential candidate for research by both the academic and industrial research groups. Nowadays, significant attentions have been devoted to the current knowledge of cellulase production and the challenges in cellulase research especially in the direction of improving the process economics of various industries. Scientific and technological developments and the future prospects for application of cellulases in different industries are discussed in this paper. PMID:21912738
Sherrington, David
2010-03-13
This paper is concerned with complex macroscopic behaviour arising in many-body systems through the combinations of competitive interactions and disorder, even with simple ingredients at the microscopic level. It attempts to indicate and illustrate the richness that has arisen, in conceptual understanding, in methodology and in application, across a large range of scientific disciplines, together with a hint of some of the further opportunities that remain to be tapped. In doing so, it takes the perspective of physics and tries to show, albeit rather briefly, how physics has contributed and been stimulated.
U.S. Geological Survey Groundwater Modeling Software: Making Sense of a Complex Natural Resource
Provost, Alden M.; Reilly, Thomas E.; Harbaugh, Arlen W.; Pollock, David W.
2009-01-01
Computer models of groundwater systems simulate the flow of groundwater, including water levels, and the transport of chemical constituents and thermal energy. Groundwater models afford hydrologists a framework on which to organize their knowledge and understanding of groundwater systems, and they provide insights water-resources managers need to plan effectively for future water demands. Building on decades of experience, the U.S. Geological Survey (USGS) continues to lead in the development and application of computer software that allows groundwater models to address scientific and management questions of increasing complexity.
Sherrington, David
2010-01-01
This paper is concerned with complex macroscopic behaviour arising in many-body systems through the combinations of competitive interactions and disorder, even with simple ingredients at the microscopic level. It attempts to indicate and illustrate the richness that has arisen, in conceptual understanding, in methodology and in application, across a large range of scientific disciplines, together with a hint of some of the further opportunities that remain to be tapped. In doing so, it takes the perspective of physics and tries to show, albeit rather briefly, how physics has contributed and been stimulated. PMID:20123753
Combinatorial and high-throughput screening of materials libraries: review of state of the art.
Potyrailo, Radislav; Rajan, Krishna; Stoewe, Klaus; Takeuchi, Ichiro; Chisholm, Bret; Lam, Hubert
2011-11-14
Rational materials design based on prior knowledge is attractive because it promises to avoid time-consuming synthesis and testing of numerous materials candidates. However with the increase of complexity of materials, the scientific ability for the rational materials design becomes progressively limited. As a result of this complexity, combinatorial and high-throughput (CHT) experimentation in materials science has been recognized as a new scientific approach to generate new knowledge. This review demonstrates the broad applicability of CHT experimentation technologies in discovery and optimization of new materials. We discuss general principles of CHT materials screening, followed by the detailed discussion of high-throughput materials characterization approaches, advances in data analysis/mining, and new materials developments facilitated by CHT experimentation. We critically analyze results of materials development in the areas most impacted by the CHT approaches, such as catalysis, electronic and functional materials, polymer-based industrial coatings, sensing materials, and biomaterials.
Managing a tier-2 computer centre with a private cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara
2014-06-01
In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.
Critical issues in NASA information systems
NASA Technical Reports Server (NTRS)
1987-01-01
The National Aeronautics and Space Administration has developed a globally-distributed complex of earth resources data bases since LANDSAT 1 was launched in 1972. NASA envisages considerable growth in the number, extent, and complexity of such data bases, due to the improvements expected in its remote sensing data rates, and the increasingly multidisciplinary nature of its scientific investigations. Work already has begun on information systems to support multidisciplinary research activities based on data acquired by the space station complex and other space-based and terrestrial sources. In response to a request from NASA's former Associate Administrator for Space Science and Applications, the National Research Council convened a committee in June 1985 to identify the critical issues involving information systems support to space science and applications. The committee has suggested that OSSA address four major information systems issues; centralization of management functions, interoperability of user involvement in the planning and implementation of its programs, and technology.
Java Performance for Scientific Applications on LLNL Computer Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kapfer, C; Wissink, A
2002-05-10
Languages in use for high performance computing at the laboratory--Fortran (f77 and f90), C, and C++--have many years of development behind them and are generally considered the fastest available. However, Fortran and C do not readily extend to object-oriented programming models, limiting their capability for very complex simulation software. C++ facilitates object-oriented programming but is a very complex and error-prone language. Java offers a number of capabilities that these other languages do not. For instance it implements cleaner (i.e., easier to use and less prone to errors) object-oriented models than C++. It also offers networking and security as part ofmore » the language standard, and cross-platform executables that make it architecture neutral, to name a few. These features have made Java very popular for industrial computing applications. The aim of this paper is to explain the trade-offs in using Java for large-scale scientific applications at LLNL. Despite its advantages, the computational science community has been reluctant to write large-scale computationally intensive applications in Java due to concerns over its poor performance. However, considerable progress has been made over the last several years. The Java Grande Forum [1] has been promoting the use of Java for large-scale computing. Members have introduced efficient array libraries, developed fast just-in-time (JIT) compilers, and built links to existing packages used in high performance parallel computing.« less
NASA Technical Reports Server (NTRS)
Ross, M. D.; Montgomery, K.; Linton, S.; Cheng, R.; Smith, J.
1998-01-01
This report describes the three-dimensional imaging and virtual environment technologies developed in NASA's Biocomputation Center for scientific purposes that have now led to applications in the field of medicine. A major goal is to develop a virtual environment surgery workbench for planning complex craniofacial and breast reconstructive surgery, and for training surgeons.
A main path domain map as digital library interface
NASA Astrophysics Data System (ADS)
Demaine, Jeffrey
2009-01-01
The shift to electronic publishing of scientific journals is an opportunity for the digital library to provide non-traditional ways of accessing the literature. One method is to use citation metadata drawn from a collection of electronic journals to generate maps of science. These maps visualize the communication patterns in the collection, giving the user an easy-tograsp view of the semantic structure underlying the scientific literature. For this visualization to be understandable the complexity of the citation network must be reduced through an algorithm. This paper describes the Citation Pathfinder application and its integration into a prototype digital library. This application generates small-scale citation networks that expand upon the search results of the digital library. These domain maps are linked to the collection, creating an interface that is based on the communication patterns in science. The Main Path Analysis technique is employed to simplify these networks into linear, sequential structures. By identifying patterns that characterize the evolution of the research field, Citation Pathfinder uses citations to give users a deeper understanding of the scientific literature.
MeshVoro: A Three-Dimensional Voronoi Mesh Building Tool for the TOUGH Family of Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeman, C. M.; Boyle, K. L.; Reagan, M.
2013-09-30
Few tools exist for creating and visualizing complex three-dimensional simulation meshes, and these have limitations that restrict their application to particular geometries and circumstances. Mesh generation needs to trend toward ever more general applications. To that end, we have developed MeshVoro, a tool that is based on the Voro (Rycroft 2009) library and is capable of generating complex threedimensional Voronoi tessellation-based (unstructured) meshes for the solution of problems of flow and transport in subsurface geologic media that are addressed by the TOUGH (Pruess et al. 1999) family of codes. MeshVoro, which includes built-in data visualization routines, is a particularly usefulmore » tool because it extends the applicability of the TOUGH family of codes by enabling the scientifically robust and relatively easy discretization of systems with challenging 3D geometries. We describe several applications of MeshVoro. We illustrate the ability of the tool to straightforwardly transform a complex geological grid into a simulation mesh that conforms to the specifications of the TOUGH family of codes. We demonstrate how MeshVoro can describe complex system geometries with a relatively small number of grid blocks, and we construct meshes for geometries that would have been practically intractable with a standard Cartesian grid approach. We also discuss the limitations and appropriate applications of this new technology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaper, H. G.
1998-01-05
An interdisciplinary project encompassing sound synthesis, music composition, sonification, and visualization of music is facilitated by the high-performance computing capabilities and the virtual-reality environments available at Argonne National Laboratory. The paper describes the main features of the project's centerpiece, DIASS (Digital Instrument for Additive Sound Synthesis); ''A.N.L.-folds'', an equivalence class of compositions produced with DIASS; and application of DIASS in two experiments in the sonification of complex scientific data. Some of the larger issues connected with this project, such as the changing ways in which both scientists and composers perform their tasks, are briefly discussed.
Grid computing technology for hydrological applications
NASA Astrophysics Data System (ADS)
Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.
2011-06-01
SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.
Search Pathways: Modeling GeoData Search Behavior to Support Usable Application Development
NASA Astrophysics Data System (ADS)
Yarmey, L.; Rosati, A.; Tressel, S.
2014-12-01
Recent technical advances have enabled development of new scientific data discovery systems. Metadata brokering, linked data, and other mechanisms allow users to discover scientific data of interes across growing volumes of heterogeneous content. Matching this complex content with existing discovery technologies, people looking for scientific data are presented with an ever-growing array of features to sort, filter, subset, and scan through search returns to help them find what they are looking for. This paper examines the applicability of available technologies in connecting searchers with the data of interest. What metrics can be used to track success given shifting baselines of content and technology? How well do existing technologies map to steps in user search patterns? Taking a user-driven development approach, the team behind the Arctic Data Explorer interdisciplinary data discovery application invested heavily in usability testing and user search behavior analysis. Building on earlier library community search behavior work, models were developed to better define the diverse set of thought processes and steps users took to find data of interest, here called 'search pathways'. This research builds a deeper understanding of the user community that seeks to reuse scientific data. This approach ensures that development decisions are driven by clearly articulated user needs instead of ad hoc technology trends. Initial results from this research will be presented along with lessons learned for other discovery platform development and future directions for informatics research into search pathways.
The design of nonlinear observers for wind turbine dynamic state and parameter estimation
NASA Astrophysics Data System (ADS)
Ritter, B.; Schild, A.; Feldt, M.; Konigorski, U.
2016-09-01
This contribution addresses the dynamic state and parameter estimation problem which arises with more advanced wind turbine controllers. These control devices need precise information about the system's current state to outperform conventional industrial controllers effectively. First, the necessity of a profound scientific treatment on nonlinear observers for wind turbine application is highlighted. Secondly, the full estimation problem is introduced and the variety of nonlinear filters is discussed. Finally, a tailored observer architecture is proposed and estimation results of an illustrative application example from a complex simulation set-up are presented.
NASA Astrophysics Data System (ADS)
Zhou, Xingjiang; He, Shaolong; Liu, Guodong; Zhao, Lin; Yu, Li; Zhang, Wentao
2018-06-01
The significant progress in angle-resolved photoemission spectroscopy (ARPES) in last three decades has elevated it from a traditional band mapping tool to a precise probe of many-body interactions and dynamics of quasiparticles in complex quantum systems. The recent developments of deep ultraviolet (DUV, including ultraviolet and vacuum ultraviolet) laser-based ARPES have further pushed this technique to a new level. In this paper, we review some latest developments in DUV laser-based photoemission systems, including the super-high energy and momentum resolution ARPES, the spin-resolved ARPES, the time-of-flight ARPES, and the time-resolved ARPES. We also highlight some scientific applications in the study of electronic structure in unconventional superconductors and topological materials using these state-of-the-art DUV laser-based ARPES. Finally we provide our perspectives on the future directions in the development of laser-based photoemission systems.
A standardized set of 3-D objects for virtual reality research and applications.
Peeters, David
2018-06-01
The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.
Modeling and simulation of a direct ethanol fuel cell: An overview
NASA Astrophysics Data System (ADS)
Abdullah, S.; Kamarudin, S. K.; Hasran, U. A.; Masdar, M. S.; Daud, W. R. W.
2014-09-01
The commercialization of Direct Ethanol Fuel Cells (DEFCs) is still hindered because of economic and technical reasons. Fundamental scientific research is required to more completely understanding the complex electrochemical behavior and engineering technology of DEFCs. To use the DEFC system in real-world applications, fast, reliable, and cost-effective methods are needed to explore this complex phenomenon and to predict the performance of different system designs. Thus, modeling and simulation play an important role in examining the DEFC system as well as in designing an optimized DEFC system. The current DEFC literature shows that modeling studies on DEFCs are still in their early stages and are not able to describe the DEFC system as a whole. Potential DEFC applications and their current status are also presented.
Remote Sensing of Soils for Environmental Assessment and Management.
NASA Technical Reports Server (NTRS)
DeGloria, Stephen D.; Irons, James R.; West, Larry T.
2014-01-01
The next generation of imaging systems integrated with complex analytical methods will revolutionize the way we inventory and manage soil resources across a wide range of scientific disciplines and application domains. This special issue highlights those systems and methods for the direct benefit of environmental professionals and students who employ imaging and geospatial information for improved understanding, management, and monitoring of soil resources.
Molecular Imprinting: From Fundamentals to Applications
NASA Astrophysics Data System (ADS)
Komiyama, Makoto; Takeuchi, Toshifumi; Mukawa, Takashi; Asanuma, Hiroyuki
2003-03-01
Molecular imprinting, the polymerization of monomers in the presence of a template molecule which imprints structural information into the resulting polymers, is a scientific field which is rapidly gaining significance for a widening range of applications in biotechnology, biochemistry and pharmaceutical research. The methods and tools needed to distinguish target molecules from others by means of tailor-made receptors are constantly growing in importance and complexity. This book gives a concise and highly up-to-date overview of the remarkable progress made in this field in the last five years. The material is comprehensively presented by the authors, giving a thorough insight into fundamentals and applications for researchers in both industry and academy.
NASA Astrophysics Data System (ADS)
Alameda, J. C.
2011-12-01
Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.
Featured Article: Genotation: Actionable knowledge for the scientific reader
Willis, Ethan; Sakauye, Mark; Jose, Rony; Chen, Hao; Davis, Robert L
2016-01-01
We present an article viewer application that allows a scientific reader to easily discover and share knowledge by linking genomics-related concepts to knowledge of disparate biomedical databases. High-throughput data streams generated by technical advancements have contributed to scientific knowledge discovery at an unprecedented rate. Biomedical Informaticists have created a diverse set of databases to store and retrieve the discovered knowledge. The diversity and abundance of such resources present biomedical researchers a challenge with knowledge discovery. These challenges highlight a need for a better informatics solution. We use a text mining algorithm, Genomine, to identify gene symbols from the text of a journal article. The identified symbols are supplemented with information from the GenoDB knowledgebase. Self-updating GenoDB contains information from NCBI Gene, Clinvar, Medgen, dbSNP, KEGG, PharmGKB, Uniprot, and Hugo Gene databases. The journal viewer is a web application accessible via a web browser. The features described herein are accessible on www.genotation.org. The Genomine algorithm identifies gene symbols with an accuracy shown by .65 F-Score. GenoDB currently contains information regarding 59,905 gene symbols, 5633 drug–gene relationships, 5981 gene–disease relationships, and 713 pathways. This application provides scientific readers with actionable knowledge related to concepts of a manuscript. The reader will be able to save and share supplements to be visualized in a graphical manner. This provides convenient access to details of complex biological phenomena, enabling biomedical researchers to generate novel hypothesis to further our knowledge in human health. This manuscript presents a novel application that integrates genomic, proteomic, and pharmacogenomic information to supplement content of a biomedical manuscript and enable readers to automatically discover actionable knowledge. PMID:26900164
Featured Article: Genotation: Actionable knowledge for the scientific reader.
Nagahawatte, Panduka; Willis, Ethan; Sakauye, Mark; Jose, Rony; Chen, Hao; Davis, Robert L
2016-06-01
We present an article viewer application that allows a scientific reader to easily discover and share knowledge by linking genomics-related concepts to knowledge of disparate biomedical databases. High-throughput data streams generated by technical advancements have contributed to scientific knowledge discovery at an unprecedented rate. Biomedical Informaticists have created a diverse set of databases to store and retrieve the discovered knowledge. The diversity and abundance of such resources present biomedical researchers a challenge with knowledge discovery. These challenges highlight a need for a better informatics solution. We use a text mining algorithm, Genomine, to identify gene symbols from the text of a journal article. The identified symbols are supplemented with information from the GenoDB knowledgebase. Self-updating GenoDB contains information from NCBI Gene, Clinvar, Medgen, dbSNP, KEGG, PharmGKB, Uniprot, and Hugo Gene databases. The journal viewer is a web application accessible via a web browser. The features described herein are accessible on www.genotation.org The Genomine algorithm identifies gene symbols with an accuracy shown by .65 F-Score. GenoDB currently contains information regarding 59,905 gene symbols, 5633 drug-gene relationships, 5981 gene-disease relationships, and 713 pathways. This application provides scientific readers with actionable knowledge related to concepts of a manuscript. The reader will be able to save and share supplements to be visualized in a graphical manner. This provides convenient access to details of complex biological phenomena, enabling biomedical researchers to generate novel hypothesis to further our knowledge in human health. This manuscript presents a novel application that integrates genomic, proteomic, and pharmacogenomic information to supplement content of a biomedical manuscript and enable readers to automatically discover actionable knowledge. © 2016 by the Society for Experimental Biology and Medicine.
On the impact of communication complexity in the design of parallel numerical algorithms
NASA Technical Reports Server (NTRS)
Gannon, D.; Vanrosendale, J.
1984-01-01
This paper describes two models of the cost of data movement in parallel numerical algorithms. One model is a generalization of an approach due to Hockney, and is suitable for shared memory multiprocessors where each processor has vector capabilities. The other model is applicable to highly parallel nonshared memory MIMD systems. In the second model, algorithm performance is characterized in terms of the communication network design. Techniques used in VLSI complexity theory are also brought in, and algorithm independent upper bounds on system performance are derived for several problems that are important to scientific computation.
Ren, Guomin; Krawetz, Roman
2015-01-01
The data explosion in the last decade is revolutionizing diagnostics research and the healthcare industry, offering both opportunities and challenges. These high-throughput "omics" techniques have generated more scientific data in the last few years than in the entire history of mankind. Here we present a brief summary of how "big data" have influenced early diagnosis of complex diseases. We will also review some of the most commonly used "omics" techniques and their applications in diagnostics. Finally, we will discuss the issues brought by these new techniques when translating laboratory discoveries to clinical practice.
On the impact of communication complexity on the design of parallel numerical algorithms
NASA Technical Reports Server (NTRS)
Gannon, D. B.; Van Rosendale, J.
1984-01-01
This paper describes two models of the cost of data movement in parallel numerical alorithms. One model is a generalization of an approach due to Hockney, and is suitable for shared memory multiprocessors where each processor has vector capabilities. The other model is applicable to highly parallel nonshared memory MIMD systems. In this second model, algorithm performance is characterized in terms of the communication network design. Techniques used in VLSI complexity theory are also brought in, and algorithm-independent upper bounds on system performance are derived for several problems that are important to scientific computation.
Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ludascher, Bertram; Altintas, Ilkay
Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less
MEMOPS: data modelling and automatic code generation.
Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D
2010-03-25
In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.
NASA Astrophysics Data System (ADS)
Bezruchko, Konstantin; Davidov, Albert
2009-01-01
In the given article scientific and technical complex for modeling, researching and testing of rocket-space vehicles' power installations which was created in Power Source Laboratory of National Aerospace University "KhAI" is described. This scientific and technical complex gives the opportunity to replace the full-sized tests on model tests and to reduce financial and temporary inputs at modeling, researching and testing of rocket-space vehicles' power installations. Using the given complex it is possible to solve the problems of designing and researching of rocket-space vehicles' power installations efficiently, and also to provide experimental researches of physical processes and tests of solar and chemical batteries of rocket-space complexes and space vehicles. Scientific and technical complex also allows providing accelerated tests, diagnostics, life-time control and restoring of chemical accumulators for rocket-space vehicles' power supply systems.
NASA Astrophysics Data System (ADS)
Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.
2016-12-01
Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.
Web-Accessible Scientific Workflow System for Performance Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roelof Versteeg; Roelof Versteeg; Trevor Rowe
2006-03-01
We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less
The Virtual Mission - A step-wise approach to large space missions
NASA Technical Reports Server (NTRS)
Hansen, Elaine; Jones, Morgan; Hooke, Adrian; Pomphrey, Richard
1992-01-01
Attention is given to the Virtual Mission (VM) concept, wherein multiple scientific instruments will be on different platforms, in different orbits, operated from different control centers, at different institutions, and reporting to different user groups. The VM concept enables NASA's science and application users to accomplish their broad science goals with a fleet made up of smaller, more focused spacecraft and to alleviate the difficulties involved with single, large, complex spacecraft. The concept makes possible the stepwise 'go-as-you-pay' extensible approach recommended by Augustine (1990). It enables scientists to mix and match the use of many smaller satellites in novel ways to respond to new scientific ideas and needs.
Development of high resolution NMR spectroscopy as a structural tool
NASA Astrophysics Data System (ADS)
Feeney, James
1992-06-01
The discovery of the nuclear magnetic resonance (NMR) phenomenon and its development and exploitation as a scientific tool provide an excellent basis for a case-study for examining the factors which control the evolution of scientific techniques. Since the detection of the NMR phenomenon and the subsequent rapid discovery of all the important NMR spectral parameters in the late 1940s, the method has emerged as one of the most powerful techniques for determining structures of molecules in solution and for analysis of complex mixtures. The method has made a dramatic impact on the development of structural chemistry over the last 30 years and is now one of the key techniques in this area. Support for NMR instrumentation attracts a dominant slice of public funding in most scientifically developed countries. The technique is an excellent example of how instrumentation and technology have revolutionised structural chemistry and it is worth exploring how it has been developed so successfully. Clearly its wide range of application and the relatively direct connection between the NMR data and molecular structure has created a major market for the instrumentation. This has provided several competing manufacturers with the incentive to develop better and better instruments. Understanding the complexity of the basics of NMR spectroscopy has been an ongoing challenge attracting the attention of physicists. The well-organised specialist NMR literature and regular scientific meetings have ensured rapid exploitation of any theoretical advances that have a practical relevance. In parallel, the commercial development of the technology has allowed the fruits of such theoretical advances to be enjoyed by the wider scientific community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Y.; Cameron, K.W.
1998-11-24
Workload characterization has been proven an essential tool to architecture design and performance evaluation in both scientific and commercial computing areas. Traditional workload characterization techniques include FLOPS rate, cache miss ratios, CPI (cycles per instruction or IPC, instructions per cycle) etc. With the complexity of sophisticated modern superscalar microprocessors, these traditional characterization techniques are not powerful enough to pinpoint the performance bottleneck of an application on a specific microprocessor. They are also incapable of immediately demonstrating the potential performance benefit of any architectural or functional improvement in a new processor design. To solve these problems, many people rely on simulators,more » which have substantial constraints especially on large-scale scientific computing applications. This paper presents a new technique of characterizing applications at the instruction level using hardware performance counters. It has the advantage of collecting instruction-level characteristics in a few runs virtually without overhead or slowdown. A variety of instruction counts can be utilized to calculate some average abstract workload parameters corresponding to microprocessor pipelines or functional units. Based on the microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. In particular, the analysis results can provide some insight to the problem that only a small percentage of processor peak performance can be achieved even for many very cache-friendly codes. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. Eventually, these abstract parameters can lead to the creation of an analytical microprocessor pipeline model and memory hierarchy model.« less
What do you mean, 'resilient geomorphic systems'?
NASA Astrophysics Data System (ADS)
Thoms, M. C.; Piégay, H.; Parsons, M.
2018-03-01
Resilience thinking has many parallels in the study of geomorphology. Similarities and intersections exist between the scientific discipline of geomorphology and the scientific concept of resilience. Many of the core themes fundamental to geomorphology are closely related to the key themes of resilience. Applications of resilience thinking in the study of natural and human systems have expanded, based on the fundamental premise that ecosystems, economies, and societies must be managed as linked social-ecological systems. Despite geomorphology and resilience sharing core themes, appreciation is limited of the history and development of geomorphology as a field of scientific endeavor by many in the field of resilience, as well as a limited awareness of the foundations of the former in the more recent emergence of resilience. This potentially limits applications of resilience concepts to the study of geomorphology. In this manuscript we provide a collective examination of geomorphology and resilience as a means to conceptually advance both areas of study, as well as to further cement the relevance and importance of not only understanding the complexities of geomorphic systems in an emerging world of interdisciplinary challenges but also the importance of viewing humans as an intrinsic component of geomorphic systems rather than just an external driver. The application of the concepts of hierarchy and scale, fundamental tenets of the study of geomorphic systems, provide a means to overcome contemporary scale-limited approaches within resilience studies. Resilience offers a framework for geomorphology to expand its application into the broader social-ecological domain.
The navigation of biological hyperspace
NASA Astrophysics Data System (ADS)
Conway Morris, Simon
2003-04-01
A recurrent argument against the reality of biological evolution is the claim that there is insufficient time for the emergence of biological complexity. Such a view is a staple of creation "scientists", but even cosmologists and biochemists have been overheard murmuring similar sentiments. Certainly the stock response, that the scientific evidence for evolution is overwhelming, must be made. However, it is also the case that whilst the efficacity of natural selection is not in dispute, it is context-free and fails to explain the specificities of life. This observation is usually greeted with a Gallic shrug: "Yes, the biosphere is very rich, but so what?" Indeed, the standard scientific response is that evolution is dogged by contingent happenstance, with the implication that a given complexity, say intelligence, is an evolutionary fluke. This, however, is inconsistent with the ubiquity of evolutionary convergence. Here I outline the argument for such convergence providing a "road-map" of possibilities that arguably has universal applications and as importantly points to a much deeper structure to life.
Elements of complexity in subsurface modeling, exemplified with three case studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Vicky L.; Truex, Michael J.; Rockhold, Mark
2017-04-03
There are complexity elements to consider when applying subsurface flow and transport models to support environmental analyses. Modelers balance the benefits and costs of modeling along the spectrum of complexity, taking into account the attributes of more simple models (e.g., lower cost, faster execution, easier to explain, less mechanistic) and the attributes of more complex models (higher cost, slower execution, harder to explain, more mechanistic and technically defensible). In this paper, modeling complexity is examined with respect to considering this balance. The discussion of modeling complexity is organized into three primary elements: 1) modeling approach, 2) description of process, andmore » 3) description of heterogeneity. Three examples are used to examine these complexity elements. Two of the examples use simulations generated from a complex model to develop simpler models for efficient use in model applications. The first example is designed to support performance evaluation of soil vapor extraction remediation in terms of groundwater protection. The second example investigates the importance of simulating different categories of geochemical reactions for carbon sequestration and selecting appropriate simplifications for use in evaluating sequestration scenarios. In the third example, the modeling history for a uranium-contaminated site demonstrates that conservative parameter estimates were inadequate surrogates for complex, critical processes and there is discussion on the selection of more appropriate model complexity for this application. All three examples highlight how complexity considerations are essential to create scientifically defensible models that achieve a balance between model simplification and complexity.« less
Elements of complexity in subsurface modeling, exemplified with three case studies
NASA Astrophysics Data System (ADS)
Freedman, Vicky L.; Truex, Michael J.; Rockhold, Mark L.; Bacon, Diana H.; Freshley, Mark D.; Wellman, Dawn M.
2017-09-01
There are complexity elements to consider when applying subsurface flow and transport models to support environmental analyses. Modelers balance the benefits and costs of modeling along the spectrum of complexity, taking into account the attributes of more simple models (e.g., lower cost, faster execution, easier to explain, less mechanistic) and the attributes of more complex models (higher cost, slower execution, harder to explain, more mechanistic and technically defensible). In this report, modeling complexity is examined with respect to considering this balance. The discussion of modeling complexity is organized into three primary elements: (1) modeling approach, (2) description of process, and (3) description of heterogeneity. Three examples are used to examine these complexity elements. Two of the examples use simulations generated from a complex model to develop simpler models for efficient use in model applications. The first example is designed to support performance evaluation of soil-vapor-extraction remediation in terms of groundwater protection. The second example investigates the importance of simulating different categories of geochemical reactions for carbon sequestration and selecting appropriate simplifications for use in evaluating sequestration scenarios. In the third example, the modeling history for a uranium-contaminated site demonstrates that conservative parameter estimates were inadequate surrogates for complex, critical processes and there is discussion on the selection of more appropriate model complexity for this application. All three examples highlight how complexity considerations are essential to create scientifically defensible models that achieve a balance between model simplification and complexity.
Science communication as political communication
Scheufele, Dietram A.
2014-01-01
Scientific debates in modern societies often blur the lines between the science that is being debated and the political, moral, and legal implications that come with its societal applications. This manuscript traces the origins of this phenomenon to professional norms within the scientific discipline and to the nature and complexities of modern science and offers an expanded model of science communication that takes into account the political contexts in which science communication takes place. In a second step, it explores what we know from empirical work in political communication, public opinion research, and communication research about the dynamics that determine how issues are debated and attitudes are formed in political environments. Finally, it discusses how and why it will be increasingly important for science communicators to draw from these different literatures to ensure that the voice of the scientific community is heard in the broader societal debates surrounding science. PMID:25225389
NASA Astrophysics Data System (ADS)
Giordan, Daniele; Hayakawa, Yuichi; Nex, Francesco; Remondino, Fabio; Tarolli, Paolo
2018-04-01
The number of scientific studies that consider possible applications of remotely piloted aircraft systems (RPASs) for the management of natural hazards effects and the identification of occurred damages strongly increased in the last decade. Nowadays, in the scientific community, the use of these systems is not a novelty, but a deeper analysis of the literature shows a lack of codified complex methodologies that can be used not only for scientific experiments but also for normal codified emergency operations. RPASs can acquire on-demand ultra-high-resolution images that can be used for the identification of active processes such as landslides or volcanic activities but can also define the effects of earthquakes, wildfires and floods. In this paper, we present a review of published literature that describes experimental methodologies developed for the study and monitoring of natural hazards.
Science communication as political communication.
Scheufele, Dietram A
2014-09-16
Scientific debates in modern societies often blur the lines between the science that is being debated and the political, moral, and legal implications that come with its societal applications. This manuscript traces the origins of this phenomenon to professional norms within the scientific discipline and to the nature and complexities of modern science and offers an expanded model of science communication that takes into account the political contexts in which science communication takes place. In a second step, it explores what we know from empirical work in political communication, public opinion research, and communication research about the dynamics that determine how issues are debated and attitudes are formed in political environments. Finally, it discusses how and why it will be increasingly important for science communicators to draw from these different literatures to ensure that the voice of the scientific community is heard in the broader societal debates surrounding science.
Lattice Boltzmann method for rain-induced overland flow
NASA Astrophysics Data System (ADS)
Ding, Yu; Liu, Haifei; Peng, Yong; Xing, Liming
2018-07-01
Complex rainfall situations can generate overland flow with complex hydrodynamic characteristics, affecting the surface configuration (i.e. sheet erosion) and environment to varying degrees. Reliable numerical simulations can provide a scientific method for the optimization of environmental management. A mesoscopic numerical method, the lattice Boltzmann method, was employed to simulate overland flows. To deal with complex rainfall, two schemes were introduced to improve the lattice Boltzmann equation and the local equilibrium function, respectively. Four typical cases with differences in rainfall, bed roughness, and slopes were selected to test the accuracy and applicability of the proposed schemes. It was found that the simulated results were in good agreement with the experimental data, analytical values, and the results produced by other models.
Reengineering observatory operations for the time domain
NASA Astrophysics Data System (ADS)
Seaman, Robert L.; Vestrand, W. T.; Hessman, Frederic V.
2014-07-01
Observatories are complex scientific and technical institutions serving diverse users and purposes. Their telescopes, instruments, software, and human resources engage in interwoven workflows over a broad range of timescales. These workflows have been tuned to be responsive to concepts of observatory operations that were applicable when various assets were commissioned, years or decades in the past. The astronomical community is entering an era of rapid change increasingly characterized by large time domain surveys, robotic telescopes and automated infrastructures, and - most significantly - of operating modes and scientific consortia that span our individual facilities, joining them into complex network entities. Observatories must adapt and numerous initiatives are in progress that focus on redesigning individual components out of the astronomical toolkit. New instrumentation is both more capable and more complex than ever, and even simple instruments may have powerful observation scripting capabilities. Remote and queue observing modes are now widespread. Data archives are becoming ubiquitous. Virtual observatory standards and protocols and astroinformatics data-mining techniques layered on these are areas of active development. Indeed, new large-aperture ground-based telescopes may be as expensive as space missions and have similarly formal project management processes and large data management requirements. This piecewise approach is not enough. Whatever challenges of funding or politics facing the national and international astronomical communities it will be more efficient - scientifically as well as in the usual figures of merit of cost, schedule, performance, and risks - to explicitly address the systems engineering of the astronomical community as a whole.
Scientific Knowledge Discovery in Complex Semantic Networks of Geophysical Systems
NASA Astrophysics Data System (ADS)
Fox, P.
2012-04-01
The vast majority of explorations of the Earth's systems are limited in their ability to effectively explore the most important (often most difficult) problems because they are forced to interconnect at the data-element, or syntactic, level rather than at a higher scientific, or semantic, level. Recent successes in the application of complex network theory and algorithms to climate data, raise expectations that more general graph-based approaches offer the opportunity for new discoveries. In the past ~ 5 years in the natural sciences there has substantial progress in providing both specialists and non-specialists the ability to describe in machine readable form, geophysical quantities and relations among them in meaningful and natural ways, effectively breaking the prior syntax barrier. The corresponding open-world semantics and reasoning provide higher-level interconnections. That is, semantics provided around the data structures, using semantically-equipped tools, and semantically aware interfaces between science application components allowing for discovery at the knowledge level. More recently, formal semantic approaches to continuous and aggregate physical processes are beginning to show promise and are soon likely to be ready to apply to geoscientific systems. To illustrate these opportunities, this presentation presents two application examples featuring domain vocabulary (ontology) and property relations (named and typed edges in the graphs). First, a climate knowledge discovery pilot encoding and exploration of CMIP5 catalog information with the eventual goal to encode and explore CMIP5 data. Second, a multi-stakeholder knowledge network for integrated assessments in marine ecosystems, where the data is highly inter-disciplinary.
Anticoagulant treatment of medical patients with complex clinical conditions.
Ruiz-Ruiz, F; Medrano, F J; Santos-Lozano, J M; Rodríguez-Torres, P; Navarro-Puerto, A; Calderón, E J
2018-06-12
There is scarce available information on the treatment or prophylaxis with anticoagulant drugs of outpatients with medical diseases and complex clinical conditions. There are no clinical practice guidelines and/or specific recommendations for this patient subgroup, which are frequently treated by internists. Complex clinical conditions are those in which, due to comorbidity, age, vital prognosis or multiple treatment with drugs, a clinical situation arises of disease-disease, disease-drug or drug-drug interactions that is not included within the scenarios that commonly generate the scientific evidence. The objective of this narrative review is collecting and adapting of the clinical guidelines recommendations and systematic reviews to complex clinical conditions, in which the direct application of recommendations based on studies that do not include patients with this complexity and comorbidity could be problematic. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.
NASA Astrophysics Data System (ADS)
Wolfschmidt, Gudrun
2015-08-01
Observatories offer a good possibility for serial transnational applications. A well-known example for a thematic programme is the Struve arc, already recognized as World Heritage.I will discuss what has been achieved and show examples, like the route of astronomical observatories or the transition from classical astronomy to modern astrophysics (La Plata, Hamburg, Nice, etc.), visible in the architecture, the choice of instruments, and the arrangement of the observatory buildings in an astronomy park. This corresponds to the main categories according to which the ``outstanding universal value'' (UNESCO criteria ii, iv and vi) of the observatories have been evaluated: historic, scientific, and aesthetic. This proposal is based on the criteria of a comparability of the observatories in terms of the urbanistic complex and the architecture, the scientific orientation, equipment of instruments, authenticity and integrity of the preserved state, as well as in terms of historic scientific relations and scientific contributions.Apart from these serial transnational applications one can also choose other groups like baroque or neo-classical observatories, solar physics observatories or a group of observatories equipped with the same kind of instruments and made by the same famous firm. I will also discuss why the implementation of the Astronomy and World Heritage Initiative is difficult and why there are problems to nominate observatories for election in the national Tentative Lists
JPRS Report, Science & Technology, USSR: Science and Technology Policy.
1988-03-03
accordance with the Kazakhstan Regional Scientific Research Program, which is called upon to unite scientific development of a basic and applied nature...Resources for 1986-1990 and the Period to 2000." The institute is a part of the union Avtogennyye protsessy Scientific Technical Complex and the...republic Tsvetnaya metallurgiya Scientific Technical Complex and is participating in the work of the creative youth collective for the automation of
Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems
NASA Astrophysics Data System (ADS)
Bourgine, P.; Johnson, J.
2009-04-01
The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (< a few Euros). The research will create an in vivo laboratory of one to ten thousand postgraduate students studying courses in complex systems. This community is chosen because it is large and interdisciplinary and there is a known requirement for courses for thousand of students across Europe. The project involves every aspect of course production and delivery. Within this the research focused on the creation of a Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.
Preface to the Focus Issue: chaos detection methods and predictability.
Gottwald, Georg A; Skokos, Charalampos
2014-06-01
This Focus Issue presents a collection of papers originating from the workshop Methods of Chaos Detection and Predictability: Theory and Applications held at the Max Planck Institute for the Physics of Complex Systems in Dresden, June 17-21, 2013. The main aim of this interdisciplinary workshop was to review comprehensively the theory and numerical implementation of the existing methods of chaos detection and predictability, as well as to report recent applications of these techniques to different scientific fields. The collection of twelve papers in this Focus Issue represents the wide range of applications, spanning mathematics, physics, astronomy, particle accelerator physics, meteorology and medical research. This Preface surveys the papers of this Issue.
Li, Y; Nielsen, P V
2011-12-01
There has been a rapid growth of scientific literature on the application of computational fluid dynamics (CFD) in the research of ventilation and indoor air science. With a 1000-10,000 times increase in computer hardware capability in the past 20 years, CFD has become an integral part of scientific research and engineering development of complex air distribution and ventilation systems in buildings. This review discusses the major and specific challenges of CFD in terms of turbulence modelling, numerical approximation, and boundary conditions relevant to building ventilation. We emphasize the growing need for CFD verification and validation, suggest ongoing needs for analytical and experimental methods to support the numerical solutions, and discuss the growing capacity of CFD in opening up new research areas. We suggest that CFD has not become a replacement for experiment and theoretical analysis in ventilation research, rather it has become an increasingly important partner. We believe that an effective scientific approach for ventilation studies is still to combine experiments, theory, and CFD. We argue that CFD verification and validation are becoming more crucial than ever as more complex ventilation problems are solved. It is anticipated that ventilation problems at the city scale will be tackled by CFD in the next 10 years. © 2011 John Wiley & Sons A/S.
Prescriptive scientific narratives for communicating usable science.
Downs, Julie S
2014-09-16
In this paper I describe how a narrative approach to science communication may help audiences to more fully understand how science is relevant to their own lives and behaviors. The use of prescriptive scientific narrative can help to overcome challenges specific to scientific concepts, especially the need to reconsider long-held beliefs in the face of new empirical findings. Narrative can captivate the audience, driving anticipation for plot resolution, thus becoming a self-motivating vehicle for information delivery. This quality gives narrative considerable power to explain complex phenomena and causal processes, and to create and reinforce memory traces for better recall and application over time. Because of the inherent properties of narrative communication, their creators have a special responsibility to ensure even-handedness in selection and presentation of the scientific evidence. The recent transformation in communication and information technology has brought about new platforms for delivering content, particularly through interactivity, which can use structured self-tailoring to help individuals most efficiently get exactly the content that they need. As with all educational efforts, prescriptive scientific narratives must be evaluated systematically to determine whether they have the desired effects in improving understanding and changing behavior.
Prescriptive scientific narratives for communicating usable science
Downs, Julie S.
2014-01-01
In this paper I describe how a narrative approach to science communication may help audiences to more fully understand how science is relevant to their own lives and behaviors. The use of prescriptive scientific narrative can help to overcome challenges specific to scientific concepts, especially the need to reconsider long-held beliefs in the face of new empirical findings. Narrative can captivate the audience, driving anticipation for plot resolution, thus becoming a self-motivating vehicle for information delivery. This quality gives narrative considerable power to explain complex phenomena and causal processes, and to create and reinforce memory traces for better recall and application over time. Because of the inherent properties of narrative communication, their creators have a special responsibility to ensure even-handedness in selection and presentation of the scientific evidence. The recent transformation in communication and information technology has brought about new platforms for delivering content, particularly through interactivity, which can use structured self-tailoring to help individuals most efficiently get exactly the content that they need. As with all educational efforts, prescriptive scientific narratives must be evaluated systematically to determine whether they have the desired effects in improving understanding and changing behavior. PMID:25225369
Scientific Data Stewardship in the 21'st Century
NASA Astrophysics Data System (ADS)
Mabie, J. J.; Redmon, R.; Bullett, T.; Kihn, E. A.; Conkright, R.; Manley, J.; Horan, K.
2008-12-01
The Ionosonde Program at the National Geophysical Data Center (NGDC) serves as a case study for how to approach data stewardship in the 21'st century. As the number and sophistication of scientific instruments increase, along with the volumes and complexity of data that need to be preserved for future generations, the old approach of simply storing data in a library, physical or electronic, is no longer sufficient to ensure the long term preservation of our important environmental data. To ensure the data can be accessed, understood, and used by future generations, the data stewards must be familiar with the observation process before the data reach the archive and the scientific applications to which the data may be called to serve. This familiarity is best obtained by active participation. In the NGDC Ionosonde Program team, we strive to have activity and expertise in ionosonde field operations and scientific data analysis in addition to our core mission of preservation and distribution of data and metadata. We believe this approach produces superior data quality, proper documentation and evaluation tools for data customers as part of the archive process. We are presenting the Ionosonde Program as an example of modern scientific data stewardship.
Climate change and public health policy: translating the science.
Braks, Marieta; van Ginkel, Rijk; Wint, William; Sedda, Luigi; Sprong, Hein
2013-12-19
Public health authorities are required to prepare for future threats and need predictions of the likely impact of climate change on public health risks. They may get overwhelmed by the volume of heterogeneous information in scientific articles and risk relying purely on the public opinion articles which focus mainly on global warming trends, and leave out many other relevant factors. In the current paper, we discuss various scientific approaches investigating climate change and its possible impact on public health and discuss their different roles and functions in unraveling the complexity of the subject. It is not our objective to review the available literature or to make predictions for certain diseases or countries, but rather to evaluate the applicability of scientific research articles on climate change to evidence-based public health decisions. In the context of mosquito borne diseases, we identify common pitfalls to watch out for when assessing scientific research on the impact of climate change on human health. We aim to provide guidance through the plethora of scientific papers and views on the impact of climate change on human health to those new to the subject, as well as to remind public health experts of its multifactorial and multidisciplinary character.
Climate Change and Public Health Policy: Translating the Science
Braks, Marieta; van Ginkel, Rijk; Wint, William; Sedda, Luigi; Sprong, Hein
2013-01-01
Public health authorities are required to prepare for future threats and need predictions of the likely impact of climate change on public health risks. They may get overwhelmed by the volume of heterogeneous information in scientific articles and risk relying purely on the public opinion articles which focus mainly on global warming trends, and leave out many other relevant factors. In the current paper, we discuss various scientific approaches investigating climate change and its possible impact on public health and discuss their different roles and functions in unraveling the complexity of the subject. It is not our objective to review the available literature or to make predictions for certain diseases or countries, but rather to evaluate the applicability of scientific research articles on climate change to evidence-based public health decisions. In the context of mosquito borne diseases, we identify common pitfalls to watch out for when assessing scientific research on the impact of climate change on human health. We aim to provide guidance through the plethora of scientific papers and views on the impact of climate change on human health to those new to the subject, as well as to remind public health experts of its multifactorial and multidisciplinary character. PMID:24452252
Precision pointing of scientific instruments on space station: The LFGGREC perspective
NASA Technical Reports Server (NTRS)
Blackwell, C. C.; Sirlin, S. W.; Laskin, R. A.
1988-01-01
An application of Lyapunov function-gradient-generated robustness-enhancing control (LFGGREC) is explored. The attention is directed to a reduced-complexity representation of the pointing problem presented by the system composed of the Space Infrared Telescope Facility gimbaled to a space station configuration. Uncertainties include disturbance forces applied in the crew compartment area and control moments applied to adjacent scientific payloads (modeled as disturbance moments). Also included are uncertainties in gimbal friction and in the structural component of the system, as reflected in the inertia matrix, the damping matrix, and the stiffness matrix, and the effect of the ignored vibrational dynamics of the structure. The emphasis is on the adaptation of LFGGREC to this particular configuration and on the robustness analysis.
Creating a FIESTA (Framework for Integrated Earth Science and Technology Applications) with MagIC
NASA Astrophysics Data System (ADS)
Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Tauxe, L.; Constable, C.
2017-12-01
The Magnetics Information Consortium (https://earthref.org/MagIC) has recently developed a containerized web application to considerably reduce the friction in contributing, exploring and combining valuable and complex datasets for the paleo-, geo- and rock magnetic scientific community. The data produced in this scientific domain are inherently hierarchical and the communities evolving approaches to this scientific workflow, from sampling to taking measurements to multiple levels of interpretations, require a large and flexible data model to adequately annotate the results and ensure reproducibility. Historically, contributing such detail in a consistent format has been prohibitively time consuming and often resulted in only publishing the highly derived interpretations. The new open-source (https://github.com/earthref/MagIC) application provides a flexible upload tool integrated with the data model to easily create a validated contribution and a powerful search interface for discovering datasets and combining them to enable transformative science. MagIC is hosted at EarthRef.org along with several interdisciplinary geoscience databases. A FIESTA (Framework for Integrated Earth Science and Technology Applications) is being created by generalizing MagIC's web application for reuse in other domains. The application relies on a single configuration document that describes the routing, data model, component settings and external services integrations. The container hosts an isomorphic Meteor JavaScript application, MongoDB database and ElasticSearch search engine. Multiple containers can be configured as microservices to serve portions of the application or rely on externally hosted MongoDB, ElasticSearch, or third-party services to efficiently scale computational demands. FIESTA is particularly well suited for many Earth Science disciplines with its flexible data model, mapping, account management, upload tool to private workspaces, reference metadata, image galleries, full text searches and detailed filters. EarthRef's Seamount Catalog of bathymetry and morphology data, EarthRef's Geochemical Earth Reference Model (GERM) databases, and Oregon State University's Marine and Geology Repository (http://osu-mgr.org) will benefit from custom adaptations of FIESTA.
NASA Astrophysics Data System (ADS)
Castano, Carolina
2008-11-01
This article reports on a qualitative and quantitative study that explored whether a constructivist Science learning environment, in which 9 to 10-year old Colombian girls had the opportunity to discuss scientific concepts and socio-scientific dilemmas in groups, improved their understanding of the concepts and the complex relations that exists between species and the environment. Data were collected from two fourth grade groups in a private bilingual school, a treatment and a comparison group. Pre and post tests on the understanding of scientific concepts and the possible consequences of human action on living things, transcriptions of the discussions of dilemmas, and pre and post tests of empathy showed that students who had the opportunity to discuss socio-scientific dilemmas gave better definitions for scientific concepts and made better connections between them, their lives and Nature than students who did not. It is argued that Science learning should occur in constructivist learning environments and go beyond the construction of scientific concepts, to discussions and decision-making related to the social and moral implications of the application of Science in the real world. It is also argued that this type of pedagogical interventions and research on them should be carried out in different sociocultural contexts to confirm their impact on Science learning in diverse conditions.
An Easy & Fun Way to Teach about How Science "Works": Popularizing Haack's Crossword-Puzzle Analogy
ERIC Educational Resources Information Center
Pavlova, Iglika V.; Lewis, Kayla C.
2013-01-01
Science is a complex process, and we must not teach our students overly simplified versions of "the" scientific method. We propose that students can uncover the complex realities of scientific thinking by exploring the similarities and differences between solving the familiar crossword puzzles and scientific "puzzles."…
Echoes That Never Were: American Mobile Intercontinental Ballistic Missiles, 1956-1983
2006-05-11
research, develop, operate, maintain, and sustain complex technological systems , ICBMs were--and remain--a system blending technical matters, scientific ...maintain, and sustain complex technological systems , ICBMs were--and remain--a system blending technical matters, scientific laws, economic...technological system that blended scientific laws, economic realities, political forces, and social concerns that included environmentalism and
An e-learning application on electrochemotherapy
Corovic, Selma; Bester, Janez; Miklavcic, Damijan
2009-01-01
Background Electrochemotherapy is an effective approach in local tumour treatment employing locally applied high-voltage electric pulses in combination with chemotherapeutic drugs. In planning and performing electrochemotherapy a multidisciplinary expertise is required and collaboration, knowledge and experience exchange among the experts from different scientific fields such as medicine, biology and biomedical engineering is needed. The objective of this study was to develop an e-learning application in order to provide the educational content on electrochemotherapy and its underlying principles and to support collaboration, knowledge and experience exchange among the experts involved in the research and clinics. Methods The educational content on electrochemotherapy and cell and tissue electroporation was based on previously published studies from molecular dynamics, lipid bilayers, single cell level and simplified tissue models to complex biological tissues and research and clinical results of electrochemotherapy treatment. We used computer graphics such as model-based visualization (i.e. 3D numerical modelling using finite element method) and 3D computer animations and graphical illustrations to facilitate the representation of complex biological and physical aspects in electrochemotherapy. The e-learning application is integrated into an interactive e-learning environment developed at our institution, enabling collaboration and knowledge exchange among the users. We evaluated the designed e-learning application at the International Scientific workshop and postgraduate course (Electroporation Based Technologies and Treatments). The evaluation was carried out by testing the pedagogical efficiency of the presented educational content and by performing the usability study of the application. Results The e-learning content presents three different levels of knowledge on cell and tissue electroporation. In the first part of the e-learning application we explain basic principles of electroporation process. The second part provides educational content about importance of modelling and visualization of local electric field in electroporation-based treatments. In the third part we developed an interactive module for visualization of local electric field distribution in 3D tissue models of cutaneous tumors for different parameters such as voltage applied, distance between electrodes, electrode dimension and shape, tissue geometry and electric conductivity. The pedagogical efficiency assessment showed that the participants improved their level of knowledge. The results of usability evaluation revealed that participants found the application simple to learn, use and navigate. The participants also found the information provided by the application easy to understand. Conclusion The e-learning application we present in this article provides educational material on electrochemotherapy and its underlying principles such as cell and tissue electroporation. The e-learning application is developed to provide an interactive educational content in order to simulate the "hands-on" learning approach about the parameters being important for successful therapy. The e-learning application together with the interactive e-learning environment is available to the users to provide collaborative and flexible learning in order to facilitate knowledge exchange among the experts from different scientific fields that are involved in electrochemotherapy. The modular structure of the application allows for upgrade with new educational content collected from the clinics and research, and can be easily adapted to serve as a collaborative e-learning tool also in other electroporation-based treatments such as gene electrotransfer, gene vaccination, irreversible tissue ablation and transdermal gene and drug delivery. The presented e-learning application provides an easy and rapid approach for information, knowledge and experience exchange among the experts from different scientific fields, which can facilitate development and optimisation of electroporation-based treatments. PMID:19843322
An e-learning application on electrochemotherapy.
Corovic, Selma; Bester, Janez; Miklavcic, Damijan
2009-10-20
Electrochemotherapy is an effective approach in local tumour treatment employing locally applied high-voltage electric pulses in combination with chemotherapeutic drugs. In planning and performing electrochemotherapy a multidisciplinary expertise is required and collaboration, knowledge and experience exchange among the experts from different scientific fields such as medicine, biology and biomedical engineering is needed. The objective of this study was to develop an e-learning application in order to provide the educational content on electrochemotherapy and its underlying principles and to support collaboration, knowledge and experience exchange among the experts involved in the research and clinics. The educational content on electrochemotherapy and cell and tissue electroporation was based on previously published studies from molecular dynamics, lipid bilayers, single cell level and simplified tissue models to complex biological tissues and research and clinical results of electrochemotherapy treatment. We used computer graphics such as model-based visualization (i.e. 3D numerical modelling using finite element method) and 3D computer animations and graphical illustrations to facilitate the representation of complex biological and physical aspects in electrochemotherapy. The e-learning application is integrated into an interactive e-learning environment developed at our institution, enabling collaboration and knowledge exchange among the users. We evaluated the designed e-learning application at the International Scientific workshop and postgraduate course (Electroporation Based Technologies and Treatments). The evaluation was carried out by testing the pedagogical efficiency of the presented educational content and by performing the usability study of the application. The e-learning content presents three different levels of knowledge on cell and tissue electroporation. In the first part of the e-learning application we explain basic principles of electroporation process. The second part provides educational content about importance of modelling and visualization of local electric field in electroporation-based treatments. In the third part we developed an interactive module for visualization of local electric field distribution in 3D tissue models of cutaneous tumors for different parameters such as voltage applied, distance between electrodes, electrode dimension and shape, tissue geometry and electric conductivity. The pedagogical efficiency assessment showed that the participants improved their level of knowledge. The results of usability evaluation revealed that participants found the application simple to learn, use and navigate. The participants also found the information provided by the application easy to understand. The e-learning application we present in this article provides educational material on electrochemotherapy and its underlying principles such as cell and tissue electroporation. The e-learning application is developed to provide an interactive educational content in order to simulate the "hands-on" learning approach about the parameters being important for successful therapy. The e-learning application together with the interactive e-learning environment is available to the users to provide collaborative and flexible learning in order to facilitate knowledge exchange among the experts from different scientific fields that are involved in electrochemotherapy. The modular structure of the application allows for upgrade with new educational content collected from the clinics and research, and can be easily adapted to serve as a collaborative e-learning tool also in other electroporation-based treatments such as gene electrotransfer, gene vaccination, irreversible tissue ablation and transdermal gene and drug delivery. The presented e-learning application provides an easy and rapid approach for information, knowledge and experience exchange among the experts from different scientific fields, which can facilitate development and optimisation of electroporation-based treatments.
What is the role of induction and deduction in reasoning and scientific inquiry?
NASA Astrophysics Data System (ADS)
Lawson, Anton E.
2005-08-01
A long-standing and continuing controversy exists regarding the role of induction and deduction in reasoning and in scientific inquiry. Given the inherent difficulty in reconstructing reasoning patterns based on personal and historical accounts, evidence about the nature of human reasoning in scientific inquiry has been sought from a controlled experiment designed to identify the role played by enumerative induction and deduction in cognition as well as from the relatively new field of neural modeling. Both experimental results and the neurological models imply that induction across a limited set of observations plays no role in task performance and in reasoning. Therefore, support has been obtained for Popper's hypothesis that enumerative induction does not exist as a psychological process. Instead, people appear to process information in terms of increasingly abstract cycles of hypothetico-deductive reasoning. Consequently, science instruction should provide students with opportunities to generate and test increasingly complex and abstract hypotheses and theories in a hypothetico-deductive manner. In this way students can be expected to become increasingly conscious of their underlying hypothetico-deductive thought processes, increasingly skilled in their application, and hence increasingly scientifically literate.
Jakimowicz, Aleksander
2009-10-01
The 7-fold interdisciplinary matrix is introduced. This integrated methodological point of view is original, although it is based on ideas of others in various ways. The name for this new approach draws on the Kuhnian notion of a disciplinary matrix. There are four components of the Kuhnian matrix on which the existence of scientific communities hinges: symbolic generalizations, models, values, and exemplars. In this context the term "paradigm" should refer to exemplars. The interdisciplinary matrix is composed of seven elements: cybernetics, catastrophe theory, fractal geometry, deterministic chaos, artificial intelligence, theory of complexity, and humanistic values. Scientific developments have recently brought substantial changes in the structure of scientific communities. Transferability of ideas and thoughts contributed to the creation of scientific communities, which unite representatives of various professions. When researching into certain phenomena we no longer need to develop theories for them from scratch, as we can draw on the achievements in other disciplines. Two examples of the employment of the interdisciplinary matrix in macroeconomics are elaborated here: the investment cycle model in socialist economy, and the model of economic transformation based on chaotic hysteresis.
Fulcher, Ben D; Jones, Nick S
2017-11-22
Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Optimizing Irregular Applications for Energy and Performance on the Tilera Many-core Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Panyala, Ajay R.; Halappanavar, Mahantesh
Optimizing applications simultaneously for energy and performance is a complex problem. High performance, parallel, irregular applications are notoriously hard to optimize due to their data-dependent memory accesses, lack of structured locality and complex data structures and code patterns. Irregular kernels are growing in importance in applications such as machine learning, graph analytics and combinatorial scientific computing. Performance- and energy-efficient implementation of these kernels on modern, energy efficient, multicore and many-core platforms is therefore an important and challenging problem. We present results from optimizing two irregular applications { the Louvain method for community detection (Grappolo), and high-performance conjugate gradient (HPCCG) {more » on the Tilera many-core system. We have significantly extended MIT's OpenTuner auto-tuning framework to conduct a detailed study of platform-independent and platform-specific optimizations to improve performance as well as reduce total energy consumption. We explore the optimization design space along three dimensions: memory layout schemes, compiler-based code transformations, and optimization of parallel loop schedules. Using auto-tuning, we demonstrate whole node energy savings of up to 41% relative to a baseline instantiation, and up to 31% relative to manually optimized variants.« less
Miniaturization as a key factor to the development and application of advanced metrology systems
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Dobrev, Ivo; Harrington, Ellery; Hefti, Peter; Khaleghi, Morteza
2012-10-01
Recent technological advances of miniaturization engineering are enabling the realization of components and systems with unprecedented capabilities. Such capabilities, which are significantly beneficial to scientific and engineering applications, are impacting the development and the application of optical metrology systems for investigations under complex boundary, loading, and operating conditions. In this paper, and overview of metrology systems that we are developing is presented. Systems are being developed and applied to high-speed and high-resolution measurements of shape and deformations under actual operating conditions for such applications as sustainability, health, medical diagnosis, security, and urban infrastructure. Systems take advantage of recent developments in light sources and modulators, detectors, microelectromechanical (MEMS) sensors and actuators, kinematic positioners, rapid prototyping fabrication technologies, as well as software engineering.
Information driving force and its application in agent-based modeling
NASA Astrophysics Data System (ADS)
Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei
2018-04-01
Exploring the scientific impact of online big-data has attracted much attention of researchers from different fields in recent years. Complex financial systems are typical open systems profoundly influenced by the external information. Based on the large-scale data in the public media and stock markets, we first define an information driving force, and analyze how it affects the complex financial system. The information driving force is observed to be asymmetric in the bull and bear market states. As an application, we then propose an agent-based model driven by the information driving force. Especially, all the key parameters are determined from the empirical analysis rather than from statistical fitting of the simulation results. With our model, both the stationary properties and non-stationary dynamic behaviors are simulated. Considering the mean-field effect of the external information, we also propose a few-body model to simulate the financial market in the laboratory.
Pectin-modifying enzymes and pectin-derived materials: applications and impacts.
Bonnin, Estelle; Garnier, Catherine; Ralet, Marie-Christine
2014-01-01
Pectins are complex branched polysaccharides present in primary cell walls. As a distinctive feature, they contain high amount of partly methyl-esterified galacturonic acid and low amount of rhamnose and carry arabinose and galactose as major neutral sugars. Due to their structural complexity, they are modifiable by many different enzymes, including hydrolases, lyases, and esterases. Their peculiar structure is the origin of their physicochemical properties. Among others, their remarkable gelling properties make them a key additive for food industries. Pectin-degrading enzymes and -modifying enzymes may be used in a wide variety of applications to modulate pectin properties or produce pectin derivatives and oligosaccharides with functional as well as nutritional interests. This paper reviews the scientific information available on pectin structure, pectin-modifying enzymes, and the use of enzymes to produce pectin with controlled structure or pectin-derived oligosaccharides, with functional or nutritional interesting properties.
77 FR 61739 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... DEPARTMENT OF COMMERCE International Trade Administration Application(s) for Duty-Free Entry of Scientific Instruments Pursuant to Section 6(c) of the Educational, Scientific and Cultural Materials... combustion, such as hydroxyl (OH) radicals. The [[Page 61740
Althoff, Marc André; Bertsch, Andreas; Metzulat, Manfred; Klapötke, Thomas M; Karaghiosoff, Konstantin L
2017-11-01
The successful application of headspace (HS) and direct immersion (DI) solid phase microextraction (SPME) for the unambiguous identification and characterization of a series of toxic thiophosphate esters, such as Amiton (I), from aqueous phases and complex matrices (e.g. grass and foliage) has been demonstrated. A Thermo Scientific gas chromatograph (GC) - tandem mass spectrometer (MS/MS) system with a TriPlus RSH® autosampler and a SPME tool was used to investigate the effect of different parameters that influence the extraction efficiency: e.g. pH of the sample matrix and extraction temperature. The developed methods were employed for the detection of several Amiton derivatives (Schedule II of the CWC) that are structurally closely related to each other; some of which are new and have not been reported in literature previously. In addition, a novel DI SPME method from complex matrices for the analysis of organophosphates related to the CWC was developed. The studies clearly show that DI SPME for complex matrices is superior to HS extraction and can potentially be applied to other related compounds controlled under the CWC. Copyright © 2017. Published by Elsevier B.V.
[Surgical treatment of chronic pancreatitis based on classification of M. Buchler and coworkers].
Krivoruchko, I A; Boĭko, V V; Goncharova, N N; Andreeshchev, S A
2011-08-01
The results of surgical treatment of 452 patients, suffering chronic pancreatitis (CHP), were analyzed. The CHP classification, elaborated by M. Buchler and coworkers (2009), based on clinical signs, morphological peculiarities and pancreatic function analysis, contains scientifically substantiated recommendations for choice of diagnostic methods and complex treatment of the disease. The classification proposed is simple in application and constitutes an instrument for studying and comparison of the CHP course severity, the patients prognosis and treatment.
Axial Halbach Magnetic Bearings
NASA Technical Reports Server (NTRS)
Eichenberg, Dennis J.; Gallo, Christopher A.; Thompson, William K.
2008-01-01
Axial Halbach magnetic bearings have been investigated as part of an effort to develop increasingly reliable noncontact bearings for future high-speed rotary machines that may be used in such applications as aircraft, industrial, and land-vehicle power systems and in some medical and scientific instrumentation systems. Axial Halbach magnetic bearings are passive in the sense that unlike most other magnetic bearings that have been developed in recent years, they effect stable magnetic levitation without need for complex active control.
Planetary Data Workshop, Part 2
NASA Technical Reports Server (NTRS)
1984-01-01
Technical aspects of the Planetary Data System (PDS) are addressed. Methods and tools for maintaining and accessing large, complex sets of data are discussed. The specific software and applications needed for processing imaging and non-imaging science data are reviewed. The need for specific software that provides users with information on the location and geometry of scientific observations is discussed. Computer networks and user interface to the PDS are covered along with Computer hardware available to this data system.
NASA Astrophysics Data System (ADS)
Strassmann, Kuno M.; Joos, Fortunat
2018-05-01
The Bern Simple Climate Model (BernSCM) is a free open-source re-implementation of a reduced-form carbon cycle-climate model which has been used widely in previous scientific work and IPCC assessments. BernSCM represents the carbon cycle and climate system with a small set of equations for the heat and carbon budget, the parametrization of major nonlinearities, and the substitution of complex component systems with impulse response functions (IRFs). The IRF approach allows cost-efficient yet accurate substitution of detailed parent models of climate system components with near-linear behavior. Illustrative simulations of scenarios from previous multimodel studies show that BernSCM is broadly representative of the range of the climate-carbon cycle response simulated by more complex and detailed models. Model code (in Fortran) was written from scratch with transparency and extensibility in mind, and is provided open source. BernSCM makes scientifically sound carbon cycle-climate modeling available for many applications. Supporting up to decadal time steps with high accuracy, it is suitable for studies with high computational load and for coupling with integrated assessment models (IAMs), for example. Further applications include climate risk assessment in a business, public, or educational context and the estimation of CO2 and climate benefits of emission mitigation options.
Using Java for distributed computing in the Gaia satellite data processing
NASA Astrophysics Data System (ADS)
O'Mullane, William; Luri, Xavier; Parsons, Paul; Lammers, Uwe; Hoar, John; Hernandez, Jose
2011-10-01
In recent years Java has matured to a stable easy-to-use language with the flexibility of an interpreter (for reflection etc.) but the performance and type checking of a compiled language. When we started using Java for astronomical applications around 1999 they were the first of their kind in astronomy. Now a great deal of astronomy software is written in Java as are many business applications. We discuss the current environment and trends concerning the language and present an actual example of scientific use of Java for high-performance distributed computing: ESA's mission Gaia. The Gaia scanning satellite will perform a galactic census of about 1,000 million objects in our galaxy. The Gaia community has chosen to write its processing software in Java. We explore the manifold reasons for choosing Java for this large science collaboration. Gaia processing is numerically complex but highly distributable, some parts being embarrassingly parallel. We describe the Gaia processing architecture and its realisation in Java. We delve into the astrometric solution which is the most advanced and most complex part of the processing. The Gaia simulator is also written in Java and is the most mature code in the system. This has been successfully running since about 2005 on the supercomputer "Marenostrum" in Barcelona. We relate experiences of using Java on a large shared machine. Finally we discuss Java, including some of its problems, for scientific computing.
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
15 CFR 301.3 - Application for duty-free entry of scientific instruments.
Code of Federal Regulations, 2014 CFR
2014-01-01
... scientific instruments. 301.3 Section 301.3 Commerce and Foreign Trade Regulations Relating to Commerce and... REGULATIONS INSTRUMENTS AND APPARATUS FOR EDUCATIONAL AND SCIENTIFIC INSTITUTIONS § 301.3 Application for duty-free entry of scientific instruments. (a) Who may apply. An applicant for duty-free entry of an...
15 CFR 301.3 - Application for duty-free entry of scientific instruments.
Code of Federal Regulations, 2012 CFR
2012-01-01
... scientific instruments. 301.3 Section 301.3 Commerce and Foreign Trade Regulations Relating to Commerce and... REGULATIONS INSTRUMENTS AND APPARATUS FOR EDUCATIONAL AND SCIENTIFIC INSTITUTIONS § 301.3 Application for duty-free entry of scientific instruments. (a) Who may apply. An applicant for duty-free entry of an...
15 CFR 301.3 - Application for duty-free entry of scientific instruments.
Code of Federal Regulations, 2010 CFR
2010-01-01
... scientific instruments. 301.3 Section 301.3 Commerce and Foreign Trade Regulations Relating to Commerce and... REGULATIONS INSTRUMENTS AND APPARATUS FOR EDUCATIONAL AND SCIENTIFIC INSTITUTIONS § 301.3 Application for duty-free entry of scientific instruments. (a) Who may apply. An applicant for duty-free entry of an...
15 CFR 301.3 - Application for duty-free entry of scientific instruments.
Code of Federal Regulations, 2013 CFR
2013-01-01
... scientific instruments. 301.3 Section 301.3 Commerce and Foreign Trade Regulations Relating to Commerce and... REGULATIONS INSTRUMENTS AND APPARATUS FOR EDUCATIONAL AND SCIENTIFIC INSTITUTIONS § 301.3 Application for duty-free entry of scientific instruments. (a) Who may apply. An applicant for duty-free entry of an...
15 CFR 301.3 - Application for duty-free entry of scientific instruments.
Code of Federal Regulations, 2011 CFR
2011-01-01
... scientific instruments. 301.3 Section 301.3 Commerce and Foreign Trade Regulations Relating to Commerce and... REGULATIONS INSTRUMENTS AND APPARATUS FOR EDUCATIONAL AND SCIENTIFIC INSTITUTIONS § 301.3 Application for duty-free entry of scientific instruments. (a) Who may apply. An applicant for duty-free entry of an...
75 FR 3895 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-25
..., materials science and nanotechnology. Justification for Duty-Free Entry: There are no domestic manufacturers... DEPARTMENT OF COMMERCE International Trade Administration Application(s) for Duty-Free Entry of Scientific Instruments Pursuant to Section 6(c) of the Educational, Scientific and Cultural Materials...
Computational Aspects of Data Assimilation and the ESMF
NASA Technical Reports Server (NTRS)
daSilva, A.
2003-01-01
The scientific challenge of developing advanced data assimilation applications is a daunting task. Independently developed components may have incompatible interfaces or may be written in different computer languages. The high-performance computer (HPC) platforms required by numerically intensive Earth system applications are complex, varied, rapidly evolving and multi-part systems themselves. Since the market for high-end platforms is relatively small, there is little robust middleware available to buffer the modeler from the difficulties of HPC programming. To complicate matters further, the collaborations required to develop large Earth system applications often span initiatives, institutions and agencies, involve geoscience, software engineering, and computer science communities, and cross national borders.The Earth System Modeling Framework (ESMF) project is a concerted response to these challenges. Its goal is to increase software reuse, interoperability, ease of use and performance in Earth system models through the use of a common software framework, developed in an open manner by leaders in the modeling community. The ESMF addresses the technical and to some extent the cultural - aspects of Earth system modeling, laying the groundwork for addressing the more difficult scientific aspects, such as the physical compatibility of components, in the future. In this talk we will discuss the general philosophy and architecture of the ESMF, focussing on those capabilities useful for developing advanced data assimilation applications.
The role of innovative global institutions in linking knowledge and action.
van Kerkhoff, Lorrae; Szlezák, Nicole A
2016-04-26
It is becoming increasingly recognized that our collective ability to tackle complex problems will require the development of new, adaptive, and innovative institutional arrangements that can deal with rapidly changing knowledge and have effective learning capabilities. In this paper, we applied a knowledge-systems perspective to examine how institutional innovations can affect the generation, sharing, and application of scientific and technical knowledge. We report on a case study that examined the effects that one large innovative organization, The Global Fund to Fight AIDS, Tuberculosis, and Malaria, is having on the knowledge dimensions of decision-making in global health. The case study shows that the organization created demand for new knowledge from a range of actors, but it did not incorporate strategies for meeting this demand into their own rules, incentives, or procedures. This made it difficult for some applicants to meet the organization's dual aims of scientific soundness and national ownership of projects. It also highlighted that scientific knowledge needed to be integrated with managerial and situational knowledge for success. More generally, the study illustrates that institutional change targeting implementation can also significantly affect the dynamics of knowledge creation (learning), access, distribution, and use. Recognizing how action-oriented institutions can affect these dynamics across their knowledge system can help institutional designers build more efficient and effective institutions for sustainable development.
Scientific Data Management Center for Enabling Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vouk, Mladen A.
Managing scientific data has been identified by the scientific community as one of the most important emerging needs because of the sheer volume and increasing complexity of data being collected. Effectively generating, managing, and analyzing this information requires a comprehensive, end-to-end approach to data management that encompasses all of the stages from the initial data acquisition to the final analysis of the data. Fortunately, the data management problems encountered by most scientific domains are common enough to be addressed through shared technology solutions. Based on community input, we have identified three significant requirements. First, more efficient access to storage systemsmore » is needed. In particular, parallel file system and I/O system improvements are needed to write and read large volumes of data without slowing a simulation, analysis, or visualization engine. These processes are complicated by the fact that scientific data are structured differently for specific application domains, and are stored in specialized file formats. Second, scientists require technologies to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis and searches over extremely large data sets. Specialized feature discovery and statistical analysis techniques are needed before the data can be understood or visualized. Furthermore, interactive analysis requires techniques for efficiently selecting subsets of the data. Finally, generating the data, collecting and storing the results, keeping track of data provenance, data post-processing, and analysis of results is a tedious, fragmented process. Tools for automation of this process in a robust, tractable, and recoverable fashion are required to enhance scientific exploration. The SDM center was established under the SciDAC program to address these issues. The SciDAC-1 Scientific Data Management (SDM) Center succeeded in bringing an initial set of advanced data management technologies to DOE application scientists in astrophysics, climate, fusion, and biology. Equally important, it established collaborations with these scientists to better understand their science as well as their forthcoming data management and data analytics challenges. Building on our early successes, we have greatly enhanced, robustified, and deployed our technology to these communities. In some cases, we identified new needs that have been addressed in order to simplify the use of our technology by scientists. This report summarizes our work so far in SciDAC-2. Our approach is to employ an evolutionary development and deployment process: from research through prototypes to deployment and infrastructure. Accordingly, we have organized our activities in three layers that abstract the end-to-end data flow described above. We labeled the layers (from bottom to top): a) Storage Efficient Access (SEA), b) Data Mining and Analysis (DMA), c) Scientific Process Automation (SPA). The SEA layer is immediately on top of hardware, operating systems, file systems, and mass storage systems, and provides parallel data access technology, and transparent access to archival storage. The DMA layer, which builds on the functionality of the SEA layer, consists of indexing, feature identification, and parallel statistical analysis technology. The SPA layer, which is on top of the DMA layer, provides the ability to compose scientific workflows from the components in the DMA layer as well as application specific modules. NCSU work performed under this contract was primarily at the SPA layer.« less
Robonaut 2 and You: Specifying and Executing Complex Operations
NASA Technical Reports Server (NTRS)
Baker, William; Kingston, Zachary; Moll, Mark; Badger, Julia; Kavraki, Lydia
2017-01-01
Crew time is a precious resource due to the expense of trained human operators in space. Efficient caretaker robots could lessen the manual labor load required by frequent vehicular and life support maintenance tasks, freeing astronaut time for scientific mission objectives. Humanoid robots can fluidly exist alongside human counterparts due to their form, but they are complex and high-dimensional platforms. This paper describes a system that human operators can use to maneuver Robonaut 2 (R2), a dexterous humanoid robot developed by NASA to research co-robotic applications. The system includes a specification of constraints used to describe operations, and the supporting planning framework that solves constrained problems on R2 at interactive speeds. The paper is developed in reference to an illustrative, typical example of an operation R2 performs to highlight the challenges inherent to the problems R2 must face. Finally, the interface and planner is validated through a case-study using the guiding example on the physical robot in a simulated microgravity environment. This work reveals the complexity of employing humanoid caretaker robots and suggest solutions that are broadly applicable.
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
The dynamics of big data and human rights: the case of scientific research.
Vayena, Effy; Tasioulas, John
2016-12-28
In this paper, we address the complex relationship between big data and human rights. Because this is a vast terrain, we restrict our focus in two main ways. First, we concentrate on big data applications in scientific research, mostly health-related research. And, second, we concentrate on two human rights: the familiar right to privacy and the less well-known right to science. Our contention is that human rights interact in potentially complex ways with big data, not only constraining it, but also enabling it in various ways; and that such rights are dynamic in character, rather than fixed once and for all, changing in their implications over time in line with changes in the context we inhabit, and also as they interact among themselves in jointly responding to the opportunities and risks thrown up by a changing world. Understanding this dynamic interaction of human rights is crucial for formulating an ethic tailored to the realities-the new capabilities and risks-of the rapidly evolving digital environment.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).
The dynamics of big data and human rights: the case of scientific research
Tasioulas, John
2016-01-01
In this paper, we address the complex relationship between big data and human rights. Because this is a vast terrain, we restrict our focus in two main ways. First, we concentrate on big data applications in scientific research, mostly health-related research. And, second, we concentrate on two human rights: the familiar right to privacy and the less well-known right to science. Our contention is that human rights interact in potentially complex ways with big data, not only constraining it, but also enabling it in various ways; and that such rights are dynamic in character, rather than fixed once and for all, changing in their implications over time in line with changes in the context we inhabit, and also as they interact among themselves in jointly responding to the opportunities and risks thrown up by a changing world. Understanding this dynamic interaction of human rights is crucial for formulating an ethic tailored to the realities—the new capabilities and risks—of the rapidly evolving digital environment. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336802
Challenge '95 - The Ariane 5 Development Programme
NASA Astrophysics Data System (ADS)
Vedrenne, M.; van Gaver, M.
1987-10-01
The Ariane-5 launcher has been assigned to the following types of missions: (1) launching geostationary and sun-synchronous commercial satellites, and scientific and trial applications satellites; (2) launching the Hermes spaceplane, and (3) launching elements of the Columbus system such as the man-tended free-flyer module, and the polar platform. A new launch complex, the ELA-3, is being built for the Ariane-5 launcher close to ESA's ELA-1 and ELA-2 launch complexes at Kourou. After two qualification flights in the automatic version in 1995 (501 and 502), it is expected that Ariane-5 will be declared operational with its first commercial flight planned for early 1996 to put an automatic payload into orbit.
Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds
NASA Astrophysics Data System (ADS)
Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.
In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.
Implementation of a multi-threaded framework for large-scale scientific applications
Sexton-Kennedy, E.; Gartung, Patrick; Jones, C. D.; ...
2015-05-22
The CMS experiment has recently completed the development of a multi-threaded capable application framework. In this paper, we will discuss the design, implementation and application of this framework to production applications in CMS. For the 2015 LHC run, this functionality is particularly critical for both our online and offline production applications, which depend on faster turn-around times and a reduced memory footprint relative to before. These applications are complex codes, each including a large number of physics-driven algorithms. While the framework is capable of running a mix of thread-safe and 'legacy' modules, algorithms running in our production applications need tomore » be thread-safe for optimal use of this multi-threaded framework at a large scale. Towards this end, we discuss the types of changes, which were necessary for our algorithms to achieve good performance of our multithreaded applications in a full-scale application. Lastly performance numbers for what has been achieved for the 2015 run are presented.« less
NASA Astrophysics Data System (ADS)
Sutter, A. McKinzie; Dauer, Jenny M.; Forbes, Cory T.
2018-06-01
One aim of science education is to develop scientific literacy for decision-making in daily life. Socio-scientific issues (SSI) and structured decision-making frameworks can help students reach these objectives. This research uses value belief norm (VBN) theory and construal level theory (CLT) to explore students' use of personal values in their decision-making processes and the relationship between abstract and concrete problematization and their decision-making. Using mixed methods, we conclude that the level of abstraction with which students problematise a prairie dog agricultural production and ecosystem preservation issue has a significant relationship to the values students used in the decision-making process. However, neither abstraction of the problem statement nor students' surveyed value orientations were significantly related to students' final decisions. These results may help inform teachers' understanding of students and their use of a structured-decision making tool in a classroom, and aid researchers in understanding if these tools help students remain objective in their analyses of complex SSIs.
Machine learning based job status prediction in scientific clusters
Yoo, Wucherl; Sim, Alex; Wu, Kesheng
2016-09-01
Large high-performance computing systems are built with increasing number of components with more CPU cores, more memory, and more storage space. At the same time, scientific applications have been growing in complexity. Together, they are leading to more frequent unsuccessful job statuses on HPC systems. From measured job statuses, 23.4% of CPU time was spent to the unsuccessful jobs. Here, we set out to study whether these unsuccessful job statuses could be anticipated from known job characteristics. To explore this possibility, we have developed a job status prediction method for the execution of jobs on scientific clusters. The Random Forestsmore » algorithm was applied to extract and characterize the patterns of unsuccessful job statuses. Experimental results show that our method can predict the unsuccessful job statuses from the monitored ongoing job executions in 99.8% the cases with 83.6% recall and 94.8% precision. Lastly, this prediction accuracy can be sufficiently high that it can be used to mitigation procedures of predicted failures.« less
Machine learning based job status prediction in scientific clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Sim, Alex; Wu, Kesheng
Large high-performance computing systems are built with increasing number of components with more CPU cores, more memory, and more storage space. At the same time, scientific applications have been growing in complexity. Together, they are leading to more frequent unsuccessful job statuses on HPC systems. From measured job statuses, 23.4% of CPU time was spent to the unsuccessful jobs. Here, we set out to study whether these unsuccessful job statuses could be anticipated from known job characteristics. To explore this possibility, we have developed a job status prediction method for the execution of jobs on scientific clusters. The Random Forestsmore » algorithm was applied to extract and characterize the patterns of unsuccessful job statuses. Experimental results show that our method can predict the unsuccessful job statuses from the monitored ongoing job executions in 99.8% the cases with 83.6% recall and 94.8% precision. Lastly, this prediction accuracy can be sufficiently high that it can be used to mitigation procedures of predicted failures.« less
Enabling scientific workflows in virtual reality
Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.
2006-01-01
To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.
NASA Astrophysics Data System (ADS)
Minnett, R.; Koppers, A.; Jarboe, N.; Tauxe, L.; Constable, C.; Jonestrask, L.
2017-12-01
Challenges are faced by both new and experienced users interested in contributing their data to community repositories, in data discovery, or engaged in potentially transformative science. The Magnetics Information Consortium (https://earthref.org/MagIC) has recently simplified its data model and developed a new containerized web application to reduce the friction in contributing, exploring, and combining valuable and complex datasets for the paleo-, geo-, and rock magnetic scientific community. The new data model more closely reflects the hierarchical workflow in paleomagnetic experiments to enable adequate annotation of scientific results and ensure reproducibility. The new open-source (https://github.com/earthref/MagIC) application includes an upload tool that is integrated with the data model to provide early data validation feedback and ease the friction of contributing and updating datasets. The search interface provides a powerful full text search of contributions indexed by ElasticSearch and a wide array of filters, including specific geographic and geological timescale filtering, to support both novice users exploring the database and experts interested in compiling new datasets with specific criteria across thousands of studies and millions of measurements. The datasets are not large, but they are complex, with many results from evolving experimental and analytical approaches. These data are also extremely valuable due to the cost in collecting or creating physical samples and the, often, destructive nature of the experiments. MagIC is heavily invested in encouraging young scientists as well as established labs to cultivate workflows that facilitate contributing their data in a consistent format. This eLightning presentation includes a live demonstration of the MagIC web application, developed as a configurable container hosting an isomorphic Meteor JavaScript application, MongoDB database, and ElasticSearch search engine. Visitors can explore the MagIC Database through maps and image or plot galleries or search and filter the raw measurements and their derived hierarchy of analytical interpretations.
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
Comprehensible Presentation of Topological Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, Gunther H.; Beketayev, Kenes; Bremer, Peer-Timo
2012-03-05
Topological information has proven very valuable in the analysis of scientific data. An important challenge that remains is presenting this highly abstract information in a way that it is comprehensible even if one does not have an in-depth background in topology. Furthermore, it is often desirable to combine the structural insight gained by topological analysis with complementary information, such as geometric information. We present an overview over methods that use metaphors to make topological information more accessible to non-expert users, and we demonstrate their applicability to a range of scientific data sets. With the increasingly complex output of exascale simulations,more » the importance of having effective means of providing a comprehensible, abstract overview over data will grow. The techniques that we present will serve as an important foundation for this purpose.« less
Knopman, David; Alford, Eli; Tate, Kaitlin; Long, Mark; Khachaturian, Ara S
2017-08-01
For nearly 50 years, institutional review boards (IRB) and independent ethics committees have featured local oversight as a core function of research ethics reviews. However growing complexity in Alzheimer's clinical research suggests current approaches to research volunteer safety is hampering development of new therapeutics. As a partial response to this challenge, the NIH has mandated that all NIH-funded multi-site studies will use a single Institutional Review Board. The perspective describes a joint program to provide a single IRB of record (sIRB) for phases of multi-site studies. The approach follows two steps. One, an expert Scientific Review Committee (SRC) of senior researchers in the field will conduct the review principally of scientific merit, significance, feasibility, and the likelihood of meaningful results. The second step will be the IRB's regulatory and ethics review. The IRB will apply appropriate regulatory criteria for approval including minimization of risks to subjects and risks reasonable in relation to anticipated benefits, equitable subject selection, informed consent, protections for vulnerable populations, and application of local context considerations, among others. There is a steady demand for scientific, ethical and regulatory review of planned Alzheimer's studies. As of January 15, 2017, there are nearly 400 open studies, Phase II and III, industry and NIH sponsored trials on disease indications affecting memory, movement and mood in the US. The effort will initially accept protocols for studies of Alzheimer's disease, dementia, and related disorders effecting memory, movement and mood. Future aims will be to provide scientific review and, where applicable, regulatory and ethical review in an international context outside North America with sites possibly in Asia, Europe and Australia. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Artificial intelligence support for scientific model-building
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1992-01-01
Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.
Integrating entertainment and scientific rigor to facilitate a co-creation of knowledge
NASA Astrophysics Data System (ADS)
Hezel, Bernd; Broschkowski, Ephraim; Kropp, Jürgen
2013-04-01
The advancing research on the changing climate system and on its impacts has uncovered the magnitude of the expectable societal implications. It therefore created substantial awareness of the problem with stakeholders and the general public. But despite this awareness, unsustainable trends have continued untamed. For a transition towards a sustainable world it is, apparently, not enough to disseminate the "scientific truth" and wait for the people to "understand". In order to remedy this problem it is rather necessary to develop new entertaining formats to communicate the complex topic in an integrated and comprehensive way. Beyond that, it could be helpful to acknowledge that science can only generate part of the knowledge that is necessary for the transformation. The nature of the problem and its deep societal implications call for a co-creation of knowledge by science and society in order to enable change. In this spirit the RAMSES project (Reconciling Adaptation, Mitigation and Sustainable Development for Cities) follows a dialogic communication approach allowing for a co-formulation of research questions by stakeholders. A web-based audio-visual guidance application presents embedded scientific information in an entertaining and intuitive way on the basis of a "complexity on demand" approach. It aims at enabling decision making despite uncertainty and it entails a reframing of the project's research according to applied and local knowledge.
Detector Array Performance Estimates for Nuclear Resonance Fluorescence Applications
NASA Astrophysics Data System (ADS)
Johnson, Micah; Hall, J. M.; McNabb, D. P.
2012-10-01
There are a myriad of explorative efforts underway at several institutions to determine the feasibility of using photonuclear reactions to detect and assay materials of varying complexity and compositions. One photonuclear process that is being explored for several applications is nuclear resonance fluorescence (NRF). NRF is interesting because the resonant lines are unique to each isotope and the widths are sufficiently narrow and the level densities are sufficiently low so as to not cause interference. Therefore, NRF provides a means to isoptically map containers and materials. The choice of detector array is determined by the application and the source. We will present results from a variety of application studies of an assortment of detector arrays that may be useful. Our results stem from simulation and modeling exercises and benchmarking measurements. We will discuss the data requirements from basic scientific research that enables these application studies. We will discuss our results and the future outlook of this technology.
A knotted complex scalar field for any knot
NASA Astrophysics Data System (ADS)
Bode, Benjamin; Dennis, Mark
Three-dimensional field configurations where a privileged defect line is knotted or linked have experienced an upsurge in interest, with examples including fluid mechanics, quantum wavefunctions, optics, liquid crystals and skyrmions. We describe a constructive algorithm to write down complex scalar functions of three-dimensional real space with knotted nodal lines, using trigonometric parametrizations of braids. The construction is most natural for the family of lemniscate knots which generalizes the torus knot and figure-8 knot, but generalizes to any knot or link. The specific forms of these functions allow various topological quantities associated with the field to be chosen, such as the helicity of a knotted flow field. We will describe some applications to physical systems such as those listed above. This work was supported by the Leverhulme Trust Programme Grant ''Scientific Properties of Complex Knots''.
2006-10-18
Research of the Nat. Bur. of Standards, Research Papaer RP2388, pp. 51-62. Tennekes H and Lumley JL. 1972 A First Course in Turbulence, The MIT Press...24061-0203 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING/MONITORING AGENCY Air Force Office of Scientific Research REPORT NUMBER...complex and difficult to predict, even for the most basic situations. Fundamental turbulence research continues to be necessary in order to advance our
Rapid Cost Assessment of Space Mission Concepts Through Application of Complexity-Based Cost Indices
NASA Technical Reports Server (NTRS)
Peterson, Craig E.; Cutts, James; Balint, Tibor; Hall, James B.
2008-01-01
This slide presentation reviews the development of a rapid cost assessment models for evaluation of exploration missions through the application of complexity based cost indices. In Fall of 2004, NASA began developing 13 documents, known as "strategic roadmaps," intended to outline a strategy for space exploration over the next 30 years. The Third Strategic Roadmap, The Strategic Roadmap for Solar System Exploration, focused on strategy for robotic exploration of the Solar System. Development of the Strategic Roadmap for Solar System Exploration led to the investigation of a large variety of missions. However, the necessity of planning around scientific inquiry and budgetary constraints made it necessary for the roadmap development team to evaluate potential missions not only for scientific return but also cost. Performing detailed cost studies for each of the large number of missions was impractical given the time constraints involved and lack of detailed mission studies; so a method of rapid cost assessment was developed by us to allow preliminary analysis. It has been noted that there is a strong correlation between complexity and cost and schedule of planetary missions. While these correlations were made after missions had been built and flown (successfully or otherwise), it seemed likely that a similar approach could provide at least some relative cost ranking. Cost estimation relationships (CERs) have been developed based on subsystem design choices. These CERs required more detailed information than available, forcing the team to adopt a more high level approach. Costing by analogy has been developed for small satellites, however, planetary exploration missions provide such varying spacecraft requirements that there is a lack of adequately comparable missions that can be used for analogy.
Automation of multi-agent control for complex dynamic systems in heterogeneous computational network
NASA Astrophysics Data System (ADS)
Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan
2017-01-01
The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.
NASA Astrophysics Data System (ADS)
Erez, Mattan; Dally, William J.
Stream processors, like other multi core architectures partition their functional units and storage into multiple processing elements. In contrast to typical architectures, which contain symmetric general-purpose cores and a cache hierarchy, stream processors have a significantly leaner design. Stream processors are specifically designed for the stream execution model, in which applications have large amounts of explicit parallel computation, structured and predictable control, and memory accesses that can be performed at a coarse granularity. Applications in the streaming model are expressed in a gather-compute-scatter form, yielding programs with explicit control over transferring data to and from on-chip memory. Relying on these characteristics, which are common to many media processing and scientific computing applications, stream architectures redefine the boundary between software and hardware responsibilities with software bearing much of the complexity required to manage concurrency, locality, and latency tolerance. Thus, stream processors have minimal control consisting of fetching medium- and coarse-grained instructions and executing them directly on the many ALUs. Moreover, the on-chip storage hierarchy of stream processors is under explicit software control, as is all communication, eliminating the need for complex reactive hardware mechanisms.
Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Liang, Ke; Hong, Yang
2017-10-01
The shuffled complex evolution optimization developed at the University of Arizona (SCE-UA) has been successfully applied in various kinds of scientific and engineering optimization applications, such as hydrological model parameter calibration, for many years. The algorithm possesses good global optimality, convergence stability and robustness. However, benchmark and real-world applications reveal the poor computational efficiency of the SCE-UA. This research aims at the parallelization and acceleration of the SCE-UA method based on powerful heterogeneous computing technology. The parallel SCE-UA is implemented on Intel Xeon multi-core CPU (by using OpenMP and OpenCL) and NVIDIA Tesla many-core GPU (by using OpenCL, CUDA, and OpenACC). The serial and parallel SCE-UA were tested based on the Griewank benchmark function. Comparison results indicate the parallel SCE-UA significantly improves computational efficiency compared to the original serial version. The OpenCL implementation obtains the best overall acceleration results however, with the most complex source code. The parallel SCE-UA has bright prospects to be applied in real-world applications.
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
Enabling the transition towards Earth Observation Science 2.0
NASA Astrophysics Data System (ADS)
Mathieu, Pierre-Philippe; Desnos, Yves-Louis
2015-04-01
Science 2.0 refers to the rapid and systematic changes in doing Research and organising Science driven by the rapid advances in ICT and digital technologies combined with a growing demand to do Science for Society (actionable research) and in Society (co-design of knowledge). Nowadays, teams of researchers around the world can easily access a wide range of open data across disciplines and remotely process them on the Cloud, combining them with their own data to generate knowledge, develop information products for societal applications, and tackle complex integrative complex problems that could not be addressed a few years ago. Such rapid exchange of digital data is fostering a new world of data-intensive research, characterized by openness, transparency, and scrutiny and traceability of results, access to large volume of complex data, availability of community open tools, unprecedented level of computing power, and new collaboration among researchers and new actors such as citizen scientists. The EO scientific community is now facing the challenge of responding to this new paradigm in science 2.0 in order to make the most of the large volume of complex and diverse data delivered by the new generation of EO missions, and in particular the Sentinels. In this context, ESA - in particular within the framework of the Scientific Exploitation of Operational Missions (SEOM) element - is supporting a variety of activities in partnership with research communities to ease the transition and make the most of the data. These include the generation of new open tools and exploitation platforms, exploring new ways to exploit data on cloud-based platforms, dissiminate data, building new partnership with citizen scientists, and training the new generation of data scientists. The paper will give a brief overview of some of ESA activities aiming to facilitate the exploitation of large amount of data from EO missions in a collaborative, cross-disciplinary, and open way, from science to applications and education.
Complex adaptive systems: concept analysis.
Holden, Lela M
2005-12-01
The aim of this paper is to explicate the concept of complex adaptive systems through an analysis that provides a description, antecedents, consequences, and a model case from the nursing and health care literature. Life is more than atoms and molecules--it is patterns of organization. Complexity science is the latest generation of systems thinking that investigates patterns and has emerged from the exploration of the subatomic world and quantum physics. A key component of complexity science is the concept of complex adaptive systems, and active research is found in many disciplines--from biology to economics to health care. However, the research and literature related to these appealing topics have generated confusion. A thorough explication of complex adaptive systems is needed. A modified application of the methods recommended by Walker and Avant for concept analysis was used. A complex adaptive system is a collection of individual agents with freedom to act in ways that are not always totally predictable and whose actions are interconnected. Examples include a colony of termites, the financial market, and a surgical team. It is often referred to as chaos theory, but the two are not the same. Chaos theory is actually a subset of complexity science. Complexity science offers a powerful new approach--beyond merely looking at clinical processes and the skills of healthcare professionals. The use of complex adaptive systems as a framework is increasing for a wide range of scientific applications, including nursing and healthcare management research. When nursing and other healthcare managers focus on increasing connections, diversity, and interactions they increase information flow and promote creative adaptation referred to as self-organization. Complexity science builds on the rich tradition in nursing that views patients and nursing care from a systems perspective.
Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models
NASA Astrophysics Data System (ADS)
Pallant, Amy; Lee, Hee-Sun
2015-04-01
Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.
Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, Robert
2013-04-20
Enhancing the performance of SciDAC applications on petascale systems had high priority within DOE SC at the start of the second phase of the SciDAC program, SciDAC-2, as it continues to do so today. Achieving expected levels of performance on high-end computing (HEC) systems is growing ever more challenging due to enormous scale, increasing architectural complexity, and increasing application complexity. To address these challenges, the University of Southern California?s Information Sciences Institute organized the Performance Engineering Research Institute (PERI). PERI implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineeringmore » of high profile applications. Within PERI, USC?s primary research activity was automatic tuning (autotuning) of scientific software. This activity was spurred by the strong user preference for automatic tools and was based on previous successful activities such as ATLAS, which automatically tuned components of the LAPACK linear algebra library, and other recent work on autotuning domain-specific libraries. Our other major component was application engagement, to which we devoted approximately 30% of our effort to work directly with SciDAC-2 applications. This report is a summary of the overall results of the USC PERI effort.« less
ERIC Educational Resources Information Center
Dinsmore, Daniel L.; Zoellner, Brian P.; Parkinson, Meghan M.; Rossi, Anthony M.; Monk, Mary J.; Vinnachi, Jenelle
2017-01-01
View change about socio-scientific issues has been well studied in the literature, but the change in the complexity of those views has not. In the current study, the change in the complexity of views about a specific scientific topic (i.e. genetically modified organisms; GMOs) and use of evidence in explaining those views was examined in relation…
Mobile applications and Virtual Observatory
NASA Astrophysics Data System (ADS)
Schaaff, A.; Jagade, S.
2015-06-01
Within a few years, smartphones and Internet tablets have become the devices to access Web or standalone applications from everywhere, with a rapid development of the bandwidth of the mobile networks (e.g. 4G). Internet tablets are used to take notes during meetings or conferences, to read scientific papers in public transportation, etc. A smartphone is for example a way to have your data in the pocket or to control, from everywhere, the progress of a heavy workflow process. These mobile devices have enough powerful hardware to run more and more complex applications for many use cases. In the field of astronomy it is possible to use these tools to access data via a simple browser, but also to develop native applications reusing libraries (written in Java for Android or Objective-C/Swift for iOS) developed for desktops/laptops. We describe the experiments conducted in this domain, at CDS and IUCAA, considering a mobile application as a native application as well as a Web application.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
Reach, Gérard
2016-01-01
According to the concept developed by Thomas Kuhn, a scientific revolution occurs when scientists encounter a crisis due to the observation of anomalies that cannot be explained by the generally accepted paradigm within which scientific progress has thereto been made: a scientific revolution can therefore be described as a change in paradigm aimed at solving a crisis. Described herein is an application of this concept to the medical realm, starting from the reflection that during the past decades, the medical community has encountered two anomalies that, by their frequency and consequences, represent a crisis in the system, as they deeply jeopardize the efficiency of care: nonadherence of patients who do not follow the prescriptions of their doctors, and clinical inertia of doctors who do not comply with good practice guidelines. It is proposed that these phenomena are caused by a contrast between, on the one hand, the complex thought of patients and doctors that sometimes escapes rationalization, and on the other hand, the simplification imposed by the current paradigm of medicine dominated by the technical rationality of evidence-based medicine. It is suggested therefore that this crisis must provoke a change in paradigm, inventing a new model of care defined by an ability to take again into account, on an individual basis, the complex thought of patients and doctors. If this overall analysis is correct, such a person-centered care model should represent a solution to the two problems of patients’ nonadherence and doctors’ clinical inertia, as it tackles their cause. These considerations may have important implications for the teaching and the practice of medicine. PMID:27103790
Samuels, Anthony; O'Driscoll, Colmán; Allnutt, Stephen
2007-12-01
This paper describes psychiatric and psychological defences to murder where the defence of insanity is not applicable. The charges of murder and manslaughter are outlined. Self-defence, sane and insane automatism, provocation, diminished responsibility, duress, necessity and novel defences are discussed. The complexities of psychological and psychiatric expert evidence are highlighted as well as the fact that legal decisions are not always consistent with medical or scientific theory. It is concluded that this is a controversial and evolving area of mental health law and mental health professionals have an educative role and a responsibility to provide testimony that is supported by the best possible evidence.
Antibacterial Nanoparticles in Endodontics: A Review.
Shrestha, Annie; Kishen, Anil
2016-10-01
A major challenge in root canal treatment is the inability of the current cleaning and shaping procedures to eliminate bacterial biofilms surviving within the anatomic complexities and uninstrumented portions of the root canal system. Nanoparticles with their enhanced and unique physicochemical properties, such as ultrasmall sizes, large surface area/mass ratio, and increased chemical reactivity, have led research toward new prospects of treating and preventing dental infections. This article presents a comprehensive review on the scientific knowledge that is available on the application of antibacterial nanoparticles in endodontics. The application of nanoparticles in the form of solutions for irrigation, medication, and as an additive within sealers/restorative materials has been evaluated to primarily improve the antibiofilm efficacy in root canal and restorative treatments. In addition, antibiotic or photosensitizer functionalized nanoparticles have been proposed recently to provide more potent antibacterial efficacy. The increasing interest in this field warrants sound research based on scientific and clinical collaborations to emphasize the near future potential of nanoparticles in clinical endodontics. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Constructing Scientific Applications from Heterogeneous Resources
NASA Technical Reports Server (NTRS)
Schichting, Richard D.
1995-01-01
A new model for high-performance scientific applications in which such applications are implemented as heterogeneous distributed programs or, equivalently, meta-computations, is investigated. The specific focus of this grant was a collaborative effort with researchers at NASA and the University of Toledo to test and improve Schooner, a software interconnection system, and to explore the benefits of increased user interaction with existing scientific applications.
NASA Astrophysics Data System (ADS)
Abdi, A.
2012-12-01
Science and science education benefit from easy access to data yet often geophysical data sets are large, complex and difficult to share. The difficulty in sharing data and imagery easily inhibits both collaboration and the use of real data in educational applications. The dissemination of data products through web maps serves a very efficient and user-friendly method for students, the public and the science community to gain insights and understanding from data. Few research groups provide direct access to their data, let alone map-based visualizations. By building upon current GIS infrastructure with web mapping technologies, like ArcGIS Server, scientific groups, institutions and agencies can enhance the value of their GIS investments. The advantages of web maps to serve data products are many; existing web-mapping technology allows complex GIS analysis to be shared across the Internet, and can be easily scaled from a few users to millions. This poster highlights the features of an interactive web map developed at the Polar Geophysics Group at the Lamont-Doherty Earth Observatory of Columbia University that provides a visual representation of, and access to, data products that resulted from the group's recently concluded AGAP project (http://pgg.ldeo.columbia.edu). The AGAP project collected more than 120,000 line km of new aerogeophysical data using two Twin Otter aircrafts. Data included ice penetrating radar, magnetometer, gravimeter and laser altimeter measurements. The web map is based upon ArcGIS Viewer for Flex, which is a configurable client application built on the ArcGIS API for Flex that works seamlessly with ArcGIS Server 10. The application can serve a variety of raster and vector file formats through the Data Interoperability for Server, which eliminates data sharing barriers across numerous file formats. The ability of the application to serve large datasets is only hindered by the availability of appropriate hardware. ArcGIS is a proprietary product, but there are a few data portals in the earth sciences that have a map interface using open access products such as MapServer and OpenLayers, the most notable being the NASA IceBridge Data Portal. Indeed, with the widespread availability of web mapping technology, the scientific community should advance towards this direction when disseminating their data.
Translational research in infectious disease: current paradigms and challenges ahead
Fontana, Judith M.; Alexander, Elizabeth; Salvatore, Mirella
2012-01-01
In recent years, the biomedical community has witnessed a rapid scientific and technological evolution following the development and refinement of high-throughput methodologies. Concurrently and consequentially, the scientific perspective has changed from the reductionist approach of meticulously analyzing the fine details of a single component of biology, to the “holistic” approach of broadmindedly examining the globally interacting elements of biological systems. The emergence of this new way of thinking has brought about a scientific revolution in which genomics, proteomics, metabolomics and other “omics” have become the predominant tools by which large amounts of data are amassed, analyzed and applied to complex questions of biology that were previously unsolvable. This enormous transformation of basic science research and the ensuing plethora of promising data, especially in the realm of human health and disease, have unfortunately not been followed by a parallel increase in the clinical application of this information. On the contrary, the number of new potential drugs in development has been steadily decreasing, suggesting the existence of roadblocks that prevent the translation of promising research into medically relevant therapeutic or diagnostic application. In this paper we will review, in a non-inclusive fashion, several recent scientific advancements in the field of translational research, with a specific focus on how they relate to infectious disease. We will also present a current picture of the limitations and challenges that exist for translational research, as well as ways that have been proposed by the National Institutes of Health to improve the state of this field. PMID:22633095
Code of Federal Regulations, 2012 CFR
2012-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Application by scientific... DISTILLED SPIRITS PLANTS Administrative and Miscellaneous Provisions Alternate Methods Or Procedures and Experimental Operations § 19.35 Application by scientific institutions and colleges of learning for...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Application by scientific... DISTILLED SPIRITS PLANTS Administrative and Miscellaneous Provisions Alternate Methods Or Procedures and Experimental Operations § 19.35 Application by scientific institutions and colleges of learning for...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Application by scientific... DISTILLED SPIRITS PLANTS Administrative and Miscellaneous Provisions Alternate Methods Or Procedures and Experimental Operations § 19.35 Application by scientific institutions and colleges of learning for...
AstroGrid-D: Grid technology for astronomical science
NASA Astrophysics Data System (ADS)
Enke, Harry; Steinmetz, Matthias; Adorf, Hans-Martin; Beck-Ratzka, Alexander; Breitling, Frank; Brüsemeister, Thomas; Carlson, Arthur; Ensslin, Torsten; Högqvist, Mikael; Nickelt, Iliya; Radke, Thomas; Reinefeld, Alexander; Reiser, Angelika; Scholl, Tobias; Spurzem, Rainer; Steinacker, Jürgen; Voges, Wolfgang; Wambsganß, Joachim; White, Steve
2011-02-01
We present status and results of AstroGrid-D, a joint effort of astrophysicists and computer scientists to employ grid technology for scientific applications. AstroGrid-D provides access to a network of distributed machines with a set of commands as well as software interfaces. It allows simple use of computer and storage facilities and to schedule or monitor compute tasks and data management. It is based on the Globus Toolkit middleware (GT4). Chapter 1 describes the context which led to the demand for advanced software solutions in Astrophysics, and we state the goals of the project. We then present characteristic astrophysical applications that have been implemented on AstroGrid-D in chapter 2. We describe simulations of different complexity, compute-intensive calculations running on multiple sites (Section 2.1), and advanced applications for specific scientific purposes (Section 2.2), such as a connection to robotic telescopes (Section 2.2.3). We can show from these examples how grid execution improves e.g. the scientific workflow. Chapter 3 explains the software tools and services that we adapted or newly developed. Section 3.1 is focused on the administrative aspects of the infrastructure, to manage users and monitor activity. Section 3.2 characterises the central components of our architecture: The AstroGrid-D information service to collect and store metadata, a file management system, the data management system, and a job manager for automatic submission of compute tasks. We summarise the successfully established infrastructure in chapter 4, concluding with our future plans to establish AstroGrid-D as a platform of modern e-Astronomy.
NASA Technical Reports Server (NTRS)
White, D. R.
1976-01-01
A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.
LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team
2011-01-01
The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.
Basics of Compounding: Clinical Pharmaceutics, Part 2.
Allen, Loyd V
2016-01-01
This article represents part 2 of a 2-part article on the topic of clinical pharmaceutics. Pharmaceutics is relevant far beyond the pharmaceutical industry, compounding, and the laboratory. Pharmaceutics can be used to solve many clinical problems in medication therapy. A pharmacists' knowledge of the physicochemical aspects of drugs and drug products should help the patient, physician, and healthcare professionals resolve issues in the increasingly complex world of modern medicine. Part 1 of this series of articles discussed incompatibilities which can directly affect a clinical outcome and utilized pharmaceutics case examples of the application and importance of clinical pharmaceutics covering different characteristics. Part 2 continues to illustrate the scientific principles and clinical effects involved in clinical pharmaceutics. Also covered in this article are many of the scientific principles in typical to patient care. Copyright© by International Journal of Pharmaceutical Compounding, Inc.
76 FR 52314 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-22
... Scientific Instruments Pursuant to Section 6(c) of the Educational, Scientific and Cultural Materials... invite comments on the question of whether instruments of equivalent scientific value, for the purposes for which the instruments shown below are intended to be used, are being manufactured in the United...
76 FR 15945 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-22
... Scientific Instruments Pursuant to Section 6(c) of the Educational, Scientific and Cultural Materials... invite comments on the question of whether instruments of equivalent scientific value, for the purposes for which the instruments shown below are intended to be used, are being manufactured in the United...
NASA Astrophysics Data System (ADS)
Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats
2014-06-01
Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.
Science Classroom Inquiry (SCI) Simulations: A Novel Method to Scaffold Science Learning
Peffer, Melanie E.; Beckler, Matthew L.; Schunn, Christian; Renken, Maggie; Revak, Amanda
2015-01-01
Science education is progressively more focused on employing inquiry-based learning methods in the classroom and increasing scientific literacy among students. However, due to time and resource constraints, many classroom science activities and laboratory experiments focus on simple inquiry, with a step-by-step approach to reach predetermined outcomes. The science classroom inquiry (SCI) simulations were designed to give students real life, authentic science experiences within the confines of a typical classroom. The SCI simulations allow students to engage with a science problem in a meaningful, inquiry-based manner. Three discrete SCI simulations were created as website applications for use with middle school and high school students. For each simulation, students were tasked with solving a scientific problem through investigation and hypothesis testing. After completion of the simulation, 67% of students reported a change in how they perceived authentic science practices, specifically related to the complex and dynamic nature of scientific research and how scientists approach problems. Moreover, 80% of the students who did not report a change in how they viewed the practice of science indicated that the simulation confirmed or strengthened their prior understanding. Additionally, we found a statistically significant positive correlation between students’ self-reported changes in understanding of authentic science practices and the degree to which each simulation benefitted learning. Since SCI simulations were effective in promoting both student learning and student understanding of authentic science practices with both middle and high school students, we propose that SCI simulations are a valuable and versatile technology that can be used to educate and inspire a wide range of science students on the real-world complexities inherent in scientific study. PMID:25786245
Intelligent Design and the Creationism/Evolution Controversy
NASA Astrophysics Data System (ADS)
Scott, E. C.
2004-12-01
"Intelligent Design" (ID) is a new form of creationism that emerged after legal decisions in the 1980s hampered the inclusion of "creation science" in the public school curriculum. To avoid legal challenge, proponents claim agnosticism regarding the identity of the intelligent agent, which could be material (such as highly intelligent terrestrials) or transcendental (God). ID consists of a scientific/scholarly effort, and a politico-religious movement of "cultural renewal." Intelligent design is supposedly detectable through the application of Michael Behe's "irreducible complexity" concept and/or William Dembski's concept of "complex specified information". ID's claims amount to, first, that "Darwinism" (vaguely defined) is incapable of providing an adequate mechanism for evolution, and second (subsequently), that evolution did not occur. Although scientific ideas not infrequently are slow to be accepted, in the 20 years since ID appeared, there is no evidence of it being used to solve problems in biology. Even if the scientific/scholarly part of ID has been a failure, the "cultural renewal" part of ID has been a success. This social and political aspect of ID seeks "restoration" of a theistic sensibility in American culture to replace what supporters consider an overemphasis on secularism. In the last few years, in several states, legislators have introduced legislation promoting ID (to date, unsuccessfully) and an addendum to the 2001 federal education bill conference committee report (the "Santorum amendment") is being used to promote the teaching of ID in public schools. Perhaps because ID has no actual content other than antievolutionism, ID proponents contend that pre-college teachers should teach wweaknesses of evolutionw or "evidence against evolutionw - largely warmed-over arguments from creation science - even though professional scientists do not recognize these as valid scientific claims.
Science classroom inquiry (SCI) simulations: a novel method to scaffold science learning.
Peffer, Melanie E; Beckler, Matthew L; Schunn, Christian; Renken, Maggie; Revak, Amanda
2015-01-01
Science education is progressively more focused on employing inquiry-based learning methods in the classroom and increasing scientific literacy among students. However, due to time and resource constraints, many classroom science activities and laboratory experiments focus on simple inquiry, with a step-by-step approach to reach predetermined outcomes. The science classroom inquiry (SCI) simulations were designed to give students real life, authentic science experiences within the confines of a typical classroom. The SCI simulations allow students to engage with a science problem in a meaningful, inquiry-based manner. Three discrete SCI simulations were created as website applications for use with middle school and high school students. For each simulation, students were tasked with solving a scientific problem through investigation and hypothesis testing. After completion of the simulation, 67% of students reported a change in how they perceived authentic science practices, specifically related to the complex and dynamic nature of scientific research and how scientists approach problems. Moreover, 80% of the students who did not report a change in how they viewed the practice of science indicated that the simulation confirmed or strengthened their prior understanding. Additionally, we found a statistically significant positive correlation between students' self-reported changes in understanding of authentic science practices and the degree to which each simulation benefitted learning. Since SCI simulations were effective in promoting both student learning and student understanding of authentic science practices with both middle and high school students, we propose that SCI simulations are a valuable and versatile technology that can be used to educate and inspire a wide range of science students on the real-world complexities inherent in scientific study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt
Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less
Analysis of Photonic Networks for a Chip Multiprocessor Using Scientific Applications
2009-05-01
Analysis of Photonic Networks for a Chip Multiprocessor Using Scientific Applications Gilbert Hendry†, Shoaib Kamil‡?, Aleksandr Biberman†, Johnnie...electronic networks -on-chip warrants investigating real application traces on functionally compa- rable photonic and electronic network designs. We... network can achieve 75× improvement in energy ef- ficiency for synthetic benchmarks and up to 37× improve- ment for real scientific applications
2017-09-29
Report: The Military-Industrial-Scientific Complex and the Rise of New Powers: Conceptual, Theoretical and Methodological Contributions and the... Methodological Contributions and the Brazilian Case Report Term: 0-Other Email: aminvielle@ucsd.edu Distribution Statement: 1-Approved for public
Complexity of Secondary Scientific Data Sources and Students' Argumentative Discourse
ERIC Educational Resources Information Center
Kerlin, Steven C.; McDonald, Scott P.; Kelly, Gregory J.
2010-01-01
This study examined the learning opportunities provided to students through the use of complex geological data supporting scientific inquiry. Through analysis of argumentative discourse in a high school Earth science classroom, uses of US Geological Survey (USGS) data were contrasted with uses of geoscience textbook data. To examine these…
Numerical methods for engine-airframe integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murthy, S.N.B.; Paynter, G.C.
1986-01-01
Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison ofmore » full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment.« less
Generalized friendship paradox in complex networks: The case of scientific collaboration
NASA Astrophysics Data System (ADS)
Eom, Young-Ho; Jo, Hang-Hyun
2014-04-01
The friendship paradox states that your friends have on average more friends than you have. Does the paradox ``hold'' for other individual characteristics like income or happiness? To address this question, we generalize the friendship paradox for arbitrary node characteristics in complex networks. By analyzing two coauthorship networks of Physical Review journals and Google Scholar profiles, we find that the generalized friendship paradox (GFP) holds at the individual and network levels for various characteristics, including the number of coauthors, the number of citations, and the number of publications. The origin of the GFP is shown to be rooted in positive correlations between degree and characteristics. As a fruitful application of the GFP, we suggest effective and efficient sampling methods for identifying high characteristic nodes in large-scale networks. Our study on the GFP can shed lights on understanding the interplay between network structure and node characteristics in complex networks.
Generalized friendship paradox in complex networks: The case of scientific collaboration
Eom, Young-Ho; Jo, Hang-Hyun
2014-01-01
The friendship paradox states that your friends have on average more friends than you have. Does the paradox “hold” for other individual characteristics like income or happiness? To address this question, we generalize the friendship paradox for arbitrary node characteristics in complex networks. By analyzing two coauthorship networks of Physical Review journals and Google Scholar profiles, we find that the generalized friendship paradox (GFP) holds at the individual and network levels for various characteristics, including the number of coauthors, the number of citations, and the number of publications. The origin of the GFP is shown to be rooted in positive correlations between degree and characteristics. As a fruitful application of the GFP, we suggest effective and efficient sampling methods for identifying high characteristic nodes in large-scale networks. Our study on the GFP can shed lights on understanding the interplay between network structure and node characteristics in complex networks. PMID:24714092
Combining Phase Identification and Statistic Modeling for Automated Parallel Benchmark Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Ye; Ma, Xiaosong; Liu, Qing Gary
2015-01-01
Parallel application benchmarks are indispensable for evaluating/optimizing HPC software and hardware. However, it is very challenging and costly to obtain high-fidelity benchmarks reflecting the scale and complexity of state-of-the-art parallel applications. Hand-extracted synthetic benchmarks are time-and labor-intensive to create. Real applications themselves, while offering most accurate performance evaluation, are expensive to compile, port, reconfigure, and often plainly inaccessible due to security or ownership concerns. This work contributes APPRIME, a novel tool for trace-based automatic parallel benchmark generation. Taking as input standard communication-I/O traces of an application's execution, it couples accurate automatic phase identification with statistical regeneration of event parameters tomore » create compact, portable, and to some degree reconfigurable parallel application benchmarks. Experiments with four NAS Parallel Benchmarks (NPB) and three real scientific simulation codes confirm the fidelity of APPRIME benchmarks. They retain the original applications' performance characteristics, in particular the relative performance across platforms.« less
Code of Federal Regulations, 2011 CFR
2011-04-01
... experimental or research operations. (a) Application requirements. A university, college, or scientific... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Application by scientific institutions and colleges of learning for experimental or research operations. 19.35 Section 19.35 Alcohol...
Lessons Over a Decade of Writing About Scientific Data
NASA Astrophysics Data System (ADS)
Beitler, J.; Collins, S. R.; Naranjo, L.
2006-12-01
For eleven years, the NASA Distributed Active Archive Centers (DAACs) have sponsored writing about research and applications using NASA remote sensing data. The publication, NASA: Supporting Earth System Science, is premised on stimulating scientific curiosity and leading a broad audience carefully into the challenging puzzles that researchers address with the help of remote-sensing data. The National Snow and Ice Data Center, one of the NASA DAACs, has handled the challenge of telling these stories across multiple science disciplines, researching and writing ten to twelve articles each year. Our approach centers on quality science. We preserve its complexity, and attract and stimulate audience interest by placing scientific endeavor at center stage. We propose to share our experiences, successes, and strategies with others who are interested in telling stories that highlight the essential nature of data in the scientific enterprise. We have learned how to write engagingly about abstract, long-term research projects involving a lot of math and physics, in ways that appeal to both scientific and lay readers. We will also talk about the skills and resources that we consider necessary to write informative data stories. We welcome leads on scientific research topics that use NASA remote sensing data. Talk to us at the conference, or write us at nasadaacs@nsidc.org. View our eleventh annual publication as well as past stories online at http://nasadaacs.eos.nasa.gov/articles/index.html, or stop by the NASA booth to pick up a color copy.
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
NASA Astrophysics Data System (ADS)
Shrestha, S. R.; Collow, T. W.; Rose, B.
2016-12-01
Scientific datasets are generated from various sources and platforms but they are typically produced either by earth observation systems or by modelling systems. These are widely used for monitoring, simulating, or analyzing measurements that are associated with physical, chemical, and biological phenomena over the ocean, atmosphere, or land. A significant subset of scientific datasets stores values directly as rasters or in a form that can be rasterized. This is where a value exists at every cell in a regular grid spanning the spatial extent of the dataset. Government agencies like NOAA, NASA, EPA, USGS produces large volumes of near real-time, forecast, and historical data that drives climatological and meteorological studies, and underpins operations ranging from weather prediction to sea ice loss. Modern science is computationally intensive because of the availability of an enormous amount of scientific data, the adoption of data-driven analysis, and the need to share these dataset and research results with the public. ArcGIS as a platform is sophisticated and capable of handling such complex domain. We'll discuss constructs and capabilities applicable to multidimensional gridded data that can be conceptualized as a multivariate space-time cube. Building on the concept of a two-dimensional raster, a typical multidimensional raster dataset could contain several "slices" within the same spatial extent. We will share a case from the NOAA Climate Forecast Systems Reanalysis (CFSR) multidimensional data as an example of how large collections of rasters can be efficiently organized and managed through a data model within a geodatabase called "Mosaic dataset" and dynamically transformed and analyzed using raster functions. A raster function is a lightweight, raster-valued transformation defined over a mixed set of raster and scalar input. That means, just like any tool, you can provide a raster function with input parameters. It enables dynamic processing of only the data that's being displayed on the screen or requested by an application. We will present the dynamic processing and analysis of CFSR data using the chains of raster function and share it as dynamic multidimensional image service. This workflow and capabilities can be easily applied to any scientific data formats that are supported in mosaic dataset.
From Invention to Innovation: Risk Analysis to Integrate One Health Technology in the Dairy Farm
Lombardo, Andrea; Boselli, Carlo; Amatiste, Simonetta; Ninci, Simone; Frazzoli, Chiara; Dragone, Roberto; De Rossi, Alberto; Grasso, Gerardo; Mantovani, Alberto; Brajon, Giovanni
2017-01-01
Current Hazard Analysis Critical Control Points (HACCP) approaches mainly fit for food industry, while their application in primary food production is still rudimentary. The European food safety framework calls for science-based support to the primary producers’ mandate for legal, scientific, and ethical responsibility in food supply. The multidisciplinary and interdisciplinary project ALERT pivots on the development of the technological invention (BEST platform) and application of its measurable (bio)markers—as well as scientific advances in risk analysis—at strategic points of the milk chain for time and cost-effective early identification of unwanted and/or unexpected events of both microbiological and toxicological nature. Health-oriented innovation is complex and subject to multiple variables. Through field activities in a dairy farm in central Italy, we explored individual components of the dairy farm system to overcome concrete challenges for the application of translational science in real life and (veterinary) public health. Based on an HACCP-like approach in animal production, the farm characterization focused on points of particular attention (POPAs) and critical control points to draw a farm management decision tree under the One Health view (environment, animal health, food safety). The analysis was based on the integrated use of checklists (environment; agricultural and zootechnical practices; animal health and welfare) and laboratory analyses of well water, feed and silage, individual fecal samples, and bulk milk. The understanding of complex systems is a condition to accomplish true innovation through new technologies. BEST is a detection and monitoring system in support of production security, quality and safety: a grid of its (bio)markers can find direct application in critical points for early identification of potential hazards or anomalies. The HACCP-like self-monitoring in primary production is feasible, as well as the biomonitoring of live food producing animals as sentinel population for One Health. PMID:29218304
From Invention to Innovation: Risk Analysis to Integrate One Health Technology in the Dairy Farm.
Lombardo, Andrea; Boselli, Carlo; Amatiste, Simonetta; Ninci, Simone; Frazzoli, Chiara; Dragone, Roberto; De Rossi, Alberto; Grasso, Gerardo; Mantovani, Alberto; Brajon, Giovanni
2017-01-01
Current Hazard Analysis Critical Control Points (HACCP) approaches mainly fit for food industry, while their application in primary food production is still rudimentary. The European food safety framework calls for science-based support to the primary producers' mandate for legal, scientific, and ethical responsibility in food supply. The multidisciplinary and interdisciplinary project ALERT pivots on the development of the technological invention (BEST platform) and application of its measurable (bio)markers-as well as scientific advances in risk analysis-at strategic points of the milk chain for time and cost-effective early identification of unwanted and/or unexpected events of both microbiological and toxicological nature. Health-oriented innovation is complex and subject to multiple variables. Through field activities in a dairy farm in central Italy, we explored individual components of the dairy farm system to overcome concrete challenges for the application of translational science in real life and (veterinary) public health. Based on an HACCP-like approach in animal production, the farm characterization focused on points of particular attention (POPAs) and critical control points to draw a farm management decision tree under the One Health view (environment, animal health, food safety). The analysis was based on the integrated use of checklists (environment; agricultural and zootechnical practices; animal health and welfare) and laboratory analyses of well water, feed and silage, individual fecal samples, and bulk milk. The understanding of complex systems is a condition to accomplish true innovation through new technologies. BEST is a detection and monitoring system in support of production security, quality and safety: a grid of its (bio)markers can find direct application in critical points for early identification of potential hazards or anomalies. The HACCP-like self-monitoring in primary production is feasible, as well as the biomonitoring of live food producing animals as sentinel population for One Health.
Three-Dimensional Printing Articular Cartilage: Recapitulating the Complexity of Native Tissue.
Guo, Ting; Lembong, Josephine; Zhang, Lijie Grace; Fisher, John P
2017-06-01
In the past few decades, the field of tissue engineering combined with rapid prototyping (RP) techniques has been successful in creating biological substitutes that mimic tissues. Its applications in regenerative medicine have drawn efforts in research from various scientific fields, diagnostics, and clinical translation to therapies. While some areas of therapeutics are well developed, such as skin replacement, many others such as cartilage repair can still greatly benefit from tissue engineering and RP due to the low success and/or inefficiency of current existing, often surgical treatments. Through fabrication of complex scaffolds and development of advanced materials, RP provides a new avenue for cartilage repair. Computer-aided design and three-dimensional (3D) printing allow the fabrication of modeled cartilage scaffolds for repair and regeneration of damaged cartilage tissues. Specifically, the various processes of 3D printing will be discussed in details, both cellular and acellular techniques, covering the different materials, geometries, and operational printing conditions for the development of tissue-engineered articular cartilage. Finally, we conclude with some insights on future applications and challenges related to this technology, especially using 3D printing techniques to recapitulate the complexity of native structure for advanced cartilage regeneration.
Exploiting the Use of Social Networking to Facilitate Collaboration in the Scientific Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coppock, Edrick G.
The goal of this project was to exploit social networking to facilitate scientific collaboration. The project objective was to research and identify scientific collaboration styles that are best served by social networking applications and to model the most effective social networking applications to substantiate how social networking can support scientific collaboration. To achieve this goal and objective, the project was to develop an understanding of the types of collaborations conducted by scientific researchers, through classification, data analysis and identification of unique collaboration requirements. Another technical objective in support of this goal was to understand the current state of technology inmore » collaboration tools. In order to test hypotheses about which social networking applications effectively support scientific collaboration the project was to create a prototype scientific collaboration system. The ultimate goal for testing the hypotheses and research of the project was to refine the prototype into a functional application that could effectively facilitate and grow collaboration within the U.S. Department of Energy (DOE) research community.« less
Assessing what to address in science communication.
Bruine de Bruin, Wändi; Bostrom, Ann
2013-08-20
As members of a democratic society, individuals face complex decisions about whether to support climate change mitigation, vaccinations, genetically modified food, nanotechnology, geoengineering, and so on. To inform people's decisions and public debate, scientific experts at government agencies, nongovernmental organizations, and other organizations aim to provide understandable and scientifically accurate communication materials. Such communications aim to improve people's understanding of the decision-relevant issues, and if needed, promote behavior change. Unfortunately, existing communications sometimes fail when scientific experts lack information about what people need to know to make more informed decisions or what wording people use to describe relevant concepts. We provide an introduction for scientific experts about how to use mental models research with intended audience members to inform their communication efforts. Specifically, we describe how to conduct interviews to characterize people's decision-relevant beliefs or mental models of the topic under consideration, identify gaps and misconceptions in their knowledge, and reveal their preferred wording. We also describe methods for designing follow-up surveys with larger samples to examine the prevalence of beliefs as well as the relationships of beliefs with behaviors. Finally, we discuss how findings from these interviews and surveys can be used to design communications that effectively address gaps and misconceptions in people's mental models in wording that they understand. We present applications to different scientific domains, showing that this approach leads to communications that improve recipients' understanding and ability to make informed decisions.
Advanced Kalman Filter for Real-Time Responsiveness in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Welch, Gregory Francis; Zhang, Jinghe
2014-06-10
Complex engineering systems pose fundamental challenges in real-time operations and control because they are highly dynamic systems consisting of a large number of elements with severe nonlinearities and discontinuities. Today’s tools for real-time complex system operations are mostly based on steady state models, unable to capture the dynamic nature and too slow to prevent system failures. We developed advanced Kalman filtering techniques and the formulation of dynamic state estimation using Kalman filtering techniques to capture complex system dynamics in aiding real-time operations and control. In this work, we looked at complex system issues including severe nonlinearity of system equations, discontinuitiesmore » caused by system controls and network switches, sparse measurements in space and time, and real-time requirements of power grid operations. We sought to bridge the disciplinary boundaries between Computer Science and Power Systems Engineering, by introducing methods that leverage both existing and new techniques. While our methods were developed in the context of electrical power systems, they should generalize to other large-scale scientific and engineering applications.« less
Regulatory Agencies and Food Safety
Chapman, R. A.; Morrison, A. B.
1966-01-01
Prior to Confederation, food control legislation in Canada consisted of only a few simple laws governing the quality, grading, packing and inspection of certain staple foods. The Inland Revenue Act of 1875 provided the first real control in Canada over adulteration of liquor, foods and drugs. Since then, food legislation has evolved in scope and complexity as the industries involved have developed, as consumers have become better informed, and as scientific advances have provided a sound basis for regulations. Present regulations under the Food and Drugs Act are intended to give consumers broad protection against health hazards and fraud in the production, manufacture, labelling, packaging, advertising, and sale of foods. This principle is well illustrated by present requirements for the control of pesticide residues, chemical additives, and the addition of vitamins to foods. In today's era of rapid technological change, application of current scientific knowledge to the food industry obviously involves the possibility of hazards to health. Regulatory agencies with responsibility for food safety must, therefore, fully utilize scientific knowledge in order to reduce the risks involved to a minimum. PMID:5905951
Thematic Continuities: Talking and Thinking about Adaptation in a Socially Complex Classroom
ERIC Educational Resources Information Center
Ash, Doris
2008-01-01
In this study I rely on sociocultural views of learning and teaching to describe how fifth- sixth-grade students in a Fostering a Community of Learners (FCL) classroom gradually adopted scientific ideas and language in a socially complex classroom. Students practiced talking science together, using everyday, scientific, and hybrid discourses as…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bales, Benjamin B; Barrett, Richard F
In almost all modern scientific applications, developers achieve the greatest performance gains by tuning algorithms, communication systems, and memory access patterns, while leaving low level instruction optimizations to the compiler. Given the increasingly varied and complicated x86 architectures, the value of these optimizations is unclear, and, due to time and complexity constraints, it is difficult for many programmers to experiment with them. In this report we explore the potential gains of these 'last mile' optimization efforts on an AMD Barcelona processor, providing readers with relevant information so that they can decide whether investment in the presented optimizations is worthwhile.
Hypertext and hypermedia systems in information retrieval
NASA Technical Reports Server (NTRS)
Kaye, K. M.; Kuhn, A. D.
1992-01-01
This paper opens with a brief history of hypertext and hypermedia in the context of information management during the 'information age.' Relevant terms are defined and the approach of the paper is explained. Linear and hypermedia information access methods are contrasted. A discussion of hyperprogramming in the handling of complex scientific and technical information follows. A selection of innovative hypermedia systems is discussed. An analysis of the Clinical Practice Library of Medicine NASA STI Program hypermedia application is presented. The paper concludes with a discussion of the NASA STI Program's future hypermedia project plans.
Policy-Aware Content Reuse on the Web
NASA Astrophysics Data System (ADS)
Seneviratne, Oshani; Kagal, Lalana; Berners-Lee, Tim
The Web allows users to share their work very effectively leading to the rapid re-use and remixing of content on the Web including text, images, and videos. Scientific research data, social networks, blogs, photo sharing sites and other such applications known collectively as the Social Web have lots of increasingly complex information. Such information from several Web pages can be very easily aggregated, mashed up and presented in other Web pages. Content generation of this nature inevitably leads to many copyright and license violations, motivating research into effective methods to detect and prevent such violations.
1981-09-01
organized the paperwork system , including finances, travel, k, , f iling, and programs in a highly independent and responsible fashion. Thanks are also due...three-dimensional transformation procedure for arbitrary non-orthogonal coordinate systems , for the purpose of the three-dimensional turbulent...transformation procedure for arbitrary non-orthogonal coordinate systems so as to acquire the generality in the application for elliptic flows (for the square
SpaceX CRS-14 What's On Board Science Briefing
2018-04-01
Craig Kundrot, director, NASA's Space Life and Physical Science Research and Applications, speaks to members of the media in the Kennedy Space Center Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for liftoff from Cape Canaveral Air Force Station's Space Launch Complex 40 at 4:30 p.m. EST, on April 2, 2018. The SpaceX Falcon 9 rocket will launch the company's 14th Commercial Resupply Services mission to the space station.
NASA Technical Reports Server (NTRS)
1972-01-01
The space shuttle fact sheet is presented. Four important reasons for the program are considered to be: (1) It is the only meaningful new manned space program which can be accomplished on a modest budget. (2) It is needed to make space operations less complex and costly. (3) It is required for scientific applications in civilian and military activities. (4) It will encourage greater international participation in space flight. The space shuttle and orbiter configurations are discussed along with the missions. The scope of the study and the costs of each contract for the major contractor are listed.
Current status and future directions for in situ transmission electron microscopy
Taheri, Mitra L.; Stach, Eric A.; Arslan, Ilke; Crozier, P.A.; Kabius, Bernd C.; LaGrange, Thomas; Minor, Andrew M.; Takeda, Seiji; Tanase, Mihaela; Wagner, Jakob B.; Sharma, Renu
2016-01-01
This review article discusses the current and future possibilities for the application of in situ transmission electron microscopy to reveal synthesis pathways and functional mechanisms in complex and nanoscale materials. The findings of a group of scientists, representing academia, government labs and private sector entities (predominantly commercial vendors) during a workshop, held at the Center for Nanoscale Science and Technology- National Institute of Science and Technology (CNST-NIST), are discussed. We provide a comprehensive review of the scientific needs and future instrument and technique developments required to meet them. PMID:27566048
Network Theory: A Primer and Questions for Air Transportation Systems Applications
NASA Technical Reports Server (NTRS)
Holmes, Bruce J.
2004-01-01
A new understanding (with potential applications to air transportation systems) has emerged in the past five years in the scientific field of networks. This development emerges in large part because we now have a new laboratory for developing theories about complex networks: The Internet. The premise of this new understanding is that most complex networks of interest, both of nature and of human contrivance, exhibit a fundamentally different behavior than thought for over two hundred years under classical graph theory. Classical theory held that networks exhibited random behavior, characterized by normal, (e.g., Gaussian or Poisson) degree distributions of the connectivity between nodes by links. The new understanding turns this idea on its head: networks of interest exhibit scale-free (or small world) degree distributions of connectivity, characterized by power law distributions. The implications of scale-free behavior for air transportation systems include the potential that some behaviors of complex system architectures might be analyzed through relatively simple approximations of local elements of the system. For air transportation applications, this presentation proposes a framework for constructing topologies (architectures) that represent the relationships between mobility, flight operations, aircraft requirements, and airspace capacity, and the related externalities in airspace procedures and architectures. The proposed architectures or topologies may serve as a framework for posing comparative and combinative analyses of performance, cost, security, environmental, and related metrics.
The structure of control and data transfer management system for the GAMMA-400 scientific complex
NASA Astrophysics Data System (ADS)
Arkhangelskiy, A. I.; Bobkov, S. G.; Serdin, O. V.; Gorbunov, M. S.; Topchiev, N. P.
2016-02-01
A description of the control and data transfer management system for scientific instrumentation involved in the GAMMA-400 space project is given. The technical capabilities of all specialized equipment to provide the functioning of the scientific instrumentation and satellite support systems are unified in a single structure. Control of the scientific instruments is maintained using one-time pulse radio commands, as well as program commands in the form of 16-bit code words, which are transmitted via onboard control system and scientific data acquisition system. Up to 100 GByte of data per day can be transferred to the ground segment of the project. The correctness of the proposed and implemented structure, engineering solutions and electronic elemental base selection has been verified by the experimental working-off of the prototype of the GAMMA-400 scientific complex in laboratory conditions.
Lay Americans' views of why scientists disagree with each other.
Johnson, Branden B; Dieckmann, Nathan F
2017-10-01
A survey experiment assessed response to five explanations of scientific disputes: problem complexity, self-interest, values, competence, and process choices (e.g. theories and methods). A US lay sample ( n = 453) did not distinguish interests from values, nor competence from process, as explanations of disputes. Process/competence was rated most likely and interests/values least; all, on average, were deemed likely to explain scientific disputes. Latent class analysis revealed distinct subgroups varying in their explanation preferences, with a more complex latent class structure for participants who had heard of scientific disputes in the past. Scientific positivism and judgments of science's credibility were the strongest predictors of latent class membership, controlling for scientific reasoning, political ideology, confidence in choice, scenario, education, gender, age, and ethnicity. The lack of distinction observed overall between different explanations, as well as within classes, raises challenges for further research on explanations of scientific disputes people find credible and why.
What can we learn from PISA?: Investigating PISA's approach to scientific literacy
NASA Astrophysics Data System (ADS)
Schwab, Cheryl Jean
This dissertation is an investigation of the relationship between the multidimensional conception of scientific literacy and its assessment. The Programme for International Student Assessment (PISA), developed under the auspices of the Organization for Economic Cooperation and Development (OECD), offers a unique opportunity to evaluate the assessment of scientific literacy. PISA developed a continuum of performance for scientific literacy across three competencies (i.e., process, content, and situation). Foundational to the interpretation of PISA science assessment is PISA's definition of scientific literacy, which I argue incorporates three themes drawn from history: (a) scientific way of thinking, (b) everyday relevance of science, and (c) scientific literacy for all students. Three coordinated studies were conducted to investigate the validity of PISA science assessment and offer insight into the development of items to assess scientific 2 literacy. Multidimensional models of the internal structure of the PISA 2003 science items were found not to reflect the complex character of PISA's definition of scientific literacy. Although the multidimensional models across the three competencies significantly decreased the G2 statistic from the unidimensional model, high correlations between the dimensions suggest that the dimensions are similar. A cognitive analysis of student verbal responses to PISA science items revealed that students were using competencies of scientific literacy, but the competencies were not elicited by the PISA science items at the depth required by PISA's definition of scientific literacy. Although student responses contained only knowledge of scientific facts and simple scientific concepts, students were using more complex skills to interpret and communicate their responses. Finally the investigation of different scoring approaches and item response models illustrated different ways to interpret student responses to assessment items. These analyses highlighted the complexities of students' responses to the PISA science items and the use of the ordered partition model to accommodate different but equal item responses. The results of the three investigations are used to discuss ways to improve the development and interpretation of PISA's science items.
75 FR 44940 - Withdrawal of Application for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-30
... Entry of Scientific Instruments Applications may be examined between 8:30 A.M. and 5:00 P.M. in Room... determine, inter alia, whether instruments of equivalent scientific value, for the purposes for which the... scientific instrument. They noted that the instrument will be used at a show/demonstration. As noted in the...
76 FR 52936 - Withdrawal of Application for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... Entry of Scientific Instruments Applications may be examined between 8:30 a.m. and 5 p.m. in Room 3720... determine, inter alia, whether instruments of equivalent scientific value, for the purposes for which the... scientific instrument. They noted that the instrument will be cleared through Customs with duty paid by the...
Injectable, cellular-scale optoelectronics with applications for wireless optogenetics.
Kim, Tae-il; McCall, Jordan G; Jung, Yei Hwan; Huang, Xian; Siuda, Edward R; Li, Yuhang; Song, Jizhou; Song, Young Min; Pao, Hsuan An; Kim, Rak-Hwan; Lu, Chaofeng; Lee, Sung Dan; Song, Il-Sun; Shin, Gunchul; Al-Hasani, Ream; Kim, Stanley; Tan, Meng Peun; Huang, Yonggang; Omenetto, Fiorenzo G; Rogers, John A; Bruchas, Michael R
2013-04-12
Successful integration of advanced semiconductor devices with biological systems will accelerate basic scientific discoveries and their translation into clinical technologies. In neuroscience generally, and in optogenetics in particular, the ability to insert light sources, detectors, sensors, and other components into precise locations of the deep brain yields versatile and important capabilities. Here, we introduce an injectable class of cellular-scale optoelectronics that offers such features, with examples of unmatched operational modes in optogenetics, including completely wireless and programmed complex behavioral control over freely moving animals. The ability of these ultrathin, mechanically compliant, biocompatible devices to afford minimally invasive operation in the soft tissues of the mammalian brain foreshadow applications in other organ systems, with potential for broad utility in biomedical science and engineering.
NASA Astrophysics Data System (ADS)
Varlamova, Larisa; Abramov, Dmitrii; Golovin, Arsenii; Seledkina, Ekaterina
2017-05-01
One of the promising methods for early diagnosis of malignant diseases of the respiratory organs and the gastrointestinal tract (GIT) is now considered a fluorescence method. Application autofluorescence phenomenon in endoscopy allows to obtain a fluorescent image of the mucosa, which shows the difference in the intensity of the autofluorescence of healthy and the affected tissue in the green and red regions of the spectrum. The result of the work is to determine on the basis of scientific research and prototyping capabilities of creating fluorescence video endoscope and the development of fluorescent light (illuminator FLU) for videoendoscopy complex. The solution of this problem is based on the method of studying biological objects in lifetime condition.
Spectrophotometric determination of substrate-borne polyacrylamide.
Lu, Jianhang; Wu, Laosheng
2002-08-28
Polyacrylamides (PAMs) have wide application in many industries and in agriculture. Scientific research and industrial applications manifested a need for a method that can quantify substrate-borne PAM. The N-bromination method (a PAM analytical technique based on N-bromination of amide groups and spectrophotometric determination of the formed starch-triiodide complex), which was originally developed for determining PAM in aqueous solutions, was modified to quantify substrate-borne PAM. In the modified method, the quantity of substrate-borne PAM was converted to a concentration of starch-triiodide complex in aqueous solution that was then measured by spectrophotometry. The method sensitivity varied with substrates due to sorption of reagents and reaction intermediates on the substrates. Therefore, separate calibration for each substrate was required. Results from PAM samples in sand, cellulose, organic matter burnt soils, and clay minerals showed that this method had good accuracy and reproducibility. The PAM recoveries ranged from 95.8% to 103.7%, and the relative standard deviations (n = 4) were <7.5% in all cases. The optimum range of PAM in each sample is 10-80 microg. The technique can serve as an effective tool in improving PAM application and facilitating PAM-related research.
Design of pressure-driven microfluidic networks using electric circuit analogy.
Oh, Kwang W; Lee, Kangsun; Ahn, Byungwook; Furlani, Edward P
2012-02-07
This article reviews the application of electric circuit methods for the analysis of pressure-driven microfluidic networks with an emphasis on concentration- and flow-dependent systems. The application of circuit methods to microfluidics is based on the analogous behaviour of hydraulic and electric circuits with correlations of pressure to voltage, volumetric flow rate to current, and hydraulic to electric resistance. Circuit analysis enables rapid predictions of pressure-driven laminar flow in microchannels and is very useful for designing complex microfluidic networks in advance of fabrication. This article provides a comprehensive overview of the physics of pressure-driven laminar flow, the formal analogy between electric and hydraulic circuits, applications of circuit theory to microfluidic network-based devices, recent development and applications of concentration- and flow-dependent microfluidic networks, and promising future applications. The lab-on-a-chip (LOC) and microfluidics community will gain insightful ideas and practical design strategies for developing unique microfluidic network-based devices to address a broad range of biological, chemical, pharmaceutical, and other scientific and technical challenges.
Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model
NASA Astrophysics Data System (ADS)
Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi
Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.
2012-01-01
Background Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Methods Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. Results After modification by dropping two indicators that showed poor measures in the measurement models’ quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of ‘transparency’, ‘participation’, ‘scientific rigour’ and ‘reasonableness’. Conclusions The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies. PMID:22856325
Fischer, Katharina E
2012-08-02
Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. After modification by dropping two indicators that showed poor measures in the measurement models' quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of 'transparency', 'participation', 'scientific rigour' and 'reasonableness'. The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies.
Health behavior-related indicator of lifestyle: application in the ELSA-Brasil study.
Patrão, Ana Luísa; Almeida, Maria-da-Conceição C; Alvim, Sheila; Chor, Dora; Aquino, Estela M L
2018-05-01
Various behaviors are considered health enhancing. Nevertheless, according to the current scientific literature, four health behaviors are considered particularly risky in view of their association with a group of chronic diseases: 1) smoking; 2) excessive alcohol consumption; 3) poor diet; and 4) lack of physical activity. Theoretically, it should be possible to make improvements to one's health by maximizing the number of healthy behaviors and minimizing the unhealthy ones. However, in reality, the different behaviors interconnect to create more complex lifestyles. Therefore, the objective of this paper is to present the construction of a lifestyle indicator based on health behaviors selected in the ELSA-Brazil study. This indicator revealed two lifestyles: less healthy and healthier lifestyles. The model proved adequate and was confirmed using latent class analysis (LCA). Agreement was 83.2 between the indicator and the LCA results, with a kappa coefficient of 0.65. Women were more likely to have a healthier lifestyle than men, reinforcing the scientific consistency of the indicator, since this finding is in agreement with data from the scientific literature. The indicator created to define lifestyle was found to have scientific consistency and validity; therefore, its use can be recommended for future population-based studies concerning the promotion of health and healthy lifestyles.
The State of the NIH BRAIN Initiative.
Koroshetz, Walter; Gordon, Joshua; Adams, Amy; Beckel-Mitchener, Andrea; Churchill, James; Farber, Gregory; Freund, Michelle; Gnadt, Jim; Hsu, Nina; Langhals, Nicholas; Lisanby, Sarah; Liu, Guoying; Peng, Grace; Ramos, Khara; Steinmetz, Michael; Talley, Edmund; White, Samantha
2018-06-19
The BRAIN Initiative® arose from a grand challenge to "accelerate the development and application of new technologies that will enable researchers to produce dynamic pictures of the brain that show how individual brain cells and complex neural circuits interact at the speed of thought." The BRAIN Initiative is a public-private effort focused on the development and use of powerful tools for acquiring fundamental insights about how information processing occurs in the central nervous system. As the Initiative enters its fifth year, NIH has supported over 500 principal investigators, who have answered the Initiative's challenge via hundreds of publications describing novel tools, methods, and discoveries that address the Initiative's seven scientific priorities. We describe scientific advances produced by individual labs, multi-investigator teams, and entire consortia that, over the coming decades, will produce more comprehensive and dynamic maps of the brain, deepen our understanding of how circuit activity can produce a rich tapestry of behaviors, and lay the foundation for understanding how its circuitry is disrupted in brain disorders. Much more work remains to bring this vision to fruition, and NIH continues to look to the diverse scientific community, from mathematics, to physics, chemistry, engineering, neuroethics, and neuroscience, to ensure that the greatest scientific benefit arises from this unique research Initiative. Copyright © 2018 the authors.
Wood, Dylan; King, Margaret; Landis, Drew; Courtney, William; Wang, Runtang; Kelly, Ross; Turner, Jessica A; Calhoun, Vince D
2014-01-01
Neuroscientists increasingly need to work with big data in order to derive meaningful results in their field. Collecting, organizing and analyzing this data can be a major hurdle on the road to scientific discovery. This hurdle can be lowered using the same technologies that are currently revolutionizing the way that cultural and social media sites represent and share information with their users. Web application technologies and standards such as RESTful webservices, HTML5 and high-performance in-browser JavaScript engines are being utilized to vastly improve the way that the world accesses and shares information. The neuroscience community can also benefit tremendously from these technologies. We present here a web application that allows users to explore and request the complex datasets that need to be shared among the neuroimaging community. The COINS (Collaborative Informatics and Neuroimaging Suite) Data Exchange uses web application technologies to facilitate data sharing in three phases: Exploration, Request/Communication, and Download. This paper will focus on the first phase, and how intuitive exploration of large and complex datasets is achieved using a framework that centers around asynchronous client-server communication (AJAX) and also exposes a powerful API that can be utilized by other applications to explore available data. First opened to the neuroscience community in August 2012, the Data Exchange has already provided researchers with over 2500 GB of data.
Rubel, Oliver; Bowen, Benjamin P
2018-01-01
Mass spectrometry imaging (MSI) is a transformative imaging method that supports the untargeted, quantitative measurement of the chemical composition and spatial heterogeneity of complex samples with broad applications in life sciences, bioenergy, and health. While MSI data can be routinely collected, its broad application is currently limited by the lack of easily accessible analysis methods that can process data of the size, volume, diversity, and complexity generated by MSI experiments. The development and application of cutting-edge analytical methods is a core driver in MSI research for new scientific discoveries, medical diagnostics, and commercial-innovation. However, the lack of means to share, apply, and reproduce analyses hinders the broad application, validation, and use of novel MSI analysis methods. To address this central challenge, we introduce the Berkeley Analysis and Storage Toolkit (BASTet), a novel framework for shareable and reproducible data analysis that supports standardized data and analysis interfaces, integrated data storage, data provenance, workflow management, and a broad set of integrated tools. Based on BASTet, we describe the extension of the OpenMSI mass spectrometry imaging science gateway to enable web-based sharing, reuse, analysis, and visualization of data analyses and derived data products. We demonstrate the application of BASTet and OpenMSI in practice to identify and compare characteristic substructures in the mouse brain based on their chemical composition measured via MSI.
Wood, Dylan; King, Margaret; Landis, Drew; Courtney, William; Wang, Runtang; Kelly, Ross; Turner, Jessica A.; Calhoun, Vince D.
2014-01-01
Neuroscientists increasingly need to work with big data in order to derive meaningful results in their field. Collecting, organizing and analyzing this data can be a major hurdle on the road to scientific discovery. This hurdle can be lowered using the same technologies that are currently revolutionizing the way that cultural and social media sites represent and share information with their users. Web application technologies and standards such as RESTful webservices, HTML5 and high-performance in-browser JavaScript engines are being utilized to vastly improve the way that the world accesses and shares information. The neuroscience community can also benefit tremendously from these technologies. We present here a web application that allows users to explore and request the complex datasets that need to be shared among the neuroimaging community. The COINS (Collaborative Informatics and Neuroimaging Suite) Data Exchange uses web application technologies to facilitate data sharing in three phases: Exploration, Request/Communication, and Download. This paper will focus on the first phase, and how intuitive exploration of large and complex datasets is achieved using a framework that centers around asynchronous client-server communication (AJAX) and also exposes a powerful API that can be utilized by other applications to explore available data. First opened to the neuroscience community in August 2012, the Data Exchange has already provided researchers with over 2500 GB of data. PMID:25206330
Developing science gateways for drug discovery in a grid environment.
Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra
2016-01-01
Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.
Quantum-limited Terahertz detection without liquid cryogens
NASA Technical Reports Server (NTRS)
2005-01-01
Under this contract, we have successfully designed, fabricated and tested a revolutionary new type of detector for Terahertz (THz) radiation, the tunable antenna-coupled intersubband Terahertz (TACIT) detector. The lowest-noise THz detectors used in the astrophysics community require cooling to temperatures below 4K. This deep cryogenic requirement forces satellites launched for THz- observing missions to include either large volumes of liquid Helium, complex cryocoolers, or both. Cryogenic requirements thus add significantly to the cost, complexity and mass of satellites and limit the duration of their missions. It hence desirable to develop new detector technologies with less stringent cryogenic requirements. Such detectors will not only be important in space-based astrophysics, but also respond to a growing demand for THz technology for earth-based scientific and commercial applications.
[M.S. Gilyarov's Scientific School of Soil Zoology].
Chesnova, L V
2005-01-01
The role of M.S. Gilyarov's scientific school in the development of the subject and methodology of a new complex discipline formed in the mid-20th century--soil zoology--was considered. The establishment and evolution of the proper scientific school was periodized. The creative continuity and development of the basic laws and technical approaches included in the teacher's scientific program was demonstrated by scientific historical analysis.
ERIC Educational Resources Information Center
Chiarello, Fabio; Castellano, Maria Gabriella
2016-01-01
In this paper the authors report different experiences in the use of board games as learning tools for complex and abstract scientific concepts such as Quantum Mechanics, Relativity or nano-biotechnologies. In particular we describe "Quantum Race," designed for the introduction of Quantum Mechanical principles, "Lab on a chip,"…
Protection from Induced Space Environments Effects on the International Space Station
NASA Technical Reports Server (NTRS)
Soares, Carlos; Mikatarian, Ron; Stegall, Courtney; Schmidl, Danny; Huang, Alvin; Olsen, Randy; Koontz, Steven
2010-01-01
The International Space Station (ISS) is one of the largest, most complex multinational scientific projects in history and protection from induced space environments effects is critical to its long duration mission as well as to the health of the vehicle and safety of on-orbit operations. This paper discusses some of the unique challenges that were encountered during the design, assembly and operation of the ISS and how they were resolved. Examples are provided to illustrate the issues and the risk mitigation strategies that were developed to resolve these issues. Of particular importance are issues related with the interaction of multiple spacecraft as in the case of ISS and Visiting Vehicles transporting crew, hardware elements, cargo and scientific payloads. These strategies are applicable to the development of future long duration space systems, not only during design, but also during assembly and operation of these systems.
ObsPy: A Python toolbox for seismology - Current state, applications, and ecosystem around it
NASA Astrophysics Data System (ADS)
Lecocq, Thomas; Megies, Tobias; Krischer, Lion; Sales de Andrade, Elliott; Barsch, Robert; Beyreuther, Moritz
2016-04-01
ObsPy (http://www.obspy.org) is a community-driven, open-source project offering a bridge for seismology into the scientific Python ecosystem. It provides * read and write support for essentially all commonly used waveform, station, and event metadata formats with a unified interface, * a comprehensive signal processing toolbox tuned to the needs of seismologists, * integrated access to all large data centers, web services and databases, and * convenient wrappers to third party codes like libmseed and evalresp. Python, in contrast to many other languages and tools, is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often has to be translated to stable and production ready environments. It furthermore offers many freely available high quality scientific modules covering most needs in developing scientific software. ObsPy has been in constant development for more than 5 years and nowadays enjoys a large rate of adoption in the community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. This contribution will give a short introduction and overview of ObsPy and highlight a number of use cases and software built around it. We will furthermore discuss the issue of sustainability of scientific software.
ObsPy: A Python toolbox for seismology - Current state, applications, and ecosystem around it
NASA Astrophysics Data System (ADS)
Krischer, L.; Megies, T.; Sales de Andrade, E.; Barsch, R.; Beyreuther, M.
2015-12-01
ObsPy (http://www.obspy.org) is a community-driven, open-source project offering a bridge for seismology into the scientific Python ecosystem. It provides read and write support for essentially all commonly used waveform, station, and event metadata formats with a unified interface, a comprehensive signal processing toolbox tuned to the needs of seismologists, integrated access to all large data centers, web services and databases, and convenient wrappers to third party codes like libmseed and evalresp. Python, in contrast to many other languages and tools, is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often has to be translated to stable and production ready environments. It furthermore offers many freely available high quality scientific modules covering most needs in developing scientific software.ObsPy has been in constant development for more than 5 years and nowadays enjoys a large rate of adoption in the community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it.This contribution will give a short introduction and overview of ObsPy and highlight a number of us cases and software built around it. We will furthermore discuss the issue of sustainability of scientific software.
Learning Physics-based Models in Hydrology under the Framework of Generative Adversarial Networks
NASA Astrophysics Data System (ADS)
Karpatne, A.; Kumar, V.
2017-12-01
Generative adversarial networks (GANs), that have been highly successful in a number of applications involving large volumes of labeled and unlabeled data such as computer vision, offer huge potential for modeling the dynamics of physical processes that have been traditionally studied using simulations of physics-based models. While conventional physics-based models use labeled samples of input/output variables for model calibration (estimating the right parametric forms of relationships between variables) or data assimilation (identifying the most likely sequence of system states in dynamical systems), there is a greater opportunity to explore the full power of machine learning (ML) methods (e.g, GANs) for studying physical processes currently suffering from large knowledge gaps, e.g. ground-water flow. However, success in this endeavor requires a principled way of combining the strengths of ML methods with physics-based numerical models that are founded on a wealth of scientific knowledge. This is especially important in scientific domains like hydrology where the number of data samples is small (relative to Internet-scale applications such as image recognition where machine learning methods has found great success), and the physical relationships are complex (high-dimensional) and non-stationary. We will present a series of methods for guiding the learning of GANs using physics-based models, e.g., by using the outputs of physics-based models as input data to the generator-learner framework, and by using physics-based models as generators trained using validation data in the adversarial learning framework. These methods are being developed under the broad paradigm of theory-guided data science that we are developing to integrate scientific knowledge with data science methods for accelerating scientific discovery.
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...
2015-02-19
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
Oak Regeneration: A Knowledge Synthesis
H. Michael Rauscher; David L. Loftis; Charles E. McGee; Christopher V. Worth
1997-01-01
This scientific literature is represented by a hypertext software. To view this literature you must download and install the hypertext software.Abstract: The scientific literature concerning oak regeneration problems is lengthy, complex, paradoxical, and often perplexing. Despite a large scientific literature and numerous conference...
An Array Library for Microsoft SQL Server with Astrophysical Applications
NASA Astrophysics Data System (ADS)
Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.
2012-09-01
Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.
Westmoreland, Carl; Carmichael, Paul; Dent, Matt; Fentem, Julia; MacKay, Cameron; Maxwell, Gavin; Pease, Camilla; Reynolds, Fiona
2010-01-01
Assuring consumer safety without the generation of new animal data is currently a considerable challenge. However, through the application of new technologies and the further development of risk-based approaches for safety assessment, we remain confident it is ultimately achievable. For many complex, multi-organ consumer safety endpoints, the development, evaluation and application of new, non-animal approaches is hampered by a lack of biological understanding of the underlying mechanistic processes involved. The enormity of this scientific challenge should not be underestimated. To tackle this challenge a substantial research programme was initiated by Unilever in 2004 to critically evaluate the feasibility of a new conceptual approach based upon the following key components: 1.Developing new, exposure-driven risk assessment approaches. 2.Developing new biological (in vitro) and computer-based (in silico) predictive models. 3.Evaluating the applicability of new technologies for generating data (e.g. "omics", informatics) and for integrating new types of data (e.g. systems approaches) for risk-based safety assessment. Our research efforts are focussed in the priority areas of skin allergy, cancer and general toxicity (including inhaled toxicity). In all of these areas, a long-term investment is essential to increase the scientific understanding of the underlying biology and molecular mechanisms that we believe will ultimately form a sound basis for novel risk assessment approaches. Our research programme in these priority areas consists of in-house research as well as Unilever-sponsored academic research, involvement in EU-funded projects (e.g. Sens-it-iv, Carcinogenomics), participation in cross-industry collaborative research (e.g. Colipa, EPAA) and ongoing involvement with other scientific initiatives on non-animal approaches to risk assessment (e.g. UK NC3Rs, US "Human Toxicology Project" consortium).
Efficient Use of Distributed Systems for Scientific Applications
NASA Technical Reports Server (NTRS)
Taylor, Valerie; Chen, Jian; Canfield, Thomas; Richard, Jacques
2000-01-01
Distributed computing has been regarded as the future of high performance computing. Nationwide high speed networks such as vBNS are becoming widely available to interconnect high-speed computers, virtual environments, scientific instruments and large data sets. One of the major issues to be addressed with distributed systems is the development of computational tools that facilitate the efficient execution of parallel applications on such systems. These tools must exploit the heterogeneous resources (networks and compute nodes) in distributed systems. This paper presents a tool, called PART, which addresses this issue for mesh partitioning. PART takes advantage of the following heterogeneous system features: (1) processor speed; (2) number of processors; (3) local network performance; and (4) wide area network performance. Further, different finite element applications under consideration may have different computational complexities, different communication patterns, and different element types, which also must be taken into consideration when partitioning. PART uses parallel simulated annealing to partition the domain, taking into consideration network and processor heterogeneity. The results of using PART for an explicit finite element application executing on two IBM SPs (located at Argonne National Laboratory and the San Diego Supercomputer Center) indicate an increase in efficiency by up to 36% as compared to METIS, a widely used mesh partitioning tool. The input to METIS was modified to take into consideration heterogeneous processor performance; METIS does not take into consideration heterogeneous networks. The execution times for these applications were reduced by up to 30% as compared to METIS. These results are given in Figure 1 for four irregular meshes with number of elements ranging from 30,269 elements for the Barth5 mesh to 11,451 elements for the Barth4 mesh. Future work with PART entails using the tool with an integrated application requiring distributed systems. In particular this application, illustrated in the document entails an integration of finite element and fluid dynamic simulations to address the cooling of turbine blades of a gas turbine engine design. It is not uncommon to encounter high-temperature, film-cooled turbine airfoils with 1,000,000s of degrees of freedom. This results because of the complexity of the various components of the airfoils, requiring fine-grain meshing for accuracy. Additional information is contained in the original.
Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.
Aji, Ablimit; Wang, Fusheng; Saltz, Joel H
2012-11-06
Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.
Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data
Aji, Ablimit; Wang, Fusheng; Saltz, Joel H.
2013-01-01
Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the “big data” challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce. PMID:24501719
75 FR 4830 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-29
... for Scientific Review Special Emphasis Panel, Applications in Mechanisms of Emotion, Stress and Health... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... applications. Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892, (Telephone...
Wavelet-Based Interpolation and Representation of Non-Uniformly Sampled Spacecraft Mission Data
NASA Technical Reports Server (NTRS)
Bose, Tamal
2000-01-01
A well-documented problem in the analysis of data collected by spacecraft instruments is the need for an accurate, efficient representation of the data set. The data may suffer from several problems, including additive noise, data dropouts, an irregularly-spaced sampling grid, and time-delayed sampling. These data irregularities render most traditional signal processing techniques unusable, and thus the data must be interpolated onto an even grid before scientific analysis techniques can be applied. In addition, the extremely large volume of data collected by scientific instrumentation presents many challenging problems in the area of compression, visualization, and analysis. Therefore, a representation of the data is needed which provides a structure which is conducive to these applications. Wavelet representations of data have already been shown to possess excellent characteristics for compression, data analysis, and imaging. The main goal of this project is to develop a new adaptive filtering algorithm for image restoration and compression. The algorithm should have low computational complexity and a fast convergence rate. This will make the algorithm suitable for real-time applications. The algorithm should be able to remove additive noise and reconstruct lost data samples from images.
Torregrosa, Alicia; Casazza, Michael L.; Caldwell, Margaret R.; Mathiasmeier, Teresa A.; Morgan, Peter M.; Overton, Cory T.
2010-01-01
Integration of scientific data and adaptive management techniques is critical to the success of species conservation, however, there are uncertainties about effective methods of knowledge exchange between scientists and decisionmakers. The conservation planning and implementation process for Greater Sage-grouse (Centrocercus urophasianus; ) in the Mono Basin, Calif. region, was used as a case study to observe the exchange of scientific information among stakeholders with differing perspectives; resource manager, scientist, public official, rancher, and others. The collaborative development of a risk-simulation model was explored as a tool to transfer knowledge between stakeholders and inform conservation planning and management decisions. Observations compiled using a transdisciplinary approach were used to compare the exchange of information during the collaborative model development and more traditional interactions such as scientist-led presentations at stakeholder meetings. Lack of congruence around knowledge needs and prioritization led to insufficient commitment to completely implement the risk-simulation model. Ethnographic analysis of the case study suggests that further application of epistemic community theory, which posits a strong boundary condition on knowledge transfer, could help support application of risk simulation models in conservation-planning efforts within similarly complex social and bureaucratic landscapes.
A Big Data Guide to Understanding Climate Change: The Case for Theory-Guided Data Science.
Faghmous, James H; Kumar, Vipin
2014-09-01
Global climate change and its impact on human life has become one of our era's greatest challenges. Despite the urgency, data science has had little impact on furthering our understanding of our planet in spite of the abundance of climate data. This is a stark contrast from other fields such as advertising or electronic commerce where big data has been a great success story. This discrepancy stems from the complex nature of climate data as well as the scientific questions climate science brings forth. This article introduces a data science audience to the challenges and opportunities to mine large climate datasets, with an emphasis on the nuanced difference between mining climate data and traditional big data approaches. We focus on data, methods, and application challenges that must be addressed in order for big data to fulfill their promise with regard to climate science applications. More importantly, we highlight research showing that solely relying on traditional big data techniques results in dubious findings, and we instead propose a theory-guided data science paradigm that uses scientific theory to constrain both the big data techniques as well as the results-interpretation process to extract accurate insight from large climate data .
Chang'E-3 data pre-processing system based on scientific workflow
NASA Astrophysics Data System (ADS)
tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai
2016-04-01
The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.
ERIC Educational Resources Information Center
Develaki, Maria
2008-01-01
In view of the complex problems of this age, the question of the socio-ethical dimension of science acquires particular importance. We approach this matter from a philosophical and sociological standpoint, looking at such focal concerns as the motivation, purposes and methods of scientific activity, the ambivalence of scientific research and the…
ERIC Educational Resources Information Center
Christenson, Nina; Chang Rundgren, Shu-Nu
2015-01-01
Socio-scientific issues (SSI) have proven to be suitable contexts for students to actively reflect on and argue about complex social issues related to science. Research has indicated that explicitly teaching SSI argumentation is a good way to help students develop their argumentation skills and make them aware of the complexity of SSI. However,…
An open source workflow for 3D printouts of scientific data volumes
NASA Astrophysics Data System (ADS)
Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.
2013-12-01
As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and long term curation. [1] http://www.arcscience.com/systemDetails/omniTechnology.html [2] http://video.esri.com/watch/53/landscape-design-with-tangible-gis [3] Löwe et al. (2013), Geophysical Research Abstracts, Vol. 15, EGU2013-1544-1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svetlana Shasharina
The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.
ScyFlow: An Environment for the Visual Specification and Execution of Scientific Workflows
NASA Technical Reports Server (NTRS)
McCann, Karen M.; Yarrow, Maurice; DeVivo, Adrian; Mehrotra, Piyush
2004-01-01
With the advent of grid technologies, scientists and engineers are building more and more complex applications to utilize distributed grid resources. The core grid services provide a path for accessing and utilizing these resources in a secure and seamless fashion. However what the scientists need is an environment that will allow them to specify their application runs at a high organizational level, and then support efficient execution across any given set or sets of resources. We have been designing and implementing ScyFlow, a dual-interface architecture (both GUT and APT) that addresses this problem. The scientist/user specifies the application tasks along with the necessary control and data flow, and monitors and manages the execution of the resulting workflow across the distributed resources. In this paper, we utilize two scenarios to provide the details of the two modules of the project, the visual editor and the runtime workflow engine.
Three-dimensional electron microscopy simulation with the CASINO Monte Carlo software.
Demers, Hendrix; Poirier-Demers, Nicolas; Couture, Alexandre Réal; Joly, Dany; Guilmain, Marc; de Jonge, Niels; Drouin, Dominique
2011-01-01
Monte Carlo softwares are widely used to understand the capabilities of electron microscopes. To study more realistic applications with complex samples, 3D Monte Carlo softwares are needed. In this article, the development of the 3D version of CASINO is presented. The software feature a graphical user interface, an efficient (in relation to simulation time and memory use) 3D simulation model, accurate physic models for electron microscopy applications, and it is available freely to the scientific community at this website: www.gel.usherbrooke.ca/casino/index.html. It can be used to model backscattered, secondary, and transmitted electron signals as well as absorbed energy. The software features like scan points and shot noise allow the simulation and study of realistic experimental conditions. This software has an improved energy range for scanning electron microscopy and scanning transmission electron microscopy applications. Copyright © 2011 Wiley Periodicals, Inc.
Three-Dimensional Electron Microscopy Simulation with the CASINO Monte Carlo Software
Demers, Hendrix; Poirier-Demers, Nicolas; Couture, Alexandre Réal; Joly, Dany; Guilmain, Marc; de Jonge, Niels; Drouin, Dominique
2011-01-01
Monte Carlo softwares are widely used to understand the capabilities of electron microscopes. To study more realistic applications with complex samples, 3D Monte Carlo softwares are needed. In this paper, the development of the 3D version of CASINO is presented. The software feature a graphical user interface, an efficient (in relation to simulation time and memory use) 3D simulation model, accurate physic models for electron microscopy applications, and it is available freely to the scientific community at this website: www.gel.usherbrooke.ca/casino/index.html. It can be used to model backscattered, secondary, and transmitted electron signals as well as absorbed energy. The software features like scan points and shot noise allow the simulation and study of realistic experimental conditions. This software has an improved energy range for scanning electron microscopy and scanning transmission electron microscopy applications. PMID:21769885
FBCOT: a fast block coding option for JPEG 2000
NASA Astrophysics Data System (ADS)
Taubman, David; Naman, Aous; Mathew, Reji
2017-09-01
Based on the EBCOT algorithm, JPEG 2000 finds application in many fields, including high performance scientific, geospatial and video coding applications. Beyond digital cinema, JPEG 2000 is also attractive for low-latency video communications. The main obstacle for some of these applications is the relatively high computational complexity of the block coder, especially at high bit-rates. This paper proposes a drop-in replacement for the JPEG 2000 block coding algorithm, achieving much higher encoding and decoding throughputs, with only modest loss in coding efficiency (typically < 0.5dB). The algorithm provides only limited quality/SNR scalability, but offers truly reversible transcoding to/from any standard JPEG 2000 block bit-stream. The proposed FAST block coder can be used with EBCOT's post-compression RD-optimization methodology, allowing a target compressed bit-rate to be achieved even at low latencies, leading to the name FBCOT (Fast Block Coding with Optimized Truncation).
Sea-level rise and shoreline retreat: time to abandon the Bruun Rule
NASA Astrophysics Data System (ADS)
Cooper, J. Andrew G.; Pilkey, Orrin H.
2004-11-01
In the face of a global rise in sea level, understanding the response of the shoreline to changes in sea level is a critical scientific goal to inform policy makers and managers. A body of scientific information exists that illustrates both the complexity of the linkages between sea-level rise and shoreline response, and the comparative lack of understanding of these linkages. In spite of the lack of understanding, many appraisals have been undertaken that employ a concept known as the "Bruun Rule". This is a simple two-dimensional model of shoreline response to rising sea level. The model has seen near global application since its original formulation in 1954. The concept provided an advance in understanding of the coastal system at the time of its first publication. It has, however, been superseded by numerous subsequent findings and is now invalid. Several assumptions behind the Bruun Rule are known to be false and nowhere has the Bruun Rule been adequately proven; on the contrary several studies disprove it in the field. No universally applicable model of shoreline retreat under sea-level rise has yet been developed. Despite this, the Bruun Rule is in widespread contemporary use at a global scale both as a management tool and as a scientific concept. The persistence of this concept beyond its original assumption base is attributed to the following factors: Appeal of a simple, easy to use analytical model that is in widespread use. Difficulty of determining the relative validity of 'proofs' and 'disproofs'. Ease of application. Positive advocacy by some scientists. Application by other scientists without critical appraisal. The simple numerical expression of the model. Lack of easy alternatives. The Bruun Rule has no power for predicting shoreline behaviour under rising sea level and should be abandoned. It is a concept whose time has passed. The belief by policy makers that it offers a prediction of future shoreline position may well have stifled much-needed research into the coastal response to sea-level rise.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
... appropriate collection of data regarding the use, consumer perception, and health risks of an MRTP? III...] Scientific Evaluation of Modified Risk Tobacco Product Applications; Public Workshop; Request for Comments... obtain input on specific issues associated with the scientific evaluation of modified risk tobacco...
Community Intelligence in Knowledge Curation: An Application to Managing Scientific Nomenclature
Zou, Dong; Li, Ang; Liu, Guocheng; Chen, Fei; Wu, Jiayan; Xiao, Jingfa; Wang, Xumin; Yu, Jun; Zhang, Zhang
2013-01-01
Harnessing community intelligence in knowledge curation bears significant promise in dealing with communication and education in the flood of scientific knowledge. As knowledge is accumulated at ever-faster rates, scientific nomenclature, a particular kind of knowledge, is concurrently generated in all kinds of fields. Since nomenclature is a system of terms used to name things in a particular discipline, accurate translation of scientific nomenclature in different languages is of critical importance, not only for communications and collaborations with English-speaking people, but also for knowledge dissemination among people in the non-English-speaking world, particularly young students and researchers. However, it lacks of accuracy and standardization when translating scientific nomenclature from English to other languages, especially for those languages that do not belong to the same language family as English. To address this issue, here we propose for the first time the application of community intelligence in scientific nomenclature management, namely, harnessing collective intelligence for translation of scientific nomenclature from English to other languages. As community intelligence applied to knowledge curation is primarily aided by wiki and Chinese is the native language for about one-fifth of the world’s population, we put the proposed application into practice, by developing a wiki-based English-to-Chinese Scientific Nomenclature Dictionary (ESND; http://esnd.big.ac.cn). ESND is a wiki-based, publicly editable and open-content platform, exploiting the whole power of the scientific community in collectively and collaboratively managing scientific nomenclature. Based on community curation, ESND is capable of achieving accurate, standard, and comprehensive scientific nomenclature, demonstrating a valuable application of community intelligence in knowledge curation. PMID:23451119
Community intelligence in knowledge curation: an application to managing scientific nomenclature.
Dai, Lin; Xu, Chao; Tian, Ming; Sang, Jian; Zou, Dong; Li, Ang; Liu, Guocheng; Chen, Fei; Wu, Jiayan; Xiao, Jingfa; Wang, Xumin; Yu, Jun; Zhang, Zhang
2013-01-01
Harnessing community intelligence in knowledge curation bears significant promise in dealing with communication and education in the flood of scientific knowledge. As knowledge is accumulated at ever-faster rates, scientific nomenclature, a particular kind of knowledge, is concurrently generated in all kinds of fields. Since nomenclature is a system of terms used to name things in a particular discipline, accurate translation of scientific nomenclature in different languages is of critical importance, not only for communications and collaborations with English-speaking people, but also for knowledge dissemination among people in the non-English-speaking world, particularly young students and researchers. However, it lacks of accuracy and standardization when translating scientific nomenclature from English to other languages, especially for those languages that do not belong to the same language family as English. To address this issue, here we propose for the first time the application of community intelligence in scientific nomenclature management, namely, harnessing collective intelligence for translation of scientific nomenclature from English to other languages. As community intelligence applied to knowledge curation is primarily aided by wiki and Chinese is the native language for about one-fifth of the world's population, we put the proposed application into practice, by developing a wiki-based English-to-Chinese Scientific Nomenclature Dictionary (ESND; http://esnd.big.ac.cn). ESND is a wiki-based, publicly editable and open-content platform, exploiting the whole power of the scientific community in collectively and collaboratively managing scientific nomenclature. Based on community curation, ESND is capable of achieving accurate, standard, and comprehensive scientific nomenclature, demonstrating a valuable application of community intelligence in knowledge curation.
Complexity and Innovation: Army Transformation and the Reality of War
2004-05-26
necessary to instill confidence among all members of the interested community that the causal relationships...continues to gain momentum and general acceptance within the scientific community . The topic is addressed in numerous books, studies and scientific journals...scientific community has steadily grown. Since the time of Galileo and Newton, scientific endeavor has been characterized by reductionism (the process
Space Archaeology: Attribute, Object, Task and Method
NASA Astrophysics Data System (ADS)
Wang, Xinyuan; Guo, Huadong; Luo, Lei; Liu, Chuansheng
2017-04-01
Archaeology takes the material remains of human activity as the research object, and uses those fragmentary remains to reconstruct the humanistic and natural environment in different historical periods. Space Archaeology is a new branch of the Archaeology. Its study object is the humanistic-natural complex including the remains of human activities and living environments on the earth surface. The research method, space information technologies applied to this complex, is an innovative process concerning archaeological information acquisition, interpretation and reconstruction, and to achieve the 3-D dynamic reconstruction of cultural heritages by constructing the digital cultural-heritage sphere. Space archaeology's attribute is highly interdisciplinary linking several areas of natural and social and humanities. Its task is to reveal the history, characteristics, and patterns of human activities in the past, as well as to understand the evolutionary processes guiding the relationship between human and their environment. This paper summarizes six important aspects of space archaeology and five crucial recommendations for the establishment and development of this new discipline. The six important aspects are: (1) technologies and methods for non-destructive detection of archaeological sites; (2) space technologies for the protection and monitoring of cultural heritages; (3) digital environmental reconstruction of archaeological sites; (4) spatial data storage and data mining of cultural heritages; (5) virtual archaeology, digital reproduction and public information and presentation system; and (6) the construction of scientific platform of digital cultural-heritage sphere. The five key recommendations for establishing the discipline of Space Archaeology are: (1) encouraging the full integration of the strengths of both archaeology and museology with space technology to promote the development of space technologies' application for cultural heritages; (2) a new disciplinary framework for guiding current researches on space technologies for cultural heritages required; (3) the large cultural heritages desperately need to carrying out the key problems research of the theory-technology-application integration to obtain essential and overall scientific understanding of heritages; (4) focusing planning and implementation of major scientific programs on earth observation for cultural heritage, including those relevant to the development of theory and methods, technology combination and applicability, impact assessments and virtual reconstruction; and (5) taking full advantage of cultural heritages and earth observation sciences to strengthen space archaeology for improvements and refinements in both disciplinary practices and theoretical development. Several case studies along the ancient Silk Road were given to demonstrate the potential benefits of space archaeology.
Stevens, Jean-Luc R.; Elver, Marco; Bednar, James A.
2013-01-01
Lancet is a new, simulator-independent Python utility for succinctly specifying, launching, and collating results from large batches of interrelated computationally demanding program runs. This paper demonstrates how to combine Lancet with IPython Notebook to provide a flexible, lightweight, and agile workflow for fully reproducible scientific research. This informal and pragmatic approach uses IPython Notebook to capture the steps in a scientific computation as it is gradually automated and made ready for publication, without mandating the use of any separate application that can constrain scientific exploration and innovation. The resulting notebook concisely records each step involved in even very complex computational processes that led to a particular figure or numerical result, allowing the complete chain of events to be replicated automatically. Lancet was originally designed to help solve problems in computational neuroscience, such as analyzing the sensitivity of a complex simulation to various parameters, or collecting the results from multiple runs with different random starting points. However, because it is never possible to know in advance what tools might be required in future tasks, Lancet has been designed to be completely general, supporting any type of program as long as it can be launched as a process and can return output in the form of files. For instance, Lancet is also heavily used by one of the authors in a separate research group for launching batches of microprocessor simulations. This general design will allow Lancet to continue supporting a given research project even as the underlying approaches and tools change. PMID:24416014
NASA Astrophysics Data System (ADS)
Otto, Thomas; Saupe, Ray; Bruch, Reinhard F.; Fritzsch, Uwe; Stock, Volker; Gessner, Thomas; Afanasyeva, Natalia I.
2001-11-01
The field of microtechnology is an important industrial and scientific resource for the 21st century. There is a great interest in spectroscopic sensors in the near and middle infrared (NIR-MIR) wavelength regions (1 - 2.5 micrometers ; 2.5 - 4.5 micrometers ; 4 - 6 micrometers ). The potential for cheap and small devices for nondestructive, remote sensing techniques at a molecular level has stimulated the design and development of more compact analyzer systems. Therefore we will try to build analyzers using micro optical components such as micromirrors and embossed micro gratings optimized for the above mentioned spectral ranges. Potentially, infrared sensors can be used for rapid nondestructive diagnostics of surfaces, liquids, gases, polymers and complex biological systems including proteins, blood, cells and cellular debris as well as body tissue. Furthermore, NIR-MIR microsensing spectroscopy will be utilized to monitor the chemical composition of petrochemical products like gasoline and diesel. In addition, miniature analyzers will be used for rapid measuring of food, in particular oil, starch and meat. In this paper we will present an overview of several new approaches for subsurface and surface sensing technologies based on the integration of optical micro devices, the most promising sensors for biomedical, environmental and industrial applications, data processing and evaluation algorithms for classification of the results. Both scientific and industrial applications will be discussed.
76 FR 18239 - Endangered Species; Marine Mammals; Receipt of Applications for Permit
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
... enhancement of the survival of the species. Applicant: Daniel Fore, Austin, TX; PRT-37816A; Applicant: Robert... Argentina for the purpose of scientific research. Applicant: Museum of Zoology and Herbarium, University of... legally accessioned into the permittee's collection for scientific research. This notification covers...
Resilient workflows for computational mechanics platforms
NASA Astrophysics Data System (ADS)
Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine
2010-06-01
Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].
Complex systems and the technology of variability analysis
Seely, Andrew JE; Macklem, Peter T
2004-01-01
Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients. PMID:15566580
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
NASA Technical Reports Server (NTRS)
Wright, Jeffrey; Thakur, Siddharth
2006-01-01
Loci-STREAM is an evolving computational fluid dynamics (CFD) software tool for simulating possibly chemically reacting, possibly unsteady flows in diverse settings, including rocket engines, turbomachines, oil refineries, etc. Loci-STREAM implements a pressure- based flow-solving algorithm that utilizes unstructured grids. (The benefit of low memory usage by pressure-based algorithms is well recognized by experts in the field.) The algorithm is robust for flows at all speeds from zero to hypersonic. The flexibility of arbitrary polyhedral grids enables accurate, efficient simulation of flows in complex geometries, including those of plume-impingement problems. The present version - Loci-STREAM version 0.9 - includes an interface with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library for access to enhanced linear-equation-solving programs therein that accelerate convergence toward a solution. The name "Loci" reflects the creation of this software within the Loci computational framework, which was developed at Mississippi State University for the primary purpose of simplifying the writing of complex multidisciplinary application programs to run in distributed-memory computing environments including clusters of personal computers. Loci has been designed to relieve application programmers of the details of programming for distributed-memory computers.
Concept Maps for Improved Science Reasoning and Writing: Complexity Isn’t Everything
Dowd, Jason E.; Duncan, Tanya; Reynolds, Julie A.
2015-01-01
A pervasive notion in the literature is that complex concept maps reflect greater knowledge and/or more expert-like thinking than less complex concept maps. We show that concept maps used to structure scientific writing and clarify scientific reasoning do not adhere to this notion. In an undergraduate course for thesis writers, students use concept maps instead of traditional outlines to define the boundaries and scope of their research and to construct an argument for the significance of their research. Students generate maps at the beginning of the semester, revise after peer review, and revise once more at the end of the semester. Although some students revised their maps to make them more complex, a significant proportion of students simplified their maps. We found no correlation between increased complexity and improved scientific reasoning and writing skills, suggesting that sometimes students simplify their understanding as they develop more expert-like thinking. These results suggest that concept maps, when used as an intervention, can meet the varying needs of a diverse population of student writers. PMID:26538388
Misreading Science in the Twentieth Century.
ERIC Educational Resources Information Center
Budd, John M.
2001-01-01
Considers textual aspects of scientific communication and problems for reception presented by the complex dynamics of communicating scientific work. Discusses scientific work based on fraud or misconduct and disputes about the nature of science, and applies reception theory and reader-response criticism to understand variations in readings of the…
1993 Annual report on scientific programs: A broad research program on the sciences of complexity
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-12-31
This report provides a summary of many of the research projects completed by the Santa Fe Institute (SFI) during 1993. These research efforts continue to focus on two general areas: the study of, and search for, underlying scientific principles governing complex adaptive systems, and the exploration of new theories of computation that incorporate natural mechanisms of adaptation (mutation, genetics, evolution).
Spectral imaging: principles and applications.
Garini, Yuval; Young, Ian T; McNamara, George
2006-08-01
Spectral imaging extends the capabilities of biological and clinical studies to simultaneously study multiple features such as organelles and proteins qualitatively and quantitatively. Spectral imaging combines two well-known scientific methodologies, namely spectroscopy and imaging, to provide a new advantageous tool. The need to measure the spectrum at each point of the image requires combining dispersive optics with the more common imaging equipment, and introduces constrains as well. The principles of spectral imaging and a few representative applications are described. Spectral imaging analysis is necessary because the complex data structure cannot be analyzed visually. A few of the algorithms are discussed with emphasis on the usage for different experimental modes (fluorescence and bright field). Finally, spectral imaging, like any method, should be evaluated in light of its advantages to specific applications, a selection of which is described. Spectral imaging is a relatively new technique and its full potential is yet to be exploited. Nevertheless, several applications have already shown its potential. (c) 2006 International Society for Analytical Cytology.
NASA Astrophysics Data System (ADS)
McQuaide, Glenn G.
2006-12-01
Without adequate understanding of science, we cannot make responsible personal, regional, national, or global decisions about any aspect of life dealing with science. Better understanding how we learn about science can contribute to improving the quality of our educational experiences. Promoting pathways leading to life-long learning and deep understanding in our world should be a goal for all educators. This dissertation project was a phenomenological investigation into undergraduate understanding and acceptance of scientific theories, including biological evolution. Specifically, student descriptions of conceptual change while learning science theory were recorded and analyzed. These qualitative investigations were preceded by a survey that provided a means of selecting students who had a firmer understanding of science theory. Background information and survey data were collected in an undergraduate biology class at a small, Southern Baptist-affiliated liberal arts school located in south central Kentucky. Responses to questions on the MATE (Rutledge and Warden, 1999) instrument were used to screen students for interviews, which investigated the way by which students came to understand and accept scientific theories. This study identifies some ways by which individuals learn complex science theories, including biological evolution. Initial understanding and acceptance often occurs by the conceptual change method described by Posner et al. (1982). Three principle ways by which an individual may reach a level of understanding and acceptance of science theory were documented in this study. They were conceptual change through application of logic and reasoning; conceptual change through modification of religious views; and conceptual change through acceptance of authoritative knowledge. Development of a deeper, richer understanding and acceptance of complex, multi-faceted concepts such as biological evolution occurs in some individuals by means of conceptual enrichment. Conceptual enrichment occurs through addition of new knowledge, and then examining prior knowledge through the perspective of this new knowledge. In the field of science, enrichment reinforces complex concepts when multiple, convergent lines of supporting evidences point to the same rational scientific conclusion.
[Scientific research results commercialization as an opportunity for the physiotherapy development].
Pietras, Piotr; Łyp, Marek; Nowicka, Katarzyna; Soliwoda, Marcin; Kruszyński, Mateusz; Malczewski, Daniel
Physiotherapy is under the very intensive development. The research carried out around the world result in implementing new forms of therapy. For several years higher education institutions are trying to support scientists in an attempt to commercialize the results of research, although the process is complex. The practice in the world shows that the cooperation of science and business is possible and results in the implementation of modern solutions as real applications. It is important to scientists and people planning a career in science knew the rules and limitations of the above process.
On applying cognitive psychology.
Baddeley, Alan
2013-11-01
Recent attempts to assess the practical impact of scientific research prompted my own reflections on over 40 years worth of combining basic and applied cognitive psychology. Examples are drawn principally from the study of memory disorders, but also include applications to the assessment of attention, reading, and intelligence. The most striking conclusion concerns the many years it typically takes to go from an initial study, to the final practical outcome. Although the complexity and sheer timescale involved make external evaluation problematic, the combination of practical satisfaction and theoretical stimulation make the attempt to combine basic and applied research very rewarding. © 2013 The British Psychological Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chilkoti, Ashutosk
2012-06-29
The emerging, interdisciplinary field of Bioinspired Materials focuses on developing a fundamental understanding of the synthesis, directed self-assembly and hierarchical organization of natural occurring materials, and uses this understanding to engineer new bioinspired artificial materials for diverse applications. The inaugural 2012 Gordon Conference on Bioinspired Materials seeks to capture the excitement of this burgeoning field by a cutting-edge scientific program and roster of distinguished invited speakers and discussion leaders who will address the key issues in the field. The Conference will feature a wide range of topics, such as materials and devices from DNA, reprogramming the genetic code for designmore » of new materials, peptide, protein and carbohydrate based materials, biomimetic systems, complexity in self-assembly, and biomedical applications of bioinspired materials.« less
Scientific and Regulatory Considerations for Generic Complex Drug Products Containing Nanomaterials.
Zheng, Nan; Sun, Dajun D; Zou, Peng; Jiang, Wenlei
2017-05-01
In the past few decades, the development of medicine at the nanoscale has been applied to oral and parenteral dosage forms in a wide range of therapeutic areas to enhance drug delivery and reduce toxicity. An obvious response to these benefits is reflected in higher market shares of complex drug products containing nanomaterials than that of conventional formulations containing the same active ingredient. The surging market interest has encouraged the pharmaceutical industry to develop cost-effective generic versions of complex drug products based on nanotechnology when the associated patent and exclusivity on the reference products have expired. Due to their complex nature, nanotechnology-based drugs present unique challenges in determining equivalence standards between generic and innovator products. This manuscript attempts to provide the scientific rationales and regulatory considerations of key equivalence standards (e.g., in vivo studies and in vitro physicochemical characterization) for oral drugs containing nanomaterials, iron-carbohydrate complexes, liposomes, protein-bound drugs, nanotube-forming drugs, and nano emulsions. It also presents active research studies in bridging regulatory and scientific gaps for establishing equivalence of complex products containing nanomaterials. We hope that open communication among industry, academia, and regulatory agencies will accelerate the development and approval processes of generic complex products based on nanotechnology.
Big Data in Plant Science: Resources and Data Mining Tools for Plant Genomics and Proteomics.
Popescu, George V; Noutsos, Christos; Popescu, Sorina C
2016-01-01
In modern plant biology, progress is increasingly defined by the scientists' ability to gather and analyze data sets of high volume and complexity, otherwise known as "big data". Arguably, the largest increase in the volume of plant data sets over the last decade is a consequence of the application of the next-generation sequencing and mass-spectrometry technologies to the study of experimental model and crop plants. The increase in quantity and complexity of biological data brings challenges, mostly associated with data acquisition, processing, and sharing within the scientific community. Nonetheless, big data in plant science create unique opportunities in advancing our understanding of complex biological processes at a level of accuracy without precedence, and establish a base for the plant systems biology. In this chapter, we summarize the major drivers of big data in plant science and big data initiatives in life sciences with a focus on the scope and impact of iPlant, a representative cyberinfrastructure platform for plant science.
Protein-protein interaction networks (PPI) and complex diseases
Safari-Alighiarloo, Nahid; Taghizadeh, Mohammad; Rezaei-Tavirani, Mostafa; Goliaei, Bahram
2014-01-01
The physical interaction of proteins which lead to compiling them into large densely connected networks is a noticeable subject to investigation. Protein interaction networks are useful because of making basic scientific abstraction and improving biological and biomedical applications. Based on principle roles of proteins in biological function, their interactions determine molecular and cellular mechanisms, which control healthy and diseased states in organisms. Therefore, such networks facilitate the understanding of pathogenic (and physiologic) mechanisms that trigger the onset and progression of diseases. Consequently, this knowledge can be translated into effective diagnostic and therapeutic strategies. Furthermore, the results of several studies have proved that the structure and dynamics of protein networks are disturbed in complex diseases such as cancer and autoimmune disorders. Based on such relationship, a novel paradigm is suggested in order to confirm that the protein interaction networks can be the target of therapy for treatment of complex multi-genic diseases rather than individual molecules with disrespect the network. PMID:25436094
Philosophy and Sociology of Science Evolution and History
NASA Astrophysics Data System (ADS)
Rosen, Joe
The following sections are included: * Concrete Versus Abstract Theoretical Models * Introduction: concrete and abstract in kepler's contribution * Einstein's theory of gravitation and mach's principle * Unitary symmetry and the structure of hadrons * Conclusion * Dedication * Symmetry, Entropy and Complexity * Introduction * Symmetry Implies Abstraction and Loss of Information * Broken Symmetries - Imposed or Spontaneous * Symmetry, Order and Information * References * Cosmological Surrealism: More Than "Eternal Reality" Is Needed * Pythagoreanism in atomic, nuclear and particle physics * Introduction: Pythagoreanism as part of the Greek scientific world view — and the three questions I will tackle * Point 1: the impact of Gersonides and Crescas, two scientific anti-Aristotelian rebels * Point 2: Kepler's spheres to Bohr's orbits — Pythagoreanisms at last! * Point 3: Aristotle to Maupertuis, Emmy Noether, Schwinger * References * Paradigm Completion For Generalized Evolutionary Theory With Application To Epistemology * Evolution Fully Generalized * Entropy: Gravity as Model * Evolution and Entropy: Measures of Complexity * Extinctions and a Balanced Evolutionary Paradigm * The Evolution of Human Society - the Age of Information as example * High-Energy Physics and the World Wide Web * Twentieth Century Epistemology has Strong (de facto) Evolutionary Elements * The discoveries towards the beginning of the XXth Century * Summary and Conclusions * References * Evolutionary Epistemology and Invalidation * Introduction * Extinctions and A New Evolutionary Paradigm * Evolutionary Epistemology - Active Mutations * Evolutionary Epistemology: Invalidation as An Extinction * References
Maron, Bradley A; Leopold, Jane A
2016-09-30
Reductionist theory proposes that analyzing complex systems according to their most fundamental components is required for problem resolution, and has served as the cornerstone of scientific methodology for more than four centuries. However, technological gains in the current scientific era now allow for the generation of large datasets that profile the proteomic, genomic, and metabolomic signatures of biological systems across a range of conditions. The accessibility of data on such a vast scale has, in turn, highlighted the limitations of reductionism, which is not conducive to analyses that consider multiple and contemporaneous interactions between intermediates within a pathway or across constructs. Systems biology has emerged as an alternative approach to analyze complex biological systems. This methodology is based on the generation of scale-free networks and, thus, provides a quantitative assessment of relationships between multiple intermediates, such as protein-protein interactions, within and between pathways of interest. In this way, systems biology is well positioned to identify novel targets implicated in the pathogenesis or treatment of diseases. In this review, the historical root and fundamental basis of systems biology, as well as the potential applications of this methodology are discussed with particular emphasis on integration of these concepts to further understanding of cardiovascular disorders such as coronary artery disease and pulmonary hypertension.
Application of Logic Models in a Large Scientific Research Program
ERIC Educational Resources Information Center
O'Keefe, Christine M.; Head, Richard J.
2011-01-01
It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…
A Parallel Rendering Algorithm for MIMD Architectures
NASA Technical Reports Server (NTRS)
Crockett, Thomas W.; Orloff, Tobias
1991-01-01
Applications such as animation and scientific visualization demand high performance rendering of complex three dimensional scenes. To deliver the necessary rendering rates, highly parallel hardware architectures are required. The challenge is then to design algorithms and software which effectively use the hardware parallelism. A rendering algorithm targeted to distributed memory MIMD architectures is described. For maximum performance, the algorithm exploits both object-level and pixel-level parallelism. The behavior of the algorithm is examined both analytically and experimentally. Its performance for large numbers of processors is found to be limited primarily by communication overheads. An experimental implementation for the Intel iPSC/860 shows increasing performance from 1 to 128 processors across a wide range of scene complexities. It is shown that minimal modifications to the algorithm will adapt it for use on shared memory architectures as well.
Reflections concerning triply-periodic minimal surfaces.
Schoen, Alan H
2012-10-06
In recent decades, there has been an explosion in the number and variety of embedded triply-periodic minimal surfaces (TPMS) identified by mathematicians and materials scientists. Only the rare examples of low genus, however, are commonly invoked as shape templates in scientific applications. Exact analytic solutions are now known for many of the low genus examples. The more complex surfaces are readily defined with numerical tools such as Surface Evolver software or the Landau-Ginzburg model. Even though table-top versions of several TPMS have been placed within easy reach by rapid prototyping methods, the inherent complexity of many of these surfaces makes it challenging to grasp their structure. The problem of distinguishing TPMS, which is now acute because of the proliferation of examples, has been addressed by Lord & Mackay (Lord & Mackay 2003 Curr. Sci. 85, 346-362).
One-step assembly of coordination complexes for versatile film and particle engineering.
Ejima, Hirotaka; Richardson, Joseph J; Liang, Kang; Best, James P; van Koeverden, Martin P; Such, Georgina K; Cui, Jiwei; Caruso, Frank
2013-07-12
The development of facile and versatile strategies for thin-film and particle engineering is of immense scientific interest. However, few methods can conformally coat substrates of different composition, size, shape, and structure. We report the one-step coating of various interfaces using coordination complexes of natural polyphenols and Fe(III) ions. Film formation is initiated by the adsorption of the polyphenol and directed by pH-dependent, multivalent coordination bonding. Aqueous deposition is performed on a range of planar as well as inorganic, organic, and biological particle templates, demonstrating an extremely rapid technique for producing structurally diverse, thin films and capsules that can disassemble. The ease, low cost, and scalability of the assembly process, combined with pH responsiveness and negligible cytotoxicity, makes these films potential candidates for biomedical and environmental applications.
Reflections concerning triply-periodic minimal surfaces
Schoen, Alan H.
2012-01-01
In recent decades, there has been an explosion in the number and variety of embedded triply-periodic minimal surfaces (TPMS) identified by mathematicians and materials scientists. Only the rare examples of low genus, however, are commonly invoked as shape templates in scientific applications. Exact analytic solutions are now known for many of the low genus examples. The more complex surfaces are readily defined with numerical tools such as Surface Evolver software or the Landau–Ginzburg model. Even though table-top versions of several TPMS have been placed within easy reach by rapid prototyping methods, the inherent complexity of many of these surfaces makes it challenging to grasp their structure. The problem of distinguishing TPMS, which is now acute because of the proliferation of examples, has been addressed by Lord & Mackay (Lord & Mackay 2003 Curr. Sci. 85, 346–362). PMID:24098851
Improving the trust in results of numerical simulations and scientific data analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cappello, Franck; Constantinescu, Emil; Hovland, Paul
This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation andmore » scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general approaches to address it. This paper does not focus on the trust that the execution will actually complete. The product of simulation or of data analytic executions is the final element of a potentially long chain of transformations, where each stage has the potential to introduce harmful corruptions. These corruptions may produce results that deviate from the user-expected accuracy without notifying the user of this deviation. There are many potential sources of corruption before and during the execution; consequently, in this white paper we do not focus on the protection of the end result after the execution.« less
Ecosystem services provided by a complex coastal region: challenges of classification and mapping.
Sousa, Lisa P; Sousa, Ana I; Alves, Fátima L; Lillebø, Ana I
2016-03-11
A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping.
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1993-01-01
This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.
Ecosystem services provided by a complex coastal region: challenges of classification and mapping
Sousa, Lisa P.; Sousa, Ana I.; Alves, Fátima L.; Lillebø, Ana I.
2016-01-01
A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping. PMID:26964892
Developing Scientific Literacy in a Primary School
ERIC Educational Resources Information Center
Smith, Kathleen Veronica; Loughran, John; Berry, Amanda; Dimitrakopoulos, Cathy
2012-01-01
The science education literature demonstrates that scientific literacy is generally valued and acknowledged among educators as a desirable student learning outcome. However, what scientific literacy really means in terms of classroom practice and student learning is debatable due to the inherent complexity of the term and varying expectations of…
Science in Writing: Learning Scientific Argument in Principle and Practice
ERIC Educational Resources Information Center
Cope, Bill; Kalantzis, Mary; Abd-El-Khalick, Fouad; Bagley, Elizabeth
2013-01-01
This article explores the processes of writing in science and in particular the "complex performance" of writing a scientific argument. The article explores in general terms the nature of scientific argumentation in which the author-scientist makes claims, provides evidence to support these claims, and develops chains of scientific…
A Systematic Approach for Obtaining Performance on Matrix-Like Operations
NASA Astrophysics Data System (ADS)
Veras, Richard Michael
Scientific Computation provides a critical role in the scientific process because it allows us ask complex queries and test predictions that would otherwise be unfeasible to perform experimentally. Because of its power, Scientific Computing has helped drive advances in many fields ranging from Engineering and Physics to Biology and Sociology to Economics and Drug Development and even to Machine Learning and Artificial Intelligence. Common among these domains is the desire for timely computational results, thus a considerable amount of human expert effort is spent towards obtaining performance for these scientific codes. However, this is no easy task because each of these domains present their own unique set of challenges to software developers, such as domain specific operations, structurally complex data and ever-growing datasets. Compounding these problems are the myriads of constantly changing, complex and unique hardware platforms that an expert must target. Unfortunately, an expert is typically forced to reproduce their effort across multiple problem domains and hardware platforms. In this thesis, we demonstrate the automatic generation of expert level high-performance scientific codes for Dense Linear Algebra (DLA), Structured Mesh (Stencil), Sparse Linear Algebra and Graph Analytic. In particular, this thesis seeks to address the issue of obtaining performance on many complex platforms for a certain class of matrix-like operations that span across many scientific, engineering and social fields. We do this by automating a method used for obtaining high performance in DLA and extending it to structured, sparse and scale-free domains. We argue that it is through the use of the underlying structure found in the data from these domains that enables this process. Thus, obtaining performance for most operations does not occur in isolation of the data being operated on, but instead depends significantly on the structure of the data.
Bonded repair of composite aircraft structures: A review of scientific challenges and opportunities
NASA Astrophysics Data System (ADS)
Katnam, K. B.; Da Silva, L. F. M.; Young, T. M.
2013-08-01
Advanced composite materials have gained popularity in high-performance structural designs such as aerospace applications that require lightweight components with superior mechanical properties in order to perform in demanding service conditions as well as provide energy efficiency. However, one of the major challenges that the aerospace industry faces with advanced composites - because of their inherent complex damage behaviour - is structural repair. Composite materials are primarily damaged by mechanical loads and/or environmental conditions. If material damage is not extensive, structural repair is the only feasible solution as replacing the entire component is not cost-effective in many cases. Bonded composite repairs (e.g. scarf patches) are generally preferred as they provide enhanced stress transfer mechanisms, joint efficiencies and aerodynamic performance. With an increased usage of advanced composites in primary and secondary aerospace structural components, it is thus essential to have robust, reliable and repeatable structural bonded repair procedures to restore damaged composite components. But structural bonded repairs, especially with primary structures, pose several scientific challenges with the current existing repair technologies. In this regard, the area of structural bonded repair of composites is broadly reviewed - starting from damage assessment to automation - to identify current scientific challenges and future opportunities.
Web-Based Geographic Information System Tool for Accessing Hanford Site Environmental Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Triplett, Mark B.; Seiple, Timothy E.; Watson, David J.
Data volume, complexity, and access issues pose severe challenges for analysts, regulators and stakeholders attempting to efficiently use legacy data to support decision making at the U.S. Department of Energy’s (DOE) Hanford Site. DOE has partnered with the Pacific Northwest National Laboratory (PNNL) on the PHOENIX (PNNL-Hanford Online Environmental Information System) project, which seeks to address data access, transparency, and integration challenges at Hanford to provide effective decision support. PHOENIX is a family of spatially-enabled web applications providing quick access to decades of valuable scientific data and insight through intuitive query, visualization, and analysis tools. PHOENIX realizes broad, public accessibilitymore » by relying only on ubiquitous web-browsers, eliminating the need for specialized software. It accommodates a wide range of users with intuitive user interfaces that require little or no training to quickly obtain and visualize data. Currently, PHOENIX is actively hosting three applications focused on groundwater monitoring, groundwater clean-up performance reporting, and in-tank monitoring. PHOENIX-based applications are being used to streamline investigative and analytical processes at Hanford, saving time and money. But more importantly, by integrating previously isolated datasets and developing relevant visualization and analysis tools, PHOENIX applications are enabling DOE to discover new correlations hidden in legacy data, allowing them to more effectively address complex issues at Hanford.« less
Adaptive optics for in-vivo exploration of human retinal structures
NASA Astrophysics Data System (ADS)
Paques, Michel; Meimon, Serge; Grieve, Kate; Rossant, Florence
2017-06-01
Adaptive optics (AO)-enhanced imaging of the retina is now reaching a level of technical maturity which fosters its expanding use in research and clinical centers in the world. By achieving wavelength-limited resolution it did not only allow a better observation of retinal substructures already visible by other means, it also broke anatomical frontiers such as individual photoreceptors or vessel walls. The clinical applications of AO-enhanced imaging has been slower than that of optical coherence tomography because of the combination of technical complexity, costs and the paucity of interpretative scheme of complex data. In several diseases, AO-enhanced imaging has already proven to provide added clinical value and quantitative biomarkers. Here, we will review some of the clinical applications of AO-enhanced en face imaging, and trace perspectives to improve its clinical pertinence in these applications. An interesting perspective is to document cell motion through time-lapse imaging such as during agerelated macular degeneration. In arterial hypertension, the possibility to measure parietal thickness and perform fine morphometric analysis is of interest for monitoring patients. In the near future, implementation of novel approaches and multimodal imaging, including in particular optical coherence tomography, will undoubtedly expand our imaging capabilities. Tackling the technical, scientific and medical challenges offered by high resolution imaging are likely to contribute to our rethinking of many retinal diseases, and, most importantly, may find applications in other areas of medicine.
Proposal for constructing an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.
1990-01-01
Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.
Compiled MPI: Cost-Effective Exascale Applications Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G; Quinlan, D; Lumsdaine, A
2012-04-10
The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardwaremore » procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over the application's lifetime. It includes: (1) New set of source code annotations, inserted either manually or automatically, that will clarify the application's use of MPI to the compiler infrastructure, enabling greater accuracy where needed; (2) A compiler transformation framework that leverages these annotations to transform the original MPI source code to improve its performance and scalability; (3) Novel MPI runtime implementation techniques that will provide a rich set of functionality extensions to be used by applications that have been transformed by our compiler; and (4) A novel compiler analysis that leverages simple user annotations to automatically extract the application's communication structure and synthesize most complex code annotations.« less
Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Mary
2014-09-19
Enhancing the performance of SciDAC applications on petascale systems has high priority within DOE SC. As we look to the future, achieving expected levels of performance on high-end com-puting (HEC) systems is growing ever more challenging due to enormous scale, increasing archi-tectural complexity, and increasing application complexity. To address these challenges, PERI has implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineering of high profile applications. The PERI performance modeling and prediction activity is developing and refining performance models, significantly reducing the cost of collecting the data upon whichmore » the models are based, and increasing model fidelity, speed and generality. Our primary research activity is automatic tuning (autotuning) of scientific software. This activity is spurred by the strong user preference for automatic tools and is based on previous successful activities such as ATLAS, which has automatically tuned components of the LAPACK linear algebra library, and other re-cent work on autotuning domain-specific libraries. Our third major component is application en-gagement, to which we are devoting approximately 30% of our effort to work directly with Sci-DAC-2 applications. This last activity not only helps DOE scientists meet their near-term per-formance goals, but also helps keep PERI research focused on the real challenges facing DOE computational scientists as they enter the Petascale Era.« less
[Development and application of component-based Chinese medicine theory].
Zhang, Jun-Hua; Fan, Guan-Wei; Zhang, Han; Fan, Xiao-Hui; Wang, Yi; Liu, Li-Mei; Li, Chuan; Gao, Yue; Gao, Xiu-Mei; Zhang, Bo-Li
2017-11-01
Traditional Chinese medicine (TCM) prescription is the main therapies for disease prevention and treatment in Chinese medicine. Following the guidance of the theory of TCM and developing drug by composing prescriptions of TCM materials and pieces, it is a traditional application mode of TCM, and still widely used in clinic. TCM prescription has theoretical advantages and rich clinical application experience in dealing with multi-factor complex diseases, but scientific research is relatively weak. The lack of scientific cognition of the effective substances and mechanism of Chinese medicine leads to insufficient understanding of the efficacy regularity, which affects the stability of effect and hinders the improvement of quality of Chinese medicinal products. Component-based Chinese medicine (CCM) is an innovation based on inheritance, which breaks through the tradition of experience-based prescription and realize the transformation of compatibility from herbal pieces to components. CCM is an important achievement during the research process of modernization of Chinese medicine. Under the support of three national "973" projects, in order to reveal the scientific connotation of the prescription compatibility theory and develop innovative Chinese drugs, we have launched theoretical innovation and technological innovation around the "two relatively clear", and opened up the research field of CCM. CCM is an innovation based on inheritance, breaking through the tradition of experience based prescription, and realizing the transformation from compatibility of herbal pieces to component compatibility, which is an important achievement of the modernization of traditional Chinese medicine. In the past more than 10 years, with the deepening of research and the expansion of application, the theory and methods of CCM and efficacy-oriented compatibility have been continuously improved. The value of CCM is not only in developing new drug, more important is to build a communication bridge between traditional Chinese medicine and modern science and construct the system of key technologies which meet the need of innovation and development of TCM. This paper focused on the research progress, related concepts and technology development of CCM, as well as its application prospect in the theory research of Chinese medicine, development of innovative Chinese drugs, secondary development of Chinese patent medicine and upgrading of pharmaceutical technology. Copyright© by the Chinese Pharmaceutical Association.
Idle waves in high-performance computing
NASA Astrophysics Data System (ADS)
Markidis, Stefano; Vencels, Juris; Peng, Ivy Bo; Akhmetova, Dana; Laure, Erwin; Henri, Pierre
2015-01-01
The vast majority of parallel scientific applications distributes computation among processes that are in a busy state when computing and in an idle state when waiting for information from other processes. We identify the propagation of idle waves through processes in scientific applications with a local information exchange between the two processes. Idle waves are nondispersive and have a phase velocity inversely proportional to the average busy time. The physical mechanism enabling the propagation of idle waves is the local synchronization between two processes due to remote data dependency. This study provides a description of the large number of processes in parallel scientific applications as a continuous medium. This work also is a step towards an understanding of how localized idle periods can affect remote processes, leading to the degradation of global performance in parallel scientific applications.
Szymanski, Jacek; Wilson, David L; Zhang, Guo-Qiang
2009-10-01
The rapid expansion of biomedical research has brought substantial scientific and administrative data management challenges to modern core facilities. Scientifically, a core facility must be able to manage experimental workflow and the corresponding set of large and complex scientific data. It must also disseminate experimental data to relevant researchers in a secure and expedient manner that facilitates collaboration and provides support for data interpretation and analysis. Administratively, a core facility must be able to manage the scheduling of its equipment and to maintain a flexible and effective billing system to track material, resource, and personnel costs and charge for services to sustain its operation. It must also have the ability to regularly monitor the usage and performance of its equipment and to provide summary statistics on resources spent on different categories of research. To address these informatics challenges, we introduce a comprehensive system called MIMI (multimodality, multiresource, information integration environment) that integrates the administrative and scientific support of a core facility into a single web-based environment. We report the design, development, and deployment experience of a baseline MIMI system at an imaging core facility and discuss the general applicability of such a system in other types of core facilities. These initial results suggest that MIMI will be a unique, cost-effective approach to addressing the informatics infrastructure needs of core facilities and similar research laboratories.
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil; ...
2018-03-22
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
ERIC Educational Resources Information Center
Fensham, Peter J.
2014-01-01
In this response to Tom G. K. Bryce and Stephen P. Day's ("Cult Stud Sci Educ." doi:10.1007/s11422-013-9500-0, 2013) original article, I share with them their interest in the teaching of climate change in school science, but I widen it to include other contemporary complex socio-scientific issues that also need to be discussed. I…
An automated and integrated framework for dust storm detection based on ogc web processing services
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.
2014-11-01
Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.
NASA Astrophysics Data System (ADS)
Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut
2017-04-01
Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumalla, Kalyan S.; Yoginath, Srikanth B.
Problems such as fault tolerance and scalable synchronization can be efficiently solved using reversibility of applications. Making applications reversible by relying on computation rather than on memory is ideal for large scale parallel computing, especially for the next generation of supercomputers in which memory is expensive in terms of latency, energy, and price. In this direction, a case study is presented here in reversing a computational core, namely, Basic Linear Algebra Subprograms, which is widely used in scientific applications. A new Reversible BLAS (RBLAS) library interface has been designed, and a prototype has been implemented with two modes: (1) amore » memory-mode in which reversibility is obtained by checkpointing to memory in forward and restoring from memory in reverse, and (2) a computational-mode in which nothing is saved in the forward, but restoration is done entirely via inverse computation in reverse. The article is focused on detailed performance benchmarking to evaluate the runtime dynamics and performance effects, comparing reversible computation with checkpointing on both traditional CPU platforms and recent GPU accelerator platforms. For BLAS Level-1 subprograms, data indicates over an order of magnitude better speed of reversible computation compared to checkpointing. For BLAS Level-2 and Level-3, a more complex tradeoff is observed between reversible computation and checkpointing, depending on computational and memory complexities of the subprograms.« less
NASA Astrophysics Data System (ADS)
Lucido, J. M.
2013-12-01
Scientists in the fields of hydrology, geophysics, and climatology are increasingly using the vast quantity of publicly-available data to address broadly-scoped scientific questions. For example, researchers studying contamination of nearshore waters could use a combination of radar indicated precipitation, modeled water currents, and various sources of in-situ monitoring data to predict water quality near a beach. In discovering, gathering, visualizing and analyzing potentially useful data sets, data portals have become invaluable tools. The most effective data portals often aggregate distributed data sets seamlessly and allow multiple avenues for accessing the underlying data, facilitated by the use of open standards. Additionally, adequate metadata are necessary for attribution, documentation of provenance and relating data sets to one another. Metadata also enable thematic, geospatial and temporal indexing of data sets and entities. Furthermore, effective portals make use of common vocabularies for scientific methods, units of measure, geologic features, chemical, and biological constituents as they allow investigators to correctly interpret and utilize data from external sources. One application that employs these principles is the National Ground Water Monitoring Network (NGWMN) Data Portal (http://cida.usgs.gov/ngwmn), which makes groundwater data from distributed data providers available through a single, publicly accessible web application by mediating and aggregating native data exposed via web services on-the-fly into Open Geospatial Consortium (OGC) compliant service output. That output may be accessed either through the map-based user interface or through the aforementioned OGC web services. Furthermore, the Geo Data Portal (http://cida.usgs.gov/climate/gdp/), which is a system that provides users with data access, subsetting and geospatial processing of large and complex climate and land use data, exemplifies the application of International Standards Organization (ISO) metadata records to enhance data discovery for both human and machine interpretation. Lastly, the Water Quality Portal (http://www.waterqualitydata.us/) achieves interoperable dissemination of water quality data by referencing a vocabulary service for mapping constituents and methods between the USGS and USEPA. The NGWMN Data Portal, Geo Data Portal and Water Quality Portal are three examples of best practices when implementing data portals that provide distributed scientific data in an integrated, standards-based approach.
Beowulf Distributed Processing and the United States Geological Survey
Maddox, Brian G.
2002-01-01
Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.
MyGeoHub: A Collaborative Geospatial Research and Education Platform
NASA Astrophysics Data System (ADS)
Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.
2017-12-01
Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.
Graeden, Ellie; Kerr, Justin; Sorrell, Erin M.; Katz, Rebecca
2018-01-01
Managing infectious disease requires rapid and effective response to support decision making. The decisions are complex and require understanding of the diseases, disease intervention and control measures, and the disease-relevant characteristics of the local community. Though disease modeling frameworks have been developed to address these questions, the complexity of current models presents a significant barrier to community-level decision makers in using the outputs of the most scientifically robust methods to support pragmatic decisions about implementing a public health response effort, even for endemic diseases with which they are already familiar. Here, we describe the development of an application available on the internet, including from mobile devices, with a simple user interface, to support on-the-ground decision-making for integrating disease control programs, given local conditions and practical constraints. The model upon which the tool is built provides predictive analysis for the effectiveness of integration of schistosomiasis and malaria control, two diseases with extensive geographical and epidemiological overlap, and which result in significant morbidity and mortality in affected regions. Working with data from countries across sub-Saharan Africa and the Middle East, we present a proof-of-principle method and corresponding prototype tool to provide guidance on how to optimize integration of vertical disease control programs. This method and tool demonstrate significant progress in effectively translating the best available scientific models to support practical decision making on the ground with the potential to significantly increase the efficacy and cost-effectiveness of disease control. Author summary Designing and implementing effective programs for infectious disease control requires complex decision-making, informed by an understanding of the diseases, the types of disease interventions and control measures available, and the disease-relevant characteristics of the local community. Though disease modeling frameworks have been developed to address these questions and support decision-making, the complexity of current models presents a significant barrier to on-the-ground end users. The picture is further complicated when considering approaches for integration of different disease control programs, where co-infection dynamics, treatment interactions, and other variables must also be taken into account. Here, we describe the development of an application available on the internet with a simple user interface, to support on-the-ground decision-making for integrating disease control, given local conditions and practical constraints. The model upon which the tool is built provides predictive analysis for the effectiveness of integration of schistosomiasis and malaria control, two diseases with extensive geographical and epidemiological overlap. This proof-of-concept method and tool demonstrate significant progress in effectively translating the best available scientific models to support pragmatic decision-making on the ground, with the potential to significantly increase the impact and cost-effectiveness of disease control. PMID:29649260
Data based identification and prediction of nonlinear and complex dynamical systems
NASA Astrophysics Data System (ADS)
Wang, Wen-Xu; Lai, Ying-Cheng; Grebogi, Celso
2016-07-01
The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. The classic approach to phase-space reconstruction through the methodology of delay-coordinate embedding has been practiced for more than three decades, but the paradigm is effective mostly for low-dimensional dynamical systems. Often, the methodology yields only a topological correspondence of the original system. There are situations in various fields of science and engineering where the systems of interest are complex and high dimensional with many interacting components. A complex system typically exhibits a rich variety of collective dynamics, and it is of great interest to be able to detect, classify, understand, predict, and control the dynamics using data that are becoming increasingly accessible due to the advances of modern information technology. To accomplish these goals, especially prediction and control, an accurate reconstruction of the original system is required. Nonlinear and complex systems identification aims at inferring, from data, the mathematical equations that govern the dynamical evolution and the complex interaction patterns, or topology, among the various components of the system. With successful reconstruction of the system equations and the connecting topology, it may be possible to address challenging and significant problems such as identification of causal relations among the interacting components and detection of hidden nodes. The "inverse" problem thus presents a grand challenge, requiring new paradigms beyond the traditional delay-coordinate embedding methodology. The past fifteen years have witnessed rapid development of contemporary complex graph theory with broad applications in interdisciplinary science and engineering. The combination of graph, information, and nonlinear dynamical systems theories with tools from statistical physics, optimization, engineering control, applied mathematics, and scientific computing enables the development of a number of paradigms to address the problem of nonlinear and complex systems reconstruction. In this Review, we describe the recent advances in this forefront and rapidly evolving field, with a focus on compressive sensing based methods. In particular, compressive sensing is a paradigm developed in recent years in applied mathematics, electrical engineering, and nonlinear physics to reconstruct sparse signals using only limited data. It has broad applications ranging from image compression/reconstruction to the analysis of large-scale sensor networks, and it has become a powerful technique to obtain high-fidelity signals for applications where sufficient observations are not available. We will describe in detail how compressive sensing can be exploited to address a diverse array of problems in data based reconstruction of nonlinear and complex networked systems. The problems include identification of chaotic systems and prediction of catastrophic bifurcations, forecasting future attractors of time-varying nonlinear systems, reconstruction of complex networks with oscillatory and evolutionary game dynamics, detection of hidden nodes, identification of chaotic elements in neuronal networks, reconstruction of complex geospatial networks and nodal positioning, and reconstruction of complex spreading networks with binary data.. A number of alternative methods, such as those based on system response to external driving, synchronization, and noise-induced dynamical correlation, will also be discussed. Due to the high relevance of network reconstruction to biological sciences, a special section is devoted to a brief survey of the current methods to infer biological networks. Finally, a number of open problems including control and controllability of complex nonlinear dynamical networks are discussed. The methods outlined in this Review are principled on various concepts in complexity science and engineering such as phase transitions, bifurcations, stabilities, and robustness. The methodologies have the potential to significantly improve our ability to understand a variety of complex dynamical systems ranging from gene regulatory systems to social networks toward the ultimate goal of controlling such systems.
77 FR 25487 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-30
... (Virtual Meeting). Contact Person: Ai-Ping Zou, MD, Ph.D., Scientific Review Officer, Center for Scientific... Scientific Review Special Emphasis Panel; Cancer Therapeutics AREA Grant Applications. Date: May 24, 2012...
Scientific workflows as productivity tools for drug discovery.
Shon, John; Ohkawa, Hitomi; Hammer, Juergen
2008-05-01
Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.
PDB-wide collection of binding data: current status of the PDBbind database.
Liu, Zhihai; Li, Yan; Han, Li; Li, Jie; Liu, Jie; Zhao, Zhixiong; Nie, Wei; Liu, Yuchen; Wang, Renxiao
2015-02-01
Molecular recognition between biological macromolecules and organic small molecules plays an important role in various life processes. Both structural information and binding data of biomolecular complexes are indispensable for depicting the underlying mechanism in such an event. The PDBbind database was created to collect experimentally measured binding data for the biomolecular complexes throughout the Protein Data Bank (PDB). It thus provides the linkage between structural information and energetic properties of biomolecular complexes, which is especially desirable for computational studies or statistical analyses. Since its first public release in 2004, the PDBbind database has been updated on an annual basis. The latest release (version 2013) provides experimental binding affinity data for 10,776 biomolecular complexes in PDB, including 8302 protein-ligand complexes and 2474 other types of complexes. In this article, we will describe the current methods used for compiling PDBbind and the updated status of this database. We will also review some typical applications of PDBbind published in the scientific literature. All contents of this database are freely accessible at the PDBbind-CN Web server at http://www.pdbbind-cn.org/. wangrx@mail.sioc.ac.cn. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Methodological Problems of Nanotechnoscience
NASA Astrophysics Data System (ADS)
Gorokhov, V. G.
Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.
Electronic Noses for Environmental Monitoring Applications
Capelli, Laura; Sironi, Selena; Rosso, Renato Del
2014-01-01
Electronic nose applications in environmental monitoring are nowadays of great interest, because of the instruments' proven capability of recognizing and discriminating between a variety of different gases and odors using just a small number of sensors. Such applications in the environmental field include analysis of parameters relating to environmental quality, process control, and verification of efficiency of odor control systems. This article reviews the findings of recent scientific studies in this field, with particular focus on the abovementioned applications. In general, these studies prove that electronic noses are mostly suitable for the different applications reported, especially if the instruments are specifically developed and fine-tuned. As a general rule, literature studies also discuss the critical aspects connected with the different possible uses, as well as research regarding the development of effective solutions. However, currently the main limit to the diffusion of electronic noses as environmental monitoring tools is their complexity and the lack of specific regulation for their standardization, as their use entails a large number of degrees of freedom, regarding for instance the training and the data processing procedures. PMID:25347583
Biomolecular logic systems: applications to biosensors and bioactuators
NASA Astrophysics Data System (ADS)
Katz, Evgeny
2014-05-01
The paper presents an overview of recent advances in biosensors and bioactuators based on the biocomputing concept. Novel biosensors digitally process multiple biochemical signals through Boolean logic networks of coupled biomolecular reactions and produce output in the form of YES/NO response. Compared to traditional single-analyte sensing devices, biocomputing approach enables a high-fidelity multi-analyte biosensing, particularly beneficial for biomedical applications. Multi-signal digital biosensors thus promise advances in rapid diagnosis and treatment of diseases by processing complex patterns of physiological biomarkers. Specifically, they can provide timely detection and alert to medical emergencies, along with an immediate therapeutic intervention. Application of the biocomputing concept has been successfully demonstrated for systems performing logic analysis of biomarkers corresponding to different injuries, particularly exemplified for liver injury. Wide-ranging applications of multi-analyte digital biosensors in medicine, environmental monitoring and homeland security are anticipated. "Smart" bioactuators, for example for signal-triggered drug release, were designed by interfacing switchable electrodes and biocomputing systems. Integration of novel biosensing and bioactuating systems with the biomolecular information processing systems keeps promise for further scientific advances and numerous practical applications.
NASA Astrophysics Data System (ADS)
Pierce, S. A.; Gentle, J.
2015-12-01
The multi-criteria decision support system (MCSDSS) is a newly completed application for touch-enabled group decision support that uses D3 data visualization tools, a geojson conversion utility that we developed, and Paralelex to create an interactive tool. The MCSDSS is a prototype system intended to demonstrate the potential capabilities of a single page application (SPA) running atop a web and cloud based architecture utilizing open source technologies. The application is implemented on current web standards while supporting human interface design that targets both traditional mouse/keyboard interactions and modern touch/gesture enabled interactions. The technology stack for MCSDSS was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The application integrates current frameworks for highly performant agile development with unit testing, statistical analysis, data visualization, mapping technologies, geographic data manipulation, and cloud infrastructure while retaining support for traditional HTML5/CSS3 web standards. The software lifecylcle for MCSDSS has following best practices to develop, share, and document the codebase and application. Code is documented and shared via an online repository with the option for programmers to see, contribute, or fork the codebase. Example data files and tutorial documentation have been shared with clear descriptions and data object identifiers. And the metadata about the application has been incorporated into an OntoSoft entry to ensure that MCSDSS is searchable and clearly described. MCSDSS is a flexible platform that allows for data fusion and inclusion of large datasets in an interactive front-end application capable of connecting with other science-based applications and advanced computing resources. In addition, MCSDSS offers functionality that enables communication with non-technical users for policy, education, or engagement with groups around scientific topics with societal relevance.
ERIC Educational Resources Information Center
Zeineddin, Ava; Abd-El-Khalick, Fouad
2010-01-01
Reasoning skills are major contributors to academic and everyday life success. Epistemological commitments (ECs) are believed to underlie reasoning processes and, when considered, could do much in delineating the complex nature of scientific reasoning. This study examined the relationship between ECs and scientific reasoning among college science…
Improving Scientific Voice in the Science Communication Center at UT Knoxville
ERIC Educational Resources Information Center
Hirst, Russel
2013-01-01
Many science students believe that scientific writing is most impressive (and most professionally acceptable) when impersonal, dense, complex, and packed with jargon. In particular, they have the idea that legitimate scientific writing must suppress the subjectivity of the human voice. But science students can mature into excellent writers whose…
The Media as an Invaluable Tool for Informal Earth System Science Education
NASA Astrophysics Data System (ADS)
James, E.; Gautier, C.
2001-12-01
One of the most widely utilized avenues for educating the general public about the Earth's environment is the media, be it print, radio or broadcast. Accurate and effective communication of issues in Earth System Science (ESS), however, is significantly hindered by the public's relative scientific illiteracy. Discussion of ESS concepts requires the laying down of a foundation of complex scientific information, which must first be conveyed to an incognizant audience before any strata of sophisticated social context can be appropriately considered. Despite such a substantial obstacle to be negotiated, the environmental journalist is afforded the unique opportunity of providing a broad-reaching informal scientific education to a largely scientifically uninformed population base. This paper will review the tools used by various environmental journalists to address ESS issues and consider how successful each of these approaches has been at conveying complex scientific messages to a general audience lacking sufficient scientific sophistication. Different kinds of media materials used to this effect will be analyzed for their ideas and concepts conveyed, as well as their effectiveness in reaching the public at large.
Austin, from 2001 to 2007. There he was principal in HPC applications and user support, as well as in research and development in large-scale scientific applications and different HPC systems and technologies Interests HPC applications performance and optimizations|HPC systems and accelerator technologies|Scientific
Achievements of agrophysics: review of new scientific divisions
USDA-ARS?s Scientific Manuscript database
This work is devoted to review the new scientific divisions that emerged in agrophysics in the last 10-15 years. Among them are the following: 1) application of Geographic Information Systems, 2) development and application of fuzzy multi attributive comparison of alternatives, 3) application of Ad...
Active Storage with Analytics Capabilities and I/O Runtime System for Petascale Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok
Computational scientists must understand results from experimental, observational and computational simulation generated data to gain insights and perform knowledge discovery. As systems approach the petascale range, problems that were unimaginable a few years ago are within reach. With the increasing volume and complexity of data produced by ultra-scale simulations and high-throughput experiments, understanding the science is largely hampered by the lack of comprehensive I/O, storage, acceleration of data manipulation, analysis, and mining tools. Scientists require techniques, tools and infrastructure to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis, statistical analysis and knowledgemore » discovery. The goal of this work is to enable more effective analysis of scientific datasets through the integration of enhancements in the I/O stack, from active storage support at the file system layer to MPI-IO and high-level I/O library layers. We propose to provide software components to accelerate data analytics, mining, I/O, and knowledge discovery for large-scale scientific applications, thereby increasing productivity of both scientists and the systems. Our approaches include 1) design the interfaces in high-level I/O libraries, such as parallel netCDF, for applications to activate data mining operations at the lower I/O layers; 2) Enhance MPI-IO runtime systems to incorporate the functionality developed as a part of the runtime system design; 3) Develop parallel data mining programs as part of runtime library for server-side file system in PVFS file system; and 4) Prototype an active storage cluster, which will utilize multicore CPUs, GPUs, and FPGAs to carry out the data mining workload.« less
Riding the Hype Wave: Evaluating new AI Techniques for their Applicability in Earth Science
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Zhang, J.; Maskey, M.; Lee, T. J.
2016-12-01
Every few years a new technology rides the hype wave generated by the computer science community. Converts to this new technology who surface from both the science community and the informatics community promulgate that it can radically improve or even change the existing scientific process. Recent examples of new technology following in the footsteps of "big data" now include deep learning algorithms and knowledge graphs. Deep learning algorithms mimic the human brain and process information through multiple stages of transformation and representation. These algorithms are able to learn complex functions that map pixels directly to outputs without relying on human-crafted features and solve some of the complex classification problems that exist in science. Similarly, knowledge graphs aggregate information around defined topics that enable users to resolve their query without having to navigate and assemble information manually. Knowledge graphs could potentially be used in scientific research to assist in hypothesis formulation, testing, and review. The challenge for the Earth science research community is to evaluate these new technologies by asking the right questions and considering what-if scenarios. What is this new technology enabling/providing that is innovative and different? Can one justify the adoption costs with respect to the research returns? Since nothing comes for free, utilizing a new technology entails adoption costs that may outweigh the benefits. Furthermore, these technologies may require significant computing infrastructure in order to be utilized effectively. Results from two different projects will be presented along with lessons learned from testing these technologies. The first project primarily evaluates deep learning techniques for different applications of image retrieval within Earth science while the second project builds a prototype knowledge graph constructed for Hurricane science.
HackAttack: Game-Theoretic Analysis of Realistic Cyber Conflicts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferragut, Erik M; Brady, Andrew C; Brady, Ethan J
Game theory is appropriate for studying cyber conflict because it allows for an intelligent and goal-driven adversary. Applications of game theory have led to a number of results regarding optimal attack and defense strategies. However, the overwhelming majority of applications explore overly simplistic games, often ones in which each participant s actions are visible to every other participant. These simplifications strip away the fundamental properties of real cyber conflicts: probabilistic alerting, hidden actions, unknown opponent capabilities. In this paper, we demonstrate that it is possible to analyze a more realistic game, one in which different resources have different weaknesses, playersmore » have different exploits, and moves occur in secrecy, but they can be detected. Certainly, more advanced and complex games are possible, but the game presented here is more realistic than any other game we know of in the scientific literature. While optimal strategies can be found for simpler games using calculus, case-by-case analysis, or, for stochastic games, Q-learning, our more complex game is more naturally analyzed using the same methods used to study other complex games, such as checkers and chess. We define a simple evaluation function and employ multi-step searches to create strategies. We show that such scenarios can be analyzed, and find that in cases of extreme uncertainty, it is often better to ignore one s opponent s possible moves. Furthermore, we show that a simple evaluation function in a complex game can lead to interesting and nuanced strategies.« less
Grady, Julia
2010-01-01
Teaching by inquiry is touted for its potential to encourage students to reason scientifically. Yet, even when inquiry teaching is practiced, complexity of students' reasoning may be limited or unbalanced. We describe an analytic tool for recognizing when students are engaged in complex reasoning during inquiry teaching. Using classrooms that represented “best case scenarios” for inquiry teaching, we adapted and applied a matrix to categorize the complexity of students' reasoning. Our results revealed points when students' reasoning was quite complex and occasions when their reasoning was limited by the curriculum, instructional choices, or students' unprompted prescription. We propose that teachers use the matrix as a springboard for reflection and discussion that takes a sustained, critical view of inquiry teaching practice. PMID:21113314
TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics
Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...
2015-04-16
Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less
Burlamaque-Neto, A C; Santos, G R; Lisbôa, L M; Goldim, J R; Machado, C L B; Matte, U; Giugliani, R
2012-02-01
In Brazil, scientific research is carried out mainly at universities, where professors coordinate research projects with the active participation of undergraduate and graduate students. However, there is no formal program for the teaching/learning of the scientific method. The objective of the present study was to evaluate the comprehension of the scientific method by students of health sciences who participate in scientific projects in an academic research laboratory. An observational descriptive cross-sectional study was conducted using Edgar Morin complexity as theoretical reference. In a semi-structured interview, students were asked to solve an abstract logical puzzle - TanGram. The collected data were analyzed using the hermeneutic-dialectic analysis method proposed by Minayo and discussed in terms of the theoretical reference of complexity. The students' concept of the scientific method is limited to participation in projects, stressing the execution of practical procedures as opposed to scientific thinking. The solving of the TanGram puzzle revealed that the students had difficulties in understanding questions and activities focused on subjects and their processes. Objective answers, even when dealing with personal issues, were also reflected on the students' opinions about the characteristics of a successful researcher. Students' difficulties concerning these issues may affect their scientific performance and result in poorly designed experiments. This is a preliminary study that should be extended to other centers of scientific research.
Burlamaque-Neto, A.C.; Santos, G.R.; Lisbôa, L.M.; Goldim, J.R.; Machado, C.L.B.; Matte, U.; Giugliani, R.
2012-01-01
In Brazil, scientific research is carried out mainly at universities, where professors coordinate research projects with the active participation of undergraduate and graduate students. However, there is no formal program for the teaching/learning of the scientific method. The objective of the present study was to evaluate the comprehension of the scientific method by students of health sciences who participate in scientific projects in an academic research laboratory. An observational descriptive cross-sectional study was conducted using Edgar Morin complexity as theoretical reference. In a semi-structured interview, students were asked to solve an abstract logical puzzle - TanGram. The collected data were analyzed using the hermeneutic-dialectic analysis method proposed by Minayo and discussed in terms of the theoretical reference of complexity. The students' concept of the scientific method is limited to participation in projects, stressing the execution of practical procedures as opposed to scientific thinking. The solving of the TanGram puzzle revealed that the students had difficulties in understanding questions and activities focused on subjects and their processes. Objective answers, even when dealing with personal issues, were also reflected on the students' opinions about the characteristics of a successful researcher. Students' difficulties concerning these issues may affect their scientific performance and result in poorly designed experiments. This is a preliminary study that should be extended to other centers of scientific research. PMID:22249427
Scientific Ethics: A New Approach.
Menapace, Marcello
2018-06-04
Science is an activity of the human intellect and as such has ethical implications that should be reviewed and taken into account. Although science and ethics have conventionally been considered different, it is herewith proposed that they are essentially similar. The proposal set henceforth is to create a new ethics rooted in science: scientific ethics. Science has firm axiological foundations and searches for truth (as a value, axiology) and knowledge (epistemology). Hence, science cannot be value neutral. Looking at standard scientific principles, it is possible to construct a scientific ethic (that is, an ethical framework based on scientific methods and rules), which can be applied to all sciences. These intellectual standards include the search for truth (honesty and its derivatives), human dignity (and by reflection the dignity of all animals) and respect for life. Through these it is thence achievable to draft a foundation of a ethics based purely on science and applicable beyond the confines of science. A few applications of these will be presented. Scientific ethics can have vast applications in other fields even in non scientific ones.
Cost-effective (gaming) motion and balance devices for functional assessment: Need or hype?
Bonnechère, B; Jansen, B; Van Sint Jan, S
2016-09-06
In the last decade, technological advances in the gaming industry have allowed the marketing of hardware for motion and balance control that is based on technological concepts similar to scientific and clinical equipment. Such hardware is attractive to researchers and clinicians for specific applications. However, some questions concerning their scientific value and the range of future potential applications have yet to be answered. This article attempts to present an objective analysis about the pros and cons of using such hardware for scientific and clinical purposes and calls for a constructive discussion based on scientific facts and practical clinical requests that are emerging from application fields. Copyright © 2016 Elsevier Ltd. All rights reserved.
Avengers Assemble! Using pop-culture icons to communicate science
2014-01-01
Engaging communication of complex scientific concepts with the general public requires more than simplification. Compelling, relevant, and timely points of linkage between scientific concepts and the experiences and interests of the general public are needed. Pop-culture icons such as superheroes can represent excellent opportunities for exploring scientific concepts in a mental “landscape” that is comfortable and familiar. Using an established icon as a familiar frame of reference, complex scientific concepts can then be discussed in a more accessible manner. In this framework, scientists and the general public use the cultural icon to occupy a commonly known performance characteristic. For example, Batman represents a globally recognized icon who represents the ultimate response to exercise and training. The physiology that underlies Batman's abilities can then be discussed and explored using real scientific examples that highlight truths and fallacies contained in the presentation of pop-culture icons. Critically, it is not important whether the popular representation of the icon shows correct science because the real science can be revealed in discussing the character through this lens. Scientists and educators can then use these icons as foils for exploring complex ideas in a context that is less threatening and more comfortable for the target audience. A “middle-ground hypothesis” for science communication is proposed in which pop-culture icons are used to exploring scientific concepts in a bridging mental landscape that is comfortable and familiar. This approach is encouraged for communication with all nonscientists regardless of age. PMID:25039082
Biological network extraction from scientific literature: state of the art and challenges.
Li, Chen; Liakata, Maria; Rebholz-Schuhmann, Dietrich
2014-09-01
Networks of molecular interactions explain complex biological processes, and all known information on molecular events is contained in a number of public repositories including the scientific literature. Metabolic and signalling pathways are often viewed separately, even though both types are composed of interactions involving proteins and other chemical entities. It is necessary to be able to combine data from all available resources to judge the functionality, complexity and completeness of any given network overall, but especially the full integration of relevant information from the scientific literature is still an ongoing and complex task. Currently, the text-mining research community is steadily moving towards processing the full body of the scientific literature by making use of rich linguistic features such as full text parsing, to extract biological interactions. The next step will be to combine these with information from scientific databases to support hypothesis generation for the discovery of new knowledge and the extension of biological networks. The generation of comprehensive networks requires technologies such as entity grounding, coordination resolution and co-reference resolution, which are not fully solved and are required to further improve the quality of results. Here, we analyse the state of the art for the extraction of network information from the scientific literature and the evaluation of extraction methods against reference corpora, discuss challenges involved and identify directions for future research. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Avengers Assemble! Using pop-culture icons to communicate science.
Zehr, E Paul
2014-06-01
Engaging communication of complex scientific concepts with the general public requires more than simplification. Compelling, relevant, and timely points of linkage between scientific concepts and the experiences and interests of the general public are needed. Pop-culture icons such as superheroes can represent excellent opportunities for exploring scientific concepts in a mental “landscape” that is comfortable and familiar. Using an established icon as a familiar frame of reference, complex scientific concepts can then be discussed in a more accessible manner. In this framework, scientists and the general public use the cultural icon to occupy a commonly known performance characteristic. For example, Batman represents a globally recognized icon who represents the ultimate response to exercise and training. The physiology that underlies Batman’s abilities can then be discussed and explored using real scientific examples that highlight truths and fallacies contained in the presentation of pop-culture icons. Critically, it is not important whether the popular representation of the icon shows correct science because the real science can be revealed in discussing the character through this lens. Scientists and educators can then use these icons as foils for exploring complex ideas in a context that is less threatening and more comfortable for the target audience. A “middle-ground hypothesis” for science communication is proposed in which popculture icons are used to exploring scientific concepts in a bridging mental landscape that is comfortable and familiar. This approach is encouraged for communication with all nonscientists regardless of age.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... Decision on Application for Duty-Free Entry of Scientific Instruments This is a decision pursuant to Section 6(c) of the Educational, Scientific, and Cultural Materials Importation Act of 1966 (Pub. L. 89- 651, as amended by Pub. L. 106-36; 80 Stat. 897; 15 CFR part 301). Related records can be viewed...
75 FR 53271 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-31
... invite comments on the question of whether instruments of equivalent scientific value, for the purposes... structure of biological macromolecules, which will be observed under cryogenic conditions. Justification for...
77 FR 25960 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... invite comments on the question of whether instruments of equivalent scientific value, for the purposes... imaging biological and other materials samples. Justification for Duty-Free Entry: There are no...
77 FR 39682 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-05
... oxides, metal chalcogenides, DNA, quantum dots, and carbon nanomaterials to determine their size, shape... Number: 12-031. Applicant: Penn State College of Medicine, 500 University Dr., Hershey, PA 17033... to further advance the body of research of the College of Medicine and the greater scientific...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-15
... reference product for the purpose of submitting a marketing application through an abbreviated licensure... submitting a marketing application through the abbreviated licensure pathway under section 351(k) of the... FDA will use to review applications for biosimilar products; and General scientific principles in...
Hay, L.; Knapp, L.
1996-01-01
Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.
A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics
NASA Technical Reports Server (NTRS)
Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan
2013-01-01
In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.
NASA Astrophysics Data System (ADS)
Martinez, Santa; Besse, Sebastien; Heather, Dave; Barbarisi, Isa; Arviset, Christophe; De Marchi, Guido; Barthelemy, Maud; Docasal, Ruben; Fraga, Diego; Grotheer, Emmanuel; Lim, Tanya; Macfarlane, Alan; Rios, Carlos; Vallejo, Fran; Saiz, Jaime; ESDC (European Space Data Centre) Team
2016-10-01
The Planetary Science Archive (PSA) is the European Space Agency's (ESA) repository of science data from all planetary science and exploration missions. The PSA provides access to scientific datasets through various interfaces at http://archives.esac.esa.int/psa. All datasets are scientifically peer-reviewed by independent scientists, and are compliant with the Planetary Data System (PDS) standards. The PSA is currently implementing a number of significant improvements, mostly driven by the evolution of the PDS standard, and the growing need for better interfaces and advanced applications to support science exploitation. The newly designed PSA will enhance the user experience and will significantly reduce the complexity for users to find their data promoting one-click access to the scientific datasets with more specialised views when needed. This includes a better integration with Planetary GIS analysis tools and Planetary interoperability services (search and retrieve data, supporting e.g. PDAP, EPN-TAP). It will be also up-to-date with versions 3 and 4 of the PDS standards, as PDS4 will be used for ESA's ExoMars and upcoming BepiColombo missions. Users will have direct access to documentation, information and tools that are relevant to the scientific use of the dataset, including ancillary datasets, Software Interface Specification (SIS) documents, and any tools/help that the PSA team can provide. A login mechanism will provide additional functionalities to the users to aid / ease their searches (e.g. saving queries, managing default views). This contribution will introduce the new PSA, its key features and access interfaces.
Sochat, Vanessa
2018-05-01
Here, we present the Scientific Filesystem (SCIF), an organizational format that supports exposure of executables and metadata for discoverability of scientific applications. The format includes a known filesystem structure, a definition for a set of environment variables describing it, and functions for generation of the variables and interaction with the libraries, metadata, and executables located within. SCIF makes it easy to expose metadata, multiple environments, installation steps, files, and entry points to render scientific applications consistent, modular, and discoverable. A SCIF can be installed on a traditional host or in a container technology such as Docker or Singularity. We start by reviewing the background and rationale for the SCIF, followed by an overview of the specification and the different levels of internal modules ("apps") that the organizational format affords. Finally, we demonstrate that SCIF is useful by implementing and discussing several use cases that improve user interaction and understanding of scientific applications. SCIF is released along with a client and integration in the Singularity 2.4 software to quickly install and interact with SCIF. When used inside of a reproducible container, a SCIF is a recipe for reproducibility and introspection of the functions and users that it serves. We use SCIF to evaluate container software, provide metrics, serve scientific workflows, and execute a primary function under different contexts. To encourage collaboration and sharing of applications, we developed tools along with an open source, version-controlled, tested, and programmatically accessible web infrastructure. SCIF and associated resources are available at https://sci-f.github.io. The ease of using SCIF, especially in the context of containers, offers promise for scientists' work to be self-documenting and programatically parseable for maximum reproducibility. SCIF opens up an abstraction from underlying programming languages and packaging logic to work with scientific applications, opening up new opportunities for scientific software development.
Grant Application Development, Submission, Review, & Award
This infographic shows the National Cancer Institute general timeline progression through Grant Application Development, Submission, Review, and Award Infographic. In the first month, Applicant prepares and submits Grant Application to Grants.gov in response to FOA. In month two, The Center for Scientific Review (CSR) assigns applications that fall under the category of R01s, etc. to a Scientific Review Group (SRG) or the CSR assigns applications that fall under the category of Program Projects and Center Grants to NCI Division of Extramural Activities (DEA). Months four through five: First-level review by Scientific Review Group (SRG) for Scientific Merit: SRG assigns Impact Scores. Month five Summary Sstatements are prepared and are available to NCI Program staff and applicants. Month six, second-level review by National Cancer Advisory board (NCAB) for NCI Funding determination begins. NCAB makes recommendation to NCI Director, NCI develops funding plan, Applications selected for Funding, “Paylists” forwarded to Office of Grant Administration (OGA). Month ten, Award Negotiations and Issuance: Award issued, Award received by Institution, and Investigator begins work. www.cancer.gov Icons made by Freepik from http://www.flaticon.com is licensed by CC BY3.0
50 CFR 15.22 - Permits for scientific research.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 50 Wildlife and Fisheries 1 2014-10-01 2014-10-01 false Permits for scientific research. 15.22... for scientific research. (a) Application requirements for permits for scientific research. Each... description of the scientific research to be conducted on the exotic bird requested, including: (i) Formal...
Constraint-based stoichiometric modelling from single organisms to microbial communities
Olivier, Brett G.; Bruggeman, Frank J.; Teusink, Bas
2016-01-01
Microbial communities are ubiquitously found in Nature and have direct implications for the environment, human health and biotechnology. The species composition and overall function of microbial communities are largely shaped by metabolic interactions such as competition for resources and cross-feeding. Although considerable scientific progress has been made towards mapping and modelling species-level metabolism, elucidating the metabolic exchanges between microorganisms and steering the community dynamics remain an enormous scientific challenge. In view of the complexity, computational models of microbial communities are essential to obtain systems-level understanding of ecosystem functioning. This review discusses the applications and limitations of constraint-based stoichiometric modelling tools, and in particular flux balance analysis (FBA). We explain this approach from first principles and identify the challenges one faces when extending it to communities, and discuss the approaches used in the field in view of these challenges. We distinguish between steady-state and dynamic FBA approaches extended to communities. We conclude that much progress has been made, but many of the challenges are still open. PMID:28334697
Building translational ecology communities of practice: insights from the field
Lawson, Dawn M.; Hall, Kimberly R.; Yung, Laurie; Enquist, Carolyn A. F.
2017-01-01
Translational ecology (TE) prioritizes the understanding of social systems and decision contexts in order to address complex natural resource management issues. Although many practitioners in applied fields employ translational tactics, the body of literature addressing such approaches is limited. We present several case studies illustrating the principles of TE and the diversity of its applications. We anticipate that these examples will help others develop scientific products that decision makers can use “off the shelf” when solving critical ecological and social challenges. Our collective experience suggests that research of such immediate utility is rare. Long‐term commitment to working directly with partners to develop and reach shared goals is central to successful translation. The examples discussed here highlight the benefits of translational processes, including actionable scientific results, more informed policy making, increased investment in science‐driven solutions, and inspiration for partnerships. We aim to facilitate future TE‐based projects and build momentum for growing this community of practice.
Parallel Computation of the Regional Ocean Modeling System (ROMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, P; Song, Y T; Chao, Y
2005-04-05
The Regional Ocean Modeling System (ROMS) is a regional ocean general circulation modeling system solving the free surface, hydrostatic, primitive equations over varying topography. It is free software distributed world-wide for studying both complex coastal ocean problems and the basin-to-global scale ocean circulation. The original ROMS code could only be run on shared-memory systems. With the increasing need to simulate larger model domains with finer resolutions and on a variety of computer platforms, there is a need in the ocean-modeling community to have a ROMS code that can be run on any parallel computer ranging from 10 to hundreds ofmore » processors. Recently, we have explored parallelization for ROMS using the MPI programming model. In this paper, an efficient parallelization strategy for such a large-scale scientific software package, based on an existing shared-memory computing model, is presented. In addition, scientific applications and data-performance issues on a couple of SGI systems, including Columbia, the world's third-fastest supercomputer, are discussed.« less
Complexity, information loss, and model building: from neuro- to cognitive dynamics
NASA Astrophysics Data System (ADS)
Arecchi, F. Tito
2007-06-01
A scientific problem described within a given code is mapped by a corresponding computational problem, We call complexity (algorithmic) the bit length of the shortest instruction which solves the problem. Deterministic chaos in general affects a dynamical systems making the corresponding problem experimentally and computationally heavy, since one must reset the initial conditions at a rate higher than that of information loss (Kolmogorov entropy). One can control chaos by adding to the system new degrees of freedom (information swapping: information lost by chaos is replaced by that arising from the new degrees of freedom). This implies a change of code, or a new augmented model. Within a single code, changing hypotheses is equivalent to fixing different sets of control parameters, each with a different a-priori probability, to be then confirmed and transformed to an a-posteriori probability via Bayes theorem. Sequential application of Bayes rule is nothing else than the Darwinian strategy in evolutionary biology. The sequence is a steepest ascent algorithm, which stops once maximum probability has been reached. At this point the hypothesis exploration stops. By changing code (and hence the set of relevant variables) one can start again to formulate new classes of hypotheses . We call semantic complexity the number of accessible scientific codes, or models, that describe a situation. It is however a fuzzy concept, in so far as this number changes due to interaction of the operator with the system under investigation. These considerations are illustrated with reference to a cognitive task, starting from synchronization of neuron arrays in a perceptual area and tracing the putative path toward a model building.
Barnfield, Sarah; Pitts, Alison Clara; Kalaria, Raj; Allan, Louise; Tullo, Ellen
2017-01-01
Why did we do this study? It can be difficult for scientists to communicate their research findings to the public. This is partly due to the complexity of translating scientific language into words that the public understand. Further, it may be hard for the public to find out about and locate information about research studies. We aimed to adapt some scientific articles about the links between dementia and stroke into lay summaries to be displayed online for the general public. How did we do it? We collaborated with five people from a volunteer organisation, VOICENorth. They took part in two group discussions about studies reporting on the link between dementia and stroke, and selected four studies to translate into lay summaries and display on a website. We discussed the layout and language of the summaries and made adaptations to make them more understandable to the general public. What did we find? We were able to work with members of the public to translate research findings into lay summaries suitable for a general audience. We made changes to language and layout including the use of 'question and answer' style layouts, the addition of a reference list of scientific terms, and removing certain words. What does this mean? Working with members of the public is a realistic way to create resources that improve the accessibility of research findings to the wider public. Background Scientific research is often poorly understood by the general public and difficult for them to access. This presents a major barrier to disseminating and translating research findings. Stroke and dementia are both major public health issues, and research has shown lifestyle measures help to prevent them. This project aimed to select a series of studies from the Newcastle Cognitive Function after Stroke cohort (COGFAST) and create lay summaries comprehensible and accessible to the public. Methods We used a focus group format to collaborate with five members of the public to review COGFAST studies, prioritise those of most interest to the wider public, and modify the language and layout of the selected lay summaries. Focus groups were audio-taped and the team used the data to make iterative amendments, as suggested by members of the public, to the summaries and to a research website. We calculated the Flesch reading ease and Flesch-Kincaid grade level for each summary before and after the changes were made. Results In total, we worked with five members of the public in two focus groups to examine draft lay summaries, created by researchers, relating to eight COGFAST studies. Members of the public prioritised four COGFAST lay summaries according to the importance of the topic to the general public. We made a series of revisions to the summaries including the use of 'question and answer' style layouts, the addition of a glossary, and the exclusion of scientific jargon. Group discussion highlighted that lay summaries should be engaging, concise and comprehensible. We incorporated suggestions from members of the public into the design of a study website to display the summaries. The application of existing quantitative tools to estimate readability resulted in an apparently paradoxical increase in complexity of the lay summaries following the changes made. Conclusion This study supports previous literature demonstrating challenges in creating generic guidelines for researchers to create lay summaries. Existing quantitative metrics to assess readability may be inappropriate for assessing scientific lay summaries. We have shown it is feasible and successful to involve members of the public to create lay summaries to communicate the findings of complex scientific research. Trial registration Not applicable to the lay summary project.
A Big Data Guide to Understanding Climate Change: The Case for Theory-Guided Data Science
Kumar, Vipin
2014-01-01
Abstract Global climate change and its impact on human life has become one of our era's greatest challenges. Despite the urgency, data science has had little impact on furthering our understanding of our planet in spite of the abundance of climate data. This is a stark contrast from other fields such as advertising or electronic commerce where big data has been a great success story. This discrepancy stems from the complex nature of climate data as well as the scientific questions climate science brings forth. This article introduces a data science audience to the challenges and opportunities to mine large climate datasets, with an emphasis on the nuanced difference between mining climate data and traditional big data approaches. We focus on data, methods, and application challenges that must be addressed in order for big data to fulfill their promise with regard to climate science applications. More importantly, we highlight research showing that solely relying on traditional big data techniques results in dubious findings, and we instead propose a theory-guided data science paradigm that uses scientific theory to constrain both the big data techniques as well as the results-interpretation process to extract accurate insight from large climate data. PMID:25276499
UMAMI: A Recipe for Generating Meaningful Metrics through Holistic I/O Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockwood, Glenn K.; Yoo, Wucherl; Byna, Suren
I/O efficiency is essential to productivity in scientific computing, especially as many scientific domains become more data-intensive. Many characterization tools have been used to elucidate specific aspects of parallel I/O performance, but analyzing components of complex I/O subsystems in isolation fails to provide insight into critical questions: how do the I/O components interact, what are reasonable expectations for application performance, and what are the underlying causes of I/O performance problems? To address these questions while capitalizing on existing component-level characterization tools, we propose an approach that combines on-demand, modular synthesis of I/O characterization data into a unified monitoring and metricsmore » interface (UMAMI) to provide a normalized, holistic view of I/O behavior. We evaluate the feasibility of this approach by applying it to a month-long benchmarking study on two distinct largescale computing platforms. We present three case studies that highlight the importance of analyzing application I/O performance in context with both contemporaneous and historical component metrics, and we provide new insights into the factors affecting I/O performance. By demonstrating the generality of our approach, we lay the groundwork for a production-grade framework for holistic I/O analysis.« less
Convolution- and Fourier-transform-based reconstructors for pyramid wavefront sensor.
Shatokhina, Iuliia; Ramlau, Ronny
2017-08-01
In this paper, we present two novel algorithms for wavefront reconstruction from pyramid-type wavefront sensor data. An overview of the current state-of-the-art in the application of pyramid-type wavefront sensors shows that the novel algorithms can be applied in various scientific fields such as astronomy, ophthalmology, and microscopy. Assuming a computationally very challenging setting corresponding to the extreme adaptive optics (XAO) on the European Extremely Large Telescope, we present the results of the performed end-to-end simulations and compare the achieved AO correction quality (in terms of the long-exposure Strehl ratio) to other methods, such as matrix-vector multiplication and preprocessed cumulative reconstructor with domain decomposition. Also, we provide a comparison in terms of applicability and computational complexity and closed-loop performance of our novel algorithms to other methods existing for this type of sensor.
Statistical processing of large image sequences.
Khellah, F; Fieguth, P; Murray, M J; Allen, M
2005-01-01
The dynamic estimation of large-scale stochastic image sequences, as frequently encountered in remote sensing, is important in a variety of scientific applications. However, the size of such images makes conventional dynamic estimation methods, for example, the Kalman and related filters, impractical. In this paper, we present an approach that emulates the Kalman filter, but with considerably reduced computational and storage requirements. Our approach is illustrated in the context of a 512 x 512 image sequence of ocean surface temperature. The static estimation step, the primary contribution here, uses a mixture of stationary models to accurately mimic the effect of a nonstationary prior, simplifying both computational complexity and modeling. Our approach provides an efficient, stable, positive-definite model which is consistent with the given correlation structure. Thus, the methods of this paper may find application in modeling and single-frame estimation.
30 CFR 251.3 - Authority and applicability of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... applicability of this part. MMS authorizes you to conduct exploration or scientific research activities under... agencies are exempt from the regulations in this part. (c) G&G exploration or G&G scientific research...
NASA Astrophysics Data System (ADS)
Ehlmann, Bryon K.
Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.
78 FR 15729 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... Emphasis Panel; RFA Panel: Molecular and Cellular Substrates of Complex Brain Disorders. Date: March 29... Scientific Review Special Emphasis Panel; Member Conflict: Genetics of Disease. Date: March 29, 2013. Time: 1...
Brouard, Benoit; Bardo, Pascale; Bonnet, Clément; Mounier, Nicolas; Vignot, Marina; Vignot, Stéphane
2016-11-01
Mobile applications represent promising tools in management of chronic diseases, both for patients and healthcare professionals, and especially in oncology. Among the large number of mobile health (mhealth) applications available in mobile stores, it could be difficult for users to identify the most relevant ones. This study evaluated the business model and the scientific validation for mobile applications related to oncology. A systematic review was performed over the two major marketplaces. Purpose, scientific validation, and source of funding were evaluated according to the description of applications in stores. Results were stratified according to targeted audience (general population/patients/healthcare professionals). Five hundred and thirty-nine applications related to oncology were identified: 46.8% dedicated to healthcare professionals, 31.5% to general population, and 21.7% to patients. A lack of information about healthcare professionals' involvement in the development process was noted since only 36.5% of applications mentioned an obvious scientific validation. Most apps were free (72.2%) and without explicit support by industry (94.2%). There is a need to enforce independent review of mhealth applications in oncology. The economic model could be questioned and the source of funding should be clarified. Meanwhile, patients and healthcare professionals should remain cautious about applications' contents. Key messages A systematic review was performed to describe the mobile applications related to oncology and it revealed a lack of information on scientific validation and funding. Independent scientific review and the reporting of conflicts of interest should be encouraged. Users, and all health professionals, should be aware that health applications, whatever the quality of their content, do not actually embrace such an approach.
Computational modelling of oxygenation processes in enzymes and biomimetic model complexes.
de Visser, Sam P; Quesne, Matthew G; Martin, Bodo; Comba, Peter; Ryde, Ulf
2014-01-11
With computational resources becoming more efficient and more powerful and at the same time cheaper, computational methods have become more and more popular for studies on biochemical and biomimetic systems. Although large efforts from the scientific community have gone into exploring the possibilities of computational methods for studies on large biochemical systems, such studies are not without pitfalls and often cannot be routinely done but require expert execution. In this review we summarize and highlight advances in computational methodology and its application to enzymatic and biomimetic model complexes. In particular, we emphasize on topical and state-of-the-art methodologies that are able to either reproduce experimental findings, e.g., spectroscopic parameters and rate constants, accurately or make predictions of short-lived intermediates and fast reaction processes in nature. Moreover, we give examples of processes where certain computational methods dramatically fail.
NASA Astrophysics Data System (ADS)
Vaidyanathan, S.; Akgul, A.; Kaçar, S.; Çavuşoğlu, U.
2018-02-01
Hyperjerk systems have received significant interest in the literature because of their simple structure and complex dynamical properties. This work presents a new chaotic hyperjerk system having two exponential nonlinearities. Dynamical properties of the chaotic hyperjerk system are discovered through equilibrium point analysis, bifurcation diagram, dissipativity and Lyapunov exponents. Moreover, an adaptive backstepping controller is designed for the synchronization of the chaotic hyperjerk system. Also, a real circuit of the chaotic hyperjerk system has been carried out to show the feasibility of the theoretical hyperjerk model. The chaotic hyperjerk system can also be useful in scientific fields such as Random Number Generators (RNGs), data security, data hiding, etc. In this work, three implementations of the chaotic hyperjerk system, viz. RNG, image encryption and sound steganography have been performed by using complex dynamics characteristics of the system.
Developments in the Tools and Methodologies of Synthetic Biology
Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul
2014-01-01
Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788
Contribution of Mössbauer spectroscopy to the investigation of Fe/S biogenesis.
Garcia-Serres, Ricardo; Clémancey, Martin; Latour, Jean-Marc; Blondin, Geneviève
2018-01-19
Fe/S cluster biogenesis involves a complex machinery comprising several mitochondrial and cytosolic proteins. Fe/S cluster biosynthesis is closely intertwined with iron trafficking in the cell. Defects in Fe/S cluster elaboration result in severe diseases such as Friedreich ataxia. Deciphering this machinery is a challenge for the scientific community. Because iron is a key player, 57 Fe-Mössbauer spectroscopy is especially appropriate for the characterization of Fe species and monitoring the iron distribution. This minireview intends to illustrate how Mössbauer spectroscopy contributes to unravel steps in Fe/S cluster biogenesis. Studies were performed on isolated proteins that may be present in multiple protein complexes. Since a few decades, Mössbauer spectroscopy was also performed on whole cells or on isolated compartments such as mitochondria and vacuoles, affording an overview of the iron trafficking. This minireview aims at presenting selected applications of 57 Fe-Mössbauer spectroscopy to Fe/S cluster biogenesis.
NASA Astrophysics Data System (ADS)
Sorensen, A. E.; Dauer, J. M.; Corral, L.; Fontaine, J. J.
2017-12-01
A core component of public scientific literacy, and thereby informed decision-making, is the ability of individuals to reason about complex systems. In response to students having difficulty learning about complex systems, educational research suggests that conceptual representations, or mental models, may help orient student thinking. Mental models provide a framework to support students in organizing and developing ideas. The PMC-2E model is a productive tool in teaching ideas of modeling complex systems in the classroom because the conceptual representation framework allows for self-directed learning where students can externalize systems thinking. Beyond mental models, recent work emphasizes the importance of facilitating integration of authentic science into the formal classroom. To align these ideas, a university class was developed around the theme of carnivore ecology, founded on PMC-2E framework and authentic scientific data collection. Students were asked to develop a protocol, collect, and analyze data around a scientific question in partnership with a scientist, and then use data to inform their own learning about the system through the mental model process. We identified two beneficial outcomes (1) scientific data is collected to address real scientific questions at a larger scale and (2) positive outcomes for student learning and views of science. After participating in the class, students report enjoying class structure, increased support for public understanding of science, and shifts in nature of science and interest in pursuing science metrics on post-assessments. Further work is ongoing investigating the linkages between engaging in authentic scientific practices that inform student mental models, and how it might promote students' systems-thinking skills, implications for student views of nature of science, and development of student epistemic practices.
Design tools for complex dynamic security systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byrne, Raymond Harry; Rigdon, James Brian; Rohrer, Brandon Robinson
2007-01-01
The development of tools for complex dynamic security systems is not a straight forward engineering task but, rather, a scientific task where discovery of new scientific principles and math is necessary. For years, scientists have observed complex behavior but have had difficulty understanding it. Prominent examples include: insect colony organization, the stock market, molecular interactions, fractals, and emergent behavior. Engineering such systems will be an even greater challenge. This report explores four tools for engineered complex dynamic security systems: Partially Observable Markov Decision Process, Percolation Theory, Graph Theory, and Exergy/Entropy Theory. Additionally, enabling hardware technology for next generation security systemsmore » are described: a 100 node wireless sensor network, unmanned ground vehicle and unmanned aerial vehicle.« less
ObsPy: Establishing and maintaining an open-source community package
NASA Astrophysics Data System (ADS)
Krischer, L.; Megies, T.; Barsch, R.
2017-12-01
Python's ecosystem evolved into one of the most powerful and productive research environment across disciplines. ObsPy (https://obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It does so by offering Read and write support for essentially every commonly used data format in seismology, Integrated access to the largest data centers, web services, and real-time data streams, A powerful signal processing toolbox tuned to the specific needs of seismologists, and Utility functionality like travel time calculations, geodetic functions, and data visualizations. ObsPy has been in constant unfunded development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. By now around 70 people directly contributed code to ObsPy and we aim to make it a self-sustaining community project.This contributions focusses on several meta aspects of open-source software in science, in particular how we experienced them. During the panel we would like to discuss obvious questions like long-term sustainability with very limited to no funding, insufficient computer science training in many sciences, and gaining hard scientific credits for software development, but also the following questions: How to best deal with the fact that a lot of scientific software is very specialized thus usually solves a complex problem but at the same time can only ever reach a limited pool of developers and users by virtue of it being so specialized? Therefore the "many eyes on the code" approach to develop and improve open-source software only applies in a limited fashion. An initial publication for a significant new scientific software package is fairly straightforward. How to on-board and motivate potential new contributors when they can no longer be lured by a potential co-authorship? When is spending significant time and effort on reusable scientific open-source development a reasonable choice for young researchers? The effort to go from purpose tailored code for a single application resulting in a scientific publication is significantly less compared to generalising and engineering it well enough so it can be used by others.
NASA Technical Reports Server (NTRS)
Keeley, J. T.
1976-01-01
Guidelines and general requirements applicable to the development of instrument flight hardware intended for use on the GSFC Shuttle Scientific Payloads Program are given. Criteria, guidelines, and an organized approach to specifying the appropriate level of requirements for each instrument in order to permit its development at minimum cost while still assuring crew safety, are included. It is recognized that the instruments for these payloads will encompass wide ranges of complexity, cost, development risk, and safety hazards. The flexibility required to adapt the controls, documentation, and verification requirements in accord with the specific instrument is provided.
Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.
2006-01-01
This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.
Estimating varying coefficients for partial differential equation models.
Zhang, Xinyu; Cao, Jiguo; Carroll, Raymond J
2017-09-01
Partial differential equations (PDEs) are used to model complex dynamical systems in multiple dimensions, and their parameters often have important scientific interpretations. In some applications, PDE parameters are not constant but can change depending on the values of covariates, a feature that we call varying coefficients. We propose a parameter cascading method to estimate varying coefficients in PDE models from noisy data. Our estimates of the varying coefficients are shown to be consistent and asymptotically normally distributed. The performance of our method is evaluated by a simulation study and by an empirical study estimating three varying coefficients in a PDE model arising from LIDAR data. © 2017, The International Biometric Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yao; Balaprakash, Prasanna; Meng, Jiayuan
We present Raexplore, a performance modeling framework for architecture exploration. Raexplore enables rapid, automated, and systematic search of architecture design space by combining hardware counter-based performance characterization and analytical performance modeling. We demonstrate Raexplore for two recent manycore processors IBM Blue- Gene/Q compute chip and Intel Xeon Phi, targeting a set of scientific applications. Our framework is able to capture complex interactions between architectural components including instruction pipeline, cache, and memory, and to achieve a 3–22% error for same-architecture and cross-architecture performance predictions. Furthermore, we apply our framework to assess the two processors, and discover and evaluate a list ofmore » architectural scaling options for future processor designs.« less
The ups and downs of peer review.
Benos, Dale J; Bashari, Edlira; Chaves, Jose M; Gaggar, Amit; Kapoor, Niren; LaFrance, Martin; Mans, Robert; Mayhew, David; McGowan, Sara; Polter, Abigail; Qadri, Yawar; Sarfare, Shanta; Schultz, Kevin; Splittgerber, Ryan; Stephenson, Jason; Tower, Cristy; Walton, R Grace; Zotov, Alexander
2007-06-01
This article traces the history of peer review of scientific publications, plotting the development of the process from its inception to its present-day application. We discuss the merits of peer review and its weaknesses, both perceived and real, as well as the practicalities of several major proposed changes to the system. It is our hope that readers will gain a better appreciation of the complexities of the process and, when serving as reviewers themselves, will do so in a manner that will enhance the utility of the exercise. We also propose the development of an international on-line training program for accreditation of potential referees.
ART AND SCIENCE OF IMAGE MAPS.
Kidwell, Richard D.; McSweeney, Joseph A.
1985-01-01
The visual image of reflected light is influenced by the complex interplay of human color discrimination, spatial relationships, surface texture, and the spectral purity of light, dyes, and pigments. Scientific theories of image processing may not always achieve acceptable results as the variety of factors, some psychological, are in part, unpredictable. Tonal relationships that affect digital image processing and the transfer functions used to transform from the continuous-tone source image to a lithographic image, may be interpreted for an insight of where art and science fuse in the production process. The application of art and science in image map production at the U. S. Geological Survey is illustrated and discussed.
A review of lighter-than-air progress in the United States and its technological significance
NASA Technical Reports Server (NTRS)
Mayer, N. J.; Krida, R. H.
1977-01-01
Lighter-than-air craft for transportation and communications systems are discussed, with attention given to tethered balloons used to provide stable platforms for airborne surveillance equipment, freight-carrying balloons, manned scientific research balloons such as Atmosat, high-altitude superpressure aerostats employed in satellite communications systems, airport feeder airships, and naval surveillance airships. In addition, technical problems associated with the development of advanced aerostats, including the aerodynamics of hybrid combinations of large rotor systems and aerostat hulls, the application of composites to balloon shells, computer analyses of the complex geometrical structures of aerostats and propulsion systems for airships, are considered.
Boriani, Elena; Esposito, Roberto; Frazzoli, Chiara; Fantke, Peter; Hald, Tine; Rüegg, Simon R.
2017-01-01
Health intervention systems are complex and subject to multiple variables in different phases of implementation. This constitutes a concrete challenge for the application of translational science in real life. Complex systems as health-oriented interventions call for interdisciplinary approaches with carefully defined system boundaries. Exploring individual components of such systems from different viewpoints gives a wide overview and helps to understand the elements and the relationships that drive actions and consequences within the system. In this study, we present an application and assessment of a framework with focus on systems and system boundaries of interdisciplinary projects. As an example on how to apply our framework, we analyzed ALERT [an integrated sensors and biosensors’ system (BEST) aimed at monitoring the quality, health, and traceability of the chain of the bovine milk], a multidisciplinary and interdisciplinary project based on the application of measurable biomarkers at strategic points of the milk chain for improved food security (including safety), human, and ecosystem health (1). In fact, the European food safety framework calls for science-based support to the primary producers’ mandate for legal, scientific, and ethical responsibility in food supply. Because of its multidisciplinary and interdisciplinary approach involving human, animal, and ecosystem health, ALERT can be considered as a One Health project. Within the ALERT context, we identified the need to take into account the main actors, interactions, and relationships of stakeholders to depict a simplified skeleton of the system. The framework can provide elements to highlight how and where to improve the project development when project evaluations are required. PMID:28804707
Boriani, Elena; Esposito, Roberto; Frazzoli, Chiara; Fantke, Peter; Hald, Tine; Rüegg, Simon R
2017-01-01
Health intervention systems are complex and subject to multiple variables in different phases of implementation. This constitutes a concrete challenge for the application of translational science in real life. Complex systems as health-oriented interventions call for interdisciplinary approaches with carefully defined system boundaries. Exploring individual components of such systems from different viewpoints gives a wide overview and helps to understand the elements and the relationships that drive actions and consequences within the system. In this study, we present an application and assessment of a framework with focus on systems and system boundaries of interdisciplinary projects. As an example on how to apply our framework, we analyzed ALERT [an integrated sensors and biosensors' system (BEST) aimed at monitoring the quality, health, and traceability of the chain of the bovine milk], a multidisciplinary and interdisciplinary project based on the application of measurable biomarkers at strategic points of the milk chain for improved food security (including safety), human, and ecosystem health (1). In fact, the European food safety framework calls for science-based support to the primary producers' mandate for legal, scientific, and ethical responsibility in food supply. Because of its multidisciplinary and interdisciplinary approach involving human, animal, and ecosystem health, ALERT can be considered as a One Health project. Within the ALERT context, we identified the need to take into account the main actors, interactions, and relationships of stakeholders to depict a simplified skeleton of the system. The framework can provide elements to highlight how and where to improve the project development when project evaluations are required.
Translations on USSR Science and Technology, Physical Sciences and Technology, Number 24
1977-11-30
8217 UPMVLYAYUSHCHIYE SISTEM I MA.SHIWY’ No 3, 1977 (UPMVLYAYUSHCHIYE SISTEMI I MA.SHIEY, May/jun 77)... 6k CYBERNETICS, COMPUTERS MD AUTOMATION TECHNOLOGY...insert pp 5-8) [Five articles from the insert] [Text] The organizing of the scientific and production complexes in the "Svetlana" association has...documentation and issuing copies to the corresponding subdivisions of the NPK [scientific and produc- tion complex ], work got underway on a broad
75 FR 23669 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-04
... invite comments on the question of whether instruments of equivalent scientific value, for the purposes... biological interactions at the nano scale. Justification for Duty-Free Entry: There are no instruments of the...
Laurenne, Nina; Tuominen, Jouni; Saarenmaa, Hannu; Hyvönen, Eero
2014-01-01
The scientific names of plants and animals play a major role in Life Sciences as information is indexed, integrated, and searched using scientific names. The main problem with names is their ambiguous nature, because more than one name may point to the same taxon and multiple taxa may share the same name. In addition, scientific names change over time, which makes them open to various interpretations. Applying machine-understandable semantics to these names enables efficient processing of biological content in information systems. The first step is to use unique persistent identifiers instead of name strings when referring to taxa. The most commonly used identifiers are Life Science Identifiers (LSID), which are traditionally used in relational databases, and more recently HTTP URIs, which are applied on the Semantic Web by Linked Data applications. We introduce two models for expressing taxonomic information in the form of species checklists. First, we show how species checklists are presented in a relational database system using LSIDs. Then, in order to gain a more detailed representation of taxonomic information, we introduce meta-ontology TaxMeOn to model the same content as Semantic Web ontologies where taxa are identified using HTTP URIs. We also explore how changes in scientific names can be managed over time. The use of HTTP URIs is preferable for presenting the taxonomic information of species checklists. An HTTP URI identifies a taxon and operates as a web address from which additional information about the taxon can be located, unlike LSID. This enables the integration of biological data from different sources on the web using Linked Data principles and prevents the formation of information silos. The Linked Data approach allows a user to assemble information and evaluate the complexity of taxonomical data based on conflicting views of taxonomic classifications. Using HTTP URIs and Semantic Web technologies also facilitate the representation of the semantics of biological data, and in this way, the creation of more "intelligent" biological applications and services.
Error-rate prediction for programmable circuits: methodology, tools and studied cases
NASA Astrophysics Data System (ADS)
Velazco, Raoul
2013-05-01
This work presents an approach to predict the error rates due to Single Event Upsets (SEU) occurring in programmable circuits as a consequence of the impact or energetic particles present in the environment the circuits operate. For a chosen application, the error-rate is predicted by combining the results obtained from radiation ground testing and the results of fault injection campaigns performed off-beam during which huge numbers of SEUs are injected during the execution of the studied application. The goal of this strategy is to obtain accurate results about different applications' error rates, without using particle accelerator facilities, thus significantly reducing the cost of the sensitivity evaluation. As a case study, this methodology was applied a complex processor, the Power PC 7448 executing a program issued from a real space application and a crypto-processor application implemented in an SRAM-based FPGA and accepted to be embedded in the payload of a scientific satellite of NASA. The accuracy of predicted error rates was confirmed by comparing, for the same circuit and application, predictions with measures issued from radiation ground testing performed at the cyclotron Cyclone cyclotron of HIF (Heavy Ion Facility) of Louvain-la-Neuve (Belgium).
77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-31
... Recompetition results for Scientific Discovery through Advanced Computing (SciDAC) applications Co-design Public... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Office of... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub...
Exploring Scientific Information for Policy Making under Deep Uncertainty
NASA Astrophysics Data System (ADS)
Forni, L.; Galaitsi, S.; Mehta, V. K.; Escobar, M.; Purkey, D. R.; Depsky, N. J.; Lima, N. A.
2016-12-01
Each actor evaluating potential management strategies brings her/his own distinct set of objectives to a complex decision space of system uncertainties. The diversity of these objectives require detailed and rigorous analyses that responds to multifaceted challenges. However, the utility of this information depends on the accessibility of scientific information to decision makers. This paper demonstrates data visualization tools for presenting scientific results to decision makers in two case studies, La Paz/ El Alto, Bolivia, and Yuba County,California. Visualization output from the case studies combines spatiotemporal, multivariate and multirun/multiscenario information to produce information corresponding to the objectives defined by key actors and stakeholders. These tools can manage complex data and distill scientific information into accessible formats. Using the visualizations, scientists and decision makers can navigate the decision space and potential objective trade-offs to facilitate discussion and consensus building. These efforts can support identifying stable negotiatedagreements between different stakeholders.
Beem, Betsi
2012-05-01
This paper argues that information produced and then taken up for policy decision making is a function of a complex interplay within the scientific community and between scientists and the broader policy network who are all grappling with issues in a complex environment with a high degree of scientific uncertainty. The dynamics of forming and re-forming the scientific community are shaped by political processes, as are the directions and questions scientists attend to in their roles as policy advisors. Three factors: 1) social construction of scientific communities, 2) the indeterminacy of science, and 3) demands by policy makers to have concrete information for decision making; are intertwined in the production and dissemination of information that may serve as the basis for policy learning. Through this process, however, what gets learned may not be what is needed to mitigate the problem, be complete in terms of addressing multiple causations, or be correct.
Key Scientific Issues in the Health Risk Assessment of Trichloroethylene
Chiu, Weihsueh A.; Caldwell, Jane C.; Keshava, Nagalakshmi; Scott, Cheryl Siegel
2006-01-01
Trichloroethylene (TCE) is a common environmental contaminant at hazardous waste sites and in ambient and indoor air. Assessing the human health risks of TCE is challenging because of its inherently complex metabolism and toxicity and the widely varying perspectives on a number of critical scientific issues. Because of this complexity, the U.S. Environmental Protection Agency (EPA) drew upon scientific input and expertise from a wide range of groups and individuals in developing its 2001 draft health risk assessment of TCE. This scientific outreach, which was aimed at engaging a diversity of perspectives rather than developing consensus, culminated in 2000 with 16 state-of-the-science articles published together as an Environmental Health Perspectives supplement. Since that time, a substantial amount of new scientific research has been published that is relevant to assessing TCE health risks. Moreover, a number of difficult or controversial scientific issues remain unresolved and are the subject of a scientific consultation with the National Academy of Sciences coordinated by the White House Office of Science and Technology Policy and co-sponsored by a number of federal agencies, including the U.S. EPA. The articles included in this mini-monograph provide a scientific update on the most prominent of these issues: the pharmacokinetics of TCE and its metabolites, mode(s) of action and effects of TCE metabolites, the role of peroxisome proliferator–activated receptor in TCE toxicity, and TCE cancer epidemiology. PMID:16966103
The Moon in the Russian scientific-educational project: Kazan-GeoNa-2010
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.; Petrova, N.
Historically thousand-year Kazan city and the two-hundred-year Kazan university Russia carry out a role of the scientific-organizational and cultural-educational center of Volga region For the further successful development of educational and scientific-educational activity of the Russian Federation the Republic Tatarstan Kazan is offered the national project - the International Center of the Science and the Internet of Technologies bf GeoNa bf Geo metry of bf Na ture - bf GeoNa is developed - wisdom enthusiasm pride grandeur which includes a modern complex of conference halls up to 4 thousand places the Center the Internet of Technologies 3D Planetarium - development of the Moon PhysicsLand an active museum of natural sciences an oceanarium training a complex Spheres of Knowledge botanical and landscape oases In center bf GeoNa will be hosted conferences congresses fundamental scientific researches of the Moon scientific-educational actions presentation of the international scientific programs on lunar research modern lunar databases exhibition Hi-tech of the equipment the extensive cultural-educational tourist and cognitive programs Center bf GeoNa will enable scientists and teachers of the Russian universities to join to advanced achievements of a science information technologies to establish scientific communications with foreign colleagues in sphere of the high technology and educational projects with world space centers
Petzold, Andrew M; Dunbar, Robert L
2018-06-01
The ability to clearly disseminate scientific knowledge is a skill that is necessary for any undergraduate student within the sciences. Traditionally, this is accomplished through the instruction of scientific presentation or writing with a focus on peer-to-peer communication at the expense of teaching communication aimed at a nonscientific audience. One of the ramifications of focusing on peer-to-peer communication has presented itself as an apprehension toward scientific knowledge within the general populace. This apprehension can be seen in a variety of venues, including the traditional media, popular culture, and education, which generally paint scientists as aloof and with an inability to discuss scientific issues to anyone other than other scientists. This paper describes a curriculum designed to teach Introduction to Anatomy and Physiology students the tools necessary for communicating complex concepts that were covered during the semester using approachable language. Students were assessed on their word usage in associated writing activities, the student's ability to reduce complexity of their statements, and performance in an informal scientific presentation to a lay audience. Results showed that this pedagogical approach has increased students' ability to reduce the complexity of their language in both a written and oral format. This, in turn, led to evaluators reporting greater levels of understanding of the topic presented following the presentation.
Bornmann, Lutz; Wallon, Gerlind; Ledin, Anna
2008-01-01
Does peer review fulfill its declared objective of identifying the best science and the best scientists? In order to answer this question we analyzed the Long-Term Fellowship and the Young Investigator programmes of the European Molecular Biology Organization. Both programmes aim to identify and support the best post doctoral fellows and young group leaders in the life sciences. We checked the association between the selection decisions and the scientific performance of the applicants. Our study involved publication and citation data for 668 applicants to the Long-Term Fellowship programme from the year 1998 (130 approved, 538 rejected) and 297 applicants to the Young Investigator programme (39 approved and 258 rejected applicants) from the years 2001 and 2002. If quantity and impact of research publications are used as a criterion for scientific achievement, the results of (zero-truncated) negative binomial models show that the peer review process indeed selects scientists who perform on a higher level than the rejected ones subsequent to application. We determined the extent of errors due to over-estimation (type I errors) and under-estimation (type 2 errors) of future scientific performance. Our statistical analyses point out that between 26% and 48% of the decisions made to award or reject an application show one of both error types. Even though for a part of the applicants, the selection committee did not correctly estimate the applicant's future performance, the results show a statistically significant association between selection decisions and the applicants' scientific achievements, if quantity and impact of research publications are used as a criterion for scientific achievement. PMID:18941530
On the Impact of Widening Vector Registers on Sequence Alignment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daily, Jeffrey A.; Kalyanaraman, Anantharaman; Krishnamoorthy, Sriram
2016-09-22
Vector extensions, such as SSE, have been part of the x86 since the 1990s, with applications in graphics, signal processing, and scientific applications. Although many algorithms and applications can naturally benefit from automatic vectorization techniques, there are still many that are difficult to vectorize due to their dependence on irregular data structures, dense branch operations, or data dependencies. Sequence alignment, one of the most widely used operations in bioinformatics workflows, has a computational footprint that features complex data dependencies. In this paper, we demonstrate that the trend of widening vector registers adversely affects the state-of-the-art sequence alignment algorithm based onmore » striped data layouts. We present a practically efficient SIMD implementation of a parallel scan based sequence alignment algorithm that can better exploit wider SIMD units. We conduct comprehensive workload and use case analyses to characterize the relative behavior of the striped and scan approaches and identify the best choice of algorithm based on input length and SIMD width.« less
Evidence-based pain management: is the concept of integrative medicine applicable?
2012-01-01
This article is dedicated to the concept of predictive, preventive, and personalized (integrative) medicine beneficial and applicable to advance pain management, overviews recent insights, and discusses novel minimally invasive tools, performed under ultrasound guidance, enhanced by model-guided approach in the field of musculoskeletal pain and neuromuscular diseases. The complexity of pain emergence and regression demands intellectual-, image-guided techniques personally specified to the patient. For personalized approach, the combination of the modalities of ultrasound, EMG, MRI, PET, and SPECT gives new opportunities to experimental and clinical studies. Neuromuscular imaging should be crucial for emergence of studies concerning advanced neuroimaging technologies to predict movement disorders, postural imbalance with integrated application of imaging, and functional modalities for rehabilitation and pain management. Scientific results should initiate evidence-based preventive movement programs in sport medicine rehabilitation. Traditional medicine and mathematical analytical approaches and education challenges are discussed in this review. The physiological management of exactly assessed pathological condition, particularly in movement disorders, requires participative medical approach to gain harmonized and sustainable effect. PMID:23088743
The impact of next-generation sequencing on genomics
Zhang, Jun; Chiodini, Rod; Badr, Ahmed; Zhang, Genfa
2011-01-01
This article reviews basic concepts, general applications, and the potential impact of next-generation sequencing (NGS) technologies on genomics, with particular reference to currently available and possible future platforms and bioinformatics. NGS technologies have demonstrated the capacity to sequence DNA at unprecedented speed, thereby enabling previously unimaginable scientific achievements and novel biological applications. But, the massive data produced by NGS also presents a significant challenge for data storage, analyses, and management solutions. Advanced bioinformatic tools are essential for the successful application of NGS technology. As evidenced throughout this review, NGS technologies will have a striking impact on genomic research and the entire biological field. With its ability to tackle the unsolved challenges unconquered by previous genomic technologies, NGS is likely to unravel the complexity of the human genome in terms of genetic variations, some of which may be confined to susceptible loci for some common human conditions. The impact of NGS technologies on genomics will be far reaching and likely change the field for years to come. PMID:21477781
NASA Astrophysics Data System (ADS)
Fortunato, Lorenzo
2018-03-01
In this contribution I will review some of the researches that are currently being pursued in Padova (mainly within the In:Theory and Strength projects), focusing on the interdisciplinary applications of nuclear theory to several other branches of physics, with the aim of contributing to show the centrality of nuclear theory in the Italian scientific scenario and the prominence of this fertile field in fostering new physics. In particular, I will talk about: i) the recent solution of the long-standing “electron screening puzzle” that settles a fundamental controversy in nuclear astrophysics between the outcome of lab experiments on earth and nuclear reactions happening in stars; the application of algebraic methods to very diverse systems such as: ii) the supramolecular complex H2@C60, i.e. a diatomic hydrogen molecule caged in a fullerene and iii) to the spectrum of hypernuclei, i.e. systems made of a Lambda particles trapped in (heavy) nuclei.
Modeling and analysis of hybrid pixel detector deficiencies for scientific applications
NASA Astrophysics Data System (ADS)
Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman
2015-08-01
Semiconductor hybrid pixel detectors often consist of a pixellated sensor layer bump bonded to a matching pixelated readout integrated circuit (ROIC). The sensor can range from high resistivity Si to III-V materials, whereas a Si CMOS process is typically used to manufacture the ROIC. Independent, device physics and electronic design automation (EDA) tools are used to determine sensor characteristics and verify functional performance of ROICs respectively with significantly different solvers. Some physics solvers provide the capability of transferring data to the EDA tool. However, single pixel transient simulations are either not feasible due to convergence difficulties or are prohibitively long. A simplified sensor model, which includes a current pulse in parallel with detector equivalent capacitor, is often used; even then, spice type top-level (entire array) simulations range from days to weeks. In order to analyze detector deficiencies for a particular scientific application, accurately defined transient behavioral models of all the functional blocks are required. Furthermore, various simulations, such as transient, noise, Monte Carlo, inter-pixel effects, etc. of the entire array need to be performed within a reasonable time frame without trading off accuracy. The sensor and the analog front-end can be modeling using a real number modeling language, as complex mathematical functions or detailed data can be saved to text files, for further top-level digital simulations. Parasitically aware digital timing is extracted in a standard delay format (sdf) from the pixel digital back-end layout as well as the periphery of the ROIC. For any given input, detector level worst-case and best-case simulations are performed using a Verilog simulation environment to determine the output. Each top-level transient simulation takes no more than 10-15 minutes. The impact of changing key parameters such as sensor Poissonian shot noise, analog front-end bandwidth, jitter due to clock distribution etc. can be accurately analyzed to determine ROIC architectural viability and bottlenecks. Hence the impact of the detector parameters on the scientific application can be studied.
ERIC Educational Resources Information Center
Romine, William L.; Sadler, Troy D.; Kinslow, Andrew T.
2017-01-01
We describe the development and validation of the Quantitative Assessment of Socio-scientific Reasoning (QuASSR) in a college context. The QuASSR contains 10 polytomous, two-tiered items crossed between two scenarios, and is based on theory suggesting a four-pronged structure for SSR (complexity, perspective taking, inquiry, and skepticism). In…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-20
... Applications for Duty- Free Entry of Scientific Instruments This is a decision pursuant to Section 6(c) of the Educational, Scientific, and Cultural Materials Importation Act of 1966 (Pub. L. 89- 651, as amended by Pub. L. 106-36; 80 Stat. 897; 15 CFR part 301). Related records can be viewed between 8:30 a.m. and 5 p.m. in...
Synchronization in complex oscillator networks and smart grids.
Dörfler, Florian; Chertkov, Michael; Bullo, Francesco
2013-02-05
The emergence of synchronization in a network of coupled oscillators is a fascinating topic in various scientific disciplines. A widely adopted model of a coupled oscillator network is characterized by a population of heterogeneous phase oscillators, a graph describing the interaction among them, and diffusive and sinusoidal coupling. It is known that a strongly coupled and sufficiently homogeneous network synchronizes, but the exact threshold from incoherence to synchrony is unknown. Here, we present a unique, concise, and closed-form condition for synchronization of the fully nonlinear, nonequilibrium, and dynamic network. Our synchronization condition can be stated elegantly in terms of the network topology and parameters or equivalently in terms of an intuitive, linear, and static auxiliary system. Our results significantly improve upon the existing conditions advocated thus far, they are provably exact for various interesting network topologies and parameters; they are statistically correct for almost all networks; and they can be applied equally to synchronization phenomena arising in physics and biology as well as in engineered oscillator networks, such as electrical power networks. We illustrate the validity, the accuracy, and the practical applicability of our results in complex network scenarios and in smart grid applications.
NASA Astrophysics Data System (ADS)
Saavedra-Duarte, L. A.; Angarita-Jerardino, A.; Ruiz, P. A.; Dulce-Moreno, H. J.; Vera-Rivera, F. H.; V-Niño, E. D.
2017-12-01
Information and Communication Technologies (ICT) are essential in the transfer of knowledge, and the Web tools, as part of ICT, are important for institutions seeking greater visibility of the products developed by their researchers. For this reason, we implemented an application that allows the information management of the FORISTOM Foundation (Foundation of Researchers in Science and Technology of Materials). The application shows a detailed description, not only of all its members also of all the scientific production that they carry out, such as technological developments, research projects, articles, presentations, among others. This application can be implemented by other entities committed to the scientific dissemination and transfer of technology and knowledge.
Rubini, Lauretta; Pollio, Chiara; Di Tommaso, Marco R
2017-08-29
Transnational research networks (TRN) are becoming increasingly complex. Such complexity may have both positive and negative effects on the quality of research. Our work studies the evolution over time of Chinese TRN and the role of complexity on the quality of Chinese research, given the leading role this country has recently acquired in international science. We focus on the fields of geriatrics and gerontology. We build an original dataset of all scientific publications of China in these areas in 2009, 2012 and 2015, starting from the ISI Web of Knowledge (ISI WoK) database. Using Social Network Analysis (SNA), we analyze the change in scientific network structure across time. Second, we design indices to control for the different aspects of networks complexity (number of authors, country heterogeneity and institutional heterogeneity) and we perform negative binomial regressions to identify the main determinants of research quality. Our analysis shows that research networks in the field of geriatrics and gerontology have gradually become wider in terms of countries and have become more balanced. Furthermore, our results identify that different forms of complexity have different impacts on quality, including a reciprocal moderating effect. In particular, according to our analysis, research quality benefits from complex research networks both in terms of countries and of types of institutions involved, but that such networks should be "compact" in terms of number of authors. Eventually, we suggest that complexity should be carefully taken into account when designing policies aimed at enhancing the quality of research.
Role of biomolecular logic systems in biosensors and bioactuators
NASA Astrophysics Data System (ADS)
Mailloux, Shay; Katz, Evgeny
2014-09-01
An overview of recent advances in biosensors and bioactuators based on biocomputing systems is presented. Biosensors digitally process multiple biochemical signals through Boolean logic networks of coupled biomolecular reactions and produce an output in the form of a YES/NO response. Compared to traditional single-analyte sensing devices, the biocomputing approach enables high-fidelity multianalyte biosensing, which is particularly beneficial for biomedical applications. Multisignal digital biosensors thus promise advances in rapid diagnosis and treatment of diseases by processing complex patterns of physiological biomarkers. Specifically, they can provide timely detection and alert medical personnel of medical emergencies together with immediate therapeutic intervention. Application of the biocomputing concept has been successfully demonstrated for systems performing logic analysis of biomarkers corresponding to different injuries, particularly as exemplified for liver injury. Wide-ranging applications of multianalyte digital biosensors in medicine, environmental monitoring, and homeland security are anticipated. "Smart" bioactuators, for signal-triggered drug release, for example, were designed by interfacing switchable electrodes with biocomputing systems. Integration of biosensing and bioactuating systems with biomolecular information processing systems advances the potential for further scientific innovations and various practical applications.
Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics
NASA Astrophysics Data System (ADS)
Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan
2014-03-01
We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.
Science Education in Primary Schools: Is an Animation Worth a Thousand Pictures?
NASA Astrophysics Data System (ADS)
Barak, Miri; Dori, Yehudit J.
2011-10-01
Science teaching deals with abstract concepts and processes that very often cannot be seen or touched. The development of Java, Flash, and other web-based applications allow teachers and educators to present complex animations that attractively illustrate scientific phenomena. Our study evaluated the integration of web-based animated movies into primary schools science curriculum. Our goal was to examine teachers' methods for integrating animated movies and their views about the role of animations in enhancing young students' thinking skills. We also aimed at investigating the effect of animated movies on students' learning outcomes. Applying qualitative and quantitative tools, we conducted informal discussions with science teachers (N = 15) and administered pre- and post-questionnaires to 4th (N = 641) and 5th (N = 694) grade students who were divided into control and experimental groups. The experimental group students studied science while using animated movies and supplementary activities at least once a week. The control group students used only textbooks and still-pictures for learning science. Findings indicated that animated movies support the use of diverse teaching strategies and learning methods, and can promote various thinking skills among students. Findings also indicated that animations can enhance scientific curiosity, the acquisition of scientific language, and fostering scientific thinking. These encouraging results can be explained by the fact that the students made use of both visual-pictorial and auditory-verbal capabilities while exploring animated movies in diverse learning styles and teaching strategies.
Bousfield, David; McEntyre, Johanna; Velankar, Sameer; Papadatos, George; Bateman, Alex; Cochrane, Guy; Kim, Jee-Hyub; Graef, Florian; Vartak, Vid; Alako, Blaise; Blomberg, Niklas
2016-01-01
Data from open access biomolecular data resources, such as the European Nucleotide Archive and the Protein Data Bank are extensively reused within life science research for comparative studies, method development and to derive new scientific insights. Indicators that estimate the extent and utility of such secondary use of research data need to reflect this complex and highly variable data usage. By linking open access scientific literature, via Europe PubMedCentral, to the metadata in biological data resources we separate data citations associated with a deposition statement from citations that capture the subsequent, long-term, reuse of data in academia and industry. We extend this analysis to begin to investigate citations of biomolecular resources in patent documents. We find citations in more than 8,000 patents from 2014, demonstrating substantial use and an important role for data resources in defining biological concepts in granted patents to both academic and industrial innovators. Combined together our results indicate that the citation patterns in biomedical literature and patents vary, not only due to citation practice but also according to the data resource cited. The results guard against the use of simple metrics such as citation counts and show that indicators of data use must not only take into account citations within the biomedical literature but also include reuse of data in industry and other parts of society by including patents and other scientific and technical documents such as guidelines, reports and grant applications.
Bousfield, David; McEntyre, Johanna; Velankar, Sameer; Papadatos, George; Bateman, Alex; Cochrane, Guy; Kim, Jee-Hyub; Graef, Florian; Vartak, Vid; Alako, Blaise; Blomberg, Niklas
2016-01-01
Data from open access biomolecular data resources, such as the European Nucleotide Archive and the Protein Data Bank are extensively reused within life science research for comparative studies, method development and to derive new scientific insights. Indicators that estimate the extent and utility of such secondary use of research data need to reflect this complex and highly variable data usage. By linking open access scientific literature, via Europe PubMedCentral, to the metadata in biological data resources we separate data citations associated with a deposition statement from citations that capture the subsequent, long-term, reuse of data in academia and industry. We extend this analysis to begin to investigate citations of biomolecular resources in patent documents. We find citations in more than 8,000 patents from 2014, demonstrating substantial use and an important role for data resources in defining biological concepts in granted patents to both academic and industrial innovators. Combined together our results indicate that the citation patterns in biomedical literature and patents vary, not only due to citation practice but also according to the data resource cited. The results guard against the use of simple metrics such as citation counts and show that indicators of data use must not only take into account citations within the biomedical literature but also include reuse of data in industry and other parts of society by including patents and other scientific and technical documents such as guidelines, reports and grant applications. PMID:27092246
Zombie algorithms: a timesaving remote sensing systems engineering tool
NASA Astrophysics Data System (ADS)
Ardanuy, Philip E.; Powell, Dylan C.; Marley, Stephen
2008-08-01
In modern horror fiction, zombies are generally undead corpses brought back from the dead by supernatural or scientific means, and are rarely under anyone's direct control. They typically have very limited intelligence, and hunger for the flesh of the living [1]. Typical spectroradiometric or hyperspectral instruments providess calibrated radiances for a number of remote sensing algorithms. The algorithms typically must meet specified latency and availability requirements while yielding products at the required quality. These systems, whether research, operational, or a hybrid, are typically cost constrained. Complexity of the algorithms can be high, and may evolve and mature over time as sensor characterization changes, product validation occurs, and areas of scientific basis improvement are identified and completed. This suggests the need for a systems engineering process for algorithm maintenance that is agile, cost efficient, repeatable, and predictable. Experience on remote sensing science data systems suggests the benefits of "plug-n-play" concepts of operation. The concept, while intuitively simple, can be challenging to implement in practice. The use of zombie algorithms-empty shells that outwardly resemble the form, fit, and function of a "complete" algorithm without the implemented theoretical basis-provides the ground systems advantages equivalent to those obtained by integrating sensor engineering models onto the spacecraft bus. Combined with a mature, repeatable process for incorporating the theoretical basis, or scientific core, into the "head" of the zombie algorithm, along with associated scripting and registration, provides an easy "on ramp" for the rapid and low-risk integration of scientific applications into operational systems.
The Proof of the Pudding?: A Case Study of an "At-Risk" Design-Based Inquiry Science Curriculum
NASA Astrophysics Data System (ADS)
Chue, Shien; Lee, Yew-Jin
2013-12-01
When students collaboratively design and build artifacts that require relevant understanding and application of science, many aspects of scientific literacy are developed. Design-based inquiry (DBI) is one such pedagogy that can serve these desired goals of science education well. Focusing on a Projectile Science curriculum previously found to be implemented with satisfactory fidelity, we investigate the many hidden challenges when using DBI with Grade 8 students from one school in Singapore. A case study method was used to analyze video recordings of DBI lessons conducted over 10 weeks, project presentations, and interviews to ascertain the opportunities for developing scientific literacy among participants. One critical factor that hindered learning was task selection by teachers, which emphasized generic scientific process skills over more important cognitive and epistemic learning goals. Teachers and students were also jointly engaged in forms of inquiry that underscored artifact completion over deeper conceptual and epistemic understanding of science. Our research surfaced two other confounding factors that undermined the curriculum; unanticipated teacher effects and the underestimation of the complexity of DBI and of inquiry science in general. Thus, even though motivated or experienced teachers can implement an inquiry science curriculum with good fidelity and enjoy school-wide support, these by themselves will not guarantee deep learning of scientific literacy in DBI. Recommendations are made for navigating the hands- and minds-on aspects of learning science that is an asset as well as inherent danger during DBI teaching.
Lagorio, Susanna; Vecchia, P
2011-01-01
Scientific knowledge is essential for the resolution of disputes in law and administrative applications (such as toxic tort litigation and workers' compensation) and provides essential input for public policy decisions. There are no socially agreed-upon rules for the application of this knowledge except in the law. On a practical level, the legal system lacks the ability to assess the validity of scientific knowledge that can be used as evidence and therefore relies heavily on expert opinion. A key issue is how to ensure that professionals in any field provide judges with sound advice, based on relevant and reliable scientific evidence. The search for solutions to this problem seems particularly urgent in Italy, a country where a number of unprecedented verdicts of guilt have been pronounced in trials involving personal injuries from exposure to electromagnetic fields. An Italian Court has recently recognized the occupational origin of a trigeminal neuroma in a mobile telephone user, and ordered the Italian Workers' Compensation Authority (INAIL) to award the applicant compensation for a high degree (80%) of permanent disability. We describe and discuss the salient aspects of this sentence as a case-study in the framework of the use (and misuse) of scientific evidence in toxic-tort litigations. Based on the motivations of the verdict, it appears that the judge relied on seriously flawed expert testimonies. The "experts" who served in this particular trial were clearly inexperienced in forensic epidemiology in general, as well as in the topic at hand. Selective overviews of scientific evidence concerning cancer risks from mobile phone use were provided, along with misleading interpretations of findings from relevant epidemiologic studies (including the dismissal of the Interphone study results on the grounds of purported bias resulting from industry funding). The necessary requirements to proceed to causal inferences at individual level were not taken into account and inappropriate methods to derive estimates of personal risk were used. A comprehensive strategy to improve the quality of expert witness testimonies in legal proceedings and promote just and equitable verdicts is urgently needed in Italy. Contrary to other countries, such as the United States or the United Kingdom, legal standards for expert testimony, such as preliminary assessment of scientific evidence admissibility and qualification requirements for professionals acting as experts in the courtroom, are lacking in our country. In this and similar contexts, recommendations issued by professional associations (including EBEA and BEMS) could play a role of paramount importance. As examples, we refer to the guidelines recently endorsed by the UK General Medical Council and the American Academy of Pediatrics.
1977-05-10
apply this method of forecast- ing in the solution of all major scientific-technical problems of the na- tional economy. Citing the slow...the future, however, computers will "mature" and learn to recognize patterns in what amounts to a much more complex language—the language of visual...images. Photoelectronic tracking devices or "eyes" will allow the computer to take in information in a much more complex form and to perform opera
Research design: the methodology for interdisciplinary research framework.
Tobi, Hilde; Kampen, Jarl K
2018-01-01
Many of today's global scientific challenges require the joint involvement of researchers from different disciplinary backgrounds (social sciences, environmental sciences, climatology, medicine, etc.). Such interdisciplinary research teams face many challenges resulting from differences in training and scientific culture. Interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences. For that purpose this paper presents the Methodology for Interdisciplinary Research (MIR) framework. The MIR framework was developed to help cross disciplinary borders, especially those between the natural sciences and the social sciences. The framework has been specifically constructed to facilitate the design of interdisciplinary scientific research, and can be applied in an educational program, as a reference for monitoring the phases of interdisciplinary research, and as a tool to design such research in a process approach. It is suitable for research projects of different sizes and levels of complexity, and it allows for a range of methods' combinations (case study, mixed methods, etc.). The different phases of designing interdisciplinary research in the MIR framework are described and illustrated by real-life applications in teaching and research. We further discuss the framework's utility in research design in landscape architecture, mixed methods research, and provide an outlook to the framework's potential in inclusive interdisciplinary research, and last but not least, research integrity.
Li, Panlin; Su, Weiwei; Yun, Sha; Liao, Yiqiu; Liao, Yinyin; Liu, Hong; Li, Peibo; Wang, Yonggang; Peng, Wei; Yao, Hongliang
2017-01-01
Since traditional Chinese medicine (TCM) is a complex mixture of multiple components, the application of methodologies for evaluating single-components Western medicine in TCM studies may have certain limitations. Appropriate strategies that recognize the integrality of TCM and connect to TCM theories remain to be developed. Here we use multiple unique approaches to study the scientific connotation of a TCM formula Dan-hong injection (DHI) without undermining its prescription integrity. The blood circulation improving and healing promoting effects of DHI were assessed by a qi stagnation blood stasis rat model and a mouse model of laser irradiation induced cerebral microvascular thrombosis. By UFLC-PDA-Triple Q-TOF-MS/MS and relevance analysis between chemical characters and biological effects, 82 chemical constituents and nine core components, whose blood circulation promoting effects were found comparable to that of whole DHI, were successfully identified. What’s more, the rationality of DHI prescription compatibility could be reflected not only in the maximum efficacy of the original ratio, but also in the interactions of compounds from different ingredient herbs, such as complementary activities and facilitating tissues distribution. This study provides scientific evidences in explanation of the clinical benefits of DHI, and also gives a good demonstration for the comprehensive evaluation of other TCM. PMID:28393856
I/O-Efficient Scientific Computation Using TPIE
NASA Technical Reports Server (NTRS)
Vengroff, Darren Erik; Vitter, Jeffrey Scott
1996-01-01
In recent years, input/output (I/O)-efficient algorithms for a wide variety of problems have appeared in the literature. However, systems specifically designed to assist programmers in implementing such algorithms have remained scarce. TPIE is a system designed to support I/O-efficient paradigms for problems from a variety of domains, including computational geometry, graph algorithms, and scientific computation. The TPIE interface frees programmers from having to deal not only with explicit read and write calls, but also the complex memory management that must be performed for I/O-efficient computation. In this paper we discuss applications of TPIE to problems in scientific computation. We discuss algorithmic issues underlying the design and implementation of the relevant components of TPIE and present performance results of programs written to solve a series of benchmark problems using our current TPIE prototype. Some of the benchmarks we present are based on the NAS parallel benchmarks while others are of our own creation. We demonstrate that the central processing unit (CPU) overhead required to manage I/O is small and that even with just a single disk, the I/O overhead of I/O-efficient computation ranges from negligible to the same order of magnitude as CPU time. We conjecture that if we use a number of disks in parallel this overhead can be all but eliminated.
Customizable scientific web-portal for DIII-D nuclear fusion experiment
NASA Astrophysics Data System (ADS)
Abla, G.; Kim, E. N.; Schissel, D. P.
2010-04-01
Increasing utilization of the Internet and convenient web technologies has made the web-portal a major application interface for remote participation and control of scientific instruments. While web-portals have provided a centralized gateway for multiple computational services, the amount of visual output often is overwhelming due to the high volume of data generated by complex scientific instruments and experiments. Since each scientist may have different priorities and areas of interest in the experiment, filtering and organizing information based on the individual user's need can increase the usability and efficiency of a web-portal. DIII-D is the largest magnetic nuclear fusion device in the US. A web-portal has been designed to support the experimental activities of DIII-D researchers worldwide. It offers a customizable interface with personalized page layouts and list of services for users to select. Each individual user can create a unique working environment to fit his own needs and interests. Customizable services are: real-time experiment status monitoring, diagnostic data access, interactive data analysis and visualization. The web-portal also supports interactive collaborations by providing collaborative logbook, and online instant announcement services. The DIII-D web-portal development utilizes multi-tier software architecture, and Web 2.0 technologies and tools, such as AJAX and Django, to develop a highly-interactive and customizable user interface.
User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org
Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.
2013-01-01
Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278
NASA Astrophysics Data System (ADS)
Germer, S.; Bens, O.; Hüttl, R. F.
2008-12-01
The scepticism of non-scientific local stakeholders about results from complex physical based models is a major problem concerning the development and implementation of local climate change adaptation measures. This scepticism originates from the high complexity of such models. Local stakeholders perceive complex models as black-box models, as it is impossible to gasp all underlying assumptions and mathematically formulated processes at a glance. The use of physical based models is, however, indispensible to study complex underlying processes and to predict future environmental changes. The increase of climate change adaptation efforts following the release of the latest IPCC report indicates that the communication of facts about what has already changed is an appropriate tool to trigger climate change adaptation. Therefore we suggest increasing the practice of empirical data analysis in addition to modelling efforts. The analysis of time series can generate results that are easier to comprehend for non-scientific stakeholders. Temporal trends and seasonal patterns of selected hydrological parameters (precipitation, evapotranspiration, groundwater levels and river discharge) can be identified and the dependence of trends and seasonal patters to land use, topography and soil type can be highlighted. A discussion about lag times between the hydrological parameters can increase the awareness of local stakeholders for delayed environment responses.
50 CFR 21.23 - Scientific collecting permits.
Code of Federal Regulations, 2013 CFR
2013-10-01
... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...
50 CFR 21.23 - Scientific collecting permits.
Code of Federal Regulations, 2012 CFR
2012-10-01
... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...
50 CFR 21.23 - Scientific collecting permits.
Code of Federal Regulations, 2014 CFR
2014-10-01
... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...
Impact of research investment on scientific productivity of junior researchers.
Farrokhyar, Forough; Bianco, Daniela; Dao, Dyda; Ghert, Michelle; Andruszkiewicz, Nicole; Sussman, Jonathan; Ginsberg, Jeffrey S
2016-12-01
There is a demand for providing evidence on the effectiveness of research investments on the promotion of novice researchers' scientific productivity and production of research with new initiatives and innovations. We used a mixed method approach to evaluate the funding effect of the New Investigator Fund (NIF) by comparing scientific productivity between award recipients and non-recipients. We reviewed NIF grant applications submitted from 2004 to 2013. Scientific productivity was assessed by confirming the publication of the NIF-submitted application. Online databases were searched, independently and in duplicate, to locate the publications. Applicants' perceptions and experiences were collected through a short survey and categorized into specified themes. Multivariable logistic regression was performed. Odds ratios (OR) with 95 % confidence intervals (CI) are reported. Of 296 applicants, 163 (55 %) were awarded. Gender, affiliation, and field of expertise did not affect funding decisions. More physicians with graduate education (32.0 %) and applicants with a doctorate degree (21.5 %) were awarded than applicants without postgraduate education (9.8 %). Basic science research (28.8 %), randomized controlled trials (24.5 %), and feasibility/pilot trials (13.3 %) were awarded more than observational designs (p < 0.001). Adjusting for applicants and application factors, awardees published the NIF application threefold more than non-awardees (OR = 3.4, 95 %, CI = 1.9, 5.9). The survey response rate was 90.5 %, and only 58 % commented on their perceptions, successes, and challenges of the submission process. These findings suggest that research investments as small as seed funding are effective for scientific productivity and professional growth of novice investigators and production of research with new initiatives and innovations. Further efforts are recommended to enhance the support of small grant funding programs.
NASA Technical Reports Server (NTRS)
Schoenwald, Adam J.; Bradley, Damon C.; Mohammed, Priscilla N.; Piepmeier, Jeffrey R.; Wong, Mark
2016-01-01
In the field of microwave radiometry, Radio Frequency Interference (RFI) consistently degrades the value of scientific results. Through the use of digital receivers and signal processing, the effects of RFI on scientific measurements can be reduced depending on certain circumstances. As technology allows us to implement wider band digital receivers for radiometry, the problem of RFI mitigation changes. Our work focuses on finding a detector that outperforms real kurtosis in wide band scenarios. The algorithm implemented is a complex signal kurtosis detector which was modeled and simulated. The performance of both complex and real signal kurtosis is evaluated for continuous wave, pulsed continuous wave, and wide band quadrature phase shift keying (QPSK) modulations. The use of complex signal kurtosis increased the detectability of interference.
Structural Biology of Proteins of the Multi-enzyme Assembly Human Pyruvate Dehydrogenase Complex
NASA Technical Reports Server (NTRS)
2003-01-01
Objectives and research challenges of this effort include: 1. Need to establish Human Pyruvate Dehydrogenase Complex protein crystals; 2. Need to test value of microgravity for improving crystal quality of Human Pyruvate Dehydrogenase Complex protein crystals; 3. Need to improve flight hardware in order to control and understand the effects of microgravity on crystallization of Human Pyruvate Dehydrogenase Complex proteins; 4. Need to integrate sets of national collaborations with the restricted and specific requirements of flight experiments; 5. Need to establish a highly controlled experiment in microgravity with a rigor not yet obtained; 6. Need to communicate both the rigor of microgravity experiments and the scientific value of results obtained from microgravity experiments to the national community; and 7. Need to advance the understanding of Human Pyruvate Dehydrogenase Complex structures so that scientific and commercial advance is identified for these proteins.
Bristol, R. Sky; Euliss, Ned H.; Booth, Nathaniel L.; Burkardt, Nina; Diffendorfer, Jay E.; Gesch, Dean B.; McCallum, Brian E.; Miller, David M.; Morman, Suzette A.; Poore, Barbara S.; Signell, Richard P.; Viger, Roland J.
2013-01-01
Core Science Systems is a new mission of the U.S. Geological Survey (USGS) that resulted from the 2007 Science Strategy, "Facing Tomorrow's Challenges: U.S. Geological Survey Science in the Decade 2007-2017." This report describes the Core Science Systems vision and outlines a strategy to facilitate integrated characterization and understanding of the complex Earth system. The vision and suggested actions are bold and far-reaching, describing a conceptual model and framework to enhance the ability of the USGS to bring its core strengths to bear on pressing societal problems through data integration and scientific synthesis across the breadth of science. The context of this report is inspired by a direction set forth in the 2007 Science Strategy. Specifically, ecosystem-based approaches provide the underpinnings for essentially all science themes that define the USGS. Every point on Earth falls within a specific ecosystem where data, other information assets, and the expertise of USGS and its many partners can be employed to quantitatively understand how that ecosystem functions and how it responds to natural and anthropogenic disturbances. Every benefit society obtains from the planet-food, water, raw materials to build infrastructure, homes and automobiles, fuel to heat homes and cities, and many others, are derived from or affect ecosystems. The vision for Core Science Systems builds on core strengths of the USGS in characterizing and understanding complex Earth and biological systems through research, modeling, mapping, and the production of high quality data on the Nation's natural resource infrastructure. Together, these research activities provide a foundation for ecosystem-based approaches through geologic mapping, topographic mapping, and biodiversity mapping. The vision describes a framework founded on these core mapping strengths that makes it easier for USGS scientists to discover critical information, share and publish results, and identify potential collaborations that transcend all USGS missions. The framework is designed to improve the efficiency of scientific work within USGS by establishing a means to preserve and recall data for future applications, organizing existing scientific knowledge and data to facilitate new use of older information, and establishing a future workflow that naturally integrates new data, applications, and other science products to make interdisciplinary research easier and more efficient. Given the increasing need for integrated data and interdisciplinary approaches to solve modern problems, leadership by the Core Science Systems mission will facilitate problem solving by all USGS missions in ways not formerly possible. The report lays out a strategy to achieve this vision through three goals with accompanying objectives and actions. The first goal builds on and enhances the strengths of the Core Science Systems mission in characterizing and understanding the Earth system from the geologic framework to the topographic characteristics of the land surface and biodiversity across the Nation. The second goal enhances and develops new strengths in computer and information science to make it easier for USGS scientists to discover data and models, share and publish results, and discover connections between scientific information and knowledge. The third goal brings additional focus to research and development methods to address complex issues affecting society that require integration of knowledge and new methods for synthesizing scientific information. Collectively, the report lays out a strategy to create a seamless connection between all USGS activities to accelerate and make USGS science more efficient by fully integrating disciplinary expertise within a new and evolving science paradigm for a changing world in the 21st century.
Science strategy for Core Science Systems in the U.S. Geological Survey, 2013-2023
Bristol, R. Sky; Euliss, Ned H.; Booth, Nathaniel L.; Burkardt, Nina; Diffendorfer, Jay E.; Gesch, Dean B.; McCallum, Brian E.; Miller, David M.; Morman, Suzette A.; Poore, Barbara S.; Signell, Richard P.; Viger, Roland J.
2012-01-01
Core Science Systems is a new mission of the U.S. Geological Survey (USGS) that grew out of the 2007 Science Strategy, “Facing Tomorrow’s Challenges: U.S. Geological Survey Science in the Decade 2007–2017.” This report describes the vision for this USGS mission and outlines a strategy for Core Science Systems to facilitate integrated characterization and understanding of the complex earth system. The vision and suggested actions are bold and far-reaching, describing a conceptual model and framework to enhance the ability of USGS to bring its core strengths to bear on pressing societal problems through data integration and scientific synthesis across the breadth of science.The context of this report is inspired by a direction set forth in the 2007 Science Strategy. Specifically, ecosystem-based approaches provide the underpinnings for essentially all science themes that define the USGS. Every point on earth falls within a specific ecosystem where data, other information assets, and the expertise of USGS and its many partners can be employed to quantitatively understand how that ecosystem functions and how it responds to natural and anthropogenic disturbances. Every benefit society obtains from the planet—food, water, raw materials to build infrastructure, homes and automobiles, fuel to heat homes and cities, and many others, are derived from or effect ecosystems.The vision for Core Science Systems builds on core strengths of the USGS in characterizing and understanding complex earth and biological systems through research, modeling, mapping, and the production of high quality data on the nation’s natural resource infrastructure. Together, these research activities provide a foundation for ecosystem-based approaches through geologic mapping, topographic mapping, and biodiversity mapping. The vision describes a framework founded on these core mapping strengths that makes it easier for USGS scientists to discover critical information, share and publish results, and identify potential collaborations that transcend all USGS missions. The framework is designed to improve the efficiency of scientific work within USGS by establishing a means to preserve and recall data for future applications, organizing existing scientific knowledge and data to facilitate new use of older information, and establishing a future workflow that naturally integrates new data, applications, and other science products to make it easier and more efficient to conduct interdisciplinary research over time. Given the increasing need for integrated data and interdisciplinary approaches to solve modern problems, leadership by the Core Science Systems mission will facilitate problem solving by all USGS missions in ways not formerly possible.The report lays out a strategy to achieve this vision through three goals with accompanying objectives and actions. The first goal builds on and enhances the strengths of the Core Science Systems mission in characterizing and understanding the earth system from the geologic framework to the topographic characteristics of the land surface and biodiversity across the nation. The second goal enhances and develops new strengths in computer and information science to make it easier for USGS scientists to discover data and models, share and publish results, and discover connections between scientific information and knowledge. The third goal brings additional focus to research and development methods to address complex issues affecting society that require integration of knowledge and new methods for synthesizing scientific information. Collectively, the report lays out a strategy to create a seamless connection between all USGS activities to accelerate and make USGS science more efficient by fully integrating disciplinary expertise within a new and evolving science paradigm for a changing world in the 21st century.
Resolving cryptic species complexes of major tephritid pests
Hendrichs, Jorge; Vera, M. Teresa; De Meyer, Marc; Clarke, Anthony R.
2015-01-01
Abstract An FAO/IAEA Co-ordinated Research Project (CRP) on “Resolution of Cryptic Species Complexes of Tephritid Pests to Overcome Constraints to SIT Application and International Trade” was conducted from 2010 to 2015. As captured in the CRP title, the objective was to undertake targeted research into the systematics and diagnostics of taxonomically challenging fruit fly groups of economic importance. The scientific output was the accurate alignment of biological species with taxonomic names; which led to the applied outcome of assisting FAO and IAEA Member States in overcoming technical constraints to the application of the Sterile Insect Technique (SIT) against pest fruit flies and the facilitation of international agricultural trade. Close to 50 researchers from over 20 countries participated in the CRP, using coordinated, multidisciplinary research to address, within an integrative taxonomic framework, cryptic species complexes of major tephritid pests. The following progress was made for the four complexes selected and studied: Anastrepha fraterculus complex – Eight morphotypes and their geographic and ecological distributions in Latin America were defined. The morphotypes can be considered as distinct biological species on the basis of differences in karyotype, sexual incompatibility, post-mating isolation, cuticular hydrocarbon, pheromone, and molecular analyses. Discriminative taxonomic tools using linear and geometric morphometrics of both adult and larval morphology were developed for this complex. Bactrocera dorsalis complex – Based on genetic, cytogenetic, pheromonal, morphometric, and behavioural data, which showed no or only minor variation between the Asian/African pest fruit flies Bactrocera dorsalis, Bactrocera papayae, Bactrocera philippinensis and Bactrocera invadens, the latter three species were synonymized with Bactrocera dorsalis. Of the five target pest taxa studied, only Bactrocera dorsalis and Bactrocera carambolae remain as scientifically valid names. Molecular and pheromone markers are now available to distinguish Bactrocera dorsalis from Bactrocera carambolae. Ceratitis FAR Complex (Ceratitis fasciventris, Ceratitis anonae, Ceratitis rosa) – Morphology, morphometry, genetic, genomic, pheromone, cuticular hydrocarbon, ecology, behaviour, and developmental physiology data provide evidence for the existence of five different entities within this fruit fly complex from the African region. These are currently recognised as Ceratitis anonae, Ceratitis fasciventris (F1 and F2), Ceratitis rosa and a new species related to Ceratitis rosa (R2). The biological limits within Ceratitis fasciventris (i.e. F1 and F2) are not fully resolved. Microsatellites markers and morphological identification tools for the adult males of the five different FAR entities were developed based on male leg structures. Zeugodacus cucurbitae (formerly Bactrocera (Zeugodacus) cucurbitae) – Genetic variability was studied among melon fly populations throughout its geographic range in Africa and the Asia/Pacific region and found to be limited. Cross-mating studies indicated no incompatibility or sexual isolation. Host preference and genetic studies showed no evidence for the existence of host races. It was concluded that the melon fly does not represent a cryptic species complex, neither with regard to geographic distribution nor to host range. Nevertheless, the higher taxonomic classification under which this species had been placed, by the time the CRP was started, was found to be paraphyletic; as a result the subgenus Zeugodacus was elevated to genus level. PMID:26798252
A precautionary principle for dual use research in the life sciences.
Kuhlau, Frida; Höglund, Anna T; Evers, Kathinka; Eriksson, Stefan
2011-01-01
Most life science research entails dual-use complexity and may be misused for harmful purposes, e.g. biological weapons. The Precautionary Principle applies to special problems characterized by complexity in the relationship between human activities and their consequences. This article examines whether the principle, so far mainly used in environmental and public health issues, is applicable and suitable to the field of dual-use life science research. Four central elements of the principle are examined: threat, uncertainty, prescription and action. Although charges against the principle exist - for example that it stifles scientific development, lacks practical applicability and is poorly defined and vague - the analysis concludes that a Precautionary Principle is applicable to the field. Certain factors such as credibility of the threat, availability of information, clear prescriptive demands on responsibility and directives on how to act, determine the suitability and success of a Precautionary Principle. Moreover, policy-makers and researchers share a responsibility for providing and seeking information about potential sources of harm. A central conclusion is that the principle is meaningful and useful if applied as a context-dependent moral principle and allowed flexibility in its practical use. The principle may then inspire awareness-raising and the establishment of practical routines which appropriately reflect the fact that life science research may be misused for harmful purposes. © 2009 Blackwell Publishing Ltd.
Towards reversible basic linear algebra subprograms: A performance study
Perumalla, Kalyan S.; Yoginath, Srikanth B.
2014-12-06
Problems such as fault tolerance and scalable synchronization can be efficiently solved using reversibility of applications. Making applications reversible by relying on computation rather than on memory is ideal for large scale parallel computing, especially for the next generation of supercomputers in which memory is expensive in terms of latency, energy, and price. In this direction, a case study is presented here in reversing a computational core, namely, Basic Linear Algebra Subprograms, which is widely used in scientific applications. A new Reversible BLAS (RBLAS) library interface has been designed, and a prototype has been implemented with two modes: (1) amore » memory-mode in which reversibility is obtained by checkpointing to memory in forward and restoring from memory in reverse, and (2) a computational-mode in which nothing is saved in the forward, but restoration is done entirely via inverse computation in reverse. The article is focused on detailed performance benchmarking to evaluate the runtime dynamics and performance effects, comparing reversible computation with checkpointing on both traditional CPU platforms and recent GPU accelerator platforms. For BLAS Level-1 subprograms, data indicates over an order of magnitude better speed of reversible computation compared to checkpointing. For BLAS Level-2 and Level-3, a more complex tradeoff is observed between reversible computation and checkpointing, depending on computational and memory complexities of the subprograms.« less
Distributed Information System for Dynamic Ocean Data in Indonesia
NASA Astrophysics Data System (ADS)
Romero, Laia; Sala, Joan; Polo, Isabel; Cases, Oscar; López, Alejandro; Jolibois, Tony; Carbou, Jérome
2014-05-01
Information systems are widely used to enable access to scientific data by different user communities. MyOcean information system is a good example of such applications in Europe. The present work describes a specific distributed information system for Ocean Numerical Model (ONM) data in the scope of the INDESO project, a project focused on Infrastructure Development of Space Oceanography in Indonesia. INDESO, as part of the Blue Revolution policy conducted by the Indonesian government for the sustainable development of fisheries and aquaculture, presents challenging service requirements in terms of services performance, reliability, security and overall usability. Following state-of-the-art technologies on scientific data networks, this robust information system provides a high level of interoperability of services to discover, view and access INDESO dynamic ONM scientific data. The entire system is automatically updated four times a day, including dataset metadata, taking into account every new file available in the data repositories. The INDESO system architecture has been designed in great part around the extension and integration of open-source flexible and mature technologies. It involves three separate modules: web portal, dissemination gateway, and user administration. Supporting different gridded and non-gridded data, the INDESO information system features search-based data discovery, data access by temporal and spatial subset extraction, direct download and ftp, and multiple-layer visualization of datasets. A complex authorization system has been designed and applied throughout all components, in order to enable services authorization at dataset level, according to the different user profiles stated in the data policy. Finally, a web portal has been developed as the single entry point and standardized interface to all data services (discover, view, and access). Apache SOLR has been implemented as the search server, allowing faceted browsing among ocean data products and the connection to an external catalogue of metadata records. ncWMS and Godiva2 have been the basis of the viewing server and client technologies developed, MOTU has been used for data subsetting and intelligent management of data queues, and has allowed the deployment of a centralised download interface applicable to all ONM products. Unidata's Thredds server has been employed to provide file metadata and remote access to ONM data. CAS has been used as the single sign-on protocol for all data services. The user management application developed has been based on GOSA2. Joomla and Bootstrap have been the technologies used for the web portal, compatible with mobile phone and tablet devices. The INDESO information system comes up as an information system that is scalable, extremely easy to use, operate and maintain. This will facilitate the extensive use of ocean numerical model data by the scientific community in Indonesia. Constituted mostly of open-source solutions, the system is able to meet strict operational requirements, and carry out complex functions. It is feasible to adapt this architecture to different static and dynamic oceanographic data sources and large data volumes, in an accessible, fast, and comprehensive manner.
The Scientific Competitiveness of Nations.
Cimini, Giulio; Gabrielli, Andrea; Sylos Labini, Francesco
2014-01-01
We use citation data of scientific articles produced by individual nations in different scientific domains to determine the structure and efficiency of national research systems. We characterize the scientific fitness of each nation-that is, the competitiveness of its research system-and the complexity of each scientific domain by means of a non-linear iterative algorithm able to assess quantitatively the advantage of scientific diversification. We find that technological leading nations, beyond having the largest production of scientific papers and the largest number of citations, do not specialize in a few scientific domains. Rather, they diversify as much as possible their research system. On the other side, less developed nations are competitive only in scientific domains where also many other nations are present. Diversification thus represents the key element that correlates with scientific and technological competitiveness. A remarkable implication of this structure of the scientific competition is that the scientific domains playing the role of "markers" of national scientific competitiveness are those not necessarily of high technological requirements, but rather addressing the most "sophisticated" needs of the society.
Standardization of Color Palettes for Scientific Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulesza, Joel A.; Spencer, Joshua Bradly; Sood, Avneet
The purpose of this white paper is to demonstrate the importance of color palette choice in scientific visualizations and to promote an effort to convene an interdisciplinary team of researchers to study and recommend color palettes based on intended application(s) and audience(s).
Topological Landscapes: A Terrain Metaphor for ScientificData
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio
2007-08-01
Scientific visualization and illustration tools are designed to help people understand the structure and complexity of scientific data with images that are as informative and intuitive as possible. In this context, the use of metaphors plays an important role, since they make complex information easily accessible by using commonly known concepts. In this paper we propose a new metaphor, called 'Topological Landscapes', which facilitates understanding the topological structure of scalar functions. The basic idea is to construct a terrain with the same topology as a given dataset and to display the terrain as an easily understood representation of the actualmore » input data. In this projection from an n-dimensional scalar function to a two-dimensional (2D) model we preserve function values of critical points, the persistence (function span) of topological features, and one possible additional metric property (in our examples volume). By displaying this topologically equivalent landscape together with the original data we harness the natural human proficiency in understanding terrain topography and make complex topological information easily accessible.« less
RAPPORT: running scientific high-performance computing applications on the cloud.
Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt
2013-01-28
Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.
ASI's space automation and robotics programs: The second step
NASA Technical Reports Server (NTRS)
Dipippo, Simonetta
1994-01-01
The strategic decisions taken by ASI in the last few years in building up the overall A&R program, represent the technological drivers for other applications (i.e., internal automation of the Columbus Orbital Facility in the ESA Manned Space program, applications to mobile robots both in space and non-space environments, etc...). In this context, the main area of application now emerging is the scientific missions domain. Due to the broad range of applications of the developed technologies, both in the in-orbit servicing and maintenance of space structures and scientific missions, ASI foresaw the need to have a common technological development path, mainly focusing on: (1) control; (2) manipulation; (3) on-board computing; (4) sensors; and (5) teleoperation. Before entering into new applications in the scientific missions field, a brief overview of the status of the SPIDER related projects is given, underlining also the possible new applications for the LEO/GEO space structures.
46 CFR 188.05-33 - Scientific personnel-interpretive rulings.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 7 2010-10-01 2010-10-01 false Scientific personnel-interpretive rulings. 188.05-33... VESSELS GENERAL PROVISIONS Application § 188.05-33 Scientific personnel—interpretive rulings. (a) Scientific personnel on oceanographic research vessels are not considered to be seamen or passengers, but are...
Climate Change and Everyday Life: Repertoires Children Use to Negotiate a Socio-Scientific Issue
ERIC Educational Resources Information Center
Byrne, Jenny; Ideland, Malin; Malmberg, Claes; Grace, Marcus
2014-01-01
There are only a few studies about how primary school students engage in socio-scientific discussions. This study aims to add to this field of research by focusing on how 9-10-year-olds in Sweden and England handle climate change as a complex environmental socio-scientific issue (SSI), within the context of their own lives and in relation to…
Sochat, Vanessa
2018-01-01
Abstract Background Here, we present the Scientific Filesystem (SCIF), an organizational format that supports exposure of executables and metadata for discoverability of scientific applications. The format includes a known filesystem structure, a definition for a set of environment variables describing it, and functions for generation of the variables and interaction with the libraries, metadata, and executables located within. SCIF makes it easy to expose metadata, multiple environments, installation steps, files, and entry points to render scientific applications consistent, modular, and discoverable. A SCIF can be installed on a traditional host or in a container technology such as Docker or Singularity. We start by reviewing the background and rationale for the SCIF, followed by an overview of the specification and the different levels of internal modules (“apps”) that the organizational format affords. Finally, we demonstrate that SCIF is useful by implementing and discussing several use cases that improve user interaction and understanding of scientific applications. SCIF is released along with a client and integration in the Singularity 2.4 software to quickly install and interact with SCIF. When used inside of a reproducible container, a SCIF is a recipe for reproducibility and introspection of the functions and users that it serves. Results We use SCIF to evaluate container software, provide metrics, serve scientific workflows, and execute a primary function under different contexts. To encourage collaboration and sharing of applications, we developed tools along with an open source, version-controlled, tested, and programmatically accessible web infrastructure. SCIF and associated resources are available at https://sci-f.github.io. The ease of using SCIF, especially in the context of containers, offers promise for scientists’ work to be self-documenting and programatically parseable for maximum reproducibility. SCIF opens up an abstraction from underlying programming languages and packaging logic to work with scientific applications, opening up new opportunities for scientific software development. PMID:29718213
A Scientific Software Product Line for the Bioinformatics domain.
Costa, Gabriella Castro B; Braga, Regina; David, José Maria N; Campos, Fernanda
2015-08-01
Most specialized users (scientists) that use bioinformatics applications do not have suitable training on software development. Software Product Line (SPL) employs the concept of reuse considering that it is defined as a set of systems that are developed from a common set of base artifacts. In some contexts, such as in bioinformatics applications, it is advantageous to develop a collection of related software products, using SPL approach. If software products are similar enough, there is the possibility of predicting their commonalities, differences and then reuse these common features to support the development of new applications in the bioinformatics area. This paper presents the PL-Science approach which considers the context of SPL and ontology in order to assist scientists to define a scientific experiment, and to specify a workflow that encompasses bioinformatics applications of a given experiment. This paper also focuses on the use of ontologies to enable the use of Software Product Line in biological domains. In the context of this paper, Scientific Software Product Line (SSPL) differs from the Software Product Line due to the fact that SSPL uses an abstract scientific workflow model. This workflow is defined according to a scientific domain and using this abstract workflow model the products (scientific applications/algorithms) are instantiated. Through the use of ontology as a knowledge representation model, we can provide domain restrictions as well as add semantic aspects in order to facilitate the selection and organization of bioinformatics workflows in a Scientific Software Product Line. The use of ontologies enables not only the expression of formal restrictions but also the inferences on these restrictions, considering that a scientific domain needs a formal specification. This paper presents the development of the PL-Science approach, encompassing a methodology and an infrastructure, and also presents an approach evaluation. This evaluation presents case studies in bioinformatics, which were conducted in two renowned research institutions in Brazil. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
McNeill, Katherine Lynch
An essential goal of classroom science is to help all students become scientifically literate to encourage greater public understanding in a science infused world. This type of literacy requires that students participate in scientific inquiry practices such as construction of arguments or scientific explanations in which they justify their claims with appropriate evidence and reasoning. Although scientific explanations are an important learning goal, this complex inquiry practice is frequently omitted from k-12 science classrooms and students have difficulty creating them. I investigated how two different curricular scaffolds (context-specific vs. generic), teacher instructional practices, and the interaction between these two types of support influence student learning of scientific explanations. This study focuses on an eight-week middle school chemistry curriculum, How can I make new stuff from old stuff?, which was enacted by six teachers with 578 students during the 2004-2005 school year. Overall, students' written scientific explanations improved during the unit in which they were provided with multiple forms of teacher and curricular support. A growth curve model of student learning showed that there was a significant difference in the effect of the two curricular scaffolds towards the end of the unit and on the posttest. The context-specific scaffolds resulted in greater student learning of how to write scientific explanations, but only for three of the six teachers. The case studies created from the videotapes of classroom enactments revealed that teachers varied in which instructional practices they engaged in and the quality of those practices. Analyses suggested that the curricular scaffolds and teacher instructional practices were synergistic in that the supports interacted and the effect of the written curricular scaffolds depended on the teacher's enactment of the curriculum. The context-specific curricular scaffolds were more successful in supporting students in this complex task only when teachers' enactments provided generic support for scientific explanation through instructional practices. For teachers who did not provide their students with generic support, neither curricular scaffold was more effective. Classrooms are complex systems in which multiple factors and the interactions between those factors influence student learning.
Writing Cancer Grant Applications | Center for Cancer Research
This course focuses on how to write clear and persuasive grant applications. The purpose is to increase the quality of your grant application by successfully communicating scientific data and ideas. Emphasis is placed on how to use the title abstract and introduction sections to draw in reviewers and how to write an organized and focused proposal using specific scientific aims.
2014-01-01
Background Accounts of evidence are vital to evaluate and reproduce scientific findings and integrate data on an informed basis. Currently, such accounts are often inadequate, unstandardized and inaccessible for computational knowledge engineering even though computational technologies, among them those of the semantic web, are ever more employed to represent, disseminate and integrate biomedical data and knowledge. Results We present SEE (Semantic EvidencE), an RDF/OWL based approach for detailed representation of evidence in terms of the argumentative structure of the supporting background for claims even in complex settings. We derive design principles and identify minimal components for the representation of evidence. We specify the Reasoning and Discourse Ontology (RDO), an OWL representation of the model of scientific claims, their subjects, their provenance and their argumentative relations underlying the SEE approach. We demonstrate the application of SEE and illustrate its design patterns in a case study by providing an expressive account of the evidence for certain claims regarding the isolation of the enzyme glutamine synthetase. Conclusions SEE is suited to provide coherent and computationally accessible representations of evidence-related information such as the materials, methods, assumptions, reasoning and information sources used to establish a scientific finding by adopting a consistently claim-based perspective on scientific results and their evidence. SEE allows for extensible evidence representations, in which the level of detail can be adjusted and which can be extended as needed. It supports representation of arbitrary many consecutive layers of interpretation and attribution and different evaluations of the same data. SEE and its underlying model could be a valuable component in a variety of use cases that require careful representation or examination of evidence for data presented on the semantic web or in other formats. PMID:25093070
Bölling, Christian; Weidlich, Michael; Holzhütter, Hermann-Georg
2014-01-01
Accounts of evidence are vital to evaluate and reproduce scientific findings and integrate data on an informed basis. Currently, such accounts are often inadequate, unstandardized and inaccessible for computational knowledge engineering even though computational technologies, among them those of the semantic web, are ever more employed to represent, disseminate and integrate biomedical data and knowledge. We present SEE (Semantic EvidencE), an RDF/OWL based approach for detailed representation of evidence in terms of the argumentative structure of the supporting background for claims even in complex settings. We derive design principles and identify minimal components for the representation of evidence. We specify the Reasoning and Discourse Ontology (RDO), an OWL representation of the model of scientific claims, their subjects, their provenance and their argumentative relations underlying the SEE approach. We demonstrate the application of SEE and illustrate its design patterns in a case study by providing an expressive account of the evidence for certain claims regarding the isolation of the enzyme glutamine synthetase. SEE is suited to provide coherent and computationally accessible representations of evidence-related information such as the materials, methods, assumptions, reasoning and information sources used to establish a scientific finding by adopting a consistently claim-based perspective on scientific results and their evidence. SEE allows for extensible evidence representations, in which the level of detail can be adjusted and which can be extended as needed. It supports representation of arbitrary many consecutive layers of interpretation and attribution and different evaluations of the same data. SEE and its underlying model could be a valuable component in a variety of use cases that require careful representation or examination of evidence for data presented on the semantic web or in other formats.
High performance computing and communications: Advancing the frontiers of information technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental inmore » the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.« less
Safety assessment for In-service Pressure Bending Pipe Containing Incomplete Penetration Defects
NASA Astrophysics Data System (ADS)
Wang, M.; Tang, P.; Xia, J. F.; Ling, Z. W.; Cai, G. Y.
2017-12-01
Incomplete penetration defect is a common defect in the welded joint of pressure pipes. While the safety classification of pressure pipe containing incomplete penetration defects, according to periodical inspection regulations in present, is more conservative. For reducing the repair of incomplete penetration defect, a scientific and applicable safety assessment method for pressure pipe is needed. In this paper, the stress analysis model of the pipe system was established for the in-service pressure bending pipe containing incomplete penetration defects. The local finite element model was set up to analyze the stress distribution of defect location and the stress linearization. And then, the applicability of two assessment methods, simplified assessment and U factor assessment method, to the assessment of incomplete penetration defects located at pressure bending pipe were analyzed. The results can provide some technical supports for the safety assessment of complex pipelines in the future.
Data Transfer Advisor with Transport Profiling Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Yun, Daqing
The network infrastructures have been rapidly upgraded in many high-performance networks (HPNs). However, such infrastructure investment has not led to corresponding performance improvement in big data transfer, especially at the application layer, largely due to the complexity of optimizing transport control on end hosts. We design and implement ProbData, a PRofiling Optimization Based DAta Transfer Advisor, to help users determine the most effective data transfer method with the most appropriate control parameter values to achieve the best data transfer performance. ProbData employs a profiling optimization based approach to exploit the optimal operational zone of various data transfer methods in supportmore » of big data transfer in extreme scale scientific applications. We present a theoretical framework of the optimized profiling approach employed in ProbData as wellas its detailed design and implementation. The advising procedure and performance benefits of ProbData are illustrated and evaluated by proof-of-concept experiments in real-life networks.« less
Liu, X-L; Liu, H-N; Tan, P-H
2017-08-01
Resonant Raman spectroscopy requires that the wavelength of the laser used is close to that of an electronic transition. A tunable laser source and a triple spectrometer are usually necessary for resonant Raman profile measurements. However, such a system is complex with low signal throughput, which limits its wide application by scientific community. Here, a tunable micro-Raman spectroscopy system based on the supercontinuum laser, transmission grating, tunable filters, and single-stage spectrometer is introduced to measure the resonant Raman profile. The supercontinuum laser in combination with transmission grating makes a tunable excitation source with a bandwidth of sub-nanometer. Such a system exhibits continuous excitation tunability and high signal throughput. Its good performance and flexible tunability are verified by resonant Raman profile measurement of twisted bilayer graphene, which demonstrates its potential application prospect for resonant Raman spectroscopy.
The FDA's role in medical device clinical studies of human subjects
NASA Astrophysics Data System (ADS)
Saviola, James
2005-03-01
This paper provides an overview of the United States Food and Drug Administration's (FDA) role as a regulatory agency in medical device clinical studies involving human subjects. The FDA's regulations and responsibilities are explained and the device application process discussed. The specific medical device regulatory authorities are described as they apply to the development and clinical study of retinal visual prosthetic devices. The FDA medical device regulations regarding clinical studies of human subjects are intended to safeguard the rights and safety of subjects. The data gathered in pre-approval clinical studies provide a basis of valid scientific evidence in order to demonstrate the safety and effectiveness of a medical device. The importance of a working understanding of applicable medical device regulations from the beginning of the device development project is emphasized particularly for novel, complex products such as implantable visual prosthetic devices.
A Domain Analysis Model for eIRB Systems: Addressing the Weak Link in Clinical Research Informatics
He, Shan; Narus, Scott P.; Facelli, Julio C.; Lau, Lee Min; Botkin, Jefferey R.; Hurdle, John F.
2014-01-01
Institutional Review Boards (IRBs) are a critical component of clinical research and can become a significant bottleneck due to the dramatic increase, in both volume and complexity of clinical research. Despite the interest in developing clinical research informatics (CRI) systems and supporting data standards to increase clinical research efficiency and interoperability, informatics research in the IRB domain has not attracted much attention in the scientific community. The lack of standardized and structured application forms across different IRBs causes inefficient and inconsistent proposal reviews and cumbersome workflows. These issues are even more prominent in multi-institutional clinical research that is rapidly becoming the norm. This paper proposes and evaluates a domain analysis model for electronic IRB (eIRB) systems, paving the way for streamlined clinical research workflow via integration with other CRI systems and improved IRB application throughput via computer-assisted decision support. PMID:24929181
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Lin; Gupta, Hoshin V.; Gao, Xiaogang; Sorooshian, Soroosh; Imam, Bisher
2002-12-01
Artificial neural networks (ANNs) can be useful in the prediction of hydrologic variables, such as streamflow, particularly when the underlying processes have complex nonlinear interrelationships. However, conventional ANN structures suffer from network training issues that significantly limit their widespread application. This paper presents a multivariate ANN procedure entitled self-organizing linear output map (SOLO), whose structure has been designed for rapid, precise, and inexpensive estimation of network structure/parameters and system outputs. More important, SOLO provides features that facilitate insight into the underlying processes, thereby extending its usefulness beyond forecast applications as a tool for scientific investigations. These characteristics are demonstrated using a classic rainfall-runoff forecasting problem. Various aspects of model performance are evaluated in comparison with other commonly used modeling approaches, including multilayer feedforward ANNs, linear time series modeling, and conceptual rainfall-runoff modeling.
New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases
NASA Astrophysics Data System (ADS)
Brescia, Massimo
2012-11-01
Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.
Scientific Discovery through Advanced Computing in Plasma Science
NASA Astrophysics Data System (ADS)
Tang, William
2005-03-01
Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.
Future Sky Surveys: New Discovery Frontiers
NASA Astrophysics Data System (ADS)
Tyson, J. Anthony; Borne, Kirk D.
2012-03-01
Driven by the availability of new instrumentation, there has been an evolution in astronomical science toward comprehensive investigations of new phenomena. Major advances in our understanding of the Universe over the history of astronomy have often arisen from dramatic improvements in our capability to observe the sky to greater depth, in previously unexplored wavebands, with higher precision, or with improved spatial, spectral, or temporal resolution. Substantial progress in the important scientific problems of the next decade (determining the nature of dark energy and dark matter, studying the evolution of galaxies and the structure of our own Milky Way, opening up the time domain to discover faint variable objects, and mapping both the inner and outer Solar System) can be achieved through the application of advanced data mining methods and machine learning algorithms operating on the numerous large astronomical databases that will be generated from a variety of revolutionary future sky surveys. Over the next decade, astronomy will irrevocably enter the era of big surveys and of really big telescopes. New sky surveys (some of which will produce petabyte-scale data collections) will begin their operations, and one or more very large telescopes (ELTs = Extremely Large Telescopes) will enter the construction phase. These programs and facilities will generate a remarkable wealth of data of high complexity, endowed with enormous scientific knowledge discovery potential. New parameter spaces will be opened, in multiple wavelength domains as well as the time domain, across wide areas of the sky, and down to unprecedented faint source flux limits. The synergies of grand facilities, massive data collections, and advanced machine learning algorithms will come together to enable discoveries within most areas of astronomical science, including Solar System, exo-planets, star formation, stellar populations, stellar death, galaxy assembly, galaxy evolution, quasar evolution, and cosmology. Current and future sky surveys, comprising an alphabet soup of project names (e.g., Pan- STARRS, WISE, Kepler, DES, VST, VISTA, GAIA, EUCLID, SKA, LSST, and WFIRST; some of which are discussed in Chapters 17, 18, and 20),will contribute to the exponential explosion of complex data in astronomy. The scientific goals of these projects are as monumental as the programs themselves. The core scientific output of all of these will be their scientific data collection. Consequently, data mining and machine learning algorithms and specialists will become a common component of future astronomical research with these facilities. This synergistic combination and collaboration among multiple disciplines are essential in order to maximize the scientific discovery potential, the science output, the research efficiency, and the success of these projects.
Early patterns of commercial activity in graphene
NASA Astrophysics Data System (ADS)
Shapira, Philip; Youtie, Jan; Arora, Sanjay
2012-03-01
Graphene, a novel nanomaterial consisting of a single layer of carbon atoms, has attracted significant attention due to its distinctive properties, including great strength, electrical and thermal conductivity, lightness, and potential benefits for diverse applications. The commercialization of scientific discoveries such as graphene is inherently uncertain, with the lag time between the scientific development of a new technology and its adoption by corporate actors revealing the extent to which firms are able to absorb knowledge and engage in learning to implement applications based on the new technology. From this perspective, we test for the existence of three different corporate learning and activity patterns: (1) a linear process where patenting follows scientific discovery; (2) a double-boom phenomenon where corporate (patenting) activity is first concentrated in technological improvements and then followed by a period of technology productization; and (3) a concurrent model where scientific discovery in publications occurs in parallel with patenting. By analyzing corporate publication and patent activity across country and application lines, we find that, while graphene as a whole is experiencing concurrent scientific development and patenting growth, country- and application-specific trends offer some evidence of the linear and double-boom models.
Text mining applications in psychiatry: a systematic literature review.
Abbe, Adeline; Grouin, Cyril; Zweigenbaum, Pierre; Falissard, Bruno
2016-06-01
The expansion of biomedical literature is creating the need for efficient tools to keep pace with increasing volumes of information. Text mining (TM) approaches are becoming essential to facilitate the automated extraction of useful biomedical information from unstructured text. We reviewed the applications of TM in psychiatry, and explored its advantages and limitations. A systematic review of the literature was carried out using the CINAHL, Medline, EMBASE, PsycINFO and Cochrane databases. In this review, 1103 papers were screened, and 38 were included as applications of TM in psychiatric research. Using TM and content analysis, we identified four major areas of application: (1) Psychopathology (i.e. observational studies focusing on mental illnesses) (2) the Patient perspective (i.e. patients' thoughts and opinions), (3) Medical records (i.e. safety issues, quality of care and description of treatments), and (4) Medical literature (i.e. identification of new scientific information in the literature). The information sources were qualitative studies, Internet postings, medical records and biomedical literature. Our work demonstrates that TM can contribute to complex research tasks in psychiatry. We discuss the benefits, limits, and further applications of this tool in the future. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
"gnparser": a powerful parser for scientific names based on Parsing Expression Grammar.
Mozzherin, Dmitry Y; Myltsev, Alexander A; Patterson, David J
2017-05-26
Scientific names in biology act as universal links. They allow us to cross-reference information about organisms globally. However variations in spelling of scientific names greatly diminish their ability to interconnect data. Such variations may include abbreviations, annotations, misspellings, etc. Authorship is a part of a scientific name and may also differ significantly. To match all possible variations of a name we need to divide them into their elements and classify each element according to its role. We refer to this as 'parsing' the name. Parsing categorizes name's elements into those that are stable and those that are prone to change. Names are matched first by combining them according to their stable elements. Matches are then refined by examining their varying elements. This two stage process dramatically improves the number and quality of matches. It is especially useful for the automatic data exchange within the context of "Big Data" in biology. We introduce Global Names Parser (gnparser). It is a Java tool written in Scala language (a language for Java Virtual Machine) to parse scientific names. It is based on a Parsing Expression Grammar. The parser can be applied to scientific names of any complexity. It assigns a semantic meaning (such as genus name, species epithet, rank, year of publication, authorship, annotations, etc.) to all elements of a name. It is able to work with nested structures as in the names of hybrids. gnparser performs with ≈99% accuracy and processes 30 million name-strings/hour per CPU thread. The gnparser library is compatible with Scala, Java, R, Jython, and JRuby. The parser can be used as a command line application, as a socket server, a web-app or as a RESTful HTTP-service. It is released under an Open source MIT license. Global Names Parser (gnparser) is a fast, high precision tool for biodiversity informaticians and biologists working with large numbers of scientific names. It can replace expensive and error-prone manual parsing and standardization of scientific names in many situations, and can quickly enhance the interoperability of distributed biological information.
Teaching Scientific Communication Skills in Science Studies: Does It Make a Difference?
ERIC Educational Resources Information Center
Spektor-Levy, Ornit; Eylon, Bat-Sheva; Scherz, Zahava
2009-01-01
This study explores the impact of "Scientific Communication" (SC) skills instruction on students' performances in scientific literacy assessment tasks. We present a general model for skills instruction, characterized by explicit and spiral instruction, integration into content learning, practice in several scientific topics, and application of…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-31
... DEPARTMENT OF VETERANS AFFAIRS Health Services Research and Development Service Scientific Merit... Research and Development Service Scientific Merit Review Board will meet on February 13-14, 2013, at the... research. Applications are reviewed for scientific and technical merit. Recommendations regarding funding...
78 FR 18963 - Endangered and Threatened Species; Take of Anadromous Fish
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-28
... applications for scientific research and enhancement. SUMMARY: Notice is hereby given that NMFS has received three scientific research and enhancement permit applications relating to anadromous species listed under the Endangered Species Act (ESA). The proposed research activities are intended to increase...
Systems in Science: Modeling Using Three Artificial Intelligence Concepts.
ERIC Educational Resources Information Center
Sunal, Cynthia Szymanski; Karr, Charles L.; Smith, Coralee; Sunal, Dennis W.
2003-01-01
Describes an interdisciplinary course focusing on modeling scientific systems. Investigates elementary education majors' applications of three artificial intelligence concepts used in modeling scientific systems before and after the course. Reveals a great increase in understanding of concepts presented but inconsistent application. (Author/KHR)