Sample records for open distributed processing

  1. 75 FR 4741 - Express Mail Open and Distribute and Priority Mail Open and Distribute Changes and Updates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-29

    ... POSTAL SERVICE 39 CFR Part 111 Express Mail Open and Distribute and Priority Mail Open and... proposes to revise its standards to reflect changes and updates for Express Mail[supreg] Open and Distribute and Priority Mail[supreg] Open and Distribute to improve efficiencies in processing and to control...

  2. 75 FR 14076 - Express Mail Open and Distribute and Priority Mail Open and Distribute Changes and Updates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... POSTAL SERVICE 39 CFR Part 111 Express Mail Open and Distribute and Priority Mail Open and... to reflect changes and updates for Express Mail[supreg] Open and Distribute and Priority Mail[supreg] Open and Distribute to improve efficiencies in processing and to control costs. DATES: Effective Date...

  3. OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing

    NASA Astrophysics Data System (ADS)

    Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping

    2017-02-01

    The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.

  4. "WWW.MDTF.ORG": a World Wide Web forum for developing open-architecture, freely distributed, digital teaching file software by participant consensus.

    PubMed

    Katzman, G L; Morris, D; Lauman, J; Cochella, C; Goede, P; Harnsberger, H R

    2001-06-01

    To foster a community supported evaluation processes for open-source digital teaching file (DTF) development and maintenance. The mechanisms used to support this process will include standard web browsers, web servers, forum software, and custom additions to the forum software to potentially enable a mediated voting protocol. The web server will also serve as a focal point for beta and release software distribution, which is the desired end-goal of this process. We foresee that www.mdtf.org will provide for widespread distribution of open source DTF software that will include function and interface design decisions from community participation on the website forums.

  5. Characterizing Crowd Participation and Productivity of Foldit Through Web Scraping

    DTIC Science & Technology

    2016-03-01

    Berkeley Open Infrastructure for Network Computing CDF Cumulative Distribution Function CPU Central Processing Unit CSSG Crowdsourced Serious Game...computers at once can create a similar capacity. According to Anderson [6], principal investigator for the Berkeley Open Infrastructure for Network...extraterrestrial life. From this project, a software-based distributed computing platform called the Berkeley Open Infrastructure for Network Computing

  6. Object Management Group object transaction service based on an X/Open and International Organization for Standardization open systems interconnection transaction processing kernel

    NASA Astrophysics Data System (ADS)

    Liang, J.; Sédillot, S.; Traverson, B.

    1997-09-01

    This paper addresses federation of a transactional object standard - Object Management Group (OMG) object transaction service (OTS) - with the X/Open distributed transaction processing (DTP) model and International Organization for Standardization (ISO) open systems interconnection (OSI) transaction processing (TP) communication protocol. The two-phase commit propagation rules within a distributed transaction tree are similar in the X/Open, ISO and OMG models. Building an OTS on an OSI TP protocol machine is possible because the two specifications are somewhat complementary. OTS defines a set of external interfaces without specific internal protocol machine, while OSI TP specifies an internal protocol machine without any application programming interface. Given these observations, and having already implemented an X/Open two-phase commit transaction toolkit based on an OSI TP protocol machine, we analyse the feasibility of using this implementation as a transaction service provider for OMG interfaces. Based on the favourable result of this feasibility study, we are implementing an OTS compliant system, which, by initiating the extensibility and openness strengths of OSI TP, is able to provide interoperability between X/Open DTP and OMG OTS models.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  8. Open quantum random walk in terms of quantum Bernoulli noise

    NASA Astrophysics Data System (ADS)

    Wang, Caishi; Wang, Ce; Ren, Suling; Tang, Yuling

    2018-03-01

    In this paper, we introduce an open quantum random walk, which we call the QBN-based open walk, by means of quantum Bernoulli noise, and study its properties from a random walk point of view. We prove that, with the localized ground state as its initial state, the QBN-based open walk has the same limit probability distribution as the classical random walk. We also show that the probability distributions of the QBN-based open walk include those of the unitary quantum walk recently introduced by Wang and Ye (Quantum Inf Process 15:1897-1908, 2016) as a special case.

  9. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  10. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  11. The Reliability Estimation for the Open Function of Cabin Door Affected by the Imprecise Judgment Corresponding to Distribution Hypothesis

    NASA Astrophysics Data System (ADS)

    Yu, Z. P.; Yue, Z. F.; Liu, W.

    2018-05-01

    With the development of artificial intelligence, more and more reliability experts have noticed the roles of subjective information in the reliability design of complex system. Therefore, based on the certain numbers of experiment data and expert judgments, we have divided the reliability estimation based on distribution hypothesis into cognition process and reliability calculation. Consequently, for an illustration of this modification, we have taken the information fusion based on intuitional fuzzy belief functions as the diagnosis model of cognition process, and finished the reliability estimation for the open function of cabin door affected by the imprecise judgment corresponding to distribution hypothesis.

  12. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  13. LibIsopach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas

    2016-12-06

    LibIsopach is a toolkit for high performance distributed immersive visualization, leveraging modern OpenGL. It features a multi-process scenegraph, explicit instance rendering, mesh generation, and three-dimensional user interaction event processing.

  14. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  15. Overview of Sea-Ice Properties, Distribution and Temporal Variations, for Application to Ice-Atmosphere Chemical Processes.

    NASA Astrophysics Data System (ADS)

    Moritz, R. E.

    2005-12-01

    The properties, distribution and temporal variation of sea-ice are reviewed for application to problems of ice-atmosphere chemical processes. Typical vertical structure of sea-ice is presented for different ice types, including young ice, first-year ice and multi-year ice, emphasizing factors relevant to surface chemistry and gas exchange. Time average annual cycles of large scale variables are presented, including ice concentration, ice extent, ice thickness and ice age. Spatial and temporal variability of these large scale quantities is considered on time scales of 1-50 years, emphasizing recent and projected changes in the Arctic pack ice. The amount and time evolution of open water and thin ice are important factors that influence ocean-ice-atmosphere chemical processes. Observations and modeling of the sea-ice thickness distribution function are presented to characterize the range of variability in open water and thin ice.

  16. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  17. Challenges of Using CSCL in Open Distributed Learning.

    ERIC Educational Resources Information Center

    Nilsen, Anders Grov; Instefjord, Elen J.

    As a compulsory part of the study in Pedagogical Information Science at the University of Bergen and Stord/Haugesund College (Norway) during the spring term of 1999, students participated in a distributed group activity that provided experience on distributed collaboration and use of online groupware systems. The group collaboration process was…

  18. Plastic debris in the open ocean

    PubMed Central

    Cózar, Andrés; Echevarría, Fidel; González-Gordillo, J. Ignacio; Irigoien, Xabier; Úbeda, Bárbara; Hernández-León, Santiago; Palma, Álvaro T.; Navarro, Sandra; García-de-Lomas, Juan; Ruiz, Andrea; Fernández-de-Puelles, María L.; Duarte, Carlos M.

    2014-01-01

    There is a rising concern regarding the accumulation of floating plastic debris in the open ocean. However, the magnitude and the fate of this pollution are still open questions. Using data from the Malaspina 2010 circumnavigation, regional surveys, and previously published reports, we show a worldwide distribution of plastic on the surface of the open ocean, mostly accumulating in the convergence zones of each of the five subtropical gyres with comparable density. However, the global load of plastic on the open ocean surface was estimated to be on the order of tens of thousands of tons, far less than expected. Our observations of the size distribution of floating plastic debris point at important size-selective sinks removing millimeter-sized fragments of floating plastic on a large scale. This sink may involve a combination of fast nano-fragmentation of the microplastic into particles of microns or smaller, their transference to the ocean interior by food webs and ballasting processes, and processes yet to be discovered. Resolving the fate of the missing plastic debris is of fundamental importance to determine the nature and significance of the impacts of plastic pollution in the ocean. PMID:24982135

  19. Plastic debris in the open ocean.

    PubMed

    Cózar, Andrés; Echevarría, Fidel; González-Gordillo, J Ignacio; Irigoien, Xabier; Ubeda, Bárbara; Hernández-León, Santiago; Palma, Alvaro T; Navarro, Sandra; García-de-Lomas, Juan; Ruiz, Andrea; Fernández-de-Puelles, María L; Duarte, Carlos M

    2014-07-15

    There is a rising concern regarding the accumulation of floating plastic debris in the open ocean. However, the magnitude and the fate of this pollution are still open questions. Using data from the Malaspina 2010 circumnavigation, regional surveys, and previously published reports, we show a worldwide distribution of plastic on the surface of the open ocean, mostly accumulating in the convergence zones of each of the five subtropical gyres with comparable density. However, the global load of plastic on the open ocean surface was estimated to be on the order of tens of thousands of tons, far less than expected. Our observations of the size distribution of floating plastic debris point at important size-selective sinks removing millimeter-sized fragments of floating plastic on a large scale. This sink may involve a combination of fast nano-fragmentation of the microplastic into particles of microns or smaller, their transference to the ocean interior by food webs and ballasting processes, and processes yet to be discovered. Resolving the fate of the missing plastic debris is of fundamental importance to determine the nature and significance of the impacts of plastic pollution in the ocean.

  20. Rock fracture processes in chemically reactive environments

    NASA Astrophysics Data System (ADS)

    Eichhubl, P.

    2015-12-01

    Rock fracture is traditionally viewed as a brittle process involving damage nucleation and growth in a zone ahead of a larger fracture, resulting in fracture propagation once a threshold loading stress is exceeded. It is now increasingly recognized that coupled chemical-mechanical processes influence fracture growth in wide range of subsurface conditions that include igneous, metamorphic, and geothermal systems, and diagenetically reactive sedimentary systems with possible applications to hydrocarbon extraction and CO2 sequestration. Fracture processes aided or driven by chemical change can affect the onset of fracture, fracture shape and branching characteristics, and fracture network geometry, thus influencing mechanical strength and flow properties of rock systems. We are investigating two fundamental modes of chemical-mechanical interactions associated with fracture growth: 1. Fracture propagation may be aided by chemical dissolution or hydration reactions at the fracture tip allowing fracture propagation under subcritical stress loading conditions. We are evaluating effects of environmental conditions on critical (fracture toughness KIc) and subcritical (subcritical index) fracture properties using double torsion fracture mechanics tests on shale and sandstone. Depending on rock composition, the presence of reactive aqueous fluids can increase or decrease KIc and/or subcritical index. 2. Fracture may be concurrent with distributed dissolution-precipitation reactions in the hostrock beyond the immediate vicinity of the fracture tip. Reconstructing the fracture opening history recorded in crack-seal fracture cement of deeply buried sandstone we find that fracture length growth and fracture opening can be decoupled, with a phase of initial length growth followed by a phase of dominant fracture opening. This suggests that mechanical crack-tip failure processes, possibly aided by chemical crack-tip weakening, and distributed solution-precipitation creep in the hostrock can independently affect fracture opening displacement and thus fracture aperture profiles and aperture distribution.

  1. Open-Source Intelligence in the Czech Military: Knowledge System and Process Design

    DTIC Science & Technology

    2002-06-01

    in Open-Source Intelligence OSINT, as one of the intelligence disciplines, bears some of the general problems of intelligence " business " OSINT...ADAPTING KNOWLEDGE MANAGEMENT THEORY TO THE CZECH MILITARY INTELLIGENCE Knowledge work is the core business of the military intelligence . As...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited OPEN-SOURCE INTELLIGENCE IN THE

  2. Integrable Floquet dynamics, generalized exclusion processes and "fused" matrix ansatz

    NASA Astrophysics Data System (ADS)

    Vanicat, Matthieu

    2018-04-01

    We present a general method for constructing integrable stochastic processes, with two-step discrete time Floquet dynamics, from the transfer matrix formalism. The models can be interpreted as a discrete time parallel update. The method can be applied for both periodic and open boundary conditions. We also show how the stationary distribution can be built as a matrix product state. As an illustration we construct parallel discrete time dynamics associated with the R-matrix of the SSEP and of the ASEP, and provide the associated stationary distributions in a matrix product form. We use this general framework to introduce new integrable generalized exclusion processes, where a fixed number of particles is allowed on each lattice site in opposition to the (single particle) exclusion process models. They are constructed using the fusion procedure of R-matrices (and K-matrices for open boundary conditions) for the SSEP and ASEP. We develop a new method, that we named "fused" matrix ansatz, to build explicitly the stationary distribution in a matrix product form. We use this algebraic structure to compute physical observables such as the correlation functions and the mean particle current.

  3. An informatics model for guiding assembly of telemicrobiology workstations for malaria collaborative diagnostics using commodity products and open-source software.

    PubMed

    Suhanic, West; Crandall, Ian; Pennefather, Peter

    2009-07-17

    Deficits in clinical microbiology infrastructure exacerbate global infectious disease burdens. This paper examines how commodity computation, communication, and measurement products combined with open-source analysis and communication applications can be incorporated into laboratory medicine microbiology protocols. Those commodity components are all now sourceable globally. An informatics model is presented for guiding the use of low-cost commodity components and free software in the assembly of clinically useful and usable telemicrobiology workstations. The model incorporates two general principles: 1) collaborative diagnostics, where free and open communication and networking applications are used to link distributed collaborators for reciprocal assistance in organizing and interpreting digital diagnostic data; and 2) commodity engineering, which leverages globally available consumer electronics and open-source informatics applications, to build generic open systems that measure needed information in ways substantially equivalent to more complex proprietary systems. Routine microscopic examination of Giemsa and fluorescently stained blood smears for diagnosing malaria is used as an example to validate the model. The model is used as a constraint-based guide for the design, assembly, and testing of a functioning, open, and commoditized telemicroscopy system that supports distributed acquisition, exploration, analysis, interpretation, and reporting of digital microscopy images of stained malarial blood smears while also supporting remote diagnostic tracking, quality assessment and diagnostic process development. The open telemicroscopy workstation design and use-process described here can address clinical microbiology infrastructure deficits in an economically sound and sustainable manner. It can boost capacity to deal with comprehensive measurement of disease and care outcomes in individuals and groups in a distributed and collaborative fashion. The workstation enables local control over the creation and use of diagnostic data, while allowing for remote collaborative support of diagnostic data interpretation and tracking. It can enable global pooling of malaria disease information and the development of open, participatory, and adaptable laboratory medicine practices. The informatic model highlights how the larger issue of access to generic commoditized measurement, information processing, and communication technology in both high- and low-income countries can enable diagnostic services that are much less expensive, but substantially equivalent to those currently in use in high-income countries.

  4. A Disk-Based System for Producing and Distributing Science Products from MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Wolfe, Robert; Sinno, Scott; Ye Gang; Teague, Michael

    2007-01-01

    Since beginning operations in 1999, the MODIS Adaptive Processing System (MODAPS) has evolved to take advantage of trends in information technology, such as the falling cost of computing cycles and disk storage and the availability of high quality open-source software (Linux, Apache and Perl), to achieve substantial gains in processing and distribution capacity and throughput while driving down the cost of system operations.

  5. Deterministic Design Optimization of Structures in OpenMDAO Framework

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  6. 75 FR 56920 - Express Mail Open and Distribute and Priority Mail Open and Distribute

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... POSTAL SERVICE 39 CFR Part 111 Express Mail Open and Distribute and Priority Mail Open and...] Open and Distribute containers. The Postal Service also proposes to revise the service commitment for Express Mail Open and Distribute as a guaranteed end of day product; and to add a five-pound minimum...

  7. New insight into California’s drought through open data

    USGS Publications Warehouse

    Read, Emily K.; Bucknell, Mary; Hines, Megan K.; Kreft, James M.; Lucido, Jessica M.; Read, Jordan S.; Schroedl, Carl; Sibley, David M.; Stephan, Shirley; Suftin, Ivan; Thongsavanh, Phethala; Van Den Hoek, Jamon; Walker, Jordan I.; Wernimont, Martin R; Winslow, Luke A.; Yan, Andrew N.

    2015-01-01

    Historically unprecedented drought in California has brought water issues to the forefront of the nation’s attention. Crucial investigations that concern water policy, management, and research, in turn, require extensive information about the quality and quantity of California’s water. Unfortunately, key sources of pertinent data are unevenly distributed and frequently hard to find. Thankfully, the vital importance of integrating water data across federal, state, and tribal, academic, and private entities, has recently been recognized and addressed through federal initiatives such as the Climate Data Initiative of President Obama’s Climate Action Plan and the Advisory Committee on Water Information’sOpen Water Data Initiative. Here, we demonstrate an application of integrated open water data, visualized and made available online using open source software, for the purpose of exploring the impact of the current California drought. Our collaborative approach and technical tools enabled a rapid, distributed development process. Many positive outcomes have resulted: the application received recognition within and outside of the Federal Government, inspired others to visualize open water data, spurred new collaborations for our group, and strengthened the collaborative relationships within the team of developers. In this article, we describe the technical tools and collaborative process that enabled the success of the application. 

  8. Weiqi games as a tree: Zipf's law of openings and beyond

    NASA Astrophysics Data System (ADS)

    Xu, Li-Gong; Li, Ming-Xia; Zhou, Wei-Xing

    2015-06-01

    Weiqi is one of the most complex board games played by two persons. The placement strategies adopted by Weiqi players are often used to analog the philosophy of human wars. Contrary to the western chess, Weiqi games are less studied by academics partially because Weiqi is popular only in East Asia, especially in China, Japan and Korea. Here, we propose to construct a directed tree using a database of extensive Weiqi games and perform a quantitative analysis of the Weiqi tree. We find that the popularity distribution of Weiqi openings with the same number of moves is distributed according to a power law and the tail exponent increases with the number of moves. Intriguingly, the superposition of the popularity distributions of Weiqi openings with a number of moves not higher than a given number also has a power-law tail in which the tail exponent increases with the number of moves, and the superposed distribution approaches the Zipf law. These findings are the same as for chess and support the conjecture that the popularity distribution of board game openings follows the Zipf law with a universal exponent. We also find that the distribution of out-degrees has a power-law form, the distribution of branching ratios has a very complicated pattern, and the distribution of uniqueness scores defined by the path lengths from the root vertex to the leaf vertices exhibits a unimodal shape. Our work provides a promising direction for the study of the decision-making process of Weiqi playing from the perspective of directed branching tree.

  9. School District Purchasing.

    ERIC Educational Resources Information Center

    Natale, Joseph L.

    This chapter of "Principles of School Business Management" discusses the effective management of purchasing processes in a school district. These processes include obtaining materials, supplies, and equipment of maximum value for the least expense, and receiving, storing, and distributing the items obtained. The chapter opens with an overview of…

  10. Microfracture spacing distributions and the evolution of fracture patterns in sandstones

    NASA Astrophysics Data System (ADS)

    Hooker, J. N.; Laubach, S. E.; Marrett, R.

    2018-03-01

    Natural fracture patterns in sandstone were sampled using scanning electron microscope-based cathodoluminescence (SEM-CL) imaging. All fractures are opening-mode and are fully or partially sealed by quartz cement. Most sampled fractures are too small to be height-restricted by sedimentary layers. At very low strains (<∼0.001), fracture spatial distributions are indistinguishable from random, whereas at higher strains, fractures are generally statistically clustered. All 12 large (N > 100) datasets show spacings that are best fit by log-normal size distributions, compared to exponential, power law, or normal distributions. The clustering of fractures suggests that the locations of natural factures are not determined by a random process. To investigate natural fracture localization, we reconstructed the opening history of a cluster of fractures within the Huizachal Group in northeastern Mexico, using fluid inclusions from synkinematic cements and thermal-history constraints. The largest fracture, which is the only fracture in the cluster visible to the naked eye, among 101 present, opened relatively late in the sequence. This result suggests that the growth of sets of fractures is a self-organized process, in which small, initially isolated fractures grow and progressively interact, with preferential growth of a subset of fractures developing at the expense of growth of the rest. Size-dependent sealing of fractures within sets suggests that synkinematic cementation may contribute to fracture clustering.

  11. Barista: A Framework for Concurrent Speech Processing by USC-SAIL

    PubMed Central

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G.; Narayanan, Shrikanth S.

    2016-01-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0. PMID:27610047

  12. Barista: A Framework for Concurrent Speech Processing by USC-SAIL.

    PubMed

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G; Narayanan, Shrikanth S

    2014-05-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0.

  13. Telescience - Optimizing aerospace science return through geographically distributed operations

    NASA Technical Reports Server (NTRS)

    Rasmussen, Daryl N.; Mian, Arshad M.

    1990-01-01

    The paper examines the objectives and requirements of teleoperations, defined as the means and process for scientists, NASA operations personnel, and astronauts to conduct payload operations as if these were colocated. This process is described in terms of Space Station era platforms. Some of the enabling technologies are discussed, including open architecture workstations, distributed computing, transaction management, expert systems, and high-speed networks. Recent testbedding experiments are surveyed to highlight some of the human factors requirements.

  14. Coastal Processes with Improved Tidal Opening in Chilika Lagoon (east Coast of India)

    NASA Astrophysics Data System (ADS)

    Jayaraman, Girija; Dube, Anumeha

    Chilika Lagoon (19°28-19°54¢N and 85°06-85°36¢E) is the largest brackish water lagoon with estuarine character. Interest in detailed analysis of the ecology of the lagoon and the various factors affecting it is due to the opening of the new mouth on September 23, 2000 to resolve the threat to its environment from various factors - Eutrophication, weed proliferation, siltation, industrial pollution, and depletion of bioresources. The opening of the new mouth has changed the lagoon environment significantly with better socio­economic implications. There is a serious concern if the significant improvement in the biological productivity of the lagoon post-mouth opening is indeed sustainable. The present study focuses on the changes in the coastal processes as a result of the additional opening of a new mouth. Our results based on mathematical modeling and numerical simulation compare the dynamics, nutrient, and plankton distribution before and after the new mouth opening. The model could confirm the significant increase (14-66% depending on the sector) in the salinity after the new mouth opening, the maximum change being observed in the channel which connects the lagoon to the sea. The constriction in the lagoon which blocks the tidal effects entering the lagoon must be responsible for maintaining the main body of the lagoon with low salinity. The ecological model is first tested for different sectors individually before a complete model, including the entire lagoon area, is included incorporating their distinct characteristics. The model is validated with available observations of plankton and nutrients made before the opening of the new mouth. It predicts the annual distribution of plankton in all the sectors of the lagoon for post-mouth opening which is to be verified when the data will be forthcoming.

  15. Coastal Processes with Improved Tidal Opening in Chilika Lagoon (east Coast of India)

    NASA Astrophysics Data System (ADS)

    Jayaraman, Girija; Dube, Anumeha

    Chilika Lagoon (19°28-19°54'N and 85°06-85°36'E) is the largest brackish water lagoon with estuarine character. Interest in detailed analysis of the ecology of the lagoon and the various factors affecting it is due to the opening of the new mouth on September 23, 2000 to resolve the threat to its environment from various factors — Eutrophication, weed proliferation, siltation, industrial pollution, and depletion of bioresources. The opening of the new mouth has changed the lagoon environment significantly with better socio-economic implications. There is a serious concern if the significant improvement in the biological productivity of the lagoon post-mouth opening is indeed sustainable. The present study focuses on the changes in the coastal processes as a result of the additional opening of a new mouth. Our results based on mathematical modeling and numerical simulation compare the dynamics, nutrient, and plankton distribution before and after the new mouth opening. The model could confirm the significant increase (14-66% depending on the sector) in the salinity after the new mouth opening, the maximum change being observed in the channel which connects the lagoon to the sea. The constriction in the lagoon which blocks the tidal effects entering the lagoon must be responsible for maintaining the main body of the lagoon with low salinity. The ecological model is first tested for different sectors individually before a complete model, including the entire lagoon area, is included incorporating their distinct characteristics. The model is validated with available observations of plankton and nutrients made before the opening of the new mouth. It predicts the annual distribution of plankton in all the sectors of the lagoon for post-mouth opening which is to be verified when the data will be forthcoming.

  16. 75 FR 72686 - Express Mail Open and Distribute and Priority Mail Open and Distribute

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... POSTAL SERVICE 39 CFR Part 111 Express Mail Open and Distribute and Priority Mail Open and... ``DB'' prefix along with new Tag 257, Tag 267, or Label 257S, on all Express Mail[supreg] Open and Distribute containers. The Postal Service is also revising the service commitment for Express Mail Open and...

  17. Harness That S.O.B.: Distributing Remote Sensing Analysis in a Small Office/Business

    NASA Astrophysics Data System (ADS)

    Kramer, J.; Combe, J.; McCord, T. B.

    2009-12-01

    Researchers in a small office/business (SOB) operate with limited funding, equipment, and software availability. To mitigate these issues, we developed a distributed computing framework that: 1) leverages open source software to implement functionality otherwise reliant on proprietary software and 2) harnesses the unused power of (semi-)idle office computers with mixed operating systems (OSes). This abstract outlines some reasons for the effort, its conceptual basis and implementation, and provides brief speedup results. The Multiple-Endmember Linear Spectral Unmixing Model (MELSUM)1 processes remote-sensing (hyper-)spectral images. The algorithm is computationally expensive, sometimes taking a full week or more for a 1 million pixel/100 wavelength image. Analysis of pixels is independent, so a large benefit can be gained from parallel processing techniques. Job concurrency is limited by the number of active processing units. MELSUM was originally written in the Interactive Data Language (IDL). Despite its multi-threading capabilities, an IDL instance executes on a single machine, and so concurrency is limited by the machine's number of central processing units (CPUs). Network distribution can access more CPUs to provide a greater speedup, while also taking advantage of (often) underutilized extant equipment. appropriately integrating open source software magnifies the impact by avoiding the purchase of additional licenses. Our method of distribution breaks into four conceptual parts: 1) the top- or task-level user interface; 2) a mid-level program that manages hosts and jobs, called the distribution server; 3) a low-level executable for individual pixel calculations; and 4) a control program to synchronize sequential sub-tasks. Each part is a separate OS process, passing information via shell commands and/or temporary files. While the control and low-level executables are short-lived, the top-level program and distribution server run (at least) for the entirety of a task. While any language that supports "spawning" of OS processes can serve as the top-level interface, our solution, d-MELSUM, has been integrated with the IDL code. Doing so extracts the core calculating from IDL, but otherwise preserves IDL features and functionality. The distribution server is an extension of ADE2 mobile robot software, written in Java. Network connections rely on a secure shell (SSH) implementation, whether natively available (e.g., Linux or OS X) or user installed (e.g., OpenSSH available via Cygwin on Windows). Both the low-level and control programs are relatively small C++ programs (~54K, or 1500 lines, total) that were developed in-house, and use GNU's g++ compiler. The low-level code also relies on Linear Algebra PACKage (LAPACK) libraries for pixel calculations. Despite performance being contingent on data size, CPU speed, and network communication rate and latency to some degree, results have generally demonstrated a time reduction of a factor proportional to the number of open connections (one per CPU). For example, the task mentioned above requiring a week to process took 18 hours with d-MELSUM, using 10 CPUs on 2 computers. 1 J.-Ph Combe, et al., PSS 56, 2008. 2 J. Kramer and M. Scheutz, IROS2006, 2006.

  18. Biointervention makes leather processing greener: an integrated cleansing and tanning system.

    PubMed

    Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2003-06-01

    The do-undo methods adopted in conventional leather processing generate huge amounts of pollutants. In other words, conventional methods employed in leather processing subject the skin/hide to wide variations in pH. Pretanning and tanning processes alone contribute more than 90% of the total pollution from leather processing. Included in this is a great deal of solid wastes such as lime and chrome sludge. In the approach described here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0 for cow hides. This was followed by a pickle-free chrome tanning, which does not require a basification step. Hence, this tanning technique involves primarily three steps, namely, dehairing, fiber opening, and tanning. It has been found that the extent of hair removal, opening up of fiber bundles, and penetration and distribution of chromium are comparable to that produced by traditional methods. This has been substantiated through scanning electron microscopic, stratigraphic chrome distribution analysis, and softness measurements. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. Importantly, softness of the leathers is numerically proven to be comparable with that of control. The process also demonstrates reduction in chemical oxygen demand load by 80%, total solids load by 85%, and chromium load by 80% as compared to the conventional process, thereby leading toward zero discharge. The input-output audit shows that the biocatalytic three-step tanning process employs a very low amount of chemicals, thereby reducing the discharge by 90% as compared to the conventional multistep processing. Furthermore, it is also demonstrated that the process is technoeconomically viable.

  19. Matrix product representation of the stationary state of the open zero range process

    NASA Astrophysics Data System (ADS)

    Bertin, Eric; Vanicat, Matthieu

    2018-06-01

    Many one-dimensional lattice particle models with open boundaries, like the paradigmatic asymmetric simple exclusion process (ASEP), have their stationary states represented in the form of a matrix product, with matrices that do not explicitly depend on the lattice site. In contrast, the stationary state of the open 1D zero-range process (ZRP) takes an inhomogeneous factorized form, with site-dependent probability weights. We show that in spite of the absence of correlations, the stationary state of the open ZRP can also be represented in a matrix product form, where the matrices are site-independent, non-commuting and determined from algebraic relations resulting from the master equation. We recover the known distribution of the open ZRP in two different ways: first, using an explicit representation of the matrices and boundary vectors; second, from the sole knowledge of the algebraic relations satisfied by these matrices and vectors. Finally, an interpretation of the relation between the matrix product form and the inhomogeneous factorized form is proposed within the framework of hidden Markov chains.

  20. The Miniaturization of the AFIT Random Noise Radar

    DTIC Science & Technology

    2013-03-01

    RANDOM NOISE RADAR I. Introduction Recent advances in technology and signal processing techniques have opened thedoor to using an ultra-wide band random...AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base, Ohio DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED...and Computer Engineering Graduate School of Engineering and Management Air Force Institute of Technology Air University Air Education and Training

  1. NCI's Distributed Geospatial Data Server

    NASA Astrophysics Data System (ADS)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  2. Architectural Improvements and New Processing Tools for the Open XAL Online Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M

    The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phasemore » information, which allows for dynamic phase slip and elapsed time computation.« less

  3. Magma Vesiculation and Infrasonic Activity in Open Conduit Volcanoes

    NASA Astrophysics Data System (ADS)

    Colo', L.; Baker, D. R.; Polacci, M.; Ripepe, M.

    2007-12-01

    At persistently active basaltic volcanoes such as Stromboli, Italy degassing of the magma column can occur in "passive" and "active" conditions. Passive degassing is generally understood as a continuous, non explosive release of gas mainly from the open summit vents and subordinately from the conduit's wall or from fumaroles. In passive degassing generally gas is in equilibrium with atmospheric pressure, while in active degassing the gas approaches the surface at overpressurized conditions. During active degassing (or puffing), the magma column is interested by the bursting of small gas bubbles at the magma free surface and, as a consequence, the active degassing process generates infrasonic signals. We postulated, in this study, that the rate and the amplitude of infrasonic activity is somehow linked to the rate and the volume of the overpressured gas bubbles, which are generated in the magma column. Our hypothesis is that infrasound is controlled by the quantities of gas exsolved in the magma column and then, that a relationship between infrasound and the vesiculation process should exist. In order to achieve this goal, infrasonic records and bubble size distributions of scoria samples from normal explosive activity at Stromboli processed via X ray tomography have been compared. We observed that the cumulative distribution for both data sets follow similar power laws, indicating that both processes are controlled by a scale invariant phenomenon. However the power law is not stable but changes in different scoria clasts, reflecting when gas bubble nucleation is predominant over bubbles coalescence and viceversa. The power law also changes for the infrasonic activity from time to time, suggesting that infrasound may be controlled also by a different gas exsolution within the magma column. Changes in power law distributions are the same for infrasound and scoria indicating that they are linked to the same process acting in the magmatic system. We suggest that monitoring infrasound on an active volcano could represent an alternative way to monitor the vesiculation process of an open conduit system.

  4. Facilitating the openEHR approach - organizational structures for defining high-quality archetypes.

    PubMed

    Kohl, Christian Dominik; Garde, Sebastian; Knaup, Petra

    2008-01-01

    Using openEHR archetypes to establish an electronic patient record promises rapid development and system interoperability by using or adopting existing archetypes. However, internationally accepted, high quality archetypes which enable a comprehensive semantic interoperability require adequate development and maintenance processes. Therefore, structures have to be created involving different health professions. In the following we present a model which facilitates and governs distributed but cooperative development and adoption of archetypes by different professionals including peer reviews. Our model consists of a hierarchical structure of professional committees and descriptions of the archetype development process considering these different committees.

  5. Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Chen, Yousu; Wu, Di

    2015-12-09

    Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Messagemore » Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.« less

  6. Easy research data handling with an OpenEarth DataLab for geo-monitoring research

    NASA Astrophysics Data System (ADS)

    Vanderfeesten, Maurice; van der Kuil, Annemiek; Prinčič, Alenka; den Heijer, Kees; Rombouts, Jeroen

    2015-04-01

    OpenEarth DataLab is an open source-based collaboration and processing platform to enable streamlined research data management from raw data ingest and transformation to interoperable distribution. It enables geo-scientists to easily synchronise, share, compute and visualise the dynamic and most up-to-date research data, scripts and models in multi-stakeholder geo-monitoring programs. This DataLab is developed by the Research Data Services team of TU Delft Library and 3TU.Datacentrum together with coastal engineers of Delft University of Technology and Deltares. Based on the OpenEarth software stack an environment has been developed to orchestrate numerous geo-related open source software components that can empower researchers and increase the overall research quality by managing research data; enabling automatic and interoperable data workflows between all the components with track & trace, hit & run data transformation processing in cloud infrastructure using MatLab and Python, synchronisation of data and scripts (SVN), and much more. Transformed interoperable data products (KML, NetCDF, PostGIS) can be used by ready-made OpenEarth tools for further analyses and visualisation, and can be distributed via interoperable channels such as THREDDS (OpenDAP) and GeoServer. An example of a successful application of OpenEarth DataLab is the Sand Motor, an innovative method for coastal protection in the Netherlands. The Sand Motor is a huge volume of sand that has been applied along the coast to be spread naturally by wind, waves and currents. Different research disciplines are involved concerned with: weather, waves and currents, sand distribution, water table and water quality, flora and fauna, recreation and management. Researchers share and transform their data in the OpenEarth DataLab, that makes it possible to combine their data and to see influence of different aspects of the coastal protection on their models. During the project the data are available only for the researchers involved. After the project a large part of the data and scripts will be published with DOI in the Data Archive of 3TU.Datacentrum for reuse in new research. For the 83 project members of the Sand Motor, the OpenEarth DataLab is available on www.zandmotordata.nl. The OpenEarth DataLab not only saves time and increases quality, but has the potential to open new frontiers for exploring cross-domain analysis and visualisations, revealing new scientific insights.

  7. Extracting Message Inter-Departure Time Distributions from the Human Electroencephalogram

    PubMed Central

    Mišić, Bratislav; Vakorin, Vasily A.; Kovačević, Nataša; Paus, Tomáš; McIntosh, Anthony R.

    2011-01-01

    The complex connectivity of the cerebral cortex is a topic of much study, yet the link between structure and function is still unclear. The processing capacity and throughput of information at individual brain regions remains an open question and one that could potentially bridge these two aspects of neural organization. The rate at which information is emitted from different nodes in the network and how this output process changes under different external conditions are general questions that are not unique to neuroscience, but are of interest in multiple classes of telecommunication networks. In the present study we show how some of these questions may be addressed using tools from telecommunications research. An important system statistic for modeling and performance evaluation of distributed communication systems is the time between successive departures of units of information at each node in the network. We describe a method to extract and fully characterize the distribution of such inter-departure times from the resting-state electroencephalogram (EEG). We show that inter-departure times are well fitted by the two-parameter Gamma distribution. Moreover, they are not spatially or neurophysiologically trivial and instead are regionally specific and sensitive to the presence of sensory input. In both the eyes-closed and eyes-open conditions, inter-departure time distributions were more dispersed over posterior parietal channels, close to regions which are known to have the most dense structural connectivity. The biggest differences between the two conditions were observed at occipital sites, where inter-departure times were significantly more variable in the eyes-open condition. Together, these results suggest that message departure times are indicative of network traffic and capture a novel facet of neural activity. PMID:21673866

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  9. Gas distributor for fluidized bed coal gasifier

    DOEpatents

    Worley, Arthur C.; Zboray, James A.

    1980-01-01

    A gas distributor for distributing high temperature reaction gases to a fluidized bed of coal particles in a coal gasification process. The distributor includes a pipe with a refractory reinforced lining and a plurality of openings in the lining through which gas is fed into the bed. These feed openings have an expanding tapered shape in the downstream or exhaust direction which aids in reducing the velocity of the gas jets as they enter the bed.

  10. Watershed Modeling Applications with the Open-Access Modular Distributed Watershed Educational Toolbox (MOD-WET) and Introductory Hydrology Textbook

    NASA Astrophysics Data System (ADS)

    Huning, L. S.; Margulis, S. A.

    2014-12-01

    Traditionally, introductory hydrology courses focus on hydrologic processes as independent or semi-independent concepts that are ultimately integrated into a watershed model near the end of the term. When an "off-the-shelf" watershed model is introduced in the curriculum, this approach can result in a potential disconnect between process-based hydrology and the inherent interconnectivity of processes within the water cycle. In order to curb this and reduce the learning curve associated with applying hydrologic concepts to complex real-world problems, we developed the open-access Modular Distributed Watershed Educational Toolbox (MOD-WET). The user-friendly, MATLAB-based toolbox contains the same physical equations for hydrological processes (i.e. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) that are presented in the companion e-textbook (http://aqua.seas.ucla.edu/margulis_intro_to_hydro_textbook.html) and taught in the classroom. The modular toolbox functions can be used by students to study individual hydrologic processes. These functions are integrated together to form a simple spatially-distributed watershed model, which reinforces a holistic understanding of how hydrologic processes are interconnected and modeled. Therefore when watershed modeling is introduced, students are already familiar with the fundamental building blocks that have been unified in the MOD-WET model. Extensive effort has been placed on the development of a highly modular and well-documented code that can be run on a personal computer within the commonly-used MATLAB environment. MOD-WET was designed to: 1) increase the qualitative and quantitative understanding of hydrological processes at the basin-scale and demonstrate how they vary with watershed properties, 2) emphasize applications of hydrologic concepts rather than computer programming, 3) elucidate the underlying physical processes that can often be obscured with a complicated "off-the-shelf" watershed model in an introductory hydrology course, and 4) reduce the learning curve associated with analyzing meaningful real-world problems. The open-access MOD-WET and e-textbook have already been successfully incorporated within our undergraduate curriculum.

  11. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    NASA Astrophysics Data System (ADS)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.

  12. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    NASA Technical Reports Server (NTRS)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  13. OpenDanubia - An integrated, modular simulation system to support regional water resource management

    NASA Astrophysics Data System (ADS)

    Muerth, M.; Waldmann, D.; Heinzeller, C.; Hennicker, R.; Mauser, W.

    2012-04-01

    The already completed, multi-disciplinary research project GLOWA-Danube has developed a regional scale, integrated modeling system, which was successfully applied on the 77,000 km2 Upper Danube basin to investigate the impact of Global Change on both the natural and anthropogenic water cycle. At the end of the last project phase, the integrated modeling system was transferred into the open source project OpenDanubia, which now provides both the core system as well as all major model components to the general public. First, this will enable decision makers from government, business and management to use OpenDanubia as a tool for proactive management of water resources in the context of global change. Secondly, the model framework to support integrated simulations and all simulation models developed for OpenDanubia in the scope of GLOWA-Danube are further available for future developments and research questions. OpenDanubia allows for the investigation of water-related scenarios considering different ecological and economic aspects to support both scientists and policy makers to design policies for sustainable environmental management. OpenDanubia is designed as a framework-based, distributed system. The model system couples spatially distributed physical and socio-economic process during run-time, taking into account their mutual influence. To simulate the potential future impacts of Global Change on agriculture, industrial production, water supply, households and tourism businesses, so-called deep actor models are implemented in OpenDanubia. All important water-related fluxes and storages in the natural environment are implemented in OpenDanubia as spatially explicit, process-based modules. This includes the land surface water and energy balance, dynamic plant water uptake, ground water recharge and flow as well as river routing and reservoirs. Although the complete system is relatively demanding on data requirements and hardware requirements, the modular structure and the generic core system (Core Framework, Actor Framework) allows the application in new regions and the selection of a reduced number of modules for simulation. As part of the Open Source Initiative in GLOWA-Danube (opendanubia.glowa-danube.de) a comprehensive documentation for the system installation was created and both the program code of the framework and of all major components is licensed under the GNU General Public License. In addition, some helpful programs and scripts necessary for the operation and processing of input and result data sets are provided.

  14. Sensitivity of open-water ice growth and ice concentration evolution in a coupled atmosphere-ocean-sea ice model

    NASA Astrophysics Data System (ADS)

    Shi, Xiaoxu; Lohmann, Gerrit

    2017-09-01

    A coupled atmosphere-ocean-sea ice model is applied to investigate to what degree the area-thickness distribution of new ice formed in open water affects the ice and ocean properties. Two sensitivity experiments are performed which modify the horizontal-to-vertical aspect ratio of open-water ice growth. The resulting changes in the Arctic sea-ice concentration strongly affect the surface albedo, the ocean heat release to the atmosphere, and the sea-ice production. The changes are further amplified through a positive feedback mechanism among the Arctic sea ice, the Atlantic Meridional Overturning Circulation (AMOC), and the surface air temperature in the Arctic, as the Fram Strait sea ice import influences the freshwater budget in the North Atlantic Ocean. Anomalies in sea-ice transport lead to changes in sea surface properties of the North Atlantic and the strength of AMOC. For the Southern Ocean, the most pronounced change is a warming along the Antarctic Circumpolar Current (ACC), owing to the interhemispheric bipolar seasaw linked to AMOC weakening. Another insight of this study lies on the improvement of our climate model. The ocean component FESOM is a newly developed ocean-sea ice model with an unstructured mesh and multi-resolution. We find that the subpolar sea-ice boundary in the Northern Hemisphere can be improved by tuning the process of open-water ice growth, which strongly influences the sea ice concentration in the marginal ice zone, the North Atlantic circulation, salinity and Arctic sea ice volume. Since the distribution of new ice on open water relies on many uncertain parameters and the knowledge of the detailed processes is currently too crude, it is a challenge to implement the processes realistically into models. Based on our sensitivity experiments, we conclude a pronounced uncertainty related to open-water sea ice growth which could significantly affect the climate system sensitivity.

  15. Distribution history and climatic controls of the Late Miocene Pikermian chronofauna.

    PubMed

    Eronen, Jussi T; Ataabadi, Majid Mirzaie; Micheels, Arne; Karme, Aleksis; Bernor, Raymond L; Fortelius, Mikael

    2009-07-21

    The Late Miocene development of faunas and environments in western Eurasia is well known, but the climatic and environmental processes that controlled its details are incompletely understood. Here we map the rise and fall of the classic Pikermian fossil mammal chronofauna between 12 and 4.2 Ma, using genus-level faunal similarity between localities. To directly relate land mammal community evolution to environmental change, we use the hypsodonty paleoprecipitation proxy and paleoclimate modeling. The geographic distribution of faunal similarity and paleoprecipitation in successive timeslices shows the development of the open biome that favored the evolution and spread of the open-habitat adapted large mammal lineages. In the climate model run, this corresponds to a decrease in precipitation over its core area south of the Paratethys Sea. The process began in the latest Middle Miocene and climaxed in the medial Late Miocene, about 7-8 million years ago. The geographic range of the Pikermian chronofauna contracted in the latest Miocene, a time of increasing summer drought and regional differentiation of habitats in Eastern Europe and Southwestern Asia. Its demise at the Miocene-Pliocene boundary coincides with an environmental reversal toward increased humidity and forestation, changes inevitably detrimental to open-adapted, wide-ranging large mammals.

  16. Swan: A tool for porting CUDA programs to OpenCL

    NASA Astrophysics Data System (ADS)

    Harvey, M. J.; De Fabritiis, G.

    2011-04-01

    The use of modern, high-performance graphical processing units (GPUs) for acceleration of scientific computation has been widely reported. The majority of this work has used the CUDA programming model supported exclusively by GPUs manufactured by NVIDIA. An industry standardisation effort has recently produced the OpenCL specification for GPU programming. This offers the benefits of hardware-independence and reduced dependence on proprietary tool-chains. Here we describe a source-to-source translation tool, "Swan" for facilitating the conversion of an existing CUDA code to use the OpenCL model, as a means to aid programmers experienced with CUDA in evaluating OpenCL and alternative hardware. While the performance of equivalent OpenCL and CUDA code on fixed hardware should be comparable, we find that a real-world CUDA application ported to OpenCL exhibits an overall 50% increase in runtime, a reduction in performance attributable to the immaturity of contemporary compilers. The ported application is shown to have platform independence, running on both NVIDIA and AMD GPUs without modification. We conclude that OpenCL is a viable platform for developing portable GPU applications but that the more mature CUDA tools continue to provide best performance. Program summaryProgram title: Swan Catalogue identifier: AEIH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public License version 2 No. of lines in distributed program, including test data, etc.: 17 736 No. of bytes in distributed program, including test data, etc.: 131 177 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 256 Mbytes Classification: 6.5 External routines: NVIDIA CUDA, OpenCL Nature of problem: Graphical Processing Units (GPUs) from NVIDIA are preferentially programed with the proprietary CUDA programming toolkit. An alternative programming model promoted as an industry standard, OpenCL, provides similar capabilities to CUDA and is also supported on non-NVIDIA hardware (including multicore ×86 CPUs, AMD GPUs and IBM Cell processors). The adaptation of a program from CUDA to OpenCL is relatively straightforward but laborious. The Swan tool facilitates this conversion. Solution method:Swan performs a translation of CUDA kernel source code into an OpenCL equivalent. It also generates the C source code for entry point functions, simplifying kernel invocation from the host program. A concise host-side API abstracts the CUDA and OpenCL APIs. A program adapted to use Swan has no dependency on the CUDA compiler for the host-side program. The converted program may be built for either CUDA or OpenCL, with the selection made at compile time. Restrictions: No support for CUDA C++ features Running time: Nominal

  17. Size tunable gold nanorods evenly distributed in the channels of mesoporous silica.

    PubMed

    Li, Zhi; Kübel, Christian; Pârvulescu, Vasile I; Richards, Ryan

    2008-06-01

    Uniformly distributed gold nanorods in mesoporous silica were synthesized in situ by performing a seed-mediated growth process in the channels of SBA-15 which functions as a hard-template to confine the diameter of gold nanorods. By changing the amount of gold precursor, gold nanorods were prepared with a fixed diameter (6-7 nm) and tunable aspect ratios from 3 to 30. Transmission electron microscope and electron tomography were utilized to visualize the gold nanorods supported on one piece of SBA-15 segment and showed a fairly uniform 3-dimensional distribution of gold nanorods within the SBA-15 channels. The longitudinal plasmon resonances of the gold nanorods/SBA-15 composites analyzed by diffuse reflectance UV-vis spectra were found to be tunable depending on the length of gold nanorods. No significant decrease in surface area and/or pore size of the composite was found after growth, indicating the growth process did not disrupt the open mesoporous structure of SBA-15. The combination of the tunable size of the nanorods and their 3-dimensional distribution within the open supporting matrix makes the gold nanorods/SBA-15 composites interesting candidates to systematically study the influence of the aspect ratio of gold nanorods on their properties and potential applications, i.e., catalyst, optical polarizer, and ultrasensitive medical imaging technique.

  18. Standardization as an Arena for Open Innovation

    NASA Astrophysics Data System (ADS)

    Grøtnes, Endre

    This paper argues that anticipatory standardization can be viewed as an arena for open innovation and shows this through two cases from mobile telecommunication standardization. One case is the Android initiative by Google and the Open Handset Alliance, while the second case is the general standardization work of the Open Mobile Alliance. The paper shows how anticipatory standardization intentionally uses inbound and outbound streams of research and intellectual property to create new innovations. This is at the heart of the open innovation model. The standardization activities use both pooling of R&D and the distribution of freely available toolkits to create products and architectures that can be utilized by the participants and third parties to leverage their innovation. The paper shows that the technology being standardized needs to have a systemic nature to be part of an open innovation process.

  19. A Structured Approach for Reviewing Architecture Documentation

    DTIC Science & Technology

    2009-12-01

    as those found in ISO 12207 [ ISO /IEC 12207 :2008] (for software engineering), ISO 15288 [ ISO /IEC 15288:2008] (for systems engineering), the Rational...Open Distributed Processing - Reference Model: Foundations ( ISO /IEC 10746-2). 1996. [ ISO /IEC 12207 :2008] International Organization for...Standardization & International Electrotechnical Commission. Sys- tems and software engineering – Software life cycle processes ( ISO /IEC 12207 ). 2008. [ ISO

  20. High-uniformity centimeter-wide Si etching method for MEMS devices with large opening elements

    NASA Astrophysics Data System (ADS)

    Okamoto, Yuki; Tohyama, Yukiya; Inagaki, Shunsuke; Takiguchi, Mikio; Ono, Tomoki; Lebrasseur, Eric; Mita, Yoshio

    2018-04-01

    We propose a compensated mesh pattern filling method to achieve highly uniform wafer depth etching (over hundreds of microns) with a large-area opening (over centimeter). The mesh opening diameter is gradually changed between the center and the edge of a large etching area. Using such a design, the etching depth distribution depending on sidewall distance (known as the local loading effect) inversely compensates for the over-centimeter-scale etching depth distribution, known as the global or within-die(chip)-scale loading effect. Only a single DRIE with test structure patterns provides a micro-electromechanical systems (MEMS) designer with the etched depth dependence on the mesh opening size as well as on the distance from the chip edge, and the designer only has to set the opening size so as to obtain a uniform etching depth over the entire chip. This method is useful when process optimization cannot be performed, such as in the cases of using standard conditions for a foundry service and of short turn-around-time prototyping. To demonstrate, a large MEMS mirror that needed over 1 cm2 of backside etching was successfully fabricated using as-is-provided DRIE conditions.

  1. The eGo grid model: An open-source and open-data based synthetic medium-voltage grid model for distribution power supply systems

    NASA Astrophysics Data System (ADS)

    Amme, J.; Pleßmann, G.; Bühler, J.; Hülk, L.; Kötter, E.; Schwaegerl, P.

    2018-02-01

    The increasing integration of renewable energy into the electricity supply system creates new challenges for distribution grids. The planning and operation of distribution systems requires appropriate grid models that consider the heterogeneity of existing grids. In this paper, we describe a novel method to generate synthetic medium-voltage (MV) grids, which we applied in our DIstribution Network GeneratOr (DINGO). DINGO is open-source software and uses freely available data. Medium-voltage grid topologies are synthesized based on location and electricity demand in defined demand areas. For this purpose, we use GIS data containing demand areas with high-resolution spatial data on physical properties, land use, energy, and demography. The grid topology is treated as a capacitated vehicle routing problem (CVRP) combined with a local search metaheuristics. We also consider the current planning principles for MV distribution networks, paying special attention to line congestion and voltage limit violations. In the modelling process, we included power flow calculations for validation. The resulting grid model datasets contain 3608 synthetic MV grids in high resolution, covering all of Germany and taking local characteristics into account. We compared the modelled networks with real network data. In terms of number of transformers and total cable length, we conclude that the method presented in this paper generates realistic grids that could be used to implement a cost-optimised electrical energy system.

  2. Anisotropy in Fracking: A Percolation Model for Observed Microseismicity

    NASA Astrophysics Data System (ADS)

    Norris, J. Quinn; Turcotte, Donald L.; Rundle, John B.

    2015-01-01

    Hydraulic fracturing (fracking), using high pressures and a low viscosity fluid, allow the extraction of large quantiles of oil and gas from very low permeability shale formations. The initial production of oil and gas at depth leads to high pressures and an extensive distribution of natural fractures which reduce the pressures. With time these fractures heal, sealing the remaining oil and gas in place. High volume fracking opens the healed fractures allowing the oil and gas to flow to horizontal production wells. We model the injection process using invasion percolation. We use a 2D square lattice of bonds to model the sealed natural fractures. The bonds are assigned random strengths and the fluid, injected at a point, opens the weakest bond adjacent to the growing cluster of opened bonds. Our model exhibits burst dynamics in which the clusters extend rapidly into regions with weak bonds. We associate these bursts with the microseismic activity generated by fracking injections. A principal object of this paper is to study the role of anisotropic stress distributions. Bonds in the y-direction are assigned higher random strengths than bonds in the x-direction. We illustrate the spatial distribution of clusters and the spatial distribution of bursts (small earthquakes) for several degrees of anisotropy. The results are compared with observed distributions of microseismicity in a fracking injection. Both our bursts and the observed microseismicity satisfy Gutenberg-Richter frequency-size statistics.

  3. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    NASA Astrophysics Data System (ADS)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems. Geospatial capabilities for mapping, visualization, and spatial analysis will be important components of these new generation of Web-based interoperable information systems in the water domain. The purpose of this presentation is to increase the awareness of scientists, IT personnel and agency managers about the advantages offered by the combined use of Open Data, Open Specifications for geospatial and water-related data collection, storage and sharing, as well as mature FOSS projects for the creation of interoperable Web-based information systems in the water domain. A case study is used to illustrate how these principles and technologies can be integrated to create a system with the previously mentioned characteristics for managing and responding to flood events.

  4. Web-client based distributed generalization and geoprocessing

    USGS Publications Warehouse

    Wolf, E.B.; Howe, K.

    2009-01-01

    Generalization and geoprocessing operations on geospatial information were once the domain of complex software running on high-performance workstations. Currently, these computationally intensive processes are the domain of desktop applications. Recent efforts have been made to move geoprocessing operations server-side in a distributed, web accessible environment. This paper initiates research into portable client-side generalization and geoprocessing operations as part of a larger effort in user-centered design for the US Geological Survey's The National Map. An implementation of the Ramer-Douglas-Peucker (RDP) line simplification algorithm was created in the open source OpenLayers geoweb client. This algorithm implementation was benchmarked using differing data structures and browser platforms. The implementation and results of the benchmarks are discussed in the general context of client-side geoprocessing. (Abstract).

  5. The Simple Concurrent Online Processing System (SCOPS) - An open-source interface for remotely sensed data processing

    NASA Astrophysics Data System (ADS)

    Warren, M. A.; Goult, S.; Clewley, D.

    2018-06-01

    Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.

  6. Needs assessment under the Maternal and Child Health Services Block Grant: Massachusetts.

    PubMed

    Guyer, B; Schor, L; Messenger, K P; Prenney, B; Evans, F

    1984-09-01

    The Massachusetts maternal and child health (MCH) agency has developed a needs assessment process which includes four components: a statistical measure of need based on indirect, proxy health and social indicators; clinical standards for services to be provided; an advisory process which guides decision making and involves constituency groups; and a management system for implementing funds distribution, namely open competitive bidding in response to a Request for Proposals. In Fiscal Years 1982 and 1983, the process was applied statewide in the distribution of primary prenatal (MIC) and pediatric (C&Y) care services and lead poisoning prevention projects. Both processes resulted in clearer definitions of services to be provided under contract to the state as well as redistribution of funds to serve localities that had previously received no resources. Although the needs assessment process does not provide a direct measure of unmet need in a complex system of private and public services, it can be used to advocate for increased MCH funding and guide the distribution of new MCH service dollars.

  7. Correlation between structure and compressive strength in a reticulated glass-reinforced hydroxyapatite foam.

    PubMed

    Callcut, S; Knowles, J C

    2002-05-01

    Glass-reinforced hydroxyapatite (HA) foams were produced using reticulated foam technology using a polyurethane template with two different pore size distributions. The mechanical properties were evaluated and the structure analyzed through density measurements, image analysis, X-ray diffraction (XRD) and scanning electron microscopy (SEM). For the mechanical properties, the use of a glass significantly improved the ultimate compressive strength (UCS) as did the use of a second coating. All the samples tested showed the classic three regions characteristic of an elastic brittle foam. From the density measurements, after application of a correction to compensate for the closed porosity, the bulk and apparent density showed a 1 : 1 correlation. When relative bulk density was plotted against UCS, a non-linear relationship was found characteristic of an isotropic open celled material. It was found by image analysis that the pore size distribution did not change and there was no degradation of the macrostructure when replicating the ceramic from the initial polyurethane template during processing. However, the pore size distributions did shift to a lower size by about 0.5 mm due to the firing process. The ceramic foams were found to exhibit mechanical properties typical of isotropic open cellular foams.

  8. Web Monitoring of EOS Front-End Ground Operations, Science Downlinks and Level 0 Processing

    NASA Technical Reports Server (NTRS)

    Cordier, Guy R.; Wilkinson, Chris; McLemore, Bruce

    2008-01-01

    This paper addresses the efforts undertaken and the technology deployed to aggregate and distribute the metadata characterizing the real-time operations associated with NASA Earth Observing Systems (EOS) high-rate front-end systems and the science data collected at multiple ground stations and forwarded to the Goddard Space Flight Center for level 0 processing. Station operators, mission project management personnel, spacecraft flight operations personnel and data end-users for various EOS missions can retrieve the information at any time from any location having access to the internet. The users are distributed and the EOS systems are distributed but the centralized metadata accessed via an external web server provide an effective global and detailed view of the enterprise-wide events as they are happening. The data-driven architecture and the implementation of applied middleware technology, open source database, open source monitoring tools, and external web server converge nicely to fulfill the various needs of the enterprise. The timeliness and content of the information provided are key to making timely and correct decisions which reduce project risk and enhance overall customer satisfaction. The authors discuss security measures employed to limit access of data to authorized users only.

  9. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  10. IAU astroEDU: an open-access platform for peer-reviewed astronomy education activities

    NASA Astrophysics Data System (ADS)

    Heenatigala, Thilina; Russo, Pedro; Strubbe, Linda; Gomez, Edward

    2015-08-01

    astroEDU is an open access platform for peer-reviewed astronomy education activities. It addresses key problems in educational repositories such as variability in quality, not maintained or updated regularly, limited content review, and more. This is achieved through a peer-review process similar to what scholarly articles are based on. Activities submitted are peer-reviewed by an educator and a professional astronomer which gives the credibility to the activities. astroEDU activities are open-access in order to make the activities accessible to educators around the world while letting them discover, review, distribute and remix the activities. The activity submission process allows authors to learn how to apply enquiry-based learning into the activity, identify the process skills required, how to develop core goals and objectives, and how to evaluate the activity to determine the outcome. astroEDU is endorsed by the International Astronomical Union meaning each activity is given an official stamp by the international organisation for professional astronomers.

  11. Structural-Diagenetic Controls on Fracture Opening in Tight Gas Sandstone Reservoirs, Alberta Foothills

    NASA Astrophysics Data System (ADS)

    Ukar, Estibalitz; Eichhubl, Peter; Fall, Andras; Hooker, John

    2013-04-01

    In tight gas reservoirs, understanding the characteristics, orientation and distribution of natural open fractures, and how these relate to the structural and stratigraphic setting are important for exploration and production. Outcrops provide the opportunity to sample fracture characteristics that would otherwise be unknown due to the limitations of sampling by cores and well logs. However, fractures in exhumed outcrops may not be representative of fractures in the reservoir because of differences in burial and exhumation history. Appropriate outcrop analogs of producing reservoirs with comparable geologic history, structural setting, fracture networks, and diagenetic attributes are desirable but rare. The Jurassic to Lower Cretaceous Nikanassin Formation from the Alberta Foothills produces gas at commercial rates where it contains a network of open fractures. Fractures from outcrops have the same diagenetic attributes as those observed in cores <100 km away, thus offering an ideal opportunity to 1) evaluate the distribution and characteristics of opening mode fractures relative to fold cores, hinges and limbs, 2) compare the distribution and attributes of fractures in outcrop vs. core samples, 3) estimate the timing of fracture formation relative to the evolution of the fold-and-thrust belt, and 4) estimate the degradation of fracture porosity due to postkinematic cementation. Cathodoluminescence images of cemented fractures in both outcrop and core samples reveal several generations of quartz and ankerite cement that is synkinematic and postkinematic relative to fracture opening. Crack-seal textures in synkinematic quartz are ubiquitous, and well-developed cement bridges abundant. Fracture porosity may be preserved in fractures wider than ~100 microns. 1-D scanlines in outcrop and core samples indicate fractures are most abundant within small parasitic folds within larger, tight, mesoscopic folds. Fracture intensity is lower away from parasitic folds; intensity progressively decreases from the faulted cores of mesoscopic folds to their forelimbs, with lowest intensities within relatively undeformed backlimb strata. Fracture apertures locally increase adjacent to reverse faults without an overall increase in fracture frequency. Fluid inclusion analyses of crack-seal quartz cement indicate both aqueous and methane-rich inclusions are present. Homogenization temperatures of two-phase inclusions indicate synkinematic fracture cement precipitation and fracture opening under conditions at or near maximum burial of 190-210°C in core samples, and 120-160°C in outcrop samples. In comparison with the fracture evolution in other, less deformed tight-gas sandstone reservoirs such as the Piceance and East Texas basins where fracture opening is primarily controlled by gas generation, gas charge, and pore fluid pressure, these results suggest a strong control of regional tectonic processes on fracture generation. In conjunction with timing and rate of gas charge, rates of fracture cement growth, and stratigraphic-lithological controls, these processes determine the overall distribution of open fractures in these reservoirs.

  12. Structural-Diagenetic Controls on Fracture Opening in Tight Gas Sandstone Reservoirs, Alberta Foothills

    NASA Astrophysics Data System (ADS)

    Ukar, E.; Eichhubl, P.; Fall, A.; Hooker, J. N.

    2012-12-01

    In tight gas reservoirs, understanding the characteristics, orientation and distribution of natural open fractures, and how these relate to the structural and stratigraphic setting are important for exploration and production. Outcrops provide the opportunity to sample fracture characteristics that would otherwise be unknown due to the limitations of sampling by cores and well logs. However, fractures in exhumed outcrops may not be representative of fractures in the reservoir because of differences in burial and exhumation history. Appropriate outcrop analogs of producing reservoirs with comparable geologic history, structural setting, fracture networks, and diagenetic attributes are desirable but rare. The Jurassic to Lower Cretaceous Nikanassin Formation from the Alberta Foothills produces gas at commercial rates where it contains a network of open fractures. Fractures from outcrops have the same diagenetic attributes as those observed in cores <100 km away, thus offering an ideal opportunity to 1) evaluate the distribution and characteristics of opening mode fractures relative to fold cores, hinges and limbs, 2) compare the distribution and attributes of fractures in outcrop vs. core samples, 3) estimate the timing of fracture formation relative to the evolution of the fold-and-thrust belt, and 4) estimate the degradation of fracture porosity due to postkinematic cementation. Cathodoluminescence images of cemented fractures in both outcrop and core samples reveal several generations of quartz and ankerite cement that is synkinematic and postkinematic relative to fracture opening. Crack-seal textures in synkinematic quartz are ubiquitous, and well-developed cement bridges abundant. Fracture porosity may be preserved in fractures wider than ~100 microns. 1-D scanlines in outcrop and core samples indicate fractures are most abundant within small parasitic folds within larger, tight, mesoscopic folds. Fracture intensity is lower away from parasitic folds; intensity progressively decreases from the faulted cores of mesoscopic folds to their forelimbs, with lowest intensities within relatively undeformed backlimb strata. Fracture apertures locally increase adjacent to reverse faults without an overall increase in fracture frequency. Fluid inclusion analyses of crack-seal quartz cement indicate both aqueous and methane-rich inclusions are present. Homogenization temperatures of two-phase inclusions indicate synkinematic fracture cement precipitation and fracture opening under conditions at or near maximum burial of 190-210°C in core samples, and 120-160°C in outcrop samples. In comparison with the fracture evolution in other, less deformed tight-gas sandstone reservoirs such as the Piceance and East Texas basins where fracture opening is primarily controlled by gas generation, gas charge, and pore fluid pressure, these results suggest a strong control of regional tectonic processes on fracture generation. In conjunction with timing and rate of gas charge, rates of fracture cement growth, and stratigraphic-lithological controls, these processes determine the overall distribution of open fractures in these reservoirs.

  13. ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Torrent, Marc

    2014-03-01

    For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization algorithm, as well as the use of external optimized librairies. Part of this work has been supported by the european Prace project (PaRtnership for Advanced Computing in Europe) in the framework of its workpackage 8.

  14. Implicit Processing of the Eyes and Mouth: Evidence from Human Electrophysiology.

    PubMed

    Pesciarelli, Francesca; Leo, Irene; Sarlo, Michela

    2016-01-01

    The current study examined the time course of implicit processing of distinct facial features and the associate event-related potential (ERP) components. To this end, we used a masked priming paradigm to investigate implicit processing of the eyes and mouth in upright and inverted faces, using a prime duration of 33 ms. Two types of prime-target pairs were used: 1. congruent (e.g., open eyes only in both prime and target or open mouth only in both prime and target); 2. incongruent (e.g., open mouth only in prime and open eyes only in target or open eyes only in prime and open mouth only in target). The identity of the faces changed between prime and target. Participants pressed a button when the target face had the eyes open and another button when the target face had the mouth open. The behavioral results showed faster RTs for the eyes in upright faces than the eyes in inverted faces, the mouth in upright and inverted faces. Moreover they also revealed a congruent priming effect for the mouth in upright faces. The ERP findings showed a face orientation effect across all ERP components studied (P1, N1, N170, P2, N2, P3) starting at about 80 ms, and a congruency/priming effect on late components (P2, N2, P3), starting at about 150 ms. Crucially, the results showed that the orientation effect was driven by the eye region (N170, P2) and that the congruency effect started earlier (P2) for the eyes than for the mouth (N2). These findings mark the time course of the processing of internal facial features and provide further evidence that the eyes are automatically processed and that they are very salient facial features that strongly affect the amplitude, latency, and distribution of neural responses to faces.

  15. Implicit Processing of the Eyes and Mouth: Evidence from Human Electrophysiology

    PubMed Central

    Pesciarelli, Francesca; Leo, Irene; Sarlo, Michela

    2016-01-01

    The current study examined the time course of implicit processing of distinct facial features and the associate event-related potential (ERP) components. To this end, we used a masked priming paradigm to investigate implicit processing of the eyes and mouth in upright and inverted faces, using a prime duration of 33 ms. Two types of prime-target pairs were used: 1. congruent (e.g., open eyes only in both prime and target or open mouth only in both prime and target); 2. incongruent (e.g., open mouth only in prime and open eyes only in target or open eyes only in prime and open mouth only in target). The identity of the faces changed between prime and target. Participants pressed a button when the target face had the eyes open and another button when the target face had the mouth open. The behavioral results showed faster RTs for the eyes in upright faces than the eyes in inverted faces, the mouth in upright and inverted faces. Moreover they also revealed a congruent priming effect for the mouth in upright faces. The ERP findings showed a face orientation effect across all ERP components studied (P1, N1, N170, P2, N2, P3) starting at about 80 ms, and a congruency/priming effect on late components (P2, N2, P3), starting at about 150 ms. Crucially, the results showed that the orientation effect was driven by the eye region (N170, P2) and that the congruency effect started earlier (P2) for the eyes than for the mouth (N2). These findings mark the time course of the processing of internal facial features and provide further evidence that the eyes are automatically processed and that they are very salient facial features that strongly affect the amplitude, latency, and distribution of neural responses to faces. PMID:26790153

  16. Distributed fiber optic sensor-enhanced detection and prediction of shrinkage-induced delamination of ultra-high-performance concrete overlay

    NASA Astrophysics Data System (ADS)

    Bao, Yi; Valipour, Mahdi; Meng, Weina; Khayat, Kamal H.; Chen, Genda

    2017-08-01

    This study develops a delamination detection system for smart ultra-high-performance concrete (UHPC) overlays using a fully distributed fiber optic sensor. Three 450 mm (length) × 200 mm (width) × 25 mm (thickness) UHPC overlays were cast over an existing 200 mm thick concrete substrate. The initiation and propagation of delamination due to early-age shrinkage of the UHPC overlay were detected as sudden increases and their extension in spatial distribution of shrinkage-induced strains measured from the sensor based on pulse pre-pump Brillouin optical time domain analysis. The distributed sensor is demonstrated effective in detecting delamination openings from microns to hundreds of microns. A three-dimensional finite element model with experimental material properties is proposed to understand the complete delamination process measured from the distributed sensor. The model is validated using the distributed sensor data. The finite element model with cohesive elements for the overlay-substrate interface can predict the complete delamination process.

  17. Satellite Cloud and Radiative Property Processing and Distribution System on the NASA Langley ASDC OpenStack and OpenShift Cloud Platform

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.

    2017-12-01

    Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.

  18. THE BERKELEY DATA ANALYSIS SYSTEM (BDAS): AN OPEN SOURCE PLATFORM FOR BIG DATA ANALYTICS

    DTIC Science & Technology

    2017-09-01

    Evan Sparks, Oliver Zahn, Michael J. Franklin, David A. Patterson, Saul Perlmutter. Scientific Computing Meets Big Data Technology: An Astronomy ...Processing Astronomy Imagery Using Big Data Technology. IEEE Transaction on Big Data, 2016. Approved for Public Release; Distribution Unlimited. 22 [93

  19. Outcasts on the Inside: Academics Reinventing Themselves Online

    ERIC Educational Resources Information Center

    Costa, Cristina

    2015-01-01

    Recent developments in digital scholarship point out that academic practices supported by technologies may not only be transformed through the obvious process of digitization, but also renovated through distributed knowledge networks that digital technologies enable, and the practices of openness that such networks develop. Yet, this apparent…

  20. apART: system for the acquisition, processing, archiving, and retrieval of digital images in an open, distributed imaging environment

    NASA Astrophysics Data System (ADS)

    Schneider, Uwe; Strack, Ruediger

    1992-04-01

    apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.

  1. Big Geo Data Management: AN Exploration with Social Media and Telecommunications Open Data

    NASA Astrophysics Data System (ADS)

    Arias Munoz, C.; Brovelli, M. A.; Corti, S.; Zamboni, G.

    2016-06-01

    The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed) made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.

  2. MPPhys—A many-particle simulation package for computational physics education

    NASA Astrophysics Data System (ADS)

    Müller, Thomas

    2014-03-01

    In a first course to classical mechanics elementary physical processes like elastic two-body collisions, the mass-spring model, or the gravitational two-body problem are discussed in detail. The continuation to many-body systems, however, is deferred to graduate courses although the underlying equations of motion are essentially the same and although there is a strong motivation for high-school students in particular because of the use of particle systems in computer games. The missing link between the simple and the more complex problem is a basic introduction to solve the equations of motion numerically which could be illustrated, however, by means of the Euler method. The many-particle physics simulation package MPPhys offers a platform to experiment with simple particle simulations. The aim is to give a principle idea how to implement many-particle simulations and how simulation and visualization can be combined for interactive visual explorations. Catalogue identifier: AERR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERR_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 111327 No. of bytes in distributed program, including test data, etc.: 608411 Distribution format: tar.gz Programming language: C++, OpenGL, GLSL, OpenCL. Computer: Linux and Windows platforms with OpenGL support. Operating system: Linux and Windows. RAM: Source Code 4.5 MB Complete package 242 MB Classification: 14, 16.9. External routines: OpenGL, OpenCL Nature of problem: Integrate N-body simulations, mass-spring models Solution method: Numerical integration of N-body-simulations, 3D-Rendering via OpenGL. Running time: Problem dependent

  3. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  4. Sailfish: A flexible multi-GPU implementation of the lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Januszewski, M.; Kostur, M.

    2014-09-01

    We present Sailfish, an open source fluid simulation package implementing the lattice Boltzmann method (LBM) on modern Graphics Processing Units (GPUs) using CUDA/OpenCL. We take a novel approach to GPU code implementation and use run-time code generation techniques and a high level programming language (Python) to achieve state of the art performance, while allowing easy experimentation with different LBM models and tuning for various types of hardware. We discuss the general design principles of the code, scaling to multiple GPUs in a distributed environment, as well as the GPU implementation and optimization of many different LBM models, both single component (BGK, MRT, ELBM) and multicomponent (Shan-Chen, free energy). The paper also presents results of performance benchmarks spanning the last three NVIDIA GPU generations (Tesla, Fermi, Kepler), which we hope will be useful for researchers working with this type of hardware and similar codes. Catalogue identifier: AETA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Lesser General Public License, version 3 No. of lines in distributed program, including test data, etc.: 225864 No. of bytes in distributed program, including test data, etc.: 46861049 Distribution format: tar.gz Programming language: Python, CUDA C, OpenCL. Computer: Any with an OpenCL or CUDA-compliant GPU. Operating system: No limits (tested on Linux and Mac OS X). RAM: Hundreds of megabytes to tens of gigabytes for typical cases. Classification: 12, 6.5. External routines: PyCUDA/PyOpenCL, Numpy, Mako, ZeroMQ (for multi-GPU simulations), scipy, sympy Nature of problem: GPU-accelerated simulation of single- and multi-component fluid flows. Solution method: A wide range of relaxation models (LBGK, MRT, regularized LB, ELBM, Shan-Chen, free energy, free surface) and boundary conditions within the lattice Boltzmann method framework. Simulations can be run in single or double precision using one or more GPUs. Restrictions: The lattice Boltzmann method works for low Mach number flows only. Unusual features: The actual numerical calculations run exclusively on GPUs. The numerical code is built dynamically at run-time in CUDA C or OpenCL, using templates and symbolic formulas. The high-level control of the simulation is maintained by a Python process. Additional comments: !!!!! The distribution file for this program is over 45 Mbytes and therefore is not delivered directly when Download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. !!!!! Running time: Problem-dependent, typically minutes (for small cases or short simulations) to hours (large cases or long simulations).

  5. Structured Modeling and Analysis of Stochastic Epidemics with Immigration and Demographic Effects

    PubMed Central

    Baumann, Hendrik; Sandmann, Werner

    2016-01-01

    Stochastic epidemics with open populations of variable population sizes are considered where due to immigration and demographic effects the epidemic does not eventually die out forever. The underlying stochastic processes are ergodic multi-dimensional continuous-time Markov chains that possess unique equilibrium probability distributions. Modeling these epidemics as level-dependent quasi-birth-and-death processes enables efficient computations of the equilibrium distributions by matrix-analytic methods. Numerical examples for specific parameter sets are provided, which demonstrates that this approach is particularly well-suited for studying the impact of varying rates for immigration, births, deaths, infection, recovery from infection, and loss of immunity. PMID:27010993

  6. Structured Modeling and Analysis of Stochastic Epidemics with Immigration and Demographic Effects.

    PubMed

    Baumann, Hendrik; Sandmann, Werner

    2016-01-01

    Stochastic epidemics with open populations of variable population sizes are considered where due to immigration and demographic effects the epidemic does not eventually die out forever. The underlying stochastic processes are ergodic multi-dimensional continuous-time Markov chains that possess unique equilibrium probability distributions. Modeling these epidemics as level-dependent quasi-birth-and-death processes enables efficient computations of the equilibrium distributions by matrix-analytic methods. Numerical examples for specific parameter sets are provided, which demonstrates that this approach is particularly well-suited for studying the impact of varying rates for immigration, births, deaths, infection, recovery from infection, and loss of immunity.

  7. Personality and complex brain networks: The role of openness to experience in default network efficiency

    PubMed Central

    Kaufman, Scott Barry; Benedek, Mathias; Jung, Rex E.; Kenett, Yoed N.; Jauk, Emanuel; Neubauer, Aljoscha C.; Silvia, Paul J.

    2015-01-01

    Abstract The brain's default network (DN) has been a topic of considerable empirical interest. In fMRI research, DN activity is associated with spontaneous and self‐generated cognition, such as mind‐wandering, episodic memory retrieval, future thinking, mental simulation, theory of mind reasoning, and creative cognition. Despite large literatures on developmental and disease‐related influences on the DN, surprisingly little is known about the factors that impact normal variation in DN functioning. Using structural equation modeling and graph theoretical analysis of resting‐state fMRI data, we provide evidence that Openness to Experience—a normally distributed personality trait reflecting a tendency to engage in imaginative, creative, and abstract cognitive processes—underlies efficiency of information processing within the DN. Across two studies, Openness predicted the global efficiency of a functional network comprised of DN nodes and corresponding edges. In Study 2, Openness remained a robust predictor—even after controlling for intelligence, age, gender, and other personality variables—explaining 18% of the variance in DN functioning. These findings point to a biological basis of Openness to Experience, and suggest that normally distributed personality traits affect the intrinsic architecture of large‐scale brain systems. Hum Brain Mapp 37:773–779, 2016. © 2015 Wiley Periodicals, Inc. PMID:26610181

  8. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  9. Database technology and the management of multimedia data in the Mirror project

    NASA Astrophysics Data System (ADS)

    de Vries, Arjen P.; Blanken, H. M.

    1998-10-01

    Multimedia digital libraries require an open distributed architecture instead of a monolithic database system. In the Mirror project, we use the Monet extensible database kernel to manage different representation of multimedia objects. To maintain independence between content, meta-data, and the creation of meta-data, we allow distribution of data and operations using CORBA. This open architecture introduces new problems for data access. From an end user's perspective, the problem is how to search the available representations to fulfill an actual information need; the conceptual gap between human perceptual processes and the meta-data is too large. From a system's perspective, several representations of the data may semantically overlap or be irrelevant. We address these problems with an iterative query process and active user participating through relevance feedback. A retrieval model based on inference networks assists the user with query formulation. The integration of this model into the database design has two advantages. First, the user can query both the logical and the content structure of multimedia objects. Second, the use of different data models in the logical and the physical database design provides data independence and allows algebraic query optimization. We illustrate query processing with a music retrieval application.

  10. Derivation of hydrous pyrolysis kinetic parameters from open-system pyrolysis

    NASA Astrophysics Data System (ADS)

    Tseng, Yu-Hsin; Huang, Wuu-Liang

    2010-05-01

    Kinetic information is essential to predict the temperature, timing or depth of hydrocarbon generation within a hydrocarbon system. The most common experiments for deriving kinetic parameters are mainly by open-system pyrolysis. However, it has been shown that the conditions of open-system pyrolysis are deviant from nature by its low near-ambient pressure and high temperatures. Also, the extrapolation of heating rates in open-system pyrolysis to geological conditions may be questionable. Recent study of Lewan and Ruble shows hydrous-pyrolysis conditions can simulate the natural conditions better and its applications are supported by two case studies with natural thermal-burial histories. Nevertheless, performing hydrous pyrolysis experiment is really tedious and requires large amount of sample, while open-system pyrolysis is rather convenient and efficient. Therefore, the present study aims at the derivation of convincing distributed hydrous pyrolysis Ea with only routine open-system Rock-Eval data. Our results unveil that there is a good correlation between open-system Rock-Eval parameter Tmax and the activation energy (Ea) derived from hydrous pyrolysis. The hydrous pyrolysis single Ea can be predicted from Tmax based on the correlation, while the frequency factor (A0) is estimated based on the linear relationship between single Ea and log A0. Because the Ea distribution is more rational than single Ea, we modify the predicted single hydrous pyrolysis Ea into distributed Ea by shifting the pattern of Ea distribution from open-system pyrolysis until the weight mean Ea distribution equals to the single hydrous pyrolysis Ea. Moreover, it has been shown that the shape of the Ea distribution is very much alike the shape of Tmax curve. Thus, in case of the absence of open-system Ea distribution, we may use the shape of Tmax curve to get the distributed hydrous pyrolysis Ea. The study offers a new approach as a simple method for obtaining distributed hydrous pyrolysis Ea with only routine open-system Rock-Eval data, which will allow for better estimating hydrocarbon generation.

  11. 'Predatory' open access: a longitudinal study of article volumes and market characteristics.

    PubMed

    Shen, Cenyu; Björk, Bo-Christer

    2015-10-01

    A negative consequence of the rapid growth of scholarly open access publishing funded by article processing charges is the emergence of publishers and journals with highly questionable marketing and peer review practices. These so-called predatory publishers are causing unfounded negative publicity for open access publishing in general. Reports about this branch of e-business have so far mainly concentrated on exposing lacking peer review and scandals involving publishers and journals. There is a lack of comprehensive studies about several aspects of this phenomenon, including extent and regional distribution. After an initial scan of all predatory publishers and journals included in the so-called Beall's list, a sample of 613 journals was constructed using a stratified sampling method from the total of over 11,000 journals identified. Information about the subject field, country of publisher, article processing charge and article volumes published between 2010 and 2014 were manually collected from the journal websites. For a subset of journals, individual articles were sampled in order to study the country affiliation of authors and the publication delays. Over the studied period, predatory journals have rapidly increased their publication volumes from 53,000 in 2010 to an estimated 420,000 articles in 2014, published by around 8,000 active journals. Early on, publishers with more than 100 journals dominated the market, but since 2012 publishers in the 10-99 journal size category have captured the largest market share. The regional distribution of both the publisher's country and authorship is highly skewed, in particular Asia and Africa contributed three quarters of authors. Authors paid an average article processing charge of 178 USD per article for articles typically published within 2 to 3 months of submission. Despite a total number of journals and publishing volumes comparable to respectable (indexed by the Directory of Open Access Journals) open access journals, the problem of predatory open access seems highly contained to just a few countries, where the academic evaluation practices strongly favor international publication, but without further quality checks.

  12. A Scalable Infrastructure for Lidar Topography Data Distribution, Processing, and Discovery

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Phan, M.; Cowart, C. A.; Arrowsmith, R.; Baru, C.

    2010-12-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology have emerged as a fundamental tool in the Earth sciences, and are also being widely utilized for ecological, planning, engineering, and environmental applications. Collected from airborne, terrestrial, and space-based platforms, these data are revolutionary because they permit analysis of geologic and biologic processes at resolutions essential for their appropriate representation. Public domain lidar data collection by federal, state, and local agencies are a valuable resource to the scientific community, however the data pose significant distribution challenges because of the volume and complexity of data that must be stored, managed, and processed. Lidar data acquisition may generate terabytes of data in the form of point clouds, digital elevation models (DEMs), and derivative products. This massive volume of data is often challenging to host for resource-limited agencies. Furthermore, these data can be technically challenging for users who lack appropriate software, computing resources, and expertise. The National Science Foundation-funded OpenTopography Facility (www.opentopography.org) has developed a cyberinfrastructure-based solution to enable online access to Earth science-oriented high-resolution lidar topography data, online processing tools, and derivative products. OpenTopography provides access to terabytes of point cloud data, standard DEMs, and Google Earth image data, all co-located with computational resources for on-demand data processing. The OpenTopography portal is built upon a cyberinfrastructure platform that utilizes a Services Oriented Architecture (SOA) to provide a modular system that is highly scalable and flexible enough to support the growing needs of the Earth science lidar community. OpenTopography strives to host and provide access to datasets as soon as they become available, and also to expose greater application level functionalities to our end-users (such as generation of custom DEMs via various gridding algorithms, and hydrological modeling algorithms). In the future, the SOA will enable direct authenticated access to back-end functionality through simple Web service Application Programming Interfaces (APIs), so that users may access our data and compute resources via clients other than Web browsers. In addition to an overview of the OpenTopography SOA, this presentation will discuss our recently developed lidar data ingestion and management system for point cloud data delivered in the binary LAS standard. This system compliments our existing partitioned database approach for data delivered in ASCII format, and permits rapid ingestion of data. The system has significantly reduced data ingestion times and has implications for data distribution in emergency response situations. We will also address on ongoing work to develop a community lidar metadata catalog based on the OGC Catalogue Service for Web (CSW) standard, which will help to centralize discovery of public domain lidar data.

  13. Numerical and Experimental Study on the Effect of Over Fire Air on NOx Distribution in Furnace

    NASA Astrophysics Data System (ADS)

    Wang, Qian; Deng, Yong-qiang; Xia, Yong-jun; Wu, Ying

    2018-05-01

    In this paper, a numerical investigation and experimental study was used to research the effect of a power plant 600MW supercritical four walls tangentially fired boiler furnace over fire air opening size on the inside furnace NOx concentration distribution and the results coincide. There are four cases in all. The influence and formation of NOx that was produced by pulverized coal furnace during combustion processes were analyzed. The research was proved that the over fire air has great effect on the concentration distribution of NOx in the furnance.

  14. Intratidal Overdistention and Derecruitment in the Injured Lung: A Simulation Study.

    PubMed

    Amini, Reza; Herrmann, Jacob; Kaczka, David W

    2017-03-01

    Ventilated patients with the acute respiratory distress syndrome (ARDS) are predisposed to cyclic parenchymal overdistention and derecruitment, which may worsen existing injury. We hypothesized that intratidal variations in global mechanics, as assessed at the airway opening, would reflect such distributed processes. We developed a computational lung model for determining local instantaneous pressure distributions and mechanical impedances continuously during a breath. Based on these distributions and previous literature, we simulated the within-breath variability of airway segment dimensions, parenchymal viscoelasticity, and acinar recruitment in an injured canine lung for tidal volumes( V T ) of 10, 15, and 20 mL·kg -1 and positive end-expiratory pressures (PEEP) of 5, 10, and 15 cm H 2 O. Acini were allowed to transition between recruited and derecruited states when exposed to stochastically determined critical opening and closing pressures, respectively. For conditions of low V T and low PEEP, we observed small intratidal variations in global resistance and elastance, with a small number of cyclically recruited acini. However, with higher V T and PEEP, larger variations in resistance and elastance were observed, and the majority of acini remained open throughout the breath. Changes in intratidal resistance, elastance, and impedance followed well-defined parabolic trajectories with tracheal pressure, achieving minima near 12 to 16 cm H 2 O. Intratidal variations in lung mechanics may allow for optimization of ventilator settings in patients with ARDS, by balancing lung recruitment against parenchymal overdistention. Titration of airway pressures based on variations in intratidal mechanics may mitigate processes associated with injurious ventilation.

  15. Design and control of rotating soil-like substrate plant-growing facility based on plant water requirement and computational fluid dynamics simulation

    NASA Astrophysics Data System (ADS)

    Hu, Dawei; Li, Leyuan; Liu, Hui; Zhang, Houkai; Fu, Yuming; Sun, Yi; Li, Liang

    It is necessary to process inedible plant biomass into soil-like substrate (SLS) by bio-compost to realize biological resource sustainable utilization. Although similar to natural soil in structure and function, SLS often has uneven water distribution adversely affecting the plant growth due to unsatisfactory porosity, permeability and gravity distribution. In this article, SLS plant-growing facility (SLS-PGF) were therefore rotated properly for cultivating lettuce, and the Brinkman equations coupled with laminar flow equations were taken as governing equations, and boundary conditions were specified by actual operating characteristics of rotating SLS-PGF. Optimal open-control law of the angular and inflow velocity was determined by lettuce water requirement and CFD simulations. The experimental result clearly showed that water content was more uniformly distributed in SLS under the action of centrifugal and Coriolis force, rotating SLS-PGF with the optimal open-control law could meet lettuce water requirement at every growth stage and achieve precise irrigation.

  16. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  17. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  18. Chemical Imaging Analysis of Environmental Particles Using the Focused Ion Beam/Scanning Electron Microscopy Technique. Microanalysis Insights into Atmospheric Chemistry of Fly Ash

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Haihan; Grassian, Vicki H.; Saraf, Laxmikant V.

    2012-11-08

    Airborne fly ash from coal combustion may represent a source of bioavailable iron (Fe) in the open ocean. However, few studies have been made focusing on Fe speciation and distribution in coal fly ash. In this study, chemical imaging of fly ash has been performed using a dual-beam FIB/SEM (focused ion beam/scanning electron microscope) system for a better understanding of how simulated atmospheric processing modify the morphology, chemical compositions and element distributions of individual particles. A novel approach has been applied for cross-sectioning of fly ash specimen with a FIB in order to explore element distribution within the interior ofmore » individual particles. Our results indicate that simulated atmospheric processing causes disintegration of aluminosilicate glass, a dominant material in fly ash particles. Aluminosilicate-phase Fe in the inner core of fly ash particles is more easily mobilized compared with oxide-phase Fe present as surface aggregates on fly ash spheres. Fe release behavior depends strongly on Fe speciation in aerosol particles. The approach for preparation of cross-sectioned specimen described here opens new opportunities for particle microanalysis, particular with respect to inorganic refractive materials like fly ash and mineral dust.« less

  19. The global distribution and dynamics of chromophoric dissolved organic matter.

    PubMed

    Nelson, Norman B; Siegel, David A

    2013-01-01

    Chromophoric dissolved organic matter (CDOM) is a ubiquitous component of the open ocean dissolved matter pool, and is important owing to its influence on the optical properties of the water column, its role in photochemistry and photobiology, and its utility as a tracer of deep ocean biogeochemical processes and circulation. In this review, we discuss the global distribution and dynamics of CDOM in the ocean, concentrating on developments in the past 10 years and restricting our discussion to open ocean and deep ocean (below the main thermocline) environments. CDOM has been demonstrated to exert primary control on ocean color by its absorption of light energy, which matches or exceeds that of phytoplankton pigments in most cases. This has important implications for assessing the ocean biosphere via ocean color-based remote sensing and the evaluation of ocean photochemical and photobiological processes. The general distribution of CDOM in the global ocean is controlled by a balance between production (primarily microbial remineralization of organic matter) and photolysis, with vertical ventilation circulation playing an important role in transporting CDOM to and from intermediate water masses. Significant decadal-scale fluctuations in the abundance of global surface ocean CDOM have been observed using remote sensing, indicating a potentially important role for CDOM in ocean-climate connections through its impact on photochemistry and photobiology.

  20. What Multilevel Parallel Programs do when you are not Watching: A Performance Analysis Case Study Comparing MPI/OpenMP, MLP, and Nested OpenMP

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors, parallel programming techniques have evolved that support parallelism beyond a single level. When comparing the performance of applications based on different programming paradigms, it is important to differentiate between the influence of the programming model itself and other factors, such as implementation specific behavior of the operating system (OS) or architectural issues. Rewriting-a large scientific application in order to employ a new programming paradigms is usually a time consuming and error prone task. Before embarking on such an endeavor it is important to determine that there is really a gain that would not be possible with the current implementation. A detailed performance analysis is crucial to clarify these issues. The multilevel programming paradigms considered in this study are hybrid MPI/OpenMP, MLP, and nested OpenMP. The hybrid MPI/OpenMP approach is based on using MPI [7] for the coarse grained parallelization and OpenMP [9] for fine grained loop level parallelism. The MPI programming paradigm assumes a private address space for each process. Data is transferred by explicitly exchanging messages via calls to the MPI library. This model was originally designed for distributed memory architectures but is also suitable for shared memory systems. The second paradigm under consideration is MLP which was developed by Taft. The approach is similar to MPi/OpenMP, using a mix of coarse grain process level parallelization and loop level OpenMP parallelization. As it is the case with MPI, a private address space is assumed for each process. The MLP approach was developed for ccNUMA architectures and explicitly takes advantage of the availability of shared memory. A shared memory arena which is accessible by all processes is required. Communication is done by reading from and writing to the shared memory.

  1. Zooplankton data: Vertical distributions of zooplankton in the Norweigian and Greenland Seas during summer, 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lane, P.V.Z.; Smith, S.L.; Schwarting, E.M.

    1993-08-01

    Recent studies of zooplankton populations in the Greenland Sea have focused on processes at the Marginal Ice Zone (MIZ) and the areas immediately adjacent to it under the ice and in open water. These studies have shown a relatively short period of intense secondary productivity which is closely linked temporally and spatially to phytoplankton blooms occurring near the ice edge in spring and early summer. During the summer of 1989 we participated in a project focusing on benthic and water column processes in the basins of the Norwegian and Greenland Seas. This study allowed us to compare biological processes atmore » the MIZ with those occurring in the open waters of the Greenland Sea, and to compare processes at both of these locations with those in the Norwegian Sea. The data presented in this report are the results of zooplankton net tows covering the upper 1000 meters of the water column over the Norwegian Sea basin and the Greenland Sea basin, and the upper 500 meters of open water adjacent to the MIZ in the Greenland Sea. Sampling was conducted between 12 and 29 July 1989.« less

  2. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  3. Computational Modeling of Aerosol Hazard Arising from the Opening of an Anthrax Letter in an Open-Office Complex

    NASA Astrophysics Data System (ADS)

    Lien, F. S.; Ji, H.; Yee, E.

    Early experimental work, conducted at Defence R&D Canada — Suffield, measured and characterized the personal and environmental contamination associated with the simulated opening of anthrax-tainted letters under a number of different scenarios. A better understanding of the physical and biological processes is considerably significant for detecting, assessing, and formulating potential mitigation strategies for managing these risks. These preliminary experimental investigations have been extended to simulate the contamination from the opening of anthrax-tainted letters in an Open-Office environment using Computational Fluid Dynamics (CFD). Bacillus globigii (BG) was used as a biological simulant for anthrax, with 0.1 gram of the simulant released from opened letters in the experiments conducted. The accuracy of the model for prediction of the spatial distribution of BG spores in the office is first assessed quantitatively by comparison with measured SF6 concentrations (the baseline experiment), and then qualitatively by comparison with measured BG concentrations obtained under a number of scenarios, some involving people moving within various offices.

  4. 75 FR 52309 - Pacific Fishery Management Council; Tule Chinook Workgroup Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-25

    ... management approach for Columbia River natural tule chinook . This meeting of the TCW is open to the public... distributed to State and Federal recovery planning processes. In the event a usable approach emerges from this...: The Pacific Fishery Management Council's (Pacific Council) Tule Chinook Workgroup (TCW) will hold a...

  5. Podcast Pilots for Distance Planning, Programming, and Development

    ERIC Educational Resources Information Center

    Cordes, Sean

    2005-01-01

    This paper examines podcasting as a library support for distance learning and information systems and services. The manuscript provides perspective on the knowledge base in the growing area of podcasting in libraries and academia. A walkthrough of the podcast creation and distribution process using basic computing skills and open source tools is…

  6. Model for calorimetric measurements in an open quantum system

    NASA Astrophysics Data System (ADS)

    Donvil, Brecht; Muratore-Ginanneschi, Paolo; Pekola, Jukka P.; Schwieger, Kay

    2018-05-01

    We investigate the experimental setup proposed in New J. Phys. 15, 115006 (2013), 10.1088/1367-2630/15/11/115006 for calorimetric measurements of thermodynamic indicators in an open quantum system. As a theoretical model we consider a periodically driven qubit coupled with a large yet finite electron reservoir, the calorimeter. The calorimeter is initially at equilibrium with an infinite phonon bath. As time elapses, the temperature of the calorimeter varies in consequence of energy exchanges with the qubit and the phonon bath. We show how under weak-coupling assumptions, the evolution of the qubit-calorimeter system can be described by a generalized quantum jump process including as dynamical variable the temperature of the calorimeter. We study the jump process by numeric and analytic methods. Asymptotically with the duration of the drive, the qubit-calorimeter attains a steady state. In this same limit, we use multiscale perturbation theory to derive a Fokker-Planck equation governing the calorimeter temperature distribution. We inquire the properties of the temperature probability distribution close and at the steady state. In particular, we predict the behavior of measurable statistical indicators versus the qubit-calorimeter coupling constant.

  7. Integrated surface and groundwater modelling in the Thames Basin, UK using the Open Modelling Interface

    NASA Astrophysics Data System (ADS)

    Mackay, Jonathan; Abesser, Corinna; Hughes, Andrew; Jackson, Chris; Kingdon, Andrew; Mansour, Majdi; Pachocka, Magdalena; Wang, Lei; Williams, Ann

    2013-04-01

    The River Thames catchment is situated in the south-east of England. It covers approximately 16,000 km2 and is the most heavily populated river basin in the UK. It is also one of the driest and has experienced severe drought events in the recent past. With the onset of climate change and human exploitation of our environment, there are now serious concerns over the sustainability of water resources in this basin with 6 million m3 consumed every day for public water supply alone. Groundwater in the Thames basin is extremely important, providing 40% of water for public supply. The principal aquifer is the Chalk, a dual permeability limestone, which has been extensively studied to understand its hydraulic properties. The fractured Jurassic limestone in the upper catchment also forms an important aquifer, supporting baseflow downstream during periods of drought. These aquifers are unconnected other than through the River Thames and its tributaries, which provide two-thirds of London's drinking water. Therefore, to manage these water resources sustainably and to make robust projections into the future, surface and groundwater processes must be considered in combination. This necessitates the simulation of the feedbacks and complex interactions between different parts of the water cycle, and the development of integrated environmental models. The Open Modelling Interface (OpenMI) standard provides a method through which environmental models of varying complexity and structure can be linked, allowing them to run simultaneously and exchange data at each timestep. This architecture has allowed us to represent the surface and subsurface flow processes within the Thames basin at an appropriate level of complexity based on our understanding of particular hydrological processes and features. We have developed a hydrological model in OpenMI which integrates a process-driven, gridded finite difference groundwater model of the Chalk with a more simplistic, semi-distributed conceptual model of the Jurassic limestone. A distributed river routing model of the Thames has also been integrated to connect the surface and subsurface hydrological processes. This application demonstrates the potential benefits and issues associated with implementing this approach.

  8. Squid - a simple bioinformatics grid.

    PubMed

    Carvalho, Paulo C; Glória, Rafael V; de Miranda, Antonio B; Degrave, Wim M

    2005-08-03

    BLAST is a widely used genetic research tool for analysis of similarity between nucleotide and protein sequences. This paper presents a software application entitled "Squid" that makes use of grid technology. The current version, as an example, is configured for BLAST applications, but adaptation for other computing intensive repetitive tasks can be easily accomplished in the open source version. This enables the allocation of remote resources to perform distributed computing, making large BLAST queries viable without the need of high-end computers. Most distributed computing / grid solutions have complex installation procedures requiring a computer specialist, or have limitations regarding operating systems. Squid is a multi-platform, open-source program designed to "keep things simple" while offering high-end computing power for large scale applications. Squid also has an efficient fault tolerance and crash recovery system against data loss, being able to re-route jobs upon node failure and recover even if the master machine fails. Our results show that a Squid application, working with N nodes and proper network resources, can process BLAST queries almost N times faster than if working with only one computer. Squid offers high-end computing, even for the non-specialist, and is freely available at the project web site. Its open-source and binary Windows distributions contain detailed instructions and a "plug-n-play" instalation containing a pre-configured example.

  9. Importance of vesicle release stochasticity in neuro-spike communication.

    PubMed

    Ramezani, Hamideh; Akan, Ozgur B

    2017-07-01

    Aim of this paper is proposing a stochastic model for vesicle release process, a part of neuro-spike communication. Hence, we study biological events occurring in this process and use microphysiological simulations to observe functionality of these events. Since the most important source of variability in vesicle release probability is opening of voltage dependent calcium channels (VDCCs) followed by influx of calcium ions through these channels, we propose a stochastic model for this event, while using a deterministic model for other variability sources. To capture the stochasticity of calcium influx to pre-synaptic neuron in our model, we study its statistics and find that it can be modeled by a distribution defined based on Normal and Logistic distributions.

  10. Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion

    NASA Astrophysics Data System (ADS)

    Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger

    2007-12-01

    Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.

  11. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  12. Efficient calculation of open quantum system dynamics and time-resolved spectroscopy with distributed memory HEOM (DM-HEOM).

    PubMed

    Kramer, Tobias; Noack, Matthias; Reinefeld, Alexander; Rodríguez, Mirta; Zelinskyy, Yaroslav

    2018-06-11

    Time- and frequency-resolved optical signals provide insights into the properties of light-harvesting molecular complexes, including excitation energies, dipole strengths and orientations, as well as in the exciton energy flow through the complex. The hierarchical equations of motion (HEOM) provide a unifying theory, which allows one to study the combined effects of system-environment dissipation and non-Markovian memory without making restrictive assumptions about weak or strong couplings or separability of vibrational and electronic degrees of freedom. With increasing system size the exact solution of the open quantum system dynamics requires memory and compute resources beyond a single compute node. To overcome this barrier, we developed a scalable variant of HEOM. Our distributed memory HEOM, DM-HEOM, is a universal tool for open quantum system dynamics. It is used to accurately compute all experimentally accessible time- and frequency-resolved processes in light-harvesting molecular complexes with arbitrary system-environment couplings for a wide range of temperatures and complex sizes. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  13. The Central Limit Theorem for Supercritical Oriented Percolation in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Tzioufas, Achillefs

    2018-04-01

    We consider the cardinality of supercritical oriented bond percolation in two dimensions. We show that, whenever the the origin is conditioned to percolate, the process appropriately normalized converges asymptotically in distribution to the standard normal law. This resolves a longstanding open problem pointed out to in several instances in the literature. The result applies also to the continuous-time analog of the process, viz. the basic one-dimensional contact process. We also derive general random-indices central limit theorems for associated random variables as byproducts of our proof.

  14. The Central Limit Theorem for Supercritical Oriented Percolation in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Tzioufas, Achillefs

    2018-06-01

    We consider the cardinality of supercritical oriented bond percolation in two dimensions. We show that, whenever the the origin is conditioned to percolate, the process appropriately normalized converges asymptotically in distribution to the standard normal law. This resolves a longstanding open problem pointed out to in several instances in the literature. The result applies also to the continuous-time analog of the process, viz. the basic one-dimensional contact process. We also derive general random-indices central limit theorems for associated random variables as byproducts of our proof.

  15. Open discovery: An integrated live Linux platform of Bioinformatics tools.

    PubMed

    Vetrivel, Umashankar; Pilla, Kalabharath

    2008-01-01

    Historically, live linux distributions for Bioinformatics have paved way for portability of Bioinformatics workbench in a platform independent manner. Moreover, most of the existing live Linux distributions limit their usage to sequence analysis and basic molecular visualization programs and are devoid of data persistence. Hence, open discovery - a live linux distribution has been developed with the capability to perform complex tasks like molecular modeling, docking and molecular dynamics in a swift manner. Furthermore, it is also equipped with complete sequence analysis environment and is capable of running windows executable programs in Linux environment. Open discovery portrays the advanced customizable configuration of fedora, with data persistency accessible via USB drive or DVD. The Open Discovery is distributed free under Academic Free License (AFL) and can be downloaded from http://www.OpenDiscovery.org.in.

  16. Funding free and universal access to Journal of Neuroinflammation.

    PubMed

    Mrak, Robert E; Griffin, W Sue T

    2004-10-14

    Journal of Neuroinflammation is an Open Access, online journal published by BioMed Central. Open Access publishing provides instant and universal availability of published work to any potential reader, worldwide, completely free of subscriptions, passwords, and charges. Further, authors retain copyright for their work, facilitating its dissemination. Open Access publishing is made possible by article-processing charges assessed "on the front end" to authors, their institutions, or their funding agencies. Beginning November 1, 2004, the Journal of Neuroinflammation will introduce article-processing charges of around US$525 for accepted articles. This charge will be waived for authors from institutions that are BioMed Central members, and in additional cases for reasons of genuine financial hardship. These article-processing charges pay for an electronic submission process that facilitates efficient and thorough peer review, for publication costs involved in providing the article freely and universally accessible in various formats online, and for the processes required for the article's inclusion in PubMed and its archiving in PubMed Central, e-Depot, Potsdam and INIST. There is no remuneration of any kind provided to the Editors-in-Chief, to any members of the Editorial Board, or to peer reviewers; all of whose work is entirely voluntary. Our article-processing charge is less than charges frequently levied by traditional journals: the Journal of Neuroinflammation does not levy any additional page or color charges on top of this fee, and there are no reprint costs as publication-quality pdf files are provided, free, for distribution in lieu of reprints. Our article-processing charge will enable full, immediate, and continued Open Access for all work published in Journal of Neuroinflammation. The benefits from such Open Access will accrue to readers, through unrestricted access; to authors, through the widest possible dissemination of their work; and to science and society in general, through facilitation of information availability and scientific advancement.

  17. Improving flow distribution in influent channels using computational fluid dynamics.

    PubMed

    Park, No-Suk; Yoon, Sukmin; Jeong, Woochang; Lee, Seungjae

    2016-10-01

    Although the flow distribution in an influent channel where the inflow is split into each treatment process in a wastewater treatment plant greatly affects the efficiency of the process, and a weir is the typical structure for the flow distribution, to the authors' knowledge, there is a paucity of research on the flow distribution in an open channel with a weir. In this study, the influent channel of a real-scale wastewater treatment plant was used, installing a suppressed rectangular weir that has a horizontal crest to cross the full channel width. The flow distribution in the influent channel was analyzed using a validated computational fluid dynamics model to investigate (1) the comparison of single-phase and two-phase simulation, (2) the improved procedure of the prototype channel, and (3) the effect of the inflow rate on flow distribution. The results show that two-phase simulation is more reliable due to the description of the free-surface fluctuations. It should first be considered for improving flow distribution to prevent a short-circuit flow, and the difference in the kinetic energy with the inflow rate makes flow distribution trends different. The authors believe that this case study is helpful for improving flow distribution in an influent channel.

  18. AMBIT RESTful web services: an implementation of the OpenTox application programming interface.

    PubMed

    Jeliazkova, Nina; Jeliazkov, Vedrin

    2011-05-16

    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.

  19. AMBIT RESTful web services: an implementation of the OpenTox application programming interface

    PubMed Central

    2011-01-01

    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202

  20. Open discovery: An integrated live Linux platform of Bioinformatics tools

    PubMed Central

    Vetrivel, Umashankar; Pilla, Kalabharath

    2008-01-01

    Historically, live linux distributions for Bioinformatics have paved way for portability of Bioinformatics workbench in a platform independent manner. Moreover, most of the existing live Linux distributions limit their usage to sequence analysis and basic molecular visualization programs and are devoid of data persistence. Hence, open discovery ‐ a live linux distribution has been developed with the capability to perform complex tasks like molecular modeling, docking and molecular dynamics in a swift manner. Furthermore, it is also equipped with complete sequence analysis environment and is capable of running windows executable programs in Linux environment. Open discovery portrays the advanced customizable configuration of fedora, with data persistency accessible via USB drive or DVD. Availability The Open Discovery is distributed free under Academic Free License (AFL) and can be downloaded from http://www.OpenDiscovery.org.in PMID:19238235

  1. Open ended intelligence: the individuation of intelligent agents

    NASA Astrophysics Data System (ADS)

    Weinbaum Weaver, David; Veitas, Viktoras

    2017-03-01

    Artificial general intelligence is a field of research aiming to distil the principles of intelligence that operate independently of a specific problem domain and utilise these principles in order to synthesise systems capable of performing any intellectual task a human being is capable of and beyond. While "narrow" artificial intelligence which focuses on solving specific problems such as speech recognition, text comprehension, visual pattern recognition and robotic motion has shown impressive breakthroughs lately, understanding general intelligence remains elusive. We propose a paradigm shift from intelligence perceived as a competence of individual agents defined in relation to an a priori given problem domain or a goal, to intelligence perceived as a formative process of self-organisation. We call this process open-ended intelligence. Starting with a brief introduction of the current conceptual approach, we expose a number of serious limitations that are traced back to the ontological roots of the concept of intelligence. Open-ended intelligence is then developed as an abstraction of the process of human cognitive development, so its application can be extended to general agents and systems. We introduce and discuss three facets of the idea: the philosophical concept of individuation, sense-making and the individuation of general cognitive agents. We further show how open-ended intelligence can be framed in terms of a distributed, self-organising network of interacting elements and how such process is scalable. The framework highlights an important relation between coordination and intelligence and a new understanding of values.

  2. Pu Anion Exchange Process Intensification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor-Pashow, Kathryn M. L.

    This research is focused on improving the efficiency of the anion exchange process for purifying plutonium. While initially focused on plutonium, the technology could also be applied to other ion-exchange processes. Work in FY17 focused on the improvement and optimization of porous foam columns that were initially developed in FY16. These foam columns were surface functionalized with poly(4-vinylpyridine) (PVP) to provide the Pu specific anion-exchange sites. Two different polymerization methods were explored for maximizing the surface functionalization with the PVP. The open-celled polymeric foams have large open pores and large surface areas available for sorption. The fluid passes through themore » large open pores of this material, allowing convection to be the dominant mechanism by which mass transport takes place. These materials generally have very low densities, open-celled structures with high cell interconnectivity, small cell sizes, uniform cell size distributions, and high structural integrity. These porous foam columns provide advantages over the typical porous resin beads by eliminating the slow diffusion through resin beads, making the anion-exchange sites easily accessible on the foam surfaces. The best performing samples exceeded the Pu capacity of the commercially available resin, and also offered the advantage of sharper elution profiles, resulting in a more concentrated product, with less loss of material to the dilute heads and tails cuts. An alternate approach to improving the efficiency of this process was also explored through the development of a microchannel array system for performing the anion exchange.« less

  3. Characterization of polypropylene–polyethylene blends by temperature rising elution and crystallization analysis fractionation

    PubMed Central

    del Hierro, Pilar

    2010-01-01

    The introduction of single-site catalysts in the polyolefins industry opens new routes to design resins with improved performance through multicatalyst-multireactor processes. Physical combination of various polyolefin types in a secondary extrusion process is also a common practice to achieve new products with improved properties. The new resins have complex structures, especially in terms of composition distribution, and their characterization is not always an easy task. Techniques like temperature rising elution fractionation (TREF) or crystallization analysis fractionation (CRYSTAF) are currently used to characterize the composition distribution of these resins. It has been shown that certain combinations of polyolefins may result in equivocal results if only TREF or CRYSTAF is used separately for their characterization. PMID:20730530

  4. SeqPig: simple and scalable scripting for large sequencing data sets in Hadoop.

    PubMed

    Schumacher, André; Pireddu, Luca; Niemenmaa, Matti; Kallio, Aleksi; Korpelainen, Eija; Zanetti, Gianluigi; Heljanko, Keijo

    2014-01-01

    Hadoop MapReduce-based approaches have become increasingly popular due to their scalability in processing large sequencing datasets. However, as these methods typically require in-depth expertise in Hadoop and Java, they are still out of reach of many bioinformaticians. To solve this problem, we have created SeqPig, a library and a collection of tools to manipulate, analyze and query sequencing datasets in a scalable and simple manner. SeqPigscripts use the Hadoop-based distributed scripting engine Apache Pig, which automatically parallelizes and distributes data processing tasks. We demonstrate SeqPig's scalability over many computing nodes and illustrate its use with example scripts. Available under the open source MIT license at http://sourceforge.net/projects/seqpig/

  5. Avalanches and power-law behaviour in lung inflation

    NASA Astrophysics Data System (ADS)

    Suki, Béla; Barabási, Albert-László; Hantos, Zoltán; Peták, Ferenc; Stanley, H. Eugene

    1994-04-01

    WHEN lungs are emptied during exhalation, peripheral airways close up1. For people with lung disease, they may not reopen for a significant portion of inhalation, impairing gas exchange2,3. A knowledge of the mechanisms that govern reinflation of collapsed regions of lungs is therefore central to the development of ventilation strategies for combating respiratory problems. Here we report measurements of the terminal airway resistance, Rt , during the opening of isolated dog lungs. When inflated by a constant flow, Rt decreases in discrete jumps. We find that the probability distribution of the sizes of the jumps and of the time intervals between them exhibit power-law behaviour over two decades. We develop a model of the inflation process in which 'avalanches' of airway openings are seen-with power-law distributions of both the size of avalanches and the time intervals between them-which agree quantitatively with those seen experimentally, and are reminiscent of the power-law behaviour observed for self-organized critical systems4. Thus power-law distributions, arising from avalanches associated with threshold phenomena propagating down a branching tree structure, appear to govern the recruitment of terminal airspaces.

  6. Creating Hierarchical Pores by Controlled Linker Thermolysis in Multivariate Metal-Organic Frameworks.

    PubMed

    Feng, Liang; Yuan, Shuai; Zhang, Liang-Liang; Tan, Kui; Li, Jia-Luo; Kirchon, Angelo; Liu, Ling-Mei; Zhang, Peng; Han, Yu; Chabal, Yves J; Zhou, Hong-Cai

    2018-02-14

    Sufficient pore size, appropriate stability, and hierarchical porosity are three prerequisites for open frameworks designed for drug delivery, enzyme immobilization, and catalysis involving large molecules. Herein, we report a powerful and general strategy, linker thermolysis, to construct ultrastable hierarchically porous metal-organic frameworks (HP-MOFs) with tunable pore size distribution. Linker instability, usually an undesirable trait of MOFs, was exploited to create mesopores by generating crystal defects throughout a microporous MOF crystal via thermolysis. The crystallinity and stability of HP-MOFs remain after thermolabile linkers are selectively removed from multivariate metal-organic frameworks (MTV-MOFs) through a decarboxylation process. A domain-based linker spatial distribution was found to be critical for creating hierarchical pores inside MTV-MOFs. Furthermore, linker thermolysis promotes the formation of ultrasmall metal oxide nanoparticles immobilized in an open framework that exhibits high catalytic activity for Lewis acid-catalyzed reactions. Most importantly, this work provides fresh insights into the connection between linker apportionment and vacancy distribution, which may shed light on probing the disordered linker apportionment in multivariate systems, a long-standing challenge in the study of MTV-MOFs.

  7. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  8. What does it take to build a medium scale scientific cloud to process significant amounts of Earth observation data?

    NASA Astrophysics Data System (ADS)

    Hollstein, André; Diedrich, Hannes; Spengler, Daniel

    2017-04-01

    The installment of the operational fleet of Sentinels by Copernicus offers an unprecedented influx of freely available Earth Observation data with Sentinel-2 being a great example. It offers a broad range of land applications due to its high spatial sampling from 10 m to 20 m and its multi-spectral imaging capabilities with 13 spectral bands. The open access policy allows unrestricted use by everybody and provides data downloads for on the respective sites. For a small area of interest and shorter time series, data processing, and exploitation can easily be done manually. However, for multi-temporal analysis of larger areas, the data size can quickly increase such that it is not manageable in practice on a personal computer which leads to an increasing interest in central data exploitation platforms. Prominent examples are GoogleEarth Engine, NASA Earth Exchange (NEX) or current developments such as CODE-DE in Germany. Open standards are still evolving, and the choice of a platform may create lock-in scenarios and a situation where scientists are not anymore in full control of all aspects of their analysis. Securing intellectual properties of researchers can become a major issue in the future. Partnering with a startup company that is dedicated to providing tools for farm management and precision farming, GFZ builds a small-scale science cloud named GTS2 for processing and distribution of Sentinel-2 data. The service includes a sophisticated atmospheric correction algorithm, spatial co-registration of time series data, as well as a web API for data distribution. This approach is different from the drag to centralized research using infrastructures controlled by others. By keeping the full licensing rights, it allows developing new business models independent from the initially chosen processing provider. Currently, data is held for the greater German area but is extendable to larger areas on short notice due to a scalable distributed network file system. For a given area of interest, band and time range selection, the API returns only the data that was requested in a fast manner and thereby saves storage space on the user's machine. A jupyterhub instance is a main tool for data exploitation by our users. Nearly all used software is open source, is based on open standards, and allows to transfer software to other infrastructures. In the talk, we give an overview of the current status of the project and the service, but also want to share our experience with its development.

  9. Does climate have heavy tails?

    NASA Astrophysics Data System (ADS)

    Bermejo, Miguel; Mudelsee, Manfred

    2013-04-01

    When we speak about a distribution with heavy tails, we are referring to the probability of the existence of extreme values will be relatively large. Several heavy-tail models are constructed from Poisson processes, which are the most tractable models. Among such processes, one of the most important are the Lévy processes, which are those process with independent, stationary increments and stochastic continuity. If the random component of a climate process that generates the data exhibits a heavy-tail distribution, and if that fact is ignored by assuming a finite-variance distribution, then there would be serious consequences (in the form, e.g., of bias) for the analysis of extreme values. Yet, it appears that it is an open question to what extent and degree climate data exhibit heavy-tail phenomena. We present a study about the statistical inference in the presence of heavy-tail distribution. In particular, we explore (1) the estimation of tail index of the marginal distribution using several estimation techniques (e.g., Hill estimator, Pickands estimator) and (2) the power of hypothesis tests. The performance of the different methods are compared using artificial time-series by means of Monte Carlo experiments. We systematically apply the heavy tail inference to observed climate data, in particular we focus on time series data. We study several proxy and directly observed climate variables from the instrumental period, the Holocene and the Pleistocene. This work receives financial support from the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme).

  10. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  11. Crackles and instabilities during lung inflation

    NASA Astrophysics Data System (ADS)

    Alencar, Adriano M.; Majumdar, Arnab; Hantos, Zoltan; Buldyrev, Sergey V.; Eugene Stanley, H.; Suki, Béla

    2005-11-01

    In a variety of physico-chemical reactions, the actual process takes place in a reactive zone, called the “active surface”. We define the active surface of the lung as the set of airway segments that are closed but connected to the trachea through an open pathway, which is the interface between closed and open regions in a collapsed lung. To study the active surface and the time interval between consecutive openings, we measured the sound pressure of crackles, associated with the opening of collapsed airway segments in isolated dog lungs, inflating from the collapsed state in 120 s. We analyzed the sequence of crackle amplitudes, inter-crackle intervals, and low frequency energy from acoustic data. The series of spike amplitudes spans two orders of magnitude and the inter-crackle intervals spans over five orders of magnitude. The distribution of spike amplitudes follows a power law for nearly two decades, while the distribution of time intervals between consecutive crackles shows two regimes of power law behavior, where the first region represents crackles coming from avalanches of openings whereas the second region is due to the time intervals between separate avalanches. Using the time interval between measured crackles, we estimated the time evolution of the active surface during lung inflation. In addition, we show that recruitment and instabilities along the pressure-volume curve are associated with airway opening and recruitment. We find a good agreement between the theory of the dynamics of lung inflation and the experimental data which combined with numerical results may prove useful in the clinical diagnosis of lung diseases.

  12. Software LS-MIDA for efficient mass isotopomer distribution analysis in metabolic modelling.

    PubMed

    Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eisenreich, Wolfgang; Dandekar, Thomas

    2013-07-09

    The knowledge of metabolic pathways and fluxes is important to understand the adaptation of organisms to their biotic and abiotic environment. The specific distribution of stable isotope labelled precursors into metabolic products can be taken as fingerprints of the metabolic events and dynamics through the metabolic networks. An open-source software is required that easily and rapidly calculates from mass spectra of labelled metabolites, derivatives and their fragments global isotope excess and isotopomer distribution. The open-source software "Least Square Mass Isotopomer Analyzer" (LS-MIDA) is presented that processes experimental mass spectrometry (MS) data on the basis of metabolite information such as the number of atoms in the compound, mass to charge ratio (m/e or m/z) values of the compounds and fragments under study, and the experimental relative MS intensities reflecting the enrichments of isotopomers in 13C- or 15 N-labelled compounds, in comparison to the natural abundances in the unlabelled molecules. The software uses Brauman's least square method of linear regression. As a result, global isotope enrichments of the metabolite or fragment under study and the molar abundances of each isotopomer are obtained and displayed. The new software provides an open-source platform that easily and rapidly converts experimental MS patterns of labelled metabolites into isotopomer enrichments that are the basis for subsequent observation-driven analysis of pathways and fluxes, as well as for model-driven metabolic flux calculations.

  13. Multiphysics Modeling and Simulations of Mil A46100 Armor-Grade Martensitic Steel Gas Metal Arc Welding Process

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Ramaswami, S.; Snipes, J. S.; Yen, C.-F.; Cheeseman, B. A.; Montgomery, J. S.

    2013-10-01

    A multiphysics computational model has been developed for the conventional Gas Metal Arc Welding (GMAW) joining process and used to analyze butt-welding of MIL A46100, a prototypical high-hardness armor martensitic steel. The model consists of five distinct modules, each covering a specific aspect of the GMAW process, i.e., (a) dynamics of welding-gun behavior; (b) heat transfer from the electric arc and mass transfer from the electrode to the weld; (c) development of thermal and mechanical fields during the GMAW process; (d) the associated evolution and spatial distribution of the material microstructure throughout the weld region; and (e) the final spatial distribution of the as-welded material properties. To make the newly developed GMAW process model applicable to MIL A46100, the basic physical-metallurgy concepts and principles for this material have to be investigated and properly accounted for/modeled. The newly developed GMAW process model enables establishment of the relationship between the GMAW process parameters (e.g., open circuit voltage, welding current, electrode diameter, electrode-tip/weld distance, filler-metal feed speed, and gun travel speed), workpiece material chemistry, and the spatial distribution of as-welded material microstructure and properties. The predictions of the present GMAW model pertaining to the spatial distribution of the material microstructure and properties within the MIL A46100 weld region are found to be consistent with general expectations and prior observations.

  14. Building a Trustworthy Environmental Science Data Repository: Lessons Learned from the ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S. K.; Boyer, A.; Beaty, T.; Deb, D.; Hook, L.

    2017-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, https://daac.ornl.gov) for biogeochemical dynamics is one of NASA's Earth Observing System Data and Information System (EOSDIS) data centers. The mission of the ORNL DAAC is to assemble, distribute, and provide data services for a comprehensive archive of terrestrial biogeochemistry and ecological dynamics observations and models to facilitate research, education, and decision-making in support of NASA's Earth Science. Since its establishment in 1994, ORNL DAAC has been continuously building itself into a trustworthy environmental science data repository by not only ensuring the quality and usability of its data holdings, but also optimizing its data publication and management process. This paper describes the lessons learned from ORNL DAAC's effort toward this goal. ORNL DAAC has been proactively implementing international community standards throughout its data management life cycle, including data publication, preservation, discovery, visualization, and distribution. Data files in standard formats, detailed documentation, and metadata following standard models are prepared to improve the usability and longevity of data products. Assignment of a Digital Object Identifier (DOI) ensures the identifiability and accessibility of every data product, including the different versions and revisions of its life cycle. ORNL DAAC's data citation policy assures data producers receive appropriate recognition of use of their products. Web service standards, such as OpenSearch and Open Geospatial Consortium (OGC), promotes the discovery, visualization, distribution, and integration of ORNL DAAC's data holdings. Recently, ORNL DAAC began efforts to optimize and standardize its data archival and data publication workflows, to improve the efficiency and transparency of its data archival and management processes.

  15. Steady-state kinetics of solitary batrachotoxin-treated sodium channels. Kinetics on a bounded continuum of polymer conformations.

    PubMed Central

    Rubinson, K A

    1992-01-01

    The underlying principles of the kinetics and equilibrium of a solitary sodium channel in the steady state are examined. Both the open and closed kinetics are postulated to result from round-trip excursions from a transition region that separates the openable and closed forms. Exponential behavior of the kinetics can have origins different from small-molecule systems. These differences suggest that the probability density functions (PDFs) that describe the time dependences of the open and closed forms arise from a distribution of rate constants. The distribution is likely to arise from a thermal modulation of the channel structure, and this provides a physical basis for the following three-variable equation: [formula; see text] Here, A0 is a scaling term, k is the mean rate constant, and sigma quantifies the Gaussian spread for the contributions of a range of effective rate constants. The maximum contribution is made by k, with rates faster and slower contributing less. (When sigma, the standard deviation of the spread, goes to zero, then p(f) = A0 e-kt.) The equation is applied to the single-channel steady-state probability density functions for batrachotoxin-treated sodium channels (1986. Keller et al. J. Gen. Physiol. 88: 1-23). The following characteristics are found: (a) The data for both open and closed forms of the channel are fit well with the above equation, which represents a Gaussian distribution of first-order rate processes. (b) The simple relationship [formula; see text] holds for the mean effective rat constants. Or, equivalently stated, the values of P open calculated from the k values closely agree with the P open values found directly from the PDF data. (c) In agreement with the known behavior of voltage-dependent rate constants, the voltage dependences of the mean effective rate constants for the opening and closing of the channel are equal and opposite over the voltage range studied. That is, [formula; see text] "Bursts" are related to the well-known cage effect of solution chemistry. PMID:1312365

  16. Multimedia Network Design Study

    DTIC Science & Technology

    1989-09-30

    manipulation and analysis of the equations involved, thereby providing the application of the great range of powerful mathematical optimization...be treated by this analysis. First, all arrivals to the network have the Poisson distribution, and separate traffic classes may have separate qrrival...different for open and closed networks, so these two situations will be treated separately in the following subsections. 2.3.1 The Computational Process in

  17. Raster Data Partitioning for Supporting Distributed GIS Processing

    NASA Astrophysics Data System (ADS)

    Nguyen Thai, B.; Olasz, A.

    2015-08-01

    In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.

  18. A Tour of Big Data, Open Source Data Management Technologies from the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.

    2012-12-01

    The Apache Software Foundation, a non-profit foundation charged with dissemination of open source software for the public good, provides a suite of data management technologies for distributed archiving, data ingestion, data dissemination, processing, triage and a host of other functionalities that are becoming critical in the Big Data regime. Apache is the world's largest open source software organization, boasting over 3000 developers from around the world all contributing to some of the most pervasive technologies in use today, from the HTTPD web server that powers a majority of Internet web sites to the Hadoop technology that is now projected at over a $1B dollar industry. Apache data management technologies are emerging as de facto off-the-shelf components for searching, distributing, processing and archiving key science data sets both geophysical, space and planetary based, all the way to biomedicine. In this talk, I will give a virtual tour of the Apache Software Foundation, its meritocracy and governance structure, and also its key big data technologies that organizations can take advantage of today and use to save cost, schedule, and resources in implementing their Big Data needs. I'll illustrate the Apache technologies in the context of several national priority projects, including the U.S. National Climate Assessment (NCA), and in the International Square Kilometre Array (SKA) project that are stretching the boundaries of volume, velocity, complexity, and other key Big Data dimensions.

  19. A Framework for Open, Flexible and Distributed Learning.

    ERIC Educational Resources Information Center

    Khan, Badrul H.

    Designing open, flexible distance learning systems on the World Wide Web requires thoughtful analysis and investigation combined with an understanding of both the Web's attributes and resources and the ways instructional design principles can be applied to tap the Web's potential. A framework for open, flexible, and distributed learning has been…

  20. A framework for integration of scientific applications into the OpenTopography workflow

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C.; Baru, C.

    2012-12-01

    The NSF-funded OpenTopography facility provides online access to Earth science-oriented high-resolution LIDAR topography data, online processing tools, and derivative products. The underlying cyberinfrastructure employs a multi-tier service oriented architecture that is comprised of an infrastructure tier, a processing services tier, and an application tier. The infrastructure tier consists of storage, compute resources as well as supporting databases. The services tier consists of the set of processing routines each deployed as a Web service. The applications tier provides client interfaces to the system. (e.g. Portal). We propose a "pluggable" infrastructure design that will allow new scientific algorithms and processing routines developed and maintained by the community to be integrated into the OpenTopography system so that the wider earth science community can benefit from its availability. All core components in OpenTopography are available as Web services using a customized open-source Opal toolkit. The Opal toolkit provides mechanisms to manage and track job submissions, with the help of a back-end database. It allows monitoring of job and system status by providing charting tools. All core components in OpenTopography have been developed, maintained and wrapped as Web services using Opal by OpenTopography developers. However, as the scientific community develops new processing and analysis approaches this integration approach is not scalable efficiently. Most of the new scientific applications will have their own active development teams performing regular updates, maintenance and other improvements. It would be optimal to have the application co-located where its developers can continue to actively work on it while still making it accessible within the OpenTopography workflow for processing capabilities. We will utilize a software framework for remote integration of these scientific applications into the OpenTopography system. This will be accomplished by virtually extending the OpenTopography service over the various infrastructures running these scientific applications and processing routines. This involves packaging and distributing a customized instance of the Opal toolkit that will wrap the software application as an OPAL-based web service and integrate it into the OpenTopography framework. We plan to make this as automated as possible. A structured specification of service inputs and outputs along with metadata annotations encoded in XML can be utilized to automate the generation of user interfaces, with appropriate tools tips and user help features, and generation of other internal software. The OpenTopography Opal toolkit will also include the customizations that will enable security authentication, authorization and the ability to write application usage and job statistics back to the OpenTopography databases. This usage information could then be reported to the original service providers and used for auditing and performance improvements. This pluggable framework will enable the application developers to continue to work on enhancing their application while making the latest iteration available in a timely manner to the earth sciences community. This will also help us establish an overall framework that other scientific application providers will also be able to use going forward.

  1. Levels and distribution of polybrominated diphenyl ethers in soil, sediment and dust samples collected from various electronic waste recycling sites within Guiyu town, southern China.

    PubMed

    Labunska, Iryna; Harrad, Stuart; Santillo, David; Johnston, Paul; Brigden, Kevin

    2013-02-01

    Electronic waste recycling operations in some parts of Asia are conducted using rudimentary techniques which result in workplace and environmental contamination with toxic metals and persistent organic pollutants. This study reports concentrations of 14 polybrominated diphenyl ethers (PBDEs), from tri- to deca-brominated, in 31 samples of soil, sediment, dust or ash collected in the vicinity of e-waste recycling sites in Guiyu (southeast China) which were engaged in common activities such as dismantling, shredding, solder recovery, acid processing and open burning. The concentrations detected in this study far exceed those reported previously in urban soil and sediment and are consistent with or exceed those reported in previous studies around e-waste processing facilities. Some of the highest PBDE concentrations reported to date (e.g. 390 000 ng g (-1) dw (∑ 14 PBDEs)) were found in a sample collected from a site used for open-burning of e-waste, while an average concentration of 220 000 ng g (-1) dw (∑ 14 PBDEs) occurred in sediments impacted by circuit board shredding. A decrease in PBDE concentrations observed with increasing distance from workshops in samples associated with acid processing of wastes provides evidence that such operations are a significant source of PBDEs to the environment. Principal components analysis reveals a complex PBDE congener distribution, suggesting contamination by two or even three commercial formulations consistent with the diverse range of wastes processed.

  2. Detecting tree-fall gap disturbances in tropical rain forests with airborne lidar

    NASA Astrophysics Data System (ADS)

    Espirito-Santo, F. D. B.; Saatchi, S.; Keller, M.

    2017-12-01

    Forest inventory studies in the Amazon indicate a large terrestrial carbon sink. However, field plots may fail to represent forest mortality processes at landscape-scales of tropical forests. Here we characterize the frequency distribution of tree-fall gap disturbances in natural forests of tropical forests using a novel combination of forest inventory and airborne lidar data. We quantify gap size frequency distribution along vertical and horizontal dimensions in ten Neotropical forest canopies distributed across gradients of climate and landscapes using airborne lidar measurements. We assessed all canopy openings related to each class of tree height which yields a three dimensional structure of the distribution of canopy gaps. Gap frequency distributions from lidar CHM data vary markedly with minimum gap size thresholds, but we found that natural forest disturbances (tree-fall gaps) follow a power-law distribution with narrow range of power-law exponents (-1.2 to -1.3). These power-law exponents from gap frequency distributions provide insights into how natural forest disturbances are distributed over tropical forest landscape.

  3. On the Performance of an Algebraic MultigridSolver on Multicore Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, A H; Schulz, M; Yang, U M

    2010-04-29

    Algebraic multigrid (AMG) solvers have proven to be extremely efficient on distributed-memory architectures. However, when executed on modern multicore cluster architectures, we face new challenges that can significantly harm AMG's performance. We discuss our experiences on such an architecture and present a set of techniques that help users to overcome the associated problems, including thread and process pinning and correct memory associations. We have implemented most of the techniques in a MultiCore SUPport library (MCSup), which helps to map OpenMP applications to multicore machines. We present results using both an MPI-only and a hybrid MPI/OpenMP model.

  4. Evolution of the Campanian Ignimbrite Magmatic System II: Trace Element and Th Isotopic Evidence for Open-System Processes

    NASA Astrophysics Data System (ADS)

    Bohrson, W. A.; Spera, F. J.; Fowler, S.; Belkin, H.; de Vivo, B.

    2005-12-01

    The Campanian Ignimbrite, a large volume (~200 km3 DRE) trachytic to phonolitic ignimbrite was deposited at ~39.3 ka and represents the largest of a number of highly explosive volcanic events in the region near Naples, Italy. Thermodynamic modeling of the major element evolution using the MELTS algorithm (see companion contribution by Fowler et al.) provides detailed information about the identity of and changes in proportions of solids along the liquid line of descent during isobaric fractional crystallization. We have derived trace element mass balance equations that explicitly accommodate changing mineral-melt bulk distribution coefficients during crystallization and also simultaneously satisfy energy and major element mass conservation. Although major element patterns are reasonably modeled assuming closed system fractional crystallization, modeling of trace elements that represent a range of behaviors (e.g. Zr, Nb, Th, U, Rb, Sm, Sr) yields trends for closed system fractionation that are distinct from those observed. These results suggest open-system processes were also important in the evolution of the Campanian magmatic system. Th isotope data yield an apparent isochron that is ~20 kyr younger than the age of the deposit, and age-corrected Th isotope data indicate that the magma body was an open-system at the time of eruption. Because open-system processes can profoundly change isotopic characteristics of a magma body, these results illustrate that it is critical to understand the contribution that open-system processes make to silicic magma bodies prior to assigning relevance to age or timescale information derived from isotope systematics. Fluid-magma interaction has been proposed as a mechanism to change isotopic and elemental characteristics of magma bodies, but an evaluation of the mass and thermal constraints on such a process suggest large-scale fluid-melt interaction at liquidus temperatures is unlikely. In the case of the magma body associated with the Campanian Ignimbrite, the most likely source of open-system signatures is assimilation of partial melts of compositionally heterogeneous basement composed of older cumulates and intrusive equivalents of volcanic activity within the Campanian region. Additional trace element modeling, explicitly evaluating the mass and energy balance effects that fluid, solids, and melt have on trace element evolution, will further elucidate the contributions of open vs. closed system processes within the Campanian magma body.

  5. Characteristics of Polybrominated Diphenyl Ethers Released from Thermal Treatment and Open Burning of E-Waste.

    PubMed

    Li, Ting-Yu; Zhou, Jun-Feng; Wu, Chen-Chou; Bao, Lian-Jun; Shi, Lei; Zeng, Eddy Y

    2018-04-17

    Primitive processing of e-waste potentially releases abundant organic contaminants to the environment, but the magnitudes and mechanisms remain to be adequately addressed. We conducted thermal treatment and open burning of typical e-wastes, that is, plastics and printed circuit boards. Emission factors of the sum of 39 polybrominated diphenyl ethers (∑ 39 PBDE) were 817-1.60 × 10 5 ng g -1 in thermal treatment and nondetected-9.14 × 10 4 ng g -1 , in open burning. Airborne particles (87%) were the main carriers of PBDEs, followed by residual ashes (13%) and gaseous constituents (0.3%), in thermal treatment, while they were 30%, 43% and 27% in open burning. The output-input mass ratios of ∑ 39 PBDE were 0.12-3.76 in thermal treatment and 0-0.16 in open burning. All PBDEs were largely affiliated with fine particles, with geometric mean diameters at 0.61-0.83 μm in thermal degradation and 0.57-1.16 μm in open burning from plastic casings, and 0.44-0.56 and nondetected- 0.55 μm, from printed circuit boards. Evaporation and reabsorption may be the main emission mechanisms for lightly brominated BDEs, but heavily brominated BDEs tend to affiliate with particles from heating or combustion. The different size distributions of particulate PBDEs in emission sources and adjacent air implicated a noteworthy redisposition process during atmospheric dispersal.

  6. Open Data and Open Science for better Research in the Geo and Space Domain

    NASA Astrophysics Data System (ADS)

    Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.

    2015-12-01

    Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data catalog based on semantical interoperability including the transparent access to data in relational data bases. References: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/207772/Open_Data_Charter.pdfhttp://www.openscience.org/blog/wp-content/uploads/2013/06/OpenSciencePoster.pdf

  7. Local delivery of molecules from a nanopipette for quantitative receptor mapping on live cells.

    PubMed

    Babakinejad, Babak; Jönsson, Peter; López Córdoba, Ainara; Actis, Paolo; Novak, Pavel; Takahashi, Yasufumi; Shevchuk, Andrew; Anand, Uma; Anand, Praveen; Drews, Anna; Ferrer-Montiel, Antonio; Klenerman, David; Korchev, Yuri E

    2013-10-01

    Using nanopipettes to locally deliver molecules to the surface of living cells could potentially open up studies of biological processes down to the level of single molecules. However, in order to achieve precise and quantitative local delivery it is essential to be able to determine the amount and distribution of the molecules being delivered. In this work, we investigate how the size of the nanopipette, the magnitude of the applied pressure or voltage, which drives the delivery, and the distance to the underlying surface influences the number and spatial distribution of the delivered molecules. Analytical expressions describing the delivery are derived and compared with the results from finite element simulations and experiments on delivery from a 100 nm nanopipette in bulk solution and to the surface of sensory neurons. We then developed a setup for rapid and quantitative delivery to multiple subcellular areas, delivering the molecule capsaicin to stimulate opening of Transient Receptor Potential Vanilloid subfamily member 1 (TRPV1) channels, membrane receptors involved in pain sensation. Overall, precise and quantitative delivery of molecules from nanopipettes has been demonstrated, opening up many applications in biology such as locally stimulating and mapping receptors on the surface of live cells.

  8. An Analysis of Fifth-Grade Students' Performance When Solving Selected Open Distributive Sentences. Technical Report No. 397.

    ERIC Educational Resources Information Center

    Hobbs, Charles Eugene

    The author investigates elementary school students' performance when solving selected open distributive sentences in relation to three factors (Open Sentence Type, Context, Number Size) and identifies and classifies solution methods attempted by students and students' errors in performance. Eighty fifth-grade students participated in the…

  9. POSIX and Object Distributed Storage Systems Performance Comparison Studies With Real-Life Scenarios in an Experimental Data Taking Context Leveraging OpenStack Swift & Ceph

    NASA Astrophysics Data System (ADS)

    Poat, M. D.; Lauret, J.; Betts, W.

    2015-12-01

    The STAR online computing infrastructure has become an intensive dynamic system used for first-hand data collection and analysis resulting in a dense collection of data output. As we have transitioned to our current state, inefficient, limited storage systems have become an impediment to fast feedback to online shift crews. Motivation for a centrally accessible, scalable and redundant distributed storage system had become a necessity in this environment. OpenStack Swift Object Storage and Ceph Object Storage are two eye-opening technologies as community use and development have led to success elsewhere. In this contribution, OpenStack Swift and Ceph have been put to the test with single and parallel I/O tests, emulating real world scenarios for data processing and workflows. The Ceph file system storage, offering a POSIX compliant file system mounted similarly to an NFS share was of particular interest as it aligned with our requirements and was retained as our solution. I/O performance tests were run against the Ceph POSIX file system and have presented surprising results indicating true potential for fast I/O and reliability. STAR'S online compute farm historical use has been for job submission and first hand data analysis. The goal of reusing the online compute farm to maintain a storage cluster and job submission will be an efficient use of the current infrastructure.

  10. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Defining Success in Open Science

    PubMed Central

    Ali-Khan, Sarah E.; Jean, Antoine; MacDonald, Emily; Gold, E. Richard

    2018-01-01

    Mounting evidence indicates that worldwide, innovation systems are increasing unsustainable. Equally, concerns about inequities in the science and innovation process, and in access to its benefits, continue. Against a backdrop of growing health, economic and scientific challenges global stakeholders are urgently seeking to spur innovation and maximize the just distribution of benefits for all. Open Science collaboration (OS) – comprising a variety of approaches to increase open, public, and rapid mobilization of scientific knowledge – is seen to be one of the most promising ways forward. Yet, many decision-makers hesitate to construct policy to support the adoption and implementation of OS without access to substantive, clear and reliable evidence. In October 2017, international thought-leaders gathered at an Open Science Leadership Forum in the Washington DC offices of the Bill and Melinda Gates Foundation to share their views on what successful Open Science looks like. Delegates from developed and developing nations, national governments, science agencies and funding bodies, philanthropy, researchers, patient organizations and the biotechnology, pharma and artificial intelligence (AI) industries discussed the outcomes that would rally them to invest in OS, as well as wider issues of policy and implementation. This first of two reports, summarizes delegates' views on what they believe OS will deliver in terms of research, innovation and social impact in the life sciences. Through open and collaborative process over the next months, we will translate these success outcomes into a toolkit of quantitative and qualitative indicators to assess when, where and how open science collaborations best advance research, innovation and social benefit. Ultimately, this work aims to develop and openly share tools to allow stakeholders to evaluate and re-invent their innovation ecosystems, to maximize value for the global public and patients, and address long-standing questions about the mechanics of innovation. PMID:29553146

  12. Defining Success in Open Science.

    PubMed

    Ali-Khan, Sarah E; Jean, Antoine; MacDonald, Emily; Gold, E Richard

    2018-01-01

    Mounting evidence indicates that worldwide, innovation systems are increasing unsustainable. Equally, concerns about inequities in the science and innovation process, and in access to its benefits, continue. Against a backdrop of growing health, economic and scientific challenges global stakeholders are urgently seeking to spur innovation and maximize the just distribution of benefits for all. Open Science collaboration (OS) - comprising a variety of approaches to increase open, public, and rapid mobilization of scientific knowledge - is seen to be one of the most promising ways forward. Yet, many decision-makers hesitate to construct policy to support the adoption and implementation of OS without access to substantive, clear and reliable evidence. In October 2017, international thought-leaders gathered at an Open Science Leadership Forum in the Washington DC offices of the Bill and Melinda Gates Foundation to share their views on what successful Open Science looks like. Delegates from developed and developing nations, national governments, science agencies and funding bodies, philanthropy, researchers, patient organizations and the biotechnology, pharma and artificial intelligence (AI) industries discussed the outcomes that would rally them to invest in OS, as well as wider issues of policy and implementation. This first of two reports, summarizes delegates' views on what they believe OS will deliver in terms of research, innovation and social impact in the life sciences. Through open and collaborative process over the next months, we will translate these success outcomes into a toolkit of quantitative and qualitative indicators to assess when, where and how open science collaborations best advance research, innovation and social benefit. Ultimately, this work aims to develop and openly share tools to allow stakeholders to evaluate and re-invent their innovation ecosystems, to maximize value for the global public and patients, and address long-standing questions about the mechanics of innovation.

  13. Alluvial deposits and plant distribution in an Amazonian lowland megafan

    NASA Astrophysics Data System (ADS)

    Zani, H.; Rossetti, D.; Cremon; Cohen, M.; Pessenda, L. C.

    2012-12-01

    A large volume of sandy alluvial deposits (> 1000 km2) characterizes a flat wetland in northern Amazonia. These have been recently described as the sedimentary record of a megafan system, which have a distinct triangular shape produced by highly migratory distributary rivers. The vegetation map suggests that this megafan is dominated by open vegetation in sharp contact with the surround rainforest. Understanding the relationship between geomorphological processes and vegetation distribution is crucial to decipher and conserve the biodiversity in this Amazonian ecosystem. In this study we interpret plant dynamics over time, and investigate its potential control by sedimentary processes during landscape evolution. The study area is located in the Viruá National Park. Two field campaigns were undertaken in the dry seasons of 2010 and 2011 and the sampling sites were selected by combining accessibility and representativeness. Vegetation contrasts were recorded along a transect in the medial section of the Viruá megafan. Due to the absence of outcrops, samples were extracted using a core device, which allowed sampling up to a depth of 7.5 m. All cores were opened and described in the field, with 5 cm3 samples collected at 20 cm intervals. The δ13C of organic matter was used as a proxy to distinguish between C3 and C4 plant communities. The chronology was established based on radiocarbon dating. The results suggest that the cores from forested areas show the most depleted values of δ13C, ranging from -32.16 to -27.28‰. The δ13C curve in these areas displays typical C3 land plant values for the entire record, which covers most of the Holocene. This finding indicates that either the vegetation remained stable over time or the sites were dominated by aquatic environments with freshwater plants before forest establishment. The cores from the open vegetation areas show a progressive upward enrichment in δ13C values, which range from -28.50 to -19.59‰. This trend is more pronounced after de mid-Holocene, suggesting that the open vegetation, represented mostly by C4 land plants, evolved only more recently. Based on our isotope data, a model is proposed taking into account the influence of sedimentary dynamics on the modern pattern of plan distribution. The establishment of open vegetation occurred at different times depending on location over the megafan area, varying from around 3,000 to 6,400 cal yrs BP. As sedimentation took place, areas located far from the surrounding rainforest were prone to inputs of organic matter derived from open vegetation, whereas the contribution of organic matter derived from arboreous vegetation increases toward the areas located closer to the rainforest. In general, open vegetation is constrained to depositional sites that remained active until relatively recent Holocene times, while surrounding areas with a relatively older geological history are covered by dense forest. The results presented here consist in a striking example of the influence of sedimentary processes during the Late Pleistocene-Holocene on the development of modern plants of this Amazonian lowland.

  14. Enabling Access to High-Resolution Lidar Topography for Earth Science Research

    NASA Astrophysics Data System (ADS)

    Crosby, Christopher; Nandigam, Viswanath; Arrowsmith, Ramon; Baru, Chaitan

    2010-05-01

    High-resolution topography data acquired with lidar (light detection and ranging a.k.a. laser scanning) technology are revolutionizing the way we study the geomorphic processes acting along the Earth's surface. These data, acquired from either an airborne platform or from a tripod-mounted scanner, are emerging as a fundamental tool for research on a variety of topics ranging from earthquake hazards to ice sheet dynamics. Lidar topography data allow earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible yet essential for their appropriate representation. These datasets also have significant implications for earth science education and outreach because they provide an accurate digital representation of landforms and geologic hazards. However, along with the potential of lidar topography comes an increase in the volume and complexity of data that must be efficiently managed, archived, distributed, processed and integrated in order for them to be of use to the community. A single lidar data acquisition may generate terabytes of data in the form of point clouds, digital elevation models (DEMs), and derivative imagery. This massive volume of data is often difficult to manage and poses significant distribution challenges when trying to allow access to the data for a large scientific user community. Furthermore, the datasets can be technically challenging to work with and may require specific software and computing resources that are not readily available to many users. The U.S. National Science Foundation (NSF)-funded OpenTopography Facility (http://www.opentopography.org) is an online data access and processing system designed to address the challenges posed by lidar data, and to democratize access to these data for the scientific user community. OpenTopography provides free, online access to lidar data in a number of forms, including raw lidar point cloud data, standard DEMs, and easily accessible Google Earth visualizations. OpenTopography uses cyberinfrastructure resources to allow users, regardless of their level of expertise, to access lidar data products that can be applied to their research. In addition to data access, the system uses customized algorithms and high-performance computing resources to allow users to perform on-the-fly data processing tasks such as the generation of custom DEMs. OpenTopography's primarily focus is on large, community-oriented, scientific data sets, such as those acquired by the NSF-funded EarthScope project. We are actively expanding our holdings through collaborations with researchers and data providers to include data from a wide variety of landscapes and geologic domains. Ultimately, the goal is for OpenTopography to be the primary clearing house for Earth science-oriented high-resolution topography. This presentation will provide an overview of the OpenTopography Facility, including available data, processing capabilities and resources, examples from scientific use cases, and a snapshot of system and data usage thus far. We will also discuss current development activities related to deploying high-performance algorithms for hydrologic processing of DEMs, geomorphic change detection analysis, and the incorporation of full waveform lidar data into the system.

  15. A novel process route for the production of spherical SLS polymer powders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, Jochen; Sachs, Marius; Blümel, Christina

    2015-05-22

    Currently, rapid prototyping gradually is transferred to additive manufacturing opening new applications. Especially selective laser sintering (SLS) is promising. One drawback is the limited choice of polymer materials available as optimized powders. Powders produced by cryogenic grinding show poor powder flowability resulting in poor device quality. Within this account we present a novel process route for the production of spherical polymer micron-sized particles of good flowability. The feasibility of the process chain is demonstrated for polystyrene e. In a first step polymer microparticles are produced by a wet grinding method. By this approach the mean particle size and the particlemore » size distribution can be tuned between a few microns and several 10 microns. The applicability of this method will be discussed for different polymers and the dependencies of product particle size distribution on stressing conditions and process temperature will be outlined. The comminution products consist of microparticles of irregular shape and poor powder flowability. An improvement of flowability of the ground particles is achieved by changing their shape: they are rounded using a heated downer reactor. The influence of temperature profile and residence time on the product properties will be addressed applying a viscous-flow sintering model. To further improve the flowability of the cohesive spherical polymer particles nanoparticles are adhered onto the microparticles’ surface. The improvement of flowability is remarkable: rounded and dry-coated powders exhibit a strongly reduced tensile strength as compared to the comminution product. The improved polymer powders obtained by the process route proposed open new possibilities in SLS processing including the usage of much smaller polymer beads.« less

  16. NASA's Earth Science Gateway: A Platform for Interoperable Services in Support of the GEOSS Architecture

    NASA Astrophysics Data System (ADS)

    Alameh, N.; Bambacus, M.; Cole, M.

    2006-12-01

    Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG has played a key role in recent GEOSS Service Network (GSN) demos and workshops, acting not only as a service and data catalog and discovery client, but also as a portrayal and visualization client to distributed data.

  17. Effects of Land Cover on the Movement of Frugivorous Birds in a Heterogeneous Landscape.

    PubMed

    Da Silveira, Natalia Stefanini; Niebuhr, Bernardo Brandão S; Muylaert, Renata de Lara; Ribeiro, Milton Cezar; Pizo, Marco Aurélio

    2016-01-01

    Movement is a key spatiotemporal process that enables interactions between animals and other elements of nature. The understanding of animal trajectories and the mechanisms that influence them at the landscape level can yield insight into ecological processes and potential solutions to specific ecological problems. Based upon optimal foraging models and empirical evidence, we hypothesized that movement by thrushes is highly tortuous (low average movement speeds and homogeneous distribution of turning angles) inside forests, moderately tortuous in urban areas, which present intermediary levels of resources, and minimally tortuous (high movement speeds and turning angles next to 0 radians) in open matrix types (e.g., crops and pasture). We used data on the trajectories of two common thrush species (Turdus rufiventris and Turdus leucomelas) collected by radio telemetry in a fragmented region in Brazil. Using a maximum likelihood model selection approach we fit four probability distribution models to average speed data, considering short-tailed, long-tailed, and scale-free distributions (to represent different regimes of movement variation), and one distribution to relative angle data. Models included land cover type and distance from forest-matrix edges as explanatory variables. Speed was greater farther away from forest edges and increased faster inside forest habitat compared to urban and open matrices. However, turning angle was not influenced by land cover. Thrushes presented a very tortuous trajectory, with many displacements followed by turns near 180 degrees. Thrush trajectories resembled habitat and edge dependent, tortuous random walks, with a well-defined movement scale inside each land cover type. Although thrushes are habitat generalists, they showed a greater preference for forest edges, and thus may be considered edge specialists. Our results reinforce the importance of studying animal movement patterns in order to understand ecological processes such as seed dispersal in fragmented areas, where the percentage of remaining habitat is dwindling.

  18. Distributed representations in memory: Insights from functional brain imaging

    PubMed Central

    Rissman, Jesse; Wagner, Anthony D.

    2015-01-01

    Forging new memories for facts and events, holding critical details in mind on a moment-to-moment basis, and retrieving knowledge in the service of current goals all depend on a complex interplay between neural ensembles throughout the brain. Over the past decade, researchers have increasingly leveraged powerful analytical tools (e.g., multi-voxel pattern analysis) to decode the information represented within distributed fMRI activity patterns. In this review, we discuss how these methods can sensitively index neural representations of perceptual and semantic content, and how leverage on the engagement of distributed representations provides unique insights into distinct aspects of memory-guided behavior. We emphasize that, in addition to characterizing the contents of memories, analyses of distributed patterns shed light on the processes that influence how information is encoded, maintained, or retrieved, and thus inform memory theory. We conclude by highlighting open questions about memory that can be addressed through distributed pattern analyses. PMID:21943171

  19. Unbiased free energy estimates in fast nonequilibrium transformations using Gaussian mixtures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Procacci, Piero

    2015-04-21

    In this paper, we present an improved method for obtaining unbiased estimates of the free energy difference between two thermodynamic states using the work distribution measured in nonequilibrium driven experiments connecting these states. The method is based on the assumption that any observed work distribution is given by a mixture of Gaussian distributions, whose normal components are identical in either direction of the nonequilibrium process, with weights regulated by the Crooks theorem. Using the prototypical example for the driven unfolding/folding of deca-alanine, we show that the predicted behavior of the forward and reverse work distributions, assuming a combination of onlymore » two Gaussian components with Crooks derived weights, explains surprisingly well the striking asymmetry in the observed distributions at fast pulling speeds. The proposed methodology opens the way for a perfectly parallel implementation of Jarzynski-based free energy calculations in complex systems.« less

  20. Asteroids@Home

    NASA Astrophysics Data System (ADS)

    Durech, Josef; Hanus, J.; Vanco, R.

    2012-10-01

    We present a new project called Asteroids@home (http://asteroidsathome.net/boinc). It is a volunteer-computing project that uses an open-source BOINC (Berkeley Open Infrastructure for Network Computing) software to distribute tasks to volunteers, who provide their computing resources. The project was created at the Astronomical Institute, Charles University in Prague, in cooperation with the Czech National Team. The scientific aim of the project is to solve a time-consuming inverse problem of shape reconstruction of asteroids from sparse-in-time photometry. The time-demanding nature of the problem comes from the fact that with sparse-in-time photometry the rotation period of an asteroid is not apriori known and a huge parameter space must be densely scanned for the best solution. The nature of the problem makes it an ideal task to be solved by distributed computing - the period parameter space can be divided into small bins that can be scanned separately and then joined together to give the globally best solution. In the framework of the the project, we process asteroid photometric data from surveys together with asteroid lightcurves and we derive asteroid shapes and spin states. The algorithm is based on the lightcurve inversion method developed by Kaasalainen et al. (Icarus 153, 37, 2001). The enormous potential of distributed computing will enable us to effectively process also the data from future surveys (Large Synoptic Survey Telescope, Gaia mission, etc.). We also plan to process data of a synthetic asteroid population to reveal biases of the method. In our presentation, we will describe the project, show the first results (new models of asteroids), and discuss the possibilities of its further development. This work has been supported by the grant GACR P209/10/0537 of the Czech Science Foundation and by the Research Program MSM0021620860 of the Ministry of Education of the Czech Republic.

  1. Measurements of spectral optical properties and their relation to biogeochemical variables and processes in Crater Lake, Crater Lake National Park, OR

    USGS Publications Warehouse

    Boss, E.S.; Collier, R.; Larson, G.; Fennel, K.; Pegau, W.S.

    2007-01-01

    Spectral inherent optical properties (IOPs) have been measured at Crater Lake, OR, an extremely clear sub-alpine lake. Indeed Pure water IOPs are major contributors to the total IOPs, and thus to the color of the lake. Variations in the spatial distribution of IOPs were observed in June and September 2001, and reflect biogeochemical processes in the lake. Absorption by colored dissolved organic material increases with depth and between June and September in the upper 300 m. This pattern is consistent with a net release of dissolved organic materials from primary and secondary production through the summer and its photo-oxidation near the surface. Waters fed by a tributary near the lake's rim exhibited low levels of absorption by dissolved organic materials. Scattering is mostly dominated by organic particulate material, though inorganic material is found to enter the lake from the rim following a rain storm. Several similarities to oceanic oligotrophic regions are observed: (a) The Beam attenuation correlates well with particulate organic material (POM) and the relationship is similar to that observed in the open ocean. (b) The specific absorption of colored dissolved organic material has a value similar to that of open ocean humic material. (c) The distribution of chlorophyll with depth does not follow the distribution of particulate organic material due to photo-acclimation resulting in a subsurface pigment maximum located about 50 m below the POM maximum. ?? 2007 Springer Science+Business Media B.V.

  2. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    NASA Astrophysics Data System (ADS)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  3. The effect of wind mixing on the vertical distribution of buoyant plastic debris

    NASA Astrophysics Data System (ADS)

    Kukulka, T.; Proskurowski, G.; Morét-Ferguson, S.; Meyer, D. W.; Law, K. L.

    2012-04-01

    Micro-plastic marine debris is widely distributed in vast regions of the subtropical gyres and has emerged as a major open ocean pollutant. The fate and transport of plastic marine debris is governed by poorly understood geophysical processes, such as ocean mixing within the surface boundary layer. Based on profile observations and a one-dimensional column model, we demonstrate that plastic debris is vertically distributed within the upper water column due to wind-driven mixing. These results suggest that total oceanic plastics concentrations are significantly underestimated by traditional surface measurements, requiring a reinterpretation of existing plastic marine debris data sets. A geophysical approach must be taken in order to properly quantify and manage this form of marine pollution.

  4. Open-source algorithm for detecting sea ice surface features in high-resolution optical imagery

    NASA Astrophysics Data System (ADS)

    Wright, Nicholas C.; Polashenski, Chris M.

    2018-04-01

    Snow, ice, and melt ponds cover the surface of the Arctic Ocean in fractions that change throughout the seasons. These surfaces control albedo and exert tremendous influence over the energy balance in the Arctic. Increasingly available meter- to decimeter-scale resolution optical imagery captures the evolution of the ice and ocean surface state visually, but methods for quantifying coverage of key surface types from raw imagery are not yet well established. Here we present an open-source system designed to provide a standardized, automated, and reproducible technique for processing optical imagery of sea ice. The method classifies surface coverage into three main categories: snow and bare ice, melt ponds and submerged ice, and open water. The method is demonstrated on imagery from four sensor platforms and on imagery spanning from spring thaw to fall freeze-up. Tests show the classification accuracy of this method typically exceeds 96 %. To facilitate scientific use, we evaluate the minimum observation area required for reporting a representative sample of surface coverage. We provide an open-source distribution of this algorithm and associated training datasets and suggest the community consider this a step towards standardizing optical sea ice imagery processing. We hope to encourage future collaborative efforts to improve the code base and to analyze large datasets of optical sea ice imagery.

  5. The mass-ratio and eccentricity distributions of barium and S stars, and red giants in open clusters

    NASA Astrophysics Data System (ADS)

    Van der Swaelmen, M.; Boffin, H. M. J.; Jorissen, A.; Van Eck, S.

    2017-01-01

    Context. A complete set of orbital parameters for barium stars, including the longest orbits, has recently been obtained thanks to a radial-velocity monitoring with the HERMES spectrograph installed on the Flemish Mercator telescope. Barium stars are supposed to belong to post-mass-transfer systems. Aims: In order to identify diagnostics distinguishing between pre- and post-mass-transfer systems, the properties of barium stars (more precisely their mass-function distribution and their period-eccentricity (P-e) diagram) are compared to those of binary red giants in open clusters. As a side product, we aim to identify possible post-mass-transfer systems among the cluster giants from the presence of s-process overabundances. We investigate the relation between the s-process enrichment, the location in the (P-e) diagram, and the cluster metallicity and turn-off mass. Methods: To invert the mass-function distribution and derive the mass-ratio distribution, we used the method pioneered by Boffin et al. (1992) that relies on a Richardson-Lucy deconvolution algorithm. The derivation of s-process abundances in the open-cluster giants was performed through spectral synthesis with MARCS model atmospheres. Results: A fraction of 22% of post-mass-transfer systems is found among the cluster binary giants (with companion masses between 0.58 and 0.87 M⊙, typical for white dwarfs), and these systems occupy a wider area than barium stars in the (P-e) diagram. Barium stars have on average lower eccentricities at a given orbital period. When the sample of binary giant stars in clusters is restricted to the subsample of systems occupying the same locus as the barium stars in the (P-e) diagram, and with a mass function compatible with a WD companion, 33% (=4/12) show a chemical signature of mass transfer in the form of s-process overabundances (from rather moderate - about 0.3 dex - to more extreme - about 1 dex). The only strong barium star in our sample is found in the cluster with the lowest metallicity in the sample (I.e. star 173 in NGC 2420, with [Fe/H] = -0.26), whereas the barium stars with mild s-process abundance anomalies (from 0.25 to 0.6 dex) are found in the clusters with slightly subsolar metallicities. Our finding confirms the classical prediction that the s-process nucleosynthesis is more efficient at low metallicities, since the s-process overabundance is not clearly correlated with the cluster turn-off (TO) mass; such a correlation would instead hint at the importance of the dilution factor. We also find a mild barium star in NGC 2335, a cluster with a large TO mass of 4.3 M⊙, which implies that asymptotic giant branch stars that massive still operate the s-process and the third dredge-up. Based on observations made with the Mercator Telescope, operated on the island of La Palma by the Flemish Community, at the Spanish Observatorio del Roque de los Muchachos of the Instituto de Astrofisica de Canarias, and on observations made with the HARPS spectrograph installed on the 3.6 m telescope at the European Southern Observatory.

  6. Datacube Services in Action, Using Open Source and Open Standards

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2016-12-01

    Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.

  7. iBiology: communicating the process of science.

    PubMed

    Goodwin, Sarah S

    2014-08-01

    The Internet hosts an abundance of science video resources aimed at communicating scientific knowledge, including webinars, massive open online courses, and TED talks. Although these videos are efficient at disseminating information for diverse types of users, they often do not demonstrate the process of doing science, the excitement of scientific discovery, or how new scientific knowledge is developed. iBiology (www.ibiology.org), a project that creates open-access science videos about biology research and science-related topics, seeks to fill this need by producing videos by science leaders that make their ideas, stories, and experiences available to anyone with an Internet connection. © 2014 Goodwin. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  8. Numerical Study of Hydrothermal Wave Suppression in Thermocapillary Flow Using a Predictive Control Method

    NASA Astrophysics Data System (ADS)

    Muldoon, F. H.

    2018-04-01

    Hydrothermal waves in flows driven by thermocapillary and buoyancy effects are suppressed by applying a predictive control method. Hydrothermal waves arise in the manufacturing of crystals, including the "open boat" crystal growth process, and lead to undesirable impurities in crystals. The open boat process is modeled using the two-dimensional unsteady incompressible Navier-Stokes equations under the Boussinesq approximation and the linear approximation of the surface thermocapillary force. The flow is controlled by a spatially and temporally varying heat flux density through the free surface. The heat flux density is determined by a conjugate gradient optimization algorithm. The gradient of the objective function with respect to the heat flux density is found by solving adjoint equations derived from the Navier-Stokes ones in the Boussinesq approximation. Special attention is given to heat flux density distributions over small free-surface areas and to the maximum admissible heat flux density.

  9. Building a Snow Data Management System using Open Source Software (and IDL)

    NASA Astrophysics Data System (ADS)

    Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.

    2012-12-01

    At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01

  10. Open star clusters and Galactic structure

    NASA Astrophysics Data System (ADS)

    Joshi, Yogesh C.

    2018-04-01

    In order to understand the Galactic structure, we perform a statistical analysis of the distribution of various cluster parameters based on an almost complete sample of Galactic open clusters yet available. The geometrical and physical characteristics of a large number of open clusters given in the MWSC catalogue are used to study the spatial distribution of clusters in the Galaxy and determine the scale height, solar offset, local mass density and distribution of reddening material in the solar neighbourhood. We also explored the mass-radius and mass-age relations in the Galactic open star clusters. We find that the estimated parameters of the Galactic disk are largely influenced by the choice of cluster sample.

  11. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    PubMed

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    NASA Astrophysics Data System (ADS)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.

  13. Data on nearshore wave process and surficial beach deposits, central Tamil Nadu coast, India.

    PubMed

    Joevivek, V; Chandrasekar, N

    2017-08-01

    The chronicles of nearshore morphology and surficial beach deposits provide valuable information about the nature of the beach condition and the depositional environment. It imparts an understanding about the spatial and temporal relationship of nearshore waves and its influence over the distribution of beach sediments. This article contains data about wave and sediment dynamics of the ten sandy beaches along the central Tamil Nadu coast, India. This present dataset comprises nearshore wave parameters, breaker wave type, beach morphodynamic state, grain size distribution and weight percentage of heavy and light mineral distribution. The dataset will figure out the beach morphology and hydrodynamic condition with respect to the different monsoonal season. This will act as a field reference to realize the coastal dynamics in an open sea condition. The nearshore entities were obtained from the intensive field survey between January 2011 and December 2011, while characteristics of beach sediments are examined by the chemical process in the laboratory environment.

  14. Flexible distributed architecture for semiconductor process control and experimentation

    NASA Astrophysics Data System (ADS)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  15. Weighted Distances in Scale-Free Configuration Models

    NASA Astrophysics Data System (ADS)

    Adriaans, Erwin; Komjáthy, Júlia

    2018-01-01

    In this paper we study first-passage percolation in the configuration model with empirical degree distribution that follows a power-law with exponent τ \\in (2,3) . We assign independent and identically distributed (i.i.d.) weights to the edges of the graph. We investigate the weighted distance (the length of the shortest weighted path) between two uniformly chosen vertices, called typical distances. When the underlying age-dependent branching process approximating the local neighborhoods of vertices is found to produce infinitely many individuals in finite time—called explosive branching process—Baroni, Hofstad and the second author showed in Baroni et al. (J Appl Probab 54(1):146-164, 2017) that typical distances converge in distribution to a bounded random variable. The order of magnitude of typical distances remained open for the τ \\in (2,3) case when the underlying branching process is not explosive. We close this gap by determining the first order of magnitude of typical distances in this regime for arbitrary, not necessary continuous edge-weight distributions that produce a non-explosive age-dependent branching process with infinite mean power-law offspring distributions. This sequence tends to infinity with the amount of vertices, and, by choosing an appropriate weight distribution, can be tuned to be any growing function that is O(log log n) , where n is the number of vertices in the graph. We show that the result remains valid for the the erased configuration model as well, where we delete loops and any second and further edges between two vertices.

  16. Method of determining interwell oil field fluid saturation distribution

    DOEpatents

    Donaldson, Erle C.; Sutterfield, F. Dexter

    1981-01-01

    A method of determining the oil and brine saturation distribution in an oil field by taking electrical current and potential measurements among a plurality of open-hole wells geometrically distributed throughout the oil field. Poisson's equation is utilized to develop fluid saturation distributions from the electrical current and potential measurement. Both signal generating equipment and chemical means are used to develop current flow among the several open-hole wells.

  17. Explosion-assisted preparation of dispersed gold-bearing different-grade ore for selective mining

    NASA Astrophysics Data System (ADS)

    Trubachev, AI; Zykov, NV

    2017-02-01

    It is found that there are transient zones (between quality and off-quality ore areas) with the respective content of useful component in an ore body, and a variant of explosive treatment of such zones before the selective mining is put forward. Practicability of two processing technologies is evaluated: processing of high-grade and low-grade ore from the transient zones and heap leaching of metals from the low-grade and impoverished ore. Open mining technology is conventional truck-and-shovel scheme, with distributed ore flows to processing plant and (or) to heap leaching, which generally enhances the mine efficiency.

  18. Grid Integrated Distributed PV (GridPV) Version 2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Matthew J.; Coogan, Kyle

    2014-12-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functio ns are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in th e OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included tomore » show potential uses of the toolbox functions. Each function i n the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.« less

  19. Measurements of Pressure Distributions and Force Coefficients in a Squeeze Film Damper. Part 1: Fully Open Ended Configuration

    NASA Technical Reports Server (NTRS)

    Jung, S. Y.; Sanandres, Luis A.; Vance, J. M.

    1991-01-01

    Measurements of pressure distributions and force coefficients were carried out in two types of squeeze film dampers, executing a circular centered orbit, an open-ended configuration, and a partially sealed one, in order to investigate the effect of fluid inertia and cavitation on pressure distributions and force coefficients. Dynamic pressure measurements were carried out for two orbit radii, epsilon 0.5 and 0.8. It was found that the partially sealed configuration was less influenced by fluid inertia than the open ended configuration.

  20. OpenSim: A Flexible Distributed Neural Network Simulator with Automatic Interactive Graphics.

    PubMed

    Jarosch, Andreas; Leber, Jean Francois

    1997-06-01

    An object-oriented simulator called OpenSim is presented that achieves a high degree of flexibility by relying on a small set of building blocks. The state variables and algorithms put in this framework can easily be accessed through a command shell. This allows one to distribute a large-scale simulation over several workstations and to generate the interactive graphics automatically. OpenSim opens new possibilities for cooperation among Neural Network researchers. Copyright 1997 Elsevier Science Ltd.

  1. OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.

    PubMed

    Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe

    2013-02-15

    Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.

  2. Prediction of biodiversity hotspots in the Anthropocene: The case of veteran oaks.

    PubMed

    Skarpaas, Olav; Blumentrath, Stefan; Evju, Marianne; Sverdrup-Thygeson, Anne

    2017-10-01

    Over the past centuries, humans have transformed large parts of the biosphere, and there is a growing need to understand and predict the distribution of biodiversity hotspots influenced by the presence of humans. Our basic hypothesis is that human influence in the Anthropocene is ubiquitous, and we predict that biodiversity hot spot modeling can be improved by addressing three challenges raised by the increasing ecological influence of humans: (i) anthropogenically modified responses to individual ecological factors, (ii) fundamentally different processes and predictors in landscape types shaped by different land use histories and (iii) a multitude and complexity of natural and anthropogenic processes that may require many predictors and even multiple models in different landscape types. We modeled the occurrence of veteran oaks in Norway, and found, in accordance with our basic hypothesis and predictions, that humans influence the distribution of veteran oaks throughout its range, but in different ways in forests and open landscapes. In forests, geographical and topographic variables related to the oak niche are still important, but the occurrence of veteran oaks is shifted toward steeper slopes, where logging is difficult. In open landscapes, land cover variables are more important, and veteran oaks are more common toward the north than expected from the fundamental oak niche. In both landscape types, multiple predictor variables representing ecological and human-influenced processes were needed to build a good model, and several models performed almost equally well. Models accounting for the different anthropogenic influences on landscape structure and processes consistently performed better than models based exclusively on natural biogeographical and ecological predictors. Thus, our results for veteran oaks clearly illustrate the challenges to distribution modeling raised by the ubiquitous influence of humans, even in a moderately populated region, but also show that predictions can be improved by explicitly addressing these anthropogenic complexities.

  3. A Hybrid Approach to Composite Damage and Failure Analysis Combining Synergistic Damage Mechanics and Peridynamics

    DTIC Science & Technology

    2017-12-31

    random radial displacement a fiber is given in simulation of the manufacturing process. As seen in the figure, the crack driving force increases...will incorporate voids along with irregular fiber distributions as consequences of composite manufacturing. The crack opening displacement in the as...subjected to IMPa pressure (ANSYS does not allow the, mathematically equivalent, tensile stresses applied at both ends without any displacement constraints

  4. Exciton Hybridisation in Organic-Inorganic Semiconductor Microcavities

    DTIC Science & Technology

    2002-02-01

    hybridizing organic and inorganic semiconductors in microcavities to produce a highly efficient light source that could be either a laser or a very efficient...such process may also have an important effect on the spectral distribution of photoluminescence from the microcavity and can be considered as a...Absorption (solid dots) and photoluminescence emission (open circles) of a thin film of J-aggregated cyanine dyes in a PVA matrix. Note, the chemical

  5. Distributed Kernelized Locality-Sensitive Hashing for Faster Image Based Navigation

    DTIC Science & Technology

    2015-03-26

    Facebook, Google, and Yahoo !. Current methods for image retrieval become problematic when implemented on image datasets that can easily reach billions of...correlations. Tech industry leaders like Facebook, Google, and Yahoo ! sort and index even larger volumes of “big data” daily. When attempting to process...open source implementation of Google’s MapReduce programming paradigm [13] which has been used for many different things. Using Apache Hadoop, Yahoo

  6. Introduction: The SERENITY vision

    NASA Astrophysics Data System (ADS)

    Maña, Antonio; Spanoudakis, George; Kokolakis, Spyros

    In this chapter we present an overview of the SERENITY approach. We describe the SERENITY model of secure and dependable applications and show how it addresses the challenge of developing, integrating and dynamically maintaining security and dependability mechanisms in open, dynamic, distributed and heterogeneous computing systems and in particular Ambient Intelligence scenarios. The chapter describes the basic concepts used in the approach and introduces the different processes supported by SERENITY, along with the tools provided.

  7. Phylogenetic evidence that both ancient vicariance and dispersal have contributed to the biogeographic patterns of anchialine cave shrimps.

    PubMed

    Jurado-Rivera, José A; Pons, Joan; Alvarez, Fernando; Botello, Alejandro; Humphreys, William F; Page, Timothy J; Iliffe, Thomas M; Willassen, Endre; Meland, Kenneth; Juan, Carlos; Jaume, Damià

    2017-06-06

    Cave shrimps from the genera Typhlatya, Stygiocaris and Typhlopatsa (Atyidae) are restricted to specialised coastal subterranean habitats or nearby freshwaters and have a highly disconnected distribution (Eastern Pacific, Caribbean, Atlantic, Mediterranean, Madagascar, Australia). The combination of a wide distribution and a limited dispersal potential suggests a large-scale process has generated this geographic pattern. Tectonic plates that fragment ancestral ranges (vicariance) has often been assumed to cause this process, with the biota as passive passengers on continental blocks. The ancestors of these cave shrimps are believed to have inhabited the ancient Tethys Sea, with three particular geological events hypothesised to have led to their isolation and divergence; (1) the opening of the Atlantic Ocean, (2) the breakup of Gondwana, and (3) the closure of the Tethys Seaway. We test the relative contribution of vicariance and dispersal in the evolutionary history of this group using mitochondrial genomes to reconstruct phylogenetic and biogeographic scenarios with fossil-based calibrations. Given that the Australia/Madagascar shrimp divergence postdates the Gondwanan breakup, our results suggest both vicariance (the Atlantic opening) and dispersal. The Tethys closure appears not to have been influential, however we hypothesise that changing marine currents had an important early influence on their biogeography.

  8. The textual characteristics of traditional and Open Access scientific journals are similar.

    PubMed

    Verspoor, Karin; Cohen, K Bretonnel; Hunter, Lawrence

    2009-06-15

    Recent years have seen an increased amount of natural language processing (NLP) work on full text biomedical journal publications. Much of this work is done with Open Access journal articles. Such work assumes that Open Access articles are representative of biomedical publications in general and that methods developed for analysis of Open Access full text publications will generalize to the biomedical literature as a whole. If this assumption is wrong, the cost to the community will be large, including not just wasted resources, but also flawed science. This paper examines that assumption. We collected two sets of documents, one consisting only of Open Access publications and the other consisting only of traditional journal publications. We examined them for differences in surface linguistic structures that have obvious consequences for the ease or difficulty of natural language processing and for differences in semantic content as reflected in lexical items. Regarding surface linguistic structures, we examined the incidence of conjunctions, negation, passives, and pronominal anaphora, and found that the two collections did not differ. We also examined the distribution of sentence lengths and found that both collections were characterized by the same mode. Regarding lexical items, we found that the Kullback-Leibler divergence between the two collections was low, and was lower than the divergence between either collection and a reference corpus. Where small differences did exist, log likelihood analysis showed that they were primarily in the area of formatting and in specific named entities. We did not find structural or semantic differences between the Open Access and traditional journal collections.

  9. Open Source Cloud-Based Technologies for Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  10. The textual characteristics of traditional and Open Access scientific journals are similar

    PubMed Central

    Verspoor, Karin; Cohen, K Bretonnel; Hunter, Lawrence

    2009-01-01

    Background Recent years have seen an increased amount of natural language processing (NLP) work on full text biomedical journal publications. Much of this work is done with Open Access journal articles. Such work assumes that Open Access articles are representative of biomedical publications in general and that methods developed for analysis of Open Access full text publications will generalize to the biomedical literature as a whole. If this assumption is wrong, the cost to the community will be large, including not just wasted resources, but also flawed science. This paper examines that assumption. Results We collected two sets of documents, one consisting only of Open Access publications and the other consisting only of traditional journal publications. We examined them for differences in surface linguistic structures that have obvious consequences for the ease or difficulty of natural language processing and for differences in semantic content as reflected in lexical items. Regarding surface linguistic structures, we examined the incidence of conjunctions, negation, passives, and pronominal anaphora, and found that the two collections did not differ. We also examined the distribution of sentence lengths and found that both collections were characterized by the same mode. Regarding lexical items, we found that the Kullback-Leibler divergence between the two collections was low, and was lower than the divergence between either collection and a reference corpus. Where small differences did exist, log likelihood analysis showed that they were primarily in the area of formatting and in specific named entities. Conclusion We did not find structural or semantic differences between the Open Access and traditional journal collections. PMID:19527520

  11. Pinhole induced efficiency variation in perovskite solar cells

    NASA Astrophysics Data System (ADS)

    Agarwal, Sumanshu; Nair, Pradeep R.

    2017-10-01

    Process induced efficiency variation is a major concern for all thin film solar cells, including the emerging perovskite based solar cells. In this article, we address the effect of pinholes or process induced surface coverage aspects on the efficiency of such solar cells through detailed numerical simulations. Interestingly, we find that the pinhole size distribution affects the short circuit current and open circuit voltage in contrasting manners. Specifically, while the JS C is heavily dependent on the pinhole size distribution, surprisingly, the VO C seems to be only nominally affected by it. Further, our simulations also indicate that, with appropriate interface engineering, it is indeed possible to design a nanostructured device with efficiencies comparable to those of ideal planar structures. Additionally, we propose a simple technique based on terminal I-V characteristics to estimate the surface coverage in perovskite solar cells.

  12. SeqPig: simple and scalable scripting for large sequencing data sets in Hadoop

    PubMed Central

    Schumacher, André; Pireddu, Luca; Niemenmaa, Matti; Kallio, Aleksi; Korpelainen, Eija; Zanetti, Gianluigi; Heljanko, Keijo

    2014-01-01

    Summary: Hadoop MapReduce-based approaches have become increasingly popular due to their scalability in processing large sequencing datasets. However, as these methods typically require in-depth expertise in Hadoop and Java, they are still out of reach of many bioinformaticians. To solve this problem, we have created SeqPig, a library and a collection of tools to manipulate, analyze and query sequencing datasets in a scalable and simple manner. SeqPigscripts use the Hadoop-based distributed scripting engine Apache Pig, which automatically parallelizes and distributes data processing tasks. We demonstrate SeqPig’s scalability over many computing nodes and illustrate its use with example scripts. Availability and Implementation: Available under the open source MIT license at http://sourceforge.net/projects/seqpig/ Contact: andre.schumacher@yahoo.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24149054

  13. IDCDACS: IDC's Distributed Application Control System

    NASA Astrophysics Data System (ADS)

    Ertl, Martin; Boresch, Alexander; Kianička, Ján; Sudakov, Alexander; Tomuta, Elena

    2015-04-01

    The Preparatory Commission for the CTBTO is an international organization based in Vienna, Austria. Its mission is to establish a global verification regime to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which bans all nuclear explosions. For this purpose time series data from a global network of seismic, hydro-acoustic and infrasound (SHI) sensors are transmitted to the International Data Centre (IDC) in Vienna in near-real-time, where it is processed to locate events that may be nuclear explosions. We newly designed the distributed application control system that glues together the various components of the automatic waveform data processing system at the IDC (IDCDACS). Our highly-scalable solution preserves the existing architecture of the IDC processing system that proved successful over many years of operational use, but replaces proprietary components with open-source solutions and custom developed software. Existing code was refactored and extended to obtain a reusable software framework that is flexibly adaptable to different types of processing workflows. Automatic data processing is organized in series of self-contained processing steps, each series being referred to as a processing pipeline. Pipelines process data by time intervals, i.e. the time-series data received from monitoring stations is organized in segments based on the time when the data was recorded. So-called data monitor applications queue the data for processing in each pipeline based on specific conditions, e.g. data availability, elapsed time or completion states of preceding processing pipelines. IDCDACS consists of a configurable number of distributed monitoring and controlling processes, a message broker and a relational database. All processes communicate through message queues hosted on the message broker. Persistent state information is stored in the database. A configurable processing controller instantiates and monitors all data processing applications. Due to decoupling by message queues the system is highly versatile and failure tolerant. The implementation utilizes the RabbitMQ open-source messaging platform that is based upon the Advanced Message Queuing Protocol (AMQP), an on-the-wire protocol (like HTML) and open industry standard. IDCDACS uses high availability capabilities provided by RabbitMQ and is equipped with failure recovery features to survive network and server outages. It is implemented in C and Python and is operated in a Linux environment at the IDC. Although IDCDACS was specifically designed for the existing IDC processing system its architecture is generic and reusable for different automatic processing workflows, e.g. similar to those described in (Olivieri et al. 2012, Kværna et al. 2012). Major advantages are its independence of the specific data processing applications used and the possibility to reconfigure IDCDACS for different types of processing, data and trigger logic. A possible future development would be to use the IDCDACS framework for different scientific domains, e.g. for processing of Earth observation satellite data extending the one-dimensional time-series intervals to spatio-temporal data cubes. REFERENCES Olivieri M., J. Clinton (2012) An almost fair comparison between Earthworm and SeisComp3, Seismological Research Letters, 83(4), 720-727. Kværna, T., S. J. Gibbons, D. B. Harris, D. A. Dodge (2012) Adapting pipeline architectures to track developing aftershock sequences and recurrent explosions, Proceedings of the 2012 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, 776-785.

  14. Ray Meta: scalable de novo metagenome assembly and profiling

    PubMed Central

    2012-01-01

    Voluminous parallel sequencing datasets, especially metagenomic experiments, require distributed computing for de novo assembly and taxonomic profiling. Ray Meta is a massively distributed metagenome assembler that is coupled with Ray Communities, which profiles microbiomes based on uniquely-colored k-mers. It can accurately assemble and profile a three billion read metagenomic experiment representing 1,000 bacterial genomes of uneven proportions in 15 hours with 1,024 processor cores, using only 1.5 GB per core. The software will facilitate the processing of large and complex datasets, and will help in generating biological insights for specific environments. Ray Meta is open source and available at http://denovoassembler.sf.net. PMID:23259615

  15. Lattice QCD Studies of Transverse Momentum-Dependent Parton Distribution Functions

    NASA Astrophysics Data System (ADS)

    Engelhardt, M.; Musch, B.; Hägler, P.; Negele, J.; Schäfer, A.

    2015-09-01

    Transverse momentum-dependent parton distributions (TMDs) relevant for semi-inclusive deep inelastic scattering and the Drell-Yan process can be defined in terms of matrix elements of a quark bilocal operator containing a staple-shaped gauge link. Such a definition opens the possibility of evaluating TMDs within lattice QCD. By parametrizing the aforementioned matrix elements in terms of invariant amplitudes, the problem can be cast in a Lorentz frame suited for the lattice calculation. Results for selected TMD observables are presented, including a particular focus on their dependence on a Collins-Soper-type evolution parameter, which quantifies proximity of the staple-shaped gauge links to the light cone.

  16. Integrating Data Distribution and Data Assimilation Between the OOI CI and the NOAA DIF

    NASA Astrophysics Data System (ADS)

    Meisinger, M.; Arrott, M.; Clemesha, A.; Farcas, C.; Farcas, E.; Im, T.; Schofield, O.; Krueger, I.; Klacansky, I.; Orcutt, J.; Peach, C.; Chave, A.; Raymer, D.; Vernon, F.

    2008-12-01

    The Ocean Observatories Initiative (OOI) is an NSF funded program to establish the ocean observing infrastructure of the 21st century benefiting research and education. It is currently approaching final design and promises to deliver cyber and physical observatory infrastructure components as well as substantial core instrumentation to study environmental processes of the ocean at various scales, from coastal shelf-slope exchange processes to the deep ocean. The OOI's data distribution network lies at the heart of its cyber- infrastructure, which enables a multitude of science and education applications, ranging from data analysis, to processing, visualization and ontology supported query and mediation. In addition, it fundamentally supports a class of applications exploiting the knowledge gained from analyzing observational data for objective-driven ocean observing applications, such as automatically triggered response to episodic environmental events and interactive instrument tasking and control. The U.S. Department of Commerce through NOAA operates the Integrated Ocean Observing System (IOOS) providing continuous data in various formats, rates and scales on open oceans and coastal waters to scientists, managers, businesses, governments, and the public to support research and inform decision-making. The NOAA IOOS program initiated development of the Data Integration Framework (DIF) to improve management and delivery of an initial subset of ocean observations with the expectation of achieving improvements in a select set of NOAA's decision-support tools. Both OOI and NOAA through DIF collaborate on an effort to integrate the data distribution, access and analysis needs of both programs. We present details and early findings from this collaboration; one part of it is the development of a demonstrator combining web-based user access to oceanographic data through ERDDAP, efficient science data distribution, and scalable, self-healing deployment in a cloud computing environment. ERDDAP is a web-based front-end application integrating oceanographic data sources of various formats, for instance CDF data files as aggregated through NcML or presented using a THREDDS server. The OOI-designed data distribution network provides global traffic management and computational load balancing for observatory resources; it makes use of the OpenDAP Data Access Protocol (DAP) for efficient canonical science data distribution over the network. A cloud computing strategy is the basis for scalable, self-healing organization of an observatory's computing and storage resources, independent of the physical location and technical implementation of these resources.

  17. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Treesearch

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  18. The water-filled versus air-filled status of vessels cut open in air: the 'Scholander assumption' revisited

    Treesearch

    M.T. Tyree; H. Cochard; P. Cruziat

    2003-01-01

    When petioles of transpiring leaves are cut in the air, according to the 'Scholander assumption', the vessels cut open should fill with air as the water is drained away by continued transpiration, The distribution of air-filled vessels versus distance from the cut surface should match the distribution of lengths of 'open vessels', i.e. vessels cut...

  19. The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James; hide

    2016-01-01

    Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.

  20. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 2: numerical application

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.

  1. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  2. The distribution and mechanism of pore formation in copper foams fabricated by Lost Carbonate Sintering method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shahzeydi, Mohammad Hosein; Parvanian, Amir Masoud; Panjepour, Masoud, E-mail: panjepour@cc.iut.ac.ir

    2016-01-15

    In this research, utilizing X-ray computed tomography (XCT), geometrical characterization, and pore formation mechanisms of highly porous copper foams manufactured by powder metallurgical (PM) process are investigated. Open-cell copper foams with porosity percentages of 60% and 80% and with a pore size within the range of 300–600 μm were manufactured by using potassium carbonate as a space holder agent via the Lost Carbonate Sintering (LCS) technique. XCT and SEM were also employed to investigate the three-dimensional structure of foams and to find the effect of the parameters of the space holders on the structural properties of copper foams. The resultmore » showed an excellent correlation between the structural properties of the foams including the size and shape of the pores, porosity percentage, volume percentage, particle size, and the shape of the sacrificial agent used. Also, the advanced image analysis of XCT images indicated fluctuations up to ± 10% in porosity distribution across different cross-sections of the foams. Simultaneous thermal analysis (STA: DTA–TG) was also used to study the thermal history of the powders used during the manufacturing process of the foams. The results indicated that the melting and thermal decomposition of the potassium carbonate occurred simultaneously at 920 °C and created the porous structure of the foams. By combining the STA result with the result of the tension analysis of cell walls, the mechanisms of open-pore formation were suggested. In fact, most open pores in the samples were formed due to the direct contact of potassium carbonate particles with each other in green compact. Also, it was found that the thermal decomposition of potassium carbonate particles into gaseous CO{sub 2} led to the production of gas pressure inside the closed pores, which eventually caused the creation of cracks on the cell walls and the opening of the pores in foam's structure. - Highlights: • Structural characterization of copper foam produced by LCS method is investigated by tomography images. • The ability of LCS technique to control structural features of produced foams was proved. • The mechanisms of open pores formation were presented.« less

  3. Simulation model of fatigue crack opening/closing phenomena for predicting RPG load under arbitrary stress distribution field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toyosada, M.; Niwa, T.

    1995-12-31

    In this paper, Newman`s calculation model is modified to solve his neglected effect of the change of stress distribution ahead of a crack, and to leave elastic plastic materials along the crack surface because of the compatibility of Dugdale model. In addition to above treatment, the authors introduce plastic shrinkage at an immediate generation of new crack surfaces due to emancipation of internal force with the magnitude of yield stress level during unloading process in the model. Moreover, the model is expanded to arbitrary stress distribution field. By using the model, RPG load is simulated for a center notched specimenmore » under constant amplitude loading with various stress ratios and decreased maximum load while keeping minimum load.« less

  4. The physics of heavy quark distributions in hadrons: Collider tests

    NASA Astrophysics Data System (ADS)

    Brodsky, S. J.; Bednyakov, V. A.; Lykasov, G. I.; Smiesko, J.; Tokar, S.

    2017-03-01

    We present a review of the current understanding of the heavy quark distributions in the nucleon and their impact on collider physics. The origin of strange, charm and bottom quark pairs at high light-front (LF) momentum fractions in hadron wavefunction-the "intrinsic" quarks, is reviewed. The determination of heavy-quark parton distribution functions (PDFs) is particularly significant for the analysis of hard processes at LHC energies. We show that a careful study of the inclusive production of open charm and the production of γ / Z / W particles, accompanied by the heavy jets at large transverse momenta can give essential information on the intrinsic heavy quark (IQ) distributions. We also focus on the theoretical predictions concerning other observables which are very sensitive to the intrinsic charm contribution to PDFs including Higgs production at high xF and novel fixed target measurements which can be tested at the LHC.

  5. The physics of heavy quark distributions in hadrons: Collider tests

    DOE PAGES

    Brodsky, S. J.; Bednyakov, V. A.; Lykasov, G. I.; ...

    2016-12-18

    Here, we present a review of the current understanding of the heavy quark distributions in the nucleon and their impact on collider physics. The origin of strange, charm and bottom quark pairs at high light-front (LF) momentum fractions in hadron wavefunction—the “intrinsic” quarks, is reviewed. The determination of heavy-quark parton distribution functions (PDFs) is particularly significant for the analysis of hard processes at LHC energies. We show that a careful study of the inclusive production of open charm and the production of γ/Z/W particles, accompanied by the heavy jets at large transverse momenta can give essential information on the intrinsicmore » heavy quark (IQ) distributions. We also focus on the theoretical predictions concerning other observables which are very sensitive to the intrinsic charm contribution to PDFs including Higgs production at high x F and novel fixed target measurements which can be tested at the LHC.« less

  6. Analyzing Protein Clusters on the Plasma Membrane: Application of Spatial Statistical Analysis Methods on Super-Resolution Microscopy Images.

    PubMed

    Paparelli, Laura; Corthout, Nikky; Pavie, Benjamin; Annaert, Wim; Munck, Sebastian

    2016-01-01

    The spatial distribution of proteins within the cell affects their capability to interact with other molecules and directly influences cellular processes and signaling. At the plasma membrane, multiple factors drive protein compartmentalization into specialized functional domains, leading to the formation of clusters in which intermolecule interactions are facilitated. Therefore, quantifying protein distributions is a necessity for understanding their regulation and function. The recent advent of super-resolution microscopy has opened up the possibility of imaging protein distributions at the nanometer scale. In parallel, new spatial analysis methods have been developed to quantify distribution patterns in super-resolution images. In this chapter, we provide an overview of super-resolution microscopy and summarize the factors influencing protein arrangements on the plasma membrane. Finally, we highlight methods for analyzing clusterization of plasma membrane proteins, including examples of their applications.

  7. Long-Term Planning for Open Pits for Mining Sulphide-Oxide Ores in Order to Achieve Maximum Profit

    NASA Astrophysics Data System (ADS)

    Kržanović, Daniel; Conić, Vesna; Stevanović, Dejan; Kolonja, Božo; Vaduvesković, Jovan

    2017-12-01

    Profitable exploitation of mineralised material from the earth's crust is a complex and difficult task that depends on a comprehensive planning process. Answering the question of how to plan production depends on the geometry of the deposit, as well as the concentration, distribution, and type of minerals in it. The complex nature of mineral deposits largely determines the method of exploitation and profitability of mining operations. In addition to unit operating costs and metal prices, the optimal recovery of and achievement of maximum profit from deposits of sulphide-oxide ores also depend, to a significant extent, on the level of technological recovery achieved in the ore processing procedure. Therefore, in defining a long-term development strategy for open pits, special attention must be paid to the selection of an optimal procedure for ore processing in order to achieve the main objective: maximising the Net Present Value (NPV). The effect of using two different processes, flotation processing and hydrometallurgical methods (bioleaching acid leaching), on determining the ultimate pit is shown in the case of the Kraku Bugaresku-Cementacija sulphide-oxide ore deposit in eastern Serbia. Analysis shows that the application of hydrometallurgical methods of processing sulphide-oxide ore achieved an increase in NPV of 20.42%.

  8. Additional Insights Into Problem Definition and Positioning From Social Science Comment on "Four Challenges That Global Health Networks Face".

    PubMed

    Quissell, Kathryn

    2017-09-10

    Commenting on a recent editorial in this journal which presented four challenges global health networks will have to tackle to be effective, this essay discusses why this type of analysis is important for global health scholars and practitioners, and why it is worth understanding and critically engaging with the complexities behind these challenges. Focusing on the topics of problem definition and positioning, I outline additional insights from social science theory to demonstrate how networks and network researchers can evaluate these processes, and how these processes contribute to better organizing, advocacy, and public health outcomes. This essay also raises multiple questions regarding these processes for future research. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  9. Interarrival times of message propagation on directed networks.

    PubMed

    Mihaljev, Tamara; de Arcangelis, Lucilla; Herrmann, Hans J

    2011-08-01

    One of the challenges in fighting cybercrime is to understand the dynamics of message propagation on botnets, networks of infected computers used to send viruses, unsolicited commercial emails (SPAM) or denial of service attacks. We map this problem to the propagation of multiple random walkers on directed networks and we evaluate the interarrival time distribution between successive walkers arriving at a target. We show that the temporal organization of this process, which models information propagation on unstructured peer to peer networks, has the same features as SPAM reaching a single user. We study the behavior of the message interarrival time distribution on three different network topologies using two different rules for sending messages. In all networks the propagation is not a pure Poisson process. It shows universal features on Poissonian networks and a more complex behavior on scale free networks. Results open the possibility to indirectly learn about the process of sending messages on networks with unknown topologies, by studying interarrival times at any node of the network.

  10. Interarrival times of message propagation on directed networks

    NASA Astrophysics Data System (ADS)

    Mihaljev, Tamara; de Arcangelis, Lucilla; Herrmann, Hans J.

    2011-08-01

    One of the challenges in fighting cybercrime is to understand the dynamics of message propagation on botnets, networks of infected computers used to send viruses, unsolicited commercial emails (SPAM) or denial of service attacks. We map this problem to the propagation of multiple random walkers on directed networks and we evaluate the interarrival time distribution between successive walkers arriving at a target. We show that the temporal organization of this process, which models information propagation on unstructured peer to peer networks, has the same features as SPAM reaching a single user. We study the behavior of the message interarrival time distribution on three different network topologies using two different rules for sending messages. In all networks the propagation is not a pure Poisson process. It shows universal features on Poissonian networks and a more complex behavior on scale free networks. Results open the possibility to indirectly learn about the process of sending messages on networks with unknown topologies, by studying interarrival times at any node of the network.

  11. Soda Taxes: The Importance of Analysing Policy Processes Comment on "The Untapped Power of Soda Taxes: Incentivising Consumers, Generating Revenue, and Altering Corporate Behaviours".

    PubMed

    Le Bodo, Yann; De Wals, Philippe

    2017-10-21

    Sarah A. Roache and Lawrence O. Gostin's recent editorial comprehensively presents soda taxation rationales from a public health perspective. While we essentially agree that soda taxes are gaining momentum, this commentary expands upon the need for a better understanding of the policy processes underlying their development and implementation. Indeed, the umbrella concept of soda taxation actually covers a diversity of objectives and mechanisms, which may not only condition the feasibility and acceptability of a proposal, but also alter its impact. We briefly highlight some conditions that may have influenced soda tax policy processes and why further theory-driven case studies may be instructive. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  12. GridPV Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.

  13. KSC-2013-4438

    NASA Image and Video Library

    2013-12-19

    VANDENBERG AIR FORCE BASE, Calif. -- A solid rocket rocket motor is maneuvered toward the open high bay door of the Solid Rocket Motor Processing Facility at Vandenberg Air Force Base in California. The motor will be attached to the United Launch Alliance Delta II rocket slated to launch NASA's Orbiting Carbon Observatory-2, or OCO-2, spacecraft in July 2014. OCO-2 will collect precise global measurements of carbon dioxide in the Earth's atmosphere. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. Photo credit: NASA/Randy Beaudoin

  14. Spectral fingerprints of large-scale neuronal interactions.

    PubMed

    Siegel, Markus; Donner, Tobias H; Engel, Andreas K

    2012-01-11

    Cognition results from interactions among functionally specialized but widely distributed brain regions; however, neuroscience has so far largely focused on characterizing the function of individual brain regions and neurons therein. Here we discuss recent studies that have instead investigated the interactions between brain regions during cognitive processes by assessing correlations between neuronal oscillations in different regions of the primate cerebral cortex. These studies have opened a new window onto the large-scale circuit mechanisms underlying sensorimotor decision-making and top-down attention. We propose that frequency-specific neuronal correlations in large-scale cortical networks may be 'fingerprints' of canonical neuronal computations underlying cognitive processes.

  15. Electrically conductive composite material

    DOEpatents

    Clough, R.L.; Sylwester, A.P.

    1989-05-23

    An electrically conductive composite material is disclosed which comprises a conductive open-celled, low density, microcellular carbon foam filled with a non-conductive polymer or resin. The composite material is prepared in a two-step process consisting of first preparing the microcellular carbon foam from a carbonizable polymer or copolymer using a phase separation process, then filling the carbon foam with the desired non-conductive polymer or resin. The electrically conductive composites of the present invention has a uniform and consistent pattern of filler distribution, and as a result is superior over prior art materials when used in battery components, electrodes, and the like. 2 figs.

  16. Electrically conductive composite material

    DOEpatents

    Clough, R.L.; Sylwester, A.P.

    1988-06-20

    An electrically conductive composite material is disclosed which comprises a conductive open-celled, low density, microcellular carbon foam filled with a non-conductive polymer or resin. The composite material is prepared in a two-step process consisting of first preparing the microcellular carbon foam from a carbonizable polymer or copolymer using a phase separation process, then filling the carbon foam with the desired non-conductive polymer or resin. The electrically conductive composites of the present invention has a uniform and consistent pattern of filler distribution, and as a result is superior over prior art materials when used in battery components, electrodes, and the like. 2 figs.

  17. Electrically conductive composite material

    DOEpatents

    Clough, Roger L.; Sylwester, Alan P.

    1989-01-01

    An electrically conductive composite material is disclosed which comprises a conductive open-celled, low density, microcellular carbon foam filled with a non-conductive polymer or resin. The composite material is prepared in a two-step process consisting of first preparing the microcellular carbon foam from a carbonizable polymer or copolymer using a phase separation process, then filling the carbon foam with the desired non-conductive polymer or resin. The electrically conductive composites of the present invention has a uniform and consistant pattern of filler distribution, and as a result is superior over prior art materials when used in battery components, electrodes, and the like.

  18. Exact joint density-current probability function for the asymmetric exclusion process.

    PubMed

    Depken, Martin; Stinchcombe, Robin

    2004-07-23

    We study the asymmetric simple exclusion process with open boundaries and derive the exact form of the joint probability function for the occupation number and the current through the system. We further consider the thermodynamic limit, showing that the resulting distribution is non-Gaussian and that the density fluctuations have a discontinuity at the continuous phase transition, while the current fluctuations are continuous. The derivations are performed by using the standard operator algebraic approach and by the introduction of new operators satisfying a modified version of the original algebra. Copyright 2004 The American Physical Society

  19. Don't Discount Societal Value in Cost-Effectiveness Comment on "Priority Setting for Universal Health Coverage: We Need Evidence-Informed Deliberative Processes, Not Just More Evidence on Cost-Effectiveness".

    PubMed

    Hall, William

    2017-01-14

    As healthcare resources become increasingly scarce due to growing demand and stagnating budgets, the need for effective priority setting and resource allocation will become ever more critical to providing sustainable care to patients. While societal values should certainly play a part in guiding these processes, the methodology used to capture these values need not necessarily be limited to multi-criterion decision analysis (MCDA)-based processes including 'evidence-informed deliberative processes.' However, if decision-makers intend to not only incorporates the values of the public they serve into decisions but have the decisions enacted as well, consideration should be given to more direct involvement of stakeholders. Based on the examples provided by Baltussen et al, MCDA-based processes like 'evidence-informed deliberative processes' could be one way of achieving this laudable goal. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  20. New Aerosol Models for the Retrieval of Aerosol Optical Thickness and Normalized Water-Leaving Radiances from the SeaWiFS and MODIS Sensors Over Coastal Regions and Open Oceans

    NASA Technical Reports Server (NTRS)

    Ahmad, Ziauddin; Franz, Bryan A.; McClain, Charles R.; Kwiatkowska, Ewa J.; Werdell, Jeremy; Shettle, Eric P.; Holben, Brent N.

    2010-01-01

    We describe the development of a new suite of aerosol models for the retrieval of atmospheric and oceanic optical properties from the SeaWiFs and MODIS sensors, including aerosol optical thickness (tau), angstrom coefficient (alpha), and water-leaving radiance (L(sub w)). The new aerosol models are derived from Aerosol Robotic Network (AERONET) observations and have bimodal lognormal distributions that are narrower than previous models used by the Ocean Biology Processing Group. We analyzed AERONET data over open ocean and coastal regions and found that the seasonal variability in the modal radii, particularly in the coastal region, was related to the relative humidity, These findings were incorporated into the models by making the modal radii, as well as the refractive indices, explicitly dependent on relative humidity, From those findings, we constructed a new suite of aerosol models. We considered eight relative humidity values (30%, 50%, 70%, 75%, 80%, 85%, 90%. and 95%) and, for each relative humidity value, we constructed ten distributions by varying the fine-mode fraction from zero to 1. In all. 80 distributions (8Rh x 10 fine-mode fractions) were created to process the satellite data. We. also assumed that the coarse-mode particles were nonabsorbing (sea salt) and that all observed absorptions were entirely due to fine-mode particles. The composition of fine mode was varied to ensure that the new models exhibited the same spectral dependence of single scattering albedo as observed in the AERONET data,

  1. Earth science big data at users' fingertips: the EarthServer Science Gateway Mobile

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Fargetta, Marco; Pappalardo, Marco; Rundo, Francesco

    2014-05-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. In this contribution we will report on the EarthServer Science Gateway Mobile, an app for both iOS and Android-based devices that allows users to seamlessly access some of the EarthServer applications using SAML-based federated authentication and fine-grained authorisation mechanisms.

  2. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 1: theoretical development

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The Saint-Venant equations are commonly used as the governing equations to solve for modeling the spatially varied unsteady flow in open channels. The presence of uncertainties in the channel or flow parameters renders these equations stochastic, thus requiring their solution in a stochastic framework in order to quantify the ensemble behavior and the variability of the process. While the Monte Carlo approach can be used for such a solution, its computational expense and its large number of simulations act to its disadvantage. This study proposes, explains, and derives a new methodology for solving the stochastic Saint-Venant equations in only one shot, without the need for a large number of simulations. The proposed methodology is derived by developing the nonlocal Lagrangian-Eulerian Fokker-Planck equation of the characteristic form of the stochastic Saint-Venant equations for an open-channel flow process, with an uncertain roughness coefficient. A numerical method for its solution is subsequently devised. The application and validation of this methodology are provided in a companion paper, in which the statistical results computed by the proposed methodology are compared against the results obtained by the Monte Carlo approach.

  3. Database Entity Persistence with Hibernate for the Network Connectivity Analysis Model

    DTIC Science & Technology

    2014-04-01

    time savings in the Java coding development process. Appendices A and B describe address setup procedures for installing the MySQL database...development environment is required: • The open source MySQL Database Management System (DBMS) from Oracle, which is a Java Database Connectivity (JDBC...compliant DBMS • MySQL JDBC Driver library that comes as a plug-in with the Netbeans distribution • The latest Java Development Kit with the latest

  4. In-network processing of joins in wireless sensor networks.

    PubMed

    Kang, Hyunchul

    2013-03-11

    The join or correlated filtering of sensor readings is one of the fundamental query operations in wireless sensor networks (WSNs). Although the join in centralized or distributed databases is a well-researched problem, join processing in WSNs has quite different characteristics and is much more difficult to perform due to the lack of statistics on sensor readings and the resource constraints of sensor nodes. Since data transmission is orders of magnitude more costly than processing at a sensor node, in-network processing of joins is essential. In this paper, the state-of-the-art techniques for join implementation in WSNs are surveyed. The requirements and challenges, join types, and components of join implementation are described. The open issues for further research are identified.

  5. In-Network Processing of Joins in Wireless Sensor Networks

    PubMed Central

    Kang, Hyunchul

    2013-01-01

    The join or correlated filtering of sensor readings is one of the fundamental query operations in wireless sensor networks (WSNs). Although the join in centralized or distributed databases is a well-researched problem, join processing in WSNs has quite different characteristics and is much more difficult to perform due to the lack of statistics on sensor readings and the resource constraints of sensor nodes. Since data transmission is orders of magnitude more costly than processing at a sensor node, in-network processing of joins is essential. In this paper, the state-of-the-art techniques for join implementation in WSNs are surveyed. The requirements and challenges, join types, and components of join implementation are described. The open issues for further research are identified. PMID:23478603

  6. Naima: a Python package for inference of particle distribution properties from nonthermal spectra

    NASA Astrophysics Data System (ADS)

    Zabalza, V.

    2015-07-01

    The ultimate goal of the observation of nonthermal emission from astrophysical sources is to understand the underlying particle acceleration and evolution processes, and few tools are publicly available to infer the particle distribution properties from the observed photon spectra from X-ray to VHE gamma rays. Here I present naima, an open source Python package that provides models for nonthermal radiative emission from homogeneous distribution of relativistic electrons and protons. Contributions from synchrotron, inverse Compton, nonthermal bremsstrahlung, and neutral-pion decay can be computed for a series of functional shapes of the particle energy distributions, with the possibility of using user-defined particle distribution functions. In addition, naima provides a set of functions that allow to use these models to fit observed nonthermal spectra through an MCMC procedure, obtaining probability distribution functions for the particle distribution parameters. Here I present the models and methods available in naima and an example of their application to the understanding of a galactic nonthermal source. naima's documentation, including how to install the package, is available at http://naima.readthedocs.org.

  7. Aeolian Sediment Trapping Efficiencies of Sparse Vegetation and its Ecohydrological Consequences in Drylands

    NASA Astrophysics Data System (ADS)

    Gonzales, H. B.; Ravi, S.; Li, J. J.; Sankey, J. B.

    2016-12-01

    Hydrological and aeolian processes control the redistribution of soil and nutrients in arid and semi arid environments thereby contributing to the formation of heterogeneous patchy landscapes with nutrient-rich resource islands surrounded by nutrient depleted bare soil patches. The differential trapping of soil particles by vegetation canopies may result in textural changes beneath the vegetation, which, in turn, can alter the hydrological processes such as infiltration and runoff. We conducted infiltration experiments and soil grain size analysis of several shrub (Larrea tridentate) and grass (Bouteloua eriopoda) microsites and in a heterogeneous landscape in the Chihuahuan desert (New Mexico, USA). Our results indicate heterogeneity in soil texture and infiltration patterns under grass and shrub microsites. We assessed the trapping effectiveness of vegetation canopies using a novel computational fluid dynamics (CFD) approach. An open-source software (OpenFOAM) was used to validate the data gathered from particle size distribution (PSD) analysis of soil within the shrub and grass microsites and their porosities (91% for shrub and 68% for grass) determined using terrestrial LiDAR surveys. Three-dimensional architectures of the shrub and grass were created using an open-source computer-aided design (CAD) software (Blender). The readily available solvers within the OpenFOAM architecture were modified to test the validity and optimize input parameters in assessing trapping efficiencies of sparse vegetation against aeolian sediment flux. The results from the numerical simulations explained the observed textual changes under grass and shrub canopies and highlighted the role of sediment trapping by canopies in structuring patch-scale hydrological processes.

  8. An integrated structural and geochemical study of fracture aperture growth in the Campito Formation of eastern California

    NASA Astrophysics Data System (ADS)

    Doungkaew, N.; Eichhubl, P.

    2015-12-01

    Processes of fracture formation control flow of fluid in the subsurface and the mechanical properties of the brittle crust. Understanding of fundamental fracture growth mechanisms is essential for understanding fracture formation and cementation in chemically reactive systems with implications for seismic and aseismic fault and fracture processes, migration of hydrocarbons, long-term CO2 storage, and geothermal energy production. A recent study on crack-seal veins in deeply buried sandstone of east Texas provided evidence for non-linear fracture growth, which is indicated by non-elliptical kinematic fracture aperture profiles. We hypothesize that similar non-linear fracture growth also occurs in other geologic settings, including under higher temperature where solution-precipitation reactions are kinetically favored. To test this hypothesis, we investigate processes of fracture growth in quartzitic sandstone of the Campito Formation, eastern California, by combining field structural observations, thin section petrography, and fluid inclusion microthermometry. Fracture aperture profile measurements of cemented opening-mode fractures show both elliptical and non-elliptical kinematic aperture profiles. In general, fractures that contain fibrous crack-seal cement have elliptical aperture profiles. Fractures filled with blocky cement have linear aperture profiles. Elliptical fracture aperture profiles are consistent with linear-elastic or plastic fracture mechanics. Linear aperture profiles may reflect aperture growth controlled by solution-precipitation creep, with the aperture distribution controlled by solution-precipitation kinetics. We hypothesize that synkinematic crack-seal cement preserves the elliptical aperture profiles of elastic fracture opening increments. Blocky cement, on the other hand, may form postkinematically relative to fracture opening, with fracture opening accommodated by continuous solution-precipitation creep.

  9. Distributed computing testbed for a remote experimental environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butner, D.N.; Casper, T.A.; Howard, B.C.

    1995-09-18

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ``Collaboratory.`` The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on themore » DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation`s Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility.« less

  10. Cloud-based distributed control of unmanned systems

    NASA Astrophysics Data System (ADS)

    Nguyen, Kim B.; Powell, Darren N.; Yetman, Charles; August, Michael; Alderson, Susan L.; Raney, Christopher J.

    2015-05-01

    Enabling warfighters to efficiently and safely execute dangerous missions, unmanned systems have been an increasingly valuable component in modern warfare. The evolving use of unmanned systems leads to vast amounts of data collected from sensors placed on the remote vehicles. As a result, many command and control (C2) systems have been developed to provide the necessary tools to perform one of the following functions: controlling the unmanned vehicle or analyzing and processing the sensory data from unmanned vehicles. These C2 systems are often disparate from one another, limiting the ability to optimally distribute data among different users. The Space and Naval Warfare Systems Center Pacific (SSC Pacific) seeks to address this technology gap through the UxV to the Cloud via Widgets project. The overarching intent of this three year effort is to provide three major capabilities: 1) unmanned vehicle control using an open service oriented architecture; 2) data distribution utilizing cloud technologies; 3) a collection of web-based tools enabling analysts to better view and process data. This paper focuses on how the UxV to the Cloud via Widgets system is designed and implemented by leveraging the following technologies: Data Distribution Service (DDS), Accumulo, Hadoop, and Ozone Widget Framework (OWF).

  11. A resilient and secure software platform and architecture for distributed spacecraft

    NASA Astrophysics Data System (ADS)

    Otte, William R.; Dubey, Abhishek; Karsai, Gabor

    2014-06-01

    A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.

  12. Spatial distribution of juvenile and adult female Tanner crabs (Chionoecetes bairdi) in a glacial fjord ecosystem: Implications for recruitment processes

    USGS Publications Warehouse

    Nielsen, J.K.; Taggart, S. James; Shirley, Thomas C.; Mondragon, Jennifer

    2007-01-01

    A systematic pot survey in Glacier Bay, Alaska, was conducted to characterize the spatial distribution of juvenile and adult female Tanner crabs, and their association with depth and temperature. The information was used to infer important recruitment processes for Tanner crabs in glaciated ecosystems. High-catch areas for juvenile and adult female Tanner crabs were identified using local autocorrelation statistics. Spatial segregation by size class corresponded to features in the glacial landscape: high-catch areas for juveniles were located at the distal ends of two narrow glacial fjords, and high-catch areas for adults were located in the open waters of the central Bay. Juvenile female Tanner crabs were found at nearly all sampled depths (15–439 m) and temperatures (4–8°C), but the biggest catches were at depths <150 m where adults were scarce. Because adults may prey on or compete with juveniles, the distribution of juveniles could be influenced by the distribution of adults. Areas where adults or predators are scarce, such as glacially influenced fjords, could serve as refuges for juvenile Tanner crabs.

  13. Multi-port valve

    DOEpatents

    Lewin, Keith F.

    1997-04-15

    A multi-port valve for regulating, as a function of ambient air having varying wind velocity and wind direction in an open-field control area, the distribution of a fluid, particularly carbon dioxide (CO.sub.2) gas, in a fluid distribution system so that the control area remains generally at an elevated fluid concentration or level of said fluid. The multi-port valve generally includes a multi-port housing having a plurality of outlets therethrough disposed in a first pattern of outlets and at least one second pattern of outlets, and a movable plate having a plurality of apertures extending therethrough disposed in a first pattern of apertures and at least one second pattern of apertures. The first pattern of apertures being alignable with the first pattern of outlets and the at least one second pattern of apertures being alignable with the second pattern of outlets. The first pattern of apertures has a predetermined orientation with the at least one second pattern of apertures. For an open-field control area subject to ambient wind having a low velocity from any direction, the movable plate is positioned to equally distribute the supply of fluid in a fluid distribution system to the open-field control area. For an open-field control area subject to ambient wind having a high velocity from a given direction, the movable plate is positioned to generally distribute a supply of fluid in a fluid distribution system to that portion of the open-field control area located upwind.

  14. Multi-port valve

    DOEpatents

    Lewin, K.F.

    1997-04-15

    A multi-port valve is described for regulating, as a function of ambient air having varying wind velocity and wind direction in an open-field control area, the distribution of a fluid, particularly carbon dioxide (CO{sub 2}) gas, in a fluid distribution system so that the control area remains generally at an elevated fluid concentration or level of said fluid. The multi-port valve generally includes a multi-port housing having a plurality of outlets there through disposed in a first pattern of outlets and at least one second pattern of outlets, and a movable plate having a plurality of apertures extending there through disposed in a first pattern of apertures and at least one second pattern of apertures. The first pattern of apertures being alignable with the first pattern of outlets and the at least one second pattern of apertures being alignable with the second pattern of outlets. The first pattern of apertures has a predetermined orientation with the at least one second pattern of apertures. For an open-field control area subject to ambient wind having a low velocity from any direction, the movable plate is positioned to equally distribute the supply of fluid in a fluid distribution system to the open-field control area. For an open-field control area subject to ambient wind having a high velocity from a given direction, the movable plate is positioned to generally distribute a supply of fluid in a fluid distribution system to that portion of the open-field control area located upwind. 7 figs.

  15. Visual servoing of a laser ablation based cochleostomy

    NASA Astrophysics Data System (ADS)

    Kahrs, Lüder A.; Raczkowsky, Jörg; Werner, Martin; Knapp, Felix B.; Mehrwald, Markus; Hering, Peter; Schipper, Jörg; Klenzner, Thomas; Wörn, Heinz

    2008-03-01

    The aim of this study is a defined, visually based and camera controlled bone removal by a navigated CO II laser on the promontory of the inner ear. A precise and minimally traumatic opening procedure of the cochlea for the implantation of a cochlear implant electrode (so-called cochleostomy) is intended. Harming the membrane linings of the inner ear can result in damage of remaining organ functions (e.g. complete deafness or vertigo). A precise tissue removal by a laser-based bone ablation system is investigated. Inside the borehole the pulsed laser beam is guided automatically over the bone by using a two mirror galvanometric scanner. The ablation process is controlled by visual servoing. For the detection of the boundary layers of the inner ear the ablation area is monitored by a color camera. The acquired pictures are analyzed by image processing. The results of this analysis are used to control the process of laser ablation. This publication describes the complete system including image processing algorithms and the concept for the resulting distribution of single laser pulses. The system has been tested on human cochleae in ex-vivo studies. Further developments could lead to safe intraoperative openings of the cochlea by a robot based surgical laser instrument.

  16. [Series: Medical Applications of the PHITS Code (2): Acceleration by Parallel Computing].

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko

    2015-01-01

    Time-consuming Monte Carlo dose calculation becomes feasible owing to the development of computer technology. However, the recent development is due to emergence of the multi-core high performance computers. Therefore, parallel computing becomes a key to achieve good performance of software programs. A Monte Carlo simulation code PHITS contains two parallel computing functions, the distributed-memory parallelization using protocols of message passing interface (MPI) and the shared-memory parallelization using open multi-processing (OpenMP) directives. Users can choose the two functions according to their needs. This paper gives the explanation of the two functions with their advantages and disadvantages. Some test applications are also provided to show their performance using a typical multi-core high performance workstation.

  17. Influence of fiber packing structure on permeability

    NASA Technical Reports Server (NTRS)

    Cai, Zhong; Berdichevsky, Alexander L.

    1993-01-01

    The study on the permeability of an aligned fiber bundle is the key building block in modeling the permeability of advanced woven and braided preforms. Available results on the permeability of fiber bundles in the literature show that a substantial difference exists between numerical and analytical calculations on idealized fiber packing structures, such as square and hexagonal packing, and experimental measurements on practical fiber bundles. The present study focuses on the variation of the permeability of a fiber bundle under practical process conditions. Fiber bundles are considered as containing openings and fiber clusters within the bundle. Numerical simulations on the influence of various openings on the permeability were conducted. Idealized packing structures are used, but with introduced openings distributed in different patterns. Both longitudinal and transverse flow are considered. The results show that openings within the fiber bundle have substantial effect on the permeability. In the longitudinal flow case, the openings become the dominant flow path. In the transverse flow case, the fiber clusters reduce the gap sizes among fibers. Therefore the permeability is greatly influenced by these openings and clusters, respectively. In addition to the porosity or fiber volume fraction, which is commonly used in the permeability expression, another fiber bundle status parameter, the ultimate fiber volume fraction, is introduced to capture the disturbance within a fiber bundle.

  18. Free and Open Source Software for Geospatial in the field of planetary science

    NASA Astrophysics Data System (ADS)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and solutions to possible detriments coming from the effort required by using, supporting and contributing.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaurov, Alexander A., E-mail: kaurov@uchicago.edu

    We explore a time-dependent energy dissipation of the energetic electrons in the inhomogeneous intergalactic medium (IGM) during the epoch of cosmic reionization. In addition to the atomic processes, we take into account the inverse Compton (IC) scattering of the electrons on the cosmic microwave background photons, which is the dominant channel of energy loss for electrons with energies above a few MeV. We show that: (1) the effect on the IGM has both local (atomic processes) and non-local (IC radiation) components; (2) the energy distribution between hydrogen and helium ionizations depends on the initial energy of an electron; (3) themore » local baryon overdensity significantly affects the fractions of energy distributed in each channel; and (4) the relativistic effect of the atomic cross-section becomes important during the epoch of cosmic reionization. We release our code as open source for further modification by the community.« less

  20. OpenCMISS: a multi-physics & multi-scale computational infrastructure for the VPH/Physiome project.

    PubMed

    Bradley, Chris; Bowery, Andy; Britten, Randall; Budelmann, Vincent; Camara, Oscar; Christie, Richard; Cookson, Andrew; Frangi, Alejandro F; Gamage, Thiranja Babarenda; Heidlauf, Thomas; Krittian, Sebastian; Ladd, David; Little, Caton; Mithraratne, Kumar; Nash, Martyn; Nickerson, David; Nielsen, Poul; Nordbø, Oyvind; Omholt, Stig; Pashaei, Ali; Paterson, David; Rajagopal, Vijayaraghavan; Reeve, Adam; Röhrle, Oliver; Safaei, Soroush; Sebastián, Rafael; Steghöfer, Martin; Wu, Tim; Yu, Ting; Zhang, Heye; Hunter, Peter

    2011-10-01

    The VPH/Physiome Project is developing the model encoding standards CellML (cellml.org) and FieldML (fieldml.org) as well as web-accessible model repositories based on these standards (models.physiome.org). Freely available open source computational modelling software is also being developed to solve the partial differential equations described by the models and to visualise results. The OpenCMISS code (opencmiss.org), described here, has been developed by the authors over the last six years to replace the CMISS code that has supported a number of organ system Physiome projects. OpenCMISS is designed to encompass multiple sets of physical equations and to link subcellular and tissue-level biophysical processes into organ-level processes. In the Heart Physiome project, for example, the large deformation mechanics of the myocardial wall need to be coupled to both ventricular flow and embedded coronary flow, and the reaction-diffusion equations that govern the propagation of electrical waves through myocardial tissue need to be coupled with equations that describe the ion channel currents that flow through the cardiac cell membranes. In this paper we discuss the design principles and distributed memory architecture behind the OpenCMISS code. We also discuss the design of the interfaces that link the sets of physical equations across common boundaries (such as fluid-structure coupling), or between spatial fields over the same domain (such as coupled electromechanics), and the concepts behind CellML and FieldML that are embodied in the OpenCMISS data structures. We show how all of these provide a flexible infrastructure for combining models developed across the VPH/Physiome community. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  2. SU-F-T-380: Comparing the Effect of Respiration On Dose Distribution Between Conventional Tangent Pair and IMRT Techniques for Adjuvant Radiotherapy in Early Stage Breast Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, M; Ramaseshan, R

    2016-06-15

    Purpose: In this project, we compared the conventional tangent pair technique to IMRT technique by analyzing the dose distribution. We also investigated the effect of respiration on planning target volume (PTV) dose coverage in both techniques. Methods: In order to implement IMRT technique a template based planning protocol, dose constrains and treatment process was developed. Two open fields with optimized field weights were combined with two beamlet optimization fields in IMRT plans. We compared the dose distribution between standard tangential pair and IMRT. The improvement in dose distribution was measured by parameters such as conformity index, homogeneity index and coveragemore » index. Another end point was the IMRT technique will reduce the planning time for staff. The effect of patient’s respiration on dose distribution was also estimated. The four dimensional computed tomography (4DCT) for different phase of breathing cycle was used to evaluate the effect of respiration on IMRT planned dose distribution. Results: We have accumulated 10 patients that acquired 4DCT and planned by both techniques. Based on the preliminary analysis, the dose distribution in IMRT technique was better than conventional tangent pair technique. Furthermore, the effect of respiration in IMRT plan was not significant as evident from the 95% isodose line coverage of PTV drawn on all phases of 4DCT. Conclusion: Based on the 4DCT images, the breathing effect on dose distribution was smaller than what we expected. We suspect that there are two reasons. First, the PTV movement due to respiration was not significant. It might be because we used a tilted breast board to setup patients. Second, the open fields with optimized field weights in IMRT technique might reduce the breathing effect on dose distribution. A further investigation is necessary.« less

  3. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  4. The development and performance of smud grid-connected photovoltaic projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, D.E.; Collier, D.E.

    1995-11-01

    The utility grid-connected market has been identified as a key market to be developed to accelerate the commercialization of photovoltaics. The Sacramento Municipal Utility District (SMUD) has completed the first two years of a continuing commercialization effort based on two years of a continuing commercialization effort based on the sustained, orderly development of the grid-connected, utility PV market. This program is aimed at developing the experience needed to successfully integrate PV as distributed generation into the utility system and to stimulate the collaborative processes needed to accelerate the cost reductions necessary for PV to be cost-effective in these applications bymore » the year 2000. In the first two years, SMUD has installed over 240 residential and commercial building, grid-connected, rooftop, {open_quotes}PV Pioneer{close_quotes} systems totaling over 1MW of capacity and four substation sited, grid-support PV systems totaling 600 kW bringing the SMUD distributed PV power systems to over 3.7 MW. The 1995 SMUD PV Program will add another approximately 800 kW of PV systems to the District`s distributed PV power system. SMUD also established a partnership with its customers through the PV Pioneer {open_quotes}green pricing{close_quotes} program to advance PV commercialization.« less

  5. Real-time sensor validation and fusion for distributed autonomous sensors

    NASA Astrophysics Data System (ADS)

    Yuan, Xiaojing; Li, Xiangshang; Buckles, Bill P.

    2004-04-01

    Multi-sensor data fusion has found widespread applications in industrial and research sectors. The purpose of real time multi-sensor data fusion is to dynamically estimate an improved system model from a set of different data sources, i.e., sensors. This paper presented a systematic and unified real time sensor validation and fusion framework (RTSVFF) based on distributed autonomous sensors. The RTSVFF is an open architecture which consists of four layers - the transaction layer, the process fusion layer, the control layer, and the planning layer. This paradigm facilitates distribution of intelligence to the sensor level and sharing of information among sensors, controllers, and other devices in the system. The openness of the architecture also provides a platform to test different sensor validation and fusion algorithms and thus facilitates the selection of near optimal algorithms for specific sensor fusion application. In the version of the model presented in this paper, confidence weighted averaging is employed to address the dynamic system state issue noted above. The state is computed using an adaptive estimator and dynamic validation curve for numeric data fusion and a robust diagnostic map for decision level qualitative fusion. The framework is then applied to automatic monitoring of a gas-turbine engine, including a performance comparison of the proposed real-time sensor fusion algorithms and a traditional numerical weighted average.

  6. 25 Years of Self-Organized Criticality: Solar and Astrophysics

    NASA Astrophysics Data System (ADS)

    Aschwanden, Markus J.; Crosby, Norma B.; Dimitropoulou, Michaila; Georgoulis, Manolis K.; Hergarten, Stefan; McAteer, James; Milovanov, Alexander V.; Mineshige, Shin; Morales, Laura; Nishizuka, Naoto; Pruessner, Gunnar; Sanchez, Raul; Sharma, A. Surja; Strugarek, Antoine; Uritsky, Vadim

    2016-01-01

    Shortly after the seminal paper "Self-Organized Criticality: An explanation of 1/ f noise" by Bak et al. (1987), the idea has been applied to solar physics, in "Avalanches and the Distribution of Solar Flares" by Lu and Hamilton (1991). In the following years, an inspiring cross-fertilization from complexity theory to solar and astrophysics took place, where the SOC concept was initially applied to solar flares, stellar flares, and magnetospheric substorms, and later extended to the radiation belt, the heliosphere, lunar craters, the asteroid belt, the Saturn ring, pulsar glitches, soft X-ray repeaters, blazars, black-hole objects, cosmic rays, and boson clouds. The application of SOC concepts has been performed by numerical cellular automaton simulations, by analytical calculations of statistical (powerlaw-like) distributions based on physical scaling laws, and by observational tests of theoretically predicted size distributions and waiting time distributions. Attempts have been undertaken to import physical models into the numerical SOC toy models, such as the discretization of magneto-hydrodynamics (MHD) processes. The novel applications stimulated also vigorous debates about the discrimination between SOC models, SOC-like, and non-SOC processes, such as phase transitions, turbulence, random-walk diffusion, percolation, branching processes, network theory, chaos theory, fractality, multi-scale, and other complexity phenomena. We review SOC studies from the last 25 years and highlight new trends, open questions, and future challenges, as discussed during two recent ISSI workshops on this theme.

  7. Learning from Multiple Collaborating Intelligent Tutors: An Agent-based Approach.

    ERIC Educational Resources Information Center

    Solomos, Konstantinos; Avouris, Nikolaos

    1999-01-01

    Describes an open distributed multi-agent tutoring system (MATS) and discusses issues related to learning in such open environments. Topics include modeling a one student-many teachers approach in a computer-based learning context; distributed artificial intelligence; implementation issues; collaboration; and user interaction. (Author/LRW)

  8. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    PubMed

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  9. Electrical study of DSA shrink process and CD rectification effect at sub-60nm using EUV test vehicle

    NASA Astrophysics Data System (ADS)

    Chi, Cheng; Liu, Chi-Chun; Meli, Luciana; Guo, Jing; Parnell, Doni; Mignot, Yann; Schmidt, Kristin; Sanchez, Martha; Farrell, Richard; Singh, Lovejeet; Furukawa, Tsuyoshi; Lai, Kafai; Xu, Yongan; Sanders, Daniel; Hetzer, David; Metz, Andrew; Burns, Sean; Felix, Nelson; Arnold, John; Corliss, Daniel

    2017-03-01

    In this study, the integrity and the benefits of the DSA shrink process were verified through a via-chain test structure, which was fabricated by either DSA or baseline litho/etch process for via layer formation while metal layer processes remain the same. The nearest distance between the vias in this test structure is below 60nm, therefore, the following process components were included: 1) lamella-forming BCP for forming self-aligned via (SAV), 2) EUV printed guiding pattern, and 3) PS-philic sidewall. The local CDU (LCDU) of minor axis was improved by 30% after DSA shrink process. We compared two DSA Via shrink processes and a DSA_Control process, in which guiding patterns (GP) were directly transferred to the bottom OPL without DSA shrink. The DSA_Control apparently resulted in larger CD, thus, showed much higher open current and shorted the dense via chains. The non-optimized DSA shrink process showed much broader current distribution than the improved DSA shrink process, which we attributed to distortion and dislocation of the vias and ineffective SAV. Furthermore, preliminary defectivity study of our latest DSA process showed that the primary defect mode is likely to be etch-related. The challenges, strategies applied to improve local CD uniformity and electrical current distribution, and potential adjustments were also discussed.

  10. The Influence of Ziegler-Natta and Metallocene Catalysts on Polyolefin Structure, Properties, and Processing Ability

    PubMed Central

    Shamiri, Ahmad; Chakrabarti, Mohammed H.; Jahan, Shah; Hussain, Mohd Azlan; Kaminsky, Walter; Aravind, Purushothaman V.; Yehye, Wageeh A.

    2014-01-01

    50 years ago, Karl Ziegler and Giulio Natta were awarded the Nobel Prize for their discovery of the catalytic polymerization of ethylene and propylene using titanium compounds and aluminum-alkyls as co-catalysts. Polyolefins have grown to become one of the biggest of all produced polymers. New metallocene/methylaluminoxane (MAO) catalysts open the possibility to synthesize polymers with highly defined microstructure, tacticity, and steroregularity, as well as long-chain branched, or blocky copolymers with excellent properties. This improvement in polymerization is possible due to the single active sites available on the metallocene catalysts in contrast to their traditional counterparts. Moreover, these catalysts, half titanocenes/MAO, zirconocenes, and other single site catalysts can control various important parameters, such as co-monomer distribution, molecular weight, molecular weight distribution, molecular architecture, stereo-specificity, degree of linearity, and branching of the polymer. However, in most cases research in this area has reduced academia as olefin polymerization has seen significant advancements in the industries. Therefore, this paper aims to further motivate interest in polyolefin research in academia by highlighting promising and open areas for the future. PMID:28788120

  11. Understanding User Behavioral Patterns in Open Knowledge Communities

    ERIC Educational Resources Information Center

    Yang, Xianmin; Song, Shuqiang; Zhao, Xinshuo; Yu, Shengquan

    2018-01-01

    Open knowledge communities (OKCs) have become popular in the era of knowledge economy. This study aimed to explore how users collaboratively create and share knowledge in OKCs. In particular, this research identified the behavior distribution and behavioral patterns of users by conducting frequency distribution and lag sequential analyses. Some…

  12. Higgs bosons with large transverse momentum at the LHC

    NASA Astrophysics Data System (ADS)

    Kudashkin, Kirill; Lindert, Jonas M.; Melnikov, Kirill; Wever, Christopher

    2018-07-01

    We compute the next-to-leading order QCD corrections to the production of Higgs bosons with large transverse momentum p⊥ ≫ 2mt at the LHC. To accomplish this, we combine the two-loop amplitudes for processes gg → Hg, qg → Hq and q q bar → Hg, recently computed in the approximation of nearly massless top quarks, with the numerical calculation of the squared one-loop amplitudes for gg → Hgg, qg → Hqg and q q bar → Hgg processes. The latter computation is performed with OpenLoops. We find that the QCD corrections to the Higgs transverse momentum distribution at very high p⊥ are large but quite similar to the QCD corrections obtained for point-like Hgg coupling. Our result removes one of the largest sources of theoretical uncertainty in the description of high-p⊥ Higgs boson production and opens a way to use the high-p⊥ region to search for physics beyond the Standard Model.

  13. General Formalism of Decision Making Based on Theory of Open Quantum Systems

    NASA Astrophysics Data System (ADS)

    Asano, M.; Ohya, M.; Basieva, I.; Khrennikov, A.

    2013-01-01

    We present the general formalism of decision making which is based on the theory of open quantum systems. A person (decision maker), say Alice, is considered as a quantum-like system, i.e., a system which information processing follows the laws of quantum information theory. To make decision, Alice interacts with a huge mental bath. Depending on context of decision making this bath can include her social environment, mass media (TV, newspapers, INTERNET), and memory. Dynamics of an ensemble of such Alices is described by Gorini-Kossakowski-Sudarshan-Lindblad (GKSL) equation. We speculate that in the processes of evolution biosystems (especially human beings) designed such "mental Hamiltonians" and GKSL-operators that any solution of the corresponding GKSL-equation stabilizes to a diagonal density operator (In the basis of decision making.) This limiting density operator describes population in which all superpositions of possible decisions has already been resolved. In principle, this approach can be used for the prediction of the distribution of possible decisions in human populations.

  14. Laser beam coupling with capillary discharge plasma for laser wakefield acceleration applications

    NASA Astrophysics Data System (ADS)

    Bagdasarov, G. A.; Sasorov, P. V.; Gasilov, V. A.; Boldarev, A. S.; Olkhovskaya, O. G.; Benedetti, C.; Bulanov, S. S.; Gonsalves, A.; Mao, H.-S.; Schroeder, C. B.; van Tilborg, J.; Esarey, E.; Leemans, W. P.; Levato, T.; Margarone, D.; Korn, G.

    2017-08-01

    One of the most robust methods, demonstrated to date, of accelerating electron beams by laser-plasma sources is the utilization of plasma channels generated by the capillary discharges. Although the spatial structure of the installation is simple in principle, there may be some important effects caused by the open ends of the capillary, by the supplying channels etc., which require a detailed 3D modeling of the processes. In the present work, such simulations are performed using the code MARPLE. First, the process of capillary filling with cold hydrogen before the discharge is fired, through the side supply channels is simulated. Second, the simulation of the capillary discharge is performed with the goal to obtain a time-dependent spatial distribution of the electron density near the open ends of the capillary as well as inside the capillary. Finally, to evaluate the effectiveness of the beam coupling with the channeling plasma wave guide and of the electron acceleration, modeling of the laser-plasma interaction was performed with the code INF&RNO.

  15. Tunable and high-purity room temperature single-photon emission from atomic defects in hexagonal boron nitride

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grosso, Gabriele; Moon, Hyowon; Lienhard, Benjamin

    Two-dimensional van der Waals materials have emerged as promising platforms for solid-state quantum information processing devices with unusual potential for heterogeneous assembly. Recently, bright and photostable single photon emitters were reported from atomic defects in layered hexagonal boron nitride (hBN), but controlling inhomogeneous spectral distribution and reducing multi-photon emission presented open challenges. Here, we demonstrate that strain control allows spectral tunability of hBN single photon emitters over 6 meV, and material processing sharply improves the single photon purity. We observe high single photon count rates exceeding 7 × 10 6 counts per second at saturation, after correcting for uncorrelated photonmore » background. Furthermore, these emitters are stable to material transfer to other substrates. High-purity and photostable single photon emission at room temperature, together with spectral tunability and transferability, opens the door to scalable integration of high-quality quantum emitters in photonic quantum technologies.« less

  16. Tunable and high-purity room temperature single-photon emission from atomic defects in hexagonal boron nitride

    DOE PAGES

    Grosso, Gabriele; Moon, Hyowon; Lienhard, Benjamin; ...

    2017-09-26

    Two-dimensional van der Waals materials have emerged as promising platforms for solid-state quantum information processing devices with unusual potential for heterogeneous assembly. Recently, bright and photostable single photon emitters were reported from atomic defects in layered hexagonal boron nitride (hBN), but controlling inhomogeneous spectral distribution and reducing multi-photon emission presented open challenges. Here, we demonstrate that strain control allows spectral tunability of hBN single photon emitters over 6 meV, and material processing sharply improves the single photon purity. We observe high single photon count rates exceeding 7 × 10 6 counts per second at saturation, after correcting for uncorrelated photonmore » background. Furthermore, these emitters are stable to material transfer to other substrates. High-purity and photostable single photon emission at room temperature, together with spectral tunability and transferability, opens the door to scalable integration of high-quality quantum emitters in photonic quantum technologies.« less

  17. Fair Processes for Priority Setting: Putting Theory into Practice Comment on "Expanded HTA: Enhancing Fairness and Legitimacy".

    PubMed

    Jansen, Maarten P; Helderman, Jan-Kees; Boer, Bert; Baltussen, Rob

    2016-07-03

    Embedding health technology assessment (HTA) in a fair process has great potential to capture societal values relevant to public reimbursement decisions on health technologies. However, the development of such processes for priority setting has largely been theoretical. In this paper, we provide further practical lead ways on how these processes can be implemented. We first present the misconception about the relation between facts and values that is since long misleading the conduct of HTA and underlies the current assessment-appraisal split. We then argue that HTA should instead be explicitly organized as an ongoing evidence-informed deliberative process, that facilitates learning among stakeholders. This has important consequences for whose values to consider, how to deal with vested interests, how to consider all values in the decision-making process, and how to communicate decisions. This is in stark contrast to how HTA processes are implemented now. It is time to set the stage for HTA as learning. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  18. Cyberscience and the Knowledge-Based Economy. Open Access and Trade Publishing: From Contradiction to Compatibility with Non-Exclusive Copyright Licensing

    ERIC Educational Resources Information Center

    Armbruster, Chris

    2008-01-01

    Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge…

  19. A Geospatial Information Grid Framework for Geological Survey.

    PubMed

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.

  20. A Geospatial Information Grid Framework for Geological Survey

    PubMed Central

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255

  1. Support of Multidimensional Parallelism in the OpenMP Programming Model

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Jost, Gabriele

    2003-01-01

    OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.

  2. EPRI and Schneider Electric Demonstrate Distributed Resource Communications

    Science.gov Websites

    Electric Power Research Institute (EPRI) is designing, building, and testing a flexible, open-source Schneider Electric ADMS, open software platforms, an open-platform home energy management system

  3. KSC-2013-4437

    NASA Image and Video Library

    2013-12-19

    VANDENBERG AIR FORCE BASE, Calif. -- A solid rocket rocket motor is hauled away from its delivery truck and toward the open high bay door of the Solid Rocket Motor Processing Facility at Vandenberg Air Force Base in California. The motor will be attached to the United Launch Alliance Delta II rocket slated to launch NASA's Orbiting Carbon Observatory-2, or OCO-2, spacecraft in July 2014. OCO-2 will collect precise global measurements of carbon dioxide in the Earth's atmosphere. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. Photo credit: NASA/Randy Beaudoin

  4. Public opinion by a poll process: model study and Bayesian view

    NASA Astrophysics Data System (ADS)

    Lee, Hyun Keun; Kim, Yong Woon

    2018-05-01

    We study the formation of public opinion in a poll process where the current score is open to the public. The voters are assumed to vote probabilistically for or against their own preference considering the group opinion collected up to then in the score. The poll-score probability is found to follow the beta distribution in the large polls limit. We demonstrate that various poll results, even those contradictory to the population preference, are possible with non-zero probability density and that such deviations are readily triggered by initial bias. It is mentioned that our poll model can be understood in the Bayesian viewpoint.

  5. VOLOBSIS: An Infrastructure for Open Access to Seismic and GNSS Data from the Volcanological and Seismological French Observatories

    NASA Astrophysics Data System (ADS)

    Satriano, C.; Lemarchand, A.; Saurel, J. M. M.; Pardo, C.; Vincent, D.; de Chabalier, J. B.; Beauducel, F.; Shapiro, N.; Cyril, G.

    2016-12-01

    The three Volcanological and Seismological Observatories of the Institut de Physique du Globe de Paris (IPGP) are situated in the overseas French territories: Martinique and Guadeloupe observatories in the Lesser Antilles and La Réunion Island in the Indian Ocean. The main missions of IPGP observatories is to monitor French active volcanoes and seismic activity associated with regional tectonics and to foster scientific research on the Lesser Antilles arc and La Réunion hotspot. For that, the observatories operate, among others, permanent seismological and geodetic networks and process and analyze continuously acquired data.IPGP observatories have a long story of seismic and geodetic monitoring: the first seismograph in Martinique was installed in 1902; starting from the early '80 the three observatories begun deploying permanent networks of analog sensors. During the years 2010, seismic and geodetic monitoring at the three observatories saw a significant breakthrough with the advent of broadband seismic sensors, digital recording and continuous GNSS receivers.This wealth of data is constituted today by 81 seismological stations (broad-band and short period, networks GL, MQ, PF and WI) and 48 permanent GNSS stations. Data of both type is continuously recorded and acquired at the three observatories, as well as at the IPGP Data Center in Paris. Real-time streams for seismic data are available through a SeedLink server. Seismic and GNSS data are further validated and completed at IPGP, and distributed through the VOLOBSIS web portal (http://volobsis.ipgp.fr), which provides download links as well a web service interface.Seismic data is further available through IRIS, the European Integrated Data Archive (EIDA) and the French RESIF portal (http://seismology.resif.fr).Here we discuss the different steps of data recording, quality-control and distribution behind VOLOBSIS, which provides an open data infrastructure for advancing the understanding of volcanic and tectonic deformation processes across the Lesser Antilles Arc and at La Réunion hotspot. We further discuss the planned future updates, with an upcoming real-time catalog of seismicity and the open and real-time distribution of additional type of data, such as tiltmeter and extensometer data, as well as geochemistry and meteorology.

  6. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  7. Omnetric Group Demonstrates Distributed Grid-Edge Control Hierarchy at NREL

    Science.gov Websites

    | Energy Systems Integration Facility | NREL Omnetric Group Omnetric Group Demonstrates Group demonstrated a distributed control hierarchy-based on an open field message bus (OpenFMB resources. OMNETRIC Group first developed and validated the system in the ESIF with a combination of

  8. Is There Evidence for Myelin Modeling by Astrocytes in the Normal Adult Brain?

    PubMed Central

    Varela-Echevarría, Alfredo; Vargas-Barroso, Víctor; Lozano-Flores, Carlos; Larriva-Sahd, Jorge

    2017-01-01

    A set of astrocytic process associated with altered myelinated axons is described in the forebrain of normal adult rodents with confocal, electron microscopy, and 3D reconstructions. Each process consists of a protuberance that contains secretory organelles including numerous lysosomes which polarize and open next to disrupted myelinated axons. Because of the distinctive asymmetric organelle distribution and ubiquity throughout the forebrain neuropil, this enlargement is named paraxial process (PAP). The myelin envelope contiguous to the PAP displays focal disruption or disintegration. In routine electron microscopy clusters of large, confluent, lysosomes proved to be an effective landmark for PAP identification. In 3D assemblies lysosomes organize a series of interconnected saccules that open up to the plasmalemma next to the disrupted myelin envelope(s). Activity for acid hydrolases was visualized in lysosomes, and extracellularly at the PAP-myelin interface and/or between the glial and neuronal outer aspects. Organelles in astrocytic processes involved in digesting pyknotic cells and debris resemble those encountered in PAPs supporting a likewise lytic function of the later. Conversely, processes entangling tripartite synapses and glomeruli were devoid of lysosomes. Both oligodendrocytic and microglial processes were not associated with altered myelin envelopes. The possible roles of the PAP in myelin remodeling in the context of the oligodendrocyte-astrocyte interactions and in the astrocyte's secretory pathways are discussed. PMID:28932188

  9. Towards a Cloud Based Smart Traffic Management Framework

    NASA Astrophysics Data System (ADS)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  10. MIST: An Open Source Environmental Modelling Programming Language Incorporating Easy to Use Data Parallelism.

    NASA Astrophysics Data System (ADS)

    Bellerby, Tim

    2014-05-01

    Model Integration System (MIST) is open-source environmental modelling programming language that directly incorporates data parallelism. The language is designed to enable straightforward programming structures, such as nested loops and conditional statements to be directly translated into sequences of whole-array (or more generally whole data-structure) operations. MIST thus enables the programmer to use well-understood constructs, directly relating to the mathematical structure of the model, without having to explicitly vectorize code or worry about details of parallelization. A range of common modelling operations are supported by dedicated language structures operating on cell neighbourhoods rather than individual cells (e.g.: the 3x3 local neighbourhood needed to implement an averaging image filter can be simply accessed from within a simple loop traversing all image pixels). This facility hides details of inter-process communication behind more mathematically relevant descriptions of model dynamics. The MIST automatic vectorization/parallelization process serves both to distribute work among available nodes and separately to control storage requirements for intermediate expressions - enabling operations on very large domains for which memory availability may be an issue. MIST is designed to facilitate efficient interpreter based implementations. A prototype open source interpreter is available, coded in standard FORTRAN 95, with tools to rapidly integrate existing FORTRAN 77 or 95 code libraries. The language is formally specified and thus not limited to FORTRAN implementation or to an interpreter-based approach. A MIST to FORTRAN compiler is under development and volunteers are sought to create an ANSI-C implementation. Parallel processing is currently implemented using OpenMP. However, parallelization code is fully modularised and could be replaced with implementations using other libraries. GPU implementation is potentially possible.

  11. Hidden biosphere in an oxygen-deficient Atlantic open ocean eddy: future implications of ocean deoxygenation on primary production in the eastern tropical North Atlantic

    NASA Astrophysics Data System (ADS)

    Loescher, Carolin; Fischer, Martin; Neulinger, Sven; Fiedler, Björn; Philippi, Miriam; Schütte, Florian; Singh, Arvind; Hauss, Helena; Karstensen, Johannes; Körtzinger, Arne; Schmitz, Ruth

    2016-04-01

    The eastern tropical North Atlantic (ETNA) is characterized by a highly productive coastal upwelling system and a moderate oxygen minimum zone with lowest open ocean oxygen (O2) concentrations of approximately 40 μmol kg-1. The recent discovery of re-occurring mesoscale eddies with close to anoxic O2 concentrations (<1 μmol kg-1) located just below the mixed layer has challenged our understanding of O2 distribution and biogeochemical processes in this area. Here, we present the first microbial community study from a deoxygenated anticyclonic modewater eddy in the open waters of the ETNA. In the eddy, we observed significantly lower bacterial diversity compared to surrounding waters, along with a significant community shift. We detected enhanced primary productivity in the surface layer of the eddy indicated by elevated chlorophyll concentrations and carbon uptake rates of up to three times as high as in surrounding waters. Carbon uptake rates below the euphotic zone correlated to the presence of a specific high-light ecotype of Prochlorococcus, which is usually underrepresented in the ETNA. Our data indicate that high primary production in the eddy fuels export production and supports enhanced respiration in a specific microbial community at shallow depths, below the mixed layer base. The O2-depleted core waters eddy promoted transcription of the key gene for denitrification, nirS. This process is usually absent from the open ETNA waters. In light of future projected ocean deoxygenation, our results show that even distinct events of anoxia have the potential to alter microbial community structure with critical impacts on primary productivity and biogeochemical processes of oceanic water bodies.

  12. Hidden biosphere in an oxygen-deficient Atlantic open ocean eddy: future implications of ocean deoxygenation on primary production in the eastern tropical North Atlantic

    NASA Astrophysics Data System (ADS)

    Löscher, C. R.; Fischer, M. A.; Neulinger, S. C.; Fiedler, B.; Philippi, M.; Schütte, F.; Singh, A.; Hauss, H.; Karstensen, J.; Körtzinger, A.; Künzel, S.; Schmitz, R. A.

    2015-08-01

    The eastern tropical North Atlantic (ETNA) is characterized by a highly productive coastal upwelling system and a moderate oxygen minimum zone with lowest open ocean oxygen (O2) concentrations of around 40 μmol kg-1. Only recently, the discovery of re-occurring mesoscale eddies with sometimes close to anoxic O2 concentrations (<1 μmol kg-1) and located just below the mixed layer challenged our understanding of O2 distribution and biogeochemical processes in this area. Here, we present the first metagenomic dataset from a deoxygenated anticyclonic modewater eddy in the open waters of the ETNA. In the eddy, we observed a significantly lower bacterial diversity compared to surrounding waters, along with a significant community shift. We detected enhanced primary productivity in the surface layer of the eddy indicated by elevated chlorophyll concentrations and increased carbon uptake rates up to three times as high as in surrounding waters. Carbon uptake below the euphotic zone correlated to the presence of a specific high-light ecotype of Prochlorococcus, which is usually underrepresented in the ETNA. Our combined data indicate that high primary production in the eddy fuels export production and the presence of a specific microbial community responsible for enhanced respiration at shallow depths, below the mixed layer base. Progressively decreasing O2 concentrations in the eddy were found to promote transcription of the key gene for denitrification, nirS, in the O2-depleted core waters. This process is usually absent from the open ETNA waters. In the light of future ocean deoxygenation our results show exemplarily that even distinct events of anoxia have the potential to alter microbial community structures and with that critically impact primary productivity and biogeochemical processes of oceanic water bodies.

  13. Hidden biosphere in an oxygen-deficient Atlantic open-ocean eddy: future implications of ocean deoxygenation on primary production in the eastern tropical North Atlantic

    NASA Astrophysics Data System (ADS)

    Löscher, C. R.; Fischer, M. A.; Neulinger, S. C.; Fiedler, B.; Philippi, M.; Schütte, F.; Singh, A.; Hauss, H.; Karstensen, J.; Körtzinger, A.; Künzel, S.; Schmitz, R. A.

    2015-12-01

    The eastern tropical North Atlantic (ETNA) is characterized by a highly productive coastal upwelling system and a moderate oxygen minimum zone with lowest open-ocean oxygen (O2) concentrations of approximately 40 μmol kg-1. The recent discovery of re-occurring mesoscale eddies with close to anoxic O2 concentrations (< 1 μmol kg-1) located just below the mixed layer has challenged our understanding of O2 distribution and biogeochemical processes in this area. Here, we present the first microbial community study from a deoxygenated anticyclonic modewater eddy in the open waters of the ETNA. In the eddy, we observed significantly lower bacterial diversity compared to surrounding waters, along with a significant community shift. We detected enhanced primary productivity in the surface layer of the eddy indicated by elevated chlorophyll concentrations and carbon uptake rates of up to three times as high as in surrounding waters. Carbon uptake rates below the euphotic zone correlated to the presence of a specific high-light ecotype of Prochlorococcus, which is usually underrepresented in the ETNA. Our data indicate that high primary production in the eddy fuels export production and supports enhanced respiration in a specific microbial community at shallow depths, below the mixed-layer base. The transcription of the key functional marker gene for dentrification, nirS, further indicated a potential for nitrogen loss processes in O2-depleted core waters of the eddy. Dentrification is usually absent from the open ETNA waters. In light of future projected ocean deoxygenation, our results show that even distinct events of anoxia have the potential to alter microbial community structure with critical impacts on primary productivity and biogeochemical processes of oceanic water bodies.

  14. MzJava: An open source library for mass spectrometry data processing.

    PubMed

    Horlacher, Oliver; Nikitin, Frederic; Alocci, Davide; Mariethoz, Julien; Müller, Markus; Lisacek, Frederique

    2015-11-03

    Mass spectrometry (MS) is a widely used and evolving technique for the high-throughput identification of molecules in biological samples. The need for sharing and reuse of code among bioinformaticians working with MS data prompted the design and implementation of MzJava, an open-source Java Application Programming Interface (API) for MS related data processing. MzJava provides data structures and algorithms for representing and processing mass spectra and their associated biological molecules, such as metabolites, glycans and peptides. MzJava includes functionality to perform mass calculation, peak processing (e.g. centroiding, filtering, transforming), spectrum alignment and clustering, protein digestion, fragmentation of peptides and glycans as well as scoring functions for spectrum-spectrum and peptide/glycan-spectrum matches. For data import and export MzJava implements readers and writers for commonly used data formats. For many classes support for the Hadoop MapReduce (hadoop.apache.org) and Apache Spark (spark.apache.org) frameworks for cluster computing was implemented. The library has been developed applying best practices of software engineering. To ensure that MzJava contains code that is correct and easy to use the library's API was carefully designed and thoroughly tested. MzJava is an open-source project distributed under the AGPL v3.0 licence. MzJava requires Java 1.7 or higher. Binaries, source code and documentation can be downloaded from http://mzjava.expasy.org and https://bitbucket.org/sib-pig/mzjava. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Self-organised criticality and 1/f noise in single-channel current of voltage-dependent anion channel

    NASA Astrophysics Data System (ADS)

    Banerjee, J.; Verma, M. K.; Manna, S.; Ghosh, S.

    2006-02-01

    Noise profile of Voltage Dependent Anion Channel (VDAC) is investigated in open channel state. Single-channel currents through VDAC from mitochondria of rat brain reconstituted into a planar lipid bilayer are recorded under different voltage clamped conditions across the membrane. Power spectrum analysis of current indicates power law noise of 1/f nature. Moreover, this 1/f nature of the open channel noise is seen throughout the range of applied membrane potential from -30 to +30 mV. It is being proposed that 1/f noise in open ion channel arises out of obstruction in the passage of ions across the membrane. The process is recognised as a phenomenon of self-organized criticality (SOC) like sandpile avalanche and other physical systems. Based on SOC it has been theoretically established that the system of ion channel follows power law noise as observed in our experiments. We also show that the first-time return probability of current fluctuations obeys a power law distribution.

  16. Vertical mercury distributions in the oceans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gill, G.A.; Fitzgerald, W.F.

    1988-06-01

    The vertical distribution of mercury (Hg) was determined at coastal and open ocean sites in the northwest Atlantic and Pacific Oceans. Reliable and diagnostic Hg distribution were obtained, permitting major processes governing the marine biogeochemistry of Hg to be identified. The northwest Atlantic near Bermuda showed surface water Hg concentrations near 4 pM, a maximum of 10 pM within the main thermocline, and concentrations less than or equal to surface water values below the depth of the maximum. The maximum appears to result from lateral transport of Hg enriched waters from higher latitudes. In the central North Pacific, surface watersmore » (to 940 m) were slightly elevated (1.9 {plus minus} 0.7 pM) compared to deeper waters (1.4 {plus minus} 0.4 pM), but on thermocline Hg maximum was observed. At similar depths, Hg concentrations near Bermuda were elevated compared to the central North Pacific Ocean. The authors hypothesize that the source of this Hg comes from diagenetic reactions in oxic margin sediments, releasing dissolved Hg to overlying water. Geochemical steady-state box modeling arguments predict a relatively short ({approximately}350 years) mean residence time for Hg in the oceans, demonstrating the reactive nature of Hg in seawater and precluding significant involvement in nutrient-type recycling. Mercury's distributional features and reactive nature suggest that interaction of Hg with settling particulate matter and margin sediments play important roles in regulating oceanic Hg concentrations. Oceanic Hg distributions are governed by an external cycling process, in which water column distributions reflect a rapid competition between the magnitude of the input source and the intensity of the (water column) removal process.« less

  17. Open-source sea ice drift algorithm for Sentinel-1 SAR imagery using a combination of feature-tracking and pattern-matching

    NASA Astrophysics Data System (ADS)

    Muckenhuber, Stefan; Sandven, Stein

    2017-04-01

    An open-source sea ice drift algorithm for Sentinel-1 SAR imagery is introduced based on the combination of feature-tracking and pattern-matching. A computational efficient feature-tracking algorithm produces an initial drift estimate and limits the search area for the pattern-matching, that provides small to medium scale drift adjustments and normalised cross correlation values as quality measure. The algorithm is designed to utilise the respective advantages of the two approaches and allows drift calculation at user defined locations. The pre-processing of the Sentinel-1 data has been optimised to retrieve a feature distribution that depends less on SAR backscatter peak values. A recommended parameter set for the algorithm has been found using a representative image pair over Fram Strait and 350 manually derived drift vectors as validation. Applying the algorithm with this parameter setting, sea ice drift retrieval with a vector spacing of 8 km on Sentinel-1 images covering 400 km x 400 km, takes less than 3.5 minutes on a standard 2.7 GHz processor with 8 GB memory. For validation, buoy GPS data, collected in 2015 between 15th January and 22nd April and covering an area from 81° N to 83.5° N and 12° E to 27° E, have been compared to calculated drift results from 261 corresponding Sentinel-1 image pairs. We found a logarithmic distribution of the error with a peak at 300 m. All software requirements necessary for applying the presented sea ice drift algorithm are open-source to ensure free implementation and easy distribution.

  18. Making geospatial data in ASF archive readily accessible

    NASA Astrophysics Data System (ADS)

    Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.

    2015-12-01

    The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.

  19. Advanced capabilities for materials modelling with Quantum ESPRESSO

    NASA Astrophysics Data System (ADS)

    Giannozzi, P.; Andreussi, O.; Brumme, T.; Bunau, O.; Buongiorno Nardelli, M.; Calandra, M.; Car, R.; Cavazzoni, C.; Ceresoli, D.; Cococcioni, M.; Colonna, N.; Carnimeo, I.; Dal Corso, A.; de Gironcoli, S.; Delugas, P.; DiStasio, R. A., Jr.; Ferretti, A.; Floris, A.; Fratesi, G.; Fugallo, G.; Gebauer, R.; Gerstmann, U.; Giustino, F.; Gorni, T.; Jia, J.; Kawamura, M.; Ko, H.-Y.; Kokalj, A.; Küçükbenli, E.; Lazzeri, M.; Marsili, M.; Marzari, N.; Mauri, F.; Nguyen, N. L.; Nguyen, H.-V.; Otero-de-la-Roza, A.; Paulatto, L.; Poncé, S.; Rocca, D.; Sabatini, R.; Santra, B.; Schlipf, M.; Seitsonen, A. P.; Smogunov, A.; Timrov, I.; Thonhauser, T.; Umari, P.; Vast, N.; Wu, X.; Baroni, S.

    2017-11-01

    Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.

  20. Advanced capabilities for materials modelling with Quantum ESPRESSO.

    PubMed

    Giannozzi, P; Andreussi, O; Brumme, T; Bunau, O; Buongiorno Nardelli, M; Calandra, M; Car, R; Cavazzoni, C; Ceresoli, D; Cococcioni, M; Colonna, N; Carnimeo, I; Dal Corso, A; de Gironcoli, S; Delugas, P; DiStasio, R A; Ferretti, A; Floris, A; Fratesi, G; Fugallo, G; Gebauer, R; Gerstmann, U; Giustino, F; Gorni, T; Jia, J; Kawamura, M; Ko, H-Y; Kokalj, A; Küçükbenli, E; Lazzeri, M; Marsili, M; Marzari, N; Mauri, F; Nguyen, N L; Nguyen, H-V; Otero-de-la-Roza, A; Paulatto, L; Poncé, S; Rocca, D; Sabatini, R; Santra, B; Schlipf, M; Seitsonen, A P; Smogunov, A; Timrov, I; Thonhauser, T; Umari, P; Vast, N; Wu, X; Baroni, S

    2017-10-24

    Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.

  1. Advanced capabilities for materials modelling with Quantum ESPRESSO.

    PubMed

    Andreussi, Oliviero; Brumme, Thomas; Bunau, Oana; Buongiorno Nardelli, Marco; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Cococcioni, Matteo; Colonna, Nicola; Carnimeo, Ivan; Dal Corso, Andrea; de Gironcoli, Stefano; Delugas, Pietro; DiStasio, Robert; Ferretti, Andrea; Floris, Andrea; Fratesi, Guido; Fugallo, Giorgia; Gebauer, Ralph; Gerstmann, Uwe; Giustino, Feliciano; Gorni, Tommaso; Jia, Junteng; Kawamura, Mitsuaki; Ko, Hsin-Yu; Kokalj, Anton; Küçükbenli, Emine; Lazzeri, Michele; Marsili, Margherita; Marzari, Nicola; Mauri, Francesco; Nguyen, Ngoc Linh; Nguyen, Huy-Viet; Otero-de-la-Roza, Alberto; Paulatto, Lorenzo; Poncé, Samuel; Giannozzi, Paolo; Rocca, Dario; Sabatini, Riccardo; Santra, Biswajit; Schlipf, Martin; Seitsonen, Ari Paavo; Smogunov, Alexander; Timrov, Iurii; Thonhauser, Timo; Umari, Paolo; Vast, Nathalie; Wu, Xifan; Baroni, Stefano

    2017-09-27

    Quantum ESPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudo-potential and projector-augmented-wave approaches. Quantum ESPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement theirs ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software. © 2017 IOP Publishing Ltd.

  2. Development of an Integrated Hydrologic Modeling System for Rainfall-Runoff Simulation

    NASA Astrophysics Data System (ADS)

    Lu, B.; Piasecki, M.

    2008-12-01

    This paper aims to present the development of an integrated hydrological model which involves functionalities of digital watershed processing, online data retrieval, hydrologic simulation and post-event analysis. The proposed system is intended to work as a back end to the CUAHSI HIS cyberinfrastructure developments. As a first step into developing this system, a physics-based distributed hydrologic model PIHM (Penn State Integrated Hydrologic Model) is wrapped into OpenMI(Open Modeling Interface and Environment ) environment so as to seamlessly interact with OpenMI compliant meteorological models. The graphical user interface is being developed from the openGIS application called MapWindows which permits functionality expansion through the addition of plug-ins. . Modules required to set up through the GUI workboard include those for retrieving meteorological data from existing database or meteorological prediction models, obtaining geospatial data from the output of digital watershed processing, and importing initial condition and boundary condition. They are connected to the OpenMI compliant PIHM to simulate rainfall-runoff processes and includes a module for automatically displaying output after the simulation. Online databases are accessed through the WaterOneFlow web services, and the retrieved data are either stored in an observation database(OD) following the schema of Observation Data Model(ODM) in case for time series support, or a grid based storage facility which may be a format like netCDF or a grid-based-data database schema . Specific development steps include the creation of a bridge to overcome interoperability issue between PIHM and the ODM, as well as the embedding of TauDEM (Terrain Analysis Using Digital Elevation Models) into the model. This module is responsible for developing watershed and stream network using digital elevation models. Visualizing and editing geospatial data is achieved by the usage of MapWinGIS, an ActiveX control developed by MapWindow team. After applying to the practical watershed, the performance of the model can be tested by the post-event analysis module.

  3. Optimization of a point-focusing, distributed receiver solar thermal electric system

    NASA Technical Reports Server (NTRS)

    Pons, R. L.

    1979-01-01

    This paper presents an approach to optimization of a solar concept which employs solar-to-electric power conversion at the focus of parabolic dish concentrators. The optimization procedure is presented through a series of trade studies, which include the results of optical/thermal analyses and individual subsystem trades. Alternate closed-cycle and open-cycle Brayton engines and organic Rankine engines are considered to show the influence of the optimization process, and various storage techniques are evaluated, including batteries, flywheels, and hybrid-engine operation.

  4. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends

    PubMed Central

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096

  5. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    PubMed

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.

  6. Do substorms energise the ring current?

    NASA Astrophysics Data System (ADS)

    Sandhu, J. K.; Rae, J.; Freeman, M. P.; Forsyth, C.; Jackman, C. M.; Lam, M. M.

    2017-12-01

    The substorm phenomenon is a highly dynamic and variable process that results in the global reconfiguration and redistribution of energy within the magnetosphere. There are many open questions surrounding substorms, particularly how the energy released during a substorm is distributed throughout the magnetosphere, and how the energy loss varies from one substorm to the next. In this study, we explore whether energy lost during the substorm plays a role in energising the ring current. Using observations of the particle energy flux from RBSPICE/RBSP, we are able to quantitatively observe how the energy is distributed spatially and across the different ion species (H+, He+, and O+). Furthermore, we can observe how the total energy content of the ring current changes during the substorm process, using substorm phases defined by the SOPHIE algorithm. This analysis provides information on how the energy released from a substorm is partitioned throughout the magnetosphere, and on the processes determining the energy provided to the ring current. Overall, our results show that the substorm-ring current coupling is more complex than originally thought, and we discuss the reasons behind this complex response.

  7. Flows and Stratification of an Enclosure Containing Both Localised and Vertically Distributed Sources of Buoyancy

    NASA Astrophysics Data System (ADS)

    Partridge, Jamie; Linden, Paul

    2013-11-01

    We examine the flows and stratification established in a naturally ventilated enclosure containing both a localised and vertically distributed source of buoyancy. The enclosure is ventilated through upper and lower openings which connect the space to an external ambient. Small scale laboratory experiments were carried out with water as the working medium and buoyancy being driven directly by temperature differences. A point source plume gave localised heating while the distributed source was driven by a controllable heater mat located in the side wall of the enclosure. The transient temperatures, as well as steady state temperature profiles, were recorded and are reported here. The temperature profiles inside the enclosure were found to be dependent on the effective opening area A*, a combination of the upper and lower openings, and the ratio of buoyancy fluxes from the distributed and localised source Ψ =Bw/Bp . Industrial CASE award with ARUP.

  8. Low-Pressure Burst-Mode Focused Ultrasound Wave Reconstruction and Mapping for Blood-Brain Barrier Opening: A Preclinical Examination

    PubMed Central

    Xia, Jingjing; Tsui, Po-Hsiang; Liu, Hao-Li

    2016-01-01

    Burst-mode focused ultrasound (FUS) exposure has been shown to induce transient blood-brain barrier (BBB) opening for potential CNS drug delivery. FUS-BBB opening requires imaging guidance during the intervention, yet current imaging technology only enables postoperative outcome confirmation. In this study, we propose an approach to visualize short-burst low-pressure focal beam distribution that allows to be applied in FUS-BBB opening intervention on small animals. A backscattered acoustic-wave reconstruction method based on synchronization among focused ultrasound emission, diagnostic ultrasound receiving and passively beamformed processing were developed. We observed that focal beam could be successfully visualized for in vitro FUS exposure with 0.5–2 MHz without involvement of microbubbles. The detectable level of FUS exposure was 0.467 MPa in pressure and 0.05 ms in burst length. The signal intensity (SI) of the reconstructions was linearly correlated with the FUS exposure level both in-vitro (r2 = 0.9878) and in-vivo (r2 = 0.9943), and SI level of the reconstructed focal beam also correlated with the success and level of BBB-opening. The proposed approach provides a feasible way to perform real-time and closed-loop control of FUS-based brain drug delivery. PMID:27295608

  9. openBIS: a flexible framework for managing and analyzing complex data in biology research

    PubMed Central

    2011-01-01

    Background Modern data generation techniques used in distributed systems biology research projects often create datasets of enormous size and diversity. We argue that in order to overcome the challenge of managing those large quantitative datasets and maximise the biological information extracted from them, a sound information system is required. Ease of integration with data analysis pipelines and other computational tools is a key requirement for it. Results We have developed openBIS, an open source software framework for constructing user-friendly, scalable and powerful information systems for data and metadata acquired in biological experiments. openBIS enables users to collect, integrate, share, publish data and to connect to data processing pipelines. This framework can be extended and has been customized for different data types acquired by a range of technologies. Conclusions openBIS is currently being used by several SystemsX.ch and EU projects applying mass spectrometric measurements of metabolites and proteins, High Content Screening, or Next Generation Sequencing technologies. The attributes that make it interesting to a large research community involved in systems biology projects include versatility, simplicity in deployment, scalability to very large data, flexibility to handle any biological data type and extensibility to the needs of any research domain. PMID:22151573

  10. Low-Pressure Burst-Mode Focused Ultrasound Wave Reconstruction and Mapping for Blood-Brain Barrier Opening: A Preclinical Examination

    NASA Astrophysics Data System (ADS)

    Xia, Jingjing; Tsui, Po-Hsiang; Liu, Hao-Li

    2016-06-01

    Burst-mode focused ultrasound (FUS) exposure has been shown to induce transient blood-brain barrier (BBB) opening for potential CNS drug delivery. FUS-BBB opening requires imaging guidance during the intervention, yet current imaging technology only enables postoperative outcome confirmation. In this study, we propose an approach to visualize short-burst low-pressure focal beam distribution that allows to be applied in FUS-BBB opening intervention on small animals. A backscattered acoustic-wave reconstruction method based on synchronization among focused ultrasound emission, diagnostic ultrasound receiving and passively beamformed processing were developed. We observed that focal beam could be successfully visualized for in vitro FUS exposure with 0.5-2 MHz without involvement of microbubbles. The detectable level of FUS exposure was 0.467 MPa in pressure and 0.05 ms in burst length. The signal intensity (SI) of the reconstructions was linearly correlated with the FUS exposure level both in-vitro (r2 = 0.9878) and in-vivo (r2 = 0.9943), and SI level of the reconstructed focal beam also correlated with the success and level of BBB-opening. The proposed approach provides a feasible way to perform real-time and closed-loop control of FUS-based brain drug delivery.

  11. Averaging in SU(2) open quantum random walk

    NASA Astrophysics Data System (ADS)

    Clement, Ampadu

    2014-03-01

    We study the average position and the symmetry of the distribution in the SU(2) open quantum random walk (OQRW). We show that the average position in the central limit theorem (CLT) is non-uniform compared with the average position in the non-CLT. The symmetry of distribution is shown to be even in the CLT.

  12. ISEE 1 charged particle observations indicative of open magnetospheric field lines near the subsolar region

    NASA Technical Reports Server (NTRS)

    Williams, D. J.; Frank, L. A.

    1980-01-01

    On November 20, 1977, at 0230-0300 UT, ISEE 1 encountered unusual charged particle distributions within the magnetosphere. The three-dimensional distribution observations for energetic (greater than 24 keV) ions and plasma show the development of field-aligned asymmetries in the energetic ion distributions simultaneously with a marked change in plasma flow. It is concluded that the most likely explanation for these observations is that ISEE 1 encountered open magnetospheric field lines at its position within the magnetosphere (1030 LT and 1200 plus or minus 300 km from the magnetopause). Field lines were open near the geomagnetic equator, and the geometry was spatially or temporally variable. Other features of the field line topology are presented.

  13. Interactive, process-oriented climate modeling with CLIMLAB

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2016-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The Jupyter Notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields.

  14. Voltage-Load Sensitivity Matrix Based Demand Response for Voltage Control in High Solar Penetration Distribution Feeders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Xiangqi; Wang, Jiyu; Mulcahy, David

    This paper presents a voltage-load sensitivity matrix (VLSM) based voltage control method to deploy demand response resources for controlling voltage in high solar penetration distribution feeders. The IEEE 123-bus system in OpenDSS is used for testing the performance of the preliminary VLSM-based voltage control approach. A load disaggregation process is applied to disaggregate the total load profile at the feeder head to each load nodes along the feeder so that loads are modeled at residential house level. Measured solar generation profiles are used in the simulation to model the impact of solar power on distribution feeder voltage profiles. Different casemore » studies involving various PV penetration levels and installation locations have been performed. Simulation results show that the VLSM algorithm performance meets the voltage control requirements and is an effective voltage control strategy.« less

  15. Distribution of Nd3+ ions in oxyfluoride glass ceramics

    PubMed Central

    2012-01-01

    It has been an open question whether Nd3+ ions are incorporated into the crystalline phase in oxyfluoride glass ceramics or not. Moreover, relative research has indicated that spectra characters display minor differences between before and after heat treatment in oxyfluoride glass compared to similar Er3+-, Yb3+-, Tm3+-, Eu3+-, etc.-doped materials. Here, we have studied the distribution of Nd3+ ions in oxyfluoride glass ceramics by X-ray diffraction quantitative analysis and found that almost none of the Nd3+ ions can be incorporated into the crystalline phase. In order to confirm the rationality of the process, the conventional mathematical calculation and energy-dispersive spectrometry line scanning are employed, which show good consistency. The distribution of Nd3+ ions in oxyfluoride glass ceramics reported here is significant for further optical investigations and applications of rare-earth doped oxyfluoride glass ceramics. PMID:22647385

  16. Magnetospheric Multiscale Observation of Plasma Velocity-Space Cascade: Hermite Representation and Theory.

    PubMed

    Servidio, S; Chasapis, A; Matthaeus, W H; Perrone, D; Valentini, F; Parashar, T N; Veltri, P; Gershman, D; Russell, C T; Giles, B; Fuselier, S A; Phan, T D; Burch, J

    2017-11-17

    Plasma turbulence is investigated using unprecedented high-resolution ion velocity distribution measurements by the Magnetospheric Multiscale mission (MMS) in the Earth's magnetosheath. This novel observation of a highly structured particle distribution suggests a cascadelike process in velocity space. Complex velocity space structure is investigated using a three-dimensional Hermite transform, revealing, for the first time in observational data, a power-law distribution of moments. In analogy to hydrodynamics, a Kolmogorov approach leads directly to a range of predictions for this phase-space transport. The scaling theory is found to be in agreement with observations. The combined use of state-of-the-art MMS data sets, novel implementation of a Hermite transform method, and scaling theory of the velocity cascade opens new pathways to the understanding of plasma turbulence and the crucial velocity space features that lead to dissipation in plasmas.

  17. How the deployment of attention determines what we see

    PubMed Central

    Treisman, Anne

    2007-01-01

    Attention is a tool to adapt what we see to our current needs. It can be focused narrowly on a single object or spread over several or distributed over the scene as a whole. In addition to increasing or decreasing the number of attended objects, these different deployments may have different effects on what we see. This chapter describes some research both on focused attention and its use in binding features, and on distributed attention and the kinds of information we gain and lose with the attention window opened wide. One kind of processing that we suggest occurs automatically with distributed attention results in a statistical description of sets of similar objects. Another gives the gist of the scene, which may be inferred from sets of features registered in parallel. Flexible use of these different modes of attention allows us to reconcile sharp capacity limits with a richer understanding of the visual scene. PMID:17387378

  18. Tubular filamentation for laser material processing

    PubMed Central

    Xie, Chen; Jukna, Vytautas; Milián, Carles; Giust, Remo; Ouadghiri-Idrissi, Ismail; Itina, Tatiana; Dudley, John M.; Couairon, Arnaud; Courvoisier, Francois

    2015-01-01

    An open challenge in the important field of femtosecond laser material processing is the controlled internal structuring of dielectric materials. Although the availability of high energy high repetition rate femtosecond lasers has led to many advances in this field, writing structures within transparent dielectrics at intensities exceeding 1013 W/cm2 has remained difficult as it is associated with significant nonlinear spatial distortion. This letter reports the existence of a new propagation regime for femtosecond pulses at high power that overcomes this challenge, associated with the generation of a hollow uniform and intense light tube that remains propagation invariant even at intensities associated with dense plasma formation. This regime is seeded from higher order nondiffracting Bessel beams, which carry an optical vortex charge. Numerical simulations are quantitatively confirmed by experiments where a novel experimental approach allows direct imaging of the 3D fluence distribution within transparent solids. We also analyze the transitions to other propagation regimes in near and far fields. We demonstrate how the generation of plasma in this tubular geometry can lead to applications in ultrafast laser material processing in terms of single shot index writing, and discuss how it opens important perspectives for material compression and filamentation guiding in atmosphere. PMID:25753215

  19. New Health Technologies: A UK Perspective Comment on "Providing Value to New Health Technology: The Early Contribution of Entrepreneurs, Investors, and Regulatory Agencies".

    PubMed

    Parvizi, Nassim; Parvizi, Sahar

    2017-05-09

    New health technologies require development and evaluation ahead of being incorporated into the patient care pathway. In light of the recent publication by Lehoux et al who discuss the role of entrepreneurs, investors and regulators in providing value to new health technologies, we summarise the processes involved in making new health technologies available for use in the United Kingdom. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  20. Breakdown flash at telecom wavelengths in InGaAs avalanche photodiodes

    NASA Astrophysics Data System (ADS)

    Shi, Yicheng; Lim, Janet Zheng Jie; Poh, Hou Shun; Tan, Peng Kian; Tan, Peiyu Amelia; Ling, Alexander; Kurtsiefer, Christian

    2017-11-01

    Quantum key distribution (QKD) at telecom wavelengths (1260-1625nm) has the potential for fast deployment due to existing optical fibre infrastructure and mature telecom technologies. At these wavelengths, indium gallium arsenide (InGaAs) avalanche photodiode (APD) based detectors are the preferred choice for photon detection. Similar to their silicon counterparts used at shorter wavelengths, they exhibit fluorescence from recombination of electron-hole pairs generated in the avalanche breakdown process. This fluorescence may open side channels for attacks on QKD systems. Here, we characterize the breakdown fluorescence from two commercial InGaAs single photon counting modules, and find a spectral distribution between 1000nm and 1600nm. We also show that by spectral filtering, this side channel can be efficiently suppressed.

  1. Breakdown flash at telecom wavelengths in InGaAs avalanche photodiodes.

    PubMed

    Shi, Yicheng; Lim, Janet Zheng Jie; Poh, Hou Shun; Tan, Peng Kian; Tan, Peiyu Amelia; Ling, Alexander; Kurtsiefer, Christian

    2017-11-27

    Quantum key distribution (QKD) at telecom wavelengths (1260 - 1625 nm) has the potential for fast deployment due to existing optical fibre infrastructure and mature telecom technologies. At these wavelengths, Indium Gallium Arsenide (InGaAs) avalanche photodiode (APD) based detectors are the preferred choice for photon detection. Similar to their Silicon counterparts used at shorter wavelengths, they exhibit fluorescence from recombination of electron-hole pairs generated in the avalanche breakdown process. This fluorescence may open side channels for attacks on QKD systems. Here, we characterize the breakdown fluorescence from two commercial InGaAs single photon counting modules, and find a spectral distribution between 1000 nm and 1600 nm. We also show that by spectral filtering, this side channel can be efficiently suppressed.

  2. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current focus is to integrate in the proposed platform the Cloud infrastructure, which is still a paradigm with critical problems to be solved despite the great efforts and investments. Cloud computing comes as a new way of delivering resources while using a large set of old as well as new technologies and tools for providing the necessary functionalities. The main challenges in the Cloud computing, most of them identified also in the Open Cloud Manifesto 2009, address resource management and monitoring, data and application interoperability and portability, security, scalability, software licensing, etc. We propose a platform able to execute different Geospatial applications on different parallel and distributed architectures such as Grid, Cloud, Multicore, etc. with the possibility of choosing among these architectures based on application characteristics and complexity, user requirements, necessary performances, cost support, etc. The execution redirection on a selected architecture is realized through a specialized component and has the purpose of offering a flexible way in achieving the best performances considering the existing restrictions.

  3. A Concept for Measuring Electron Distribution Functions Using Collective Thomson Scattering

    NASA Astrophysics Data System (ADS)

    Milder, A. L.; Froula, D. H.

    2017-10-01

    A.B. Langdon proposed that stable non-Maxwellian distribution functions are realized in coronal inertial confinement fusion plasmas via inverse bremsstrahlung heating. For Zvosc2 Zvosc2 vth2 > 1 , vth2 > 1 , the inverse bremsstrahlung heating rate is sufficiently fast to compete with electron-electron collisions. This process preferentially heats the subthermal electrons leading to super-Gaussian distribution functions. A method to identify the super-Gaussian order of the distribution functions in these plasmas using collective Thomson scattering will be proposed. By measuring the collective Thomson spectra over a range of angles the density, temperature and super-Gaussian order can be determined. This is accomplished by fitting non-Maxwellian distribution data with a super-Gaussian model; in order to match the density and electron temperature to within 10%, the super-Gaussian order must be varied. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  4. Uncertainty Quantification Techniques for Population Density Estimates Derived from Sparse Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Robert N; White, Devin A; Urban, Marie L

    2013-01-01

    The Population Density Tables (PDT) project at the Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort whichmore » considers over 250 countries, spans 40 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.« less

  5. Emergence and stability of intermediate open vesicles in disk-to-vesicle transitions.

    PubMed

    Li, Jianfeng; Zhang, Hongdong; Qiu, Feng; Shi, An-Chang

    2013-07-01

    The transition between two basic structures, a disk and an enclosed vesicle, of a finite membrane is studied by examining the minimum energy path (MEP) connecting these two states. The MEP is constructed using the string method applied to continuum elastic membrane models. The results reveal that, besides the commonly observed disk and vesicle, open vesicles (bowl-shaped vesicles or vesicles with a pore) can become stable or metastable shapes. The emergence, stability, and probability distribution of these open vesicles are analyzed. It is demonstrated that open vesicles can be stabilized by higher-order elastic energies. The estimated probability distribution of the different structures is in good agreement with available experiments.

  6. A compositional framework for Markov processes

    NASA Astrophysics Data System (ADS)

    Baez, John C.; Fong, Brendan; Pollard, Blake S.

    2016-03-01

    We define the concept of an "open" Markov process, or more precisely, continuous-time Markov chain, which is one where probability can flow in or out of certain states called "inputs" and "outputs." One can build up a Markov process from smaller open pieces. This process is formalized by making open Markov processes into the morphisms of a dagger compact category. We show that the behavior of a detailed balanced open Markov process is determined by a principle of minimum dissipation, closely related to Prigogine's principle of minimum entropy production. Using this fact, we set up a functor mapping open detailed balanced Markov processes to open circuits made of linear resistors. We also describe how to "black box" an open Markov process, obtaining the linear relation between input and output data that holds in any steady state, including nonequilibrium steady states with a nonzero flow of probability through the system. We prove that black boxing gives a symmetric monoidal dagger functor sending open detailed balanced Markov processes to Lagrangian relations between symplectic vector spaces. This allows us to compute the steady state behavior of an open detailed balanced Markov process from the behaviors of smaller pieces from which it is built. We relate this black box functor to a previously constructed black box functor for circuits.

  7. Size distribution of dust grains: A problem of self-similarity

    NASA Technical Reports Server (NTRS)

    Henning, TH.; Dorschner, J.; Guertler, J.

    1989-01-01

    Distribution functions describing the results of natural processes frequently show the shape of power laws, e.g., mass functions of stars and molecular clouds, velocity spectrum of turbulence, size distributions of asteroids, micrometeorites and also interstellar dust grains. It is an open question whether this behavior is a result simply coming about by the chosen mathematical representation of the observational data or reflects a deep-seated principle of nature. The authors suppose the latter being the case. Using a dust model consisting of silicate and graphite grains Mathis et al. (1977) showed that the interstellar extinction curve can be represented by taking a grain radii distribution of power law type n(a) varies as a(exp -p) with 3.3 less than or equal to p less than or equal to 3.6 (example 1) as a basis. A different approach to understanding power laws like that in example 1 becomes possible by the theory of self-similar processes (scale invariance). The beta model of turbulence (Frisch et al., 1978) leads in an elementary way to the concept of the self-similarity dimension D, a special case of Mandelbrot's (1977) fractal dimension. In the frame of this beta model, it is supposed that on each stage of a cascade the system decays to N clumps and that only the portion beta N remains active further on. An important feature of this model is that the active eddies become less and less space-filling. In the following, the authors assume that grain-grain collisions are such a scale-invarient process and that the remaining grains are the inactive (frozen) clumps of the cascade. In this way, a size distribution n(a) da varies as a(exp -(D+1))da (example 2) results. It seems to be highly probable that the power law character of the size distribution of interstellar dust grains is the result of a self-similarity process. We can, however, not exclude that the process leading to the interstellar grain size distribution is not fragmentation at all. It could be, e.g., diffusion-limited growth discussed by Sander (1986), who applied the theory of fractal geometry to the classification of non-equilibrium growth processes. He received D=2.4 for diffusion-limited aggregation in 3d-space.

  8. Optical simulation of a Popescu-Rohrlich Box

    PubMed Central

    Chu, Wen-Jing; Zong, Xiao-Lan; Yang, Ming; Pan, Guo-Zhu; Cao, Zhuo-Liang

    2016-01-01

    It is well known that the fair-sampling loophole in Bell test opened by the selection of the state to be measured can lead to post-quantum correlations. In this paper, we make the selection of the results after measurement, which opens the fair- sampling loophole too, and thus can lead to post-quantum correlations. This kind of result-selection loophole can be realized by pre- and post-selection processes within the “two-state vector formalism”, and a physical simulation of Popescu-Rohrlich (PR) box is designed in linear optical system. The probability distribution of the PR has a maximal CHSH value 4, i.e. it can maximally violate CHSH inequality. Because the “two-state vector formalism” violates the information causality, it opens the locality loophole too, which means that this kind of results selection within “two-state vector formalism” leads to both fair- sampling loophole and locality loophole, so we call it a comprehensive loophole in Bell test. The comprehensive loophole opened by the results selection within “two-state vector formalism” may be another possible explanation of why post-quantum correlations are incompatible with quantum mechanics and seem not to exist in nature. PMID:27329203

  9. Optical simulation of a Popescu-Rohrlich Box.

    PubMed

    Chu, Wen-Jing; Zong, Xiao-Lan; Yang, Ming; Pan, Guo-Zhu; Cao, Zhuo-Liang

    2016-06-22

    It is well known that the fair-sampling loophole in Bell test opened by the selection of the state to be measured can lead to post-quantum correlations. In this paper, we make the selection of the results after measurement, which opens the fair- sampling loophole too, and thus can lead to post-quantum correlations. This kind of result-selection loophole can be realized by pre- and post-selection processes within the "two-state vector formalism", and a physical simulation of Popescu-Rohrlich (PR) box is designed in linear optical system. The probability distribution of the PR has a maximal CHSH value 4, i.e. it can maximally violate CHSH inequality. Because the "two-state vector formalism" violates the information causality, it opens the locality loophole too, which means that this kind of results selection within "two-state vector formalism" leads to both fair- sampling loophole and locality loophole, so we call it a comprehensive loophole in Bell test. The comprehensive loophole opened by the results selection within "two-state vector formalism" may be another possible explanation of why post-quantum correlations are incompatible with quantum mechanics and seem not to exist in nature.

  10. Open Distribution of Virtual Containers as a Key Framework for Open Educational Resources and STEAM Subjects

    ERIC Educational Resources Information Center

    Corbi, Alberto; Burgos, Daniel

    2017-01-01

    This paper presents how virtual containers enhance the implementation of STEAM (science, technology, engineering, arts, and math) subjects as Open Educational Resources (OER). The publication initially summarizes the limitations of delivering open rich learning contents and corresponding assignments to students in college level STEAM areas. The…

  11. A Framework System for Intelligent Support in Open Distributed Learning Environments--A Look Back from 16 Years Later

    ERIC Educational Resources Information Center

    Hoppe, H. Ulrich

    2016-01-01

    The 1998 paper by Martin Mühlenbrock, Frank Tewissen, and myself introduced a multi-agent architecture and a component engineering approach for building open distributed learning environments to support group learning in different types of classroom settings. It took up prior work on "multiple student modeling" as a method to configure…

  12. Review and Content Analysis of the "International Review of Research in Open and Distance/Distributed Learning" (2000-2015)

    ERIC Educational Resources Information Center

    Zawacki-Richter, Olaf; Alturki, Uthman; Aldraiweesh, Ahmed

    2017-01-01

    This paper presents a review of distance education literature published in the "International Review of Research in Open and Distance/Distributed Learning" (IRRODL) to describe the status thereof and to identify gaps and priority areas in distance education research based on a validated classification of research areas. All articles (N =…

  13. Vessel contents of leaves after excision: a test of the Scholander assumption

    Treesearch

    Melvin T. Tyree; Herve Cochard

    2003-01-01

    When petioles of transpiring leaves are cut in the air, according to the 'Scholander assumption', the vessels cut open should fill with air as the water is drained away by tissue rehydration and/or continued transpiration. The distribution of air-filled vessels versus distance from the cut surface should match the distribution of lengths of 'open vessels...

  14. DISTRIBUTED RC NETWORKS WITH RATIONAL TRANSFER FUNCTIONS,

    DTIC Science & Technology

    A distributed RC circuit analogous to a continuously tapped transmission line can be made to have a rational short-circuit transfer admittance and...one rational shortcircuit driving-point admittance. A subcircuit of the same structure has a rational open circuit transfer impedance and one rational ...open circuit driving-point impedance. Hence, rational transfer functions may be obtained while considering either generator impedance or load

  15. Software Comparison for Renewable Energy Deployment in a Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercialmore » tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.« less

  16. NDL-v2.0: A new version of the numerical differentiation library for parallel architectures

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Voglis, C.; Papageorgiou, D. G.; Lagaris, I. E.

    2014-07-01

    We present a new version of the numerical differentiation library (NDL) used for the numerical estimation of first and second order partial derivatives of a function by finite differencing. In this version we have restructured the serial implementation of the code so as to achieve optimal task-based parallelization. The pure shared-memory parallelization of the library has been based on the lightweight OpenMP tasking model allowing for the full extraction of the available parallelism and efficient scheduling of multiple concurrent library calls. On multicore clusters, parallelism is exploited by means of TORC, an MPI-based multi-threaded tasking library. The new MPI implementation of NDL provides optimal performance in terms of function calls and, furthermore, supports asynchronous execution of multiple library calls within legacy MPI programs. In addition, a Python interface has been implemented for all cases, exporting the functionality of our library to sequential Python codes. Catalog identifier: AEDG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 63036 No. of bytes in distributed program, including test data, etc.: 801872 Distribution format: tar.gz Programming language: ANSI Fortran-77, ANSI C, Python. Computer: Distributed systems (clusters), shared memory systems. Operating system: Linux, Unix. Has the code been vectorized or parallelized?: Yes. RAM: The library uses O(N) internal storage, N being the dimension of the problem. It can use up to O(N2) internal storage for Hessian calculations, if a task throttling factor has not been set by the user. Classification: 4.9, 4.14, 6.5. Catalog identifier of previous version: AEDG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180(2009)1404 Does the new version supersede the previous version?: Yes Nature of problem: The numerical estimation of derivatives at several accuracy levels is a common requirement in many computational tasks, such as optimization, solution of nonlinear systems, and sensitivity analysis. For a large number of scientific and engineering applications, the underlying functions correspond to simulation codes for which analytical estimation of derivatives is difficult or almost impossible. A parallel implementation that exploits systems with multiple CPUs is very important for large scale and computationally expensive problems. Solution method: Finite differencing is used with a carefully chosen step that minimizes the sum of the truncation and round-off errors. The parallel versions employ both OpenMP and MPI libraries. Reasons for new version: The updated version was motivated by our endeavors to extend a parallel Bayesian uncertainty quantification framework [1], by incorporating higher order derivative information as in most state-of-the-art stochastic simulation methods such as Stochastic Newton MCMC [2] and Riemannian Manifold Hamiltonian MC [3]. The function evaluations are simulations with significant time-to-solution, which also varies with the input parameters such as in [1, 4]. The runtime of the N-body-type of problem changes considerably with the introduction of a longer cut-off between the bodies. In the first version of the library, the OpenMP-parallel subroutines spawn a new team of threads and distribute the function evaluations with a PARALLEL DO directive. This limits the functionality of the library as multiple concurrent calls require nested parallelism support from the OpenMP environment. Therefore, either their function evaluations will be serialized or processor oversubscription is likely to occur due to the increased number of OpenMP threads. In addition, the Hessian calculations include two explicit parallel regions that compute first the diagonal and then the off-diagonal elements of the array. Due to the barrier between the two regions, the parallelism of the calculations is not fully exploited. These issues have been addressed in the new version by first restructuring the serial code and then running the function evaluations in parallel using OpenMP tasks. Although the MPI-parallel implementation of the first version is capable of fully exploiting the task parallelism of the PNDL routines, it does not utilize the caching mechanism of the serial code and, therefore, performs some redundant function evaluations in the Hessian and Jacobian calculations. This can lead to: (a) higher execution times if the number of available processors is lower than the total number of tasks, and (b) significant energy consumption due to wasted processor cycles. Overcoming these drawbacks, which become critical as the time of a single function evaluation increases, was the primary goal of this new version. Due to the code restructure, the MPI-parallel implementation (and the OpenMP-parallel in accordance) avoids redundant calls, providing optimal performance in terms of the number of function evaluations. Another limitation of the library was that the library subroutines were collective and synchronous calls. In the new version, each MPI process can issue any number of subroutines for asynchronous execution. We introduce two library calls that provide global and local task synchronizations, similarly to the BARRIER and TASKWAIT directives of OpenMP. The new MPI-implementation is based on TORC, a new tasking library for multicore clusters [5-7]. TORC improves the portability of the software, as it relies exclusively on the POSIX-Threads and MPI programming interfaces. It allows MPI processes to utilize multiple worker threads, offering a hybrid programming and execution environment similar to MPI+OpenMP, in a completely transparent way. Finally, to further improve the usability of our software, a Python interface has been implemented on top of both the OpenMP and MPI versions of the library. This allows sequential Python codes to exploit shared and distributed memory systems. Summary of revisions: The revised code improves the performance of both parallel (OpenMP and MPI) implementations. The functionality and the user-interface of the MPI-parallel version have been extended to support the asynchronous execution of multiple PNDL calls, issued by one or multiple MPI processes. A new underlying tasking library increases portability and allows MPI processes to have multiple worker threads. For both implementations, an interface to the Python programming language has been added. Restrictions: The library uses only double precision arithmetic. The MPI implementation assumes the homogeneity of the execution environment provided by the operating system. Specifically, the processes of a single MPI application must have identical address space and a user function resides at the same virtual address. In addition, address space layout randomization should not be used for the application. Unusual features: The software takes into account bound constraints, in the sense that only feasible points are used to evaluate the derivatives, and given the level of the desired accuracy, the proper formula is automatically employed. Running time: Running time depends on the function's complexity. The test run took 23 ms for the serial distribution, 25 ms for the OpenMP with 2 threads, 53 ms and 1.01 s for the MPI parallel distribution using 2 threads and 2 processes respectively and yield-time for idle workers equal to 10 ms. References: [1] P. Angelikopoulos, C. Paradimitriou, P. Koumoutsakos, Bayesian uncertainty quantification and propagation in molecular dynamics simulations: a high performance computing framework, J. Chem. Phys 137 (14). [2] H.P. Flath, L.C. Wilcox, V. Akcelik, J. Hill, B. van Bloemen Waanders, O. Ghattas, Fast algorithms for Bayesian uncertainty quantification in large-scale linear inverse problems based on low-rank partial Hessian approximations, SIAM J. Sci. Comput. 33 (1) (2011) 407-432. [3] M. Girolami, B. Calderhead, Riemann manifold Langevin and Hamiltonian Monte Carlo methods, J. R. Stat. Soc. Ser. B (Stat. Methodol.) 73 (2) (2011) 123-214. [4] P. Angelikopoulos, C. Paradimitriou, P. Koumoutsakos, Data driven, predictive molecular dynamics for nanoscale flow simulations under uncertainty, J. Phys. Chem. B 117 (47) (2013) 14808-14816. [5] P.E. Hadjidoukas, E. Lappas, V.V. Dimakopoulos, A runtime library for platform-independent task parallelism, in: PDP, IEEE, 2012, pp. 229-236. [6] C. Voglis, P.E. Hadjidoukas, D.G. Papageorgiou, I. Lagaris, A parallel hybrid optimization algorithm for fitting interatomic potentials, Appl. Soft Comput. 13 (12) (2013) 4481-4492. [7] P.E. Hadjidoukas, C. Voglis, V.V. Dimakopoulos, I. Lagaris, D.G. Papageorgiou, Supporting adaptive and irregular parallelism for non-linear numerical optimization, Appl. Math. Comput. 231 (2014) 544-559.

  17. BioWord: A sequence manipulation suite for Microsoft Word

    PubMed Central

    2012-01-01

    Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326

  18. Nanomaterial release characteristics in a single-walled carbon nanotube manufacturing workplace

    NASA Astrophysics Data System (ADS)

    Ji, Jun Ho; Kim, Jong Bum; Lee, Gwangjae; Bae, Gwi-Nam

    2015-02-01

    As carbon nanotubes (CNTs) are widely used in various applications, exposure assessment also increases in importance with other various toxicity tests for CNTs. We conducted 24-h continuous nanoaerosol measurements to identify possible nanomaterial release in a single-walled carbon nanotube (SWCNT) manufacturing workplace. Four real-time aerosol instruments were used to determine the nanosized and microsized particle numbers, particle surface area, and carbonaceous species. Task-based exposure assessment was carried out for SWCNT synthesis using the arc plasma and thermal decomposition processes to remove amorphous carbon components as impurities. During the SWCNT synthesis, the black carbon (BC) concentration was 2-12 μg/m3. The maximum BC mass concentrations occurred when the synthesis chamber was opened for harvesting the SWCNTs. The number concentrations of particles with sizes 10-420 nm were 10,000-40,000 particles/cm3 during the tasks. The maximum number concentration existed when a vacuum pump was operated to remove exhaust air from the SWCNT synthesis chamber due to the penetration of highly concentrated oil mists through the window opened. We analyzed the particle mass size distribution and particle number size distribution for each peak episode. Using real-time aerosol detectors, we distinguished the SWCNT releases from background nanoaerosols such as oil mist and atmospheric photochemical smog particles. SWCNT aggregates with sizes of 1-10 μm were mainly released from the arc plasma synthesis. The harvesting process was the main release route of SWCNTs in the workplace.

  19. BioWord: a sequence manipulation suite for Microsoft Word.

    PubMed

    Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan

    2012-06-07

    The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  20. Service Oriented Architecture for Wireless Sensor Networks in Agriculture

    NASA Astrophysics Data System (ADS)

    Sawant, S. A.; Adinarayana, J.; Durbha, S. S.; Tripathy, A. K.; Sudharsan, D.

    2012-08-01

    Rapid advances in Wireless Sensor Network (WSN) for agricultural applications has provided a platform for better decision making for crop planning and management, particularly in precision agriculture aspects. Due to the ever-increasing spread of WSNs there is a need for standards, i.e. a set of specifications and encodings to bring multiple sensor networks on common platform. Distributed sensor systems when brought together can facilitate better decision making in agricultural domain. The Open Geospatial Consortium (OGC) through Sensor Web Enablement (SWE) provides guidelines for semantic and syntactic standardization of sensor networks. In this work two distributed sensing systems (Agrisens and FieldServer) were selected to implement OGC SWE standards through a Service Oriented Architecture (SOA) approach. Online interoperable data processing was developed through SWE components such as Sensor Model Language (SensorML) and Sensor Observation Service (SOS). An integrated web client was developed to visualize the sensor observations and measurements that enables the retrieval of crop water resources availability and requirements in a systematic manner for both the sensing devices. Further, the client has also the ability to operate in an interoperable manner with any other OGC standardized WSN systems. The study of WSN systems has shown that there is need to augment the operations / processing capabilities of SOS in order to understand about collected sensor data and implement the modelling services. Also, the very low cost availability of WSN systems in future, it is possible to implement the OGC standardized SWE framework for agricultural applications with open source software tools.

  1. Voltage-Gated Lipid Ion Channels

    PubMed Central

    Blicher, Andreas; Heimburg, Thomas

    2013-01-01

    Synthetic lipid membranes can display channel-like ion conduction events even in the absence of proteins. We show here that these events are voltage-gated with a quadratic voltage dependence as expected from electrostatic theory of capacitors. To this end, we recorded channel traces and current histograms in patch-experiments on lipid membranes. We derived a theoretical current-voltage relationship for pores in lipid membranes that describes the experimental data very well when assuming an asymmetric membrane. We determined the equilibrium constant between closed and open state and the open probability as a function of voltage. The voltage-dependence of the lipid pores is found comparable to that of protein channels. Lifetime distributions of open and closed events indicate that the channel open distribution does not follow exponential statistics but rather power law behavior for long open times. PMID:23823188

  2. Hadoop-BAM: directly manipulating next generation sequencing data in the cloud

    PubMed Central

    Niemenmaa, Matti; Kallio, Aleksi; Schumacher, André; Klemelä, Petri; Korpelainen, Eija; Heljanko, Keijo

    2012-01-01

    Summary: Hadoop-BAM is a novel library for the scalable manipulation of aligned next-generation sequencing data in the Hadoop distributed computing framework. It acts as an integration layer between analysis applications and BAM files that are processed using Hadoop. Hadoop-BAM solves the issues related to BAM data access by presenting a convenient API for implementing map and reduce functions that can directly operate on BAM records. It builds on top of the Picard SAM JDK, so tools that rely on the Picard API are expected to be easily convertible to support large-scale distributed processing. In this article we demonstrate the use of Hadoop-BAM by building a coverage summarizing tool for the Chipster genome browser. Our results show that Hadoop offers good scalability, and one should avoid moving data in and out of Hadoop between analysis steps. Availability: Available under the open-source MIT license at http://sourceforge.net/projects/hadoop-bam/ Contact: matti.niemenmaa@aalto.fi Supplementary information: Supplementary material is available at Bioinformatics online. PMID:22302568

  3. BioSigPlot: an opensource tool for the visualization of multi-channel biomedical signals with Matlab.

    PubMed

    Boudet, Samuel; Peyrodie, Laurent; Gallois, Philippe; de l'Aulnoit, Denis Houzé; Cao, Hua; Forzy, Gérard

    2013-01-01

    This paper presents a Matlab-based software (MathWorks inc.) called BioSigPlot for the visualization of multi-channel biomedical signals, particularly for the EEG. This tool is designed for researchers on both engineering and medicine who have to collaborate to visualize and analyze signals. It aims to provide a highly customizable interface for signal processing experimentation in order to plot several kinds of signals while integrating the common tools for physician. The main advantages compared to other existing programs are the multi-dataset displaying, the synchronization with video and the online processing. On top of that, this program uses object oriented programming, so that the interface can be controlled by both graphic controls and command lines. It can be used as EEGlab plug-in but, since it is not limited to EEG, it would be distributed separately. BioSigPlot is distributed free of charge (http://biosigplot.sourceforge.net), under the terms of GNU Public License for non-commercial use and open source development.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acosta, D.; Affolder, Anthony A.; Albrow, M.G.

    The authors have measured the azimuthal angular correlation of b{bar b} production, using 86.5 pb{sup -1} of data collected by Collider Detector at Fermilab (CDF) in p{bar p} collisions at {radical}s = 1.8 TeV during 1994-1995. In high-energy p{bar p} collisions, such as at the Tevatron, b{bar b} production can be schematically categorized into three mechanisms. The leading-order (LO) process is ''flavor creation'', where both b and {bar b} quarks substantially participate in the hard scattering and result in a distinct back-to-back signal in final state. The ''flavor excitation'' and the ''gluon splitting'' processes, which appear at next-leading-order (NLO), aremore » known to make a comparable contribution to total b{bar b} cross section, while providing very different opening angle distributions from the LO process. An azimuthal opening angle between bottom and anti-bottom, {Delta}{phi}, has been used for the correlation measurement to probe the interaction creating b{bar b} pairs. The {Delta}{phi} distribution has been obtained from two different methods. one method measures the {Delta}{phi} between bottom hadrons using events with two reconstructed secondary vertex tags. The other method uses b{bar b} {yields} (J/{psi}X)({ell}X') events, where the charged lepton ({ell}) is an electron (e) or a muon ({mu}), to measure {Delta}{phi} between bottom quarks. The b{bar b} purity is determined as a function of {Delta}{phi} by fitting the decay length of the J/{psi} and the impact parameter of the {ell}. Both methods quantify the contribution from higher-order production mechanisms by the fraction of the b{bar b} pairs produced in the same azimuthal hemisphere, f{sub toward}. The measured f{sub toward} values are consistent with both parton shower Monte Carlo and NLO QCD predictions.« less

  5. Log-normal distribution of the trace element data results from a mixture of stocahstic input and deterministic internal dynamics.

    PubMed

    Usuda, Kan; Kono, Koichi; Dote, Tomotaro; Shimizu, Hiroyasu; Tominaga, Mika; Koizumi, Chisato; Nakase, Emiko; Toshina, Yumi; Iwai, Junko; Kawasaki, Takashi; Akashi, Mitsuya

    2002-04-01

    In previous article, we showed a log-normal distribution of boron and lithium in human urine. This type of distribution is common in both biological and nonbiological applications. It can be observed when the effects of many independent variables are combined, each of which having any underlying distribution. Although elemental excretion depends on many variables, the one-compartment open model following a first-order process can be used to explain the elimination of elements. The rate of excretion is proportional to the amount present of any given element; that is, the same percentage of an existing element is eliminated per unit time, and the element concentration is represented by a deterministic negative power function of time in the elimination time-course. Sampling is of a stochastic nature, so the dataset of time variables in the elimination phase when the sample was obtained is expected to show Normal distribution. The time variable appears as an exponent of the power function, so a concentration histogram is that of an exponential transformation of Normally distributed time. This is the reason why the element concentration shows a log-normal distribution. The distribution is determined not by the element concentration itself, but by the time variable that defines the pharmacokinetic equation.

  6. Extending Marine Species Distribution Maps Using Non-Traditional Sources

    PubMed Central

    Moretzsohn, Fabio; Gibeaut, James

    2015-01-01

    Abstract Background Traditional sources of species occurrence data such as peer-reviewed journal articles and museum-curated collections are included in species databases after rigorous review by species experts and evaluators. The distribution maps created in this process are an important component of species survival evaluations, and are used to adapt, extend and sometimes contract polygons used in the distribution mapping process. New information During an IUCN Red List Gulf of Mexico Fishes Assessment Workshop held at The Harte Research Institute for Gulf of Mexico Studies, a session included an open discussion on the topic of including other sources of species occurrence data. During the last decade, advances in portable electronic devices and applications enable 'citizen scientists' to record images, location and data about species sightings, and submit that data to larger species databases. These applications typically generate point data. Attendees of the workshop expressed an interest in how that data could be incorporated into existing datasets, how best to ascertain the quality and value of that data, and what other alternate data sources are available. This paper addresses those issues, and provides recommendations to ensure quality data use. PMID:25941453

  7. A Process-Based Transport-Distance Model of Aeolian Transport

    NASA Astrophysics Data System (ADS)

    Naylor, A. K.; Okin, G.; Wainwright, J.; Parsons, A. J.

    2017-12-01

    We present a new approach to modeling aeolian transport based on transport distance. Particle fluxes are based on statistical probabilities of particle detachment and distributions of transport lengths, which are functions of particle size classes. A computational saltation model is used to simulate transport distances over a variety of sizes. These are fit to an exponential distribution, which has the advantages of computational economy, concordance with current field measurements, and a meaningful relationship to theoretical assumptions about mean and median particle transport distance. This novel approach includes particle-particle interactions, which are important for sustaining aeolian transport and dust emission. Results from this model are compared with results from both bulk- and particle-sized-specific transport equations as well as empirical wind tunnel studies. The transport-distance approach has been successfully used for hydraulic processes, and extending this methodology from hydraulic to aeolian transport opens up the possibility of modeling joint transport by wind and water using consistent physics. Particularly in nutrient-limited environments, modeling the joint action of aeolian and hydraulic transport is essential for understanding the spatial distribution of biomass across landscapes and how it responds to climatic variability and change.

  8. Nuclear transporters in a multinucleated organism: functional and localization analyses in Aspergillus nidulans

    PubMed Central

    Markina-Iñarrairaegui, Ane; Etxebeste, Oier; Herrero-García, Erika; Araújo-Bazán, Lidia; Fernández-Martínez, Javier; Flores, Jairo A.; Osmani, Stephen A.; Espeso, Eduardo A.

    2011-01-01

    Nuclear transporters mediate bidirectional macromolecule traffic through the nuclear pore complex (NPC), thus participating in vital processes of eukaryotic cells. A systematic functional analysis in Aspergillus nidulans permitted the identification of 4 essential nuclear transport pathways of a hypothetical number of 14. The absence of phenotypes for most deletants indicates redundant roles for these nuclear receptors. Subcellular distribution studies of these carriers show three main distributions: nuclear, nucleocytoplasmic, and in association with the nuclear envelope. These locations are not specific to predicted roles as exportins or importins but indicate that bidirectional transport may occur coordinately in all nuclei of a syncytium. Coinciding with mitotic NPC rearrangements, transporters dynamically modified their localizations, suggesting supplementary roles to nucleocytoplasmic transport specifically during mitosis. Loss of transportin-SR and Mex/TAP from the nuclear envelope indicates absence of RNA transport during the partially open mitosis of Aspergillus, whereas nucleolar accumulation of Kap121 and Kap123 homologues suggests a role in nucleolar disassembly. This work provides new insight into the roles of nuclear transporters and opens an avenue for future studies of the molecular mechanisms of transport among nuclei within a common cytoplasm, using A. nidulans as a model organism. PMID:21880896

  9. Fabrication and investigation of electrochromatographic columns with a simplex configuration.

    PubMed

    Liu, Qing; Yang, Lijun; Wang, Qiuquan; Zhang, Bo

    2014-07-04

    Duplex capillary columns with a packed and an open section are widely used in electrochromatography (CEC). The duplex column configuration leads to non-uniform voltage drop, electrical field distribution and separation performance. It also adds to the complexity in understanding and optimizing electrochromatographic process. In this study, we introduced a simplex column configuration based on single particle fritting technology. The new column configuration has an essentially uniform packed bed through the entire column length, with only 1mm length left unpacked serving as the optical detection window. The study shows that a simplex column has higher separation efficiency than a duplex column, especially at the high voltage range, due to the consistent distribution of electrical field over the column length. In comparison to the duplex column, the simplex column presented a lower flow rate at the same applied voltage, suggesting that an open section may support a higher speed than a packed section. In practice, the long and short ends of the simplex column could be used as independent CEC columns respectively. This "two-in-one" bi-functional column configuration provided extra flexibilities in selecting and optimizing electrochromatographic conditions. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Field Distribution and Coupling Investigation of an Eight-Channel RF Coil Consisting of Different Dipole Coil Elements for 7 T MRI.

    PubMed

    Chen, Zhichao; Solbach, Klaus; Erni, Daniel; Rennings, Andreas

    2017-06-01

    In this contribution, we investigate the [Formula: see text] distribution and coupling characteristics of a multichannel radio frequency (RF) coil consisting of different dipole coil elements for 7 T MRI, and explore the feasibility to achieve a compromise between field distribution and decoupling by combining different coil elements. Two types of dipole elements are considered here: the meander dipole element with a chip-capacitor-based connection to the RF shield which achieves a sufficient decoupling between the neighboring elements; and the open-ended meander dipole element which exhibits a broader magnetic field distribution. By nesting the open-ended dipole elements in between the ones with end-capacitors, the [Formula: see text] distribution, in terms of field penetration depth and homogeneity, is improved in comparison to the dipole coil consisting only of the elements with end-capacitors, and at the same time, the adjacent elements are less coupled to each other in comparison to the dipole coil consisting only of the open-ended elements. The proposed approach is validated by both full-wave simulation and experimental results.

  11. Cognitive processes and neural basis of language switching: proposal of a new model.

    PubMed

    Moritz-Gasser, Sylvie; Duffau, Hugues

    2009-12-09

    Although studies on bilingualism are abundant, cognitive processes and neural foundations of language switching received less attention. The aim of our study is to provide new insights to this still open question: do dedicated region(s) for language switching exist or is this function underlain by a distributed circuit of interconnected brain areas, part of a more general cognitive system? On the basis of recent behavioral, neuroimaging, and brain stimulation studies, we propose an original 'hodological' model of language switching. This process might be subserved by a large-scale cortico-subcortical network, with an executive system (prefrontal cortex, anterior cingulum, caudate nucleus) controlling a more dedicated language subcircuit, which involves postero-temporal areas, supramarginal and angular gyri, Broca's area, and the superior longitudinal fasciculus.

  12. Processing and properties of Titanium alloy based materials with tailored porosity and composition

    NASA Astrophysics Data System (ADS)

    Cabezas-Villa, Jose Luis; Olmos, Luis; Lemus-Ruiz, Jose; Bouvard, Didier; Chavez, Jorge; Jimenez, Omar; Manuel Solorio, Victor

    2017-06-01

    This paper deals with powder processing of Ti6Al4V titanium alloy based materials with tailored porosity and composition. Ti6Al4V powder was mixed either with salt particles acting as space holder, so as to provide two-scale porosity, or with hard TiN particles that significantly modified the microstructure of the material and increased its hardness. Finally an original three-layer component was produced. Sample microstructure was observed by SEM and micro-tomography with special interest in pore size and shape, inclusion distribution and connectivity. Compression tests provided elastic modulus and yield stress as functions of density. These materials are representative of bone implants subjected to complex biological and mechanical conditions. These results thus open avenues for processing personalized implants by powder metallurgy.

  13. Study on process design of partially-balanced, hydraulically lifting vertical ship lift

    NASA Astrophysics Data System (ADS)

    Xin, Shen; Xiaofeng, Xu; Lu, Zhang; Bing, Zhu; Fei, Li

    2017-11-01

    The hub ship lift in Panjin is the first navigation structure in China for the link between the inland and open seas, which adopts a novel partially-balanced, hydraulically lifting ship lift; it can meet such requirements as fast and sharp water level change in open sea, large draft of a yacht, and launching of a ship reception chamber; its balancing weight system can effectively reduce the load of the primary lifting cylinder, and optimize the force distribution of the ship reception chamber. The paper provides an introduction to main equipment, basic principles, main features and system composition of a ship lift. The unique power system and balancing system of the completed ship lift has offered some experience for the construction of the tourism-type ship lifts with a lower lifting height.

  14. Utilization of ERTS-1 for appraising changes in continental migratory bird habitat

    NASA Technical Reports Server (NTRS)

    Gilmer, D. S. (Principal Investigator); Work, E. A., Jr.; Klett, A. T.

    1974-01-01

    The author has identified the following significant results. Information on numbers, distribution, and quality of wetlands in the breeding range of migratory waterfowl is important for the management of this wildlife resource. Using computer processing of data gathered by the ERTS-1 multispectral scanner, techniques for obtaining indices of annual waterfowl recruitment, and habitat quality are examined. As a primary task, thematic maps and statistics relating to open surface water were produced. Discrimination of water was based upon water's low apparent radiance in a single, near-infrared waveband. An advanced technique using multispectral information for discerning open water at a level of detail finer than the virtual resolution of the data was also successfully tested. In another related task, vegetation indicators were used for detecting conditions of latent or occluded water and upland habitat characteristics.

  15. Demographic management in a federated healthcare environment.

    PubMed

    Román, I; Roa, L M; Reina-Tosina, J; Madinabeitia, G

    2006-09-01

    The purpose of this paper is to provide a further step toward the decentralization of identification and demographic information about persons by solving issues related to the integration of demographic agents in a federated healthcare environment. The aim is to identify a particular person in every system of a federation and to obtain a unified view of his/her demographic information stored in different locations. This work is based on semantic models and techniques, and pursues the reconciliation of several current standardization works including ITU-T's Open Distributed Processing, CEN's prEN 12967, OpenEHR's dual and reference models, CEN's General Purpose Information Components and CORBAmed's PID service. We propose a new paradigm for the management of person identification and demographic data, based on the development of an open architecture of specialized distributed components together with the incorporation of techniques for the efficient management of domain ontologies, in order to have a federated demographic service. This new service enhances previous correlation solutions sharing ideas with different standards and domains like semantic techniques and database systems. The federation philosophy enforces us to devise solutions to the semantic, functional and instance incompatibilities in our approach. Although this work is based on several models and standards, we have improved them by combining their contributions and developing a federated architecture that does not require the centralization of demographic information. The solution is thus a good approach to face integration problems and the applied methodology can be easily extended to other tasks involved in the healthcare organization.

  16. Blood Vessel Adaptation with Fluctuations in Capillary Flow Distribution

    PubMed Central

    Hu, Dan; Cai, David; Rangan, Aaditya V.

    2012-01-01

    Throughout the life of animals and human beings, blood vessel systems are continuously adapting their structures – the diameter of vessel lumina, the thickness of vessel walls, and the number of micro-vessels – to meet the changing metabolic demand of the tissue. The competition between an ever decreasing tendency of luminal diameters and an increasing stimulus from the wall shear stress plays a key role in the adaptation of luminal diameters. However, it has been shown in previous studies that the adaptation dynamics based only on these two effects is unstable. In this work, we propose a minimal adaptation model of vessel luminal diameters, in which we take into account the effects of metabolic flow regulation in addition to wall shear stresses and the decreasing tendency of luminal diameters. In particular, we study the role, in the adaptation process, of fluctuations in capillary flow distribution which is an important means of metabolic flow regulation. The fluctuation in the flow of a capillary group is idealized as a switch between two states, i.e., an open-state and a close-state. Using this model, we show that the adaptation of blood vessel system driven by wall shear stress can be efficiently stabilized when the open time ratio responds sensitively to capillary flows. As micro-vessel rarefaction is observed in our simulations with a uniformly decreased open time ratio of capillary flows, our results point to a possible origin of micro-vessel rarefaction, which is believed to induce hypertension. PMID:23029014

  17. Properties of single NMDA receptor channels in human dentate gyrus granule cells

    PubMed Central

    Lieberman, David N; Mody, Istvan

    1999-01-01

    Cell-attached single-channel recordings of NMDA channels were carried out in human dentate gyrus granule cells acutely dissociated from slices prepared from hippocampi surgically removed for the treatment of temporal lobe epilepsy (TLE). The channels were activated by l-aspartate (250–500 nm) in the presence of saturating glycine (8 μm). The main conductance was 51 ± 3 pS. In ten of thirty granule cells, clear subconductance states were observed with a mean conductance of 42 ± 3 pS, representing 8 ± 2% of the total openings. The mean open times varied from cell to cell, possibly owing to differences in the epileptogenicity of the tissue of origin. The mean open time was 2.70 ± 0.95 ms (range, 1.24–4.78 ms). In 87% of the cells, three exponential components were required to fit the apparent open time distributions. In the remaining neurons, as in control rat granule cells, two exponentials were sufficient. Shut time distributions were fitted by five exponential components. The average numbers of openings in bursts (1.74 ± 0.09) and clusters (3.06 ± 0.26) were similar to values obtained in rodents. The mean burst (6.66 ± 0.9 ms), cluster (20.1 ± 3.3 ms) and supercluster lengths (116.7 ± 17.5 ms) were longer than those in control rat granule cells, but approached the values previously reported for TLE (kindled) rats. As in rat NMDA channels, adjacent open and shut intervals appeared to be inversely related to each other, but it was only the relative areas of the three open time constants that changed with adjacent shut time intervals. The long openings of human TLE NMDA channels resembled those produced by calcineurin inhibitors in control rat granule cells. Yet the calcineurin inhibitor FK-506 (500 nm) did not prolong the openings of human channels, consistent with a decreased calcineurin activity in human TLE. Many properties of the human NMDA channels resemble those recorded in rat hippocampal neurons. Both have similar slope conductances, five exponential shut time distributions, complex groupings of openings, and a comparable number of openings per grouping. Other properties of human TLE NMDA channels correspond to those observed in kindling; the openings are considerably long, requiring an additional exponential component to fit their distributions, and inhibition of calcineurin is without effect in prolonging the openings. PMID:10373689

  18. Development of efficient and cost-effective distributed hydrological modeling tool MWEasyDHM based on open-source MapWindow GIS

    NASA Astrophysics Data System (ADS)

    Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao

    2011-09-01

    Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.

  19. Apparatus tube configuration and mounting for solid oxide fuel cells

    DOEpatents

    Zymboly, G.E.

    1993-09-14

    A generator apparatus is made containing long, hollow, tubular, fuel cells containing an inner air electrode, an outer fuel electrode, and solid electrolyte there between, placed between a fuel distribution board and a board which separates the combustion chamber from the generating chamber, where each fuel cell has an insertable open end and in insertable, plugged, closed end, the plugged end being inserted into the fuel distribution board and the open end being inserted through the separator board where the plug is completely within the fuel distribution board. 3 figures.

  20. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    PubMed Central

    2011-01-01

    Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage examples. PMID:21352538

  1. Distributed Modelling of Stormflow Generation: Assessing the Effect of Ground Cover

    NASA Astrophysics Data System (ADS)

    Jarihani, B.; Sidle, R. C.; Roth, C. H.; Bartley, R.; Wilkinson, S. N.

    2017-12-01

    Understanding the effects of grazing management and land cover changes on surface hydrology is important for water resources and land management. A distributed hydrological modelling platform, wflow, (that was developed as part of Deltares's OpenStreams project) is used to assess the effect of land management practices on runoff generation processes. The model was applied to Weany Creek, a small catchment (13.6 km2) of the Burdekin Basin, North Australia, which is being studied to understand sources of sediment and nutrients to the Great Barrier Reef. Satellite and drone-based ground cover data, high resolution topography from LiDAR, soil properties, and distributed rainfall data were used to parameterise the model. Wflow was used to predict total runoff, peak runoff, time of rise, and lag time for several events of varying magnitudes and antecedent moisture conditions. A nested approach was employed to calibrate the model by using recorded flow hydrographs at three scales: (1) a hillslope sub-catchment: (2) a gullied sub-catchment; and the 13.6 km2 catchment outlet. Model performance was evaluated by comparing observed and predicted stormflow hydrograph attributes using the Nash Sutcliffe efficiency metric. By using a nested approach, spatiotemporal patterns of overland flow occurrence across the catchment can also be evaluated. The results show that a process-based distributed model can be calibrated to simulate spatial and temporal patterns of runoff generation processes, to help identify dominant processes which may be addressed by land management to improve rainfall retention. The model will be used to assess the effects of ground cover changes due to management practices in grazed lands on storm runoff.

  2. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines.

    PubMed

    Cieślik, Marcin; Mura, Cameron

    2011-02-25

    Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage examples.

  3. Neuronal Assemblies Evidence Distributed Interactions within a Tactile Discrimination Task in Rats

    PubMed Central

    Deolindo, Camila S.; Kunicki, Ana C. B.; da Silva, Maria I.; Lima Brasil, Fabrício; Moioli, Renan C.

    2018-01-01

    Accumulating evidence suggests that neural interactions are distributed and relate to animal behavior, but many open questions remain. The neural assembly hypothesis, formulated by Hebb, states that synchronously active single neurons may transiently organize into functional neural circuits—neuronal assemblies (NAs)—and that would constitute the fundamental unit of information processing in the brain. However, the formation, vanishing, and temporal evolution of NAs are not fully understood. In particular, characterizing NAs in multiple brain regions over the course of behavioral tasks is relevant to assess the highly distributed nature of brain processing. In the context of NA characterization, active tactile discrimination tasks with rats are elucidative because they engage several cortical areas in the processing of information that are otherwise masked in passive or anesthetized scenarios. In this work, we investigate the dynamic formation of NAs within and among four different cortical regions in long-range fronto-parieto-occipital networks (primary somatosensory, primary visual, prefrontal, and posterior parietal cortices), simultaneously recorded from seven rats engaged in an active tactile discrimination task. Our results first confirm that task-related neuronal firing rate dynamics in all four regions is significantly modulated. Notably, a support vector machine decoder reveals that neural populations contain more information about the tactile stimulus than the majority of single neurons alone. Then, over the course of the task, we identify the emergence and vanishing of NAs whose participating neurons are shown to contain more information about animal behavior than randomly chosen neurons. Taken together, our results further support the role of multiple and distributed neurons as the functional unit of information processing in the brain (NA hypothesis) and their link to active animal behavior. PMID:29375324

  4. Embracing Open Software Development in Solar Physics

    NASA Astrophysics Data System (ADS)

    Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.

    2012-12-01

    We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We discuss the development of both these efforts and how they are beginning to influence the solar physics community.

  5. Web processing service for climate impact and extreme weather event analyses. Flyingpigeon (Version 1.0)

    NASA Astrophysics Data System (ADS)

    Hempelmann, Nils; Ehbrecht, Carsten; Alvarez-Castro, Carmen; Brockmann, Patrick; Falk, Wolfgang; Hoffmann, Jörg; Kindermann, Stephan; Koziol, Ben; Nangini, Cathy; Radanovics, Sabine; Vautard, Robert; Yiou, Pascal

    2018-01-01

    Analyses of extreme weather events and their impacts often requires big data processing of ensembles of climate model simulations. Researchers generally proceed by downloading the data from the providers and processing the data files ;at home; with their own analysis processes. However, the growing amount of available climate model and observation data makes this procedure quite awkward. In addition, data processing knowledge is kept local, instead of being consolidated into a common resource of reusable code. These drawbacks can be mitigated by using a web processing service (WPS). A WPS hosts services such as data analysis processes that are accessible over the web, and can be installed close to the data archives. We developed a WPS named 'flyingpigeon' that communicates over an HTTP network protocol based on standards defined by the Open Geospatial Consortium (OGC), to be used by climatologists and impact modelers as a tool for analyzing large datasets remotely. Here, we present the current processes we developed in flyingpigeon relating to commonly-used processes (preprocessing steps, spatial subsets at continent, country or region level, and climate indices) as well as methods for specific climate data analysis (weather regimes, analogues of circulation, segetal flora distribution, and species distribution models). We also developed a novel, browser-based interactive data visualization for circulation analogues, illustrating the flexibility of WPS in designing custom outputs. Bringing the software to the data instead of transferring the data to the code is becoming increasingly necessary, especially with the upcoming massive climate datasets.

  6. Near Theoretical Gigabit Link Efficiency for Distributed Data Acquisition Systems

    PubMed Central

    Abu-Nimeh, Faisal T.; Choong, Woon-Seng

    2017-01-01

    Link efficiency, data integrity, and continuity for high-throughput and real-time systems is crucial. Most of these applications require specialized hardware and operating systems as well as extensive tuning in order to achieve high efficiency. Here, we present an implementation of gigabit Ethernet data streaming which can achieve 99.26% link efficiency while maintaining no packet losses. The design and implementation are built on OpenPET, an opensource data acquisition platform for nuclear medical imaging, where (a) a crate hosting multiple OpenPET detector boards uses a User Datagram Protocol over Internet Protocol (UDP/IP) Ethernet soft-core, that is capable of understanding PAUSE frames, to stream data out to a computer workstation; (b) the receiving computer uses Netmap to allow the processing software (i.e., user space), which is written in Python, to directly receive and manage the network card’s ring buffers, bypassing the operating system kernel’s networking stack; and (c) a multi-threaded application using synchronized queues is implemented in the processing software (Python) to free up the ring buffers as quickly as possible while preserving data integrity and flow continuity. PMID:28630948

  7. Near Theoretical Gigabit Link Efficiency for Distributed Data Acquisition Systems.

    PubMed

    Abu-Nimeh, Faisal T; Choong, Woon-Seng

    2017-03-01

    Link efficiency, data integrity, and continuity for high-throughput and real-time systems is crucial. Most of these applications require specialized hardware and operating systems as well as extensive tuning in order to achieve high efficiency. Here, we present an implementation of gigabit Ethernet data streaming which can achieve 99.26% link efficiency while maintaining no packet losses. The design and implementation are built on OpenPET, an opensource data acquisition platform for nuclear medical imaging, where (a) a crate hosting multiple OpenPET detector boards uses a User Datagram Protocol over Internet Protocol (UDP/IP) Ethernet soft-core, that is capable of understanding PAUSE frames, to stream data out to a computer workstation; (b) the receiving computer uses Netmap to allow the processing software (i.e., user space), which is written in Python, to directly receive and manage the network card's ring buffers, bypassing the operating system kernel's networking stack; and (c) a multi-threaded application using synchronized queues is implemented in the processing software (Python) to free up the ring buffers as quickly as possible while preserving data integrity and flow continuity.

  8. Wind-sea surface temperature-sea ice relationship in the Chukchi-Beaufort Seas during autumn

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Stegall, Steve T.; Zhang, Xiangdong

    2018-03-01

    Dramatic climate changes, especially the largest sea ice retreat during September and October, in the Chukchi-Beaufort Seas could be a consequence of, and further enhance, complex air-ice-sea interactions. To detect these interaction signals, statistical relationships between surface wind speed, sea surface temperature (SST), and sea ice concentration (SIC) were analyzed. The results show a negative correlation between wind speed and SIC. The relationships between wind speed and SST are complicated by the presence of sea ice, with a negative correlation over open water but a positive correlation in sea ice dominated areas. The examination of spatial structures indicates that wind speed tends to increase when approaching the ice edge from open water and the area fully covered by sea ice. The anomalous downward radiation and thermal advection, as well as their regional distribution, play important roles in shaping these relationships, though wind-driven sub-grid scale boundary layer processes may also have contributions. Considering the feedback loop involved in the wind-SST-SIC relationships, climate model experiments would be required to further untangle the underlying complex physical processes.

  9. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  10. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    PubMed

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.

  11. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    PubMed Central

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-01-01

    Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the biological sample. Conclusion Decon2LS is an efficient software package for discovering and visualizing features in proteomics studies that require automated interpretation of mass spectra. Besides being easy to use, fast, and reliable, Decon2LS is also open-source, which allows developers in the proteomics and bioinformatics communities to reuse and refine the algorithms to meet individual needs. Decon2LS source code, installer, and tutorials may be downloaded free of charge at . PMID:19292916

  12. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    PubMed

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the biological sample. Decon2LS is an efficient software package for discovering and visualizing features in proteomics studies that require automated interpretation of mass spectra. Besides being easy to use, fast, and reliable, Decon2LS is also open-source, which allows developers in the proteomics and bioinformatics communities to reuse and refine the algorithms to meet individual needs.Decon2LS source code, installer, and tutorials may be downloaded free of charge at http://http:/ncrr.pnl.gov/software/.

  13. Research on Closed Residential Area Based on Balanced Distribution Theory

    NASA Astrophysics Data System (ADS)

    Lan, Si; Fang, Ni; Lin, Hai Peng; Ye, Shi Qi

    2018-06-01

    With the promotion of the street system, residential quarters and units of the compound gradually open. In this paper, the relationship between traffic flow and traffic flow is established for external roads, and the road resistance model is established by internal roads. We propose a balanced distribution model from the two aspects of road opening conditions and traffic flow inside and outside the district, and quantitatively analyze the impact of the opening and closing on the surrounding roads. Finally, it puts forward feasible suggestions to improve the traffic situation and optimize the network structure.

  14. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  15. New Process Controls for the Hera Cryogenic Plant

    NASA Astrophysics Data System (ADS)

    Böckmann, T.; Clausen, M.; Gerke, Chr.; Prüß, K.; Schoeneburg, B.; Urbschat, P.

    2010-04-01

    The cryogenic plant built for the HERA accelerator at DESY in Hamburg (Germany) is now in operation for more than two decades. The commercial process control system for the cryogenic plant is in operation for the same time period. Ever since the operator stations, the control network and the CPU boards in the process controllers went through several upgrade stages. Only the centralized Input/Output system was kept unchanged. Many components have been running beyond the expected lifetime. The control system for one at the three parts of the cryogenic plant has been replaced recently by a distributed I/O system. The I/O nodes are connected to several Profibus-DP field busses. Profibus provides the infrastructure to attach intelligent sensors and actuators directly to the process controllers which run the open source process control software EPICS. This paper describes the modification process on all levels from cabling through I/O configuration, the process control software up to the operator displays.

  16. Advanced porous electrodes with flow channels for vanadium redox flow battery

    NASA Astrophysics Data System (ADS)

    Bhattarai, Arjun; Wai, Nyunt; Schweiss, Ruediger; Whitehead, Adam; Lim, Tuti M.; Hng, Huey Hoon

    2017-02-01

    Improving the overall energy efficiency by reducing pumping power and improving flow distribution of electrolyte, is a major challenge for developers of flow batteries. The use of suitable channels can improve flow distribution through the electrodes and reduce flow resistance, hence reducing the energy consumption of the pumps. Although several studies of vanadium redox flow battery have proposed the use of bipolar plates with flow channels, similar to fuel cell designs, this paper presents the use of flow channels in the porous electrode as an alternative approach. Four types of electrodes with channels: rectangular open channel, interdigitated open cut channel, interdigitated circular poked channel and cross poked circular channels, are studied and compared with a conventional electrode without channels. Our study shows that interdigitated open channels can improve the overall energy efficiency up to 2.7% due to improvement in flow distribution and pump power reduction while interdigitated poked channel can improve up to 2.5% due to improvement in flow distribution.

  17. A portable approach for PIC on emerging architectures

    NASA Astrophysics Data System (ADS)

    Decyk, Viktor

    2016-03-01

    A portable approach for designing Particle-in-Cell (PIC) algorithms on emerging exascale computers, is based on the recognition that 3 distinct programming paradigms are needed. They are: low level vector (SIMD) processing, middle level shared memory parallel programing, and high level distributed memory programming. In addition, there is a memory hierarchy associated with each level. Such algorithms can be initially developed using vectorizing compilers, OpenMP, and MPI. This is the approach recommended by Intel for the Phi processor. These algorithms can then be translated and possibly specialized to other programming models and languages, as needed. For example, the vector processing and shared memory programming might be done with CUDA instead of vectorizing compilers and OpenMP, but generally the algorithm itself is not greatly changed. The UCLA PICKSC web site at http://www.idre.ucla.edu/ contains example open source skeleton codes (mini-apps) illustrating each of these three programming models, individually and in combination. Fortran2003 now supports abstract data types, and design patterns can be used to support a variety of implementations within the same code base. Fortran2003 also supports interoperability with C so that implementations in C languages are also easy to use. Finally, main codes can be translated into dynamic environments such as Python, while still taking advantage of high performing compiled languages. Parallel languages are still evolving with interesting developments in co-Array Fortran, UPC, and OpenACC, among others, and these can also be supported within the same software architecture. Work supported by NSF and DOE Grants.

  18. Innovative Ways of Visualising Meta Data in 4D Using Open Source Libaries

    NASA Astrophysics Data System (ADS)

    Balhar, Jakub; Valach, Pavel; Veselka, Jonas; Voumard, Yann

    2016-08-01

    There are more and more data being measured by different Earth Observation satellites around the world. Ever increasing amount of these data present new challenges and opportunities for their visualization.In this paper we propose how to visualize the amount, distribution and the structure of the data in a transparent way, which will take into account time-dimensions as well. Our approach allows us to get a global overview as well detailed regional information about distribution of the products from EO observation missions.We focus on introducing our mobile-friendly and easy- to-use web mapping application for 4D visualization of the data. Apart from that we also present the Java application which can read and process the data from various data sources.

  19. Liberalization of the Spanish electricity sector: An advanced model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unda, J.I.

    1998-06-01

    Spain`s electricity industry is being restructured to provide a competitive generation market, a regulated, open access transmission and distribution system, and phased-in customer choice. But while the reform is radical in its objectives, it will be gradual in its implementation. This article briefly describes the current state of affairs within the Spanish electricity sector and details the reform plans set out in the act, focusing on the adopted institutional design and the established transition period. It also offers an overview of the role that the regulatory authority will play throughout the process.

  20. Chemistry in Bioinformatics

    PubMed Central

    Murray-Rust, Peter; Mitchell, John BO; Rzepa, Henry S

    2005-01-01

    Chemical information is now seen as critical for most areas of life sciences. But unlike Bioinformatics, where data is openly available and freely re-usable, most chemical information is closed and cannot be re-distributed without permission. This has led to a failure to adopt modern informatics and software techniques and therefore paucity of chemistry in bioinformatics. New technology, however, offers the hope of making chemical data (compounds and properties) free during the authoring process. We argue that the technology is already available; we require a collective agreement to enhance publication protocols. PMID:15941476

  1. Brillouin lasing in single-mode tapered optical fiber with inscribed fiber Bragg grating array

    NASA Astrophysics Data System (ADS)

    Popov, S. M.; Butov, O. V.; Chamorovskiy, Y. K.; Isaev, V. A.; Kolosovskiy, A. O.; Voloshin, V. V.; Vorob'ev, I. L.; Vyatkin, M. Yu.; Mégret, P.; Odnoblyudov, M.; Korobko, D. A.; Zolotovskii, I. O.; Fotiadi, A. A.

    2018-06-01

    A tapered optical fiber has been manufactured with an array of fiber Bragg gratings (FBG) inscribed during the drawing process. The total fiber peak reflectivity is 5% and the reflection bandwidth is ∼3.5 nm. A coherent frequency domain reflectometry has been applied for precise profiling of the fiber core diameter and grating reflectivity both distributed along the whole fiber length. These measurements are in a good agreement with the specific features of Brillouin lasing achieved in the semi-open fiber cavity configuration.

  2. Big Data Analytics in Medicine and Healthcare.

    PubMed

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  3. Active and Passive User Trust in Sociotechnical Systems

    DTIC Science & Technology

    2015-09-28

    Drury ,   2000)   was   used  for  measuring  trust  quantitatively  and  an  open  ended  question  was  used  for...is  severe   “If  something  goes  wrong,  there  is  an   accountability  nightmare.”  (P)     DISTRIBUTION A...for public release. AFOSR Grant FA9550-12-1-0311   it  did  not  take  the  whole  task  process  into  

  4. Locked Nucleic Acid Flow Cytometry-fluorescence in situ Hybridization (LNA flow-FISH): A Method for Bacterial Small RNA Detection

    DTIC Science & Technology

    2012-01-10

    flow cytometry, locked nucleic acid, sRNA, Vibrio , Date Published: 1/10/2012 This is an open-access article distributed under the terms of the Creative...solubilization process to maintain a 10 mL volume. Aliquot the 60% dextran sulfate solution and store at -20 °C until use. 1. Harvest 1x108 cells of...bioluminescent Vibrio campbellii or your bacteria of interest and transfer them into a 1.5 mL microcentrifuge tube. This quantity of cells provides

  5. Free for All: Open Source Software

    ERIC Educational Resources Information Center

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  6. Microclimatic, chemical, and mineralogical evidence for tafoni weathering processes on the Miaowan Island, South China

    NASA Astrophysics Data System (ADS)

    Huang, Rihui; Wang, Wei

    2017-02-01

    Tafoni were widely distributed around the world; however, their processes of development remain unclear. In this study, the roles of microclimatic, geochemical and mineralogical processes on tafoni development along the subtropical coastline of the Miaowan Island, south China, are investigated. Field observations were carried out during three visits to the island over a four-year period (2011-2015). The orientation of 184 tafoni openings were measured, and micrometeorological changes of three tafoni on opposite sides of the island were monitored by pocket weather trackers (Kestrel 4500) in two periods. Samples of residual debris inside three tafoni hosted in a large boulder, the parent rock of the tafoni, and from the weathering profile of a nearby bedrock outcrop were collected for X-ray fluorescence (XRF) and X-ray diffraction (XRD) analyses. The field observations showed that tafoni were of different sizes and constantly produced flakes and debris inside the tafoni caves, indicating their on-going active development. An increase in Na in residual debris in tafoni caves on the Miaowan Island is the most obvious evidence of salt weathering. Salt weathering inside tafoni caves is not intense and does not match the salt-rich environment outside the caves, indicating that the influence of salt is not strong. The loss of K, Ca, and Mg in the residue samples, and the appearance of the clay mineral montmorillonite are caused by chemical weathering. Most of the tafoni openings face mountains, demonstrating the effect of humidity in tafoni weathering. Tafoni cave shapes are related to the distribution of humid water vapour, which tends to collect at the top of the cave, and leads to more intensive development here than in other parts. Drastic daily changes in relative humidity inside tafoni caves accelerate mechanical weathering owing to swelling and shrinking of salt and clay minerals. The Miaowan Island tafoni are formed by weathering, but they cannot be simply interpreted as the product of a single weathering process.

  7. A hybrid analytical model for open-circuit field calculation of multilayer interior permanent magnet machines

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Xia, Changliang; Yan, Yan; Geng, Qiang; Shi, Tingna

    2017-08-01

    Due to the complicated rotor structure and nonlinear saturation of rotor bridges, it is difficult to build a fast and accurate analytical field calculation model for multilayer interior permanent magnet (IPM) machines. In this paper, a hybrid analytical model suitable for the open-circuit field calculation of multilayer IPM machines is proposed by coupling the magnetic equivalent circuit (MEC) method and the subdomain technique. In the proposed analytical model, the rotor magnetic field is calculated by the MEC method based on the Kirchhoff's law, while the field in the stator slot, slot opening and air-gap is calculated by subdomain technique based on the Maxwell's equation. To solve the whole field distribution of the multilayer IPM machines, the coupled boundary conditions on the rotor surface are deduced for the coupling of the rotor MEC and the analytical field distribution of the stator slot, slot opening and air-gap. The hybrid analytical model can be used to calculate the open-circuit air-gap field distribution, back electromotive force (EMF) and cogging torque of multilayer IPM machines. Compared with finite element analysis (FEA), it has the advantages of faster modeling, less computation source occupying and shorter time consuming, and meanwhile achieves the approximate accuracy. The analytical model is helpful and applicable for the open-circuit field calculation of multilayer IPM machines with any size and pole/slot number combination.

  8. A distributed data component for the open modeling interface

    USDA-ARS?s Scientific Manuscript database

    As the volume of collected data continues to increase in the environmental sciences, so does the need for effective means for accessing those data. We have developed an Open Modeling Interface (OpenMI) data component that retrieves input data for model components from environmental information syste...

  9. Partial liquid ventilation: effects of closed breathing systems, heat-and-moisture-exchangers and sodalime absorbers on perfluorocarbon evaporation.

    PubMed

    Wilms, C T; Schober, P; Kalb, R; Loer, S A

    2006-01-01

    During partial liquid ventilation perfluorocarbons are instilled into the airways from where they subsequently evaporate via the bronchial system. This process is influenced by multiple factors, such as the vapour pressure of the perfluorocarbons, the instilled volume, intrapulmonary perfluorocarbon distribution, postural positioning and ventilatory settings. In our study we compared the effects of open and closed breathing systems, a heat-and-moisture-exchanger and a sodalime absorber on perfluorocarbon evaporation during partial liquid ventilation. Isolated rat lungs were suspended from a force transducer. After intratracheal perfluorocarbon instillation (10 mL kg(-1)) the lungs were either ventilated with an open breathing system (n = 6), a closed breathing system (n = 6), an open breathing system with an integrated heat-and-moisture-exchanger (n = 6), an open breathing system with an integrated sodalime absorber (n = 6), or a closed breathing system with an integrated heat-and-moisture-exchanger and a sodalime absorber (n = 6). Evaporative perfluorocarbon elimination was determined gravimetrically. When compared to the elimination half-life in an open breathing system (1.2 +/- 0.07 h), elimination half-life was longer with a closed system (6.4 +/- 0.9 h, P 0.05) when compared to a closed system. Evaporative perfluorocarbon loss can be reduced effectively with closed breathing systems, followed by the use of sodalime absorbers and heat-and-moisture-exchangers.

  10. New particle formation in the Svalbard region 2006-2015

    NASA Astrophysics Data System (ADS)

    Heintzenberg, Jost; Tunved, Peter; Galí, Martí; Leck, Caroline

    2017-05-01

    Events of new particle formation (NPF) were analyzed in a 10-year data set of hourly particle size distributions recorded on Mt. Zeppelin, Spitsbergen, Svalbard. Three different types of NPF events were identified through objective search algorithms. The first and simplest algorithm utilizes short-term increases in particle concentrations below 25 nm (PCT (percentiles) events). The second one builds on the growth of the sub-50 nm diameter median (DGR (diameter growth) events) and is most closely related to the classical banana type of event. The third and most complex, multiple-size approach to identifying NPF events builds on a hypothesis suggesting the concurrent production of polymer gel particles at several sizes below ca. 60 nm (MEV (multi-size growth) events). As a first and general conclusion, we can state that NPF events are a summer phenomenon and not related to Arctic haze, which is a late winter to early spring feature. The occurrence of NPF events appears to be somewhat sensitive to the available data on precipitation. The seasonal distribution of solar flux suggests some photochemical control that may affect marine biological processes generating particle precursors and/or atmospheric photochemical processes that generate condensable vapors from precursor gases. Notably, the seasonal distribution of the biogenic methanesulfonate (MSA) follows that of the solar flux although it peaks before the maxima in NPF occurrence. A host of ancillary data and findings point to varying and rather complex marine biological source processes. The potential source regions for all types of new particle formation appear to be restricted to the marginal-ice and open-water areas between northeastern Greenland and eastern Svalbard. Depending on conditions, yet to be clarified new particle formation may become visible as short bursts of particles around 20 nm (PCT events), longer events involving condensation growth (DGR events), or extended events with elevated concentrations of particles at several sizes below 100 nm (MEV events). The seasonal distribution of NPF events peaks later than that of MSA and DGR, and in particular than that of MEV events, which reach into late summer and early fall with open, warm, and biologically active waters around Svalbard. Consequently, a simple model to describe the seasonal distribution of the total number of NPF events can be based on solar flux and sea surface temperature, representing environmental conditions for marine biological activity and condensation sink, controlling the balance between new particle nucleation and their condensational growth. Based on the sparse knowledge about the seasonal cycle of gel-forming marine microorganisms and their controlling factors, we hypothesize that the seasonal distribution of DGR and, more so, MEV events reflect the seasonal cycle of the gel-forming phytoplankton.

  11. nSTAT: Open-Source Neural Spike Train Analysis Toolbox for Matlab

    PubMed Central

    Cajigas, I.; Malik, W.Q.; Brown, E.N.

    2012-01-01

    Over the last decade there has been a tremendous advance in the analytical tools available to neuroscientists to understand and model neural function. In particular, the point process - Generalized Linear Model (PPGLM) framework has been applied successfully to problems ranging from neuro-endocrine physiology to neural decoding. However, the lack of freely distributed software implementations of published PP-GLM algorithms together with problem-specific modifications required for their use, limit wide application of these techniques. In an effort to make existing PP-GLM methods more accessible to the neuroscience community, we have developed nSTAT – an open source neural spike train analysis toolbox for Matlab®. By adopting an Object-Oriented Programming (OOP) approach, nSTAT allows users to easily manipulate data by performing operations on objects that have an intuitive connection to the experiment (spike trains, covariates, etc.), rather than by dealing with data in vector/matrix form. The algorithms implemented within nSTAT address a number of common problems including computation of peri-stimulus time histograms, quantification of the temporal response properties of neurons, and characterization of neural plasticity within and across trials. nSTAT provides a starting point for exploratory data analysis, allows for simple and systematic building and testing of point process models, and for decoding of stimulus variables based on point process models of neural function. By providing an open-source toolbox, we hope to establish a platform that can be easily used, modified, and extended by the scientific community to address limitations of current techniques and to extend available techniques to more complex problems. PMID:22981419

  12. Flow processes on the catchment scale - modeling of initial structural states and hydrological behavior in an artificial exemplary catchment

    NASA Astrophysics Data System (ADS)

    Maurer, Thomas; Caviedes-Voullième, Daniel; Hinz, Christoph; Gerke, Horst H.

    2017-04-01

    Landscapes that are heavily disturbed or newly formed by either natural processes or human activity are in a state of disequilibrium. Their initial development is thus characterized by highly dynamic processes under all climatic conditions. The primary distribution and structure of the solid phase (i.e. mineral particles forming the pore space) is one of the decisive factors for the development of hydrological behavior of the eco-hydrological system and therefore (co-) determining for its - more or less - stable final state. The artificially constructed ‚Hühnerwasser' catchment (a 6 ha area located in the open-cast lignite mine Welzow-Süd, southern Brandenburg, Germany) is a landscape laboratory where the initial eco-hydrological development is observed since 2005. The specific formation (or construction) processes generated characteristic sediment structures and distributions, resulting in a spatially heterogeneous initial state of the catchment. We developed a structure generator that simulates the characteristic distribution of the solid phase for such constructed landscapes. The program is able to generate quasi-realistic structures and sediment compositions on multiple spatial levels (1 cm up to 100 m scale). The generated structures can be i) conditioned to actual measurement values (e.g., soil texture and bulk distribution); ii) stochastically generated, and iii) calculated deterministically according to the geology and technical processes at the excavation site. Results are visualized using the GOCAD software package and the free software Paraview. Based on the 3D-spatial sediment distributions, effective hydraulic van-Genuchten parameters are calculated using pedotransfer functions. The hydraulic behavior of different sediment distribution (i.e. versions or variations of the catchment's porous body) is calculated using a numerical model developed by one of us (Caviedes-Voullième). Observation data are available from catchment monitoring are available for i) determining the boundary conditions (e.g., precipitation), and ii) the calibration / validation of the model (catchment discharge, ground water). The analysis of multiple sediment distribution scenarios should allow to approximately determine the influx of starting conditions on initial development of hydrological behavior. We present first flow modeling results for a reference (conditioned) catchment model and variations thereof. We will also give an outlook on further methodical development of our approach.

  13. Implementation of Web Processing Services (WPS) over IPSL Earth System Grid Federation (ESGF) node

    NASA Astrophysics Data System (ADS)

    Kadygrov, Nikolay; Denvil, Sebastien; Carenton, Nicolas; Levavasseur, Guillaume; Hempelmann, Nils; Ehbrecht, Carsten

    2016-04-01

    The Earth System Grid Federation (ESGF) is aimed to provide access to climate data for the international climate community. ESGF is a system of distributed and federated nodes that dynamically interact with each other. ESGF user may search and download climatic data, geographically distributed over the world, from one common web interface and through standardized API. With the continuous development of the climate models and the beginning of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), the amount of data available from ESGF will continuously increase during the next 5 years. IPSL holds a replication of the different global and regional climate models output, observations and reanalysis data (CMIP5, CORDEX, obs4MIPs, etc) that are available on the IPSL ESGF node. In order to let scientists perform analysis of the models without downloading vast amount of data the Web Processing Services (WPS) were installed at IPSL compute node. The work is part of the CONVERGENCE project founded by French National Research Agency (ANR). PyWPS implementation of the Web processing Service standard from Open Geospatial Consortium (OGC) in the framework of birdhouse software is used. The processes could be run by user remotely through web-based WPS client or by using command-line tool. All the calculations are performed on the server side close to the data. If the models/observations are not available at IPSL it will be downloaded and cached by WPS process from ESGF network using synda tool. The outputs of the WPS processes are available for download as plots, tar-archives or as NetCDF files. We present the architecture of WPS at IPSL along with the processes for evaluation of the model performance, on-site diagnostics and post-analysis processing of the models output, e.g.: - regriding/interpolation/aggregation - ocgis (OpenClimateGIS) based polygon subsetting of the data - average seasonal cycle, multimodel mean, multimodel mean bias - calculation of the climate indices with icclim library (CERFACS) - atmospheric modes of variability In order to evaluate performance of any new model, once it became available in ESGF, we implement WPS with several model diagnostics and performance metrics calculated using ESMValTool (Eyring et al., GMDD 2015). As a further step we are developing new WPS processes and core-functions to be implemented at ISPL ESGF compute node following the scientific community needs.

  14. The role of fractional time-derivative operators on anomalous diffusion

    NASA Astrophysics Data System (ADS)

    Tateishi, Angel A.; Ribeiro, Haroldo V.; Lenzi, Ervin K.

    2017-10-01

    The generalized diffusion equations with fractional order derivatives have shown be quite efficient to describe the diffusion in complex systems, with the advantage of producing exact expressions for the underlying diffusive properties. Recently, researchers have proposed different fractional-time operators (namely: the Caputo-Fabrizio and Atangana-Baleanu) which, differently from the well-known Riemann-Liouville operator, are defined by non-singular memory kernels. Here we proposed to use these new operators to generalize the usual diffusion equation. By analyzing the corresponding fractional diffusion equations within the continuous time random walk framework, we obtained waiting time distributions characterized by exponential, stretched exponential, and power-law functions, as well as a crossover between two behaviors. For the mean square displacement, we found crossovers between usual and confined diffusion, and between usual and sub-diffusion. We obtained the exact expressions for the probability distributions, where non-Gaussian and stationary distributions emerged. This former feature is remarkable because the fractional diffusion equation is solved without external forces and subjected to the free diffusion boundary conditions. We have further shown that these new fractional diffusion equations are related to diffusive processes with stochastic resetting, and to fractional diffusion equations with derivatives of distributed order. Thus, our results suggest that these new operators may be a simple and efficient way for incorporating different structural aspects into the system, opening new possibilities for modeling and investigating anomalous diffusive processes.

  15. Dynamics starting at a conical intersection: Application to the photochemistry of pyrrole

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sellner, Bernhard; Barbatti, Mario; Lischka, Hans

    The photochemical ring opening process in pyrrole has been investigated by performing classical on-the-fly dynamics using the multiconfiguration self-consistent field method for the computation of energies and energy gradients. As starting point for the dynamics the conical intersection corresponding to the ring-puckered ring-opened structure, determined previously [Barbatti et al., J. Chem. Phys. 125, 164323 (2006)], has been chosen. Two sets of initial conditions for the nuclear velocities were constructed: (i) nuclear velocities in the branching (g,h) plane of the conical intersection and (ii) statistical distribution for all atoms. Both sets of initial conditions show very similar results. Reactive trajectories aremore » only found in a very limited sector in the (g,h) plane and reaction products are very similar. Within the simulation time of 1 ps, ring opening of pyrrole to the biradical NH=CH-CH-CH=CH chain followed by ring closure to a substituted cyclopropene structure (NH=CH-C{sub 3}H{sub 3}) is observed. The computed structural data correlate well with the experimentally observed dissociation products.« less

  16. Transcription closed and open complex dynamics studies reveal balance between genetic determinants and co-factors

    NASA Astrophysics Data System (ADS)

    Sala, Adrien; Shoaib, Muhammad; Anufrieva, Olga; Mutharasu, Gnanavel; Jahan Hoque, Rawnak; Yli-Harja, Olli; Kandhavelu, Meenakshisundaram

    2015-05-01

    In E. coli, promoter closed and open complexes are key steps in transcription initiation, where magnesium-dependent RNA polymerase catalyzes RNA synthesis. However, the exact mechanism of initiation remains to be fully elucidated. Here, using single mRNA detection and dual reporter studies, we show that increased intracellular magnesium concentration affects Plac initiation complex formation resulting in a highly dynamic process over the cell growth phases. Mg2+ regulates transcription transition, which modulates bimodality of mRNA distribution in the exponential phase. We reveal that Mg2+ regulates the size and frequency of the mRNA burst by changing the open complex duration. Moreover, increasing magnesium concentration leads to higher intrinsic and extrinsic noise in the exponential phase. RNAP-Mg2+ interaction simulation reveals critical movements creating a shorter contact distance between aspartic acid residues and Nucleotide Triphosphate residues and increasing electrostatic charges in the active site. Our findings provide unique biophysical insights into the balanced mechanism of genetic determinants and magnesium ion in transcription initiation regulation during cell growth.

  17. Modeling of lipase catalyzed ring-opening polymerization of epsilon-caprolactone.

    PubMed

    Sivalingam, G; Madras, Giridhar

    2004-01-01

    Enzymatic ring-opening polymerization of epsilon-caprolactone by various lipases was investigated in toluene at various temperatures. The determination of molecular weight and structural identification was carried out with gel permeation chromatography and proton NMR, respectively. Among the various lipases employed, an immobilized lipase from Candida antartica B (Novozym 435) showed the highest catalytic activity. The polymerization of epsilon-caprolactone by Novozym 435 showed an optimal temperature of 65 degrees C and an optimum toluene content of 50/50 v/v of toluene and epsilon-caprolactone. As lipases can degrade polyesters, a maximum in the molecular weight with time was obtained due to the competition of ring opening polymerization and degradation by specific chain end scission. The optimum temperature, toluene content, and the variation of molecular weight with time are consistent with earlier observations. A comprehensive model based on continuous distribution kinetics was developed to model these phenomena. The model accounts for simultaneous polymerization, degradation and enzyme deactivation and provides a technique to determine the rate coefficients for these processes. The dependence of these rate coefficients with temperature and monomer concentration is also discussed.

  18. Leveraging the BPEL Event Model to Support QoS-aware Process Execution

    NASA Astrophysics Data System (ADS)

    Zaid, Farid; Berbner, Rainer; Steinmetz, Ralf

    Business processes executed using compositions of distributed Web Services are susceptible to different fault types. The Web Services Business Process Execution Language (BPEL) is widely used to execute such processes. While BPEL provides fault handling mechanisms to handle functional faults like invalid message types, it still lacks a flexible native mechanism to handle non-functional exceptions associated with violations of QoS levels that are typically specified in a governing Service Level Agreement (SLA), In this paper, we present an approach to complement BPEL's fault handling, where expected QoS levels and necessary recovery actions are specified declaratively in form of Event-Condition-Action (ECA) rules. Our main contribution is leveraging BPEL's standard event model which we use as an event space for the created ECA rules. We validate our approach by an extension to an open source BPEL engine.

  19. Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System.

    PubMed

    Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C; Parisot, Sarah; Rueckert, Daniel

    2017-01-01

    OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A demonstration of the flexibility of our solution showcases three neuroimaging pipelines harnessing distributed computing environments as heterogeneous as local clusters or the European Grid Infrastructure (EGI).

  20. Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System

    PubMed Central

    Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C.; Parisot, Sarah; Rueckert, Daniel

    2017-01-01

    OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A demonstration of the flexibility of our solution showcases three neuroimaging pipelines harnessing distributed computing environments as heterogeneous as local clusters or the European Grid Infrastructure (EGI). PMID:28381997

  1. Building Energy Management Open Source Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This is the repository for Building Energy Management Open Source Software (BEMOSS), which is an open source operating system that is engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. BEMOSS offers the following key features: (1) Open source, open architecture – BEMOSS is an open source operating system that is built upon VOLTTRON – a distributed agent platform developed by Pacific Northwest National Laboratory (PNNL). BEMOSS was designed to make it easy for hardware manufacturers to seamlessly interface their devices with BEMOSS. Software developers can also contribute to adding additional BEMOSS functionalities and applications.more » (2) Plug & play – BEMOSS was designed to automatically discover supported load controllers (including smart thermostats, VAV/RTUs, lighting load controllers and plug load controllers) in commercial buildings. (3) Interoperability – BEMOSS was designed to work with load control devices form different manufacturers that operate on different communication technologies and data exchange protocols. (4) Cost effectiveness – Implementation of BEMOSS deemed to be cost-effective as it was built upon a robust open source platform that can operate on a low-cost single-board computer, such as Odroid. This feature could contribute to its rapid deployment in small- or medium-sized commercial buildings. (5) Scalability and ease of deployment – With its multi-node architecture, BEMOSS provides a distributed architecture where load controllers in a multi-floor and high occupancy building could be monitored and controlled by multiple single-board computers hosting BEMOSS. This makes it possible for a building engineer to deploy BEMOSS in one zone of a building, be comfortable with its operation, and later on expand the deployment to the entire building to make it more energy efficient. (6) Ability to provide local and remote monitoring – BEMOSS provides both local and remote monitoring ability with role-based access control. (7) Security – In addition to built-in security features provided by VOLTTRON, BEMOSS provides enhanced security features, including BEMOSS discovery approval process, encrypted core-to-node communication, thermostat anti-tampering feature and many more. (8) Support from the Advisory Committee – BEMOSS was developed in consultation with an advisory committee from the beginning of the project. BEMOSS advisory committee comprises representatives from 22 organizations from government and industry.« less

  2. Coordinating the Commons: Diversity & Dynamics in Open Collaborations

    ERIC Educational Resources Information Center

    Morgan, Jonathan T.

    2013-01-01

    The success of Wikipedia demonstrates that open collaboration can be an effective model for organizing geographically-distributed volunteers to perform complex, sustained work at a massive scale. However, Wikipedia's history also demonstrates some of the challenges that large, long-term open collaborations face: the core community of Wikipedia…

  3. Innovation in Open & Distance Learning: Successful Development of Online and Web-Based Learning.

    ERIC Educational Resources Information Center

    Lockwood, Fred, Ed.; Gooley, Anne, Ed.

    This book contains 19 papers examining innovation in open and distance learning through development of online and World Wide Web-based learning. The following papers are included: "Innovation in Distributed Learning: Creating the Environment" (Fred Lockwood); "Innovation in Open and Distance Learning: Some Lessons from Experience…

  4. Addressing the Health of Formerly Imprisoned Persons in a Distressed Neighborhood Through a Community Collaborative Board

    PubMed Central

    Smith, Vivian C.; Jemal, Alexis

    2016-01-01

    This article provides a case study evaluating the structure and dynamic process of a Community Collaborative Board that had the goal of creating an evidence-based substance abuse/health intervention for previously incarcerated individuals. Meeting agendas, attendance, minutes, video recording of meetings, and in-depth interviews with 13 Community Collaborative Board members were used to conduct an independent process evaluation. Open coding identified quotes exemplifying specific themes and/or patterns across answers related to the desired domain. Several themes were identified regarding membership engagement, retention, and power distribution. Results showed member retention was due to strong personal commitment to the targeted problem. Analysis also revealed an unequal power distribution based on participants' background. Nevertheless, the development of an innovative, community-based health intervention manual was accomplished. Aspects of the process, such as incentives, subcommittees, and trainings, enhanced the Board's ability to integrate the community and scientific knowledge to accomplish its research agenda. Community-based participatory research was a useful framework in enhancing quality and efficiency in the development of an innovative, substance abuse/health intervention manual for distressed communities. Overall, this article sheds light on a process that illustrates the integration of community-based and scientific knowledge to address the health, economic, and societal marginalization of low-income, minority communities. PMID:26055460

  5. Photosynthetic microbial mats in the 3,416-Myr-old ocean.

    PubMed

    Tice, Michael M; Lowe, Donald R

    2004-09-30

    Recent re-evaluations of the geological record of the earliest life on Earth have led to the suggestion that some of the oldest putative microfossils and carbonaceous matter were formed through abiotic hydrothermal processes. Similarly, many early Archaean (more than 3,400-Myr-old) cherts have been reinterpreted as hydrothermal deposits rather than products of normal marine sedimentary processes. Here we present the results of a field, petrographic and geochemical study testing these hypotheses for the 3,416-Myr-old Buck Reef Chert, South Africa. From sedimentary structures and distributions of sand and mud, we infer that deposition occurred in normal open shallow to deep marine environments. The siderite enrichment that we observe in deep-water sediments is consistent with a stratified early ocean. We show that most carbonaceous matter was formed by photosynthetic mats within the euphotic zone and distributed as detrital matter by waves and currents to surrounding environments. We find no evidence that hydrothermal processes had any direct role in the deposition of either the carbonaceous matter or the enclosing sediments. Instead, we conclude that photosynthetic organisms had evolved and were living in a stratified ocean supersaturated in dissolved silica 3,416 Myr ago.

  6. Photosynthetic microbial mats in the 3,416-Myr-old ocean

    NASA Astrophysics Data System (ADS)

    Tice, Michael M.; Lowe, Donald R.

    2004-09-01

    Recent re-evaluations of the geological record of the earliest life on Earth have led to the suggestion that some of the oldest putative microfossils and carbonaceous matter were formed through abiotic hydrothermal processes. Similarly, many early Archaean (more than 3,400-Myr-old) cherts have been reinterpreted as hydrothermal deposits rather than products of normal marine sedimentary processes. Here we present the results of a field, petrographic and geochemical study testing these hypotheses for the 3,416-Myr-old Buck Reef Chert, South Africa. From sedimentary structures and distributions of sand and mud, we infer that deposition occurred in normal open shallow to deep marine environments. The siderite enrichment that we observe in deep-water sediments is consistent with a stratified early ocean. We show that most carbonaceous matter was formed by photosynthetic mats within the euphotic zone and distributed as detrital matter by waves and currents to surrounding environments. We find no evidence that hydrothermal processes had any direct role in the deposition of either the carbonaceous matter or the enclosing sediments. Instead, we conclude that photosynthetic organisms had evolved and were living in a stratified ocean supersaturated in dissolved silica 3,416Myr ago.

  7. Health Technology Assessment: Global Advocacy and Local Realities Comment on "Priority Setting for Universal Health Coverage: We Need Evidence-Informed Deliberative Processes, Not Just More Evidence on Cost-Effectiveness".

    PubMed

    Chalkidou, Kalipso; Li, Ryan; Culyer, Anthony J; Glassman, Amanda; Hofman, Karen J; Teerawattananon, Yot

    2016-08-29

    Cost-effectiveness analysis (CEA) can help countries attain and sustain universal health coverage (UHC), as long as it is context-specific and considered within deliberative processes at the country level. Institutionalising robust deliberative processes requires significant time and resources, however, and countries often begin by demanding evidence (including local CEA evidence as well as evidence about local values), whilst striving to strengthen the governance structures and technical capacities with which to generate, consider and act on such evidence. In low- and middle-income countries (LMICs), such capacities could be developed initially around a small technical unit in the health ministry or health insurer. The role of networks, development partners, and global norm setting organisations is crucial in supporting the necessary capacities. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  8. ON THE BRIGHTNESS AND WAITING-TIME DISTRIBUTIONS OF A TYPE III RADIO STORM OBSERVED BY STEREO/WAVES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eastwood, J. P.; Hudson, H. S.; Krucker, S.

    2010-01-10

    Type III solar radio storms, observed at frequencies below {approx}16 MHz by space-borne radio experiments, correspond to the quasi-continuous, bursty emission of electron beams onto open field lines above active regions. The mechanisms by which a storm can persist in some cases for more than a solar rotation whilst exhibiting considerable radio activity are poorly understood. To address this issue, the statistical properties of a type III storm observed by the STEREO/WAVES radio experiment are presented, examining both the brightness distribution and (for the first time) the waiting-time distribution (WTD). Single power-law behavior is observed in the number distribution asmore » a function of brightness; the power-law index is {approx}2.1 and is largely independent of frequency. The WTD is found to be consistent with a piecewise-constant Poisson process. This indicates that during the storm individual type III bursts occur independently and suggests that the storm dynamics are consistent with avalanche-type behavior in the underlying active region.« less

  9. Towards multifocal ultrasonic neural stimulation: pattern generation algorithms

    NASA Astrophysics Data System (ADS)

    Hertzberg, Yoni; Naor, Omer; Volovick, Alexander; Shoham, Shy

    2010-10-01

    Focused ultrasound (FUS) waves directed onto neural structures have been shown to dynamically modulate neural activity and excitability, opening up a range of possible systems and applications where the non-invasiveness, safety, mm-range resolution and other characteristics of FUS are advantageous. As in other neuro-stimulation and modulation modalities, the highly distributed and parallel nature of neural systems and neural information processing call for the development of appropriately patterned stimulation strategies which could simultaneously address multiple sites in flexible patterns. Here, we study the generation of sparse multi-focal ultrasonic distributions using phase-only modulation in ultrasonic phased arrays. We analyse the relative performance of an existing algorithm for generating multifocal ultrasonic distributions and new algorithms that we adapt from the field of optical digital holography, and find that generally the weighted Gerchberg-Saxton algorithm leads to overall superior efficiency and uniformity in the focal spots, without significantly increasing the computational burden. By combining phased-array FUS and magnetic-resonance thermometry we experimentally demonstrate the simultaneous generation of tightly focused multifocal distributions in a tissue phantom, a first step towards patterned FUS neuro-modulation systems and devices.

  10. Apparatus tube configuration and mounting for solid oxide fuel cells

    DOEpatents

    Zymboly, Gregory E.

    1993-01-01

    A generator apparatus (10) is made containing long, hollow, tubular, fuel cells containing an inner air electrode (64), an outer fuel electrode (56), and solid electrolyte (54) therebetween, placed between a fuel distribution board (29) and a board (32) which separates the combustion chamber (16) from the generating chamber (14), where each fuel cell has an insertable open end and in insertable, plugged, closed end (44), the plugged end being inserted into the fuel distribution board (29) and the open end being inserted through the separator board (32) where the plug (60) is completely within the fuel distribution board (29).

  11. Investigation of the effect of phase nonuniformities and the microwave field distribution on the electronic efficiency of a diffraction-radiation generator

    NASA Astrophysics Data System (ADS)

    Maksimov, P. P.; Tsvyk, A. I.; Shestopalov, V. P.

    1985-10-01

    The effect of local phase nonuniformities of the diffraction gratings and the field distribution of the open cavity on the electronic efficiency of a diffraction-radiation generator (DRG) is analyzed numerically on the basis of a self-consistent system of nonlinear stationary equations for the DRG. It is shown that the interaction power and efficiency of a DRG can be increased by the use of an open cavity with a nonuniform diffraction grating and a complex form of microwave field distribution over the interaction space.

  12. Drake passage and central american seaway controls on the distribution of the oceanic carbon reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fyke, Jeremy G.; D'Orgeville, Marc; Weaver, Andrew J.

    2015-05-01

    A coupled carbon/climate model is used to explore the impact of Drake Passage opening and Central American Seaway closure on the distribution of carbon in the global oceans. We find that gateway evolution likely played an important role in setting the modern day distribution of oceanic dissolved inorganic carbon (DIC), which is currently characterized by relatively low concentrations in the Atlantic ocean, and high concentrations in the Southern, Indian, and Pacific oceans. In agreement with previous studies, we find a closed Drake Passage in the presence of an open Central American Seaway results in suppressed Atlantic meridional overturning and enhancedmore » southern hemispheric deep convection. Opening of the Drake Passage triggers Antarctic Circumpolar Current flow and a weak Atlantic meridional overturning circulation (AMOC). Subsequent Central American Seaway closure reinforces the AMOC while also stagnating equatorial Pacific subsurface waters. These gateway-derived oceanographic changes are reflected in large shifts to the global distribution of DIC. An initially closed Drake Passage results in high DIC concentrations in the Atlantic and Arctic oceans, and lower DIC concentrations in the Pacific/Indian/Southern oceans. Opening Drake Passage reverses this gradient by lowering mid-depth Atlantic and Arctic DIC concentrations and raising deep Pacific/Indian/Southern Ocean DIC concentrations. Central American Seaway closure further reinforces this trend through additional Atlantic mid-depth DIC decreases, as well as Pacific mid-depth DIC concentration increases, with the net effect being a transition to a modern distribution of oceanic DIC.« less

  13. Advanced and secure architectural EHR approaches.

    PubMed

    Blobel, Bernd

    2006-01-01

    Electronic Health Records (EHRs) provided as a lifelong patient record advance towards core applications of distributed and co-operating health information systems and health networks. For meeting the challenge of scalable, flexible, portable, secure EHR systems, the underlying EHR architecture must be based on the component paradigm and model driven, separating platform-independent and platform-specific models. Allowing manageable models, real systems must be decomposed and simplified. The resulting modelling approach has to follow the ISO Reference Model - Open Distributing Processing (RM-ODP). The ISO RM-ODP describes any system component from different perspectives. Platform-independent perspectives contain the enterprise view (business process, policies, scenarios, use cases), the information view (classes and associations) and the computational view (composition and decomposition), whereas platform-specific perspectives concern the engineering view (physical distribution and realisation) and the technology view (implementation details from protocols up to education and training) on system components. Those views have to be established for components reflecting aspects of all domains involved in healthcare environments including administrative, legal, medical, technical, etc. Thus, security-related component models reflecting all view mentioned have to be established for enabling both application and communication security services as integral part of the system's architecture. Beside decomposition and simplification of system regarding the different viewpoint on their components, different levels of systems' granularity can be defined hiding internals or focusing on properties of basic components to form a more complex structure. The resulting models describe both structure and behaviour of component-based systems. The described approach has been deployed in different projects defining EHR systems and their underlying architectural principles. In that context, the Australian GEHR project, the openEHR initiative, the revision of CEN ENV 13606 "Electronic Health Record communication", all based on Archetypes, but also the HL7 version 3 activities are discussed in some detail. The latter include the HL7 RIM, the HL7 Development Framework, the HL7's clinical document architecture (CDA) as well as the set of models from use cases, activity diagrams, sequence diagrams up to Domain Information Models (DMIMs) and their building blocks Common Message Element Types (CMET) Constraining Models to their underlying concepts. The future-proof EHR architecture as open, user-centric, user-friendly, flexible, scalable, portable core application in health information systems and health networks has to follow advanced architectural paradigms.

  14. The statistics of Pearce element diagrams and the Chayes closure problem

    NASA Astrophysics Data System (ADS)

    Nicholls, J.

    1988-05-01

    Pearce element ratios are defined as having a constituent in their denominator that is conserved in a system undergoing change. The presence of a conserved element in the denominator simplifies the statistics of such ratios and renders them subject to statistical tests, especially tests of significance of the correlation coefficient between Pearce element ratios. Pearce element ratio diagrams provide unambigous tests of petrologic hypotheses because they are based on the stoichiometry of rock-forming minerals. There are three ways to recognize a conserved element: 1. The petrologic behavior of the element can be used to select conserved ones. They are usually the incompatible elements. 2. The ratio of two conserved elements will be constant in a comagmatic suite. 3. An element ratio diagram that is not constructed with a conserved element in the denominator will have a trend with a near zero intercept. The last two criteria can be tested statistically. The significance of the slope, intercept and correlation coefficient can be tested by estimating the probability of obtaining the observed values from a random population of arrays. This population of arrays must satisfy two criteria: 1. The population must contain at least one array that has the means and variances of the array of analytical data for the rock suite. 2. Arrays with the means and variances of the data must not be so abundant in the population that nearly every array selected at random has the properties of the data. The population of random closed arrays can be obtained from a population of open arrays whose elements are randomly selected from probability distributions. The means and variances of these probability distributions are themselves selected from probability distributions which have means and variances equal to a hypothetical open array that would give the means and variances of the data on closure. This hypothetical open array is called the Chayes array. Alternatively, the population of random closed arrays can be drawn from the compositional space available to rock-forming processes. The minerals comprising the available space can be described with one additive component per mineral phase and a small number of exchange components. This space is called Thompson space. Statistics based on either space lead to the conclusion that Pearce element ratios are statistically valid and that Pearce element diagrams depict the processes that create chemical inhomogeneities in igneous rock suites.

  15. Atomic Processes in a Plasma Opening Switch.

    NASA Astrophysics Data System (ADS)

    Klepper, C. C.; Moschella, J. J.; Hazelton, R. C.; Yadlowsky, E. J.; Maron, Y.

    1998-11-01

    Detailed measurements of carbon emission have been carried out in a Plasma Opening Switch (POS) with a planar geometry, in order to characterize the plasma conditions and the ionization process in the POS. Emission from various transitions of C^circ to C^3+ has been measured as a function of time from several viewing chords. For these experiments, the POS was operated with a shorted load at 130kA and with a ~700ns conduction time. A single-chord, heterodyne interferometer measured the electron density evolution along a chord coincident with one of the spectroscopic views. The passage of the ionization front across the line of sight is witnessed by both diagnostics. The data are interpreted by analyzing the time-dependent atomic processes. The measured ne rises from 1.5×10^15 to 3×10^15cm-3 as the current crosses the view. An initial electron temperature in the 1.3-2 eV range is obtained from the ratio of the C II 4267 Åand 6578 Ålines. The time dependent line emission of the various charge states shows that Te rises to a few tens of eV at the peak current. The charge state distribution during the pulse will be discussed.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    An OpenStudio Measure is a script that can manipulate an OpenStudio model and associated data to apply energy conservation measures (ECMs), run supplemental simulations, or visualize simulation results. The OpenStudio software development kit (SDK) and accessibility of the Ruby scripting language makes measure authorship accessible to both software developers and energy modelers. This paper discusses the life cycle of an OpenStudio Measure from development, testing, and distribution, to application.

  17. Mechanical Network in Titin Immunoglobulin from Force Distribution Analysis

    PubMed Central

    Wilmanns, Matthias; Gräter, Frauke

    2009-01-01

    The role of mechanical force in cellular processes is increasingly revealed by single molecule experiments and simulations of force-induced transitions in proteins. How the applied force propagates within proteins determines their mechanical behavior yet remains largely unknown. We present a new method based on molecular dynamics simulations to disclose the distribution of strain in protein structures, here for the newly determined high-resolution crystal structure of I27, a titin immunoglobulin (IG) domain. We obtain a sparse, spatially connected, and highly anisotropic mechanical network. This allows us to detect load-bearing motifs composed of interstrand hydrogen bonds and hydrophobic core interactions, including parts distal to the site to which force was applied. The role of the force distribution pattern for mechanical stability is tested by in silico unfolding of I27 mutants. We then compare the observed force pattern to the sparse network of coevolved residues found in this family. We find a remarkable overlap, suggesting the force distribution to reflect constraints for the evolutionary design of mechanical resistance in the IG family. The force distribution analysis provides a molecular interpretation of coevolution and opens the road to the study of the mechanism of signal propagation in proteins in general. PMID:19282960

  18. Microcracking and Healing in Semibrittle Salt-Rock: Elastic and Plastic Behavior

    NASA Astrophysics Data System (ADS)

    Ding, J.; Chester, F. M.; Chester, J. S.; Shen, X.; Arson, C. F.

    2017-12-01

    Microcracking and healing during semibrittle deformation are important processes that affect physical properties such as elastic moduli and permeability. We study these processes through triaxial compression tests involving cyclic differential loading and isostatic-holds on synthetic salt-rock at room temperature and low confining pressure (Pc, 1 to 4 MPa). The salt samples are produced by uniaxial pressing of granular (300 µm dia.) halite to 75 MPa at 150˚C for 10^3 s, to create low-porosity ( 5%) aggregates of nearly equant, work-hardened grains. Alternating large- and small-load cycles are performed to track the evolution of plastic and elastic properties, respecitively, with progressive strain to 8% axial shortening. 24-hour holds are carried out at about 4% axial shortening followed by renewed cyclic loading to investigate healing. During large load cycles samples yield and exhibit distributed flow with dilatancy and small work hardening. Young's Modulus (YM) decreases and then tends to stabilize, while Poisson's Ratio (PR) increases at a reducing rate, with progressive strain. Microstructures at sequential stages show that opening-mode grain-boundary cracking, grain-boundary sliding, and some intracrystalline plasticity are the dominant deformation processes. Opening and shear occur preferentially on boundaries that are parallel and inclined to the shortening axis, respectively, leading to progressive redistribution of porosity. Opening-mode grain-boundary cracks increase in number and aperature with strain, and are linked by sliding grain-boundaries to form en echelon arrays. After a 24-hour hold, samples show yielding and flow behavior consistent with that prior to the hold, whereas YM and PR are reset to the same values documented at zero strain and subsequently evolve with additional strain similar to that documented at smaller strains prior to the hold. Open grain-boundary cracks are not closed or healed during the hold. Observations suggest that changes in elastic properties in the semibrittle salt-rock reflect weakening and healing of grain-boundaries undergoing sliding rather than progressive dilatancy or healing of opening-mode cracks. Findings are being used to inform and develop continuum damage mechanics models of semibrittle deformation in polycrystalline aggregates

  19. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    PubMed

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  20. Hadoop-Based Distributed System for Online Prediction of Air Pollution Based on Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.

    2015-12-01

    The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.

  1. Open access, library and publisher competition, and the evolution of general commerce.

    PubMed

    Odlyzko, Andrew M

    2015-02-01

    Discussions of the economics of scholarly communication are usually devoted to Open Access, rising journal prices, publisher profits, and boycotts. That ignores what seems a much more important development in this market. Publishers, through the oft-reviled Big Deal packages, are providing much greater and more egalitarian access to the journal literature, an approximation to true Open Access. In the process, they are also marginalizing libraries and obtaining a greater share of the resources going into scholarly communication. This is enabling a continuation of publisher profits as well as of what for decades has been called "unsustainable journal price escalation." It is also inhibiting the spread of Open Access and potentially leading to an oligopoly of publishers controlling distribution through large-scale licensing. The Big Deal practices are worth studying for several general reasons. The degree to which publishers succeed in diminishing the role of libraries may be an indicator of the degree and speed at which universities transform themselves. More importantly, these Big Deals appear to point the way to the future of the whole economy, where progress is characterized by declining privacy, increasing price discrimination, increasing opaqueness in pricing, increasing reliance on low-paid or unpaid work of others for profits, and business models that depend on customer inertia. © The Author(s) 2014.

  2. Terror in the Board Room: The Bid-Opening Process

    ERIC Educational Resources Information Center

    Shoop, James

    2009-01-01

    Competitive bids and the bid-opening process are the cornerstones of public school purchasing. The bid-opening process does not begin on the day of the bid opening. It begins with good planning by the purchasing agent to ensure that the advertised bid complies with the public school contracts law. In New Jersey, that raises the following…

  3. Measuring charged particle multiplicity with early ATLAS public data

    NASA Astrophysics Data System (ADS)

    Üstün, G.; Barut, E.; Bektaş, E.; Özcan, V. E.

    2017-07-01

    We study 100 images of early LHC collisions that were recorded by the ATLAS experiment and made public for outreach purposes, and extract the charged particle multiplicity as a function of momentum for proton-proton collisions at a centre-of-mass energy of 7 TeV. As these collisions have already been pre-processed by the ATLAS Collaboration, the particle tracks are visible, but are available to the public only in the form of low-resolution bitmaps. We describe two separate image processing methods, one based on the industry-standard OpenCV library and C++, another based on self-developed algorithms in Python. We present our analysis of the transverse momentum and azimuthal angle distributions of the particles, in agreement with the literature.

  4. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e -Å 2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis ofmore » the node distribution in metal-organic frameworks (MOFs).« less

  5. CellAnimation: an open source MATLAB framework for microscopy assays.

    PubMed

    Georgescu, Walter; Wikswo, John P; Quaranta, Vito

    2012-01-01

    Advances in microscopy technology have led to the creation of high-throughput microscopes that are capable of generating several hundred gigabytes of images in a few days. Analyzing such wealth of data manually is nearly impossible and requires an automated approach. There are at present a number of open-source and commercial software packages that allow the user to apply algorithms of different degrees of sophistication to the images and extract desired metrics. However, the types of metrics that can be extracted are severely limited by the specific image processing algorithms that the application implements, and by the expertise of the user. In most commercial software, code unavailability prevents implementation by the end user of newly developed algorithms better suited for a particular type of imaging assay. While it is possible to implement new algorithms in open-source software, rewiring an image processing application requires a high degree of expertise. To obviate these limitations, we have developed an open-source high-throughput application that allows implementation of different biological assays such as cell tracking or ancestry recording, through the use of small, relatively simple image processing modules connected into sophisticated imaging pipelines. By connecting modules, non-expert users can apply the particular combination of well-established and novel algorithms developed by us and others that are best suited for each individual assay type. In addition, our data exploration and visualization modules make it easy to discover or select specific cell phenotypes from a heterogeneous population. CellAnimation is distributed under the Creative Commons Attribution-NonCommercial 3.0 Unported license (http://creativecommons.org/licenses/by-nc/3.0/). CellAnimationsource code and documentation may be downloaded from www.vanderbilt.edu/viibre/software/documents/CellAnimation.zip. Sample data are available at www.vanderbilt.edu/viibre/software/documents/movies.zip. walter.georgescu@vanderbilt.edu Supplementary data available at Bioinformatics online.

  6. Statistical organelle dissection of Arabidopsis guard cells using image database LIPS.

    PubMed

    Higaki, Takumi; Kutsuna, Natsumaro; Hosokawa, Yoichiroh; Akita, Kae; Ebine, Kazuo; Ueda, Takashi; Kondo, Noriaki; Hasezawa, Seiichiro

    2012-01-01

    To comprehensively grasp cell biological events in plant stomatal movement, we have captured microscopic images of guard cells with various organelles markers. The 28,530 serial optical sections of 930 pairs of Arabidopsis guard cells have been released as a new image database, named Live Images of Plant Stomata (LIPS). We visualized the average organellar distributions in guard cells using probabilistic mapping and image clustering techniques. The results indicated that actin microfilaments and endoplasmic reticulum (ER) are mainly localized to the dorsal side and connection regions of guard cells. Subtractive images of open and closed stomata showed distribution changes in intracellular structures, including the ER, during stomatal movement. Time-lapse imaging showed that similar ER distribution changes occurred during stomatal opening induced by light irradiation or femtosecond laser shots on neighboring epidermal cells, indicating that our image analysis approach has identified a novel ER relocation in stomatal opening.

  7. A review on the sources and spatial-temporal distributions of Pb in Jiaozhou Bay

    NASA Astrophysics Data System (ADS)

    Yang, Dongfang; Zhang, Jie; Wang, Ming; Zhu, Sixi; Wu, Yunjie

    2017-12-01

    This paper provided a review on the source, spatial-distribution, temporal variations of Pb in Jiaozhou Bay based on investigation of Pb in surface and waters in different seasons during 1979-1983. The source strengths of Pb sources in Jiaozhou Bay were showing increasing trends, and the pollution level of Pb in this bay was slight or moderate in the early stage of reform and opening-up. Pb contents in the marine bay were mainly determined by the strength and frequency of Pb inputs from human activities, and Pb could be moving from high content areas to low content areas in the ocean interior. Surface waters in the ocean was polluted by human activities, and bottom waters was polluted by means of vertical water’s effect. The process of spatial distribution of Pb in waters was including three steps, i.e., 1), Pb was transferring to surface waters in the bay, 2) Pb was transferring to surface waters, and 3) Pb was transferring to and accumulating in bottom waters.

  8. [Distribution of acetylcholinesterase activity in the digestive system of the gastropod molluscs Littorina littorea and Achatina fulica].

    PubMed

    Zaĭtseva, O V; Kuznetsova, T V

    2008-01-01

    With the use of the histochemical procedure for the demonstration of acetylcholinesterase (AchE) activity, the distribution cholinergic regulatory elements was studied in the esophagus, the pharynx, the stomach, the liver (the digestive gland) and the intestine in sea and terrestrial gastropod molluscs that differed in their general organization level, lifestyle, habitat and feeding type. In both molluscs, all the parts of the digestive tract contained the significant amount of intraepithelial AchE-positive cells of the open type, single subepithelial neurons and the nervous fibers localized among the muscle cells of the wall of the organs. The basal processes of the AchE-positive intraepithelial cells were shown to form the intraepithelial nerve plexus and to pass under the epithelium. The peculiarities and common principles in the distribution of the nervous elements detected, their possible function and the regulatory role in the digestion in gastropod molluscs and other animals are discussed.

  9. Preparation and Distribution of Cannabis and Cannabis-Derived Dosage Formulations for Investigational and Therapeutic Use in the United States

    PubMed Central

    Thomas, Brian F.; Pollard, Gerald T.

    2016-01-01

    Cannabis is classified as a schedule I controlled substance by the US Drug Enforcement Agency, meaning that it has no medicinal value. Production is legally restricted to a single supplier at the University of Mississippi, and distribution to researchers is tightly controlled. However, a majority of the population is estimated to believe that cannabis has legitimate medical or recreational value, numerous states have legalized or decriminalized possession to some degree, and the federal government does not strictly enforce its law and is considering rescheduling. The explosive increase in open sale and use of herbal cannabis and its products has occurred with widely variable and in many cases grossly inadequate quality control at all levels—growing, processing, storage, distribution, and use. This paper discusses elements of the analytical and regulatory system that need to be put in place to ensure standardization for the researcher and to reduce the hazards of contamination, overdosing, and underdosing for the end-user. PMID:27630566

  10. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  11. Spatial distribution of heavy metals in surficial sediments from Guanabara Bay: Rio de Janeiro, Brazil

    NASA Astrophysics Data System (ADS)

    Neto, José Antônio Baptista; Gingele, Franz Xaver; Leipe, Thomas; Brehme, Isa

    2006-04-01

    Ninety-two surface sediment samples were collected in Guanabara Bay, one of the most prominent urban bays in SE Brazil, to investigate the spatial distribution of anthropogenic pollutants. The concentrations of heavy metals, organic carbon and particle size were examined in all samples. Large spatial variations of heavy metals and particle size were observed. The highest concentrations of heavy metals were found in the muddy sediments from the north western region of the bay near the main outlets of the most polluted rivers, municipal waste drainage systems and one of the major oil refineries. Another anomalous concentration of metals was found adjacent to Rio de Janeiro Harbour. The heavy metal concentrations decrease to the northeast, due to intact rivers and the mangrove systems in this area, and to the south where the sand fraction and open-marine processes dominate. The geochemical normalization of metal data to Li or Al has also demonstrated that the anthropogenic input of heavy metals have altered the natural sediment heavy metal distribution.

  12. Permeability changes induced by microfissure closure and opening in tectonized materials. Effect on slope pore pressure regime.

    NASA Astrophysics Data System (ADS)

    De la Fuente, Maria; Vaunat, Jean; Pedone, Giuseppe; Cotecchia, Federica; Sollecito, Francesca; Casini, Francesca

    2015-04-01

    Tectonized clays are complex materials characterized by several levels of structures that may evolve during load and wetting/drying processes. Some microstructural patterns, as microfissures, have a particular influence on the value of permeability which is one of the main factors controlling pore pressure regime in slopes. In this work, the pore pressure regime measured in a real slope of tectonized clay in Southern Italy is analyzed by a numerical model that considers changes in permeability induced by microfissure closure and opening during the wetting and drying processes resulting from climatic actions. Permeability model accounts for the changes in Pore Size Distribution observed by Microscopy Intrusion Porosimetry. MIP tests are performed on representative samples of ground in initial conditions ("in situ" conditions) and final conditions (deformed sample after applying a wetting path that aims to reproduce the saturation of the soil under heavy rains). The resulting measurements allow for the characterization at microstructural level of the soil, identifying the distribution of dominant families pores in the sample and its evolution under external actions. Moreover, comparison of pore size density functions allows defining a microstructural parameter that depends on void ratio and degree of saturation and controls the variation of permeability. Model has been implemented in a thermo-hydro-mechanical code provided with a special boundary condition for climatic actions. Tool is used to analyze pore pressure measurements obtained in the tectonized clay slope. Results are analyzed at the light of the effect that permeability changes during wetting and drying have on the pore pressure regime.

  13. TopFed: TCGA tailored federated query processing and linking to LOD.

    PubMed

    Saleem, Muhammad; Padmanabhuni, Shanmukha S; Ngomo, Axel-Cyrille Ngonga; Iqbal, Aftab; Almeida, Jonas S; Decker, Stefan; Deus, Helena F

    2014-01-01

    The Cancer Genome Atlas (TCGA) is a multidisciplinary, multi-institutional effort to catalogue genetic mutations responsible for cancer using genome analysis techniques. One of the aims of this project is to create a comprehensive and open repository of cancer related molecular analysis, to be exploited by bioinformaticians towards advancing cancer knowledge. However, devising bioinformatics applications to analyse such large dataset is still challenging, as it often requires downloading large archives and parsing the relevant text files. Therefore, it is making it difficult to enable virtual data integration in order to collect the critical co-variates necessary for analysis. We address these issues by transforming the TCGA data into the Semantic Web standard Resource Description Format (RDF), link it to relevant datasets in the Linked Open Data (LOD) cloud and further propose an efficient data distribution strategy to host the resulting 20.4 billion triples data via several SPARQL endpoints. Having the TCGA data distributed across multiple SPARQL endpoints, we enable biomedical scientists to query and retrieve information from these SPARQL endpoints by proposing a TCGA tailored federated SPARQL query processing engine named TopFed. We compare TopFed with a well established federation engine FedX in terms of source selection and query execution time by using 10 different federated SPARQL queries with varying requirements. Our evaluation results show that TopFed selects on average less than half of the sources (with 100% recall) with query execution time equal to one third to that of FedX. With TopFed, we aim to offer biomedical scientists a single-point-of-access through which distributed TCGA data can be accessed in unison. We believe the proposed system can greatly help researchers in the biomedical domain to carry out their research effectively with TCGA as the amount and diversity of data exceeds the ability of local resources to handle its retrieval and parsing.

  14. ENKI - An Open Source environmental modelling platfom

    NASA Astrophysics Data System (ADS)

    Kolberg, S.; Bruland, O.

    2012-04-01

    The ENKI software framework for implementing spatio-temporal models is now released under the LGPL license. Originally developed for evaluation and comparison of distributed hydrological model compositions, ENKI can be used for simulating any time-evolving process over a spatial domain. The core approach is to connect a set of user specified subroutines into a complete simulation model, and provide all administrative services needed to calibrate and run that model. This includes functionality for geographical region setup, all file I/O, calibration and uncertainty estimation etc. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines and various model compositions in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational water resource management. ENKI uses a plug-in structure to invoke separately compiled subroutines, separately built as dynamic-link libraries (dlls). The source code of an ENKI routine is highly compact, with a narrow framework-routine interface allowing the main program to recognise the number, types, and names of the routine's variables. The framework then exposes these variables to the user within the proper context, ensuring that distributed maps coincide spatially, time series exist for input variables, states are initialised, GIS data sets exist for static map data, manually or automatically calibrated values for parameters etc. By using function calls and memory data structures to invoke routines and facilitate information flow, ENKI provides good performance. For a typical distributed hydrological model setup in a spatial domain of 25000 grid cells, 3-4 time steps simulated per second should be expected. Future adaptation to parallel processing may further increase this speed. New modifications to ENKI include a full separation of API and user interface, making it possible to run ENKI from GIS programs and other software environments. ENKI currently compiles under Windows and Visual Studio only, but ambitions exist to remove the platform and compiler dependencies.

  15. From the γ γ →p p ¯ reaction to the production of p p ¯ pairs in ultraperipheral ultrarelativistic heavy-ion collisions at the LHC

    NASA Astrophysics Data System (ADS)

    Kłusek-Gawenda, Mariola; Lebiedowicz, Piotr; Nachtmann, Otto; Szczurek, Antoni

    2017-11-01

    In this paper we consider the production of proton-antiproton pairs in two-photon interactions in electron-positron and heavy-ion collisions. We try to understand the dependence of the total cross section on the photon-photon c.m. energy as well as corresponding angular distributions measured by the Belle Collaboration for the γ γ →p p ¯ process. To understand the Belle data we include the proton-exchange, the f2(1270 ) and f2(1950 ) s -channel exchanges, as well as the hand-bag mechanism. The helicity amplitudes for the γ γ →f2→p p ¯ process are written explicitly based on a Lagrangian approach. The parameters of vertex form factors are adjusted to the Belle data. Having described the angular distributions for the γ γ →p p ¯ process we present first predictions for the ultraperipheral, ultrarelativistic, heavy-ion reaction P208bP208b→P208bP208bp p ¯ . Both, the total cross section and several differential distributions for experimental cuts corresponding to the ALICE, ATLAS, CMS, and LHCb experiments are presented. We find the total cross section 100 μ b for the ALICE cuts, 160 μ b for the ATLAS cuts, 500 μ b for the CMS cuts, and 104 μ b taking into account the LHCb cuts. This opens a possibility to study the γ γ →p p ¯ process at the LHC.

  16. Assessing and Improving Performance: A Longitudinal Evaluation of Priority Setting and Resource Allocation in a Canadian Health Region.

    PubMed

    Hall, William; Smith, Neale; Mitton, Craig; Urquhart, Bonnie; Bryan, Stirling

    2017-08-22

    In order to meet the challenges presented by increasing demand and scarcity of resources, healthcare organizations are faced with difficult decisions related to resource allocation. Tools to facilitate evaluation and improvement of these processes could enable greater transparency and more optimal distribution of resources. The Resource Allocation Performance Assessment Tool (RAPAT) was implemented in a healthcare organization in British Columbia, Canada. Recommendations for improvement were delivered, and a follow up evaluation exercise was conducted to assess the trajectory of the organization's priority setting and resource allocation (PSRA) process 2 years post the original evaluation. Implementation of RAPAT in the pilot organization identified strengths and weaknesses of the organization's PSRA process at the time of the original evaluation. Strengths included the use of criteria and evidence, an ability to reallocate resources, and the involvement of frontline staff in the process. Weaknesses included training, communication, and lack of program budgeting. Although the follow up revealed a regression from a more formal PSRA process, a legacy of explicit resource allocation was reported to be providing ongoing benefit for the organization. While past studies have taken a cross-sectional approach, this paper introduces the first longitudinal evaluation of PSRA in a healthcare organization. By including the strengths, weaknesses, and evolution of one organization's journey, the authors' intend that this paper will assist other healthcare leaders in meeting the challenges of allocating scarce resources. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  17. The directivity of the sound radiation from panels and openings.

    PubMed

    Davy, John L

    2009-06-01

    This paper presents a method for calculating the directivity of the radiation of sound from a panel or opening, whose vibration is forced by the incidence of sound from the other side. The directivity of the radiation depends on the angular distribution of the incident sound energy in the room or duct in whose wall or end the panel or opening occurs. The angular distribution of the incident sound energy is predicted using a model which depends on the sound absorption coefficient of the room or duct surfaces. If the sound source is situated in the room or duct, the sound absorption coefficient model is used in conjunction with a model for the directivity of the sound source. For angles of radiation approaching 90 degrees to the normal to the panel or opening, the effect of the diffraction by the panel or opening, or by the finite baffle in which the panel or opening is mounted, is included. A simple empirical model is developed to predict the diffraction of sound into the shadow zone when the angle of radiation is greater than 90 degrees to the normal to the panel or opening. The method is compared with published experimental results.

  18. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  19. SciSpark's SRDD : A Scientific Resilient Distributed Dataset for Multidimensional Data

    NASA Astrophysics Data System (ADS)

    Palamuttam, R. S.; Wilson, B. D.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; McGibbney, L. J.; Ramirez, P.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We have developed SciSpark, a robust Big Data framework, that extends ApacheTM Spark for scaling scientific computations. Apache Spark improves the map-reduce implementation in ApacheTM Hadoop for parallel computing on a cluster, by emphasizing in-memory computation, "spilling" to disk only as needed, and relying on lazy evaluation. Central to Spark is the Resilient Distributed Dataset (RDD), an in-memory distributed data structure that extends the functional paradigm provided by the Scala programming language. However, RDDs are ideal for tabular or unstructured data, and not for highly dimensional data. The SciSpark project introduces the Scientific Resilient Distributed Dataset (sRDD), a distributed-computing array structure which supports iterative scientific algorithms for multidimensional data. SciSpark processes data stored in NetCDF and HDF files by partitioning them across time or space and distributing the partitions among a cluster of compute nodes. We show usability and extensibility of SciSpark by implementing distributed algorithms for geospatial operations on large collections of multi-dimensional grids. In particular we address the problem of scaling an automated method for finding Mesoscale Convective Complexes. SciSpark provides a tensor interface to support the pluggability of different matrix libraries. We evaluate performance of the various matrix libraries in distributed pipelines, such as Nd4jTM and BreezeTM. We detail the architecture and design of SciSpark, our efforts to integrate climate science algorithms, parallel ingest and partitioning (sharding) of A-Train satellite observations from model grids. These solutions are encompassed in SciSpark, an open-source software framework for distributed computing on scientific data.

  20. Wealth distribution, Pareto law, and stretched exponential decay of money: Computer simulations analysis of agent-based models

    NASA Astrophysics Data System (ADS)

    Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf

    2018-01-01

    We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.

  1. Simulation and Analysis of Electric Field for the Disconnector Switch Incomplete Opening Position Based on 220kV GIS

    NASA Astrophysics Data System (ADS)

    Wang, Feifeng; Huang, Huimin; Su, Yi; Yan, Dandan; Lu, Yufeng; Xia, Xiaofei; Yang, Jian

    2018-05-01

    It has accounted for a large proportion of GIS equipment defects, which cause the disconnector switches to incomplete open-close position. Once opening operation is not in place, it will arouse continuous arcing between contacts to reduce insulation strength. Otherwise, the intense heat give rise to burn the contact, which has a severe effect on the safe operation of power grid. This paper analyzes some typical defection cases about the opening operation incomplete for disconnector switches of GIS. The COMSOL Multiphysics is applied to verify the influence on electric field distribution. The results show that moving contact out shield is 20 mm, the electric field distribution of the moving contact surface is uneven, and the maximum electric field value can reach 9.74 kV/mm.

  2. Distribution of soft-bottom macrofauna in the deep open Baltic Sea in relation to environmental variability

    NASA Astrophysics Data System (ADS)

    Laine, Ari O.

    2003-05-01

    To analyse the large-scale distribution of soft-bottom macrofauna in the open Baltic Sea, samples for species abundance and biomass were collected in 1996-1997. Benthic community structure was used to classify and describe different assemblages and the observed distribution of communities was related to environmental factors. Distinct benthic assemblages were found that were dominated by only a few species ( Harmothoe sarsi, Saduria entomon, Monoporeia affinis, Pontoporeia femorata and Macoma balthica). These assemblages were related to different subareas and/or depth zones of the Baltic Sea. Salinity or the combined effects of salinity, dissolved oxygen and sediment organic matter content best explained the patterns in community distribution, indicating the importance of hydrography and sediment quality as structuring factors of the macrozoobenthos communities. When compared to long-term studies on Baltic macrozoobenthos it is evident that the results represent only a momentary state in the succession of the open-sea communities, which have been affected by past changes in hydrography, and will be subject to future changes in accordance with the variable environment, affected by climate.

  3. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  4. Open-source hardware for medical devices.

    PubMed

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  5. Dehydration reactions, mass transfer and rock deformation relationships during subduction of Alpine metabauxites: insights from LIBS compositional profiles between metamorphic veins

    NASA Astrophysics Data System (ADS)

    Verlaguet, Anne; Brunet, Fabrice; Goffé, Bruno; Menut, Denis; Findling, Nathaniel; Poinssot, Christophe

    2013-04-01

    In subduction zones, the significant amounts of aqueous fluid released in the course of the successive dehydration reactions occurring during prograde metamorphism are expected to strongly influence the rock rheology, as well as kinetics of metamorphic reactions and mass transfer efficiency. Mineralized veins, ubiquitous in metamorphic rocks, can be seen as preserved witnesses of fluid and mass redistribution that partly accommodate the rock deformation (lateral segregation). However, the driving forces and mechanisms of mass transfer towards fluid-filled open spaces remain somewhat unclear. The aim of this study is to investigate the vein-forming processes and the modalities of mass transfer during local fluid-rock interactions, and their links with fluid production and rock deformation, with new insights from Laser Induced Breakdown Spectroscopy (LIBS) profiles. This study focuses on karstic pockets (metre scale) of Triassic metabauxites embedded in thick carbonate units, that have been isolated from large-scale fluid flow during HP-LT Alpine metamorphism (W. Vanoise, French Alps). These rocks display several generations of metamorphic veins containing various Al-bearing minerals, which give particular insights into mass transfer processes. It is proposed that the internally-derived fluid (~13 vol% produced by successive dehydration reactions) has promoted the opening of fluid-filled open spaces (euhedral habits of vein minerals) and served as medium for diffusive mass transfer from rock to vein. Based on mineralogical and textural features, two vein types can be distinguished: (1) some veins are filled with newly formed products of either prograde (chloritoid) or retrograde (chlorite) metamorphic reactions; in this case, fluid-filled open spaces seem to offer energetically favourable nucleation/growth sites; (2) the second vein type is filled with cookeite (Li-Al-rich chlorite) or pyrophyllite, that were present in the host rock prior to the vein formation. In this closed chemical system, mass transfer from rock to vein was achieved through the fluid, in a dissolution-transport-precipitation process, possibly stress-assisted. To investigate the modalities of mass transfer towards this second vein type, LIBS profiles were performed in the rock matrix, taking Li concentration as a proxy for cookeite distribution. Cookeite is highly concentrated (40-70 vol%) in regularly spaced veins, and the LIBS profiles show that cookeite is evenly distributed in the rock matrix comprised between two veins. The absence of diffusion profiles suggests that the characteristic diffusion length for Li, Al and Si is greater than or equal to the distance separating two cookeite veins (3-6 cm). This is in agreement with characteristic diffusion lengths calculated from both grain boundary and pore fluid diffusion coefficients, for the estimated duration of the peak of metamorphism. Concerning mass transfer driving forces, phyllosilicates have very different morphologies in the rock matrix (fibers) compared to veins (euhedral crystals): fluid-mineral interfacial energy may be maximal in the small matrix pores, which can maintain higher cookeite solubility than in fluid-filled open spaces. Therefore, as soon as veins open, chemical potential gradients may develop and drive cookeite transfer from rock matrix to veins.

  6. Stewardship and management challenges within a cloud-based open data ecosystem (Invited Paper 211863)

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2017-12-01

    NOAA's Big Data Project is conducting an experiment in the collaborative distribution of open government data to non-governmental cloud-based systems. Through Cooperative Research and Development Agreements signed in 2015 between NOAA and Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium, NOAA is distributing open government data to a wide community of potential users. There are a number of significant advantages related to the use of open data on commercial cloud platforms, but through this experiment NOAA is also discovering significant challenges for those stewarding and maintaining NOAA's data resources in support of users in the wider open data ecosystem. Among the challenges that will be discussed are: the need to provide effective interpretation of the data content to enable their use by data scientists from other expert communities; effective maintenance of Collaborators' open data stores through coordinated publication of new data and new versions of older data; the provenance and verification of open data as authentic NOAA-sourced data across multiple management boundaries and analytical tools; and keeping pace with the accelerating expectations of users with regard to improved quality control, data latency, availability, and discoverability. Suggested strategies to address these challenges will also be described.

  7. Implications of fractured Arctic perennial ice cover on thermodynamic and dynamic sea ice processes

    NASA Astrophysics Data System (ADS)

    Asplin, Matthew G.; Scharien, Randall; Else, Brent; Howell, Stephen; Barber, David G.; Papakyriakou, Tim; Prinsenberg, Simon

    2014-04-01

    Decline of the Arctic summer minimum sea ice extent is characterized by large expanses of open water in the Siberian, Laptev, Chukchi, and Beaufort Seas, and introduces large fetch distances in the Arctic Ocean. Long waves can propagate deep into the pack ice, thereby causing flexural swell and failure of the sea ice. This process shifts the floe size diameter distribution smaller, increases floe surface area, and thereby affects sea ice dynamic and thermodynamic processes. The results of Radarsat-2 imagery analysis show that a flexural fracture event which occurred in the Beaufort Sea region on 6 September 2009 affected ˜40,000 km2. Open water fractional area in the area affected initially decreased from 3.7% to 2.7%, but later increased to ˜20% following wind-forced divergence of the ice pack. Energy available for lateral melting was assessed by estimating the change in energy entrainment from longwave and shortwave radiation in the mixed-layer of the ocean following flexural fracture. 11.54 MJ m-2 of additional energy for lateral melting of ice floes was identified in affected areas. The impact of this process in future Arctic sea ice melt seasons was assessed using estimations of earlier occurrences of fracture during the melt season, and is discussed in context with ocean heat fluxes, atmospheric mixing of the ocean mixed layer, and declining sea ice cover. We conclude that this process is an important positive feedback to Arctic sea ice loss, and timing of initiation is critical in how it affects sea ice thermodynamic and dynamic processes.

  8. Lessons Learned through the Development and Publication of AstroImageJ

    NASA Astrophysics Data System (ADS)

    Collins, Karen

    2018-01-01

    As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.

  9. BEANS - a software package for distributed Big Data analysis

    NASA Astrophysics Data System (ADS)

    Hypki, Arkadiusz

    2018-07-01

    BEANS software is a web-based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of data sets. Its main purpose is to simplify the process of storing, examining, and finding new relations in huge data sets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse, and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open-source software too.

  10. BEANS - a software package for distributed Big Data analysis

    NASA Astrophysics Data System (ADS)

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  11. Intelligent manufacturing: the challenge for manufacturing strategy in China in the 21st century--what we will do

    NASA Astrophysics Data System (ADS)

    Yang, Shuzi; Lei, Ming; Guan, Zai-Lin; Xiong, Youlun

    1995-08-01

    This paper first introduces the project of intelligent manufacturing in China and the research state of the IIMRC (Intelligent and Integrated Manufacturing Research Centre) of HUST (Huazhong University of Science and Technology), then reviews the recent advances in object- oriented and distributed artificial intelligence and puts forth the view that these advances open up the prospect of systems that will enable the true integration of enterprises. In an attempt to identify domain requirements and match them with research achievements, the paper examines the current literature and distinguishes 14 features that are common. It argues that effective enterprise-wide support could be greatly facilitated by the existence of intelligent software entities with autonomous processing capabilities, that possess coordination and negotiation facilities and are organized in distributed hierarchical states.

  12. Open Rotor Tone Shielding Methods for System Noise Assessments Using Multiple Databases

    NASA Technical Reports Server (NTRS)

    Bahr, Christopher J.; Thomas, Russell H.; Lopes, Leonard V.; Burley, Casey L.; Van Zante, Dale E.

    2014-01-01

    Advanced aircraft designs such as the hybrid wing body, in conjunction with open rotor engines, may allow for significant improvements in the environmental impact of aviation. System noise assessments allow for the prediction of the aircraft noise of such designs while they are still in the conceptual phase. Due to significant requirements of computational methods, these predictions still rely on experimental data to account for the interaction of the open rotor tones with the hybrid wing body airframe. Recently, multiple aircraft system noise assessments have been conducted for hybrid wing body designs with open rotor engines. These assessments utilized measured benchmark data from a Propulsion Airframe Aeroacoustic interaction effects test. The measured data demonstrated airframe shielding of open rotor tonal and broadband noise with legacy F7/A7 open rotor blades. Two methods are proposed for improving the use of these data on general open rotor designs in a system noise assessment. The first, direct difference, is a simple octave band subtraction which does not account for tone distribution within the rotor acoustic signal. The second, tone matching, is a higher-fidelity process incorporating additional physical aspects of the problem, where isolated rotor tones are matched by their directivity to determine tone-by-tone shielding. A case study is conducted with the two methods to assess how well each reproduces the measured data and identify the merits of each. Both methods perform similarly for system level results and successfully approach the experimental data for the case study. The tone matching method provides additional tools for assessing the quality of the match to the data set. Additionally, a potential path to improve the tone matching method is provided.

  13. OpenFIRE - A Web GIS Service for Distributing the Finnish Reflection Experiment Datasets

    NASA Astrophysics Data System (ADS)

    Väkevä, Sakari; Aalto, Aleksi; Heinonen, Aku; Heikkinen, Pekka; Korja, Annakaisa

    2017-04-01

    The Finnish Reflection Experiment (FIRE) is a land-based deep seismic reflection survey conducted between 2001 and 2003 by a research consortium of the Universities of Helsinki and Oulu, the Geological Survey of Finland, and a Russian state-owned enterprise SpetsGeofysika. The dataset consists of 2100 kilometers of high-resolution profiles across the Archaean and Proterozoic nuclei of the Fennoscandian Shield. Although FIRE data have been available on request since 2009, the data have remained underused outside the original research consortium. The original FIRE data have been quality-controlled. The shot gathers have been cross-checked and comprehensive errata has been created. The brute stacks provided by the Russian seismic contractor have been reprocessed into seismic sections and replotted. A complete documentation of the intermediate processing steps is provided together with guidelines for setting up a computing environment and plotting the data. An open access web service "OpenFIRE" for the visualization and the downloading of FIRE data has been created. The service includes a mobile-responsive map application capable of enriching seismic sections with data from other sources such as open data from the National Land Survey and the Geological Survey of Finland. The AVAA team of the Finnish Open Science and Research Initiative has provided a tailored Liferay portal with necessary web components such as an API (Application Programming Interface) for download requests. INSPIRE (Infrastructure for Spatial Information in Europe) -compliant discovery metadata have been produced and geospatial data will be exposed as Open Geospatial Consortium standard services. The technical guidelines of the European Plate Observing System have been followed and the service could be considered as a reference application for sharing reflection seismic data. The OpenFIRE web service is available at www.seismo.helsinki.fi/openfire

  14. Open Markov Processes and Reaction Networks

    NASA Astrophysics Data System (ADS)

    Swistock Pollard, Blake Stephen

    We begin by defining the concept of `open' Markov processes, which are continuous-time Markov chains where probability can flow in and out through certain `boundary' states. We study open Markov processes which in the absence of such boundary flows admit equilibrium states satisfying detailed balance, meaning that the net flow of probability vanishes between all pairs of states. External couplings which fix the probabilities of boundary states can maintain such systems in non-equilibrium steady states in which non-zero probability currents flow. We show that these non-equilibrium steady states minimize a quadratic form which we call 'dissipation.' This is closely related to Prigogine's principle of minimum entropy production. We bound the rate of change of the entropy of a driven non-equilibrium steady state relative to the underlying equilibrium state in terms of the flow of probability through the boundary of the process. We then consider open Markov processes as morphisms in a symmetric monoidal category by splitting up their boundary states into certain sets of `inputs' and `outputs.' Composition corresponds to gluing the outputs of one such open Markov process onto the inputs of another so that the probability flowing out of the first process is equal to the probability flowing into the second. Tensoring in this category corresponds to placing two such systems side by side. We construct a `black-box' functor characterizing the behavior of an open Markov process in terms of the space of possible steady state probabilities and probability currents along the boundary. The fact that this is a functor means that the behavior of a composite open Markov process can be computed by composing the behaviors of the open Markov processes from which it is composed. We prove a similar black-boxing theorem for reaction networks whose dynamics are given by the non-linear rate equation. Along the way we describe a more general category of open dynamical systems where composition corresponds to gluing together open dynamical systems.

  15. Prominin-1 Localizes to the Open Rims of Outer Segment Lamellae in Xenopus laevis Rod and Cone Photoreceptors

    PubMed Central

    Han, Zhou; Anderson, David W.

    2012-01-01

    Purpose. Prominin-1 expresses in rod and cone photoreceptors. Mutations in the prominin-1 gene cause retinal degeneration in humans. In this study, the authors investigated the expression and subcellular localization of xlProminin-1 protein, the Xenopus laevis ortholog of prominin-1, in rod and cone photoreceptors of this frog. Methods. Antibodies specific for xlProminin-1 were generated. Immunoblotting was used to study the expression and posttranslational processing of xlProminin-1 protein. Immunocytochemical light and electron microscopy and transgenesis were used to study the subcellular distribution of xlProminin-1. Results. xlProminin-1 is expressed and is subject to posttranslational proteolytic processing in the retina, brain, and kidney. xlProminin-1 is differently expressed and localized in outer segments of rod and cone photoreceptors of X. laevis. Antibodies specific for the N or C termini of xlProminin-1 labeled the open rims of lamellae of cone outer segments (COS) and the open lamellae at the base of rod outer segments (ROS). By contrast, anti–peripherin-2/rds antibody, Xper5A11, labeled the closed rims of cone lamellae adjacent to the ciliary axoneme and the rims of the closed ROS disks. The extent of labeling of the basal ROS by anti–xlProminin-1 antibodies varied with the light cycle in this frog. The entire ROS was also faintly labeled by both antibodies, a result that contrasts with the current notion that prominin-1 localizes only to the basal ROS. Conclusions. These findings suggest that xlProminin-1 may serve as an anti–fusogenic factor in the regulation of disk morphogenesis and may help to maintain the open lamellar structure of basal ROS and COS disks in X. laevis photoreceptors. PMID:22076989

  16. Extending Supernova Spectral Templates for Next Generation Space Telescope Observations

    NASA Astrophysics Data System (ADS)

    Roberts-Pierel, Justin; Rodney, Steven A.; Steven Rodney

    2018-01-01

    Widely used empirical supernova (SN) Spectral Energy Distributions (SEDs) have not historically extended meaningfully into the ultraviolet (UV), or the infrared (IR). However, both are critical for current and future aspects of SN research including UV spectra as probes of poorly understood SN Ia physical properties, and expanding our view of the universe with high-redshift James Webb Space Telescope (JWST) IR observations. We therefore present a comprehensive set of SN SED templates that have been extended into the UV and IR, as well as an open-source software package written in Python that enables a user to generate their own extrapolated SEDs. We have taken a sampling of core-collapse (CC) and Type Ia SNe to get a time-dependent distribution of UV and IR colors (U-B,r’-[JHK]), and then generated color curves are used to extrapolate SEDs into the UV and IR. The SED extrapolation process is now easily duplicated using a user’s own data and parameters via our open-source Python package: SNSEDextend. This work develops the tools necessary to explore the JWST’s ability to discriminate between CC and Type Ia SNe, as well as provides a repository of SN SEDs that will be invaluable to future JWST and WFIRST SN studies.

  17. Using WNTR to Model Water Distribution System Resilience ...

    EPA Pesticide Factsheets

    The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of disruptive events, including earthquakes, contamination incidents, floods, climate change, and fires. The software includes the EPANET solver as well as a WNTR solver with the ability to model pressure-driven demand hydraulics, pipe breaks, component degradation and failure, changes to supply and demand, and cascading failure. Damage to individual components in the network (i.e. pipes, tanks) can be selected probabilistically using fragility curves. WNTR can also simulate different types of resilience-enhancing actions, including scheduled pipe repair or replacement, water conservation efforts, addition of back-up power, and use of contamination warning systems. The software can be used to estimate potential damage in a network, evaluate preparedness, prioritize repair strategies, and identify worse case scenarios. As a Python package, WNTR takes advantage of many existing python capabilities, including parallel processing of scenarios and graphics capabilities. This presentation will outline the modeling components in WNTR, demonstrate their use, give the audience information on how to get started using the code, and invite others to participate in this open source project. This pres

  18. Electrified Flow in Slender V-Groove Microchannels: Generalized Stability of Steady State Configurations

    NASA Astrophysics Data System (ADS)

    Markeviciute, Vilda; White, Nicholas; Troian, Sandra

    2017-11-01

    Although spontaneous capillary flow can be an especially rapid process in slender open microchannels resembling V-grooves, enhanced flow control is possible through implementation of electric field distributions which generate opposing electrohydrodynamic pressures along the air/liquid interface to modulate the capillary pressures. Important fundamental work by Romero and Yost (1996) and Weislogel(1996) has elucidated the behavior of Newtonian films in slender V-grooves driven to flow solely by the streamwise change in capillary pressure due to the change in radius of curvature of the circular arc describing the interface of wetting or non-wetting fluids. Here we augment the Romero and Yost model with inclusion of Maxwell stresses for perfectly conducting wetting films and examine which electric field distributions allow formation of steady state film shapes for various inlet and outlet boundary conditions. We investigate the stability of these steady solutions to small perturbations in film thickness using a generalized stability analysis. These results reveal how the ratio of Maxwell to capillary stresses influences the degree of linearized transient growth or decay for thin films confined to flow within an open V-groove. Funding from the 2017 Caltech Summer Undergraduate Research Fellowship Program (Markeviciute) as well as a 2017 NASA Space Technology Research Fellowship (White) is gratefully acknowledged.

  19. Sensing Slow Mobility and Interesting Locations for Lombardy Region (italy): a Case Study Using Pointwise Geolocated Open Data

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Oxoli, D.; Zurbarán, M. A.

    2016-06-01

    During the past years Web 2.0 technologies have caused the emergence of platforms where users can share data related to their activities which in some cases are then publicly released with open licenses. Popular categories for this include community platforms where users can upload GPS tracks collected during slow travel activities (e.g. hiking, biking and horse riding) and platforms where users share their geolocated photos. However, due to the high heterogeneity of the information available on the Web, the sole use of these user-generated contents makes it an ambitious challenge to understand slow mobility flows as well as to detect the most visited locations in a region. Exploiting the available data on community sharing websites allows to collect near real-time open data streams and enables rigorous spatial-temporal analysis. This work presents an approach for collecting, unifying and analysing pointwise geolocated open data available from different sources with the aim of identifying the main locations and destinations of slow mobility activities. For this purpose, we collected pointwise open data from the Wikiloc platform, Twitter, Flickr and Foursquare. The analysis was confined to the data uploaded in Lombardy Region (Northern Italy) - corresponding to millions of pointwise data. Collected data was processed through the use of Free and Open Source Software (FOSS) in order to organize them into a suitable database. This allowed to run statistical analyses on data distribution in both time and space by enabling the detection of users' slow mobility preferences as well as places of interest at a regional scale.

  20. Unglazed transpired solar collector having a low thermal-conductance absorber

    DOEpatents

    Christensen, Craig B.; Kutscher, Charles F.; Gawlik, Keith M.

    1997-01-01

    An unglazed transpired solar collector using solar radiation to heat incoming air for distribution, comprising an unglazed absorber formed of low thermal-conductance material having a front surface for receiving the solar radiation and openings in the unglazed absorber for passage of the incoming air such that the incoming air is heated as it passes towards the front surface of the absorber and the heated air passes through the openings in the absorber for distribution.

  1. Unglazed transpired solar collector having a low thermal-conductance absorber

    DOEpatents

    Christensen, C.B.; Kutscher, C.F.; Gawlik, K.M.

    1997-12-02

    An unglazed transpired solar collector using solar radiation to heat incoming air for distribution, comprises an unglazed absorber formed of low thermal-conductance material having a front surface for receiving the solar radiation and openings in the unglazed absorber for passage of the incoming air such that the incoming air is heated as it passes towards the front surface of the absorber and the heated air passes through the openings in the absorber for distribution. 3 figs.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, Adam; Connaughton, Valerie; Briggs, Michael S.

    We present a method to estimate the jet opening angles of long duration gamma-ray bursts (GRBs) using the prompt gamma-ray energetics and an inversion of the Ghirlanda relation, which is a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma-rays. The derived jet opening angles using this method and detailed assumptions match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate themore » probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observations, and we potentially expand the number of GRBs that can be used in this analysis by more than an order of magnitude. In this analysis, we also present an inferred distribution of jet breaks which indicates that a large fraction of jet breaks are not observable with current instrumentation and observing strategies. We present simple parameterizations for the jet angle, energetics, and jet break distributions so that they may be used in future studies.« less

  3. Open Source Live Distributions for Computer Forensics

    NASA Astrophysics Data System (ADS)

    Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele

    Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.

  4. Experimental observations of granular debris flows

    NASA Astrophysics Data System (ADS)

    Ghilardi, P.

    2003-04-01

    Various tests are run using two different laboratory flumes with rectangular cross section and transparent walls. The grains used in a single experiment have an almost constant grain sizes; mean diameter ranges from 5 mm to 20 mm. In each test various measurements are taken: hydrograms, velocity distribution near the transparent walls and on the free surface, average flow concentration. Concentration values are measured taking samples. Velocity distributions are obtained from movies recorded by high speed video cameras capable of 350 frames per second; flow rates and depth hydrograms are computed from the same velocity distributions. A gate is installed at the beginning of one of the flumes; this gate slides normally to the bed and opens very quickly, reproducing a dam-break. Several tests are run using this device, varying channel slope, sediment concentration, initial mixture thickness before the gate. Velocity distribution in the flume is almost constant from left to right, except for the flow sections near the front. The observed discharges and velocities are less than those given by a classic dam break formula, and depend on sediment concentration. The other flume is fed by a mixture with constant discharge and concentration, and is mainly used for measuring velocity distributions when the flow is uniform, with both rigid and granular bed, and to study erosion/deposition processes near debris flow dams or other mitigation devices. The equilibrium slope of the granular bed is very close to that given by the classical equilibrium formulas for debris flow. Different deposition processes are observed depending on mixture concentration and channel geometry.

  5. Tentacle: distributed quantification of genes in metagenomes.

    PubMed

    Boulund, Fredrik; Sjögren, Anders; Kristiansson, Erik

    2015-01-01

    In metagenomics, microbial communities are sequenced at increasingly high resolution, generating datasets with billions of DNA fragments. Novel methods that can efficiently process the growing volumes of sequence data are necessary for the accurate analysis and interpretation of existing and upcoming metagenomes. Here we present Tentacle, which is a novel framework that uses distributed computational resources for gene quantification in metagenomes. Tentacle is implemented using a dynamic master-worker approach in which DNA fragments are streamed via a network and processed in parallel on worker nodes. Tentacle is modular, extensible, and comes with support for six commonly used sequence aligners. It is easy to adapt Tentacle to different applications in metagenomics and easy to integrate into existing workflows. Evaluations show that Tentacle scales very well with increasing computing resources. We illustrate the versatility of Tentacle on three different use cases. Tentacle is written for Linux in Python 2.7 and is published as open source under the GNU General Public License (v3). Documentation, tutorials, installation instructions, and the source code are freely available online at: http://bioinformatics.math.chalmers.se/tentacle.

  6. Distributed neural system for emotional intelligence revealed by lesion mapping.

    PubMed

    Barbey, Aron K; Colom, Roberto; Grafman, Jordan

    2014-03-01

    Cognitive neuroscience has made considerable progress in understanding the neural architecture of human intelligence, identifying a broadly distributed network of frontal and parietal regions that support goal-directed, intelligent behavior. However, the contributions of this network to social and emotional aspects of intellectual function remain to be well characterized. Here we investigated the neural basis of emotional intelligence in 152 patients with focal brain injuries using voxel-based lesion-symptom mapping. Latent variable modeling was applied to obtain measures of emotional intelligence, general intelligence and personality from the Mayer, Salovey, Caruso Emotional Intelligence Test (MSCEIT), the Wechsler Adult Intelligence Scale and the Neuroticism-Extroversion-Openness Inventory, respectively. Regression analyses revealed that latent scores for measures of general intelligence and personality reliably predicted latent scores for emotional intelligence. Lesion mapping results further indicated that these convergent processes depend on a shared network of frontal, temporal and parietal brain regions. The results support an integrative framework for understanding the architecture of executive, social and emotional processes and make specific recommendations for the interpretation and application of the MSCEIT to the study of emotional intelligence in health and disease.

  7. Distributed neural system for emotional intelligence revealed by lesion mapping

    PubMed Central

    Colom, Roberto; Grafman, Jordan

    2014-01-01

    Cognitive neuroscience has made considerable progress in understanding the neural architecture of human intelligence, identifying a broadly distributed network of frontal and parietal regions that support goal-directed, intelligent behavior. However, the contributions of this network to social and emotional aspects of intellectual function remain to be well characterized. Here we investigated the neural basis of emotional intelligence in 152 patients with focal brain injuries using voxel-based lesion-symptom mapping. Latent variable modeling was applied to obtain measures of emotional intelligence, general intelligence and personality from the Mayer, Salovey, Caruso Emotional Intelligence Test (MSCEIT), the Wechsler Adult Intelligence Scale and the Neuroticism-Extroversion-Openness Inventory, respectively. Regression analyses revealed that latent scores for measures of general intelligence and personality reliably predicted latent scores for emotional intelligence. Lesion mapping results further indicated that these convergent processes depend on a shared network of frontal, temporal and parietal brain regions. The results support an integrative framework for understanding the architecture of executive, social and emotional processes and make specific recommendations for the interpretation and application of the MSCEIT to the study of emotional intelligence in health and disease. PMID:23171618

  8. Fission Fragment Mass Distributions and Total Kinetic Energy Release of 235-Uranium and 238-Uranium in Neutron-Induced Fission at Intermediate and Fast Neutron Energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duke, Dana Lynn

    2015-11-12

    This Ph.D. dissertation describes a measurement of the change in mass distributions and average total kinetic energy (TKE) release with increasing incident neutron energy for fission of 235U and 238U. Although fission was discovered over seventy-five years ago, open questions remain about the physics of the fission process. The energy of the incident neutron, En, changes the division of energy release in the resulting fission fragments, however, the details of energy partitioning remain ambiguous because the nucleus is a many-body quantum system. Creating a full theoretical model is difficult and experimental data to validate existing models are lacking. Additional fissionmore » measurements will lead to higher-quality models of the fission process, therefore improving applications such as the development of next-generation nuclear reactors and defense. This work also paves the way for precision experiments such as the Time Projection Chamber (TPC) for fission cross section measurements and the Spectrometer for Ion Determination in Fission (SPIDER) for precision mass yields.« less

  9. DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...

  10. Steady hydromagnetic flows in open magnetic fields. II - Global flows with static zones

    NASA Technical Reports Server (NTRS)

    Tsinganos, K.; Low, B. C.

    1989-01-01

    A theoretical study of an axisymmetric steady stellar wind with a static zone is presented, with emphasis on the situation where the global magnetic field is symmetrical about the stellar equator and is partially open. In this scenario, the wind escapes in open magnetic fluxes originating from a region at the star pole and a region at an equatorial belt of closed magnetic field in static equilibrium. The two-dimensional balance of the pressure gradient and the inertial, gravitational, and Lorentz forces in different parts of the flow are studied, along with the static interplay between external sources of energy (heating and/or cooling) distributed in the flow and the pressure distribution.

  11. Optimal Power Scheduling for a Medium Voltage AC/DC Hybrid Distribution Network

    DOE PAGES

    Zhu, Zhenshan; Liu, Dichen; Liao, Qingfen; ...

    2018-01-26

    With the great increase of renewable generation as well as the DC loads in the distribution network; DC distribution technology is receiving more attention; since the DC distribution network can improve operating efficiency and power quality by reducing the energy conversion stages. This paper presents a new architecture for the medium voltage AC/DC hybrid distribution network; where the AC and DC subgrids are looped by normally closed AC soft open point (ACSOP) and DC soft open point (DCSOP); respectively. The proposed AC/DC hybrid distribution systems contain renewable generation (i.e., wind power and photovoltaic (PV) generation); energy storage systems (ESSs); softmore » open points (SOPs); and both AC and DC flexible demands. An energy management strategy for the hybrid system is presented based on the dynamic optimal power flow (DOPF) method. The main objective of the proposed power scheduling strategy is to minimize the operating cost and reduce the curtailment of renewable generation while meeting operational and technical constraints. The proposed approach is verified in five scenarios. The five scenarios are classified as pure AC system; hybrid AC/DC system; hybrid system with interlinking converter; hybrid system with DC flexible demand; and hybrid system with SOPs. Results show that the proposed scheduling method can successfully dispatch the controllable elements; and that the presented architecture for the AC/DC hybrid distribution system is beneficial for reducing operating cost and renewable generation curtailment.« less

  12. Optimal Power Scheduling for a Medium Voltage AC/DC Hybrid Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Zhenshan; Liu, Dichen; Liao, Qingfen

    With the great increase of renewable generation as well as the DC loads in the distribution network; DC distribution technology is receiving more attention; since the DC distribution network can improve operating efficiency and power quality by reducing the energy conversion stages. This paper presents a new architecture for the medium voltage AC/DC hybrid distribution network; where the AC and DC subgrids are looped by normally closed AC soft open point (ACSOP) and DC soft open point (DCSOP); respectively. The proposed AC/DC hybrid distribution systems contain renewable generation (i.e., wind power and photovoltaic (PV) generation); energy storage systems (ESSs); softmore » open points (SOPs); and both AC and DC flexible demands. An energy management strategy for the hybrid system is presented based on the dynamic optimal power flow (DOPF) method. The main objective of the proposed power scheduling strategy is to minimize the operating cost and reduce the curtailment of renewable generation while meeting operational and technical constraints. The proposed approach is verified in five scenarios. The five scenarios are classified as pure AC system; hybrid AC/DC system; hybrid system with interlinking converter; hybrid system with DC flexible demand; and hybrid system with SOPs. Results show that the proposed scheduling method can successfully dispatch the controllable elements; and that the presented architecture for the AC/DC hybrid distribution system is beneficial for reducing operating cost and renewable generation curtailment.« less

  13. Engineering Review Information System

    NASA Technical Reports Server (NTRS)

    Grems, III, Edward G. (Inventor); Henze, James E. (Inventor); Bixby, Jonathan A. (Inventor); Roberts, Mark (Inventor); Mann, Thomas (Inventor)

    2015-01-01

    A disciplinal engineering review computer information system and method by defining a database of disciplinal engineering review process entities for an enterprise engineering program, opening a computer supported engineering item based upon the defined disciplinal engineering review process entities, managing a review of the opened engineering item according to the defined disciplinal engineering review process entities, and closing the opened engineering item according to the opened engineering item review.

  14. Single Plant Root System Modeling under Soil Moisture Variation

    NASA Astrophysics Data System (ADS)

    Yabusaki, S.; Fang, Y.; Chen, X.; Scheibe, T. D.

    2016-12-01

    A prognostic Virtual Plant-Atmosphere-Soil System (vPASS) model is being developed that integrates comprehensively detailed mechanistic single plant modeling with microbial, atmospheric, and soil system processes in its immediate environment. Three broad areas of process module development are targeted: Incorporating models for root growth and function, rhizosphere interactions with bacteria and other organisms, litter decomposition and soil respiration into established porous media flow and reactive transport models Incorporating root/shoot transport, growth, photosynthesis and carbon allocation process models into an integrated plant physiology model Incorporating transpiration, Volatile Organic Compounds (VOC) emission, particulate deposition and local atmospheric processes into a coupled plant/atmosphere model. The integrated plant ecosystem simulation capability is being developed as open source process modules and associated interfaces under a modeling framework. The initial focus addresses the coupling of root growth, vascular transport system, and soil under drought scenarios. Two types of root water uptake modeling approaches are tested: continuous root distribution and constitutive root system architecture. The continuous root distribution models are based on spatially averaged root development process parameters, which are relatively straightforward to accommodate in the continuum soil flow and reactive transport module. Conversely, the constitutive root system architecture models use root growth rates, root growth direction, and root branching to evolve explicit root geometries. The branching topologies require more complex data structures and additional input parameters. Preliminary results are presented for root model development and the vascular response to temporal and spatial variations in soil conditions.

  15. Simplified Distributed Computing

    NASA Astrophysics Data System (ADS)

    Li, G. G.

    2006-05-01

    The distributed computing runs from high performance parallel computing, GRID computing, to an environment where idle CPU cycles and storage space of numerous networked systems are harnessed to work together through the Internet. In this work we focus on building an easy and affordable solution for computationally intensive problems in scientific applications based on existing technology and hardware resources. This system consists of a series of controllers. When a job request is detected by a monitor or initialized by an end user, the job manager launches the specific job handler for this job. The job handler pre-processes the job, partitions the job into relative independent tasks, and distributes the tasks into the processing queue. The task handler picks up the related tasks, processes the tasks, and puts the results back into the processing queue. The job handler also monitors and examines the tasks and the results, and assembles the task results into the overall solution for the job request when all tasks are finished for each job. A resource manager configures and monitors all participating notes. A distributed agent is deployed on all participating notes to manage the software download and report the status. The processing queue is the key to the success of this distributed system. We use BEA's Weblogic JMS queue in our implementation. It guarantees the message delivery and has the message priority and re-try features so that the tasks never get lost. The entire system is built on the J2EE technology and it can be deployed on heterogeneous platforms. It can handle algorithms and applications developed in any languages on any platforms. J2EE adaptors are provided to manage and communicate the existing applications to the system so that the applications and algorithms running on Unix, Linux and Windows can all work together. This system is easy and fast to develop based on the industry's well-adopted technology. It is highly scalable and heterogeneous. It is an open system and any number and type of machines can join the system to provide the computational power. This asynchronous message-based system can achieve second of response time. For efficiency, communications between distributed tasks are often done at the start and end of the tasks but intermediate status of the tasks can also be provided.

  16. Distributions and Changes of Carbonate Parameters Along the U.S. East Coast

    NASA Astrophysics Data System (ADS)

    Xu, Y. Y.; Cai, W. J.; Wanninkhof, R. H.; Salisbury, J., II

    2017-12-01

    On top of anthropogenic climate change, upwelling, eutrophication, river discharge, and interactions with the open ocean have affected carbonate chemistry in coastal waters. In this study, we present the large-scale variations of carbonate parameters along the U.S. east coast using in situ observations obtained during an East Coast Ocean Acidification (ECOA) cruise in summer 2015. Compare with previous large-scale cruises along the east coast, the ECOA cruise increases spatial coverage in the Gulf of Marine region and has more offshore stations for a better understanding of carbon dynamics in coastal waters and their interactions with open ocean waters. Our results show that the spatial distribution of water mass properties set up the large-scale advection of salt and heat and the distribution of total alkalinity (TA). However, dissolved inorganic carbon (DIC) shows a distinct pattern. Coastal water pH displays high variability in the Gulf of Maine and the Mid-Atlantic Bight (MAB). But it is relatively homogeneous in the South Atlantic Bight (SAB). In contrast, the distribution of aragonite saturation state (Ω) has an increase pattern from north to south similar to those of TA, SST, and SSS. A mechanistic discussion will be presented to understand the controls on Ω in eastern coastal waters. A comparison with previous cruises also suggests very different changes of pH and Ω in the MAB and SAB. Preliminary analysis suggests an overall increase in surface pH and Ω in the MAB. In contrast, pH and Ω in the SAB surface waters decrease over the past two decades. This work serves as a platform for the monitoring of large-scale carbon cycling in the U.S. east coast. It is also important to identify the physical and biogeochemical processes that affect these distributions and changes over time for a better understanding of carbon cycling and ocean acidification in coastal waters.

  17. Web Strategies for the Curation and Discovery of Open Educational Resources

    ERIC Educational Resources Information Center

    Rolfe, Vivien

    2016-01-01

    For those receiving funding from the UK HEFCE-funded Open Educational Resource Programme (2009-2012), the sustainability of project outputs was one of a number of essential goals. Our approach for the hosting and distribution of health and life science open educational resources (OER) was based on the utilisation of the WordPress.org blogging…

  18. Open sharing of genomic data: Who does it and why?

    PubMed

    Haeusermann, Tobias; Greshake, Bastian; Blasimme, Alessandro; Irdam, Darja; Richards, Martin; Vayena, Effy

    2017-01-01

    We explored the characteristics and motivations of people who, having obtained their genetic or genomic data from Direct-To-Consumer genetic testing (DTC-GT) companies, voluntarily decide to share them on the publicly accessible web platform openSNP. The study is the first attempt to describe open data sharing activities undertaken by individuals without institutional oversight. In the paper we provide a detailed overview of the distribution of the demographic characteristics and motivations of people engaged in genetic or genomic open data sharing. The geographical distribution of the respondents showed the USA as dominant. There was no significant gender divide, the age distribution was broad, educational background varied and respondents with and without children were equally represented. Health, even though prominent, was not the respondents' primary or only motivation to be tested. As to their motivations to openly share their data, 86.05% indicated wanting to learn about themselves as relevant, followed by contributing to the advancement of medical research (80.30%), improving the predictability of genetic testing (76.02%) and considering it fun to explore genotype and phenotype data (75.51%). Whereas most respondents were well aware of the privacy risks of their involvement in open genetic data sharing and considered the possibility of direct, personal repercussions troubling, they estimated the risk of this happening to be negligible. Our findings highlight the diversity of DTC-GT consumers who decide to openly share their data. Instead of focusing exclusively on health-related aspects of genetic testing and data sharing, our study emphasizes the importance of taking into account benefits and risks that stretch beyond the health spectrum. Our results thus lend further support to the call for a broader and multi-faceted conceptualization of genomic utility.

  19. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  20. Ceramic-like open-celled geopolymer foam as a porous substrate for water treatment catalyst

    NASA Astrophysics Data System (ADS)

    Kovářík, T.; Křenek, T.; Pola, M.; Rieger, D.; Kadlec, J.; Franče, P.

    2017-02-01

    This paper presents results from experimental study on microstructural and mechanical properties of geopolymer-based foam filters. The process for making porous ceramic-like geopolymer body was experimentally established, consists of (a) geopolymer paste synthesis, (b) ceramic filler incorporation, (c) coating of open-celled polyurethane foam with geopolymer mixture, (d) rapid setting procedure, (e) thermal treatment. Geopolymer paste was based on potassium silicate solution n(SiO2)/n(K2O)=1.6 and powder mixture of calcined kaolin and precipitated silica. Various types of ceramic granular filler (alumina, calcined schistous clay and cordierite) were tested in relation to aggregate gradation design and particle size distribution. The small amplitude oscillatory rheometry in strain controlled regime 0.01% with angular frequency 10 rad/s was applied for determination of rheology behavior of prepared mixtures. Thermal treatment conditions were applied in the temperature range 1100 - 1300 °C. The developed porous ceramic-like foam effectively served as a substrate for highly active nanoparticles of selected Fe+2 spinels. Such new-type of nanocomposite was tested as a heterogeneous catalyst for technological process of advanced oxidative degradation of resistive antibiotics occurring in waste waters.

  1. PLOCAN glider portal: a gateway for useful data management and visualization system

    NASA Astrophysics Data System (ADS)

    Morales, Tania; Lorenzo, Alvaro; Viera, Josue; Barrera, Carlos; José Rueda, María

    2014-05-01

    Nowadays monitoring ocean behavior and its characteristics involves a wide range of sources able to gather and provide a vast amount of data in spatio-temporal scales. Multiplatform infrastructures, like PLOCAN, hold a variety of autonomous Lagrangian and Eulerian devices addressed to collect information then transferred to land in near-real time. Managing all this data collection in an efficient way is a major issue. Advances in ocean observation technologies, where underwater autonomous gliders play a key role, has brought as a consequence an improvement of spatio-temporal resolution which offers a deeper understanding of the ocean but requires a bigger effort in the data management process. There are general requirements in terms of data management in that kind of environments, such as processing raw data at different levels to obtain valuable information, storing data coherently and providing accurate products to final users according to their specific needs. Managing large amount of data can be certainly tedious and complex without having right tools and operational procedures; hence automating these tasks through software applications saves time and reduces errors. Moreover, data distribution is highly relevant since scientist tent to assimilate different sources for comparison and validation. The use of web applications has boosted the necessary scientific dissemination. Within this argument, PLOCAN has implemented a set of independent but compatible applications to process, store and disseminate information gathered through different oceanographic platforms. These applications have been implemented using open standards, such as HTML and CSS, and open source software, like python as programming language and Django as framework web. More specifically, a glider application has been developed within the framework of FP7-GROOM project. Regarding data management, this project focuses on collecting and making available consistent and quality controlled datasets as well as fostering open access to glider data.

  2. "The Jackson Table Is a Pain in the…": A Qualitative Study of Providers' Perception Toward a Spinal Surgery Table.

    PubMed

    Asiedu, Gladys B; Lowndes, Bethany R; Huddleston, Paul M; Hallbeck, Susan

    2016-01-07

    The aim of this study was to define health care providers' perceptions toward prone patient positioning for spine surgery using the Jackson Table, which has not been hitherto explored. We analyzed open-ended questionnaire data and interviews conducted with the spine surgical team regarding the current process of spinal positioning/repositioning using the Jackson Table. Participants were asked to provide an open-ended explanation as to whether they think the current process of spinal positioning/repositioning is safe for the staff or patients. Follow-up qualitative interviews were conducted with 11 of the participants to gain an in-depth understanding of the challenges and safety issues related to prone patient positioning. Data analysis resulted in 6 main categories: general challenges with patient positioning, role-specific challenges, challenges with the Jackson Table and the "sandwich" mechanism, safety concerns for patients, safety concerns for the medical staff, and recommendations for best practices. This study is relevant to everyday practice for spinal surgical team members and advances our understanding of how surgical teams qualitatively view the current process of patient positioning for spinal surgery. Providers recommended best practices for using the Jackson Table, which can be achieved through standardized practice for transfer of patients, educational tools, and checklists for equipment before patient transfer and positioning. This research has identified several important practice opportunities for improving provider and patient safety in spine surgery.This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivitives 3.0 License, where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially. http://creativecommons.org/licenses/by-nc-nd/3.0.

  3. Basic to Advanced InSAR Processing: GMTSAR

    NASA Astrophysics Data System (ADS)

    Sandwell, D. T.; Xu, X.; Baker, S.; Hogrelius, A.; Mellors, R. J.; Tong, X.; Wei, M.; Wessel, P.

    2017-12-01

    Monitoring crustal deformation using InSAR is becoming a standard technique for the science and application communities. Optimal use of the new data streams from Sentinel-1 and NISAR will require open software tools as well as education on the strengths and limitations of the InSAR methods. Over the past decade we have developed freely available, open-source software for processing InSAR data. The software relies on the Generic Mapping Tools (GMT) for the back-end data analysis and display and is thus called GMTSAR. With startup funding from NSF, we accelerated the development of GMTSAR to include more satellite data sources and provide better integration and distribution with GMT. In addition, with support from UNAVCO we have offered 6 GMTSAR short courses to educate mostly novice InSAR users. Currently, the software is used by hundreds of scientists and engineers around the world to study deformation at more than 4300 different sites. The most challenging aspect of the recent software development was the transition from image alignment using the cross-correlation method to a completely new alignment algorithm that uses only the precise orbital information to geometrically align images to an accuracy of better than 7 cm. This development was needed to process a new data type that is being acquired by the Sentinel-1A/B satellites. This combination of software and open data is transforming radar interferometry from a research tool into a fully operational time series analysis tool. Over the next 5 years we are planning to continue to broaden the user base through: improved software delivery methods; code hardening; better integration with data archives; support for high level products being developed for NISAR; and continued education and outreach.

  4. Generation co-ordination and energy trading systems in an open market

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichler, R.

    1998-07-01

    The power industry in many parts of the world is currently undergoing dramatic changes: deregulation, privatization, competition and 3rd party access are the keywords. The major trends are summarized at the beginning of the paper to provide the basis for the evolving consequences for the power generation industry. In the restructured environment of the Open Market power generation companies frequently are organizationally separated from transmission, distribution, and supply and now have to sell their product directly to customers. This necessitates the introduction of energy trading support functions for both bilateral trading and power exchange trading. On the other hand, theremore » is a close relationship between energy trading and the technical process of energy production. The paper discusses design principles for software systems supporting maximum economic benefits. First practical application experience is also presented. The energy trading process requires the break up of proprietary databases and proprietary data structures as this process has a major need to communicate with external partners who normally use different systems. This directly leads to 3rd party products for the database, standardized data structures and standardized communication protocols. The Open Market environment calls for new and modified planning functions: in some cases measured value information necessary for updating load forecasts cannot be directly achieved. This leads to the need for an estimator of the actual load situation, a completely new function. Power scheduling has to take care of the generation company's balance but it need not always be forced to 0. Regulating services from the grid companies can be used instead. This gives the scheduling functions additional freedom for determining more economic overall solutions considering both purchase and services and sales of energy.« less

  5. Opticks : GPU Optical Photon Simulation for Particle Physics using NVIDIA® OptiX™

    NASA Astrophysics Data System (ADS)

    C, Blyth Simon

    2017-10-01

    Opticks is an open source project that integrates the NVIDIA OptiX GPU ray tracing engine with Geant4 toolkit based simulations. Massive parallelism brings drastic performance improvements with optical photon simulation speedup expected to exceed 1000 times Geant4 when using workstation GPUs. Optical photon simulation time becomes effectively zero compared to the rest of the simulation. Optical photons from scintillation and Cherenkov processes are allocated, generated and propagated entirely on the GPU, minimizing transfer overheads and allowing CPU memory usage to be restricted to optical photons that hit photomultiplier tubes or other photon detectors. Collecting hits into standard Geant4 hit collections then allows the rest of the simulation chain to proceed unmodified. Optical physics processes of scattering, absorption, scintillator reemission and boundary processes are implemented in CUDA OptiX programs based on the Geant4 implementations. Wavelength dependent material and surface properties as well as inverse cumulative distribution functions for reemission are interleaved into GPU textures providing fast interpolated property lookup or wavelength generation. Geometry is provided to OptiX in the form of CUDA programs that return bounding boxes for each primitive and ray geometry intersection positions. Some critical parts of the geometry such as photomultiplier tubes have been implemented analytically with the remainder being tessellated. OptiX handles the creation and application of a choice of acceleration structures such as boundary volume hierarchies and the transparent use of multiple GPUs. OptiX supports interoperation with OpenGL and CUDA Thrust that has enabled unprecedented visualisations of photon propagations to be developed using OpenGL geometry shaders to provide interactive time scrubbing and CUDA Thrust photon indexing to enable interactive history selection.

  6. VisTrails SAHM: visualization and workflow management for species habitat modeling

    USGS Publications Warehouse

    Morisette, Jeffrey T.; Jarnevich, Catherine S.; Holcombe, Tracy R.; Talbert, Colin B.; Ignizio, Drew A.; Talbert, Marian; Silva, Claudio; Koop, David; Swanson, Alan; Young, Nicholas E.

    2013-01-01

    The Software for Assisted Habitat Modeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre- and post-processing steps and modeling options incorporated in the construction of a species distribution model through the established workflow management and visualization VisTrails software. This paper provides an overview of the VisTrails:SAHM software including a link to the open source code, a table detailing the current SAHM modules, and a simple example modeling an invasive weed species in Rocky Mountain National Park, USA.

  7. A survey of the role of thermodynamic stability in viscous flow

    NASA Technical Reports Server (NTRS)

    Horne, W. C.; Smith, C. A.; Karamcheti, K.

    1991-01-01

    The stability of near-equilibrium states has been studied as a branch of the general field of nonequilibrium thermodynamics. By treating steady viscous flow as an open thermodynamic system, nonequilibrium principles such as the condition of minimum entropy-production rate for steady, near-equilibrium processes can be used to generate flow distributions from variational analyses. Examples considered in this paper are steady heat conduction, channel flow, and unconstrained three-dimensional flow. The entropy-production-rate condition has also been used for hydrodynamic stability criteria, and calculations of the stability of a laminar wall jet support this interpretation.

  8. Regioregular narrow-bandgap-conjugated polymers for plastic electronics

    NASA Astrophysics Data System (ADS)

    Ying, Lei; Huang, Fei; Bazan, Guillermo C.

    2017-03-01

    Progress in the molecular design and processing protocols of semiconducting polymers has opened significant opportunities for the fabrication of low-cost plastic electronic devices. Recent studies indicate that field-effect transistors and organic solar cells fabricated using narrow-bandgap regioregular polymers with translational symmetries in the direction of the backbone vector often outperform those containing analogous regiorandom polymers. This review addresses the cutting edge of regioregularity chemistry, in particular how to control the spatial distribution in the molecular structures and how this order translates to more ordered bulk morphologies. The effect of regioregularity on charge transport and photovoltaic properties is also outlined.

  9. Teaching earth science

    USGS Publications Warehouse

    Alpha, Tau Rho; Diggles, Michael F.

    1998-01-01

    This CD-ROM contains 17 teaching tools: 16 interactive HyperCard 'stacks' and a printable model. They are separated into the following categories: Geologic Processes, Earthquakes and Faulting, and Map Projections and Globes. A 'navigation' stack, Earth Science, is provided as a 'launching' place from which to access all of the other stacks. You can also open the HyperCard Stacks folder and launch any of the 16 stacks yourself. In addition, a 17th tool, Earth and Tectonic Globes, is provided as a printable document. Each of the tools can be copied onto a 1.4-MB floppy disk and distributed freely.

  10. 43 CFR 418.27 - Distribution system operation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... authorized employees or agents to open and close individual turnouts and operate the distribution system... variable field conditions, weather, etc., they must immediately notify the District so proper adjustments...

  11. Open Quantum Walks with Noncommuting Jump Operators

    NASA Astrophysics Data System (ADS)

    Caballar, Roland Cristopher; Petruccione, Francesco; Sinayskiy, Ilya

    2014-03-01

    We examine homogeneous open quantum walks along a line, wherein each forward step is due to one quantum jump operator, and each backward step due to another quantum jump operator. We assume that these two quantum jump operators do not commute with each other. We show that if the system has N internal degrees of freedom, for particular forms of these quantum jump operators, we can obtain exact probability distributions which fall into two distinct classes, namely Gaussian distributions and solitonic distributions. We also show that it is possible for a maximum of 2 solitonic distributions to be present simultaneously in the system. Finally, we consider applications of these classes of jump operators in quantum state preparation and quantum information. We acknowledge support from the National Institute for Theoretical Physics (NITheP).

  12. OpenStudio: A Platform for Ex Ante Incentive Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roth, Amir; Brackney, Larry; Parker, Andrew

    Many utilities operate programs that provide ex ante (up front) incentives for building energy conservation measures (ECMs). A typical incentive program covers two kinds of ECMs. ECMs that deliver similar savings in different contexts are associated with pre-calculated 'deemed' savings values. ECMs that deliver different savings in different contexts are evaluated on a 'custom' per-project basis. Incentive programs often operate at less than peak efficiency because both deemed ECMs and custom projects have lengthy and effort-intensive review processes--deemed ECMs to gain confidence that they are sufficiently context insensitive, custom projects to ensure that savings are claimed appropriately. DOE's OpenStudio platformmore » can be used to automate ex ante processes and help utilities operate programs more efficiently, consistently, and transparently, resulting in greater project throughput and energy savings. A key concept of the platform is the OpenStudio Measure, a script that queries and transforms building energy models. Measures can be simple or surgical, e.g., applying different transformations based on space-type, orientation, etc. Measures represent ECMs explicitly and are easier to review than ECMs that are represented implicitly as the difference between a with-ECM and without-ECM models. Measures can be automatically applied to large numbers of prototype models--and instantiated from uncertainty distributions--facilitating the large scale analysis required to develop deemed savings values. For custom projects, Measures can also be used to calibrate existing building models, to automatically create code baseline models, and to perform quality assurance screening.« less

  13. Structures and energetics of hydrated deprotonated cis-pinonic acid anion clusters and their atmospheric relevance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Gao-Lei; Zhang, Jun; Valiev, Marat

    2017-01-01

    Pinonic acid, a C10-monocarboxylic acid with a hydrophilic –CO 2H group and a hydrophobic hydrocarbon backbone, is a key intermediate oxidation product of α-pinene – an important monoterpene compound in biogenic emission processes that influences the atmosphere. Molecular interaction between cis-pinonic acid and water is essential for understanding its role in the formation and growth of pinene-derived secondary organic aerosols. In this work, we studied the structures, energetics, and optical properties of hydrated clusters of cis-pinonate anion (cPA–), the deprotonated form of cis-pinonic acid, by negative ion photoelectron spectroscopy and ab initio theoretical calculations. Our results show that cPA– canmore » adopt two different structural configurations – open and folded. In the absence of waters, the open configuration has the lowest energy and provides the best agreement with the experiment. The addition waters, which mainly interact with the negatively charged -CO 2– group, gradually stabilize the folded configuration and lower its energy difference relative to the most stable open-configured structure. Thermochemical and equilibrium hydrate distribution analysis suggests that the mono- and di- hydrates are likely to exist in humid atmospheric environment with high populations. The detailed molecular description of cPA– hydrated clusters unraveled in this study provides a valuable reference for understanding the initial nucleation process and aerosol formation involving organics containing both hydrophilic and hydrophobic groups, as well as for analyzing the optical properties of those organic aerosols.« less

  14. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  15. Velocity and size of droplets in dense region of diesel fuel spray on transient needle opening condition

    NASA Astrophysics Data System (ADS)

    Ueki, Hironobu; Ishida, Masahiro; Sakaguchi, Daisaku

    2005-06-01

    In order to investigate the effect of transient needle opening on early stage of spray behavior, simultaneous measurements of velocity and size of droplet were conducted by a newly developed laser 2-focus velocimeter (L2F). The micro-scale probe of the L2F was consisted of two foci with a distance of 36 µm. The tested nozzle had a single hole with a diameter of 0.2 mm. The measurements of injection pressure, needle lift, and crank angle were synchronized with the spray measurement by the L2F at the position 10 mm downstream from the nozzle exit. It has been clearly shown that the velocity and size of droplet increase with needle valve opening and that the probability density distribution of droplet size can be fitted to the Nukiyama-Tanasawa distribution under the transient needle opening condition.

  16. Characterization of the biomechanical properties of T4 pili expressed by Streptococcus pneumoniae--a comparison between helix-like and open coil-like pili.

    PubMed

    Castelain, Mickaël; Koutris, Efstratios; Andersson, Magnus; Wiklund, Krister; Björnham, Oscar; Schedin, Staffan; Axner, Ove

    2009-07-13

    Bacterial adhesion organelles, known as fimbria or pili, are expressed by gram-positive as well as gram-negative bacteria families. These appendages play a key role in the first steps of the invasion and infection processes, and they therefore provide bacteria with pathogenic abilities. To improve the knowledge of pili-mediated bacterial adhesion to host cells and how these pili behave under the presence of an external force, we first characterize, using force measuring optical tweezers, open coil-like T4 pili expressed by gram-positive Streptococcus pneumoniae with respect to their biomechanical properties. It is shown that their elongation behavior can be well described by the worm-like chain model and that they possess a large degree of flexibility. Their properties are then compared with those of helix-like pili expressed by gram-negative uropathogenic Escherichia coli (UPEC), which have different pili architecture. The differences suggest that these two types of pili have distinctly dissimilar mechanisms to adhere and sustain external forces. Helix-like pili expressed by UPEC bacteria adhere to host cells by single adhesins located at the distal end of the pili while their helix-like structures act as shock absorbers to dampen the irregularly shear forces induced by urine flow and to increase the cooperativity of the pili ensemble, whereas open coil-like pili expressed by S. pneumoniae adhere to cells by a multitude of adhesins distributed along the pili. It is hypothesized that these two types of pili represent different strategies of adhering to host cells in the presence of external forces. When exposed to significant forces, bacteria expressing helix-like pili remain attached by distributing the external force among a multitude of pili, whereas bacteria expressing open coil-like pili sustain large forces primarily by their multitude of binding adhesins which presumably detach sequentially.

  17. Natural thermodynamics

    NASA Astrophysics Data System (ADS)

    Annila, Arto

    2016-02-01

    The principle of increasing entropy is derived from statistical physics of open systems assuming that quanta of actions, as undividable basic build blocks, embody everything. According to this tenet, all systems evolve from one state to another either by acquiring quanta from their surroundings or by discarding quanta to the surroundings in order to attain energetic balance in least time. These natural processes result in ubiquitous scale-free patterns: skewed distributions that accumulate in a sigmoid manner and hence span log-log scales mostly as straight lines. Moreover, the equation for least-time motions reveals that evolution is by nature a non-deterministic process. Although the obtained insight in thermodynamics from the notion of quanta in motion yields nothing new, it accentuates that contemporary comprehension is impaired when modeling evolution as a computable process by imposing conservation of energy and thereby ignoring that quantum of actions are the carriers of energy from the system to its surroundings.

  18. Globalisation and women in India.

    PubMed

    Krishnaraj, M

    1999-11-01

    Globalization arrived in India through an external and internal alignment of political and economic forces that led to the opening of the country to the outside world. The five processes under globalization are: 1) commercialism wherein more services become monetized and incomes are received in money rather than in kind; 2) more capitalization; 3) foreign trade becomes important for the production and distribution process; 4) greater financialization develops; and 5) international capital moves freely. These changes affect women more than men in different ways. Capitalization results in more self-employed marginal farmers becoming wage workers, making it less possible for women to manage domestic duties alongside their productive work. In general, macro-economic policies affect women through the household, market, and gender relations. In countries like India where women suffer from serious discrimination, whatever affects the household will worsen women's position. Thus, the process of liberalization, privatization, and globalization will put the clock back for women and for the poor in general.

  19. Magnetic Memory from Site Isolated Dy(III) on Silica Materials

    PubMed Central

    2017-01-01

    Achieving magnetic remanence at single isolated metal sites dispersed at the surface of a solid matrix has been envisioned as a key step toward information storage and processing in the smallest unit of matter. Here, we show that isolated Dy(III) sites distributed at the surface of silica nanoparticles, prepared with a simple and scalable two-step process, show magnetic remanence and display a hysteresis loop open at liquid 4He temperature, in contrast to the molecular precursor which does not display any magnetic memory. This singular behavior is achieved through the controlled grafting of a tailored Dy(III) siloxide complex on partially dehydroxylated silica nanoparticles followed by thermal annealing. This approach allows control of the density and the structure of isolated, “bare” Dy(III) sites bound to the silica surface. During the process, all organic fragments are removed, leaving the surface as the sole ligand, promoting magnetic remanence. PMID:28386602

  20. Surface-micromachined and high-aspect ratio electrostatic actuators for aeronautic and space applications: design and lifetime considerations

    NASA Astrophysics Data System (ADS)

    Vescovo, P.; Joseph, E.; Bourbon, G.; Le Moal, P.; Minotti, P.; Hibert, C.; Pont, G.

    2003-09-01

    This paper focuses on recent advances in the field of MEMS-based actuators and distributed microelectromechanical systems (MEMS). IC-processed actuators (e.g. actuators that are machined using integrated circuit batch processes) are expected to open a wide range of industrial applications on the near term. The most promising investigations deal with high-aspect ratio electric field driven microactuators suitable for use in numerous technical fields such as aeronautics and space industry. Because the silicon micromachining technology have the potential to integrate both mechanical components and control circuits within a single process, MEMS-based active control of microscopic and macroscopic structures appears to be one of the most promising challenges for the next decade. As a first step towards new generations of MEMS-based smart structures, recent investigations dealing with silicon mechanisms involving MEMS-based actuators are briefly discussed in this paper.

  1. Magnetic memory from site isolated Dy(III) on silica materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allouche, Florian; Lapadula, Giuseppe; Siddiqi, Georges

    Achieving magnetic remanence at single isolated metal sites dispersed at the surface of a solid matrix has been envisioned as a key step toward information storage and processing in the smallest unit of matter. Here, we show that isolated Dy(III) sites distributed at the surface of silica nanoparticles, prepared with a simple and scalable two-step process, show magnetic remanence and display a hysteresis loop open at liquid 4He temperature, in contrast to the molecular precursor which does not display any magnetic memory. This singular behavior is achieved through the controlled grafting of a tailored Dy(III) siloxide complex on partially dehydroxylatedmore » silica nanoparticles followed by thermal annealing. This approach allows control of the density and the structure of isolated, “bare” Dy(III) sites bound to the silica surface. Throughout the process, all organic fragments are removed, leaving the surface as the sole ligand, promoting magnetic remanence.« less

  2. Magnetic memory from site isolated Dy(III) on silica materials

    DOE PAGES

    Allouche, Florian; Lapadula, Giuseppe; Siddiqi, Georges; ...

    2017-02-22

    Achieving magnetic remanence at single isolated metal sites dispersed at the surface of a solid matrix has been envisioned as a key step toward information storage and processing in the smallest unit of matter. Here, we show that isolated Dy(III) sites distributed at the surface of silica nanoparticles, prepared with a simple and scalable two-step process, show magnetic remanence and display a hysteresis loop open at liquid 4He temperature, in contrast to the molecular precursor which does not display any magnetic memory. This singular behavior is achieved through the controlled grafting of a tailored Dy(III) siloxide complex on partially dehydroxylatedmore » silica nanoparticles followed by thermal annealing. This approach allows control of the density and the structure of isolated, “bare” Dy(III) sites bound to the silica surface. Throughout the process, all organic fragments are removed, leaving the surface as the sole ligand, promoting magnetic remanence.« less

  3. Spin-Off Successes of SETI Research at Berkeley

    NASA Astrophysics Data System (ADS)

    Douglas, K. A.; Anderson, D. P.; Bankay, R.; Chen, H.; Cobb, J.; Korpela, E. J.; Lebofsky, M.; Parsons, A.; von Korff, J.; Werthimer, D.

    2009-12-01

    Our group contributes to the Search for Extra-Terrestrial Intelligence (SETI) by developing and using world-class signal processing computers to analyze data collected on the Arecibo telescope. Although no patterned signal of extra-terrestrial origin has yet been detected, and the immediate prospects for making such a detection are highly uncertain, the SETI@home project has nonetheless proven the value of pursuing such research through its impact on the fields of distributed computing, real-time signal processing, and radio astronomy. The SETI@home project has spun off the Center for Astronomy Signal Processing and Electronics Research (CASPER) and the Berkeley Open Infrastructure for Networked Computing (BOINC), both of which are responsible for catalyzing a smorgasbord of new research in scientific disciplines in countries around the world. Futhermore, the data collected and archived for the SETI@home project is proving valuable in data-mining experiments for mapping neutral galatic hydrogen and for detecting black-hole evaporation.

  4. BioMAJ: a flexible framework for databanks synchronization and processing.

    PubMed

    Filangi, Olivier; Beausse, Yoann; Assi, Anthony; Legrand, Ludovic; Larré, Jean-Marc; Martin, Véronique; Collin, Olivier; Caron, Christophe; Leroy, Hugues; Allouche, David

    2008-08-15

    Large- and medium-scale computational molecular biology projects require accurate bioinformatics software and numerous heterogeneous biological databanks, which are distributed around the world. BioMAJ provides a flexible, robust, fully automated environment for managing such massive amounts of data. The JAVA application enables automation of the data update cycle process and supervision of the locally mirrored data repository. We have developed workflows that handle some of the most commonly used bioinformatics databases. A set of scripts is also available for post-synchronization data treatment consisting of indexation or format conversion (for NCBI blast, SRS, EMBOSS, GCG, etc.). BioMAJ can be easily extended by personal homemade processing scripts. Source history can be kept via html reports containing statements of locally managed databanks. http://biomaj.genouest.org. BioMAJ is free open software. It is freely available under the CECILL version 2 license.

  5. Empirical tests of Zipf's law mechanism in open source Linux distribution.

    PubMed

    Maillart, T; Sornette, D; Spaeth, S; von Krogh, G

    2008-11-21

    Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.

  6. Environmental constraints on the invasion of Triadica sebifera in the eastern United States: an experimental field assessment.

    PubMed

    Pattison, Robert R; Mack, Richard N

    2009-01-01

    Identifying the environmental constraints that affect the distribution of an invasive species is fundamental to its effective control. Triadica sebifera (Chinese tallow tree) has invaded the southeastern United States, but its potential for further range and habitat extension has been unresolved. We explored experimentally environmental factors in macro- and microhabitats that affect its persistence at five widely separated sites along the Atlantic seaboard of the United States and at two sites inland; three sites occur well beyond the tree's current range. At each site, seeds and young vegetative plants (0.5-0.65 m tall) of T. sebifera were placed in four microhabitats (closed-canopy upland, closed-canopy lowland, open-canopy upland, and open-canopy lowland). Plant growth, leaf CO(2) assimilation rates, leaf N concentrations and delta(13)C ratios, and stem water potential were measured for two growing seasons. Percent seed germination was consistently higher in open-canopy microhabitats and lowest at northern and inland sites. T. sebifera grew in all open-canopy microhabitats, even 300-500 km beyond its current distribution. Plant growth in closed-canopy habitats was lower, attributable to lower carbon gain per unit leaf area in shaded compared with open-canopy environments, especially at northern and inland sites. Neither competition, other than canopy shade, nor grazing was a key constraint on distribution at any scale. Our results demonstrate that T. sebifera is dispersal limited at landscape scales but limited locally by dispersal and overstory shade; it has yet to occupy the full extent of its new range in North America. Quantifying environmental factors both within and well beyond a species' current range can effectively highlight the limits on its distribution.

  7. Holographic Jet Shapes and their Evolution in Strongly Coupled Plasma

    NASA Astrophysics Data System (ADS)

    Brewer, Jasmine; Rajagopal, Krishna; Sadofyev, Andrey; van der Schee, Wilke

    2017-11-01

    Recently our group analyzed how the probability distribution for the jet opening angle is modified in an ensemble of jets that has propagated through an expanding cooling droplet of plasma [K. Rajagopal, A. V. Sadofyev, W. van der Schee, Phys. Rev. Lett. 116 (2016) 211603]. Each jet in the ensemble is represented holographically by a string in the dual 4+1- dimensional gravitational theory with the distribution of initial energies and opening angles in the ensemble given by perturbative QCD. In [K. Rajagopal, A. V. Sadofyev, W. van der Schee, Phys. Rev. Lett. 116 (2016) 211603], the full string dynamics were approximated by assuming that the string moves at the speed of light. We are now able to analyze the full string dynamics for a range of possible initial conditions, giving us access to the dynamics of holographic jets just after their creation. The nullification timescale and the features of the string when it has nullified are all results of the string evolution. This emboldens us to analyze the full jet shape modification, rather than just the opening angle modification of each jet in the ensemble as in [K. Rajagopal, A. V. Sadofyev, W. van der Schee, Phys. Rev. Lett. 116 (2016) 211603]. We find the result that the jet shape scales with the opening angle at any particular energy. We construct an ensemble of dijets with energies and energy asymmetry distributions taken from events in proton-proton collisions, opening angle distribution as in [K. Rajagopal, A. V. Sadofyev, W. van der Schee, Phys. Rev. Lett. 116 (2016) 211603], and jet shape taken from proton-proton collisions and scaled according to our result. We study how these observables are modified after we send the ensemble of dijets through the strongly-coupled plasma.

  8. Reconstructing quantum entropy production to probe irreversibility and correlations

    NASA Astrophysics Data System (ADS)

    Gherardini, Stefano; Müller, Matthias M.; Trombettoni, Andrea; Ruffo, Stefano; Caruso, Filippo

    2018-07-01

    One of the major goals of quantum thermodynamics is the characterization of irreversibility and its consequences in quantum processes. Here, we discuss how entropy production provides a quantification of the irreversibility in open quantum systems through the quantum fluctuation theorem. We start by introducing a two-time quantum measurement scheme, in which the dynamical evolution between the measurements is described by a completely positive, trace-preserving (CPTP) quantum map (forward process). By inverting the measurement scheme and applying the time-reversed version of the quantum map, we can study how this backward process differs from the forward one. When the CPTP map is unital, we show that the stochastic quantum entropy production is a function only of the probabilities to get the initial measurement outcomes in correspondence of the forward and backward processes. For bipartite open quantum systems we also prove that the mean value of the stochastic quantum entropy production is sub-additive with respect to the bipartition (except for product states). Hence, we find a method to detect correlations between the subsystems. Our main result is the proposal of an efficient protocol to determine and reconstruct the characteristic functions of the stochastic entropy production for each subsystem. This procedure enables to reconstruct even others thermodynamical quantities, such as the work distribution of the composite system and the corresponding internal energy. Efficiency and possible extensions of the protocol are also discussed. Finally, we show how our findings might be experimentally tested by exploiting the state of-the-art trapped-ion platforms.

  9. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  10. Open Markov Processes and Reaction Networks

    ERIC Educational Resources Information Center

    Swistock Pollard, Blake Stephen

    2017-01-01

    We begin by defining the concept of "open" Markov processes, which are continuous-time Markov chains where probability can flow in and out through certain "boundary" states. We study open Markov processes which in the absence of such boundary flows admit equilibrium states satisfying detailed balance, meaning that the net flow…

  11. Processing and Conversion of Algae to Bioethanol

    NASA Astrophysics Data System (ADS)

    Kampfe, Sara Katherine

    We begin by defining the concept of `open' Markov processes, which are continuous-time Markov chains where probability can flow in and out through certain `boundary' states. We study open Markov processes which in the absence of such boundary flows admit equilibrium states satisfying detailed balance, meaning that the net flow of probability vanishes between all pairs of states. External couplings which fix the probabilities of boundary states can maintain such systems in non-equilibrium steady states in which non-zero probability currents flow. We show that these non-equilibrium steady states minimize a quadratic form which we call 'dissipation.' This is closely related to Prigogine's principle of minimum entropy production. We bound the rate of change of the entropy of a driven non-equilibrium steady state relative to the underlying equilibrium state in terms of the flow of probability through the boundary of the process. We then consider open Markov processes as morphisms in a symmetric monoidal category by splitting up their boundary states into certain sets of `inputs' and `outputs.' Composition corresponds to gluing the outputs of one such open Markov process onto the inputs of another so that the probability flowing out of the first process is equal to the probability flowing into the second. Tensoring in this category corresponds to placing two such systems side by side. We construct a `black-box' functor characterizing the behavior of an open Markov process in terms of the space of possible steady state probabilities and probability currents along the boundary. The fact that this is a functor means that the behavior of a composite open Markov process can be computed by composing the behaviors of the open Markov processes from which it is composed. We prove a similar black-boxing theorem for reaction networks whose dynamics are given by the non-linear rate equation. Along the way we describe a more general category of open dynamical systems where composition corresponds to gluing together open dynamical systems.

  12. B-HIT - A Tool for Harvesting and Indexing Biodiversity Data

    PubMed Central

    Barker, Katharine; Braak, Kyle; Cawsey, E. Margaret; Coddington, Jonathan; Robertson, Tim; Whitacre, Jamie

    2015-01-01

    With the rapidly growing number of data publishers, the process of harvesting and indexing information to offer advanced search and discovery becomes a critical bottleneck in globally distributed primary biodiversity data infrastructures. The Global Biodiversity Information Facility (GBIF) implemented a Harvesting and Indexing Toolkit (HIT), which largely automates data harvesting activities for hundreds of collection and observational data providers. The team of the Botanic Garden and Botanical Museum Berlin-Dahlem has extended this well-established system with a range of additional functions, including improved processing of multiple taxon identifications, the ability to represent associations between specimen and observation units, new data quality control and new reporting capabilities. The open source software B-HIT can be freely installed and used for setting up thematic networks serving the demands of particular user groups. PMID:26544980

  13. B-HIT - A Tool for Harvesting and Indexing Biodiversity Data.

    PubMed

    Kelbert, Patricia; Droege, Gabriele; Barker, Katharine; Braak, Kyle; Cawsey, E Margaret; Coddington, Jonathan; Robertson, Tim; Whitacre, Jamie; Güntsch, Anton

    2015-01-01

    With the rapidly growing number of data publishers, the process of harvesting and indexing information to offer advanced search and discovery becomes a critical bottleneck in globally distributed primary biodiversity data infrastructures. The Global Biodiversity Information Facility (GBIF) implemented a Harvesting and Indexing Toolkit (HIT), which largely automates data harvesting activities for hundreds of collection and observational data providers. The team of the Botanic Garden and Botanical Museum Berlin-Dahlem has extended this well-established system with a range of additional functions, including improved processing of multiple taxon identifications, the ability to represent associations between specimen and observation units, new data quality control and new reporting capabilities. The open source software B-HIT can be freely installed and used for setting up thematic networks serving the demands of particular user groups.

  14. Porous inorganic capsules in action: modelling transmembrane cation-transport parameter-dependence based on water as vehicle.

    PubMed

    Haupt, Erhard T K; Wontorra, Claudia; Rehder, Dieter; Müller, Achim

    2005-08-21

    Insight into basic principles of cation transport through "molecular channels", and especially details of the related fundamental H2O vehicle function, could be obtained via7Li NMR studies of the Li+ uptake/release processes by the unique porous nanocapsule [{(MoVI)MoVI5O21(H2O)6}12{MoV2O4(SO4)}30]72- which behaves as a semi-permeable inorganic membrane open for H2O and small cations; channel traffic as well as internal cavity distribution processes show a strong dependence on "environmental" effects such as exerted by solvent properties, the amount of water present, and competing complexing ligands, and end up in a complex equilibrium situation as in biological leak channels.

  15. Experiences using OpenMP based on Computer Directed Software DSM on a PC Cluster

    NASA Technical Reports Server (NTRS)

    Hess, Matthias; Jost, Gabriele; Mueller, Matthias; Ruehle, Roland

    2003-01-01

    In this work we report on our experiences running OpenMP programs on a commodity cluster of PCs running a software distributed shared memory (DSM) system. We describe our test environment and report on the performance of a subset of the NAS Parallel Benchmarks that have been automaticaly parallelized for OpenMP. We compare the performance of the OpenMP implementations with that of their message passing counterparts and discuss performance differences.

  16. An open, object-based modeling approach for simulating subsurface heterogeneity

    NASA Astrophysics Data System (ADS)

    Bennett, J.; Ross, M.; Haslauer, C. P.; Cirpka, O. A.

    2017-12-01

    Characterization of subsurface heterogeneity with respect to hydraulic and geochemical properties is critical in hydrogeology as their spatial distribution controls groundwater flow and solute transport. Many approaches of characterizing subsurface heterogeneity do not account for well-established geological concepts about the deposition of the aquifer materials; those that do (i.e. process-based methods) often require forcing parameters that are difficult to derive from site observations. We have developed a new method for simulating subsurface heterogeneity that honors concepts of sequence stratigraphy, resolves fine-scale heterogeneity and anisotropy of distributed parameters, and resembles observed sedimentary deposits. The method implements a multi-scale hierarchical facies modeling framework based on architectural element analysis, with larger features composed of smaller sub-units. The Hydrogeological Virtual Reality simulator (HYVR) simulates distributed parameter models using an object-based approach. Input parameters are derived from observations of stratigraphic morphology in sequence type-sections. Simulation outputs can be used for generic simulations of groundwater flow and solute transport, and for the generation of three-dimensional training images needed in applications of multiple-point geostatistics. The HYVR algorithm is flexible and easy to customize. The algorithm was written in the open-source programming language Python, and is intended to form a code base for hydrogeological researchers, as well as a platform that can be further developed to suit investigators' individual needs. This presentation will encompass the conceptual background and computational methods of the HYVR algorithm, the derivation of input parameters from site characterization, and the results of groundwater flow and solute transport simulations in different depositional settings.

  17. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  18. Exploring the Role of Distributed Learning in Distance Education at Allama Iqbal Open University: Academic Challenges at Postgraduate Level

    ERIC Educational Resources Information Center

    Bukhsh, Qadir; Chaudhary, Muhammad Ajmal

    2015-01-01

    Distributed learning is derived from the concept of distributed resources. Different institutions around the globe connected through network and the learners are diverse, located in the different cultures and communities. Distributed learning provides global standards of quality to all learners through synchronous and asynchronous communications…

  19. Catalytic and electrochemical behaviour of solid oxide fuel cell operated with simulated-biogas mixtures

    NASA Astrophysics Data System (ADS)

    Dang-Long, T.; Quang-Tuyen, T.; Shiratori, Y.

    2016-06-01

    Being produced from organic matters of wastes (bio-wastes) through a fermentation process, biogas mainly composed of CH4 and CO2 and can be considered as a secondary energy carrier derived from solar energy. To generate electricity from biogas through the electrochemical process in fuel cells is a state-of-the-art technology possessing higher energy conversion efficiency without harmful emissions compared to combustion process in heat engines. Getting benefits from high operating temperature such as direct internal reforming ability and activation of electrochemical reactions to increase overall system efficiency, solid oxide fuel cell (SOFC) system operated with biogas becomes a promising candidate for distributed power generator for rural applications leading to reductions of environmental issues caused by greenhouse effects and bio-wastes. CO2 reforming of CH4 and electrochemical oxidation of the produced syngas (H2-CO mixture) are two main reaction processes within porous anode material of SOFC. Here catalytic and electrochemical behavior of Ni-ScSZ (scandia stabilized-zirconia) anode in the feed of CH4-CO2 mixtures as simulated-biogas at 800 °C were evaluated. The results showed that CO2 had strong influences on both reaction processes. The increase in CO2 partial pressure resulted in the decrease in anode overvoltage, although open-circuit voltage was dropped. Besides that, the simulation result based on a power-law model for equimolar CH4-CO2 mixture revealed that coking hazard could be suppressed along the fuel flow channel in both open-circuit and closed-circuit conditions.

  20. Real-time implementations of image segmentation algorithms on shared memory multicore architecture: a survey (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Akil, Mohamed

    2017-05-01

    The real-time processing is getting more and more important in many image processing applications. Image segmentation is one of the most fundamental tasks image analysis. As a consequence, many different approaches for image segmentation have been proposed. The watershed transform is a well-known image segmentation tool. The watershed transform is a very data intensive task. To achieve acceleration and obtain real-time processing of watershed algorithms, parallel architectures and programming models for multicore computing have been developed. This paper focuses on the survey of the approaches for parallel implementation of sequential watershed algorithms on multicore general purpose CPUs: homogeneous multicore processor with shared memory. To achieve an efficient parallel implementation, it's necessary to explore different strategies (parallelization/distribution/distributed scheduling) combined with different acceleration and optimization techniques to enhance parallelism. In this paper, we give a comparison of various parallelization of sequential watershed algorithms on shared memory multicore architecture. We analyze the performance measurements of each parallel implementation and the impact of the different sources of overhead on the performance of the parallel implementations. In this comparison study, we also discuss the advantages and disadvantages of the parallel programming models. Thus, we compare the OpenMP (an application programming interface for multi-Processing) with Ptheads (POSIX Threads) to illustrate the impact of each parallel programming model on the performance of the parallel implementations.

Top