Sample records for system exploiting distributed

  1. Modeling a hierarchical structure of factors influencing exploitation policy for water distribution systems using ISM approach

    NASA Astrophysics Data System (ADS)

    Jasiulewicz-Kaczmarek, Małgorzata; Wyczółkowski, Ryszard; Gładysiak, Violetta

    2017-12-01

    Water distribution systems are one of the basic elements of contemporary technical infrastructure of urban and rural areas. It is a complex engineering system composed of transmission networks and auxiliary equipment (e.g. controllers, checkouts etc.), scattered territorially over a large area. From the water distribution system operation point of view, its basic features are: functional variability, resulting from the need to adjust the system to temporary fluctuations in demand for water and territorial dispersion. The main research questions are: What external factors should be taken into account when developing an effective water distribution policy? Does the size and nature of the water distribution system significantly affect the exploitation policy implemented? These questions have shaped the objectives of research and the method of research implementation.

  2. Informatic analysis for hidden pulse attack exploiting spectral characteristics of optics in plug-and-play quantum key distribution system

    NASA Astrophysics Data System (ADS)

    Ko, Heasin; Lim, Kyongchun; Oh, Junsang; Rhee, June-Koo Kevin

    2016-10-01

    Quantum channel loopholes due to imperfect implementations of practical devices expose quantum key distribution (QKD) systems to potential eavesdropping attacks. Even though QKD systems are implemented with optical devices that are highly selective on spectral characteristics, information theory-based analysis about a pertinent attack strategy built with a reasonable framework exploiting it has never been clarified. This paper proposes a new type of trojan horse attack called hidden pulse attack that can be applied in a plug-and-play QKD system, using general and optimal attack strategies that can extract quantum information from phase-disturbed quantum states of eavesdropper's hidden pulses. It exploits spectral characteristics of a photodiode used in a plug-and-play QKD system in order to probe modulation states of photon qubits. We analyze the security performance of the decoy-state BB84 QKD system under the optimal hidden pulse attack model that shows enormous performance degradation in terms of both secret key rate and transmission distance.

  3. Exploiting geo-distributed clouds for a e-health monitoring system with minimum service delay and privacy preservation.

    PubMed

    Shen, Qinghua; Liang, Xiaohui; Shen, Xuemin; Lin, Xiaodong; Luo, Henry Y

    2014-03-01

    In this paper, we propose an e-health monitoring system with minimum service delay and privacy preservation by exploiting geo-distributed clouds. In the system, the resource allocation scheme enables the distributed cloud servers to cooperatively assign the servers to the requested users under the load balance condition. Thus, the service delay for users is minimized. In addition, a traffic-shaping algorithm is proposed. The traffic-shaping algorithm converts the user health data traffic to the nonhealth data traffic such that the capability of traffic analysis attacks is largely reduced. Through the numerical analysis, we show the efficiency of the proposed traffic-shaping algorithm in terms of service delay and privacy preservation. Furthermore, through the simulations, we demonstrate that the proposed resource allocation scheme significantly reduces the service delay compared to two other alternatives using jointly the short queue and distributed control law.

  4. Programming your way out of the past: ISIS and the META Project

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Marzullo, Keith

    1989-01-01

    The ISIS distributed programming system and the META Project are described. The ISIS programming toolkit is an aid to low-level programming that makes it easy to build fault-tolerant distributed applications that exploit replication and concurrent execution. The META Project is reexamining high-level mechanisms such as the filesystem, shell language, and administration tools in distributed systems.

  5. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  6. Artificial neural networks in models of specialization, guild evolution and sympatric speciation.

    PubMed

    Holmgren, Noél M A; Norrström, Niclas; Getz, Wayne M

    2007-03-29

    Sympatric speciation can arise as a result of disruptive selection with assortative mating as a pleiotropic by-product. Studies on host choice, employing artificial neural networks as models for the host recognition system in exploiters, illustrate how disruptive selection on host choice coupled with assortative mating can arise as a consequence of selection for specialization. Our studies demonstrate that a generalist exploiter population can evolve into a guild of specialists with an 'ideal free' frequency distribution across hosts. The ideal free distribution arises from variability in host suitability and density-dependent exploiter fitness on different host species. Specialists are less subject to inter-phenotypic competition than generalists and to harmful mutations that are common in generalists exploiting multiple hosts. When host signals used as cues by exploiters coevolve with exploiter recognition systems, our studies show that evolutionary changes may be continuous and cyclic. Selection changes back and forth between specialization and generalization in the exploiters, and weak and strong mimicry in the hosts, where non-defended hosts use the host investing in defence as a model. Thus, host signals and exploiter responses are engaged in a red-queen mimicry process that is ultimately cyclic rather then directional. In one phase, evolving signals of exploitable hosts mimic those of hosts less suitable for exploitation (i.e. the model). Signals in the model hosts also evolve through selection to escape the mimic and its exploiters. Response saturation constraints in the model hosts lead to the mimic hosts finally perfecting its mimicry, after which specialization in the exploiter guild is lost. This loss of exploiter specialization provides an opportunity for the model hosts to escape their mimics. Therefore, this cycle then repeats. We suggest that a species can readily evolve sympatrically when disruptive selection for specialization on hosts is the first step. In a sexual reproduction setting, partial reproductive isolation may first evolve by mate choice being confined to individuals on the same host. Secondly, this disruptive selection will favour assortative mate choice on genotype, thereby leading to increased reproductive isolation.

  7. The StarLite Project

    DTIC Science & Technology

    1988-09-01

    The current prototyping tool also provides a multiversion data object control mechanism. In a real-time database system, synchronization protocols...data in distributed real-time systems. The semantic informa- tion of read-only transactions is exploited for improved efficiency, and a multiversion ...are discussed. ." Index Terms: distributed system, replication, read-only transaction, consistency, multiversion . I’ I’ I’ 4. -9- I I I ° e% 4, 1

  8. Mixed Poisson distributions in exact solutions of stochastic autoregulation models.

    PubMed

    Iyer-Biswas, Srividya; Jayaprakash, C

    2014-11-01

    In this paper we study the interplay between stochastic gene expression and system design using simple stochastic models of autoactivation and autoinhibition. Using the Poisson representation, a technique whose particular usefulness in the context of nonlinear gene regulation models we elucidate, we find exact results for these feedback models in the steady state. Further, we exploit this representation to analyze the parameter spaces of each model, determine which dimensionless combinations of rates are the shape determinants for each distribution, and thus demarcate where in the parameter space qualitatively different behaviors arise. These behaviors include power-law-tailed distributions, bimodal distributions, and sub-Poisson distributions. We also show how these distribution shapes change when the strength of the feedback is tuned. Using our results, we reexamine how well the autoinhibition and autoactivation models serve their conventionally assumed roles as paradigms for noise suppression and noise exploitation, respectively.

  9. Exploiting replication in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, T. A.

    1989-01-01

    Techniques are examined for replicating data and execution in directly distributed systems: systems in which multiple processes interact directly with one another while continuously respecting constraints on their joint behavior. Directly distributed systems are often required to solve difficult problems, ranging from management of replicated data to dynamic reconfiguration in response to failures. It is shown that these problems reduce to more primitive, order-based consistency problems, which can be solved using primitives such as the reliable broadcast protocols. Moreover, given a system that implements reliable broadcast primitives, a flexible set of high-level tools can be provided for building a wide variety of directly distributed application programs.

  10. SPANG: a SPARQL client supporting generation and reuse of queries for distributed RDF databases.

    PubMed

    Chiba, Hirokazu; Uchiyama, Ikuo

    2017-02-08

    Toward improved interoperability of distributed biological databases, an increasing number of datasets have been published in the standardized Resource Description Framework (RDF). Although the powerful SPARQL Protocol and RDF Query Language (SPARQL) provides a basis for exploiting RDF databases, writing SPARQL code is burdensome for users including bioinformaticians. Thus, an easy-to-use interface is necessary. We developed SPANG, a SPARQL client that has unique features for querying RDF datasets. SPANG dynamically generates typical SPARQL queries according to specified arguments. It can also call SPARQL template libraries constructed in a local system or published on the Web. Further, it enables combinatorial execution of multiple queries, each with a distinct target database. These features facilitate easy and effective access to RDF datasets and integrative analysis of distributed data. SPANG helps users to exploit RDF datasets by generation and reuse of SPARQL queries through a simple interface. This client will enhance integrative exploitation of biological RDF datasets distributed across the Web. This software package is freely available at http://purl.org/net/spang .

  11. Exploiting Virtual Synchrony in Distributed Systems

    DTIC Science & Technology

    1987-02-01

    for distributed systems yield the best performance relative to the level of synchronization guaranteed by the primitive . A pro- grammer could then... synchronization facility. Semaphores Replicated binary and general semaphores . Monitors Monitor lock, condition variables and signals. Deadlock detection...We describe applications of a new software abstraction called the virtually synchronous process group. Such a group consists of a set of processes

  12. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  13. Uncertainty of exploitation estimates made from tag returns

    USGS Publications Warehouse

    Miranda, L.E.; Brock, R.E.; Dorr, B.S.

    2002-01-01

    Over 6,000 crappies Pomoxis spp. were tagged in five water bodies to estimate exploitation rates by anglers. Exploitation rates were computed as the percentage of tags returned after adjustment for three sources of uncertainty: postrelease mortality due to the tagging process, tag loss, and the reporting rate of tagged fish. Confidence intervals around exploitation rates were estimated by resampling from the probability distributions of tagging mortality, tag loss, and reporting rate. Estimates of exploitation rates ranged from 17% to 54% among the five study systems. Uncertainty around estimates of tagging mortality, tag loss, and reporting resulted in 90% confidence intervals around the median exploitation rate as narrow as 15 percentage points and as broad as 46 percentage points. The greatest source of estimation error was uncertainty about tag reporting. Because the large investments required by tagging and reward operations produce imprecise estimates of the exploitation rate, it may be worth considering other approaches to estimating it or simply circumventing the exploitation question altogether.

  14. Low-Rank Correction Methods for Algebraic Domain Decomposition Preconditioners

    DOE PAGES

    Li, Ruipeng; Saad, Yousef

    2017-08-01

    This study presents a parallel preconditioning method for distributed sparse linear systems, based on an approximate inverse of the original matrix, that adopts a general framework of distributed sparse matrices and exploits domain decomposition (DD) and low-rank corrections. The DD approach decouples the matrix and, once inverted, a low-rank approximation is applied by exploiting the Sherman--Morrison--Woodbury formula, which yields two variants of the preconditioning methods. The low-rank expansion is computed by the Lanczos procedure with reorthogonalizations. Numerical experiments indicate that, when combined with Krylov subspace accelerators, this preconditioner can be efficient and robust for solving symmetric sparse linear systems. Comparisonsmore » with pARMS, a DD-based parallel incomplete LU (ILU) preconditioning method, are presented for solving Poisson's equation and linear elasticity problems.« less

  15. Low-Rank Correction Methods for Algebraic Domain Decomposition Preconditioners

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ruipeng; Saad, Yousef

    This study presents a parallel preconditioning method for distributed sparse linear systems, based on an approximate inverse of the original matrix, that adopts a general framework of distributed sparse matrices and exploits domain decomposition (DD) and low-rank corrections. The DD approach decouples the matrix and, once inverted, a low-rank approximation is applied by exploiting the Sherman--Morrison--Woodbury formula, which yields two variants of the preconditioning methods. The low-rank expansion is computed by the Lanczos procedure with reorthogonalizations. Numerical experiments indicate that, when combined with Krylov subspace accelerators, this preconditioner can be efficient and robust for solving symmetric sparse linear systems. Comparisonsmore » with pARMS, a DD-based parallel incomplete LU (ILU) preconditioning method, are presented for solving Poisson's equation and linear elasticity problems.« less

  16. Content-based image exploitation for situational awareness

    NASA Astrophysics Data System (ADS)

    Gains, David

    2008-04-01

    Image exploitation is of increasing importance to the enterprise of building situational awareness from multi-source data. It involves image acquisition, identification of objects of interest in imagery, storage, search and retrieval of imagery, and the distribution of imagery over possibly bandwidth limited networks. This paper describes an image exploitation application that uses image content alone to detect objects of interest, and that automatically establishes and preserves spatial and temporal relationships between images, cameras and objects. The application features an intuitive user interface that exposes all images and information generated by the system to an operator thus facilitating the formation of situational awareness.

  17. A Cooperative V2V Alert System to Mitigate Vehicular Traffic ShockWaves

    DOT National Transportation Integrated Search

    2017-03-01

    Vehicle traffic on highway systems are typically not uniformly distributed. In our work, we introduce a protocol that exploits this phenomenon by considering the formations of shock waves and opportunities in adjacent lanes. The objective of this pro...

  18. EVALUATING DISCONTINUITIES IN COMPLEX SYSTEMS: TOWARD QUANTITATIVE MEASURE OF RESILIENCE

    EPA Science Inventory

    The textural discontinuity hypothesis (TDH) is based on the observation that animal body mass distributions exhibit discontinuities that may reflect the texture of the landscape available for exploitation. This idea has been extended to other complex systems, hinting that the ide...

  19. Optimized distributed computing environment for mask data preparation

    NASA Astrophysics Data System (ADS)

    Ahn, Byoung-Sup; Bang, Ju-Mi; Ji, Min-Kyu; Kang, Sun; Jang, Sung-Hoon; Choi, Yo-Han; Ki, Won-Tai; Choi, Seong-Woon; Han, Woo-Sung

    2005-11-01

    As the critical dimension (CD) becomes smaller, various resolution enhancement techniques (RET) are widely adopted. In developing sub-100nm devices, the complexity of optical proximity correction (OPC) is severely increased and applied OPC layers are expanded to non-critical layers. The transformation of designed pattern data by OPC operation causes complexity, which cause runtime overheads to following steps such as mask data preparation (MDP), and collapse of existing design hierarchy. Therefore, many mask shops exploit the distributed computing method in order to reduce the runtime of mask data preparation rather than exploit the design hierarchy. Distributed computing uses a cluster of computers that are connected to local network system. However, there are two things to limit the benefit of the distributing computing method in MDP. First, every sequential MDP job, which uses maximum number of available CPUs, is not efficient compared to parallel MDP job execution due to the input data characteristics. Second, the runtime enhancement over input cost is not sufficient enough since the scalability of fracturing tools is limited. In this paper, we will discuss optimum load balancing environment that is useful in increasing the uptime of distributed computing system by assigning appropriate number of CPUs for each input design data. We will also describe the distributed processing (DP) parameter optimization to obtain maximum throughput in MDP job processing.

  20. Data Curation for the Exploitation of Large Earth Observation Products Databases - The MEA system

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Cavicchi, Mario; Della Vecchia, Andrea

    2014-05-01

    National Space Agencies under the umbrella of the European Space Agency are performing a strong activity to handle and provide solutions to Big Data and related knowledge (metadata, software tools and services) management and exploitation. The continuously increasing amount of long-term and of historic data in EO facilities in the form of online datasets and archives, the incoming satellite observation platforms that will generate an impressive amount of new data and the new EU approach on the data distribution policy make necessary to address technologies for the long-term management of these data sets, including their consolidation, preservation, distribution, continuation and curation across multiple missions. The management of long EO data time series of continuing or historic missions - with more than 20 years of data available already today - requires technical solutions and technologies which differ considerably from the ones exploited by existing systems. Several tools, both open source and commercial, are already providing technologies to handle data and metadata preparation, access and visualization via OGC standard interfaces. This study aims at describing the Multi-sensor Evolution Analysis (MEA) system and the Data Curation concept as approached and implemented within the ASIM and EarthServer projects, funded by the European Space Agency and the European Commission, respectively.

  1. A study of the viability of exploiting memory content similarity to improve resilience to memory errors

    DOE PAGES

    Levy, Scott; Ferreira, Kurt B.; Bridges, Patrick G.; ...

    2014-12-09

    Building the next-generation of extreme-scale distributed systems will require overcoming several challenges related to system resilience. As the number of processors in these systems grow, the failure rate increases proportionally. One of the most common sources of failure in large-scale systems is memory. In this paper, we propose a novel runtime for transparently exploiting memory content similarity to improve system resilience by reducing the rate at which memory errors lead to node failure. We evaluate the viability of this approach by examining memory snapshots collected from eight high-performance computing (HPC) applications and two important HPC operating systems. Based on themore » characteristics of the similarity uncovered, we conclude that our proposed approach shows promise for addressing system resilience in large-scale systems.« less

  2. ISIS and META projects

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth; Cooper, Robert; Marzullo, Keith

    1990-01-01

    The ISIS project has developed a new methodology, virtual synchony, for writing robust distributed software. High performance multicast, large scale applications, and wide area networks are the focus of interest. Several interesting applications that exploit the strengths of ISIS, including an NFS-compatible replicated file system, are being developed. The META project is distributed control in a soft real-time environment incorporating feedback. This domain encompasses examples as diverse as monitoring inventory and consumption on a factory floor, and performing load-balancing on a distributed computing system. One of the first uses of META is for distributed application management: the tasks of configuring a distributed program, dynamically adapting to failures, and monitoring its performance. Recent progress and current plans are reported.

  3. A distributed programming environment for Ada

    NASA Technical Reports Server (NTRS)

    Brennan, Peter; Mcdonnell, Tom; Mcfarland, Gregory; Timmins, Lawrence J.; Litke, John D.

    1986-01-01

    Despite considerable commercial exploitation of fault tolerance systems, significant and difficult research problems remain in such areas as fault detection and correction. A research project is described which constructs a distributed computing test bed for loosely coupled computers. The project is constructing a tool kit to support research into distributed control algorithms, including a distributed Ada compiler, distributed debugger, test harnesses, and environment monitors. The Ada compiler is being written in Ada and will implement distributed computing at the subsystem level. The design goal is to provide a variety of control mechanics for distributed programming while retaining total transparency at the code level.

  4. Coaching the exploration and exploitation in active learning for interactive video retrieval.

    PubMed

    Wei, Xiao-Yong; Yang, Zhen-Qun

    2013-03-01

    Conventional active learning approaches for interactive video/image retrieval usually assume the query distribution is unknown, as it is difficult to estimate with only a limited number of labeled instances available. Thus, it is easy to put the system in a dilemma whether to explore the feature space in uncertain areas for a better understanding of the query distribution or to harvest in certain areas for more relevant instances. In this paper, we propose a novel approach called coached active learning that makes the query distribution predictable through training and, therefore, avoids the risk of searching on a completely unknown space. The estimated distribution, which provides a more global view of the feature space, can be used to schedule not only the timing but also the step sizes of the exploration and the exploitation in a principled way. The results of the experiments on a large-scale data set from TRECVID 2005-2009 validate the efficiency and effectiveness of our approach, which demonstrates an encouraging performance when facing domain-shift, outperforms eight conventional active learning methods, and shows superiority to six state-of-the-art interactive video retrieval systems.

  5. Exploiting virtual synchrony in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, Thomas A.

    1987-01-01

    Applications of a virtually synchronous environment are described for distributed programming, which underlies a collection of distributed programming tools in the ISIS2 system. A virtually synchronous environment allows processes to be structured into process groups, and makes events like broadcasts to the group as an entity, group membership changes, and even migration of an activity from one place to another appear to occur instantaneously, in other words, synchronously. A major advantage to this approach is that many aspects of a distributed application can be treated independently without compromising correctness. Moreover, user code that is designed as if the system were synchronous can often be executed concurrently. It is argued that this approach to building distributed and fault tolerant software is more straightforward, more flexible, and more likely to yield correct solutions than alternative approaches.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin

    This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.

  7. Long-distance measurement-device-independent quantum key distribution with coherent-state superpositions.

    PubMed

    Yin, H-L; Cao, W-F; Fu, Y; Tang, Y-L; Liu, Y; Chen, T-Y; Chen, Z-B

    2014-09-15

    Measurement-device-independent quantum key distribution (MDI-QKD) with decoy-state method is believed to be securely applied to defeat various hacking attacks in practical quantum key distribution systems. Recently, the coherent-state superpositions (CSS) have emerged as an alternative to single-photon qubits for quantum information processing and metrology. Here, in this Letter, CSS are exploited as the source in MDI-QKD. We present an analytical method that gives two tight formulas to estimate the lower bound of yield and the upper bound of bit error rate. We exploit the standard statistical analysis and Chernoff bound to perform the parameter estimation. Chernoff bound can provide good bounds in the long-distance MDI-QKD. Our results show that with CSS, both the security transmission distance and secure key rate are significantly improved compared with those of the weak coherent states in the finite-data case.

  8. Key issues for determining the exploitable water resources in a Mediterranean river basin.

    PubMed

    Pedro-Monzonís, María; Ferrer, Javier; Solera, Abel; Estrela, Teodoro; Paredes-Arquiola, Javier

    2015-01-15

    One of the major difficulties in water planning is to determine the water availability in a water resource system in order to distribute water sustainably. In this paper, we analyze the key issues for determining the exploitable water resources as an indicator of water availability in a Mediterranean river basin. Historically, these territories are characterized by heavily regulated water resources and the extensive use of unconventional resources (desalination and wastewater reuse); hence, emulating the hydrological cycle is not enough. This analysis considers the Jucar River Basin as a case study. We have analyzed the different possible combinations between the streamflow time series, the length of the simulation period and the reliability criteria. As expected, the results show a wide dispersion, proving the great influence of the reliability criteria used for the quantification and localization of the exploitable water resources in the system. Therefore, it is considered risky to provide a single value to represent the water availability in the Jucar water resource system. In this sense, it is necessary that policymakers and stakeholders make a decision about the methodology used to determine the exploitable water resources in a river basin. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. The architecture of a distributed medical dictionary.

    PubMed

    Fowler, J; Buffone, G; Moreau, D

    1995-01-01

    Exploiting high-speed computer networks to provide a national medical information infrastructure is a goal for medical informatics. The Distributed Medical Dictionary under development at Baylor College of Medicine is a model for an architecture that supports collaborative development of a distributed online medical terminology knowledge-base. A prototype is described that illustrates the concept. Issues that must be addressed by such a system include high availability, acceptable response time, support for local idiom, and control of vocabulary.

  10. ISIS and META projects

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth; Cooper, Robert; Marzullo, Keith

    1990-01-01

    ISIS and META are two distributed systems projects at Cornell University. The ISIS project, has developed a new methodology, virtual synchrony, for writing robust distributed software. This approach is directly supported by the ISIS Toolkit, a programming system that is distributed to over 300 academic and industrial sites. Several interesting applications that exploit the strengths of ISIS, including an NFS-compatible replicated file system, are being developed. The META project, is about distributed control in a soft real time environment incorporating feedback. This domain encompasses examples as diverse as monitoring inventory and consumption on a factory floor and performing load-balancing on a distributed computing system. One of the first uses of META is for distributed application management: the tasks of configuring a distributed program, dynamically adapting to failures, and monitoring its performance. Recent progress and current plans are presented. This approach to distributed computing, a philosophy that is believed to significantly distinguish the work from that of others in the field, is explained.

  11. How to do research fairly in an unjust world.

    PubMed

    Ballantyne, Angela J

    2010-06-01

    International research, sponsored by for-profit companies, is regularly criticised as unethical on the grounds that it exploits research subjects in developing countries. Many commentators agree that exploitation occurs when the benefits of cooperative activity are unfairly distributed between the parties. To determine whether international research is exploitative we therefore need an account of fair distribution. Procedural accounts of fair bargaining have been popular solutions to this problem, but I argue that they are insufficient to protect against exploitation. I argue instead that a maximin principle of fair distribution provides a more compelling normative account of fairness in relationships characterised by extreme vulnerability and inequality of bargaining potential between the parties. A global tax on international research would provide a mechanism for implementing the maximin account of fair benefits. This model has the capacity to ensure fair benefits and thereby prevent exploitation in international research.

  12. A numerical differentiation library exploiting parallel architectures

    NASA Astrophysics Data System (ADS)

    Voglis, C.; Hadjidoukas, P. E.; Lagaris, I. E.; Papageorgiou, D. G.

    2009-08-01

    We present a software library for numerically estimating first and second order partial derivatives of a function by finite differencing. Various truncation schemes are offered resulting in corresponding formulas that are accurate to order O(h), O(h), and O(h), h being the differencing step. The derivatives are calculated via forward, backward and central differences. Care has been taken that only feasible points are used in the case where bound constraints are imposed on the variables. The Hessian may be approximated either from function or from gradient values. There are three versions of the software: a sequential version, an OpenMP version for shared memory architectures and an MPI version for distributed systems (clusters). The parallel versions exploit the multiprocessing capability offered by computer clusters, as well as modern multi-core systems and due to the independent character of the derivative computation, the speedup scales almost linearly with the number of available processors/cores. Program summaryProgram title: NDL (Numerical Differentiation Library) Catalogue identifier: AEDG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 73 030 No. of bytes in distributed program, including test data, etc.: 630 876 Distribution format: tar.gz Programming language: ANSI FORTRAN-77, ANSI C, MPI, OPENMP Computer: Distributed systems (clusters), shared memory systems Operating system: Linux, Solaris Has the code been vectorised or parallelized?: Yes RAM: The library uses O(N) internal storage, N being the dimension of the problem Classification: 4.9, 4.14, 6.5 Nature of problem: The numerical estimation of derivatives at several accuracy levels is a common requirement in many computational tasks, such as optimization, solution of nonlinear systems, etc. The parallel implementation that exploits systems with multiple CPUs is very important for large scale and computationally expensive problems. Solution method: Finite differencing is used with carefully chosen step that minimizes the sum of the truncation and round-off errors. The parallel versions employ both OpenMP and MPI libraries. Restrictions: The library uses only double precision arithmetic. Unusual features: The software takes into account bound constraints, in the sense that only feasible points are used to evaluate the derivatives, and given the level of the desired accuracy, the proper formula is automatically employed. Running time: Running time depends on the function's complexity. The test run took 15 ms for the serial distribution, 0.6 s for the OpenMP and 4.2 s for the MPI parallel distribution on 2 processors.

  13. Use of communication technologies in document exchange for the management of construction projects

    NASA Astrophysics Data System (ADS)

    Mesároš, Peter; Mandičák, Tomáš

    2016-06-01

    Information and communication technologies represent a set of people, processes, technical and software tools providing collection, transport, storage and processing of data for distribution and presentation of information. Particularly communication systems are the main tool for information exchange. Of the other part, these technologies have a broad focus and use. One of them is the exchange of documents in the management of construction projects. Paper discusses the issue of exploitation level of communication technologies in construction project management. The main objective of this paper is to analyze exploitation level of communication technologies. Another aim of the paper is to compare exploitation level or rate of document exchange by electronic communication devices and face-to-face communication.

  14. CERN experience and strategy for the maintenance of cryogenic plants and distribution systems

    NASA Astrophysics Data System (ADS)

    Serio, L.; Bremer, J.; Claudet, S.; Delikaris, D.; Ferlin, G.; Pezzetti, M.; Pirotte, O.; Tavian, L.; Wagner, U.

    2015-12-01

    CERN operates and maintains the world largest cryogenic infrastructure ranging from ageing installations feeding detectors, test facilities and general services, to the state-of-the-art cryogenic system serving the flagship LHC machine complex. After several years of exploitation of a wide range of cryogenic installations and in particular following the last two years major shutdown to maintain and consolidate the LHC machine, we have analysed and reviewed the maintenance activities to implement an efficient and reliable exploitation of the installations. We report the results, statistics and lessons learned on the maintenance activities performed and in particular the required consolidations and major overhauling, the organization, management and methodologies implemented.

  15. DISPAQ: Distributed Profitable-Area Query from Big Taxi Trip Data.

    PubMed

    Putri, Fadhilah Kurnia; Song, Giltae; Kwon, Joonho; Rao, Praveen

    2017-09-25

    One of the crucial problems for taxi drivers is to efficiently locate passengers in order to increase profits. The rapid advancement and ubiquitous penetration of Internet of Things (IoT) technology into transportation industries enables us to provide taxi drivers with locations that have more potential passengers (more profitable areas) by analyzing and querying taxi trip data. In this paper, we propose a query processing system, called Distributed Profitable-Area Query ( DISPAQ ) which efficiently identifies profitable areas by exploiting the Apache Software Foundation's Spark framework and a MongoDB database. DISPAQ first maintains a profitable-area query index (PQ-index) by extracting area summaries and route summaries from raw taxi trip data. It then identifies candidate profitable areas by searching the PQ-index during query processing. Then, it exploits a Z-Skyline algorithm, which is an extension of skyline processing with a Z-order space filling curve, to quickly refine the candidate profitable areas. To improve the performance of distributed query processing, we also propose local Z-Skyline optimization, which reduces the number of dominant tests by distributing killer profitable areas to each cluster node. Through extensive evaluation with real datasets, we demonstrate that our DISPAQ system provides a scalable and efficient solution for processing profitable-area queries from huge amounts of big taxi trip data.

  16. DISPAQ: Distributed Profitable-Area Query from Big Taxi Trip Data †

    PubMed Central

    Putri, Fadhilah Kurnia; Song, Giltae; Rao, Praveen

    2017-01-01

    One of the crucial problems for taxi drivers is to efficiently locate passengers in order to increase profits. The rapid advancement and ubiquitous penetration of Internet of Things (IoT) technology into transportation industries enables us to provide taxi drivers with locations that have more potential passengers (more profitable areas) by analyzing and querying taxi trip data. In this paper, we propose a query processing system, called Distributed Profitable-Area Query (DISPAQ) which efficiently identifies profitable areas by exploiting the Apache Software Foundation’s Spark framework and a MongoDB database. DISPAQ first maintains a profitable-area query index (PQ-index) by extracting area summaries and route summaries from raw taxi trip data. It then identifies candidate profitable areas by searching the PQ-index during query processing. Then, it exploits a Z-Skyline algorithm, which is an extension of skyline processing with a Z-order space filling curve, to quickly refine the candidate profitable areas. To improve the performance of distributed query processing, we also propose local Z-Skyline optimization, which reduces the number of dominant tests by distributing killer profitable areas to each cluster node. Through extensive evaluation with real datasets, we demonstrate that our DISPAQ system provides a scalable and efficient solution for processing profitable-area queries from huge amounts of big taxi trip data. PMID:28946679

  17. Searching and exploitation of distributed geospatial data sources via the Naval Research Lab's Geospatial Information Database (GIDB) Portal System

    NASA Astrophysics Data System (ADS)

    McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.

    2005-05-01

    The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.

  18. Parallelization of NAS Benchmarks for Shared Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    This paper presents our experiences of parallelizing the sequential implementation of NAS benchmarks using compiler directives on SGI Origin2000 distributed shared memory (DSM) system. Porting existing applications to new high performance parallel and distributed computing platforms is a challenging task. Ideally, a user develops a sequential version of the application, leaving the task of porting to new generations of high performance computing systems to parallelization tools and compilers. Due to the simplicity of programming shared-memory multiprocessors, compiler developers have provided various facilities to allow the users to exploit parallelism. Native compilers on SGI Origin2000 support multiprocessing directives to allow users to exploit loop-level parallelism in their programs. Additionally, supporting tools can accomplish this process automatically and present the results of parallelization to the users. We experimented with these compiler directives and supporting tools by parallelizing sequential implementation of NAS benchmarks. Results reported in this paper indicate that with minimal effort, the performance gain is comparable with the hand-parallelized, carefully optimized, message-passing implementations of the same benchmarks.

  19. Towards scalable Byzantine fault-tolerant replication

    NASA Astrophysics Data System (ADS)

    Zbierski, Maciej

    2017-08-01

    Byzantine fault-tolerant (BFT) replication is a powerful technique, enabling distributed systems to remain available and correct even in the presence of arbitrary faults. Unfortunately, existing BFT replication protocols are mostly load-unscalable, i.e. they fail to respond with adequate performance increase whenever new computational resources are introduced into the system. This article proposes a universal architecture facilitating the creation of load-scalable distributed services based on BFT replication. The suggested approach exploits parallel request processing to fully utilize the available resources, and uses a load balancer module to dynamically adapt to the properties of the observed client workload. The article additionally provides a discussion on selected deployment scenarios, and explains how the proposed architecture could be used to increase the dependability of contemporary large-scale distributed systems.

  20. Randomness determines practical security of BB84 quantum key distribution.

    PubMed

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-10

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  1. Randomness determines practical security of BB84 quantum key distribution

    PubMed Central

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-01-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system. PMID:26552359

  2. Randomness determines practical security of BB84 quantum key distribution

    NASA Astrophysics Data System (ADS)

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  3. Frequency distributions from birth, death, and creation processes.

    PubMed

    Bartley, David L; Ogden, Trevor; Song, Ruiguang

    2002-01-01

    The time-dependent frequency distribution of groups of individuals versus group size was investigated within a continuum approximation, assuming a simplified individual growth, death and creation model. The analogy of the system to a physical fluid exhibiting both convection and diffusion was exploited in obtaining various solutions to the distribution equation. A general solution was approximated through the application of a Green's function. More specific exact solutions were also found to be useful. The solutions were continually checked against the continuum approximation through extensive simulation of the discrete system. Over limited ranges of group size, the frequency distributions were shown to closely exhibit a power-law dependence on group size, as found in many realizations of this type of system, ranging from colonies of mutated bacteria to the distribution of surnames in a given population. As an example, the modeled distributions were successfully fit to the distribution of surnames in several countries by adjusting the parameters specifying growth, death and creation rates.

  4. The spatial distribution of threats to plant species with extremely small populations

    NASA Astrophysics Data System (ADS)

    Wang, Chunjing; Zhang, Jing; Wan, Jizhong; Qu, Hong; Mu, Xianyun; Zhang, Zhixiang

    2017-03-01

    Many biological conservationists take actions to conserve plant species with extremely small populations (PSESP) in China; however, there have been few studies on the spatial distribution of threats to PSESP. Hence, we selected distribution data of PSESP and made a map of the spatial distribution of threats to PSESP in China. First, we used the weight assignment method to evaluate the threat risk to PSESP at both country and county scales. Second, we used a geographic information system to map the spatial distribution of threats to PSESP, and explored the threat factors based on linear regression analysis. Finally, we suggested some effective conservation options. We found that the PSESP with high values of protection, such as the plants with high scientific research values and ornamental plants, were threatened by over-exploitation and utilization, habitat fragmentation, and a small sized wild population in broad-leaved forests and bush fallows. We also identified some risk hotspots for PSESP in China. Regions with low elevation should be given priority for ex- and in-situ conservation. Moreover, climate change should be considered for conservation of PSESP. To avoid intensive over-exploitation or utilization and habitat fragmentation, in-situ conservation should be practiced in regions with high temperatures and low temperature seasonality, particularly in the high risk hotspots for PSESP that we proposed. Ex-situ conservation should be applied in these same regions, and over-exploitation and utilization of natural resources should be prevented. It is our goal to apply the concept of PSESP to the global scale in the future.

  5. SenSyF Experience on Integration of EO Services in a Generic, Cloud-Based EO Exploitation Platform

    NASA Astrophysics Data System (ADS)

    Almeida, Nuno; Catarino, Nuno; Gutierrez, Antonio; Grosso, Nuno; Andrade, Joao; Caumont, Herve; Goncalves, Pedro; Villa, Guillermo; Mangin, Antoine; Serra, Romain; Johnsen, Harald; Grydeland, Tom; Emsley, Stephen; Jauch, Eduardo; Moreno, Jose; Ruiz, Antonio

    2016-08-01

    SenSyF is a cloud-based data processing framework for EO- based services. It has been pioneer in addressing Big Data issues from the Earth Observation point of view, and is a precursor of several of the technologies and methodologies that will be deployed in ESA's Thematic Exploitation Platforms and other related systems.The SenSyF system focuses on developing fully automated data management, together with access to a processing and exploitation framework, including Earth Observation specific tools. SenSyF is both a development and validation platform for data intensive applications using Earth Observation data. With SenSyF, scientific, institutional or commercial institutions developing EO- based applications and services can take advantage of distributed computational and storage resources, tailored for applications dependent on big Earth Observation data, and without resorting to deep infrastructure and technological investments.This paper describes the integration process and the experience gathered from different EO Service providers during the project.

  6. Using ontologies for structuring organizational knowledge in Home Care assistance.

    PubMed

    Valls, Aida; Gibert, Karina; Sánchez, David; Batet, Montserrat

    2010-05-01

    Information Technologies and Knowledge-based Systems can significantly improve the management of complex distributed health systems, where supporting multidisciplinarity is crucial and communication and synchronization between the different professionals and tasks becomes essential. This work proposes the use of the ontological paradigm to describe the organizational knowledge of such complex healthcare institutions as a basis to support their management. The ontology engineering process is detailed, as well as the way to maintain the ontology updated in front of changes. The paper also analyzes how such an ontology can be exploited in a real healthcare application and the role of the ontology in the customization of the system. The particular case of senior Home Care assistance is addressed, as this is a highly distributed field as well as a strategic goal in an ageing Europe. The proposed ontology design is based on a Home Care medical model defined by an European consortium of Home Care professionals, framed in the scope of the K4Care European project (FP6). Due to the complexity of the model and the knowledge gap existing between the - textual - medical model and the strict formalization of an ontology, an ontology engineering methodology (On-To-Knowledge) has been followed. After applying the On-To-Knowledge steps, the following results were obtained: the feasibility study concluded that the ontological paradigm and the expressiveness of modern ontology languages were enough to describe the required medical knowledge; after the kick-off and refinement stages, a complete and non-ambiguous definition of the Home Care model, including its main components and interrelations, was obtained; the formalization stage expressed HC medical entities in the form of ontological classes, which are interrelated by means of hierarchies, properties and semantically rich class restrictions; the evaluation, carried out by exploiting the ontology into a knowledge-driven e-health application running on a real scenario, showed that the ontology design and its exploitation brought several benefits with regards to flexibility, adaptability and work efficiency from the end-user point of view; for the maintenance stage, two software tools are presented, aimed to address the incorporation and modification of healthcare units and the personalization of ontological profiles. The paper shows that the ontological paradigm and the expressiveness of modern ontology languages can be exploited not only to represent terminology in a non-ambiguous way, but also to formalize the interrelations and organizational structures involved in a real and distributed healthcare environment. This kind of ontologies facilitates the adaptation in front of changes in the healthcare organization or Care Units, supports the creation of profile-based interaction models in a transparent and seamless way, and increases the reusability and generality of the developed software components. As a conclusion of the exploitation of the developed ontology in a real medical scenario, we can say that an ontology formalizing organizational interrelations is a key component for building effective distributed knowledge-driven e-health systems. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Smartphone Application Enabling Global Graph Exploitation and Proactive Dissemination Service (DSPro) Integration (Revised Fiscal Year 2015)

    DTIC Science & Technology

    2015-09-01

    interface. 15. SUBJECT TERMS smartphone, HDPT, global graph, DSPro, ozone widget framework, distributed common ground system, web service 16. SECURITY...Lee M. Lessons learned with a global graph and ozone widget framework (OWF) testbed. Aberdeen Proving Ground (MD): Army Research Laboratory (US); 2013

  8. Exploiting Discrete Structure for Learning On-Line in Distributed Robot Systems

    DTIC Science & Technology

    2009-10-21

    accelerating rate over the next 20 years. Service robotics currently shares some important characteristics with the automobile industry in the early...Authorization Act for Fiscal Year 2001, S. 2549, Sec. 217). The same impact is expected for pilotless air and water vehicles, where drone aircraft for

  9. Integrated Sensing and Processing (ISP) Phase II: Demonstration and Evaluation for Distributed Sensor Netowrks and Missile Seeker Systems

    DTIC Science & Technology

    2007-02-28

    Shah, D. Waagen, H. Schmitt, S. Bellofiore, A. Spanias, and D. Cochran, 32nd International Conference on Acoustics, Speech , and Signal Processing...Information Exploitation Office kNN k-Nearest Neighbor LEAN Laplacian Eigenmap Adaptive Neighbor LIP Linear Integer Programming ISP

  10. Efficient Software Systems for Cardio Surgical Departments

    NASA Astrophysics Data System (ADS)

    Fountoukis, S. G.; Diomidous, M. J.

    2009-08-01

    Herein, the design implementation and deployment of an object oriented software system, suitable for the monitoring of cardio surgical departments, is investigated. Distributed design architectures are applied and the implemented software system can be deployed on distributed infrastructures. The software is flexible and adaptable to any cardio surgical environment regardless of the department resources used. The system exploits the relations and the interdependency of the successive bed positions that the patients occupy at the different health care units during their stay in a cardio surgical department, to determine bed availabilities and to perform patient scheduling and instant rescheduling whenever necessary. It also aims to successful monitoring of the workings of the cardio surgical departments in an efficient manner.

  11. Predictive Anomaly Management for Resilient Virtualized Computing Infrastructures

    DTIC Science & Technology

    2015-05-27

    PREC: Practical Root Exploit Containment for Android Devices, ACM Conference on Data and Application Security and Privacy (CODASPY) . 03-MAR-14...05-OCT-11, . : , Hiep Nguyen, Yongmin Tan, Xiaohui Gu. Propagation-aware Anomaly Localization for Cloud Hosted Distributed Applications , ACM...Workshop on Managing Large-Scale Systems via the Analysis of System Logs and the Application of Machine Learning Techniques (SLAML) in conjunction with SOSP

  12. INDIGO-DataCloud solutions for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Fiore, Sandro; Monna, Stephen; Chen, Yin

    2017-04-01

    INDIGO-DataCloud (https://www.indigo-datacloud.eu/) is a European Commission funded project aiming to develop a data and computing platform targeting scientific communities, deployable on multiple hardware and provisioned over hybrid (private or public) e-infrastructures. The development of INDIGO solutions covers the different layers in cloud computing (IaaS, PaaS, SaaS), and provides tools to exploit resources like HPC or GPGPUs. INDIGO is oriented to support European Scientific research communities, that are well represented in the project. Twelve different Case Studies have been analyzed in detail from different fields: Biological & Medical sciences, Social sciences & Humanities, Environmental and Earth sciences and Physics & Astrophysics. INDIGO-DataCloud provides solutions to emerging challenges in Earth Science like: -Enabling an easy deployment of community services at different cloud sites. Many Earth Science research infrastructures often involve distributed observation stations across countries, and also have distributed data centers to support the corresponding data acquisition and curation. There is a need to easily deploy new data center services while the research infrastructure continuous spans. As an example: LifeWatch (ESFRI, Ecosystems and Biodiversity) uses INDIGO solutions to manage the deployment of services to perform complex hydrodynamics and water quality modelling over a Cloud Computing environment, predicting algae blooms, using the Docker technology: TOSCA requirement description, Docker repository, Orchestrator for deployment, AAI (AuthN, AuthZ) and OneData (Distributed Storage System). -Supporting Big Data Analysis. Nowadays, many Earth Science research communities produce large amounts of data and and are challenged by the difficulties of processing and analysing it. A climate models intercomparison data analysis case study for the European Network for Earth System Modelling (ENES) community has been setup, based on the Ophidia big data analysis framework and the Kepler workflow management system. Such services normally involve a large and distributed set of data and computing resources. In this regard, this case study exploits the INDIGO PaaS for a flexible and dynamic allocation of the resources at the infrastructural level. -Providing Distributed Data Storage Solutions. In order to allow scientific communities to perform heavy computation on huge datasets, INDIGO provides global data access solutions allowing researchers to access data in a distributed environment like fashion regardless of its location, and also to publish and share their research results with public or close communities. INDIGO solutions that support the access to distributed data storage (OneData) are being tested on EMSO infrastructure (Ocean Sciences and Geohazards) data. Another aspect of interest for the EMSO community is in efficient data processing by exploiting INDIGO services like PaaS Orchestrator. Further, for HPC exploitation, a new solution named Udocker has been implemented, enabling users to execute docker containers in supercomputers, without requiring administration privileges. This presentation will overview INDIGO solutions that are interesting and useful for Earth science communities and will show how they can be applied to other Case Studies.

  13. Theseus: tethered distributed robotics (TDR)

    NASA Astrophysics Data System (ADS)

    Digney, Bruce L.; Penzes, Steven G.

    2003-09-01

    The Defence Research and Development Canada's (DRDC) Autonomous Intelligent System's program conducts research to increase the independence and effectiveness of military vehicles and systems. DRDC-Suffield's Autonomous Land Systems (ALS) is creating new concept vehicles and autonomous control systems for use in outdoor areas, urban streets, urban interiors and urban subspaces. This paper will first give an overview of the ALS program and then give a specific description of the work being done for mobility in urban subspaces. Discussed will be the Theseus: Thethered Distributed Robotics (TDR) system, which will not only manage an unavoidable tether but exploit it for mobility and navigation. Also discussed will be the prototype robot called the Hedgehog, which uses conformal 3D mobility in ducts, sewer pipes, collapsed rubble voids and chimneys.

  14. Distributed Compressive CSIT Estimation and Feedback for FDD Multi-User Massive MIMO Systems

    NASA Astrophysics Data System (ADS)

    Rao, Xiongbin; Lau, Vincent K. N.

    2014-06-01

    To fully utilize the spatial multiplexing gains or array gains of massive MIMO, the channel state information must be obtained at the transmitter side (CSIT). However, conventional CSIT estimation approaches are not suitable for FDD massive MIMO systems because of the overwhelming training and feedback overhead. In this paper, we consider multi-user massive MIMO systems and deploy the compressive sensing (CS) technique to reduce the training as well as the feedback overhead in the CSIT estimation. The multi-user massive MIMO systems exhibits a hidden joint sparsity structure in the user channel matrices due to the shared local scatterers in the physical propagation environment. As such, instead of naively applying the conventional CS to the CSIT estimation, we propose a distributed compressive CSIT estimation scheme so that the compressed measurements are observed at the users locally, while the CSIT recovery is performed at the base station jointly. A joint orthogonal matching pursuit recovery algorithm is proposed to perform the CSIT recovery, with the capability of exploiting the hidden joint sparsity in the user channel matrices. We analyze the obtained CSIT quality in terms of the normalized mean absolute error, and through the closed-form expressions, we obtain simple insights into how the joint channel sparsity can be exploited to improve the CSIT recovery performance.

  15. Contrast statistics for foveated visual systems: fixation selection by minimizing contrast entropy

    NASA Astrophysics Data System (ADS)

    Raj, Raghu; Geisler, Wilson S.; Frazor, Robert A.; Bovik, Alan C.

    2005-10-01

    The human visual system combines a wide field of view with a high-resolution fovea and uses eye, head, and body movements to direct the fovea to potentially relevant locations in the visual scene. This strategy is sensible for a visual system with limited neural resources. However, for this strategy to be effective, the visual system needs sophisticated central mechanisms that efficiently exploit the varying spatial resolution of the retina. To gain insight into some of the design requirements of these central mechanisms, we have analyzed the effects of variable spatial resolution on local contrast in 300 calibrated natural images. Specifically, for each retinal eccentricity (which produces a certain effective level of blur), and for each value of local contrast observed at that eccentricity, we measured the probability distribution of the local contrast in the unblurred image. These conditional probability distributions can be regarded as posterior probability distributions for the ``true'' unblurred contrast, given an observed contrast at a given eccentricity. We find that these conditional probability distributions are adequately described by a few simple formulas. To explore how these statistics might be exploited by central perceptual mechanisms, we consider the task of selecting successive fixation points, where the goal on each fixation is to maximize total contrast information gained about the image (i.e., minimize total contrast uncertainty). We derive an entropy minimization algorithm and find that it performs optimally at reducing total contrast uncertainty and that it also works well at reducing the mean squared error between the original image and the image reconstructed from the multiple fixations. Our results show that measurements of local contrast alone could efficiently drive the scan paths of the eye when the goal is to gain as much information about the spatial structure of a scene as possible.

  16. Ropes: Support for collective opertions among distributed threads

    NASA Technical Reports Server (NTRS)

    Haines, Matthew; Mehrotra, Piyush; Cronk, David

    1995-01-01

    Lightweight threads are becoming increasingly useful in supporting parallelism and asynchronous control structures in applications and language implementations. Recently, systems have been designed and implemented to support interprocessor communication between lightweight threads so that threads can be exploited in a distributed memory system. Their use, in this setting, has been largely restricted to supporting latency hiding techniques and functional parallelism within a single application. However, to execute data parallel codes independent of other threads in the system, collective operations and relative indexing among threads are required. This paper describes the design of ropes: a scoping mechanism for collective operations and relative indexing among threads. We present the design of ropes in the context of the Chant system, and provide performance results evaluating our initial design decisions.

  17. Ion distributions in electrolyte confined by multiple dielectric interfaces

    NASA Astrophysics Data System (ADS)

    Jing, Yufei; Zwanikken, Jos W.; Jadhao, Vikram; de La Cruz, Monica

    2014-03-01

    The distribution of ions at dielectric interfaces between liquids characterized by different dielectric permittivities is crucial to nanoscale assembly processes in many biological and synthetic materials such as cell membranes, colloids and oil-water emulsions. The knowledge of ionic structure of these systems is also exploited in energy storage devices such as double-layer super-capacitors. The presence of multiple dielectric interfaces often complicates computing the desired ionic distributions via simulations or theory. Here, we use coarse-grained models to compute the ionic distributions in a system of electrolyte confined by two planar dielectric interfaces using Car-Parrinello molecular dynamics simulations and liquid state theory. We compute the density profiles for various electrolyte concentrations, stoichiometric ratios and dielectric contrasts. The explanations for the trends in these profiles and discuss their effects on the behavior of the confined charged fluid are also presented.

  18. Work stealing for GPU-accelerated parallel programs in a global address space framework: WORK STEALING ON GPU-ACCELERATED SYSTEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arafat, Humayun; Dinan, James; Krishnamoorthy, Sriram

    Task parallelism is an attractive approach to automatically load balance the computation in a parallel system and adapt to dynamism exhibited by parallel systems. Exploiting task parallelism through work stealing has been extensively studied in shared and distributed-memory contexts. In this paper, we study the design of a system that uses work stealing for dynamic load balancing of task-parallel programs executed on hybrid distributed-memory CPU-graphics processing unit (GPU) systems in a global-address space framework. We take into account the unique nature of the accelerator model employed by GPUs, the significant performance difference between GPU and CPU execution as a functionmore » of problem size, and the distinct CPU and GPU memory domains. We consider various alternatives in designing a distributed work stealing algorithm for CPU-GPU systems, while taking into account the impact of task distribution and data movement overheads. These strategies are evaluated using microbenchmarks that capture various execution configurations as well as the state-of-the-art CCSD(T) application module from the computational chemistry domain.« less

  19. Anomalous Transient Amplification of Waves in Non-normal Photonic Media

    NASA Astrophysics Data System (ADS)

    Makris, K. G.; Ge, L.; Türeci, H. E.

    2014-10-01

    Dissipation is a ubiquitous phenomenon in dynamical systems encountered in nature because no finite system is fully isolated from its environment. In optical systems, a key challenge facing any technological application has traditionally been the mitigation of optical losses. Recent work has shown that a new class of optical materials that consist of a precisely balanced distribution of loss and gain can be exploited to engineer novel functionalities for propagating and filtering electromagnetic radiation. Here we show a generic property of optical systems that feature an unbalanced distribution of loss and gain, described by non-normal operators, namely, that an overall lossy optical system can transiently amplify certain input signals by several orders of magnitude. We present a mathematical framework to analyze the dynamics of wave propagation in media with an arbitrary distribution of loss and gain, and we construct the initial conditions to engineer such non-normal power amplifiers. Our results point to a new design space for engineered optical systems employed in photonics and quantum optics.

  20. Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Alruwaili, Manal

    With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.

  1. Surface plasmon holographic microscopy for near-field refractive index detection and thin film mapping

    NASA Astrophysics Data System (ADS)

    Zhao, Jianlin; Zhang, Jiwei; Dai, Siqing; Di, Jianglei; Xi, Teli

    2018-02-01

    Surface plasmon microscopy (SPM) is widely applied for label-free detection of changes of refractive index and concentration, as well as mapping thin films in near field. Traditionally, the SPM systems are based on the detection of light intensity or phase changes. Here, we present two kinds of surface plasmon holographic microscopy (SPHM) systems for amplitude- and phase-contrast imaging simultaneously. Through recording off-axis holograms and numerical reconstruction, the complex amplitude distributions of surface plasmon resonance (SPR) images can be obtained. According to the Fresnel's formula, in a prism/ gold/ dielectric structure, the reflection phase shift is uniquely decided by refractive index of the dielectric. By measuring the phase shift difference of the reflected light exploiting prism-coupling SPHM system based on common-path interference configuration, monitoring tiny refractive index variation and imaging biological tissue are performed. Furthermore, to characterize the thin film thickness in near field, we employ a four-layer SPR model in which the third film layer is within the evanescent field. The complex reflection coefficient, including the reflectivity and reflection phase shift, is uniquely decided by the film thickness. By measuring the complex amplitude distributions of the SPR images exploiting objective-coupling SPHM system based on common-path interference configuration, the thickness distributions of thin films are mapped with sub-nanometer resolution theoretically. Owing to its high temporal stability, the recommended SPHMs show great potentials for monitoring tiny refractive index variations, imaging biological tissues and mapping thin films in near field with dynamic, nondestructive and full-field measurement capabilities in chemistry, biomedicine field, etc.

  2. Work stealing for GPU-accelerated parallel programs in a global address space framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arafat, Humayun; Dinan, James; Krishnamoorthy, Sriram

    Task parallelism is an attractive approach to automatically load balance the computation in a parallel system and adapt to dynamism exhibited by parallel systems. Exploiting task parallelism through work stealing has been extensively studied in shared and distributed-memory contexts. In this paper, we study the design of a system that uses work stealing for dynamic load balancing of task-parallel programs executed on hybrid distributed-memory CPU-graphics processing unit (GPU) systems in a global-address space framework. We take into account the unique nature of the accelerator model employed by GPUs, the significant performance difference between GPU and CPU execution as a functionmore » of problem size, and the distinct CPU and GPU memory domains. We consider various alternatives in designing a distributed work stealing algorithm for CPU-GPU systems, while taking into account the impact of task distribution and data movement overheads. These strategies are evaluated using microbenchmarks that capture various execution configurations as well as the state-of-the-art CCSD(T) application module from the computational chemistry domain« less

  3. On the Path to SunShot - Emerging Issues and Challenges with Integrating High Levels of Solar into the Distribution System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palminitier, Bryan; Broderick, Robert; Mather, Barry

    2016-05-01

    Wide use of advanced inverters could double the electricity-distribution system’s hosting capacity for distributed PV at low costs—from about 170 GW to 350 GW (see Palmintier et al. 2016). At the distribution system level, increased variable generation due to high penetrations of distributed PV (typically rooftop and smaller ground-mounted systems) could challenge the management of distribution voltage, potentially increase wear and tear on electromechanical utility equipment, and complicate the configuration of circuit-breakers and other protection systems—all of which could increase costs, limit further PV deployment, or both. However, improved analysis of distribution system hosting capacity—the amount of distributed PV thatmore » can be interconnected without changing the existing infrastructure or prematurely wearing out equipment—has overturned previous rule-of-thumb assumptions such as the idea that distributed PV penetrations higher than 15% require detailed impact studies. For example, new analysis suggests that the hosting capacity for distributed PV could rise from approximately 170 GW using traditional inverters to about 350 GW with the use of advanced inverters for voltage management, and it could be even higher using accessible and low-cost strategies such as careful siting of PV systems within a distribution feeder and additional minor changes in distribution operations. Also critical to facilitating distributed PV deployment is the improvement of interconnection processes, associated standards and codes, and compensation mechanisms so they embrace PV’s contributions to system-wide operations. Ultimately SunShot-level PV deployment will require unprecedented coordination of the historically separate distribution and transmission systems along with incorporation of energy storage and “virtual storage,” which exploits improved management of electric vehicle charging, building energy systems, and other large loads. Additional analysis and innovation are neede« less

  4. Airborne net-centric multi-INT sensor control, display, fusion, and exploitation systems

    NASA Astrophysics Data System (ADS)

    Linne von Berg, Dale C.; Lee, John N.; Kruer, Melvin R.; Duncan, Michael D.; Olchowski, Fred M.; Allman, Eric; Howard, Grant

    2004-08-01

    The NRL Optical Sciences Division has initiated a multi-year effort to develop and demonstrate an airborne net-centric suite of multi-intelligence (multi-INT) sensors and exploitation systems for real-time target detection and targeting product dissemination. The goal of this Net-centric Multi-Intelligence Fusion Targeting Initiative (NCMIFTI) is to develop an airborne real-time intelligence gathering and targeting system that can be used to detect concealed, camouflaged, and mobile targets. The multi-INT sensor suite will include high-resolution visible/infrared (EO/IR) dual-band cameras, hyperspectral imaging (HSI) sensors in the visible-to-near infrared, short-wave and long-wave infrared (VNIR/SWIR/LWIR) bands, Synthetic Aperture Radar (SAR), electronics intelligence sensors (ELINT), and off-board networked sensors. Other sensors are also being considered for inclusion in the suite to address unique target detection needs. Integrating a suite of multi-INT sensors on a single platform should optimize real-time fusion of the on-board sensor streams, thereby improving the detection probability and reducing the false alarms that occur in reconnaissance systems that use single-sensor types on separate platforms, or that use independent target detection algorithms on multiple sensors. In addition to the integration and fusion of the multi-INT sensors, the effort is establishing an open-systems net-centric architecture that will provide a modular "plug and play" capability for additional sensors and system components and provide distributed connectivity to multiple sites for remote system control and exploitation.

  5. Multiobjective evolutionary optimization of water distribution systems: Exploiting diversity with infeasible solutions.

    PubMed

    Tanyimboh, Tiku T; Seyoum, Alemtsehay G

    2016-12-01

    This article investigates the computational efficiency of constraint handling in multi-objective evolutionary optimization algorithms for water distribution systems. The methodology investigated here encourages the co-existence and simultaneous development including crossbreeding of subpopulations of cost-effective feasible and infeasible solutions based on Pareto dominance. This yields a boundary search approach that also promotes diversity in the gene pool throughout the progress of the optimization by exploiting the full spectrum of non-dominated infeasible solutions. The relative effectiveness of small and moderate population sizes with respect to the number of decision variables is investigated also. The results reveal the optimization algorithm to be efficient, stable and robust. It found optimal and near-optimal solutions reliably and efficiently. The real-world system based optimization problem involved multiple variable head supply nodes, 29 fire-fighting flows, extended period simulation and multiple demand categories including water loss. The least cost solutions found satisfied the flow and pressure requirements consistently. The best solutions achieved indicative savings of 48.1% and 48.2% based on the cost of the pipes in the existing network, for populations of 200 and 1000, respectively. The population of 1000 achieved slightly better results overall. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Quantum hacking on quantum key distribution using homodyne detection

    NASA Astrophysics Data System (ADS)

    Huang, Jing-Zheng; Kunz-Jacques, Sébastien; Jouguet, Paul; Weedbrook, Christian; Yin, Zhen-Qiang; Wang, Shuang; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2014-03-01

    Imperfect devices in commercial quantum key distribution systems open security loopholes that an eavesdropper may exploit. An example of one such imperfection is the wavelength-dependent coupling ratio of the fiber beam splitter. Utilizing this loophole, the eavesdropper can vary the transmittances of the fiber beam splitter at the receiver's side by inserting lights with wavelengths different from what is normally used. Here, we propose a wavelength attack on a practical continuous-variable quantum key distribution system using homodyne detection. By inserting light pulses at different wavelengths, this attack allows the eavesdropper to bias the shot-noise estimation even if it is done in real time. Based on experimental data, we discuss the feasibility of this attack and suggest a prevention scheme by improving the previously proposed countermeasures.

  7. Distributed Similarity based Clustering and Compressed Forwarding for wireless sensor networks.

    PubMed

    Arunraja, Muruganantham; Malathi, Veluchamy; Sakthivel, Erulappan

    2015-11-01

    Wireless sensor networks are engaged in various data gathering applications. The major bottleneck in wireless data gathering systems is the finite energy of sensor nodes. By conserving the on board energy, the life span of wireless sensor network can be well extended. Data communication being the dominant energy consuming activity of wireless sensor network, data reduction can serve better in conserving the nodal energy. Spatial and temporal correlation among the sensor data is exploited to reduce the data communications. Data similar cluster formation is an effective way to exploit spatial correlation among the neighboring sensors. By sending only a subset of data and estimate the rest using this subset is the contemporary way of exploiting temporal correlation. In Distributed Similarity based Clustering and Compressed Forwarding for wireless sensor networks, we construct data similar iso-clusters with minimal communication overhead. The intra-cluster communication is reduced using adaptive-normalized least mean squares based dual prediction framework. The cluster head reduces the inter-cluster data payload using a lossless compressive forwarding technique. The proposed work achieves significant data reduction in both the intra-cluster and the inter-cluster communications, with the optimal data accuracy of collected data. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Suboptimal distributed control and estimation: application to a four coupled tanks system

    NASA Astrophysics Data System (ADS)

    Orihuela, Luis; Millán, Pablo; Vivas, Carlos; Rubio, Francisco R.

    2016-06-01

    The paper proposes an innovative estimation and control scheme that enables the distributed monitoring and control of large-scale processes. The proposed approach considers a discrete linear time-invariant process controlled by a network of agents that may both collect information about the evolution of the plant and apply control actions to drive its behaviour. The problem makes full sense when local observability/controllability is not assumed and the communication between agents can be exploited to reach system-wide goals. Additionally, to reduce agents bandwidth requirements and power consumption, an event-based communication policy is studied. The design procedure guarantees system stability, allowing the designer to trade-off performance, control effort and communication requirements. The obtained controllers and observers are implemented in a fully distributed fashion. To illustrate the performance of the proposed technique, experimental results on a quadruple-tank process are provided.

  9. Compressive Feedback Control Design for Spatially Distributed Systems

    DTIC Science & Technology

    2017-01-03

    NecSys 2015 & 2016 Abstract The goal of this research is the development of new fundamental insights and methodologies to exploit structural properties of...Measures One of the simplest class of dynamical networks that our proposed methodology can be explained in a simple setting is the class of first–order...developed a novel methodology to obtain tight lower and upper bounds for the class of systemic measures. In the following, some of the key ideas behind our

  10. Finite-key analysis for measurement-device-independent quantum key distribution.

    PubMed

    Curty, Marcos; Xu, Feihu; Cui, Wei; Lim, Charles Ci Wen; Tamaki, Kiyoshi; Lo, Hoi-Kwong

    2014-04-29

    Quantum key distribution promises unconditionally secure communications. However, as practical devices tend to deviate from their specifications, the security of some practical systems is no longer valid. In particular, an adversary can exploit imperfect detectors to learn a large part of the secret key, even though the security proof claims otherwise. Recently, a practical approach--measurement-device-independent quantum key distribution--has been proposed to solve this problem. However, so far its security has only been fully proven under the assumption that the legitimate users of the system have unlimited resources. Here we fill this gap and provide a rigorous security proof against general attacks in the finite-key regime. This is obtained by applying large deviation theory, specifically the Chernoff bound, to perform parameter estimation. For the first time we demonstrate the feasibility of long-distance implementations of measurement-device-independent quantum key distribution within a reasonable time frame of signal transmission.

  11. Bibliography--Unclassified Technical Reports, Special Reports, and Technical Notes: FY 1982.

    DTIC Science & Technology

    1982-11-01

    in each category are listed in chronological order under seven areas: manpower management, personnel administration , organization management, education...7633). Technical reports listed that have unlimited distribution can also be obtained from the National Technical Information Service , 5285 Port Royal...simulations of manpower systems. This research exploits the technology of computer-managed large-scale data bases. PERSONNEL ADMINISTRATION The personnel

  12. EMASS (tm): An expandable solution for NASA space data storage needs

    NASA Technical Reports Server (NTRS)

    Peterson, Anthony L.; Cardwell, P. Larry

    1992-01-01

    The data acquisition, distribution, processing, and archiving requirements of NASA and other U.S. Government data centers present significant data management challenges that must be met in the 1990's. The Earth Observing System (EOS) project alone is expected to generate daily data volumes greater than 2 Terabytes (2(10)(exp 12) Bytes). As the scientific community makes use of this data their work product will result in larger, increasingly complex data sets to be further exploited and managed. The challenge for data storage systems is to satisfy the initial data management requirements with cost effective solutions that provide for planned growth. This paper describes the expandable architecture of the E-Systems Modular Automated Storage System (EMASS (TM)), a mass storage system which is designed to support NASA's data capture, storage, distribution, and management requirements into the 21st century.

  13. EMASS (trademark): An expandable solution for NASA space data storage needs

    NASA Technical Reports Server (NTRS)

    Peterson, Anthony L.; Cardwell, P. Larry

    1991-01-01

    The data acquisition, distribution, processing, and archiving requirements of NASA and other U.S. Government data centers present significant data management challenges that must be met in the 1990's. The Earth Observing System (EOS) project alone is expected to generate daily data volumes greater than 2 Terabytes (2 x 10(exp 12) Bytes). As the scientific community makes use of this data, their work will result in larger, increasingly complex data sets to be further exploited and managed. The challenge for data storage systems is to satisfy the initial data management requirements with cost effective solutions that provide for planned growth. The expendable architecture of the E-Systems Modular Automated Storage System (EMASS(TM)), a mass storage system which is designed to support NASA's data capture, storage, distribution, and management requirements into the 21st century is described.

  14. TALON: the telescope alert operation network system: intelligent linking of distributed autonomous robotic telescopes

    NASA Astrophysics Data System (ADS)

    White, Robert R.; Wren, James; Davis, Heath R.; Galassi, Mark; Starr, Daniel; Vestrand, W. T.; Wozniak, P.

    2004-09-01

    The internet has brought about great change in the astronomical community, but this interconnectivity is just starting to be exploited for use in instrumentation. Utilizing the internet for communicating between distributed astronomical systems is still in its infancy, but it already shows great potential. Here we present an example of a distributed network of telescopes that performs more efficiently in synchronous operation than as individual instruments. RAPid Telescopes for Optical Response (RAPTOR) is a system of telescopes at LANL that has intelligent intercommunication, combined with wide-field optics, temporal monitoring software, and deep-field follow-up capability all working in closed-loop real-time operation. The Telescope ALert Operations Network (TALON) is a network server that allows intercommunication of alert triggers from external and internal resources and controls the distribution of these to each of the telescopes on the network. TALON is designed to grow, allowing any number of telescopes to be linked together and communicate. Coupled with an intelligent alert client at each telescope, it can analyze and respond to each distributed TALON alert based on the telescopes needs and schedule.

  15. A comparison of decentralized, distributed, and centralized vibro-acoustic control.

    PubMed

    Frampton, Kenneth D; Baumann, Oliver N; Gardonio, Paolo

    2010-11-01

    Direct velocity feedback control of structures is well known to increase structural damping and thus reduce vibration. In multi-channel systems the way in which the velocity signals are used to inform the actuators ranges from decentralized control, through distributed or clustered control to fully centralized control. The objective of distributed controllers is to exploit the anticipated performance advantage of the centralized control while maintaining the scalability, ease of implementation, and robustness of decentralized control. However, and in seeming contradiction, some investigations have concluded that decentralized control performs as well as distributed and centralized control, while other results have indicated that distributed control has significant performance advantages over decentralized control. The purpose of this work is to explain this seeming contradiction in results, to explore the effectiveness of decentralized, distributed, and centralized vibro-acoustic control, and to expand the concept of distributed control to include the distribution of the optimization process and the cost function employed.

  16. Wireless Infrastructure M2M Network For Distributed Power Grid Monitoring

    PubMed Central

    Gharavi, Hamid; Hu, Bin

    2018-01-01

    With the massive integration of distributed renewable energy sources (RESs) into the power system, the demand for timely and reliable network quality monitoring, control, and fault analysis is rapidly growing. Following the successful deployment of Phasor Measurement Units (PMUs) in transmission systems for power monitoring, a new opportunity to utilize PMU measurement data for power quality assessment in distribution grid systems is emerging. The main problem however, is that a distribution grid system does not normally have the support of an infrastructure network. Therefore, the main objective in this paper is to develop a Machine-to-Machine (M2M) communication network that can support wide ranging sensory data, including high rate synchrophasor data for real-time communication. In particular, we evaluate the suitability of the emerging IEEE 802.11ah standard by exploiting its important features, such as classifying the power grid sensory data into different categories according to their traffic characteristics. For performance evaluation we use our hardware in the loop grid communication network testbed to access the performance of the network. PMID:29503505

  17. Study of Solid State Drives performance in PROOF distributed analysis system

    NASA Astrophysics Data System (ADS)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  18. Wireless Infrastructure M2M Network For Distributed Power Grid Monitoring.

    PubMed

    Gharavi, Hamid; Hu, Bin

    2017-01-01

    With the massive integration of distributed renewable energy sources (RESs) into the power system, the demand for timely and reliable network quality monitoring, control, and fault analysis is rapidly growing. Following the successful deployment of Phasor Measurement Units (PMUs) in transmission systems for power monitoring, a new opportunity to utilize PMU measurement data for power quality assessment in distribution grid systems is emerging. The main problem however, is that a distribution grid system does not normally have the support of an infrastructure network. Therefore, the main objective in this paper is to develop a Machine-to-Machine (M2M) communication network that can support wide ranging sensory data, including high rate synchrophasor data for real-time communication. In particular, we evaluate the suitability of the emerging IEEE 802.11ah standard by exploiting its important features, such as classifying the power grid sensory data into different categories according to their traffic characteristics. For performance evaluation we use our hardware in the loop grid communication network testbed to access the performance of the network.

  19. Social Media as Seismic Networks for the Earthquake Damage Assessment

    NASA Astrophysics Data System (ADS)

    Meletti, C.; Cresci, S.; La Polla, M. N.; Marchetti, A.; Tesconi, M.

    2014-12-01

    The growing popularity of online platforms, based on user-generated content, is gradually creating a digital world that mirrors the physical world. In the paradigm of crowdsensing, the crowd becomes a distributed network of sensors that allows us to understand real life events at a quasi-real-time rate. The SoS-Social Sensing project [http://socialsensing.it/] exploits the opportunistic crowdsensing, involving users in the sensing process in a minimal way, for social media emergency management purposes in order to obtain a very fast, but still reliable, detection of emergency dimension to face. First of all we designed and implemented a decision support system for the detection and the damage assessment of earthquakes. Our system exploits the messages shared in real-time on Twitter. In the detection phase, data mining and natural language processing techniques are firstly adopted to select meaningful and comprehensive sets of tweets. Then we applied a burst detection algorithm in order to promptly identify outbreaking seismic events. Using georeferenced tweets and reported locality names, a rough epicentral determination is also possible. The results, compared to Italian INGV official reports, show that the system is able to detect, within seconds, events of a magnitude in the region of 3.5 with a precision of 75% and a recall of 81,82%. We then focused our attention on damage assessment phase. We investigated the possibility to exploit social media data to estimate earthquake intensity. We designed a set of predictive linear models and evaluated their ability to map the intensity of worldwide earthquakes. The models build on a dataset of almost 5 million tweets exploited to compute our earthquake features, and more than 7,000 globally distributed earthquakes data, acquired in a semi-automatic way from USGS, serving as ground truth. We extracted 45 distinct features falling into four categories: profile, tweet, time and linguistic. We run diagnostic tests and simulations on generated models to assess their significance and avoid overfitting. Overall results show a correlation between the messages shared in social media and intensity estimations based on online survey data (CDI).

  20. A cooperative model for IS security risk management in distributed environment.

    PubMed

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  1. Distributed cooperative control of AC microgrids

    NASA Astrophysics Data System (ADS)

    Bidram, Ali

    In this dissertation, the comprehensive secondary control of electric power microgrids is of concern. Microgrid technical challenges are mainly realized through the hierarchical control structure, including primary, secondary, and tertiary control levels. Primary control level is locally implemented at each distributed generator (DG), while the secondary and tertiary control levels are conventionally implemented through a centralized control structure. The centralized structure requires a central controller which increases the reliability concerns by posing the single point of failure. In this dissertation, the distributed control structure using the distributed cooperative control of multi-agent systems is exploited to increase the secondary control reliability. The secondary control objectives are microgrid voltage and frequency, and distributed generators (DGs) active and reactive powers. Fully distributed control protocols are implemented through distributed communication networks. In the distributed control structure, each DG only requires its own information and the information of its neighbors on the communication network. The distributed structure obviates the requirements for a central controller and complex communication network which, in turn, improves the system reliability. Since the DG dynamics are nonlinear and non-identical, input-output feedback linearization is used to transform the nonlinear dynamics of DGs to linear dynamics. Proposed control frameworks cover the control of microgrids containing inverter-based DGs. Typical microgrid test systems are used to verify the effectiveness of the proposed control protocols.

  2. The Global File System

    NASA Technical Reports Server (NTRS)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  3. New tools: potential medical applications of data from new and old environmental satellites.

    PubMed

    Huh, O K; Malone, J B

    2001-04-27

    The last 40 years, beginning with the first TIROS (television infrared observational satellite) launched on 1 April 1960, has seen an explosion of earth environmental satellite systems and their capabilities. They can provide measurements in globe encircling arrays or small select areas, with increasing resolutions, and new capabilities. Concurrently there are expanding numbers of existing and emerging infectious diseases, many distributed according to areal patterns of physical conditions at the earth's surface. For these reasons, the medical and remote sensing communities can beneficially collaborate with the objective of making needed progress in public health activities by exploiting the advances of the national and international space programs. Major improvements in applicability of remotely sensed data are becoming possible with increases in the four kinds of resolution: spatial, temporal, radiometric and spectral, scheduled over the next few years. Much collaborative research will be necessary before data from these systems are fully exploited by the medical community.

  4. Optimal post-experiment estimation of poorly modeled dynamic systems

    NASA Technical Reports Server (NTRS)

    Mook, D. Joseph

    1988-01-01

    Recently, a novel strategy for post-experiment state estimation of discretely-measured dynamic systems has been developed. The method accounts for errors in the system dynamic model equations in a more general and rigorous manner than do filter-smoother algorithms. The dynamic model error terms do not require the usual process noise assumptions of zero-mean, symmetrically distributed random disturbances. Instead, the model error terms require no prior assumptions other than piecewise continuity. The resulting state estimates are more accurate than filters for applications in which the dynamic model error clearly violates the typical process noise assumptions, and the available measurements are sparse and/or noisy. Estimates of the dynamic model error, in addition to the states, are obtained as part of the solution of a two-point boundary value problem, and may be exploited for numerous reasons. In this paper, the basic technique is explained, and several example applications are given. Included among the examples are both state estimation and exploitation of the model error estimates.

  5. Quantum cryptography with entangled photons

    PubMed

    Jennewein; Simon; Weihs; Weinfurter; Zeilinger

    2000-05-15

    By realizing a quantum cryptography system based on polarization entangled photon pairs we establish highly secure keys, because a single photon source is approximated and the inherent randomness of quantum measurements is exploited. We implement a novel key distribution scheme using Wigner's inequality to test the security of the quantum channel, and, alternatively, realize a variant of the BB84 protocol. Our system has two completely independent users separated by 360 m, and generates raw keys at rates of 400-800 bits/s with bit error rates around 3%.

  6. Progressive retry for software error recovery in distributed systems

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Huang, Yennun; Fuchs, W. K.

    1993-01-01

    In this paper, we describe a method of execution retry for bypassing software errors based on checkpointing, rollback, message reordering and replaying. We demonstrate how rollback techniques, previously developed for transient hardware failure recovery, can also be used to recover from software faults by exploiting message reordering to bypass software errors. Our approach intentionally increases the degree of nondeterminism and the scope of rollback when a previous retry fails. Examples from our experience with telecommunications software systems illustrate the benefits of the scheme.

  7. Megawatt Space Power Conditioning, Distribution, and Control Study

    DTIC Science & Technology

    1988-03-01

    also must be given to the design of an ac transmission line for this relatively high frequency . 2.3.2 Medium High Voltage Systems. Figure 2-4 shows a...systems are designed to exploit the use of 2 MW klystrode tubes (see Section 3.1) which require a dc voltage of about 140 kV. This high voltage can be...the concerns is that to date there have been no three-phase high voltage, high frequency transmission lines designed . Figure 5-6. While the previous

  8. Distributed MIMO chaotic radar based on wavelength-division multiplexing technology.

    PubMed

    Yao, Tingfeng; Zhu, Dan; Ben, De; Pan, Shilong

    2015-04-15

    A distributed multiple-input multiple-output chaotic radar based on wavelength-division multiplexing technology (WDM) is proposed and demonstrated. The wideband quasi-orthogonal chaotic signals generated by different optoelectronic oscillators (OEOs) are emitted by separated antennas to gain spatial diversity against the fluctuation of a target's radar cross section and enhance the detection capability. The received signals collected by the receive antennas and the reference signals from the OEOs are delivered to the central station for joint processing by exploiting WDM technology. The centralized signal processing avoids precise time synchronization of the distributed system and greatly simplifies the remote units, which improves the localization accuracy of the entire system. A proof-of-concept experiment for two-dimensional localization of a metal target is demonstrated. The maximum position error is less than 6.5 cm.

  9. A fiber-based quasi-continuous-wave quantum key distribution system

    PubMed Central

    Shen, Yong; Chen, Yan; Zou, Hongxin; Yuan, Jianmin

    2014-01-01

    We report a fiber-based quasi-continuous-wave (CW) quantum key distribution (QKD) system with continuous variables (CV). This system employs coherent light pulses and time multiplexing to maximally reduce cross talk in the fiber. No-switching detection scheme is adopted to optimize the repetition rate. Information is encoded on the sideband of the pulsed coherent light to fully exploit the continuous wave nature of laser field. With this configuration, high secret key rate can be achieved. For the 50 MHz detected bandwidth in our experiment, when the multidimensional reconciliation protocol is applied, a secret key rate of 187 kb/s can be achieved over 50 km of optical fiber against collective attacks, which have been shown to be asymptotically optimal. Moreover, recently studied loopholes have been fixed in our system. PMID:24691409

  10. Operational Reconnaissance for the Anti-Access /Area Denial environment

    DTIC Science & Technology

    2015-04-01

    locations, the Air Force Distributed Common Ground System ( DCGS ) collects, processes, analyzes, and disseminates over 1.3 million megabits of... DCGS ; satellite data link between the aircraft and ground based receiver; and fiber- optic connection between the receiver, RPA crew, and DCGS . This...analysts and end users. DCGS Integration The Air Force global ISR enterprise is not configured to efficiently receive, exploit, or disseminate fighter

  11. Rotorcraft digital advanced avionics system (RODAAS) functional description

    NASA Technical Reports Server (NTRS)

    Peterson, E. M.; Bailey, J.; Mcmanus, T. J.

    1985-01-01

    A functional design of a rotorcraft digital advanced avionics system (RODAAS) to transfer the technology developed for general aviation in the Demonstration Advanced Avionics System (DAAS) program to rotorcraft operation was undertaken. The objective was to develop an integrated avionics system design that enhances rotorcraft single pilot IFR operations without increasing the required pilot training/experience by exploiting advanced technology in computers, busing, displays and integrated systems design. A key element of the avionics system is the functionally distributed architecture that has the potential for high reliability with low weight, power and cost. A functional description of the RODAAS hardware and software functions is presented.

  12. A versatile fibre optic sensor interrogation system for the Ariane Launcher based on an electro-optically tuneable laser diode

    NASA Astrophysics Data System (ADS)

    Plattner, M. P.; Hirth, F.; Müller, M. S.; Hoffmann, L.; Buck, T. C.; Koch, A. W.

    2017-11-01

    Availability of reliable flight sensor data and knowledge of the structural behaviour are essential for safe operation of the Ariane launcher. The Ariane launcher is currently monitored by hundreds of electric sensors during test and qualification. Fibre optic sensors are regarded as a potential technique to overcome limitations of recent monitoring systems for the Ariane launcher [1]. These limitations include cumbersome application of sensors and harness as well as a very limited degree of distributed sensing capability. But, in order to exploit the various advantages of fibre optic sensors (high degree of multiplexing, distributed sensing capability, lower mass impact, etc.) dedicated measurement systems have to be developed and investigated. State-of-the-art fibre optic measurement systems often use free beam setups making them bulky and sensitive to vibration impact. Therefore a new measurement system is developed as part of the ESAstudy [2].

  13. Insecurity of Detector-Device-Independent Quantum Key Distribution.

    PubMed

    Sajeed, Shihan; Huang, Anqi; Sun, Shihai; Xu, Feihu; Makarov, Vadim; Curty, Marcos

    2016-12-16

    Detector-device-independent quantum key distribution (DDI-QKD) held the promise of being robust to detector side channels, a major security loophole in quantum key distribution (QKD) implementations. In contrast to what has been claimed, however, we demonstrate that the security of DDI-QKD is not based on postselected entanglement, and we introduce various eavesdropping strategies that show that DDI-QKD is in fact insecure against detector side-channel attacks as well as against other attacks that exploit devices' imperfections of the receiver. Our attacks are valid even when the QKD apparatuses are built by the legitimate users of the system themselves, and thus, free of malicious modifications, which is a key assumption in DDI-QKD.

  14. Transit-time and age distributions for nonlinear time-dependent compartmental systems.

    PubMed

    Metzler, Holger; Müller, Markus; Sierra, Carlos A

    2018-02-06

    Many processes in nature are modeled using compartmental systems (reservoir/pool/box systems). Usually, they are expressed as a set of first-order differential equations describing the transfer of matter across a network of compartments. The concepts of age of matter in compartments and the time required for particles to transit the system are important diagnostics of these models with applications to a wide range of scientific questions. Until now, explicit formulas for transit-time and age distributions of nonlinear time-dependent compartmental systems were not available. We compute densities for these types of systems under the assumption of well-mixed compartments. Assuming that a solution of the nonlinear system is available at least numerically, we show how to construct a linear time-dependent system with the same solution trajectory. We demonstrate how to exploit this solution to compute transit-time and age distributions in dependence on given start values and initial age distributions. Furthermore, we derive equations for the time evolution of quantiles and moments of the age distributions. Our results generalize available density formulas for the linear time-independent case and mean-age formulas for the linear time-dependent case. As an example, we apply our formulas to a nonlinear and a linear version of a simple global carbon cycle model driven by a time-dependent input signal which represents fossil fuel additions. We derive time-dependent age distributions for all compartments and calculate the time it takes to remove fossil carbon in a business-as-usual scenario.

  15. Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum

    NASA Astrophysics Data System (ADS)

    Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.

    The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.

  16. Simulation of Stochastic Processes by Coupled ODE-PDE

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2008-01-01

    A document discusses the emergence of randomness in solutions of coupled, fully deterministic ODE-PDE (ordinary differential equations-partial differential equations) due to failure of the Lipschitz condition as a new phenomenon. It is possible to exploit the special properties of ordinary differential equations (represented by an arbitrarily chosen, dynamical system) coupled with the corresponding Liouville equations (used to describe the evolution of initial uncertainties in terms of joint probability distribution) in order to simulate stochastic processes with the proscribed probability distributions. The important advantage of the proposed approach is that the simulation does not require a random-number generator.

  17. Exploiting broad-area surface emitting lasers to manifest the path-length distributions of finite-potential quantum billiards.

    PubMed

    Yu, Y T; Tuan, P H; Chang, K C; Hsieh, Y H; Huang, K F; Chen, Y F

    2016-01-11

    Broad-area vertical-cavity surface-emitting lasers (VCSELs) with different cavity sizes are experimentally exploited to manifest the influence of the finite confinement strength on the path-length distribution of quantum billiards. The subthreshold emission spectra of VCSELs are measured to obtain the path-length distributions by using the Fourier transform. It is verified that the number of the resonant peaks in the path-length distribution decreases with decreasing the confinement strength. Theoretical analyses for finite-potential quantum billiards are numerically performed to confirm that the mesoscopic phenomena of quantum billiards with finite confinement strength can be analogously revealed by using broad-area VCSELs.

  18. Dynamic federation of grid and cloud storage

    NASA Astrophysics Data System (ADS)

    Furano, Fabrizio; Keeble, Oliver; Field, Laurence

    2016-09-01

    The Dynamic Federations project ("Dynafed") enables the deployment of scalable, distributed storage systems composed of independent storage endpoints. While the Uniform Generic Redirector at the heart of the project is protocol-agnostic, we have focused our effort on HTTP-based protocols, including S3 and WebDAV. The system has been deployed on testbeds covering the majority of the ATLAS and LHCb data, and supports geography-aware replica selection. The work done exploits the federation potential of HTTP to build systems that offer uniform, scalable, catalogue-less access to the storage and metadata ensemble and the possibility of seamless integration of other compatible resources such as those from cloud providers. Dynafed can exploit the potential of the S3 delegation scheme, effectively federating on the fly any number of S3 buckets from different providers and applying a uniform authorization to them. This feature has been used to deploy in production the BOINC Data Bridge, which uses the Uniform Generic Redirector with S3 buckets to harmonize the BOINC authorization scheme with the Grid/X509. The Data Bridge has been deployed in production with good results. We believe that the features of a loosely coupled federation of open-protocolbased storage elements open many possibilities of smoothly evolving the current computing models and of supporting new scientific computing projects that rely on massive distribution of data and that would appreciate systems that can more easily be interfaced with commercial providers and can work natively with Web browsers and clients.

  19. Low-cost distributed solar-thermal-electric power generation

    NASA Astrophysics Data System (ADS)

    Der Minassians, Artin; Aschenbach, Konrad H.; Sanders, Seth R.

    2004-01-01

    Due to their high relative cost, solar electric energy systems have yet to be exploited on a widespread basis. It is believed in the energy community that a technology similar to photovoltaic (PV), but offered at about $1/W would lead to widespread deployment at residential and commercial sites. This paper addresses the investigation and feasibility study of a low-cost solar thermal electricity generation technology, suitable for distributed deployment. Specifically, we discuss a system based on nonimaging solar concentrators, integrated with free-piston Stirling engine devices incorporating integrated electric generation. We target concentrator-collector operation at moderate temperatures, in the range of 125°C to 150°C. This temperature is consistent with use of optical concentrators with concentration ratios on the order of 1-2. These low ratio concentrators admit wide angles of radiation acceptance and are thus compatible with no diurnal tracking, and no or only a few seasonal adjustments. Thus, costs and reliability hazards associated with tracking hardware systems are avoided. Further, we note that in the intended application, there is no shortage of incident solar energy, but rather it is the capital cost of the solar-electric system that is most precious. Thus, we outline a strategy for exploiting solar resources in a cost constrained manner. The paper outlines design issues, and a specific design for an appropriately dimensioned free-piston Stirling engine. Only standard low-cost materials and manufacturing methods are required to realize such a machine.

  20. A Cooperative Model for IS Security Risk Management in Distributed Environment

    PubMed Central

    Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively. PMID:24563626

  1. Strong light illumination on gain-switched semiconductor lasers helps the eavesdropper in practical quantum key distribution systems

    NASA Astrophysics Data System (ADS)

    Fei, Yang-yang; Meng, Xiang-dong; Gao, Ming; Yang, Yi; Wang, Hong; Ma, Zhi

    2018-07-01

    The temperature of the semiconductor diode increases under strong light illumination whether thermoelectric cooler is installed or not, which changes the output wavelength of the laser (Lee et al., 2017). However, other characteristics also vary as temperature increases. These variations may help the eavesdropper in practical quantum key distribution systems. We study the effects of temperature increase on gain-switched semiconductor lasers by simulating temperature dependent rate equations. The results show that temperature increase may cause large intensity fluctuation, decrease the output intensity and lead the signal state and decoy state distinguishable. We also propose a modified photon number splitting attack by exploiting the effects of temperature increase. Countermeasures are also proposed.

  2. Highly efficient router-based readout algorithm for single-photon-avalanche-diode imagers for time-correlated experiments

    NASA Astrophysics Data System (ADS)

    Cominelli, A.; Acconcia, G.; Caldi, F.; Peronio, P.; Ghioni, M.; Rech, I.

    2018-02-01

    Time-Correlated Single Photon Counting (TCSPC) is a powerful tool that permits to record extremely fast optical signals with a precision down to few picoseconds. On the other hand, it is recognized as a relatively slow technique, especially when a large time-resolved image is acquired exploiting a single acquisition channel and a scanning system. During the last years, much effort has been made towards the parallelization of many acquisition and conversion chains. In particular, the exploitation of Single-Photon Avalanche Diodes in standard CMOS technology has paved the way to the integration of thousands of independent channels on the same chip. Unfortunately, the presence of a large number of detectors can give rise to a huge rate of events, which can easily lead to the saturation of the transfer rate toward the elaboration unit. As a result, a smart readout approach is needed to guarantee an efficient exploitation of the limited transfer bandwidth. We recently introduced a novel readout architecture, aimed at maximizing the counting efficiency of the system in typical TCSPC measurements. It features a limited number of high-performance converters, which are shared with a much larger array, while a smart routing logic provides a dynamic multiplexing between the two parts. Here we propose a novel routing algorithm, which exploits standard digital gates distributed among a large 32x32 array to ensure a dynamic connection between detectors and external time-measurement circuits.

  3. Distributed smoothed tree kernel for protein-protein interaction extraction from the biomedical literature

    PubMed Central

    Murugesan, Gurusamy; Abdulkadhar, Sabenabanu; Natarajan, Jeyakumar

    2017-01-01

    Automatic extraction of protein-protein interaction (PPI) pairs from biomedical literature is a widely examined task in biological information extraction. Currently, many kernel based approaches such as linear kernel, tree kernel, graph kernel and combination of multiple kernels has achieved promising results in PPI task. However, most of these kernel methods fail to capture the semantic relation information between two entities. In this paper, we present a special type of tree kernel for PPI extraction which exploits both syntactic (structural) and semantic vectors information known as Distributed Smoothed Tree kernel (DSTK). DSTK comprises of distributed trees with syntactic information along with distributional semantic vectors representing semantic information of the sentences or phrases. To generate robust machine learning model composition of feature based kernel and DSTK were combined using ensemble support vector machine (SVM). Five different corpora (AIMed, BioInfer, HPRD50, IEPA, and LLL) were used for evaluating the performance of our system. Experimental results show that our system achieves better f-score with five different corpora compared to other state-of-the-art systems. PMID:29099838

  4. Distributed smoothed tree kernel for protein-protein interaction extraction from the biomedical literature.

    PubMed

    Murugesan, Gurusamy; Abdulkadhar, Sabenabanu; Natarajan, Jeyakumar

    2017-01-01

    Automatic extraction of protein-protein interaction (PPI) pairs from biomedical literature is a widely examined task in biological information extraction. Currently, many kernel based approaches such as linear kernel, tree kernel, graph kernel and combination of multiple kernels has achieved promising results in PPI task. However, most of these kernel methods fail to capture the semantic relation information between two entities. In this paper, we present a special type of tree kernel for PPI extraction which exploits both syntactic (structural) and semantic vectors information known as Distributed Smoothed Tree kernel (DSTK). DSTK comprises of distributed trees with syntactic information along with distributional semantic vectors representing semantic information of the sentences or phrases. To generate robust machine learning model composition of feature based kernel and DSTK were combined using ensemble support vector machine (SVM). Five different corpora (AIMed, BioInfer, HPRD50, IEPA, and LLL) were used for evaluating the performance of our system. Experimental results show that our system achieves better f-score with five different corpora compared to other state-of-the-art systems.

  5. Defense Science Board Task Force Report: The Role of Autonomy in DoD Systems

    DTIC Science & Technology

    2012-07-01

    ASD(R&E) and the Military Services should schedule periodic, on-site collaborations that bring together academia, government and not-for-profit labs...expressing UxV activities, increased problem solving, planning and scheduling capabilities to enable dynamic tasking of distributed UxVs and tools for...industrial, governmental and military. Manufacturing has long exploited planning for logistics and matching product demand to production schedules

  6. Attacks exploiting deviation of mean photon number in quantum key distribution and coin tossing

    NASA Astrophysics Data System (ADS)

    Sajeed, Shihan; Radchenko, Igor; Kaiser, Sarah; Bourgoin, Jean-Philippe; Pappa, Anna; Monat, Laurent; Legré, Matthieu; Makarov, Vadim

    2015-03-01

    The security of quantum communication using a weak coherent source requires an accurate knowledge of the source's mean photon number. Finite calibration precision or an active manipulation by an attacker may cause the actual emitted photon number to deviate from the known value. We model effects of this deviation on the security of three quantum communication protocols: the Bennett-Brassard 1984 (BB84) quantum key distribution (QKD) protocol without decoy states, Scarani-Acín-Ribordy-Gisin 2004 (SARG04) QKD protocol, and a coin-tossing protocol. For QKD we model both a strong attack using technology possible in principle and a realistic attack bounded by today's technology. To maintain the mean photon number in two-way systems, such as plug-and-play and relativistic quantum cryptography schemes, bright pulse energy incoming from the communication channel must be monitored. Implementation of a monitoring detector has largely been ignored so far, except for ID Quantique's commercial QKD system Clavis2. We scrutinize this implementation for security problems and show that designing a hack-proof pulse-energy-measuring detector is far from trivial. Indeed, the first implementation has three serious flaws confirmed experimentally, each of which may be exploited in a cleverly constructed Trojan-horse attack. We discuss requirements for a loophole-free implementation of the monitoring detector.

  7. Distribution of high-dimensional entanglement via an intra-city free-space link

    PubMed Central

    Steinlechner, Fabian; Ecker, Sebastian; Fink, Matthias; Liu, Bo; Bavaresco, Jessica; Huber, Marcus; Scheidl, Thomas; Ursin, Rupert

    2017-01-01

    Quantum entanglement is a fundamental resource in quantum information processing and its distribution between distant parties is a key challenge in quantum communications. Increasing the dimensionality of entanglement has been shown to improve robustness and channel capacities in secure quantum communications. Here we report on the distribution of genuine high-dimensional entanglement via a 1.2-km-long free-space link across Vienna. We exploit hyperentanglement, that is, simultaneous entanglement in polarization and energy-time bases, to encode quantum information, and observe high-visibility interference for successive correlation measurements in each degree of freedom. These visibilities impose lower bounds on entanglement in each subspace individually and certify four-dimensional entanglement for the hyperentangled system. The high-fidelity transmission of high-dimensional entanglement under real-world atmospheric link conditions represents an important step towards long-distance quantum communications with more complex quantum systems and the implementation of advanced quantum experiments with satellite links. PMID:28737168

  8. Distribution of high-dimensional entanglement via an intra-city free-space link.

    PubMed

    Steinlechner, Fabian; Ecker, Sebastian; Fink, Matthias; Liu, Bo; Bavaresco, Jessica; Huber, Marcus; Scheidl, Thomas; Ursin, Rupert

    2017-07-24

    Quantum entanglement is a fundamental resource in quantum information processing and its distribution between distant parties is a key challenge in quantum communications. Increasing the dimensionality of entanglement has been shown to improve robustness and channel capacities in secure quantum communications. Here we report on the distribution of genuine high-dimensional entanglement via a 1.2-km-long free-space link across Vienna. We exploit hyperentanglement, that is, simultaneous entanglement in polarization and energy-time bases, to encode quantum information, and observe high-visibility interference for successive correlation measurements in each degree of freedom. These visibilities impose lower bounds on entanglement in each subspace individually and certify four-dimensional entanglement for the hyperentangled system. The high-fidelity transmission of high-dimensional entanglement under real-world atmospheric link conditions represents an important step towards long-distance quantum communications with more complex quantum systems and the implementation of advanced quantum experiments with satellite links.

  9. The exploitation argument against commercial surrogacy.

    PubMed

    Wilkinson, Stephen

    2003-04-01

    This paper discusses the exploitation argument against commercial surrogacy: the claim that commercial surrogacy is morally objectionable because it is exploitative. The following questions are addressed. First, what exactly does the exploitation argument amount to? Second, is commercial surrogacy in fact exploitative? Third, if it were exploitative, would this provide a sufficient reason to prohibit (or otherwise legislatively discourage) it? The focus throughout is on the exploitation of paid surrogates, although it is noted that other parties (e.g. 'commissioning parents') may also be the victims of exploitation. It is argued that there are good reasons for believing that commercial surrogacy is often exploitative. However, even if we accept this, the exploitation argument for prohibiting (or otherwise legislatively discouraging) commercial surrogacy remains quite weak. One reason for this is that prohibition may well 'backfire' and lead to potential surrogates having to do other things that are more exploitative and/or more harmful than paid surrogacy. It is concluded therefore that those who oppose exploitation should (rather than attempting to stop particular practices like commercial surrogacy) concentrate on: (a) improving the conditions under which paid surrogates 'work'; and (b) changing the background conditions (in particular, the unequal distribution of power and wealth) which generate exploitative relationships.

  10. On a simple attack, limiting the range transmission of secret keys in a system of quantum cryptography based on coding in a sub-carrier frequency

    NASA Astrophysics Data System (ADS)

    Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.; Potapova, T. A.

    2017-03-01

    In the paper by Gleim et al (2016 Opt. Express 24 2619), it was declared that the system of quantum cryptography, exploiting quantum key distribution (QKD) protocol BB84 with the additional reference state and encoding in a sub-carrier, is able to distribute secret keys at a distance of 210 km. The following shows that a simple attack realized with a beam splitter results in a loss of privacy of the keys over substantially smaller distances. It turns out that the actual length of the secret key transmission for the QKD system encoding in the sub-carrier frequency is ten times less than that declared in Gleim et al (2016 Opt. Express 24 2619). Therefore it is impossible to safely use the keys when distributed at a larger length of the communication channel than shown below. The maximum communication distance does not exceed 22 km, even in the most optimistic scenario.

  11. Monte Carlo sampling in diffusive dynamical systems

    NASA Astrophysics Data System (ADS)

    Tapias, Diego; Sanders, David P.; Altmann, Eduardo G.

    2018-05-01

    We introduce a Monte Carlo algorithm to efficiently compute transport properties of chaotic dynamical systems. Our method exploits the importance sampling technique that favors trajectories in the tail of the distribution of displacements, where deviations from a diffusive process are most prominent. We search for initial conditions using a proposal that correlates states in the Markov chain constructed via a Metropolis-Hastings algorithm. We show that our method outperforms the direct sampling method and also Metropolis-Hastings methods with alternative proposals. We test our general method through numerical simulations in 1D (box-map) and 2D (Lorentz gas) systems.

  12. TALON - The Telescope Alert Operation Network System : intelligent linking of distributed autonomous robotic telescopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, R. R.; Wren, J.; Davis, H. R.

    2004-01-01

    The internet has brought about great change in the astronomical community, but this interconnectivity is just starting to be exploited for use in instrumentation. Utilizing the internet for communicating between distributed astronomical systems is still in its infancy, but it already shows great potential. Here we present an example of a distributed network of telescopes that performs more efficienfiy in synchronous operation than as individual instruments. RAPid Telescopes for Optical Response (RAPTOR) is a system of telescopes at LANL that has intelligent intercommunication, combined with wide-field optics, temporal monitoring software, and deep-field follow-up capability all working in closed-loop real-time operation.more » The Telescope ALert Operations Network (TALON) is a network server that allows intercommunication of alert triggers from external and internal resources and controls the distribution of these to each of the telescopes on the network. TALON is designed to grow, allowing any number of telescopes to be linked together and communicate. Coupled with an intelligent alert client at each telescope, it can analyze and respond to each distributed TALON alert based on the telescopes needs and schedule.« less

  13. A QoS adaptive multimedia transport system: design, implementation and experiences

    NASA Astrophysics Data System (ADS)

    Campbell, Andrew; Coulson, Geoff

    1997-03-01

    The long awaited `new environment' of high speed broadband networks and multimedia applications is fast becoming a reality. However, few systems in existence today, whether they be large scale pilots or small scale test-beds in research laboratories, offer a fully integrated and flexible environment where multimedia applications can maximally exploit the quality of service (QoS) capabilities of supporting networks and end-systems. In this paper we describe the implementation of an adaptive transport system that incorporates a QoS oriented API and a range of mechanisms to assist applications in exploiting QoS and adapting to fluctuations in QoS. The system, which is an instantiation of the Lancaster QoS Architecture, is implemented in a multi ATM switch network environment with Linux based PC end systems and continuous media file servers. A performance evaluation of the system configured to support video-on-demand application scenario is presented and discussed. Emphasis is placed on novel features of the system and on their integration into a complete prototype. The most prominent novelty of our design is a `distributed QoS adaptation' scheme which allows applications to delegate to the system responsibility for augmenting and reducing the perceptual quality of video and audio flows when resource availability increases or decreases.

  14. Hillslope characterization: Identifying key controls on local-scale plant communities' distribution using remote sensing and subsurface data fusion.

    NASA Astrophysics Data System (ADS)

    Falco, N.; Wainwright, H. M.; Dafflon, B.; Leger, E.; Peterson, J.; Steltzer, H.; Wilmer, C.; Williams, K. H.; Hubbard, S. S.

    2017-12-01

    Mountainous watershed systems are characterized by extreme heterogeneity in hydrological and pedological properties that influence biotic activities, plant communities and their dynamics. To gain predictive understanding of how ecosystem and watershed system evolve under climate change, it is critical to capture such heterogeneity and to quantify the effect of key environmental variables such as topography, and soil properties. In this study, we exploit advanced geophysical and remote sensing techniques - coupled with machine learning - to better characterize and quantify the interactions between plant communities' distribution and subsurface properties. First, we have developed a remote sensing data fusion framework based on the random forest (RF) classification algorithm to estimate the spatial distribution of plant communities. The framework allows the integration of both plant spectral and structural information, which are derived from multispectral satellite images and airborne LiDAR data. We then use the RF method to evaluate the estimated plant community map, exploiting the subsurface properties (such as bedrock depth, soil moisture and other properties) and geomorphological parameters (such as slope, curvature) as predictors. Datasets include high-resolution geophysical data (electrical resistivity tomography) and LiDAR digital elevation maps. We demonstrate our approach on a mountain hillslope and meadow within the East River watershed in Colorado, which is considered to be a representative headwater catchment in the Upper Colorado Basin. The obtained results show the existence of co-evolution between above and below-ground processes; in particular, dominant shrub communities in wet and flat areas. We show that successful integration of remote sensing data with geophysical measurements allows identifying and quantifying the key environmental controls on plant communities' distribution, and provides insights into their potential changes in the future climate conditions.

  15. Architectures for Parafermionic Topological Matter in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Burrello, Michele; van Heck, Bernard; Cobanera, Emilio

    2013-03-01

    Recent proposals exploit edge modes of fractional topological insulators (FTIs), induced superconducting pairing, and back-scattering to realize one-dimensional systems of parafermions. We extend these proposals to two dimensions and analyze the effect of the superconducting islands' charging energy on the parafermions they host. We focus on two two-dimensional architectures, the tile and stripe configurations, characterized by different distributions of FTIs and derive the associated parafermionic effective Hamiltonians. The tile model realizes the Z2 m toric code in low-order perturbation theory and hence possesses full topological quantum order. By exploiting dualities, we obtain the phase diagram and generalized order parameters for both the tile and stripe models of parafermions. This work was supported by the Dutch Science Foundation NWO/FOM and an ERC Advanced Investigator grant.

  16. Autonomic Management in a Distributed Storage System

    NASA Astrophysics Data System (ADS)

    Tauber, Markus

    2010-07-01

    This thesis investigates the application of autonomic management to a distributed storage system. Effects on performance and resource consumption were measured in experiments, which were carried out in a local area test-bed. The experiments were conducted with components of one specific distributed storage system, but seek to be applicable to a wide range of such systems, in particular those exposed to varying conditions. The perceived characteristics of distributed storage systems depend on their configuration parameters and on various dynamic conditions. For a given set of conditions, one specific configuration may be better than another with respect to measures such as resource consumption and performance. Here, configuration parameter values were set dynamically and the results compared with a static configuration. It was hypothesised that under non-changing conditions this would allow the system to converge on a configuration that was more suitable than any that could be set a priori. Furthermore, the system could react to a change in conditions by adopting a more appropriate configuration. Autonomic management was applied to the peer-to-peer (P2P) and data retrieval components of ASA, a distributed storage system. The effects were measured experimentally for various workload and churn patterns. The management policies and mechanisms were implemented using a generic autonomic management framework developed during this work. The experimental evaluations of autonomic management show promising results, and suggest several future research topics. The findings of this thesis could be exploited in building other distributed storage systems that focus on harnessing storage on user workstations, since these are particularly likely to be exposed to varying, unpredictable conditions.

  17. Design and Implementation of High Interaction Client Honeypot for Drive-by-Download Attacks

    NASA Astrophysics Data System (ADS)

    Akiyama, Mitsuaki; Iwamura, Makoto; Kawakoya, Yuhei; Aoki, Kazufumi; Itoh, Mitsutaka

    Nowadays, the number of web-browser targeted attacks that lead users to adversaries' web sites and exploit web browser vulnerabilities is increasing, and a clarification of their methods and countermeasures is urgently needed. In this paper, we introduce the design and implementation of a new client honeypot for drive-by-download attacks that has the capacity to detect and investigate a variety of malicious web sites. On the basis of the problems of existing client honeypots, we enumerate the requirements of a client honeypot: 1) detection accuracy and variety, 2) collection variety, 3) performance efficiency, and 4) safety and stability. We improve our system with regard to these requirements. The key features of our developed system are stepwise detection focusing on exploit phases, multiple crawler processing, tracking of malware distribution networks, and malware infection prevention. Our evaluation of our developed system in a laboratory experiment and field experiment indicated that its detection variety and crawling performance are higher than those of existing client honeypots. In addition, our system is able to collect information for countermeasures and is secure and stable for continuous operation. We conclude that our system can investigate malicious web sites comprehensively and support countermeasures.

  18. Arc Fault Detection & Localization by Electromagnetic-Acoustic Remote Sensing

    NASA Astrophysics Data System (ADS)

    Vasile, C.; Ioana, C.

    2017-05-01

    Electrical arc faults that occur in photovoltaic systems represent a danger due to their economic impact on production and distribution. In this paper we propose a complete system, with focus on the methodology, that enables the detection and localization of the arc fault, by the use of an electromagnetic-acoustic sensing system. By exploiting the multiple emissions of the arc fault, in conjunction with a real-time detection signal processing method, we ensure accurate detection and localization. In its final form, this present work will present in greater detail the complete system, the methods employed, results and performance, alongside further works that will be carried on.

  19. Exploiting the flexibility of a family of models for taxation and redistribution

    NASA Astrophysics Data System (ADS)

    Bertotti, M. L.; Modanese, G.

    2012-08-01

    We discuss a family of models expressed by nonlinear differential equation systems describing closed market societies in the presence of taxation and redistribution. We focus in particular on three example models obtained in correspondence to different parameter choices. We analyse the influence of the various choices on the long time shape of the income distribution. Several simulations suggest that behavioral heterogeneity among the individuals plays a definite role in the formation of fat tails of the asymptotic stationary distributions. This is in agreement with results found with different approaches and techniques. We also show that an excellent fit for the computational outputs of our models is provided by the κ-generalized distribution introduced by Kaniadakis in [Physica A 296, 405 (2001)].

  20. Hybrid Communication Architectures for Distributed Smart Grid Applications

    DOE PAGES

    Zhang, Jianhua; Hasandka, Adarsh; Wei, Jin; ...

    2018-04-09

    Wired and wireless communications both play an important role in the blend of communications technologies necessary to enable future smart grid communications. Hybrid networks exploit independent mediums to extend network coverage and improve performance. However, whereas individual technologies have been applied in simulation networks, as far as we know there is only limited attention that has been paid to the development of a suite of hybrid communication simulation models for the communications system design. Hybrid simulation models are needed to capture the mixed communication technologies and IP address mechanisms in one simulation. To close this gap, we have developed amore » suite of hybrid communication system simulation models to validate the critical system design criteria for a distributed solar Photovoltaic (PV) communications system, including a single trip latency of 300 ms, throughput of 9.6 Kbps, and packet loss rate of 1%. In conclusion, the results show that three low-power wireless personal area network (LoWPAN)-based hybrid architectures can satisfy three performance metrics that are critical for distributed energy resource communications.« less

  1. Hybrid Communication Architectures for Distributed Smart Grid Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jianhua; Hasandka, Adarsh; Wei, Jin

    Wired and wireless communications both play an important role in the blend of communications technologies necessary to enable future smart grid communications. Hybrid networks exploit independent mediums to extend network coverage and improve performance. However, whereas individual technologies have been applied in simulation networks, as far as we know there is only limited attention that has been paid to the development of a suite of hybrid communication simulation models for the communications system design. Hybrid simulation models are needed to capture the mixed communication technologies and IP address mechanisms in one simulation. To close this gap, we have developed amore » suite of hybrid communication system simulation models to validate the critical system design criteria for a distributed solar Photovoltaic (PV) communications system, including a single trip latency of 300 ms, throughput of 9.6 Kbps, and packet loss rate of 1%. In conclusion, the results show that three low-power wireless personal area network (LoWPAN)-based hybrid architectures can satisfy three performance metrics that are critical for distributed energy resource communications.« less

  2. Evolutionary prisoner's dilemma on Newman-Watts social networks with an asymmetric payoff distribution mechanism

    NASA Astrophysics Data System (ADS)

    Du, Wen-Bo; Cao, Xian-Bin; Yang, Han-Xin; Hu, Mao-Bin

    2010-01-01

    In this paper, we introduce an asymmetric payoff distribution mechanism into the evolutionary prisoner's dilemma game (PDG) on Newman-Watts social networks, and study its effects on the evolution of cooperation. The asymmetric payoff distribution mechanism can be adjusted by the parameter α: if α > 0, the rich will exploit the poor to get richer; if α < 0, the rich are forced to offer part of their income to the poor. Numerical results show that the cooperator frequency monotonously increases with α and is remarkably promoted when α > 0. The effects of updating order and self-interaction are also investigated. The co-action of random updating and self-interaction can induce the highest cooperation level. Moreover, we employ the Gini coefficient to investigate the effect of asymmetric payoff distribution on the the system's wealth distribution. This work may be helpful for understanding cooperative behaviour and wealth inequality in society.

  3. Transport infrastructure surveillance and monitoring by electromagnetic sensing: the ISTIMES project.

    PubMed

    Proto, Monica; Bavusi, Massimo; Bernini, Romeo; Bigagli, Lorenzo; Bost, Marie; Bourquin, Frédrèric; Cottineau, Louis-Marie; Cuomo, Vincenzo; Della Vecchia, Pietro; Dolce, Mauro; Dumoulin, Jean; Eppelbaum, Lev; Fornaro, Gianfranco; Gustafsson, Mats; Hugenschmidt, Johannes; Kaspersen, Peter; Kim, Hyunwook; Lapenna, Vincenzo; Leggio, Mario; Loperte, Antonio; Mazzetti, Paolo; Moroni, Claudio; Nativi, Stefano; Nordebo, Sven; Pacini, Fabrizio; Palombo, Angelo; Pascucci, Simone; Perrone, Angela; Pignatti, Stefano; Ponzo, Felice Carlo; Rizzo, Enzo; Soldovieri, Francesco; Taillade, Fédrèric

    2010-01-01

    The ISTIMES project, funded by the European Commission in the frame of a joint Call "ICT and Security" of the Seventh Framework Programme, is presented and preliminary research results are discussed. The main objective of the ISTIMES project is to design, assess and promote an Information and Communication Technologies (ICT)-based system, exploiting distributed and local sensors, for non-destructive electromagnetic monitoring of critical transport infrastructures. The integration of electromagnetic technologies with new ICT information and telecommunications systems enables remotely controlled monitoring and surveillance and real time data imaging of the critical transport infrastructures. The project exploits different non-invasive imaging technologies based on electromagnetic sensing (optic fiber sensors, Synthetic Aperture Radar satellite platform based, hyperspectral spectroscopy, Infrared thermography, Ground Penetrating Radar-, low-frequency geophysical techniques, Ground based systems for displacement monitoring). In this paper, we show the preliminary results arising from the GPR and infrared thermographic measurements carried out on the Musmeci bridge in Potenza, located in a highly seismic area of the Apennine chain (Southern Italy) and representing one of the test beds of the project.

  4. Transport Infrastructure Surveillance and Monitoring by Electromagnetic Sensing: The ISTIMES Project

    PubMed Central

    Proto, Monica; Bavusi, Massimo; Bernini, Romeo; Bigagli, Lorenzo; Bost, Marie; Bourquin, Frédrèric.; Cottineau, Louis-Marie; Cuomo, Vincenzo; Vecchia, Pietro Della; Dolce, Mauro; Dumoulin, Jean; Eppelbaum, Lev; Fornaro, Gianfranco; Gustafsson, Mats; Hugenschmidt, Johannes; Kaspersen, Peter; Kim, Hyunwook; Lapenna, Vincenzo; Leggio, Mario; Loperte, Antonio; Mazzetti, Paolo; Moroni, Claudio; Nativi, Stefano; Nordebo, Sven; Pacini, Fabrizio; Palombo, Angelo; Pascucci, Simone; Perrone, Angela; Pignatti, Stefano; Ponzo, Felice Carlo; Rizzo, Enzo; Soldovieri, Francesco; Taillade, Fédrèric

    2010-01-01

    The ISTIMES project, funded by the European Commission in the frame of a joint Call “ICT and Security” of the Seventh Framework Programme, is presented and preliminary research results are discussed. The main objective of the ISTIMES project is to design, assess and promote an Information and Communication Technologies (ICT)-based system, exploiting distributed and local sensors, for non-destructive electromagnetic monitoring of critical transport infrastructures. The integration of electromagnetic technologies with new ICT information and telecommunications systems enables remotely controlled monitoring and surveillance and real time data imaging of the critical transport infrastructures. The project exploits different non-invasive imaging technologies based on electromagnetic sensing (optic fiber sensors, Synthetic Aperture Radar satellite platform based, hyperspectral spectroscopy, Infrared thermography, Ground Penetrating Radar-, low-frequency geophysical techniques, Ground based systems for displacement monitoring). In this paper, we show the preliminary results arising from the GPR and infrared thermographic measurements carried out on the Musmeci bridge in Potenza, located in a highly seismic area of the Apennine chain (Southern Italy) and representing one of the test beds of the project. PMID:22163489

  5. Fishing and temperature effects on the size structure of exploited fish stocks.

    PubMed

    Tu, Chen-Yi; Chen, Kuan-Ting; Hsieh, Chih-Hao

    2018-05-08

    Size structure of fish stock plays an important role in maintaining sustainability of the population. Size distribution of an exploited stock is predicted to shift toward small individuals caused by size-selective fishing and/or warming; however, their relative contribution remains relatively unexplored. In addition, existing analyses on size structure have focused on univariate size-based indicators (SBIs), such as mean length, evenness of size classes, or the upper 95-percentile of the length frequency distribution; these approaches may not capture full information of size structure. To bridge the gap, we used the variation partitioning approach to examine how the size structure (composition of size classes) responded to fishing, warming and the interaction. We analyzed 28 exploited stocks in the West US, Alaska and North Sea. Our result shows fishing has the most prominent effect on the size structure of the exploited stocks. In addition, the fish stocks experienced higher variability in fishing is more responsive to the temperature effect in their size structure, suggesting that fishing may elevate the sensitivity of exploited stocks in responding to environmental effects. The variation partitioning approach provides complementary information to univariate SBIs in analyzing size structure.

  6. Quantum key distribution with hacking countermeasures and long term field trial.

    PubMed

    Dixon, A R; Dynes, J F; Lucamarini, M; Fröhlich, B; Sharpe, A W; Plews, A; Tam, W; Yuan, Z L; Tanizawa, Y; Sato, H; Kawamura, S; Fujiwara, M; Sasaki, M; Shields, A J

    2017-05-16

    Quantum key distribution's (QKD's) central and unique claim is information theoretic security. However there is an increasing understanding that the security of a QKD system relies not only on theoretical security proofs, but also on how closely the physical system matches the theoretical models and prevents attacks due to discrepancies. These side channel or hacking attacks exploit physical devices which do not necessarily behave precisely as the theory expects. As such there is a need for QKD systems to be demonstrated to provide security both in the theoretical and physical implementation. We report here a QKD system designed with this goal in mind, providing a more resilient target against possible hacking attacks including Trojan horse, detector blinding, phase randomisation and photon number splitting attacks. The QKD system was installed into a 45 km link of a metropolitan telecom network for a 2.5 month period, during which time the system operated continuously and distributed 1.33 Tbits of secure key data with a stable secure key rate over 200 kbit/s. In addition security is demonstrated against coherent attacks that are more general than the collective class of attacks usually considered.

  7. Population-based learning of load balancing policies for a distributed computer system

    NASA Technical Reports Server (NTRS)

    Mehra, Pankaj; Wah, Benjamin W.

    1993-01-01

    Effective load-balancing policies use dynamic resource information to schedule tasks in a distributed computer system. We present a novel method for automatically learning such policies. At each site in our system, we use a comparator neural network to predict the relative speedup of an incoming task using only the resource-utilization patterns obtained prior to the task's arrival. Outputs of these comparator networks are broadcast periodically over the distributed system, and the resource schedulers at each site use these values to determine the best site for executing an incoming task. The delays incurred in propagating workload information and tasks from one site to another, as well as the dynamic and unpredictable nature of workloads in multiprogrammed multiprocessors, may cause the workload pattern at the time of execution to differ from patterns prevailing at the times of load-index computation and decision making. Our load-balancing policy accommodates this uncertainty by using certain tunable parameters. We present a population-based machine-learning algorithm that adjusts these parameters in order to achieve high average speedups with respect to local execution. Our results show that our load-balancing policy, when combined with the comparator neural network for workload characterization, is effective in exploiting idle resources in a distributed computer system.

  8. Exploiting position effects and the gypsy retrovirus insulator to engineer precisely expressed transgenes.

    PubMed

    Markstein, Michele; Pitsouli, Chrysoula; Villalta, Christians; Celniker, Susan E; Perrimon, Norbert

    2008-04-01

    A major obstacle to creating precisely expressed transgenes lies in the epigenetic effects of the host chromatin that surrounds them. Here we present a strategy to overcome this problem, employing a Gal4-inducible luciferase assay to systematically quantify position effects of host chromatin and the ability of insulators to counteract these effects at phiC31 integration loci randomly distributed throughout the Drosophila genome. We identify loci that can be exploited to deliver precise doses of transgene expression to specific tissues. Moreover, we uncover a previously unrecognized property of the gypsy retrovirus insulator to boost gene expression to levels severalfold greater than at most or possibly all un-insulated loci, in every tissue tested. These findings provide the first opportunity to create a battery of transgenes that can be reliably expressed at high levels in virtually any tissue by integration at a single locus, and conversely, to engineer a controlled phenotypic allelic series by exploiting several loci. The generality of our approach makes it adaptable to other model systems to identify and modify loci for optimal transgene expression.

  9. Derived virtual devices: a secure distributed file system mechanism

    NASA Technical Reports Server (NTRS)

    VanMeter, Rodney; Hotz, Steve; Finn, Gregory

    1996-01-01

    This paper presents the design of derived virtual devices (DVDs). DVDs are the mechanism used by the Netstation Project to provide secure shared access to network-attached peripherals distributed in an untrusted network environment. DVDs improve Input/Output efficiency by allowing user processes to perform I/O operations directly from devices without intermediate transfer through the controlling operating system kernel. The security enforced at the device through the DVD mechanism includes resource boundary checking, user authentication, and restricted operations, e.g., read-only access. To illustrate the application of DVDs, we present the interactions between a network-attached disk and a file system designed to exploit the DVD abstraction. We further discuss third-party transfer as a mechanism intended to provide for efficient data transfer in a typical NAP environment. We show how DVDs facilitate third-party transfer, and provide the security required in a more open network environment.

  10. Product Distribution Theory for Control of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Lee, Chia Fan; Wolpert, David H.

    2004-01-01

    Product Distribution (PD) theory is a new framework for controlling Multi-Agent Systems (MAS's). First we review one motivation of PD theory, as the information-theoretic extension of conventional full-rationality game theory to the case of bounded rational agents. In this extension the equilibrium of the game is the optimizer of a Lagrangian of the (probability distribution of) the joint stare of the agents. Accordingly we can consider a team game in which the shared utility is a performance measure of the behavior of the MAS. For such a scenario the game is at equilibrium - the Lagrangian is optimized - when the joint distribution of the agents optimizes the system's expected performance. One common way to find that equilibrium is to have each agent run a reinforcement learning algorithm. Here we investigate the alternative of exploiting PD theory to run gradient descent on the Lagrangian. We present computer experiments validating some of the predictions of PD theory for how best to do that gradient descent. We also demonstrate how PD theory can improve performance even when we are not allowed to rerun the MAS from different initial conditions, a requirement implicit in some previous work.

  11. Boldness predicts an individual's position along an exploration-exploitation foraging trade-off.

    PubMed

    Patrick, Samantha C; Pinaud, David; Weimerskirch, Henri

    2017-09-01

    Individuals do not have complete information about the environment and therefore they face a trade-off between gathering information (exploration) and gathering resources (exploitation). Studies have shown individual differences in components of this trade-off but how stable these strategies are in a population and the intrinsic drivers of these differences is not well understood. Top marine predators are expected to experience a particularly strong trade-off as many species have large foraging ranges and their prey often have a patchy distribution. This environment leads these species to exhibit pronounced exploration and exploitation phases but differences between individuals are poorly resolved. Personality differences are known to be important in foraging behaviour but also in the trade-off between exploration and exploitation. Here we test whether personality predicts an individual exploration-exploitation strategy using wide ranging wandering albatrosses (Diomedea exulans) as a model system. Using GPS tracking data from 276 wandering albatrosses, we extract foraging parameters indicative of exploration (searching) and exploitation (foraging) and show that foraging effort, time in patch and size of patch are strongly correlated, demonstrating these are indicative of an exploration-exploitation (EE) strategy. Furthermore, we show these are consistent within individuals and appear stable in the population, with no reproductive advantage. The searching and foraging behaviour of bolder birds placed them towards the exploration end of the trade-off, whereas shy birds showed greater exploitation. This result provides a mechanism through which individual foraging strategies may emerge. Age and sex affected components of the trade-off, but not the trade-off itself, suggesting these factors may drive behavioural compensation to maintain resource acquisition and this was supported by the evidence that there were no fitness consequence of any EE trait nor the trade-off itself. These results demonstrate a clear trade-off between information gathering and exploitation of prey patches, and reveals for the first time that boldness may drive these differences. This provides a mechanism through which widely reported links between personality and foraging may emerge. © 2017 The Authors. Journal of Animal Ecology published by John Wiley & Sons Ltd on behalf of British Ecological Society.

  12. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  13. Decentralized diagnostics based on a distributed micro-genetic algorithm for transducer networks monitoring large experimental systems.

    PubMed

    Arpaia, P; Cimmino, P; Girone, M; La Commara, G; Maisto, D; Manna, C; Pezzetti, M

    2014-09-01

    Evolutionary approach to centralized multiple-faults diagnostics is extended to distributed transducer networks monitoring large experimental systems. Given a set of anomalies detected by the transducers, each instance of the multiple-fault problem is formulated as several parallel communicating sub-tasks running on different transducers, and thus solved one-by-one on spatially separated parallel processes. A micro-genetic algorithm merges evaluation time efficiency, arising from a small-size population distributed on parallel-synchronized processors, with the effectiveness of centralized evolutionary techniques due to optimal mix of exploitation and exploration. In this way, holistic view and effectiveness advantages of evolutionary global diagnostics are combined with reliability and efficiency benefits of distributed parallel architectures. The proposed approach was validated both (i) by simulation at CERN, on a case study of a cold box for enhancing the cryogeny diagnostics of the Large Hadron Collider, and (ii) by experiments, under the framework of the industrial research project MONDIEVOB (Building Remote Monitoring and Evolutionary Diagnostics), co-funded by EU and the company Del Bo srl, Napoli, Italy.

  14. The distributed production system of the SuperB project: description and results

    NASA Astrophysics Data System (ADS)

    Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.

    2011-12-01

    The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.

  15. Evolution of the ATLAS distributed computing system during the LHC long shutdown

    NASA Astrophysics Data System (ADS)

    Campana, S.; Atlas Collaboration

    2014-06-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the Worldwide LHC Computing Grid (WLCG) distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1 PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileup. We will describe the evolution of the ADC software foreseen during this period. This includes consolidating the existing Production and Distributed Analysis framework (PanDA) and ATLAS Grid Information System (AGIS), together with the development and commissioning of next generation systems for distributed data management (DDM/Rucio) and production (Prodsys-2). We will explain how new technologies such as Cloud Computing and NoSQL databases, which ATLAS investigated as R&D projects in past years, will be integrated in production. Finally, we will describe more fundamental developments such as breaking job-to-data locality by exploiting storage federations and caches, and event level (rather than file or dataset level) workload engines.

  16. Attacks on practical quantum key distribution systems (and how to prevent them)

    NASA Astrophysics Data System (ADS)

    Jain, Nitin; Stiller, Birgit; Khan, Imran; Elser, Dominique; Marquardt, Christoph; Leuchs, Gerd

    2016-07-01

    With the emergence of an information society, the idea of protecting sensitive data is steadily gaining importance. Conventional encryption methods may not be sufficient to guarantee data protection in the future. Quantum key distribution (QKD) is an emerging technology that exploits fundamental physical properties to guarantee perfect security in theory. However, it is not easy to ensure in practice that the implementations of QKD systems are exactly in line with the theoretical specifications. Such theory-practice deviations can open loopholes and compromise security. Several such loopholes have been discovered and investigated in the last decade. These activities have motivated the proposal and implementation of appropriate countermeasures, thereby preventing future attacks and enhancing the practical security of QKD. This article introduces the so-called field of quantum hacking by summarising a variety of attacks and their prevention mechanisms.

  17. [Applications of GIS in biomass energy source research].

    PubMed

    Su, Xian-Ming; Wang, Wu-Kui; Li, Yi-Wei; Sun, Wen-Xiang; Shi, Hai; Zhang, Da-Hong

    2010-03-01

    Biomass resources have the characteristics of widespread and dispersed distribution, which have close relations to the environment, climate, soil, and land use, etc. Geographic information system (GIS) has the functions of spatial analysis and the flexibility of integrating with other application models and algorithms, being of predominance to the biomass energy source research. This paper summarized the researches on the GIS applications in biomass energy source research, with the focus in the feasibility study of bioenergy development, assessment of biomass resources amount and distribution, layout of biomass exploitation and utilization, evaluation of gaseous emission from biomass burning, and biomass energy information system. Three perspectives of GIS applications in biomass energy source research were proposed, i. e., to enrich the data source, to improve the capacity on data processing and decision-support, and to generate the online proposal.

  18. Performance analysis of distributed symmetric sparse matrix vector multiplication algorithm for multi-core architectures

    DOE PAGES

    Oryspayev, Dossay; Aktulga, Hasan Metin; Sosonkina, Masha; ...

    2015-07-14

    In this article, sparse matrix vector multiply (SpMVM) is an important kernel that frequently arises in high performance computing applications. Due to its low arithmetic intensity, several approaches have been proposed in literature to improve its scalability and efficiency in large scale computations. In this paper, our target systems are high end multi-core architectures and we use messaging passing interface + open multiprocessing hybrid programming model for parallelism. We analyze the performance of recently proposed implementation of the distributed symmetric SpMVM, originally developed for large sparse symmetric matrices arising in ab initio nuclear structure calculations. We also study important featuresmore » of this implementation and compare with previously reported implementations that do not exploit underlying symmetry. Our SpMVM implementations leverage the hybrid paradigm to efficiently overlap expensive communications with computations. Our main comparison criterion is the "CPU core hours" metric, which is the main measure of resource usage on supercomputers. We analyze the effects of topology-aware mapping heuristic using simplified network load model. Furthermore, we have tested the different SpMVM implementations on two large clusters with 3D Torus and Dragonfly topology. Our results show that the distributed SpMVM implementation that exploits matrix symmetry and hides communication yields the best value for the "CPU core hours" metric and significantly reduces data movement overheads.« less

  19. A broadband multimedia TeleLearning system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ruiping; Karmouch, A.

    1996-12-31

    In this paper we discuss a broadband multimedia TeleLearning system under development in the Multimedia Information Research Laboratory at the University of Ottawa. The system aims at providing a seamless environment for TeleLearning using the latest telecommunication and multimedia information processing technology. It basically consists of a media production center, a courseware author site, a courseware database, a courseware user site, and an on-line facilitator site. All these components are distributed over an ATM network and work together to offer a multimedia interactive courseware service. An MHEG-based model is exploited in designing the system architecture to achieve the real-time, interactive,more » and reusable information interchange through heterogeneous platforms. The system architecture, courseware processing strategies, courseware document models are presented.« less

  20. Interrogation of weak Bragg grating sensors based on dual-wavelength differential detection.

    PubMed

    Cheng, Rui; Xia, Li

    2016-11-15

    It is shown that for weak Bragg gratings the logarithmic ratio of reflected intensities at any two wavelengths within the spectrum follows a linear relationship with the Bragg wavelength shift, with a slope proportional to their wavelength spacing. This finding is exploited to develop a flexible, efficient, and cheap interrogation solution of weak fiber Bragg grating (FBGs), especially ultra-short FBGs, in distributed sensing based on dual-wavelength differential detection. The concept is experimentally studied in both single and distributed sensing systems with ultra-short FBG sensors. The work may form the basis of new and promising FBG interrogation techniques based on detecting discrete rather than continuous spectra.

  1. Recent advances in multiview distributed video coding

    NASA Astrophysics Data System (ADS)

    Dufaux, Frederic; Ouaret, Mourad; Ebrahimi, Touradj

    2007-04-01

    We consider dense networks of surveillance cameras capturing overlapped images of the same scene from different viewing directions, such a scenario being referred to as multi-view. Data compression is paramount in such a system due to the large amount of captured data. In this paper, we propose a Multi-view Distributed Video Coding approach. It allows for low complexity / low power consumption at the encoder side, and the exploitation of inter-view correlation without communications among the cameras. We introduce a combination of temporal intra-view side information and homography inter-view side information. Simulation results show both the improvement of the side information, as well as a significant gain in terms of coding efficiency.

  2. Misreporting behaviour in iterated prisoner's dilemma game with combined trust strategy

    NASA Astrophysics Data System (ADS)

    Chen, Bo; Zhang, Bin; Wu, Hua-qing

    2015-01-01

    Effects of agents' misreporting behaviour on system cooperation are studied in a multi-agent iterated prisoner's dilemma game. Agents, adopting combined trust strategy (denoted by CTS) are classified into three groups, i.e., honest CTS, positive-reporting CTS and negative-reporting CTS. The differences of cooperation frequency and pay-off under three different systems, i.e., system only with honest CTS, system with honest CTS and positive-reporting CTS and system with honest CTS and negative-reporting CTS, are compared. Furthermore, we also investigate the effects of misreporting behaviour on an exploiter who adopts an exploiting strategy (denoted by EXPL) in a system with two CTSs and one EXPL. At last, numerical simulations are performed for understanding the effects of misreporting behaviour on CTS. The results reveal that positive-reporting behaviour can strengthen system cooperation, while negative-reporting behaviour cannot. When EXPL exists in a system, positive-reporting behaviour helps the exploiter in reducing its exploiting cost and encourages agents to adopt exploiting strategy, but hurts other agents' interests.

  3. Spatio-temporal patterns of key exploited marine species in the Northwestern Mediterranean Sea.

    PubMed

    Morfin, Marie; Fromentin, Jean-Marc; Jadaud, Angélique; Bez, Nicolas

    2012-01-01

    This study analyzes the temporal variability/stability of the spatial distributions of key exploited species in the Gulf of Lions (Northwestern Mediterranean Sea). To do so, we analyzed data from the MEDITS bottom-trawl scientific surveys from 1994 to 2010 at 66 fixed stations and selected 12 key exploited species. We proposed a geostatistical approach to handle zero-inflated and non-stationary distributions and to test for the temporal stability of the spatial structures. Empirical Orthogonal Functions and other descriptors were then applied to investigate the temporal persistence and the characteristics of the spatial patterns. The spatial structure of the distribution (i.e. the pattern of spatial autocorrelation) of the 12 key species studied remained highly stable over the time period sampled. The spatial distributions of all species obtained through kriging also appeared to be stable over time, while each species displayed a specific spatial distribution. Furthermore, adults were generally more densely concentrated than juveniles and occupied areas included in the distribution of juveniles. Despite the strong persistence of spatial distributions, we also observed that the area occupied by each species was correlated to its abundance: the more abundant the species, the larger the occupation area. Such a result tends to support MacCall's basin theory, according to which density-dependence responses would drive the expansion of those 12 key species in the Gulf of Lions. Further analyses showed that these species never saturated their habitats, suggesting that they are below their carrying capacity; an assumption in agreement with the overexploitation of several of these species. Finally, the stability of their spatial distributions over time and their potential ability to diffuse outside their main habitats give support to Marine Protected Areas as a potential pertinent management tool.

  4. Advanced unambiguous state discrimination attack and countermeasure strategy in a practical B92 QKD system

    NASA Astrophysics Data System (ADS)

    Ko, Heasin; Choi, Byung-Seok; Choe, Joong-Seon; Youn, Chun Ju

    2018-01-01

    Even though unconditional security of B92 quantum key distribution (QKD) system is based on the assumption of perfect positive-operator-valued measures, practical B92 systems only utilize two projective measurements. Unfortunately, such implementation may degrade the security of the B92 QKD system due to Eve's potential attack exploiting the imperfection of system. In this paper, we propose an advanced attack strategy with an unambiguous state discrimination (USD) measurement which makes practical B92 QKD systems insecure even under a lossless channel. In addition, we propose an effective countermeasure against the advanced USD attack model by monitoring double-click events. We further address a fundamental approach to make the B92 QKD system tolerable to attack strategies with USD measurements using a multi-qubit scheme.

  5. Decision algorithm for data center vortex beam receiver

    NASA Astrophysics Data System (ADS)

    Kupferman, Judy; Arnon, Shlomi

    2017-12-01

    We present a new scheme for a vortex beam communications system which exploits the radial component p of Laguerre-Gauss modes in addition to the azimuthal component l generally used. We derive a new encoding algorithm which makes use of the spatial distribution of intensity to create an alphabet dictionary for communication. We suggest an application of the scheme as part of an optical wireless link for intra data center communication. We investigate the probability of error in decoding, for several detector options.

  6. Protecting Cryptographic Keys and Functions from Malware Attacks

    DTIC Science & Technology

    2010-12-01

    registers. modifies RSA private key signing in OpenSSL to use the technique. The resulting system has the following features: 1. No special hardware is...the above method based on OpenSSL , by exploiting the Streaming SIMD Extension (SSE) XMM registers of modern Intel and AMD x86-compatible CPU’s [22...one can store a 2048-bit exponent.1 Our prototype is based on OpenSSL 0.9.8e, the Ubuntu 6.06 Linux distribution with a 2.6.15 kernel, and SSE2 which

  7. Wealth redistribution in our small world

    NASA Astrophysics Data System (ADS)

    Iglesias, J. R.; Gonçalves, S.; Pianegonda, S.; Vega, J. L.; Abramson, G.

    2003-09-01

    We present a simplified model for the exploitation of resources by interacting agents, in an economy with small-world properties. It is shown that Gaussian distributions of wealth, with some cutoff at a poverty line are present for all values of the parameters, while the frequency of maxima and minima strongly depends on the connectivity and the disorder of the lattice. Finally, we compare a system where the commercial links are frozen with an economy where agents can choose their commercial partners at each time step.

  8. Single-user MIMO versus multi-user MIMO in distributed antenna systems with limited feedback

    NASA Astrophysics Data System (ADS)

    Schwarz, Stefan; Heath, Robert W.; Rupp, Markus

    2013-12-01

    This article investigates the performance of cellular networks employing distributed antennas in addition to the central antennas of the base station. Distributed antennas are likely to be implemented using remote radio units, which is enabled by a low latency and high bandwidth dedicated link to the base station. This facilitates coherent transmission from potentially all available antennas at the same time. Such distributed antenna system (DAS) is an effective way to deal with path loss and large-scale fading in cellular systems. DAS can apply precoding across multiple transmission points to implement single-user MIMO (SU-MIMO) and multi-user MIMO (MU-MIMO) transmission. The throughput performance of various SU-MIMO and MU-MIMO transmission strategies is investigated in this article, employing a Long-Term evolution (LTE) standard compliant simulation framework. The previously theoretically established cell-capacity improvement of MU-MIMO in comparison to SU-MIMO in DASs is confirmed under the practical constraints imposed by the LTE standard, even under the assumption of imperfect channel state information (CSI) at the base station. Because practical systems will use quantized feedback, the performance of different CSI feedback algorithms for DASs is investigated. It is shown that significant gains in the CSI quantization accuracy and in the throughput of especially MU-MIMO systems can be achieved with relatively simple quantization codebook constructions that exploit the available temporal correlation and channel gain differences.

  9. Seasonal source-sink dynamics at the edge of a species' range

    USGS Publications Warehouse

    Kanda, L.L.; Fuller, T.K.; Sievert, P.R.; Kellogg, R.L.

    2009-01-01

    The roles of dispersal and population dynamics in determining species' range boundaries recently have received theoretical attention but little empirical work. Here we provide data on survival, reproduction, and movement for a Virginia opossum (Didelphis virginiana) population at a local distributional edge in central Massachusetts (USA). Most juvenile females that apparently exploited anthropogenic resources survived their first winter, whereas those using adjacent natural resources died of starvation. In spring, adult females recolonized natural areas. A life-table model suggests that a population exploiting anthropogenic resources may grow, acting as source to a geographically interlaced sink of opossums using only natural resources, and also providing emigrants for further range expansion to new human-dominated landscapes. In a geographical model, this source-sink dynamic is consistent with the local distribution identified through road-kill surveys. The Virginia opossum's exploitation of human resources likely ameliorates energetically restrictive winters and may explain both their local distribution and their northward expansion in unsuitable natural climatic regimes. Landscape heterogeneity, such as created by urbanization, may result in source-sink dynamics at highly localized scales. Differential fitness and individual dispersal movements within local populations are key to generating regional distributions, and thus species ranges, that exceed expectations. ?? 2009 by the Ecological Society of America.

  10. The Exploitative Nature of Work-Based Studies: A Sketch of an Idea

    ERIC Educational Resources Information Center

    Gibbs, Paul

    2004-01-01

    This article argues that, if education is considered as a means of increasing human capital, then the potential exists for exploitation of the learners through the inequitable distribution of the value accruing from their research activities. To illustrate the argument, I discuss these issues through the lens of a work-based professional doctorate…

  11. Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarje, Abhinav; Jacobsen, Douglas W.; Williams, Samuel W.

    The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.

  12. Test of mutually unbiased bases for six-dimensional photonic quantum systems

    PubMed Central

    D'Ambrosio, Vincenzo; Cardano, Filippo; Karimi, Ebrahim; Nagali, Eleonora; Santamato, Enrico; Marrucci, Lorenzo; Sciarrino, Fabio

    2013-01-01

    In quantum information, complementarity of quantum mechanical observables plays a key role. The eigenstates of two complementary observables form a pair of mutually unbiased bases (MUBs). More generally, a set of MUBs consists of bases that are all pairwise unbiased. Except for specific dimensions of the Hilbert space, the maximal sets of MUBs are unknown in general. Even for a dimension as low as six, the identification of a maximal set of MUBs remains an open problem, although there is strong numerical evidence that no more than three simultaneous MUBs do exist. Here, by exploiting a newly developed holographic technique, we implement and test different sets of three MUBs for a single photon six-dimensional quantum state (a “qusix”), encoded exploiting polarization and orbital angular momentum of photons. A close agreement is observed between theory and experiments. Our results can find applications in state tomography, quantitative wave-particle duality, quantum key distribution. PMID:24067548

  13. Test of mutually unbiased bases for six-dimensional photonic quantum systems.

    PubMed

    D'Ambrosio, Vincenzo; Cardano, Filippo; Karimi, Ebrahim; Nagali, Eleonora; Santamato, Enrico; Marrucci, Lorenzo; Sciarrino, Fabio

    2013-09-25

    In quantum information, complementarity of quantum mechanical observables plays a key role. The eigenstates of two complementary observables form a pair of mutually unbiased bases (MUBs). More generally, a set of MUBs consists of bases that are all pairwise unbiased. Except for specific dimensions of the Hilbert space, the maximal sets of MUBs are unknown in general. Even for a dimension as low as six, the identification of a maximal set of MUBs remains an open problem, although there is strong numerical evidence that no more than three simultaneous MUBs do exist. Here, by exploiting a newly developed holographic technique, we implement and test different sets of three MUBs for a single photon six-dimensional quantum state (a "qusix"), encoded exploiting polarization and orbital angular momentum of photons. A close agreement is observed between theory and experiments. Our results can find applications in state tomography, quantitative wave-particle duality, quantum key distribution.

  14. Coupling bimolecular PARylation biosensors with genetic screens to identify PARylation targets.

    PubMed

    Krastev, Dragomir B; Pettitt, Stephen J; Campbell, James; Song, Feifei; Tanos, Barbara E; Stoynov, Stoyno S; Ashworth, Alan; Lord, Christopher J

    2018-05-22

    Poly (ADP-ribose)ylation is a dynamic protein modification that regulates multiple cellular processes. Here, we describe a system for identifying and characterizing PARylation events that exploits the ability of a PBZ (PAR-binding zinc finger) protein domain to bind PAR with high-affinity. By linking PBZ domains to bimolecular fluorescent complementation biosensors, we developed fluorescent PAR biosensors that allow the detection of temporal and spatial PARylation events in live cells. Exploiting transposon-mediated recombination, we integrate the PAR biosensor en masse into thousands of protein coding genes in living cells. Using these PAR-biosensor "tagged" cells in a genetic screen we carry out a large-scale identification of PARylation targets. This identifies CTIF (CBP80/CBP20-dependent translation initiation factor) as a novel PARylation target of the tankyrase enzymes in the centrosomal region of cells, which plays a role in the distribution of the centrosomal satellites.

  15. Exploiting the Adaptation Dynamics to Predict the Distribution of Beneficial Fitness Effects

    PubMed Central

    2016-01-01

    Adaptation of asexual populations is driven by beneficial mutations and therefore the dynamics of this process, besides other factors, depends on the distribution of beneficial fitness effects. It is known that on uncorrelated fitness landscapes, this distribution can only be of three types: truncated, exponential and power law. We performed extensive stochastic simulations to study the adaptation dynamics on rugged fitness landscapes, and identified two quantities that can be used to distinguish the underlying distribution of beneficial fitness effects. The first quantity studied here is the fitness difference between successive mutations that spread in the population, which is found to decrease in the case of truncated distributions, remains nearly a constant for exponentially decaying distributions and increases when the fitness distribution decays as a power law. The second quantity of interest, namely, the rate of change of fitness with time also shows quantitatively different behaviour for different beneficial fitness distributions. The patterns displayed by the two aforementioned quantities are found to hold good for both low and high mutation rates. We discuss how these patterns can be exploited to determine the distribution of beneficial fitness effects in microbial experiments. PMID:26990188

  16. Quantitative Mapping of the Spatial Distribution of Nanoparticles in Endo-Lysosomes by Local pH.

    PubMed

    Wang, Jing; MacEwan, Sarah R; Chilkoti, Ashutosh

    2017-02-08

    Understanding the intracellular distribution and trafficking of nanoparticle drug carriers is necessary to elucidate their mechanisms of drug delivery and is helpful in the rational design of novel nanoparticle drug delivery systems. The traditional immunofluorescence method to study intracellular distribution of nanoparticles using organelle-specific antibodies is laborious and subject to artifacts. As an alternative, we developed a new method that exploits ratiometric fluorescence imaging of a pH-sensitive Lysosensor dye to visualize and quantify the spatial distribution of nanoparticles in the endosomes and lysosomes of live cells. Using this method, we compared the endolysosomal distribution of cell-penetrating peptide (CPP)-functionalized micelles to unfunctionalized micelles and found that CPP-functionalized micelles exhibited faster endosome-to-lysosome trafficking than unfunctionalized micelles. Ratiometric fluorescence imaging of pH-sensitive Lysosensor dye allows rapid quantitative mapping of nanoparticle distribution in endolysosomes in live cells while minimizing artifacts caused by extensive sample manipulation typical of alternative approaches. This new method can thus serve as an alternative to traditional immunofluorescence approaches to study the intracellular distribution and trafficking of nanoparticles within endosomes and lysosomes.

  17. Nautilus at Risk – Estimating Population Size and Demography of Nautilus pompilius

    PubMed Central

    Dunstan, Andrew; Bradshaw, Corey J. A.; Marshall, Justin

    2011-01-01

    The low fecundity, late maturity, long gestation and long life span of Nautilus suggest that this species is vulnerable to over-exploitation. Demand from the ornamental shell trade has contributed to their rapid decline in localized populations. More data from wild populations are needed to design management plans which ensure Nautilus persistence. We used a variety of techniques including capture-mark-recapture, baited remote underwater video systems, ultrasonic telemetry and remotely operated vehicles to estimate population size, growth rates, distribution and demographic characteristics of an unexploited Nautilus pompilius population at Osprey Reef (Coral Sea, Australia). We estimated a small and dispersed population of between 844 and 4467 individuals (14.6–77.4 km−2) dominated by males (83∶17 male∶female) and comprised of few juveniles (<10%).These results provide the first Nautilid population and density estimates which are essential elements for long-term management of populations via sustainable catch models. Results from baited remote underwater video systems provide confidence for their more widespread use to assess efficiently the size and density of exploited and unexploited Nautilus populations worldwide. PMID:21347360

  18. Fast evaluation of scaled opposite spin second-order Møller-Plesset correlation energies using auxiliary basis expansions and exploiting sparsity.

    PubMed

    Jung, Yousung; Shao, Yihan; Head-Gordon, Martin

    2007-09-01

    The scaled opposite spin Møller-Plesset method (SOS-MP2) is an economical way of obtaining correlation energies that are computationally cheaper, and yet, in a statistical sense, of higher quality than standard MP2 theory, by introducing one empirical parameter. But SOS-MP2 still has a fourth-order scaling step that makes the method inapplicable to very large molecular systems. We reduce the scaling of SOS-MP2 by exploiting the sparsity of expansion coefficients and local integral matrices, by performing local auxiliary basis expansions for the occupied-virtual product distributions. To exploit sparsity of 3-index local quantities, we use a blocking scheme in which entire zero-rows and columns, for a given third global index, are deleted by comparison against a numerical threshold. This approach minimizes sparse matrix book-keeping overhead, and also provides sufficiently large submatrices after blocking, to allow efficient matrix-matrix multiplies. The resulting algorithm is formally cubic scaling, and requires only moderate computational resources (quadratic memory and disk space) and, in favorable cases, is shown to yield effective quadratic scaling behavior in the size regime we can apply it to. Errors associated with local fitting using the attenuated Coulomb metric and numerical thresholds in the blocking procedure are found to be insignificant in terms of the predicted relative energies. A diverse set of test calculations shows that the size of system where significant computational savings can be achieved depends strongly on the dimensionality of the system, and the extent of localizability of the molecular orbitals. Copyright 2007 Wiley Periodicals, Inc.

  19. Availability Control for Means of Transport in Decisive Semi-Markov Models of Exploitation Process

    NASA Astrophysics Data System (ADS)

    Migawa, Klaudiusz

    2012-12-01

    The issues presented in this research paper refer to problems connected with the control process for exploitation implemented in the complex systems of exploitation for technical objects. The article presents the description of the method concerning the control availability for technical objects (means of transport) on the basis of the mathematical model of the exploitation process with the implementation of the decisive processes by semi-Markov. The presented method means focused on the preparing the decisive for the exploitation process for technical objects (semi-Markov model) and after that specifying the best control strategy (optimal strategy) from among possible decisive variants in accordance with the approved criterion (criteria) of the activity evaluation of the system of exploitation for technical objects. In the presented method specifying the optimal strategy for control availability in the technical objects means a choice of a sequence of control decisions made in individual states of modelled exploitation process for which the function being a criterion of evaluation reaches the extreme value. In order to choose the optimal control strategy the implementation of the genetic algorithm was chosen. The opinions were presented on the example of the exploitation process of the means of transport implemented in the real system of the bus municipal transport. The model of the exploitation process for the means of transports was prepared on the basis of the results implemented in the real transport system. The mathematical model of the exploitation process was built taking into consideration the fact that the model of the process constitutes the homogenous semi-Markov process.

  20. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  1. The population ecology of despotism. Concessions and migration between central and peripheral habitats.

    PubMed

    Bell, Adrian Viliami; Winterhalder, Bruce

    2014-03-01

    Since despotism is a common evolutionary development in human history, we seek to understand the conditions under which it can originate, persist, and affect population trajectories. We describe a general system of population ecology equations representing the Ideal Free and Despotic Distributions for one and two habitats, one of which contains a despotic class that controls the distribution of resources. Using analytical and numerical solutions we derive the optimal concession strategy by despots with and without subordinate migration to an alternative habitat. We show that low concessions exponentially increase the time it takes for the despotic habitat to fill, and we discuss the trade-offs despots and subordinates confront at various levels of exploitation. Contrary to previous hypotheses, higher levels of despotism do not necessarily cause faster migration to alternative habitats. We further show how, during colonization, divergent population trajectories may arise if despotic systems experience Allee-type economies of scale.

  2. Practical performance of real-time shot-noise measurement in continuous-variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Huang, Peng; Zhou, Yingming; Liu, Weiqi; Zeng, Guihua

    2018-01-01

    In a practical continuous-variable quantum key distribution (CVQKD) system, real-time shot-noise measurement (RTSNM) is an essential procedure for preventing the eavesdropper exploiting the practical security loopholes. However, the performance of this procedure itself is not analyzed under the real-world condition. Therefore, we indicate the RTSNM practical performance and investigate its effects on the CVQKD system. In particular, due to the finite-size effect, the shot-noise measurement at the receiver's side may decrease the precision of parameter estimation and consequently result in a tight security bound. To mitigate that, we optimize the block size for RTSNM under the ensemble size limitation to maximize the secure key rate. Moreover, the effect of finite dynamics of amplitude modulator in this scheme is studied and its mitigation method is also proposed. Our work indicates the practical performance of RTSNM and provides the real secret key rate under it.

  3. Self-Healing Networks: Redundancy and Structure

    PubMed Central

    Quattrociocchi, Walter; Caldarelli, Guido; Scala, Antonio

    2014-01-01

    We introduce the concept of self-healing in the field of complex networks modelling; in particular, self-healing capabilities are implemented through distributed communication protocols that exploit redundant links to recover the connectivity of the system. We then analyze the effect of the level of redundancy on the resilience to multiple failures; in particular, we measure the fraction of nodes still served for increasing levels of network damages. Finally, we study the effects of redundancy under different connectivity patterns—from planar grids, to small-world, up to scale-free networks—on healing performances. Small-world topologies show that introducing some long-range connections in planar grids greatly enhances the resilience to multiple failures with performances comparable to the case of the most resilient (and least realistic) scale-free structures. Obvious applications of self-healing are in the important field of infrastructural networks like gas, power, water, oil distribution systems. PMID:24533065

  4. Method of phase space beam dilution utilizing bounded chaos generated by rf phase modulation

    DOE PAGES

    Pham, Alfonse N.; Lee, S. Y.; Ng, K. Y.

    2015-12-10

    This paper explores the physics of chaos in a localized phase-space region produced by rf phase modulation applied to a double rf system. The study can be exploited to produce rapid particle bunch broadening exhibiting longitudinal particle distribution uniformity. Hamiltonian models and particle-tracking simulations are introduced to understand the mechanism and applicability of controlled particle diffusion. When phase modulation is applied to the double rf system, regions of localized chaos are produced through the disruption and overlapping of parametric resonant islands and configured to be bounded by well-behaved invariant tori to prevent particle loss. The condition of chaoticity and themore » degree of particle dilution can be controlled by the rf parameters. As a result, the method has applications in alleviating adverse space-charge effects in high-intensity beams, particle bunch distribution uniformization, and industrial radiation-effects experiments.« less

  5. Numerical Modeling of Exploitation Relics and Faults Influence on Rock Mass Deformations

    NASA Astrophysics Data System (ADS)

    Wesołowski, Marek

    2016-12-01

    This article presents numerical modeling results of fault planes and exploitation relics influenced by the size and distribution of rock mass and surface area deformations. Numerical calculations were performed using the finite difference program FLAC. To assess the changes taking place in a rock mass, an anisotropic elasto-plastic ubiquitous joint model was used, into which the Coulomb-Mohr strength (plasticity) condition was implemented. The article takes as an example the actual exploitation of the longwall 225 area in the seam 502wg of the "Pokój" coal mine. Computer simulations have shown that it is possible to determine the influence of fault planes and exploitation relics on the size and distribution of rock mass and its surface deformation. The main factor causing additional deformations of the area surface are the abandoned workings in the seam 502wd. These abandoned workings are the activation factor that caused additional subsidences and also, due to the significant dip, they are a layer on which the rock mass slides down in the direction of the extracted space. These factors are not taken into account by the geometrical and integral theories.

  6. On the way to identify microorganisms in drinking water distribution networks via DNA analysis of the gut content of freshwater isopods.

    PubMed

    Mayer, Michael; Keller, Adrian; Szewzyk, Ulrich; Warnecke, Hans-Joachim

    2015-05-10

    Pure drinking water is the basis for a healthy society. In Germany the drinking water regulations demand for analysis of water via detection of certain microbiological parameters by cultivation only. However, not all prokaryotes can be detected by these standard methods. How to gain more and better information about the bacteria present in drinking water and its distribution systems? The biofilms in drinking water distribution systems are built by bacteria and therefore represent a valuable source of information about the species present. Unfortunately, these biofilms are badly accessible. We thus exploited the circumstance that a lot of metazoans graze the biofilms, so that the content of their guts partly reflects the respective biofilm biocenosis. Therefore, we collected omnivorous isopods, prepared their guts and examined and characterized their contents based on 16S und 18S rDNA analysis. These molecularbiological investigations provide a profound basis for the characterization of the biocenosis and thereby biologically assess the drinking water ecosystems. Combined with a thorough identification of the species and the knowledge of their habitats, this approach can provide useful indications for the assessment of drinking-water quality and the early detection of problems in the distribution system. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Recommender engine for continuous-time quantum Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Huang, Li; Yang, Yi-feng; Wang, Lei

    2017-03-01

    Recommender systems play an essential role in the modern business world. They recommend favorable items such as books, movies, and search queries to users based on their past preferences. Applying similar ideas and techniques to Monte Carlo simulations of physical systems boosts their efficiency without sacrificing accuracy. Exploiting the quantum to classical mapping inherent in the continuous-time quantum Monte Carlo methods, we construct a classical molecular gas model to reproduce the quantum distributions. We then utilize powerful molecular simulation techniques to propose efficient quantum Monte Carlo updates. The recommender engine approach provides a general way to speed up the quantum impurity solvers.

  8. Parallel, Distributed Scripting with Python

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, P J

    2002-05-24

    Parallel computers used to be, for the most part, one-of-a-kind systems which were extremely difficult to program portably. With SMP architectures, the advent of the POSIX thread API and OpenMP gave developers ways to portably exploit on-the-box shared memory parallelism. Since these architectures didn't scale cost-effectively, distributed memory clusters were developed. The associated MPI message passing libraries gave these systems a portable paradigm too. Having programmers effectively use this paradigm is a somewhat different question. Distributed data has to be explicitly transported via the messaging system in order for it to be useful. In high level languages, the MPI librarymore » gives access to data distribution routines in C, C++, and FORTRAN. But we need more than that. Many reasonable and common tasks are best done in (or as extensions to) scripting languages. Consider sysadm tools such as password crackers, file purgers, etc ... These are simple to write in a scripting language such as Python (an open source, portable, and freely available interpreter). But these tasks beg to be done in parallel. Consider the a password checker that checks an encrypted password against a 25,000 word dictionary. This can take around 10 seconds in Python (6 seconds in C). It is trivial to parallelize if you can distribute the information and co-ordinate the work.« less

  9. Ultra-short FBG based distributed sensing using shifted optical Gaussian filters and microwave-network analysis.

    PubMed

    Cheng, Rui; Xia, Li; Sima, Chaotan; Ran, Yanli; Rohollahnejad, Jalal; Zhou, Jiaao; Wen, Yongqiang; Yu, Can

    2016-02-08

    Ultrashort fiber Bragg gratings (US-FBGs) have significant potential as weak grating sensors for distributed sensing, but the exploitation have been limited by their inherent broad spectra that are undesirable for most traditional wavelength measurements. To address this, we have recently introduced a new interrogation concept using shifted optical Gaussian filters (SOGF) which is well suitable for US-FBG measurements. Here, we apply it to demonstrate, for the first time, an US-FBG-based self-referencing distributed optical sensing technique, with the advantages of adjustable sensitivity and range, high-speed and wide-range (potentially >14000 με) intensity-based detection, and resistance to disturbance by nonuniform parameter distribution. The entire system is essentially based on a microwave network, which incorporates the SOGF with a fiber delay-line between the two arms. Differential detections of the cascaded US-FBGs are performed individually in the network time-domain response which can be obtained by analyzing its complex frequency response. Experimental results are presented and discussed using eight cascaded US-FBGs. A comprehensive numerical analysis is also conducted to assess the system performance, which shows that the use of US-FBGs instead of conventional weak FBGs could significantly improve the power budget and capacity of the distributed sensing system while maintaining the crosstalk level and intensity decay rate, providing a promising route for future sensing applications.

  10. Guest Editor's introduction: Special issue on distributed virtual environments

    NASA Astrophysics Data System (ADS)

    Lea, Rodger

    1998-09-01

    Distributed virtual environments (DVEs) combine technology from 3D graphics, virtual reality and distributed systems to provide an interactive 3D scene that supports multiple participants. Each participant has a representation in the scene, often known as an avatar, and is free to navigate through the scene and interact with both the scene and other viewers of the scene. Changes to the scene, for example, position changes of one avatar as the associated viewer navigates through the scene, or changes to objects in the scene via manipulation, are propagated in real time to all viewers. This ensures that all viewers of a shared scene `see' the same representation of it, allowing sensible reasoning about the scene. Early work on such environments was restricted to their use in simulation, in particular in military simulation. However, over recent years a number of interesting and potentially far-reaching attempts have been made to exploit the technology for a range of other uses, including: Social spaces. Such spaces can be seen as logical extensions of the familiar text chat space. In 3D social spaces avatars, representing participants, can meet in shared 3D scenes and in addition to text chat can use visual cues and even in some cases spatial audio. Collaborative working. A number of recent projects have attempted to explore the use of DVEs to facilitate computer-supported collaborative working (CSCW), where the 3D space provides a context and work space for collaboration. Gaming. The shared 3D space is already familiar, albeit in a constrained manner, to the gaming community. DVEs are a logical superset of existing 3D games and can provide a rich framework for advanced gaming applications. e-commerce. The ability to navigate through a virtual shopping mall and to look at, and even interact with, 3D representations of articles has appealed to the e-commerce community as it searches for the best method of presenting merchandise to electronic consumers. The technology needed to support these systems crosses a number of disciplines in computer science. These include, but are certainly not limited to, real-time graphics for the accurate and realistic representation of scenes, group communications for the efficient update of shared consistent scene data, user interface modelling to exploit the use of the 3D representation and multimedia systems technology for the delivery of streamed graphics and audio-visual data into the shared scene. It is this intersection of technologies and the overriding need to provide visual realism that places such high demands on the underlying distributed systems infrastructure and makes DVEs such fertile ground for distributed systems research. Two examples serve to show how DVE developers have exploited the unique aspects of their domain. Communications. The usual tension between latency and throughput is particularly noticeable within DVEs. To ensure the timely update of multiple viewers of a particular scene requires that such updates be propagated quickly. However, the sheer volume of changes to any one scene calls for techniques that minimize the number of distinct updates that are sent to the network. Several techniques have been used to address this tension; these include the use of multicast communications, and in particular multicast in wide-area networks to reduce actual message traffic. Multicast has been combined with general group communications to partition updates to related objects or users of a scene. A less traditional approach has been the use of dead reckoning whereby a client application that visualizes the scene calculates position updates by extrapolating movement based on previous information. This allows the system to reduce the number of communications needed to update objects that move in a stable manner within the scene. Scaling. DVEs, especially those used for social spaces, are required to support large numbers of simultaneous users in potentially large shared scenes. The desire for scalability has driven different architectural designs, for example, the use of fully distributed architectures which scale well but often suffer performance costs versus centralized and hierarchical architectures in which the inverse is true. However, DVEs have also exploited the spatial nature of their domain to address scalability and have pioneered techniques that exploit the semantics of the shared space to reduce data updates and so allow greater scalability. Several of the systems reported in this special issue apply a notion of area of interest to partition the scene and so reduce the participants in any data updates. The specification of area of interest differs between systems. One approach has been to exploit a geographical notion, i.e. a regular portion of a scene, or a semantic unit, such as a room or building. Another approach has been to define the area of interest as a spatial area associated with an avatar in the scene. The five papers in this special issue have been chosen to highlight the distributed systems aspects of the DVE domain. The first paper, on the DIVE system, described by Emmanuel Frécon and Mårten Stenius explores the use of multicast and group communication in a fully peer-to-peer architecture. The developers of DIVE have focused on its use as the basis for collaborative work environments and have explored the issues associated with maintaining and updating large complicated scenes. The second paper, by Hiroaki Harada et al, describes the AGORA system, a DVE concentrating on social spaces and employing a novel communication technique that incorporates position update and vector information to support dead reckoning. The paper by Simon Powers et al explores the application of DVEs to the gaming domain. They propose a novel architecture that separates out higher-level game semantics - the conceptual model - from the lower-level scene attributes - the dynamic model, both running on servers, from the actual visual representation - the visual model - running on the client. They claim a number of benefits from this approach, including better predictability and consistency. Wolfgang Broll discusses the SmallView system which is an attempt to provide a toolkit for DVEs. One of the key features of SmallView is a sophisticated application level protocol, DWTP, that provides support for a variety of communication models. The final paper, by Chris Greenhalgh, discusses the MASSIVE system which has been used to explore the notion of awareness in the 3D space via the concept of `auras'. These auras define an area of interest for users and support a mapping between what a user is aware of, and what data update rate the communications infrastructure can support. We hope that this selection of papers will serve to provide a clear introduction to the distributed system issues faced by the DVE community and the approaches they have taken in solving them. Finally, we wish to thank Hubert Le Van Gong for his tireless efforts in pulling together all these papers and both the referees and the authors of the papers for the time and effort in ensuring that their contributions teased out the interesting distributed systems issues for this special issue. † E-mail address: rodger@arch.sel.sony.com

  11. Bioinspired magnetic reception and multimodal sensing.

    PubMed

    Taylor, Brian K

    2017-08-01

    Several animals use Earth's magnetic field in concert with other sensor modes to accomplish navigational tasks ranging from local homing to continental scale migration. However, despite extensive research, animal magnetic reception remains poorly understood. Similarly, the Earth's magnetic field offers a signal that engineered systems can leverage to navigate in environments where man-made positioning systems such as GPS are either unavailable or unreliable. This work uses a behavioral strategy inspired by the migratory behavior of sea turtles to locate a magnetic goal and respond to wind when it is present. Sensing is performed using a number of distributed sensors. Based on existing theoretical biology considerations, data processing is performed using combinations of circles and ellipses to exploit the distributed sensing paradigm. Agent-based simulation results indicate that this approach is capable of using two separate magnetic properties to locate a goal from a variety of initial conditions in both noiseless and noisy sensory environments. The system's ability to locate the goal appears robust to noise at the cost of overall path length.

  12. [Development and evaluation of the medical imaging distribution system with dynamic web application and clustering technology].

    PubMed

    Yokohama, Noriya; Tsuchimoto, Tadashi; Oishi, Masamichi; Itou, Katsuya

    2007-01-20

    It has been noted that the downtime of medical informatics systems is often long. Many systems encounter downtimes of hours or even days, which can have a critical effect on daily operations. Such systems remain especially weak in the areas of database and medical imaging data. The scheme design shows the three-layer architecture of the system: application, database, and storage layers. The application layer uses the DICOM protocol (Digital Imaging and Communication in Medicine) and HTTP (Hyper Text Transport Protocol) with AJAX (Asynchronous JavaScript+XML). The database is designed to decentralize in parallel using cluster technology. Consequently, restoration of the database can be done not only with ease but also with improved retrieval speed. In the storage layer, a network RAID (Redundant Array of Independent Disks) system, it is possible to construct exabyte-scale parallel file systems that exploit storage spread. Development and evaluation of the test-bed has been successful in medical information data backup and recovery in a network environment. This paper presents a schematic design of the new medical informatics system that can be accommodated from a recovery and the dynamic Web application for medical imaging distribution using AJAX.

  13. CompatPM: enabling energy efficient multimedia workloads for distributed mobile platforms

    NASA Astrophysics Data System (ADS)

    Nathuji, Ripal; O'Hara, Keith J.; Schwan, Karsten; Balch, Tucker

    2007-01-01

    The computation and communication abilities of modern platforms are enabling increasingly capable cooperative distributed mobile systems. An example is distributed multimedia processing of sensor data in robots deployed for search and rescue, where a system manager can exploit the application's cooperative nature to optimize the distribution of roles and tasks in order to successfully accomplish the mission. Because of limited battery capacities, a critical task a manager must perform is online energy management. While support for power management has become common for the components that populate mobile platforms, what is lacking is integration and explicit coordination across the different management actions performed in a variety of system layers. This papers develops an integration approach for distributed multimedia applications, where a global manager specifies both a power operating point and a workload for a node to execute. Surprisingly, when jointly considering power and QoS, experimental evaluations show that using a simple deadline-driven approach to assigning frequencies can be non-optimal. These trends are further affected by certain characteristics of underlying power management mechanisms, which in our research, are identified as groupings that classify component power management as "compatible" (VFC) or "incompatible" (VFI) with voltage and frequency scaling. We build on these findings to develop CompatPM, a vertically integrated control strategy for power management in distributed mobile systems. Experimental evaluations of CompatPM indicate average energy improvements of 8% when platform resources are managed jointly rather than independently, demonstrating that previous attempts to maximize battery life by simply minimizing frequency are inappropriate from a platform-level perspective.

  14. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Temporal complexity in emission from Anderson localized lasers

    NASA Astrophysics Data System (ADS)

    Kumar, Randhir; Balasubrahmaniyam, M.; Alee, K. Shadak; Mujumdar, Sushil

    2017-12-01

    Anderson localization lasers exploit resonant cavities formed due to structural disorder. The inherent randomness in the structure of these cavities realizes a probability distribution in all cavity parameters such as quality factors, mode volumes, mode structures, and so on, implying resultant statistical fluctuations in the temporal behavior. Here we provide direct experimental measurements of temporal width distributions of Anderson localization lasing pulses in intrinsically and extrinsically disordered coupled-microresonator arrays. We first illustrate signature exponential decays in the spatial intensity distributions of the lasing modes that quantify their localized character, and then measure the temporal width distributions of the pulsed emission over several configurations. We observe a dependence of temporal widths on the disorder strength, wherein the widths show a single-peaked, left-skewed distribution in extrinsic disorder and a dual-peaked distribution in intrinsic disorder. We propose a model based on coupled rate equations for an emitter and an Anderson cavity with a random mode structure, which gives excellent quantitative and qualitative agreement with the experimental observations. The experimental and theoretical analyses bring to the fore the temporal complexity in Anderson-localization-based lasing systems.

  16. Initial Results from an Energy-Aware Airborne Dynamic, Data-Driven Application System Performing Sampling in Coherent Boundary-Layer Structures

    NASA Astrophysics Data System (ADS)

    Frew, E.; Argrow, B. M.; Houston, A. L.; Weiss, C.

    2014-12-01

    The energy-aware airborne dynamic, data-driven application system (EA-DDDAS) performs persistent sampling in complex atmospheric conditions by exploiting wind energy using the dynamic data-driven application system paradigm. The main challenge for future airborne sampling missions is operation with tight integration of physical and computational resources over wireless communication networks, in complex atmospheric conditions. The physical resources considered here include sensor platforms, particularly mobile Doppler radar and unmanned aircraft, the complex conditions in which they operate, and the region of interest. Autonomous operation requires distributed computational effort connected by layered wireless communication. Onboard decision-making and coordination algorithms can be enhanced by atmospheric models that assimilate input from physics-based models and wind fields derived from multiple sources. These models are generally too complex to be run onboard the aircraft, so they need to be executed in ground vehicles in the field, and connected over broadband or other wireless links back to the field. Finally, the wind field environment drives strong interaction between the computational and physical systems, both as a challenge to autonomous path planning algorithms and as a novel energy source that can be exploited to improve system range and endurance. Implementation details of a complete EA-DDDAS will be provided, along with preliminary flight test results targeting coherent boundary-layer structures.

  17. A Deep Learning Architecture for Temporal Sleep Stage Classification Using Multivariate and Multimodal Time Series.

    PubMed

    Chambon, Stanislas; Galtier, Mathieu N; Arnal, Pierrick J; Wainrib, Gilles; Gramfort, Alexandre

    2018-04-01

    Sleep stage classification constitutes an important preliminary exam in the diagnosis of sleep disorders. It is traditionally performed by a sleep expert who assigns to each 30 s of the signal of a sleep stage, based on the visual inspection of signals such as electroencephalograms (EEGs), electrooculograms (EOGs), electrocardiograms, and electromyograms (EMGs). We introduce here the first deep learning approach for sleep stage classification that learns end-to-end without computing spectrograms or extracting handcrafted features, that exploits all multivariate and multimodal polysomnography (PSG) signals (EEG, EMG, and EOG), and that can exploit the temporal context of each 30-s window of data. For each modality, the first layer learns linear spatial filters that exploit the array of sensors to increase the signal-to-noise ratio, and the last layer feeds the learnt representation to a softmax classifier. Our model is compared to alternative automatic approaches based on convolutional networks or decisions trees. Results obtained on 61 publicly available PSG records with up to 20 EEG channels demonstrate that our network architecture yields the state-of-the-art performance. Our study reveals a number of insights on the spatiotemporal distribution of the signal of interest: a good tradeoff for optimal classification performance measured with balanced accuracy is to use 6 EEG with 2 EOG (left and right) and 3 EMG chin channels. Also exploiting 1 min of data before and after each data segment offers the strongest improvement when a limited number of channels are available. As sleep experts, our system exploits the multivariate and multimodal nature of PSG signals in order to deliver the state-of-the-art classification performance with a small computational cost.

  18. Cryogenic-temperature profiling of high-power superconducting lines using local and distributed optical-fiber sensors.

    PubMed

    Chiuchiolo, Antonella; Palmieri, Luca; Consales, Marco; Giordano, Michele; Borriello, Anna; Bajas, Hugues; Galtarossa, Andrea; Bajko, Marta; Cusano, Andrea

    2015-10-01

    This contribution presents distributed and multipoint fiber-optic monitoring of cryogenic temperatures along a superconducting power transmission line down to 30 K and over 20 m distance. Multipoint measurements were conducted using fiber Bragg gratings sensors coated with two different functional overlays (epoxy and poly methyl methacrylate (PMMA)) demonstrating cryogenic operation in the range 300-4.2 K. Distributed measurements exploited optical frequency-domain reflectometry to analyze the Rayleigh scattering along two concatenated fibers with different coatings (acrylate and polyimide). The integrated system has been placed along the 20 m long cryostat of a superconducting power transmission line, which is currently being tested at the European Organization for Nuclear Research (CERN). Cool-down events from 300-30 K have been successfully measured in space and time, confirming the viability of these approaches to the monitoring of cryogenic temperatures along a superconducting transmission line.

  19. GPS-based tracking system for TOPEX orbit determination

    NASA Technical Reports Server (NTRS)

    Melbourne, W. G.

    1984-01-01

    A tracking system concept is discussed that is based on the utilization of the constellation of Navstar satellites in the Global Positioning System (GPS). The concept involves simultaneous and continuous metric tracking of the signals from all visible Navstar satellites by approximately six globally distributed ground terminals and by the TOPEX spacecraft at 1300-km altitude. Error studies indicate that this system could be capable of obtaining decimeter position accuracies and, most importantly, around 5 cm in the radial component which is key to exploiting the full accuracy potential of the altimetric measurements for ocean topography. Topics covered include: background of the GPS, the precision mode for utilization of the system, past JPL research for using the GPS in precision applications, the present tracking system concept for high accuracy satellite positioning, and results from a proof-of-concept demonstration.

  20. Quantifying Risk of Financial Incapacity and Financial Exploitation in Community-dwelling Older Adults: Utility of a Scoring System for the Lichtenberg Financial Decision-making Rating Scale.

    PubMed

    Lichtenberg, Peter A; Gross, Evan; Ficker, Lisa J

    2018-06-08

    This work examines the clinical utility of the scoring system for the Lichtenberg Financial Decision-making Rating Scale (LFDRS) and its usefulness for decision making capacity and financial exploitation. Objective 1 was to examine the clinical utility of a person centered, empirically supported, financial decision making scale. Objective 2 was to determine whether the risk-scoring system created for this rating scale is sufficiently accurate for the use of cutoff scores in cases of decisional capacity and cases of suspected financial exploitation. Objective 3 was to examine whether cognitive decline and decisional impairment predicted suspected financial exploitation. Two hundred independently living, non-demented community-dwelling older adults comprised the sample. Participants completed the rating scale and other cognitive measures. Receiver operating characteristic curves were in the good to excellent range for decisional capacity scoring, and in the fair to good range for financial exploitation. Analyses supported the conceptual link between decision making deficits and risk for exploitation, and supported the use of the risk-scoring system in a community-based population. This study adds to the empirical evidence supporting the use of the rating scale as a clinical tool assessing risk for financial decisional impairment and/or financial exploitation.

  1. Integrated quantum key distribution sender unit for daily-life implementations

    NASA Astrophysics Data System (ADS)

    Mélen, Gwenaelle; Vogl, Tobias; Rau, Markus; Corrielli, Giacomo; Crespi, Andrea; Osellame, Roberto; Weinfurter, Harald

    2016-03-01

    Unlike currently implemented encryption schemes, Quantum Key Distribution provides a secure way of generating and distributing a key among two parties. Although a multitude of research platforms has been developed, the integration of QKD units within classical communication systems remains a tremendous challenge. The recently achieved maturity of integrated photonic technologies could be exploited to create miniature QKD add-ons that could extend the primary function of various existing systems such as mobile devices or optical stations. In this work we report on an integrated optics module enabling secure short-distance communication for, e.g., quantum access schemes. Using BB84-like protocols, Alice's mobile low-cost device can exchange secure key and information everywhere within a trusted node network. The new optics platform (35×20×8mm) compatible with current smartphone's technology generates NIR faint polarised laser pulses with 100MHz repetition rate. Fully automated beam tracking and live basis-alignment on Bob's side ensure user-friendly operation with a quantum link efficiency as high as 50% stable over a few seconds.

  2. Security analysis on some experimental quantum key distribution systems with imperfect optical and electrical devices

    NASA Astrophysics Data System (ADS)

    Liang, Lin-Mei; Sun, Shi-Hai; Jiang, Mu-Sheng; Li, Chun-Yan

    2014-10-01

    In general, quantum key distribution (QKD) has been proved unconditionally secure for perfect devices due to quantum uncertainty principle, quantum noncloning theorem and quantum nondividing principle which means that a quantum cannot be divided further. However, the practical optical and electrical devices used in the system are imperfect, which can be exploited by the eavesdropper to partially or totally spy the secret key between the legitimate parties. In this article, we first briefly review the recent work on quantum hacking on some experimental QKD systems with respect to imperfect devices carried out internationally, then we will present our recent hacking works in details, including passive faraday mirror attack, partially random phase attack, wavelength-selected photon-number-splitting attack, frequency shift attack, and single-photon-detector attack. Those quantum attack reminds people to improve the security existed in practical QKD systems due to imperfect devices by simply adding countermeasure or adopting a totally different protocol such as measurement-device independent protocol to avoid quantum hacking on the imperfection of measurement devices [Lo, et al., Phys. Rev. Lett., 2012, 108: 130503].

  3. A robust close-range photogrammetric target extraction algorithm for size and type variant targets

    NASA Astrophysics Data System (ADS)

    Nyarko, Kofi; Thomas, Clayton; Torres, Gilbert

    2016-05-01

    The Photo-G program conducted by Naval Air Systems Command at the Atlantic Test Range in Patuxent River, Maryland, uses photogrammetric analysis of large amounts of real-world imagery to characterize the motion of objects in a 3-D scene. Current approaches involve several independent processes including target acquisition, target identification, 2-D tracking of image features, and 3-D kinematic state estimation. Each process has its own inherent complications and corresponding degrees of both human intervention and computational complexity. One approach being explored for automated target acquisition relies on exploiting the pixel intensity distributions of photogrammetric targets, which tend to be patterns with bimodal intensity distributions. The bimodal distribution partitioning algorithm utilizes this distribution to automatically deconstruct a video frame into regions of interest (ROI) that are merged and expanded to target boundaries, from which ROI centroids are extracted to mark target acquisition points. This process has proved to be scale, position and orientation invariant, as well as fairly insensitive to global uniform intensity disparities.

  4. Surname distribution in population genetics and in statistical physics.

    PubMed

    Rossi, Paolo

    2013-12-01

    Surnames tend to behave like neutral genes, and their distribution has attracted a growing attention from genetists and physicists. We review the century-long history of surname studies and discuss the most recent developments. Isonymy has been regarded as a tool for the measurement of consanguinity of individuals and populations and has been applied to the analysis of migrations. The analogy between patrilineal surname transmission and the propagation of Y chromosomes has been exploited for the genetic characterization of families, communities and control groups. Surname distribution is the result of a stochastic dynamics, which has been studied either as a Yule process or as a branching phenomenon: both approaches predict the asymptotic power-law behavior which has been observed in many empirical researches. Models of neutral evolution based on the theory of disordered systems have suggested the application of field-theoretical techniques, and in particular the Renormalization Group, to describe the dynamics leading to scale-invariant distributions and to compute the related (critical) exponents. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Surname distribution in population genetics and in statistical physics

    NASA Astrophysics Data System (ADS)

    Rossi, Paolo

    2013-12-01

    Surnames tend to behave like neutral genes, and their distribution has attracted a growing attention from genetists and physicists. We review the century-long history of surname studies and discuss the most recent developments. Isonymy has been regarded as a tool for the measurement of consanguinity of individuals and populations and has been applied to the analysis of migrations. The analogy between patrilineal surname transmission and the propagation of Y chromosomes has been exploited for the genetic characterization of families, communities and control groups. Surname distribution is the result of a stochastic dynamics, which has been studied either as a Yule process or as a branching phenomenon: both approaches predict the asymptotic power-law behavior which has been observed in many empirical researches. Models of neutral evolution based on the theory of disordered systems have suggested the application of field-theoretical techniques, and in particular the Renormalization Group, to describe the dynamics leading to scale-invariant distributions and to compute the related (critical) exponents.

  6. The Heliophysics Integrated Observatory

    NASA Astrophysics Data System (ADS)

    Csillaghy, A.; Bentley, R. D.

    2009-12-01

    HELIO is a new Europe-wide, FP7-funded distributed network of services that will address the needs of a broad community of researchers in heliophysics. This new research field explores the “Sun-Solar System Connection” and requires the joint exploitation of solar, heliospheric, magnetospheric and ionospheric observations. HELIO will provide the most comprehensive integrated information system in this domain; it will coordinate access to the distributed resources needed by the community, and will provide access to services to mine and analyse the data. HELIO will be designed as a Service-oriented Architecture. The initial infrastructure will include services based on metadata and data servers deployed by the European Grid of Solar Observations (EGSO). We will extend these to address observations from all the disciplines of heliophysics; differences in the way the domains describe and handle the data will be resolved using semantic mapping techniques. Processing and storage services will allow the user to explore the data and create the products that meet stringent standards of interoperability. These capabilities will be orchestrated with the data and metadata services using the Taverna workflow tool. HELIO will address the challenges along the FP7 I3 activities model: (1) Networking: we will cooperate closely with the community to define new standards for heliophysics and the required capabilities of the HELIO system. (2) Services: we will integrate the services developed by the project and other groups to produce an infrastructure that can easily be extended to satisfy the growing and changing needs of the community. (3) Joint Research: we will develop search tools that span disciplinary boundaries and explore new types of user-friendly interfaces HELIO will be a key component of a worldwide effort to integrate heliophysics data and will coordinate closely with international organizations to exploit synergies with complementary domains.

  7. Large-scale absence of sharks on reefs in the greater-Caribbean: a footprint of human pressures.

    PubMed

    Ward-Paige, Christine A; Mora, Camilo; Lotze, Heike K; Pattengill-Semmens, Christy; McClenachan, Loren; Arias-Castro, Ery; Myers, Ransom A

    2010-08-05

    In recent decades, large pelagic and coastal shark populations have declined dramatically with increased fishing; however, the status of sharks in other systems such as coral reefs remains largely unassessed despite a long history of exploitation. Here we explore the contemporary distribution and sighting frequency of sharks on reefs in the greater-Caribbean and assess the possible role of human pressures on observed patterns. We analyzed 76,340 underwater surveys carried out by trained volunteer divers between 1993 and 2008. Surveys were grouped within one km2 cells, which allowed us to determine the contemporary geographical distribution and sighting frequency of sharks. Sighting frequency was calculated as the ratio of surveys with sharks to the total number of surveys in each cell. We compared sighting frequency to the number of people in the cell vicinity and used population viability analyses to assess the effects of exploitation on population trends. Sharks, with the exception of nurse sharks occurred mainly in areas with very low human population or strong fishing regulations and marine conservation. Population viability analysis suggests that exploitation alone could explain the large-scale absence; however, this pattern is likely to be exacerbated by additional anthropogenic stressors, such as pollution and habitat degradation, that also correlate with human population. Human pressures in coastal zones have lead to the broad-scale absence of sharks on reefs in the greater-Caribbean. Preventing further loss of sharks requires urgent management measures to curb fishing mortality and to mitigate other anthropogenic stressors to protect sites where sharks still exist. The fact that sharks still occur in some densely populated areas where strong fishing regulations are in place indicates the possibility of success and encourages the implementation of conservation measures.

  8. Main Sources of Occupational Stress and Symptoms of Burnout, Clinical Distress, and Post-Traumatic Stress Among Distributed Common Ground System Intelligence Exploitation Operators (2011 USAFSAM Survey Results)

    DTIC Science & Technology

    2012-09-01

    were married ; 30 (7.65%) were unmarried but in a significant relationship. A total of 187 (45.49%) reported having children living at home. 3.1.2...52 (34.21%) were single and 85 (55.92%) were married ; 9 (5.92%) were unmarried but in a significant relationship. Six participants did not report...66.6%) were married ; 11 (5.56%) were unmarried but in a significant relationship. Two participants did not report their relationship status. A

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai Xiaoming; Chen Shu; Wang Yupeng

    The superfluid-to-Anderson-insulator transition of a strongly repulsive Bose gas is studied in a one-dimensional incommensurate optical lattice. In the hard-core limit, the Bose-Fermi mapping allows us to deal with the system using the exact numerical method. Based on the Aubry-Andre model, we exploit the phase transition of the hard-core boson system from the superfluid phase with all single-particle states extended to the Bose-glass phase with all the single-particle states being Anderson localized as the strength of the incommensurate potential increases relative to the hopping amplitude. We evaluate the superfluid fraction, one-particle density matrices, momentum distributions, the natural orbitals, and theirmore » occupations. All of these quantities show that there exists a superfluid-to-insulator phase transition in the system.« less

  10. A Reasoning Agent for Credit Card Fraud on the Internet Using the Event Calculus

    NASA Astrophysics Data System (ADS)

    Blackwell, Clive

    We illustrate the design of an intelligent agent to aid a merchant to limit fraudulent payment card purchases over the Internet. This is important because increasing fraud may limit the rise of e-commerce, and difficult because of the uncertainty in identifying and authenticating people remotely. The agent can advise the merchant what actions to take to reduce risk without complete knowledge of the circumstances. It can also negotiate flexibly to conclude transactions successfully that would otherwise be rejected. We use the Event Calculus to model the transaction system including the participants and their actions. The idea has applications in other distributed systems where incomplete knowledge of a system may be exploited by adversaries to their advantage.

  11. The NO$$\

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zalesak, Jaroslav; et al.

    2014-01-01

    The NOνA experiment is a long-baseline neutrino experiment designed to make measurements to determine the neutrino mass hierarchy, neutrino mixing parameters and CP violation in the neutrino sector. In order to make these measurements the NOνA collaboration has designed a highly distributed, synchronized, continuous digitization and readout system that is able to acquire and correlate data from the Fermilab accelerator complex (NuMI), the NOνA near detector at the Fermilab site and the NOνA far detector which is located 810 km away at Ash River, MN. This system has unique properties that let it fully exploit the physics capabilities of themore » NOνA detector. The design of the NOνA DAQ system and its capabilities are discussed in this paper.« less

  12. Quantum Color Image Encryption Algorithm Based on A Hyper-Chaotic System and Quantum Fourier Transform

    NASA Astrophysics Data System (ADS)

    Tan, Ru-Chao; Lei, Tong; Zhao, Qing-Min; Gong, Li-Hua; Zhou, Zhi-Hong

    2016-12-01

    To improve the slow processing speed of the classical image encryption algorithms and enhance the security of the private color images, a new quantum color image encryption algorithm based on a hyper-chaotic system is proposed, in which the sequences generated by the Chen's hyper-chaotic system are scrambled and diffused with three components of the original color image. Sequentially, the quantum Fourier transform is exploited to fulfill the encryption. Numerical simulations show that the presented quantum color image encryption algorithm possesses large key space to resist illegal attacks, sensitive dependence on initial keys, uniform distribution of gray values for the encrypted image and weak correlation between two adjacent pixels in the cipher-image.

  13. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  14. Data-Aware Retrodiction for Asynchronous Harmonic Measurement in a Cyber-Physical Energy System.

    PubMed

    Liu, Youda; Wang, Xue; Liu, Yanchi; Cui, Sujin

    2016-08-18

    Cyber-physical energy systems provide a networked solution for safety, reliability and efficiency problems in smart grids. On the demand side, the secure and trustworthy energy supply requires real-time supervising and online power quality assessing. Harmonics measurement is necessary in power quality evaluation. However, under the large-scale distributed metering architecture, harmonic measurement faces the out-of-sequence measurement (OOSM) problem, which is the result of latencies in sensing or the communication process and brings deviations in data fusion. This paper depicts a distributed measurement network for large-scale asynchronous harmonic analysis and exploits a nonlinear autoregressive model with exogenous inputs (NARX) network to reorder the out-of-sequence measuring data. The NARX network gets the characteristics of the electrical harmonics from practical data rather than the kinematic equations. Thus, the data-aware network approximates the behavior of the practical electrical parameter with real-time data and improves the retrodiction accuracy. Theoretical analysis demonstrates that the data-aware method maintains a reasonable consumption of computing resources. Experiments on a practical testbed of a cyber-physical system are implemented, and harmonic measurement and analysis accuracy are adopted to evaluate the measuring mechanism under a distributed metering network. Results demonstrate an improvement of the harmonics analysis precision and validate the asynchronous measuring method in cyber-physical energy systems.

  15. Conditional random matrix ensembles and the stability of dynamical systems

    NASA Astrophysics Data System (ADS)

    Kirk, Paul; Rolando, Delphine M. Y.; MacLean, Adam L.; Stumpf, Michael P. H.

    2015-08-01

    Random matrix theory (RMT) has found applications throughout physics and applied mathematics, in subject areas as diverse as communications networks, population dynamics, neuroscience, and models of the banking system. Many of these analyses exploit elegant analytical results, particularly the circular law and its extensions. In order to apply these results, assumptions must be made about the distribution of matrix elements. Here we demonstrate that the choice of matrix distribution is crucial. In particular, adopting an unrealistic matrix distribution for the sake of analytical tractability is liable to lead to misleading conclusions. We focus on the application of RMT to the long-standing, and at times fractious, ‘diversity-stability debate’, which is concerned with establishing whether large complex systems are likely to be stable. Early work (and subsequent elaborations) brought RMT to bear on the debate by modelling the entries of a system’s Jacobian matrix as independent and identically distributed (i.i.d.) random variables. These analyses were successful in yielding general results that were not tied to any specific system, but relied upon a restrictive i.i.d. assumption. Other studies took an opposing approach, seeking to elucidate general principles of stability through the analysis of specific systems. Here we develop a statistical framework that reconciles these two contrasting approaches. We use a range of illustrative dynamical systems examples to demonstrate that: (i) stability probability cannot be summarily deduced from any single property of the system (e.g. its diversity); and (ii) our assessment of stability depends on adequately capturing the details of the systems analysed. Failing to condition on the structure of dynamical systems will skew our analysis and can, even for very small systems, result in an unnecessarily pessimistic diagnosis of their stability.

  16. On the Path to SunShot. Emerging Issues and Challenges in Integrating Solar with the Distribution System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Broderick, Robert; Mather, Barry

    2016-05-01

    This report analyzes distribution-integration challenges, solutions, and research needs in the context of distributed generation from PV (DGPV) deployment to date and the much higher levels of deployment expected with achievement of the U.S. Department of Energy's SunShot targets. Recent analyses have improved estimates of the DGPV hosting capacities of distribution systems. This report uses these results to statistically estimate the minimum DGPV hosting capacity for the contiguous United States using traditional inverters of approximately 170 GW without distribution system modifications. This hosting capacity roughly doubles if advanced inverters are used to manage local voltage and additional minor, low-cost changesmore » could further increase these levels substantially. Key to achieving these deployment levels at minimum cost is siting DGPV based on local hosting capacities, suggesting opportunities for regulatory, incentive, and interconnection innovation. Already, pre-computed hosting capacity is beginning to expedite DGPV interconnection requests and installations in select regions; however, realizing SunShot-scale deployment will require further improvements to DGPV interconnection processes, standards and codes, and compensation mechanisms so they embrace the contributions of DGPV to system-wide operations. SunShot-scale DGPV deployment will also require unprecedented coordination of the distribution and transmission systems. This includes harnessing DGPV's ability to relieve congestion and reduce system losses by generating closer to loads; minimizing system operating costs and reserve deployments through improved DGPV visibility; developing communication and control architectures that incorporate DGPV into system operations; providing frequency response, transient stability, and synthesized inertia with DGPV in the event of large-scale system disturbances; and potentially managing reactive power requirements due to large-scale deployment of advanced inverter functions. Finally, additional local and system-level value could be provided by integrating DGPV with energy storage and 'virtual storage,' which exploits improved management of electric vehicle charging, building energy systems, and other large loads. Together, continued innovation across this rich distribution landscape can enable the very-high deployment levels envisioned by SunShot.« less

  17. Photonic elements in smart systems for use in aerospace platforms

    NASA Astrophysics Data System (ADS)

    Adamovsky, Grigory; Baumbick, Robert J.; Tabib-Azar, Massood

    1998-07-01

    To compete globally in the next millennium, designers of new transportation vehicles will have to be innovative. Keen competition will reward innovative concepts that are developed and proven first. In order to improve reliability of aerospace platforms and reduce operating cots, new technologies must be exploited to produce autonomous systems, based on highly distributed, smart systems, which can be treated as line replaceable units. These technologies include photonics, which provide sensing and information transfer functions, and micro electro mechanical systems that will produce the actuation and, in some cases, may even provide a computing capability that resembles the hydro- mechanical control system used in older aircraft systems. The combination of these technologies will provide unique systems that will enable achieving the reliability and cost goals dictated by global market. In the article we review some of these issues and discuss a role of photonics in smart system for aerospace platforms.

  18. Equalizer: a scalable parallel rendering framework.

    PubMed

    Eilemann, Stefan; Makhinya, Maxim; Pajarola, Renato

    2009-01-01

    Continuing improvements in CPU and GPU performances as well as increasing multi-core processor and cluster-based parallelism demand for flexible and scalable parallel rendering solutions that can exploit multipipe hardware accelerated graphics. In fact, to achieve interactive visualization, scalable rendering systems are essential to cope with the rapid growth of data sets. However, parallel rendering systems are non-trivial to develop and often only application specific implementations have been proposed. The task of developing a scalable parallel rendering framework is even more difficult if it should be generic to support various types of data and visualization applications, and at the same time work efficiently on a cluster with distributed graphics cards. In this paper we introduce a novel system called Equalizer, a toolkit for scalable parallel rendering based on OpenGL which provides an application programming interface (API) to develop scalable graphics applications for a wide range of systems ranging from large distributed visualization clusters and multi-processor multipipe graphics systems to single-processor single-pipe desktop machines. We describe the system architecture, the basic API, discuss its advantages over previous approaches, present example configurations and usage scenarios as well as scalability results.

  19. JPEG 2000 in advanced ground station architectures

    NASA Astrophysics Data System (ADS)

    Chien, Alan T.; Brower, Bernard V.; Rajan, Sreekanth D.

    2000-11-01

    The integration and management of information from distributed and heterogeneous information producers and providers must be a key foundation of any developing imagery intelligence system. Historically, imagery providers acted as production agencies for imagery, imagery intelligence, and geospatial information. In the future, these imagery producers will be evolving to act more like e-business information brokers. The management of imagery and geospatial information-visible, spectral, infrared (IR), radar, elevation, or other feature and foundation data-is crucial from a quality and content perspective. By 2005, there will be significantly advanced collection systems and a myriad of storage devices. There will also be a number of automated and man-in-the-loop correlation, fusion, and exploitation capabilities. All of these new imagery collection and storage systems will result in a higher volume and greater variety of imagery being disseminated and archived in the future. This paper illustrates the importance-from a collection, storage, exploitation, and dissemination perspective-of the proper selection and implementation of standards-based compression technology for ground station and dissemination/archive networks. It specifically discusses the new compression capabilities featured in JPEG 2000 and how that commercially based technology can provide significant improvements to the overall imagery and geospatial enterprise both from an architectural perspective as well as from a user's prospective.

  20. An ecology-oriented exploitation mode of groundwater resources in the northern Tianshan Mountains, China

    NASA Astrophysics Data System (ADS)

    Shang, Haimin; Wang, Wenke; Dai, Zhenxue; Duan, Lei; Zhao, Yaqian; Zhang, Jing

    2016-12-01

    In recent years, ecological degradation caused by irrational groundwater exploitation has been of growing concern in arid and semiarid regions. To address the groundwater-ecological issues, this paper proposes a groundwater-resource exploitation mode to evaluate the tradeoff between groundwater development and ecological environment in the northern Tianshan Mountains, northwest China's Xinjiang Uygur Autonomous Region. Field surveys and remote sensing studies were conducted to analyze the relation between the distribution of hydrological conditions and the occurrence of ecological types. The results show that there is a good correlation between groundwater depth and the supergene ecological type. Numerical simulations and ecological assessment models were applied to develop an ecology-oriented exploitation mode of groundwater resources. The mode allows the groundwater levels in different zones to be regulated by optimizing groundwater exploitation modes. The prediction results show that the supergene ecological quality will be better in 2020 and even more groundwater can be exploited in this mode. This study provides guidance for regional groundwater management, especially in regions with an obvious water scarcity.

  1. An ecology-oriented exploitation mode of groundwater resources in the northern foot of Tianshan Mountain, China

    DOE PAGES

    Shang, Haimin; Wang, Wenke; Dai, Zhenxue; ...

    2016-10-10

    In recent years, ecological degradation caused by irrational groundwater exploitation has been of growing concern in arid and semiarid regions. To address the groundwater-ecological issues, this paper proposes a groundwater-resource exploitation mode to evaluate the tradeoff between groundwater development and ecological environment in the northern Tianshan Mountains, northwest China’s Xinjiang Uygur Autonomous Region. Field surveys and remote sensing studies were conducted to analyze the relation between the distribution of hydrological conditions and the occurrence of ecological types. The results show that there is a good correlation between groundwater depth and the supergene ecological type. Numerical simulations and ecological assessment modelsmore » were applied to develop an ecology-oriented exploitation mode of groundwater resources. The mode allows the groundwater levels in different zones to be regulated by optimizing groundwater exploitation modes. The prediction results show that the supergene ecological quality will be better in 2020 and even more groundwater can be exploited in this mode. This study provides guidance for regional groundwater management, especially in regions with an obvious water scarcity.« less

  2. An ecology-oriented exploitation mode of groundwater resources in the northern foot of Tianshan Mountain, China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, Haimin; Wang, Wenke; Dai, Zhenxue

    In recent years, ecological degradation caused by irrational groundwater exploitation has been of growing concern in arid and semiarid regions. To address the groundwater-ecological issues, this paper proposes a groundwater-resource exploitation mode to evaluate the tradeoff between groundwater development and ecological environment in the northern Tianshan Mountains, northwest China’s Xinjiang Uygur Autonomous Region. Field surveys and remote sensing studies were conducted to analyze the relation between the distribution of hydrological conditions and the occurrence of ecological types. The results show that there is a good correlation between groundwater depth and the supergene ecological type. Numerical simulations and ecological assessment modelsmore » were applied to develop an ecology-oriented exploitation mode of groundwater resources. The mode allows the groundwater levels in different zones to be regulated by optimizing groundwater exploitation modes. The prediction results show that the supergene ecological quality will be better in 2020 and even more groundwater can be exploited in this mode. This study provides guidance for regional groundwater management, especially in regions with an obvious water scarcity.« less

  3. Role of social interactions in dynamic patterns of resource patches and forager aggregation.

    PubMed

    Tania, Nessy; Vanderlei, Ben; Heath, Joel P; Edelstein-Keshet, Leah

    2012-07-10

    The dynamics of resource patches and species that exploit such patches are of interest to ecologists, conservation biologists, modelers, and mathematicians. Here we consider how social interactions can create unique, evolving patterns in space and time. Whereas simple prey taxis (with consumable prey) promotes spatial uniform distributions, here we show that taxis in producer-scrounger groups can lead to pattern formation. We consider two types of foragers: those that search directly ("producers") and those that exploit other foragers to find food ("scroungers" or exploiters). We show that such groups can sustain fluctuating spatiotemporal patterns, akin to "waves of pursuit." Investigating the relative benefits to the individuals, we observed conditions under which either strategy leads to enhanced success, defined as net food consumption. Foragers that search for food directly have an advantage when food patches are localized. Those that seek aggregations of group mates do better when their ability to track group mates exceeds the foragers' food-sensing acuity. When behavioral switching or reproductive success of the strategies is included, the relative abundance of foragers and exploiters is dynamic over time, in contrast with classic models that predict stable frequencies. Our work shows the importance of considering two-way interaction--i.e., how food distribution both influences and is influenced by social foraging and aggregation of predators.

  4. A tele-home care system exploiting the DVB-T technology and MHP.

    PubMed

    Angius, G; Pani, D; Raffo, L; Randaccio, P; Seruis, S

    2008-01-01

    The aim of this research work is the development of a low-cost system for telemedicine based on the DVB-T technology. The diffusion of DVB-T standard and the low cost of DVB-T set-top boxes bring the vision of a capillary distribution of tele-home care monitoring systems with easy-to-use patient's interface. Exploiting the potentiality of the DVB-T set-top box, we transformed it into an "on-demand tele-home care interface". The Xlet we developed is able to govern the functionality of an external microcontroller-based unit for the acquisition of the bio-signals of interest. The uplink connection is used to send the exam results to a remote care center. The Xlet providing the patient interface on the set-top box is uploaded by a DVB-T broadcaster without any intervention in the patient's home. A prototypal low-cost base station for the acquisition of the patient's signals (1-lead ECG) has been developed. It is able to be connected to the set-top box via an infrared link. A smart-card-based system is in charge for the customization of the Xlet for every patient. The proposed system, based on a currently widespread infrastructure, is able to allow the patients monitoring from home without any installation procedure. Even untrained (or elderly) people can easily use such system due to their practice with the basic DVB-T home-entertainment equipments.

  5. Experimental demonstration of subcarrier multiplexed quantum key distribution system.

    PubMed

    Mora, José; Ruiz-Alba, Antonio; Amaya, Waldimar; Martínez, Alfonso; García-Muñoz, Víctor; Calvo, David; Capmany, José

    2012-06-01

    We provide, to our knowledge, the first experimental demonstration of the feasibility of sending several parallel keys by exploiting the technique of subcarrier multiplexing (SCM) widely employed in microwave photonics. This approach brings several advantages such as high spectral efficiency compatible with the actual secure key rates, the sharing of the optical fainted pulse by all the quantum multiplexed channels reducing the system complexity, and the possibility of upgrading with wavelength division multiplexing in a two-tier scheme, to increase the number of parallel keys. Two independent quantum SCM channels featuring a sifted key rate of 10 Kb/s/channel over a link with quantum bit error rate <2% is reported.

  6. The PLUTO code for astrophysical gasdynamics .

    NASA Astrophysics Data System (ADS)

    Mignone, A.

    Present numerical codes appeal to a consolidated theory based on finite difference and Godunov-type schemes. In this context we have developed a versatile numerical code, PLUTO, suitable for the solution of high-mach number flow in 1, 2 and 3 spatial dimensions and different systems of coordinates. Different hydrodynamic modules and algorithms may be independently selected to properly describe Newtonian, relativistic, MHD, or relativistic MHD fluids. The modular structure exploits a general framework for integrating a system of conservation laws, built on modern Godunov-type shock-capturing schemes. The code is freely distributed under the GNU public license and it is available for download to the astrophysical community at the URL http://plutocode.to.astro.it.

  7. Commercial imagery archive, management, exploitation, and distribution project development

    NASA Astrophysics Data System (ADS)

    Hollinger, Bruce; Sakkas, Alysa

    1999-10-01

    The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (1) a potentially unbounded, data archive (5000 TB range) (2) automated workflow management for increased user productivity; (3) automatic tracking and management of files stored on shelves; (4) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (5) access through a thin client-to-server network environment; (6) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (7) scalability that maintains information throughput performance as the size of the digital library grows.

  8. Commercial imagery archive, management, exploitation, and distribution product development

    NASA Astrophysics Data System (ADS)

    Hollinger, Bruce; Sakkas, Alysa

    1999-12-01

    The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (a) a potentially unbounded, data archive (5000 TB range) (b) automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (e) access through a thin client-to-server network environment; (f) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.

  9. Parallelization and checkpointing of GPU applications through program transformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solano-Quinde, Lizandro Damian

    2012-01-01

    GPUs have emerged as a powerful tool for accelerating general-purpose applications. The availability of programming languages that makes writing general-purpose applications for running on GPUs tractable have consolidated GPUs as an alternative for accelerating general purpose applications. Among the areas that have benefited from GPU acceleration are: signal and image processing, computational fluid dynamics, quantum chemistry, and, in general, the High Performance Computing (HPC) Industry. In order to continue to exploit higher levels of parallelism with GPUs, multi-GPU systems are gaining popularity. In this context, single-GPU applications are parallelized for running in multi-GPU systems. Furthermore, multi-GPU systems help to solvemore » the GPU memory limitation for applications with large application memory footprint. Parallelizing single-GPU applications has been approached by libraries that distribute the workload at runtime, however, they impose execution overhead and are not portable. On the other hand, on traditional CPU systems, parallelization has been approached through application transformation at pre-compile time, which enhances the application to distribute the workload at application level and does not have the issues of library-based approaches. Hence, a parallelization scheme for GPU systems based on application transformation is needed. Like any computing engine of today, reliability is also a concern in GPUs. GPUs are vulnerable to transient and permanent failures. Current checkpoint/restart techniques are not suitable for systems with GPUs. Checkpointing for GPU systems present new and interesting challenges, primarily due to the natural differences imposed by the hardware design, the memory subsystem architecture, the massive number of threads, and the limited amount of synchronization among threads. Therefore, a checkpoint/restart technique suitable for GPU systems is needed. The goal of this work is to exploit higher levels of parallelism and to develop support for application-level fault tolerance in applications using multiple GPUs. Our techniques reduce the burden of enhancing single-GPU applications to support these features. To achieve our goal, this work designs and implements a framework for enhancing a single-GPU OpenCL application through application transformation.« less

  10. Secure detection in quantum key distribution by real-time calibration of receiver

    NASA Astrophysics Data System (ADS)

    Marøy, Øystein; Makarov, Vadim; Skaar, Johannes

    2017-12-01

    The single-photon detectionefficiency of the detector unit is crucial for the security of common quantum key distribution protocols like Bennett-Brassard 1984 (BB84). A low value for the efficiency indicates a possible eavesdropping attack that exploits the photon receiver’s imperfections. We present a method for estimating the detection efficiency, and calculate the corresponding secure key generation rate. The estimation is done by testing gated detectors using a randomly activated photon source inside the receiver unit. This estimate gives a secure rate for any detector with non-unity single-photon detection efficiency, both inherit or due to blinding. By adding extra optical components to the receiver, we make sure that the key is extracted from photon states for which our estimate is valid. The result is a quantum key distribution scheme that is secure against any attack that exploits detector imperfections.

  11. Assessing the Validity of Automated Webcrawlers as Data Collection Tools to Investigate Online Child Sexual Exploitation.

    PubMed

    Westlake, Bryce; Bouchard, Martin; Frank, Richard

    2017-10-01

    The distribution of child sexual exploitation (CE) material has been aided by the growth of the Internet. The graphic nature and prevalence of the material has made researching and combating difficult. Although used to study online CE distribution, automated data collection tools (e.g., webcrawlers) have yet to be shown effective at targeting only relevant data. Using CE-related image and keyword criteria, we compare networks starting from CE websites to those from similar non-CE sexuality websites and dissimilar sports websites. Our results provide evidence that (a) webcrawlers have the potential to provide valid CE data, if the appropriate criterion is selected; (b) CE distribution is still heavily image-based suggesting images as an effective criterion; (c) CE-seeded networks are more hub-based and differ from non-CE-seeded networks on several website characteristics. Recommendations for improvements to reliable criteria selection are discussed.

  12. Model-Unified Planning and Execution for Distributed Autonomous System Control

    NASA Technical Reports Server (NTRS)

    Aschwanden, Pascal; Baskaran, Vijay; Bernardini, Sara; Fry, Chuck; Moreno, Maria; Muscettola, Nicola; Plaunt, Chris; Rijsman, David; Tompkins, Paul

    2006-01-01

    The Intelligent Distributed Execution Architecture (IDEA) is a real-time architecture that exploits artificial intelligence planning as the core reasoning engine for interacting autonomous agents. Rather than enforcing separate deliberation and execution layers, IDEA unifies them under a single planning technology. Deliberative and reactive planners reason about and act according to a single representation of the past, present and future domain state. The domain state behaves the rules dictated by a declarative model of the subsystem to be controlled, internal processes of the IDEA controller, and interactions with other agents. We present IDEA concepts - modeling, the IDEA core architecture, the unification of deliberation and reaction under planning - and illustrate its use in a simple example. Finally, we present several real-world applications of IDEA, and compare IDEA to other high-level control approaches.

  13. Probing and exploiting the chaotic dynamics of a hydrodynamic photochemical oscillator to implement all the basic binary logic functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayashi, Kenta; Department of Chemistry, Biology, and Biotechnology, University of Perugia, 06123 Perugia; Gotoda, Hiroshi

    2016-05-15

    The convective motions within a solution of a photochromic spiro-oxazine being irradiated by UV only on the bottom part of its volume, give rise to aperiodic spectrophotometric dynamics. In this paper, we study three nonlinear properties of the aperiodic time series: permutation entropy, short-term predictability and long-term unpredictability, and degree distribution of the visibility graph networks. After ascertaining the extracted chaotic features, we show how the aperiodic time series can be exploited to implement all the fundamental two-inputs binary logic functions (AND, OR, NAND, NOR, XOR, and XNOR) and some basic arithmetic operations (half-adder, full-adder, half-subtractor). This is possible duemore » to the wide range of states a nonlinear system accesses in the course of its evolution. Therefore, the solution of the convective photochemical oscillator results in hardware for chaos-computing alternative to conventional complementary metal-oxide semiconductor-based integrated circuits.« less

  14. Expanding the biotechnology potential of lactobacilli through comparative genomics of 213 strains and associated genera.

    PubMed

    Sun, Zhihong; Harris, Hugh M B; McCann, Angela; Guo, Chenyi; Argimón, Silvia; Zhang, Wenyi; Yang, Xianwei; Jeffery, Ian B; Cooney, Jakki C; Kagawa, Todd F; Liu, Wenjun; Song, Yuqin; Salvetti, Elisa; Wrobel, Agnieszka; Rasinkangas, Pia; Parkhill, Julian; Rea, Mary C; O'Sullivan, Orla; Ritari, Jarmo; Douillard, François P; Paul Ross, R; Yang, Ruifu; Briner, Alexandra E; Felis, Giovanna E; de Vos, Willem M; Barrangou, Rodolphe; Klaenhammer, Todd R; Caufield, Page W; Cui, Yujun; Zhang, Heping; O'Toole, Paul W

    2015-09-29

    Lactobacilli are a diverse group of species that occupy diverse nutrient-rich niches associated with humans, animals, plants and food. They are used widely in biotechnology and food preservation, and are being explored as therapeutics. Exploiting lactobacilli has been complicated by metabolic diversity, unclear species identity and uncertain relationships between them and other commercially important lactic acid bacteria. The capacity for biotransformations catalysed by lactobacilli is an untapped biotechnology resource. Here we report the genome sequences of 213 Lactobacillus strains and associated genera, and their encoded genetic catalogue for modifying carbohydrates and proteins. In addition, we describe broad and diverse presence of novel CRISPR-Cas immune systems in lactobacilli that may be exploited for genome editing. We rationalize the phylogenomic distribution of host interaction factors and bacteriocins that affect their natural and industrial environments, and mechanisms to withstand stress during technological processes. We present a robust phylogenomic framework of existing species and for classifying new species.

  15. Regional groundwater flow modeling of the Geba basin, northern Ethiopia

    NASA Astrophysics Data System (ADS)

    Gebreyohannes, Tesfamichael; De Smedt, Florimond; Walraevens, Kristine; Gebresilassie, Solomon; Hussien, Abdelwassie; Hagos, Miruts; Amare, Kassa; Deckers, Jozef; Gebrehiwot, Kindeya

    2017-05-01

    The Geba basin is one of the most food-insecure areas of the Tigray regional state in northern Ethiopia due to recurrent drought resulting from erratic distribution of rainfall. Since the beginning of the 1990s, rain-fed agriculture has been supported through small-scale irrigation schemes mainly by surface-water harvesting, but success has been limited. Hence, use of groundwater for irrigation purposes has gained considerable attention. The main purpose of this study is to assess groundwater resources in the Geba basin by means of a MODFLOW modeling approach. The model is calibrated using observed groundwater levels, yielding a clear insight into the groundwater flow systems and reserves. Results show that none of the hydrogeological formations can be considered as aquifers that can be exploited for large-scale groundwater exploitation. However, aquitards can be identified that can support small-scale groundwater abstraction for irrigation needs in regions that are either designated as groundwater discharge areas or where groundwater levels are shallow and can be tapped by hand-dug wells or shallow boreholes.

  16. Experimental Satellite Quantum Communications

    NASA Astrophysics Data System (ADS)

    Vallone, Giuseppe; Bacco, Davide; Dequal, Daniele; Gaiarin, Simone; Luceri, Vincenza; Bianco, Giuseppe; Villoresi, Paolo

    2015-07-01

    Quantum communication (QC), namely, the faithful transmission of generic quantum states, is a key ingredient of quantum information science. Here we demonstrate QC with polarization encoding from space to ground by exploiting satellite corner cube retroreflectors as quantum transmitters in orbit and the Matera Laser Ranging Observatory of the Italian Space Agency in Matera, Italy, as a quantum receiver. The quantum bit error ratio (QBER) has been kept steadily low to a level suitable for several quantum information protocols, as the violation of Bell inequalities or quantum key distribution (QKD). Indeed, by taking data from different satellites, we demonstrate an average value of QBER =4.6 % for a total link duration of 85 s. The mean photon number per pulse μsat leaving the satellites was estimated to be of the order of one. In addition, we propose a fully operational satellite QKD system by exploiting our communication scheme with orbiting retroreflectors equipped with a modulator, a very compact payload. Our scheme paves the way toward the implementation of a QC worldwide network leveraging existing receivers.

  17. Expanding the biotechnology potential of lactobacilli through comparative genomics of 213 strains and associated genera

    PubMed Central

    Sun, Zhihong; Harris, Hugh M. B.; McCann, Angela; Guo, Chenyi; Argimón, Silvia; Zhang, Wenyi; Yang, Xianwei; Jeffery, Ian B; Cooney, Jakki C.; Kagawa, Todd F.; Liu, Wenjun; Song, Yuqin; Salvetti, Elisa; Wrobel, Agnieszka; Rasinkangas, Pia; Parkhill, Julian; Rea, Mary C.; O'Sullivan, Orla; Ritari, Jarmo; Douillard, François P.; Paul Ross, R.; Yang, Ruifu; Briner, Alexandra E.; Felis, Giovanna E.; de Vos, Willem M.; Barrangou, Rodolphe; Klaenhammer, Todd R.; Caufield, Page W.; Cui, Yujun; Zhang, Heping; O'Toole, Paul W.

    2015-01-01

    Lactobacilli are a diverse group of species that occupy diverse nutrient-rich niches associated with humans, animals, plants and food. They are used widely in biotechnology and food preservation, and are being explored as therapeutics. Exploiting lactobacilli has been complicated by metabolic diversity, unclear species identity and uncertain relationships between them and other commercially important lactic acid bacteria. The capacity for biotransformations catalysed by lactobacilli is an untapped biotechnology resource. Here we report the genome sequences of 213 Lactobacillus strains and associated genera, and their encoded genetic catalogue for modifying carbohydrates and proteins. In addition, we describe broad and diverse presence of novel CRISPR-Cas immune systems in lactobacilli that may be exploited for genome editing. We rationalize the phylogenomic distribution of host interaction factors and bacteriocins that affect their natural and industrial environments, and mechanisms to withstand stress during technological processes. We present a robust phylogenomic framework of existing species and for classifying new species. PMID:26415554

  18. Diffusive coevolution and mutualism maintenance mechanisms in a fig-fig wasp system.

    PubMed

    Wang, Rui-Wu; Sun, Bao-Fa; Zheng, Qi

    2010-05-01

    In reciprocal mutualism systems, the exploitation events by exploiters might disrupt the reciprocal mutualism, wherein one exploiter species might even exclude other coexisting exploiter species over an evolutionary time frame. What remains unclear is how such a community is maintained. Niche partitioning, or spatial heterogeneity among the mutualists and exploiters, is generally believed to enable stability within a mutualistic system. However, our examination of a reciprocal mutualism between a fig species (Ficus racemosa) and its pollinator wasp (Ceratosolen fusciceps) shows that spatial niche partitioning does not sufficiently prevent exploiters from overexploiting the common resource (i.e., the female flowers), because of the considerable niche overlap between the mutualists and exploiters. In response to an exploiter, our experiment shows that the fig can (1) abort syconia-containing flowers that have been galled by the exploiter, Apocryptophagus testacea, which oviposits before the pollinators do; and (2) retain syconia-containing flowers galled by Apocryptophagus mayri, which oviposit later than pollinators. However, as a result of (2), there is decreased development of adult non-pollinators or pollinator species in syconia that have not been sufficiently pollinated, but not aborted. Such discriminative abortion of figs or reduction in offspring development of exploiters while rewarding cooperative individuals with higher offspring development by the fig will increase the fitness of cooperative pollinating wasps, but decrease the fitness of exploiters. The fig-fig wasp interactions are diffusively coevolved, a case in which fig wasps diversify their genotype, phenotype, or behavior as a result of competition between wasps, while figs diverge their strategies to facilitate the evolution of cooperative fig waps or lessen the detrimental behavior by associated fig wasps. In habitats or syconia that suffer overexploitation, discriminative abortion of figs or reduction in the offspring development of exploiters in syconia that are not or not sufficiently pollinated will decrease exploiter fitness and perhaps even drive the population of exploiters to local extinction, enabling the evolution and maintenance of cooperative pollinators through the movement between habitats or syconia (i.e., the metapopulations).

  19. MQW Optical Feedback Modulators And Phase Shifters

    NASA Technical Reports Server (NTRS)

    Jackson, Deborah J.

    1995-01-01

    Laser diodes equipped with proposed multiple-quantum-well (MQW) optical feedback modulators prove useful in variety of analog and digital optical-communication applications, including fiber-optic signal-distribution networks and high-speed, low-crosstalk interconnections among super computers or very-high-speed integrated circuits. Development exploits accompanying electro-optical aspect of QCSE - variation in index of refraction with applied electric field. Also exploits sensitivity of laser diodes to optical feedback. Approach is reverse of prior approach.

  20. Rapid establishment of a regular distribution of adult tropical Drosophila parasitoids in a multi-patch environment by patch defence behaviour.

    PubMed

    de Jong, Peter W; Hemerik, Lia; Gort, Gerrit; van Alphen, Jacques J M

    2011-01-01

    Females of the larval parasitoid of Drosophila, Asobara citri, from sub-Saharan Africa, defend patches with hosts by fighting and chasing conspecific females upon encounter. Females of the closely related, palearctic species Asobara tabida do not defend patches and often search simultaneously in the same patch. The effect of patch defence by A. citri females on their distribution in a multi-patch environment was investigated, and their distributions were compared with those of A. tabida. For both species 20 females were released from two release-points in replicate experiments. Females of A. citri quickly reached a regular distribution across 16 patches, with a small variance/mean ratio per patch. Conversely, A. tabida females initially showed a clumped distribution, and after gradual dispersion, a more Poisson-like distribution across patches resulted (variance/mean ratio was closer to 1 and higher than for A. citri). The dispersion of A. tabida was most probably an effect of exploitation: these parasitoids increasingly made shorter visits to already exploited patches. We briefly discuss hypotheses on the adaptive significance of patch defence behaviour or its absence in the light of differences in the natural history of both parasitoid species, notably the spatial distribution of their hosts.

  1. Root Exploit Detection and Features Optimization: Mobile Device and Blockchain Based Medical Data Management.

    PubMed

    Firdaus, Ahmad; Anuar, Nor Badrul; Razak, Mohd Faizal Ab; Hashem, Ibrahim Abaker Targio; Bachok, Syafiq; Sangaiah, Arun Kumar

    2018-05-04

    The increasing demand for Android mobile devices and blockchain has motivated malware creators to develop mobile malware to compromise the blockchain. Although the blockchain is secure, attackers have managed to gain access into the blockchain as legal users, thereby comprising important and crucial information. Examples of mobile malware include root exploit, botnets, and Trojans and root exploit is one of the most dangerous malware. It compromises the operating system kernel in order to gain root privileges which are then used by attackers to bypass the security mechanisms, to gain complete control of the operating system, to install other possible types of malware to the devices, and finally, to steal victims' private keys linked to the blockchain. For the purpose of maximizing the security of the blockchain-based medical data management (BMDM), it is crucial to investigate the novel features and approaches contained in root exploit malware. This study proposes to use the bio-inspired method of practical swarm optimization (PSO) which automatically select the exclusive features that contain the novel android debug bridge (ADB). This study also adopts boosting (adaboost, realadaboost, logitboost, and multiboost) to enhance the machine learning prediction that detects unknown root exploit, and scrutinized three categories of features including (1) system command, (2) directory path and (3) code-based. The evaluation gathered from this study suggests a marked accuracy value of 93% with Logitboost in the simulation. Logitboost also helped to predicted all the root exploit samples in our developed system, the root exploit detection system (RODS).

  2. Fracture distribution and porosity in a fault-bound hydrothermal system (Grimsel Pass, Swiss Alps)

    NASA Astrophysics Data System (ADS)

    Egli, Daniel; Küng, Sulamith; Baumann, Rahel; Berger, Alfons; Baron, Ludovic; Herwegh, Marco

    2017-04-01

    The spatial distribution, orientation and continuity of brittle and ductile structures strongly control fluid pathways in a rock mass by joining existing pores and creating new pore space (fractures, joints) but can also act as seals to fluid flow (e.g. ductile shear zones, clay-rich fault gouges). In long-lived hydrothermal systems, permeability and the related fluid flow paths are therefore dynamic in space and time. Understanding the evolution and behaviour of naturally porous and permeable rock masses is critical for the successful exploration and sustainable exploitation of hydrothermal systems and can advance methods for planning and implementation of enhanced geothermal systems. This study focuses on an active fault-bound hydrothermal system in the crystalline basement of the Aar Massif (hydrothermal field Grimsel Pass, Swiss Alps) that has been exhumed from few kilometres depth and which documents at least 3 Ma of hydrothermal activity. The explored rock unit of the Aar massif is part of the External Crystalline Massifs that hosts a multitude of thermal springs on its southern border in the Swiss Rhône valley and furthermore represents the exhumed equivalent of potentially exploitable geothermal reservoirs in the deep crystalline subsurface of the northern Alpine foreland basin. This study combines structural data collected from a 125 m long drillhole across the hydrothermal zone, the corresponding drill core and surface mapping. Different methods are applied to estimate the porosity and the structural evolution with regard to porosity, permeability and fracture distribution. Analyses are carried out from the micrometre to decametre scale with main focus on the flow path evolution with time. This includes a large variety of porosity-types including fracture-porosity with up to cm-sized aperture down to grain-scale porosity. Main rock types are granitoid host rocks, mylonites, paleo-breccia and recent breccias. The porosity of the host rock as well as the cemented paleo-hydrothermal breccia is typically very low with values <1%. The high volume of mineralized fractures in the paleo-breccia indicates high porosity in former times, which is today closed by newly developed cements. The preservation of such paleo-breccias allow the investigation of contrasts between the fossil porosity/permeability and the present day active flow path, which is defined by fracture porosity that generally follows the regional deformation pattern and forms a wide network of interconnected fractures of variable orientation.

  3. Organization of an optimal adaptive immune system

    NASA Astrophysics Data System (ADS)

    Walczak, Aleksandra; Mayer, Andreas; Balasubramanian, Vijay; Mora, Thierry

    The repertoire of lymphocyte receptors in the adaptive immune system protects organisms from a diverse set of pathogens. A well-adapted repertoire should be tuned to the pathogenic environment to reduce the cost of infections. I will discuss a general framework for predicting the optimal repertoire that minimizes the cost of infections contracted from a given distribution of pathogens. The theory predicts that the immune system will have more receptors for rare antigens than expected from the frequency of encounters and individuals exposed to the same infections will have sparse repertoires that are largely different, but nevertheless exploit cross-reactivity to provide the same coverage of antigens. I will show that the optimal repertoires can be reached by dynamics that describes the competitive binding of antigens by receptors, and selective amplification of stimulated receptors.

  4. Visualization of terahertz surface waves propagation on metal foils

    PubMed Central

    Wang, Xinke; Wang, Sen; Sun, Wenfeng; Feng, Shengfei; Han, Peng; Yan, Haitao; Ye, Jiasheng; Zhang, Yan

    2016-01-01

    Exploitation of surface plasmonic devices (SPDs) in the terahertz (THz) band is always beneficial for broadening the application potential of THz technologies. To clarify features of SPDs, a practical characterization means is essential for accurately observing the complex field distribution of a THz surface wave (TSW). Here, a THz digital holographic imaging system is employed to coherently exhibit temporal variations and spectral properties of TSWs activated by a rectangular or semicircular slit structure on metal foils. Advantages of the imaging system are comprehensively elucidated, including the exclusive measurement of TSWs and fall-off of the time consumption. Numerical simulations of experimental procedures further verify the imaging measurement accuracy. It can be anticipated that this imaging system will provide a versatile tool for analyzing the performance and principle of SPDs. PMID:26729652

  5. SIOUX project: a simultaneous multiband camera for exoplanet atmospheres studies

    NASA Astrophysics Data System (ADS)

    Christille, Jean Marc; Bonomo, Aldo Stefano; Borsa, Francesco; Busonero, Deborah; Calcidese, Paolo; Claudi, Riccardo; Damasso, Mario; Giacobbe, Paolo; Molinari, Emilio; Pace, Emanuele; Riva, Alberto; Sozzetti, Alesandro; Toso, Giorgio; Tresoldi, Daniela

    2016-08-01

    The exoplanet revolution is well underway. The last decade has seen order-of-magnitude increases in the number of known planets beyond the Solar system. Detailed characterization of exoplanetary atmospheres provide the best means for distinguishing the makeup of their outer layers, and the only hope for understanding the interplay between initial composition chemistry, temperature-pressure atmospheric profiles, dynamics and circulation. While pioneering work on the observational side has produced the first important detections of atmospheric molecules for the class of transiting exoplanets, important limitations are still present due to the lack of systematic, repeated measurements with optimized instrumentation at both visible (VIS) and near-infrared (NIR) wavelengths. It is thus of fundamental importance to explore quantitatively possible avenues for improvements. In this paper we report initial results of a feasibility study for the prototype of a versatile multi-band imaging system for very high-precision differential photometry that exploits the choice of specifically selected narrow-band filters and novel ideas for the execution of simultaneous VIS and NIR measurements. Starting from the fundamental system requirements driven by the science case at hand, we describe a set of three opto-mechanical solutions for the instrument prototype: 1) a radial distribution of the optical flux using dichroic filters for the wavelength separation and narrow-band filters or liquid crystal filters for the observations; 2) a tree distribution of the optical flux (implying 2 separate foci), with the same technique used for the beam separation and filtering; 3) an 'exotic' solution consisting of the study of a complete optical system (i.e. a brand new telescope) that exploits the chromatic errors of a reflecting surface for directing the different wavelengths at different foci. In this paper we present the first results of the study phase for the three solutions, as well as the results of two laboratory prototypes (related to the first two options), that simulate the most critical aspects of the future instrument.

  6. Interpreting Popov criteria in Lure´ systems with complex scaling stability analysis

    NASA Astrophysics Data System (ADS)

    Zhou, J.

    2018-06-01

    The paper presents a novel frequency-domain interpretation of Popov criteria for absolute stability in Lure´ systems by means of what we call complex scaling stability analysis. The complex scaling technique is developed for exponential/asymptotic stability in LTI feedback systems, which dispenses open-loop poles distribution, contour/locus orientation and prior frequency sweeping. Exploiting the technique for alternatively revealing positive realness of transfer functions, re-interpreting Popov criteria is explicated. More specifically, the suggested frequency-domain stability conditions are conformable both in scalar and multivariable cases, and can be implemented either graphically with locus plotting or numerically without; in particular, the latter is suitable as a design tool with auxiliary parameter freedom. The interpretation also reveals further frequency-domain facts about Lure´ systems. Numerical examples are included to illustrate the main results.

  7. [Distribution Characteristics and Source of Fluoride in Groundwater in Lower Plain Area of North China Plain: A Case Study in Nanpi County].

    PubMed

    Kong, Xiao-le; Wang, Shi-qin; Zhao, Huan; Yuan, Rui-qiang

    2015-11-01

    There is an obvious regional contradiction between water resources and agricultural produce in lower plain area of North China, however, excessive fluorine in deep groundwater further limits the use of regional water resources. In order to understand the spatial distribution characteristics and source of F(-) in groundwater, study was carried out in Nanpi County by field survey and sampling, hydrogeochemical analysis and stable isotopes methods. The results showed that the center of low fluoride concentrations of shallow groundwater was located around reservoir of Dalang Lake, and centers of high fluoride concentrations were located in southeast and southwest of the study area. The region with high fluoride concentration was consistent with the over-exploitation region of deep groundwater. Point source pollution of subsurface drainage and non-point source of irrigation with deep groundwater in some regions were the main causes for the increasing F(-) concentrations of shallow groundwater in parts of the sampling sites. Rock deposition and hydrogeology conditions were the main causes for the high F(-) concentrations (1.00 mg x L(-1), threshold of drinking water quality standard in China) in deep groundwater. F(-) released from clay minerals into the water increased the F(-) concentrations in deep groundwater because of over-exploitation. With the increasing exploitation and utilization of brackish shallow groundwater and the compressing and restricting of deep groundwater exploitation, the water environment in the middle and east lower plain area of North China will undergo significant change, and it is important to identify the distribution and source of F(-) in surface water and groundwater for reasonable development and use of water resources in future.

  8. The Nimrod computational workbench: a case study in desktop metacomputing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramson, D.; Sosic, R.; Foster, I.

    The coordinated use of geographically distributed computers, or metacomputing, can in principle provide more accessible and cost- effective supercomputing than conventional high-performance systems. However, we lack evidence that metacomputing systems can be made easily usable, or that there exist large numbers of applications able to exploit metacomputing resources. In this paper, we present work that addresses both these concerns. The basis for this work is a system called Nimrod that provides a desktop problem-solving environment for parametric experiments. We describe how Nimrod has been extended to support the scheduling of computational resources located in a wide-area environment, and report onmore » an experiment in which Nimrod was used to schedule a large parametric study across the Australian Internet. The experiment provided both new scientific results and insights into Nimrod capabilities. We relate the results of this experiment to lessons learned from the I-WAY distributed computing experiment, and draw conclusions as to how Nimrod and I-WAY- like computing environments should be developed to support desktop metacomputing.« less

  9. Risk analysis and detection of thrombosis by measurement of electrical resistivity of blood.

    PubMed

    Sapkota, Achyut; Asakura, Yuta; Maruyama, Osamu; Kosaka, Ryo; Yamane, Takashi; Takei, Masahiro

    2013-01-01

    Monitoring of thrombogenic process is very important in ventricular assistance devices (VADs) used as temporary or permanent measures in patients with advanced heart failure. Currently, there is a lack of a system which can perform a real-time monitoring of thrombogenic activity. Electrical signals vary according to the change in concentration of coagulation factors as well as the distribution of blood cells, and thus have potential to detect the thrombogenic process in an early stage. In the present work, we have made an assessment of an instrumentation system exploiting the electrical properties of blood. The experiments were conducted using bovine blood. Electrical resistance tomography with eight-electrode sensor was used to monitor the spatio-temporal change in electrical resistivity of blood in thrombogenic and non-thrombogenic condition. Under non-thrombogenic condition, the resistivity was uniform across the cross-section and average resistivity monotonically decreased with time before remaining almost flat. In contrary, under thrombogenic condition, there was non-uniform distribution across the cross-section, and average resistivity fluctuated with time.

  10. Coordinated Science Campaign Scheduling for Sensor Webs

    NASA Technical Reports Server (NTRS)

    Edgington, Will; Morris, Robert; Dungan, Jennifer; Williams, Jenny; Carlson, Jean; Fleming, Damian; Wood, Terri; Yorke-Smith, Neil

    2005-01-01

    Future Earth observing missions will study different aspects and interacting pieces of the Earth's eco-system. Scientists are designing increasingly complex, interdisciplinary campaigns to exploit the diverse capabilities of multiple Earth sensing assets. In addition, spacecraft platforms are being configured into clusters, trains, or other distributed organizations in order to improve either the quality or the coverage of observations. These simultaneous advances in the design of science campaigns and in the missions that will provide the sensing resources to support them offer new challenges in the coordination of data and operations that are not addressed by current practice. For example, the scheduling of scientific observations for satellites in low Earth orbit is currently conducted independently by each mission operations center. An absence of an information infrastructure to enable the scheduling of coordinated observations involving multiple sensors makes it difficult to execute campaigns involving multiple assets. This paper proposes a software architecture and describes a prototype system called DESOPS (Distributed Earth Science Observation Planning and Scheduling) that will address this deficiency.

  11. Energy harvesting from coupled bending-twisting oscillations in carbon-fibre reinforced polymer laminates

    NASA Astrophysics Data System (ADS)

    Xie, Mengying; Zhang, Yan; Kraśny, Marcin J.; Rhead, Andrew; Bowen, Chris; Arafa, Mustafa

    2018-07-01

    The energy harvesting capability of resonant harvesting structures, such as piezoelectric cantilever beams, can be improved by utilizing coupled oscillations that generate favourable strain mode distributions. In this work, we present the first demonstration of the use of a laminated carbon fibre reinforced polymer to create cantilever beams that undergo coupled bending-twisting oscillations for energy harvesting applications. Piezoelectric layers that operate in bending and shear mode are attached to the bend-twist coupled beam surface at locations of maximum bending and torsional strains in the first mode of vibration to fully exploit the strain distribution along the beam. Modelling of this new bend-twist harvesting system is presented, which compares favourably with experimental results. It is demonstrated that the variety of bend and torsional modes of the harvesters can be utilized to create a harvester that operates over a wider range of frequencies and such multi-modal device architectures provides a unique approach to tune the frequency response of resonant harvesting systems.

  12. Performance improvement of eight-state continuous-variable quantum key distribution with an optical amplifier

    NASA Astrophysics Data System (ADS)

    Guo, Ying; Li, Renjie; Liao, Qin; Zhou, Jian; Huang, Duan

    2018-02-01

    Discrete modulation is proven to be beneficial to improving the performance of continuous-variable quantum key distribution (CVQKD) in long-distance transmission. In this paper, we suggest a construct to improve the maximal generated secret key rate of discretely modulated eight-state CVQKD using an optical amplifier (OA) with a slight cost of transmission distance. In the proposed scheme, an optical amplifier is exploited to compensate imperfection of Bob's apparatus, so that the generated secret key rate of eight-state protocol is enhanced. Specifically, we investigate two types of optical amplifiers, phase-insensitive amplifier (PIA) and phase-sensitive amplifier (PSA), and thereby obtain approximately equivalent improved performance for eight-state CVQKD system when applying these two different amplifiers. Numeric simulation shows that the proposed scheme can well improve the generated secret key rate of eight-state CVQKD in both asymptotic limit and finite-size regime. We also show that the proposed scheme can achieve the relatively high-rate transmission at long-distance communication system.

  13. Exploiting CMS data popularity to model the evolution of data management for Run-2 and beyond

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Boccali, T.; Giordano, D.; Girone, M.; Neri, M.; Magini, N.; Kuznetsov, V.; Wildish, T.

    2015-12-01

    During the LHC Run-1 data taking, all experiments collected large data volumes from proton-proton and heavy-ion collisions. The collisions data, together with massive volumes of simulated data, were replicated in multiple copies, transferred among various Tier levels, transformed/slimmed in format/content. These data were then accessed (both locally and remotely) by large groups of distributed analysis communities exploiting the WorldWide LHC Computing Grid infrastructure and services. While efficient data placement strategies - together with optimal data redistribution and deletions on demand - have become the core of static versus dynamic data management projects, little effort has so far been invested in understanding the detailed data-access patterns which surfaced in Run-1. These patterns, if understood, can be used as input to simulation of computing models at the LHC, to optimise existing systems by tuning their behaviour, and to explore next-generation CPU/storage/network co-scheduling solutions. This is of great importance, given that the scale of the computing problem will increase far faster than the resources available to the experiments, for Run-2 and beyond. Studying data-access patterns involves the validation of the quality of the monitoring data collected on the “popularity of each dataset, the analysis of the frequency and pattern of accesses to different datasets by analysis end-users, the exploration of different views of the popularity data (by physics activity, by region, by data type), the study of the evolution of Run-1 data exploitation over time, the evaluation of the impact of different data placement and distribution choices on the available network and storage resources and their impact on the computing operations. This work presents some insights from studies on the popularity data from the CMS experiment. We present the properties of a range of physics analysis activities as seen by the data popularity, and make recommendations for how to tune the initial distribution of data in anticipation of how it will be used in Run-2 and beyond.

  14. A new approach to modelling schistosomiasis transmission based on stratified worm burden.

    PubMed

    Gurarie, D; King, C H; Wang, X

    2010-11-01

    Multiple factors affect schistosomiasis transmission in distributed meta-population systems including age, behaviour, and environment. The traditional approach to modelling macroparasite transmission often exploits the 'mean worm burden' (MWB) formulation for human hosts. However, typical worm distribution in humans is overdispersed, and classic models either ignore this characteristic or make ad hoc assumptions about its pattern (e.g., by assuming a negative binomial distribution). Such oversimplifications can give wrong predictions for the impact of control interventions. We propose a new modelling approach to macro-parasite transmission by stratifying human populations according to worm burden, and replacing MWB dynamics with that of 'population strata'. We developed proper calibration procedures for such multi-component systems, based on typical epidemiological and demographic field data, and implemented them using Wolfram Mathematica. Model programming and calibration proved to be straightforward. Our calibrated system provided good agreement with the individual level field data from the Msambweni region of eastern Kenya. The Stratified Worm Burden (SWB) approach offers many advantages, in that it accounts naturally for overdispersion and accommodates other important factors and measures of human infection and demographics. Future work will apply this model and methodology to evaluate innovative control intervention strategies, including expanded drug treatment programmes proposed by the World Health Organization and its partners.

  15. Application of Bayesian Classification to Content-Based Data Management

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Berrick, S.; Gopalan, A.; Hua, X.; Shen, S.; Smith, P.; Yang, K-Y.; Wheeler, K.; Curry, C.

    2004-01-01

    The high volume of Earth Observing System data has proven to be challenging to manage for data centers and users alike. At the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC), about 1 TB of new data are archived each day. Distribution to users is also about 1 TB/day. A substantial portion of this distribution is MODIS calibrated radiance data, which has a wide variety of uses. However, much of the data is not useful for a particular user's needs: for example, ocean color users typically need oceanic pixels that are free of cloud and sun-glint. The GES DAAC is using a simple Bayesian classification scheme to rapidly classify each pixel in the scene in order to support several experimental content-based data services for near-real-time MODIS calibrated radiance products (from Direct Readout stations). Content-based subsetting would allow distribution of, say, only clear pixels to the user if desired. Content-based subscriptions would distribute data to users only when they fit the user's usability criteria in their area of interest within the scene. Content-based cache management would retain more useful data on disk for easy online access. The classification may even be exploited in an automated quality assessment of the geolocation product. Though initially to be demonstrated at the GES DAAC, these techniques have applicability in other resource-limited environments, such as spaceborne data systems.

  16. Design of Provider-Provisioned Website Protection Scheme against Malware Distribution

    NASA Astrophysics Data System (ADS)

    Yagi, Takeshi; Tanimoto, Naoto; Hariu, Takeo; Itoh, Mitsutaka

    Vulnerabilities in web applications expose computer networks to security threats, and many websites are used by attackers as hopping sites to attack other websites and user terminals. These incidents prevent service providers from constructing secure networking environments. To protect websites from attacks exploiting vulnerabilities in web applications, service providers use web application firewalls (WAFs). WAFs filter accesses from attackers by using signatures, which are generated based on the exploit codes of previous attacks. However, WAFs cannot filter unknown attacks because the signatures cannot reflect new types of attacks. In service provider environments, the number of exploit codes has recently increased rapidly because of the spread of vulnerable web applications that have been developed through cloud computing. Thus, generating signatures for all exploit codes is difficult. To solve these problems, our proposed scheme detects and filters malware downloads that are sent from websites which have already received exploit codes. In addition, to collect information for detecting malware downloads, web honeypots, which automatically extract the communication records of exploit codes, are used. According to the results of experiments using a prototype, our scheme can filter attacks automatically so that service providers can provide secure and cost-effective network environments.

  17. The intellectual information system for management of geological and technical arrangements during oil field exploitation

    NASA Astrophysics Data System (ADS)

    Markov, N. G.; E Vasilyeva, E.; Evsyutkin, I. V.

    2017-01-01

    The intellectual information system for management of geological and technical arrangements during oil fields exploitation is developed. Service-oriented architecture of its software is a distinctive feature of the system. The results of the cluster analysis of real field data received by means of this system are shown.

  18. Improving security of the ping-pong protocol

    NASA Astrophysics Data System (ADS)

    Zawadzki, Piotr

    2013-01-01

    A security layer for the asymptotically secure ping-pong protocol is proposed and analyzed in the paper. The operation of the improvement exploits inevitable errors introduced by the eavesdropping in the control and message modes. Its role is similar to the privacy amplification algorithms known from the quantum key distribution schemes. Messages are processed in blocks which guarantees that an eavesdropper is faced with a computationally infeasible problem as long as the system parameters are within reasonable limits. The introduced additional information preprocessing does not require quantum memory registers and confidential communication is possible without prior key agreement or some shared secret.

  19. Role of social interactions in dynamic patterns of resource patches and forager aggregation

    PubMed Central

    Tania, Nessy; Vanderlei, Ben; Heath, Joel P.; Edelstein-Keshet, Leah

    2012-01-01

    The dynamics of resource patches and species that exploit such patches are of interest to ecologists, conservation biologists, modelers, and mathematicians. Here we consider how social interactions can create unique, evolving patterns in space and time. Whereas simple prey taxis (with consumable prey) promotes spatial uniform distributions, here we show that taxis in producer–scrounger groups can lead to pattern formation. We consider two types of foragers: those that search directly (“producers”) and those that exploit other foragers to find food (“scroungers” or exploiters). We show that such groups can sustain fluctuating spatiotemporal patterns, akin to “waves of pursuit.” Investigating the relative benefits to the individuals, we observed conditions under which either strategy leads to enhanced success, defined as net food consumption. Foragers that search for food directly have an advantage when food patches are localized. Those that seek aggregations of group mates do better when their ability to track group mates exceeds the foragers’ food-sensing acuity. When behavioral switching or reproductive success of the strategies is included, the relative abundance of foragers and exploiters is dynamic over time, in contrast with classic models that predict stable frequencies. Our work shows the importance of considering two-way interaction—i.e., how food distribution both influences and is influenced by social foraging and aggregation of predators. PMID:22745167

  20. Data-Aware Retrodiction for Asynchronous Harmonic Measurement in a Cyber-Physical Energy System

    PubMed Central

    Liu, Youda; Wang, Xue; Liu, Yanchi; Cui, Sujin

    2016-01-01

    Cyber-physical energy systems provide a networked solution for safety, reliability and efficiency problems in smart grids. On the demand side, the secure and trustworthy energy supply requires real-time supervising and online power quality assessing. Harmonics measurement is necessary in power quality evaluation. However, under the large-scale distributed metering architecture, harmonic measurement faces the out-of-sequence measurement (OOSM) problem, which is the result of latencies in sensing or the communication process and brings deviations in data fusion. This paper depicts a distributed measurement network for large-scale asynchronous harmonic analysis and exploits a nonlinear autoregressive model with exogenous inputs (NARX) network to reorder the out-of-sequence measuring data. The NARX network gets the characteristics of the electrical harmonics from practical data rather than the kinematic equations. Thus, the data-aware network approximates the behavior of the practical electrical parameter with real-time data and improves the retrodiction accuracy. Theoretical analysis demonstrates that the data-aware method maintains a reasonable consumption of computing resources. Experiments on a practical testbed of a cyber-physical system are implemented, and harmonic measurement and analysis accuracy are adopted to evaluate the measuring mechanism under a distributed metering network. Results demonstrate an improvement of the harmonics analysis precision and validate the asynchronous measuring method in cyber-physical energy systems. PMID:27548171

  1. Laboratory information management system for membrane protein structure initiative--from gene to crystal.

    PubMed

    Troshin, Petr V; Morris, Chris; Prince, Stephen M; Papiz, Miroslav Z

    2008-12-01

    Membrane Protein Structure Initiative (MPSI) exploits laboratory competencies to work collaboratively and distribute work among the different sites. This is possible as protein structure determination requires a series of steps, starting with target selection, through cloning, expression, purification, crystallization and finally structure determination. Distributed sites create a unique set of challenges for integrating and passing on information on the progress of targets. This role is played by the Protein Information Management System (PIMS), which is a laboratory information management system (LIMS), serving as a hub for MPSI, allowing collaborative structural proteomics to be carried out in a distributed fashion. It holds key information on the progress of cloning, expression, purification and crystallization of proteins. PIMS is employed to track the status of protein targets and to manage constructs, primers, experiments, protocols, sample locations and their detailed histories: thus playing a key role in MPSI data exchange. It also serves as the centre of a federation of interoperable information resources such as local laboratory information systems and international archival resources, like PDB or NCBI. During the challenging task of PIMS integration, within the MPSI, we discovered a number of prerequisites for successful PIMS integration. In this article we share our experiences and provide invaluable insights into the process of LIMS adaptation. This information should be of interest to partners who are thinking about using LIMS as a data centre for their collaborative efforts.

  2. Multi-client quantum key distribution using wavelength division multiplexing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grice, Warren P; Bennink, Ryan S; Earl, Dennis Duncan

    Quantum Key Distribution (QKD) exploits the rules of quantum mechanics to generate and securely distribute a random sequence of bits to two spatially separated clients. Typically a QKD system can support only a single pair of clients at a time, and so a separate quantum link is required for every pair of users. We overcome this limitation with the design and characterization of a multi-client entangled-photon QKD system with the capacity for up to 100 clients simultaneously. The time-bin entangled QKD system includes a broadband down-conversion source with two unique features that enable the multi-user capability. First, the photons aremore » emitted across a very large portion of the telecom spectrum. Second, and more importantly, the photons are strongly correlated in their energy degree of freedom. Using standard wavelength division multiplexing (WDM) hardware, the photons can be routed to different parties on a quantum communication network, while the strong spectral correlations ensure that each client is linked only to the client receiving the conjugate wavelength. In this way, a single down-conversion source can support dozens of channels simultaneously--and to the extent that the WDM hardware can send different spectral channels to different clients, the system can support multiple client pairings. We will describe the design and characterization of the down-conversion source, as well as the client stations, which must be tunable across the emission spectrum.« less

  3. Quantitative groundwater modelling for a sustainable water resource exploitation in a Mediterranean alluvial aquifer

    NASA Astrophysics Data System (ADS)

    Laïssaoui, Mounir; Mesbah, Mohamed; Madani, Khodir; Kiniouar, Hocine

    2018-05-01

    To analyze the water budget under human influences in the Isser wadi alluvial aquifer in the northeast of Algeria, we built a mathematical model which can be used for better managing groundwater exploitation. A modular three-dimensional finite-difference groundwater flow model (MODFLOW) was used. The modelling system is largely based on physical laws and employs a numerical method of the finite difference to simulate water movement and fluxes in a horizontally discretized field. After calibration in steady-state, the model could reproduce the initial heads with a rather good precision. It enabled us to quantify the aquifer water balance terms and to obtain a conductivity zones distribution. The model also highlighted the relevant role of the Isser wadi which constitutes a drain of great importance for the aquifer, ensuring alone almost all outflows. The scenarios suggested in transient simulations showed that an increase in the pumping would only increase the lowering of the groundwater levels and disrupting natural balance of aquifer. However, it is clear that this situation depends primarily on the position of pumping wells in the plain as well as on the extracted volumes of water. As proven by the promising results of model, this physically based and distributed-parameter model is a valuable contribution to the ever-advancing technology of hydrological modelling and water resources assessment.

  4. Multimodal Estimation of Distribution Algorithms.

    PubMed

    Yang, Qiang; Chen, Wei-Neng; Li, Yun; Chen, C L Philip; Xu, Xiang-Min; Zhang, Jun

    2016-02-15

    Taking the advantage of estimation of distribution algorithms (EDAs) in preserving high diversity, this paper proposes a multimodal EDA. Integrated with clustering strategies for crowding and speciation, two versions of this algorithm are developed, which operate at the niche level. Then these two algorithms are equipped with three distinctive techniques: 1) a dynamic cluster sizing strategy; 2) an alternative utilization of Gaussian and Cauchy distributions to generate offspring; and 3) an adaptive local search. The dynamic cluster sizing affords a potential balance between exploration and exploitation and reduces the sensitivity to the cluster size in the niching methods. Taking advantages of Gaussian and Cauchy distributions, we generate the offspring at the niche level through alternatively using these two distributions. Such utilization can also potentially offer a balance between exploration and exploitation. Further, solution accuracy is enhanced through a new local search scheme probabilistically conducted around seeds of niches with probabilities determined self-adaptively according to fitness values of these seeds. Extensive experiments conducted on 20 benchmark multimodal problems confirm that both algorithms can achieve competitive performance compared with several state-of-the-art multimodal algorithms, which is supported by nonparametric tests. Especially, the proposed algorithms are very promising for complex problems with many local optima.

  5. The Language Environment and Syntactic Word-Class Acquisition.

    ERIC Educational Resources Information Center

    Zavrel, Jakub; Veenstra, Jorn

    A study analyzed the distribution of words in a three-million-word corpus of text from the "Wall Street Journal," in order to test a theory of the acquisition of word categories. The theory, an alternative to the semantic bootstrapping hypothesis, proposes that the child exploits multiple sources of cues (distributional, semantic, or…

  6. Mathematics of Sensing, Exploitation, and Execution (MSEE) Hierarchical Representations for the Evaluation of Sensed Data

    DTIC Science & Technology

    2016-06-01

    theories of the mammalian visual system, and exploiting descriptive text that may accompany a still image for improved inference. The focus of the Brown...test, computer vision, semantic description , street scenes, belief propagation, generative models, nonlinear filtering, sufficient statistics 16...visual system, and exploiting descriptive text that may accompany a still image for improved inference. The focus of the Brown team was on single images

  7. GALAXY: A new hybrid MOEA for the optimal design of Water Distribution Systems

    NASA Astrophysics Data System (ADS)

    Wang, Q.; Savić, D. A.; Kapelan, Z.

    2017-03-01

    A new hybrid optimizer, called genetically adaptive leaping algorithm for approximation and diversity (GALAXY), is proposed for dealing with the discrete, combinatorial, multiobjective design of Water Distribution Systems (WDSs), which is NP-hard and computationally intensive. The merit of GALAXY is its ability to alleviate to a great extent the parameterization issue and the high computational overhead. It follows the generational framework of Multiobjective Evolutionary Algorithms (MOEAs) and includes six search operators and several important strategies. These operators are selected based on their leaping ability in the objective space from the global and local search perspectives. These strategies steer the optimization and balance the exploration and exploitation aspects simultaneously. A highlighted feature of GALAXY lies in the fact that it eliminates majority of parameters, thus being robust and easy-to-use. The comparative studies between GALAXY and three representative MOEAs on five benchmark WDS design problems confirm its competitiveness. GALAXY can identify better converged and distributed boundary solutions efficiently and consistently, indicating a much more balanced capability between the global and local search. Moreover, its advantages over other MOEAs become more substantial as the complexity of the design problem increases.

  8. LENSED: a code for the forward reconstruction of lenses and sources from strong lensing observations

    NASA Astrophysics Data System (ADS)

    Tessore, Nicolas; Bellagamba, Fabio; Metcalf, R. Benton

    2016-12-01

    Robust modelling of strong lensing systems is fundamental to exploit the information they contain about the distribution of matter in galaxies and clusters. In this work, we present LENSED, a new code which performs forward parametric modelling of strong lenses. LENSED takes advantage of a massively parallel ray-tracing kernel to perform the necessary calculations on a modern graphics processing unit (GPU). This makes the precise rendering of the background lensed sources much faster, and allows the simultaneous optimization of tens of parameters for the selected model. With a single run, the code is able to obtain the full posterior probability distribution for the lens light, the mass distribution and the background source at the same time. LENSED is first tested on mock images which reproduce realistic space-based observations of lensing systems. In this way, we show that it is able to recover unbiased estimates of the lens parameters, even when the sources do not follow exactly the assumed model. Then, we apply it to a subsample of the Sloan Lens ACS Survey lenses, in order to demonstrate its use on real data. The results generally agree with the literature, and highlight the flexibility and robustness of the algorithm.

  9. Integrating Agent Models of Subsistence Farming With Dynamic Models of Water Distribution

    NASA Astrophysics Data System (ADS)

    Bithell, M.; Brasington, J.

    2004-12-01

    Subsistence farming communities are dependent on the landscape to provide the resource base upon which their societies can be built. A key component of this is the role of climate, and the feedback between rainfall, crop growth and land clearance, and their coupling to the hydrological cycle. Temporal fluctuations in rainfall on timescales from annual through to decadal and longer, and the associated changes in in the spatial distribution of water availability mediated by the soil-type, slope and landcover determine the locations within the landscape that can support agriculture, and control sustainability of farming practices. We seek to make an integrated modelling system to represent land use change by coupling an agent based model of subsistence farming, and the associated exploitation of natural resources, to a realistic representation of the hydrology at the catchment scale, using TOPMODEL to map the spatial distribution of crop water stress for given time-series of rainfall. In this way we can, for example, investigate how demographic changes and associated removal of forest cover influence the possibilities for field locations within the catchment, through changes in ground water availability. The framework for this modelling exercise will be presented and preliminary results from this system will be discussed.

  10. Robust Operation of Soft Open Points in Active Distribution Networks with High Penetration of Photovoltaic Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Ji, Haoran; Wang, Chengshan

    Distributed generators (DGs) including photovoltaic panels (PVs) have been integrated dramatically in active distribution networks (ADNs). Due to the strong volatility and uncertainty, the high penetration of PV generation immensely exacerbates the conditions of voltage violation in ADNs. However, the emerging flexible interconnection technology based on soft open points (SOPs) provides increased controllability and flexibility to the system operation. For fully exploiting the regulation ability of SOPs to address the problems caused by PV, this paper proposes a robust optimization method to achieve the robust optimal operation of SOPs in ADNs. A two-stage adjustable robust optimization model is built tomore » tackle the uncertainties of PV outputs, in which robust operation strategies of SOPs are generated to eliminate the voltage violations and reduce the power losses of ADNs. A column-and-constraint generation (C&CG) algorithm is developed to solve the proposed robust optimization model, which are formulated as second-order cone program (SOCP) to facilitate the accuracy and computation efficiency. Case studies on the modified IEEE 33-node system and comparisons with the deterministic optimization approach are conducted to verify the effectiveness and robustness of the proposed method.« less

  11. Plasmonic modes in nanowire dimers: A study based on the hydrodynamic Drude model including nonlocal and nonlinear effects

    NASA Astrophysics Data System (ADS)

    Moeferdt, Matthias; Kiel, Thomas; Sproll, Tobias; Intravaia, Francesco; Busch, Kurt

    2018-02-01

    A combined analytical and numerical study of the modes in two distinct plasmonic nanowire systems is presented. The computations are based on a discontinuous Galerkin time-domain approach, and a fully nonlinear and nonlocal hydrodynamic Drude model for the metal is utilized. In the linear regime, these computations demonstrate the strong influence of nonlocality on the field distributions as well as on the scattering and absorption spectra. Based on these results, second-harmonic-generation efficiencies are computed over a frequency range that covers all relevant modes of the linear spectra. In order to interpret the physical mechanisms that lead to corresponding field distributions, the associated linear quasielectrostatic problem is solved analytically via conformal transformation techniques. This provides an intuitive classification of the linear excitations of the systems that is then applied to the full Maxwell case. Based on this classification, group theory facilitates the determination of the selection rules for the efficient excitation of modes in both the linear and nonlinear regimes. This leads to significantly enhanced second-harmonic generation via judiciously exploiting the system symmetries. These results regarding the mode structure and second-harmonic generation are of direct relevance to other nanoantenna systems.

  12. Marginal Contribution-Based Distributed Subchannel Allocation in Small Cell Networks.

    PubMed

    Shah, Shashi; Kittipiyakul, Somsak; Lim, Yuto; Tan, Yasuo

    2018-05-10

    The paper presents a game theoretic solution for distributed subchannel allocation problem in small cell networks (SCNs) analyzed under the physical interference model. The objective is to find a distributed solution that maximizes the welfare of the SCNs, defined as the total system capacity. Although the problem can be addressed through best-response (BR) dynamics, the existence of a steady-state solution, i.e., a pure strategy Nash equilibrium (NE), cannot be guaranteed. Potential games (PGs) ensure convergence to a pure strategy NE when players rationally play according to some specified learning rules. However, such a performance guarantee comes at the expense of complete knowledge of the SCNs. To overcome such requirements, properties of PGs are exploited for scalable implementations, where we utilize the concept of marginal contribution (MC) as a tool to design learning rules of players’ utility and propose the marginal contribution-based best-response (MCBR) algorithm of low computational complexity for the distributed subchannel allocation problem. Finally, we validate and evaluate the proposed scheme through simulations for various performance metrics.

  13. Single-shot coherent diffraction imaging of microbunched relativistic electron beams for free-electron laser applications.

    PubMed

    Marinelli, A; Dunning, M; Weathersby, S; Hemsing, E; Xiang, D; Andonian, G; O'Shea, F; Miao, Jianwei; Hast, C; Rosenzweig, J B

    2013-03-01

    With the advent of coherent x rays provided by the x-ray free-electron laser (FEL), strong interest has been kindled in sophisticated diffraction imaging techniques. In this Letter, we exploit such techniques for the diagnosis of the density distribution of the intense electron beams typically utilized in an x-ray FEL itself. We have implemented this method by analyzing the far-field coherent transition radiation emitted by an inverse-FEL microbunched electron beam. This analysis utilizes an oversampling phase retrieval method on the transition radiation angular spectrum to reconstruct the transverse spatial distribution of the electron beam. This application of diffraction imaging represents a significant advance in electron beam physics, having critical applications to the diagnosis of high-brightness beams, as well as the collective microbunching instabilities afflicting these systems.

  14. Predicting species richness and distribution ranges of centipedes at the northern edge of Europe

    NASA Astrophysics Data System (ADS)

    Georgopoulou, Elisavet; Djursvoll, Per; Simaiakis, Stylianos M.

    2016-07-01

    In recent decades, interest in understanding species distributions and exploring processes that shape species diversity has increased, leading to the development of advanced methods for the exploitation of occurrence data for analytical and ecological purposes. Here, with the use of georeferenced centipede data, we explore the importance and contribution of bioclimatic variables and land cover, and predict distribution ranges and potential hotspots in Norway. We used a maximum entropy analysis (Maxent) to model species' distributions, aiming at exploring centres of distribution, latitudinal spans and northern range boundaries of centipedes in Norway. The performance of all Maxent models was better than random with average test area under the curve (AUC) values above 0.893 and True Skill Statistic (TSS) values above 0.593. Our results showed a highly significant latitudinal gradient of increased species richness in southern grid-cells. Mean temperatures of warmest and coldest quarters explained much of the potential distribution of species. Predictive modelling analyses revealed that south-eastern Norway and the Atlantic coast in the west (inclusive of the major fjord system of Sognefjord), are local biodiversity hotspots with regard to high predictive species co-occurrence. We conclude that our predicted northward shifts of centipedes' distributions in Norway are likely a result of post-glacial recolonization patterns, species' ecological requirements and dispersal abilities.

  15. Quantum network with trusted and untrusted relays

    NASA Astrophysics Data System (ADS)

    Ma, Xiongfeng; Annabestani, Razieh; Fung, Chi-Hang Fred; Lo, Hoi-Kwong; Lütkenhaus, Norbert; PitkäNen, David; Razavi, Mohsen

    2012-02-01

    Quantum key distribution offers two distant users to establish a random secure key by exploiting properties of quantum mechanics, whose security has proven in theory. In practice, many lab and field demonstrations have been performed in the last 20 years. Nowadays, quantum network with quantum key distribution systems are tested around the world, such as in China, Europe, Japan and US. In this talk, I will give a brief introduction of recent development for quantum network. For the untrusted relay part, I will introduce the measurement-device-independent quantum key distribution scheme and a quantum relay with linear optics. The security of such scheme is proven without assumptions on the detection devices, where most of quantum hacking strategies are launched. This scheme can be realized with current technology. For the trusted relay part, I will introduce so-called delayed privacy amplification, with which no error correction and privacy amplification is necessarily to be performed between users and the relay. In this way, classical communications and computational power requirement on the relay site will be reduced.

  16. Energy harvesting through gas dynamics in the free molecular flow regime between structured surfaces at different temperatures

    NASA Astrophysics Data System (ADS)

    Baier, Tobias; Dölger, Julia; Hardt, Steffen

    2014-05-01

    For a gas confined between surfaces held at different temperatures the velocity distribution shows a significant deviation from the Maxwell distribution when the mean free path of the molecules is comparable to or larger than the channel dimensions. If one of the surfaces is suitably structured, this nonequilibrium distribution can be exploited for momentum transfer in a tangential direction between the two surfaces. This opens up the possibility to extract work from the system which operates as a heat engine. Since both surfaces are held at constant temperatures, the mode of momentum transfer is different from the thermal creep flow that has gained more attention so far. This situation is studied in the limit of free-molecular flow for the case that an unstructured surface is allowed to move tangentially with respect to a structured surface. Parameter studies are conducted, and configurations with maximum thermodynamic efficiency are identified. Overall, it is shown that significant efficiencies can be obtained by tangential momentum transfer between structured surfaces.

  17. Energy harvesting through gas dynamics in the free molecular flow regime between structured surfaces at different temperatures.

    PubMed

    Baier, Tobias; Dölger, Julia; Hardt, Steffen

    2014-05-01

    For a gas confined between surfaces held at different temperatures the velocity distribution shows a significant deviation from the Maxwell distribution when the mean free path of the molecules is comparable to or larger than the channel dimensions. If one of the surfaces is suitably structured, this nonequilibrium distribution can be exploited for momentum transfer in a tangential direction between the two surfaces. This opens up the possibility to extract work from the system which operates as a heat engine. Since both surfaces are held at constant temperatures, the mode of momentum transfer is different from the thermal creep flow that has gained more attention so far. This situation is studied in the limit of free-molecular flow for the case that an unstructured surface is allowed to move tangentially with respect to a structured surface. Parameter studies are conducted, and configurations with maximum thermodynamic efficiency are identified. Overall, it is shown that significant efficiencies can be obtained by tangential momentum transfer between structured surfaces.

  18. An entropy-variables-based formulation of residual distribution schemes for non-equilibrium flows

    NASA Astrophysics Data System (ADS)

    Garicano-Mena, Jesús; Lani, Andrea; Degrez, Gérard

    2018-06-01

    In this paper we present an extension of Residual Distribution techniques for the simulation of compressible flows in non-equilibrium conditions. The latter are modeled by means of a state-of-the-art multi-species and two-temperature model. An entropy-based variable transformation that symmetrizes the projected advective Jacobian for such a thermophysical model is introduced. Moreover, the transformed advection Jacobian matrix presents a block diagonal structure, with mass-species and electronic-vibrational energy being completely decoupled from the momentum and total energy sub-system. The advantageous structure of the transformed advective Jacobian can be exploited by contour-integration-based Residual Distribution techniques: established schemes that operate on dense matrices can be substituted by the same scheme operating on the momentum-energy subsystem matrix and repeated application of scalar scheme to the mass-species and electronic-vibrational energy terms. Finally, the performance gain of the symmetrizing-variables formulation is quantified on a selection of representative testcases, ranging from subsonic to hypersonic, in inviscid or viscous conditions.

  19. Few-mode optical fiber based simultaneously distributed curvature and temperature sensing.

    PubMed

    Wu, Hao; Tang, Ming; Wang, Meng; Zhao, Can; Zhao, Zhiyong; Wang, Ruoxu; Liao, Ruolin; Fu, Songnian; Yang, Chen; Tong, Weijun; Shum, Perry Ping; Liu, Deming

    2017-05-29

    The few-mode fiber (FMF) based Brillouin sensing operated in quasi-single mode (QSM) has been reported to achieve the distributed curvature measurement by monitoring the bend-induced strain variation. However, its practicality is limited by the inherent temperature-strain cross-sensitivity of Brillouin sensors. Here we proposed and experimentally demonstrated an approach for simultaneously distributed curvature and temperature sensing, which exploits a hybrid QSM operated Raman-Brillouin system in FMFs. Thanks to the larger spot size of the fundamental mode in the FMF, the Brillouin frequency shift change of the FMF is used for curvature estimation while the temperature variation is alleviated through Raman signals with the enhanced signal-to-noise ratio (SNR). Within 2 minutes measuring time, a 1.5 m spatial resolution is achieved along a 2 km FMF. The worst resolution of the square of fiber curvature is 0.333 cm -2 while the temperature resolution is 1.301 °C at the end of fiber.

  20. How to resolve microsecond current fluctuations in single ion channels: The power of beta distributions

    PubMed Central

    Schroeder, Indra

    2015-01-01

    Abstract A main ingredient for the understanding of structure/function correlates of ion channels is the quantitative description of single-channel gating and conductance. However, a wealth of information provided from fast current fluctuations beyond the temporal resolution of the recording system is often ignored, even though it is close to the time window accessible to molecular dynamics simulations. This kind of current fluctuations provide a special technical challenge, because individual opening/closing or blocking/unblocking events cannot be resolved, and the resulting averaging over undetected events decreases the single-channel current. Here, I briefly summarize the history of fast-current fluctuation analysis and focus on the so-called “beta distributions.” This tool exploits characteristics of current fluctuation-induced excess noise on the current amplitude histograms to reconstruct the true single-channel current and kinetic parameters. A guideline for the analysis and recent applications demonstrate that a construction of theoretical beta distributions by Markov Model simulations offers maximum flexibility as compared to analytical solutions. PMID:26368656

  1. INTEGRATING PARASITES AND PATHOGENS INTO THE STUDY OF GEOGRAPHIC RANGE LIMITS.

    PubMed

    Bozick, Brooke A; Real, Leslie A

    2015-12-01

    The geographic distributions of all species are limited, and the determining factors that set these limits are of fundamental importance to the fields of ecology and evolutionary biology. Plant and animal ranges have been of primary concern, while those of parasites, which represent much of the Earth's biodiversity, have been neglected. Here, we review the determinants of the geographic ranges of parasites and pathogens, and explore how parasites provide novel systems with which to investigate the ecological and evolutionary processes governing host/parasite spatial distributions. Although there is significant overlap in the causative factors that determine range borders of parasites and free-living species, parasite distributions are additionally constrained by the geographic range and ecology of the host species' population, as well as by evolutionary factors that promote host-parasite coevolution. Recently, parasites have been used to infer population demographic and ecological information about their host organisms and we conclude that this strategy can be further exploited to understand geographic range limitations of both host and parasite populations.

  2. A prototype Infrastructure for Cloud-based distributed services in High Availability over WAN

    NASA Astrophysics Data System (ADS)

    Bulfon, C.; Carlino, G.; De Salvo, A.; Doria, A.; Graziosi, C.; Pardi, S.; Sanchez, A.; Carboni, M.; Bolletta, P.; Puccio, L.; Capone, V.; Merola, L.

    2015-12-01

    In this work we present the architectural and performance studies concerning a prototype of a distributed Tier2 infrastructure for HEP, instantiated between the two Italian sites of INFN-Romal and INFN-Napoli. The network infrastructure is based on a Layer-2 geographical link, provided by the Italian NREN (GARR), directly connecting the two remote LANs of the named sites. By exploiting the possibilities offered by the new distributed file systems, a shared storage area with synchronous copy has been set up. The computing infrastructure, based on an OpenStack facility, is using a set of distributed Hypervisors installed in both sites. The main parameter to be taken into account when managing two remote sites with a single framework is the effect of the latency, due to the distance and the end-to-end service overhead. In order to understand the capabilities and limits of our setup, the impact of latency has been investigated by means of a set of stress tests, including data I/O throughput, metadata access performance evaluation and network occupancy, during the life cycle of a Virtual Machine. A set of resilience tests has also been performed, in order to verify the stability of the system on the event of hardware or software faults. The results of this work show that the reliability and robustness of the chosen architecture are effective enough to build a production system and to provide common services. This prototype can also be extended to multiple sites with small changes of the network topology, thus creating a National Network of Cloud-based distributed services, in HA over WAN.

  3. Network Penetration Testing and Research

    NASA Technical Reports Server (NTRS)

    Murphy, Brandon F.

    2013-01-01

    This paper will focus the on research and testing done on penetrating a network for security purposes. This research will provide the IT security office new methods of attacks across and against a company's network as well as introduce them to new platforms and software that can be used to better assist with protecting against such attacks. Throughout this paper testing and research has been done on two different Linux based operating systems, for attacking and compromising a Windows based host computer. Backtrack 5 and BlackBuntu (Linux based penetration testing operating systems) are two different "attacker'' computers that will attempt to plant viruses and or NASA USRP - Internship Final Report exploits on a host Windows 7 operating system, as well as try to retrieve information from the host. On each Linux OS (Backtrack 5 and BlackBuntu) there is penetration testing software which provides the necessary tools to create exploits that can compromise a windows system as well as other operating systems. This paper will focus on two main methods of deploying exploits 1 onto a host computer in order to retrieve information from a compromised system. One method of deployment for an exploit that was tested is known as a "social engineering" exploit. This type of method requires interaction from unsuspecting user. With this user interaction, a deployed exploit may allow a malicious user to gain access to the unsuspecting user's computer as well as the network that such computer is connected to. Due to more advance security setting and antivirus protection and detection, this method is easily identified and defended against. The second method of exploit deployment is the method mainly focused upon within this paper. This method required extensive research on the best way to compromise a security enabled protected network. Once a network has been compromised, then any and all devices connected to such network has the potential to be compromised as well. With a compromised network, computers and devices can be penetrated through deployed exploits. This paper will illustrate the research done to test ability to penetrate a network without user interaction, in order to retrieve personal information from a targeted host.

  4. Study on the groundwater sustainable problem by numerical simulation in a multi-layered coastal aquifer system of Zhanjiang, China

    NASA Astrophysics Data System (ADS)

    Zhou, Pengpeng; Li, Ming; Lu, Yaodong

    2017-10-01

    Assessing sustainability of coastal groundwater is significant for groundwater management as coastal groundwater is vulnerable to over-exploitation and contamination. To address the issues of serious groundwater level drawdown and potential seawater intrusion risk of a multi-layered coastal aquifer system in Zhanjiang, China, this paper presents a numerical modelling study to research groundwater sustainability of this aquifer system. The transient modelling results show that the groundwater budget was negative (-3826× 104 to -4502× 10^{4 } m3/a) during the years 2008-2011, revealing that this aquifer system was over-exploited. Meanwhile, the groundwater sustainability was assessed by evaluating the negative hydraulic pressure area (NHPA) of the unconfined aquifer and the groundwater level dynamic and flow velocity of the offshore boundaries of the confined aquifers. The results demonstrate that the Nansan Island is most influenced by NHPA and that the local groundwater should not be exploited. The results also suggest that, with the current groundwater exploitation scheme, the sustainable yield should be 1.784× 108 m3/a (i.e., decreased by 20% from the current exploitation amount). To satisfy public water demands, the 20% decrease of the exploitation amount can be offset by the groundwater sourced from the Taiping groundwater resource field. These results provide valuable guidance for groundwater management of Zhanjiang.

  5. Generating Spatiotemporal Joint Torque Patterns from Dynamical Synchronization of Distributed Pattern Generators

    PubMed Central

    Pitti, Alexandre; Lungarella, Max; Kuniyoshi, Yasuo

    2009-01-01

    Pattern generators found in the spinal cord are no more seen as simple rhythmic oscillators for motion control. Indeed, they achieve flexible and dynamical coordination in interaction with the body and the environment dynamics giving to rise motor synergies. Discovering the mechanisms underlying the control of motor synergies constitutes an important research question not only for neuroscience but also for robotics: the motors coordination of high dimensional robotic systems is still a drawback and new control methods based on biological solutions may reduce their overall complexity. We propose to model the flexible combination of motor synergies in embodied systems via partial phase synchronization of distributed chaotic systems; for specific coupling strength, chaotic systems are able to phase synchronize their dynamics to the resonant frequencies of one external force. We take advantage of this property to explore and exploit the intrinsic dynamics of one specified embodied system. In two experiments with bipedal walkers, we show how motor synergies emerge when the controllers phase synchronize to the body's dynamics, entraining it to its intrinsic behavioral patterns. This stage is characterized by directed information flow from the sensors to the motors exhibiting the optimal situation when the body dynamics drive the controllers (mutual entrainment). Based on our results, we discuss the relevance of our findings for modeling the modular control of distributed pattern generators exhibited in the spinal cord, and for exploring the motor synergies in robots. PMID:20011216

  6. Phase Distribution and Selection of Partially Correlated Persistent Scatterers

    NASA Astrophysics Data System (ADS)

    Lien, J.; Zebker, H. A.

    2012-12-01

    Interferometric synthetic aperture radar (InSAR) time-series methods can effectively estimate temporal surface changes induced by geophysical phenomena. However, such methods are susceptible to decorrelation due to spatial and temporal baselines (radar pass separation), changes in orbital geometries, atmosphere, and noise. These effects limit the number of interferograms that can be used for differential analysis and obscure the deformation signal. InSAR decorrelation effects may be ameliorated by exploiting pixels that exhibit phase stability across the stack of interferograms. These so-called persistent scatterer (PS) pixels are dominated by a single point-like scatterer that remains phase-stable over the spatial and temporal baseline. By identifying a network of PS pixels for use in phase unwrapping, reliable deformation measurements may be obtained even in areas of low correlation, where traditional InSAR techniques fail to produce useful observations. Many additional pixels can be added to the PS list if we are able to identify those in which a dominant scatterer exhibits partial, rather than complete, correlation across all radar scenes. In this work, we quantify and exploit the phase stability of partially correlated PS pixels. We present a new system model for producing interferometric pixel values from a complex surface backscatter function characterized by signal-to-clutter ratio (SCR). From this model, we derive the joint probabilistic distribution for PS pixel phases in a stack of interferograms as a function of SCR and spatial baselines. This PS phase distribution generalizes previous results that assume the clutter phase contribution is uncorrelated between radar passes. We verify the analytic distribution through a series of radar scattering simulations. We use the derived joint PS phase distribution with maximum-likelihood SCR estimation to analyze an area of the Hayward Fault Zone in the San Francisco Bay Area. We obtain a series of 38 interferometric images of the area from C-band ERS radar satellite passes between May 1995 and December 2000. We compare the estimated SCRs to those calculated with previously derived PS phase distributions. Finally, we examine the PS network density resulting from varying selection thresholds of SCR and compare to other PS identification techniques.

  7. Contention Modeling for Multithreaded Distributed Shared Memory Machines: The Cray XMT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Secchi, Simone; Tumeo, Antonino; Villa, Oreste

    Distributed Shared Memory (DSM) machines are a wide class of multi-processor computing systems where a large virtually-shared address space is mapped on a network of physically distributed memories. High memory latency and network contention are two of the main factors that limit performance scaling of such architectures. Modern high-performance computing DSM systems have evolved toward exploitation of massive hardware multi-threading and fine-grained memory hashing to tolerate irregular latencies, avoid network hot-spots and enable high scaling. In order to model the performance of such large-scale machines, parallel simulation has been proved to be a promising approach to achieve good accuracy inmore » reasonable times. One of the most critical factors in solving the simulation speed-accuracy trade-off is network modeling. The Cray XMT is a massively multi-threaded supercomputing architecture that belongs to the DSM class, since it implements a globally-shared address space abstraction on top of a physically distributed memory substrate. In this paper, we discuss the development of a contention-aware network model intended to be integrated in a full-system XMT simulator. We start by measuring the effects of network contention in a 128-processor XMT machine and then investigate the trade-off that exists between simulation accuracy and speed, by comparing three network models which operate at different levels of accuracy. The comparison and model validation is performed by executing a string-matching algorithm on the full-system simulator and on the XMT, using three datasets that generate noticeably different contention patterns.« less

  8. Astronomical Verification of a Stabilized Frequency Reference Transfer System for the Square Kilometer Array

    NASA Astrophysics Data System (ADS)

    Gozzard, David R.; Schediwy, Sascha W.; Dodson, Richard; Rioja, María J.; Hill, Mike; Lennon, Brett; McFee, Jock; Mirtschin, Peter; Stevens, Jamie; Grainge, Keith

    2017-07-01

    In order to meet its cutting-edge scientific objectives, the Square Kilometre Array (SKA) telescope requires high-precision frequency references to be distributed to each of its antennas. The frequency references are distributed via fiber-optic links and must be actively stabilized to compensate for phase noise imposed on the signals by environmental perturbations on the links. SKA engineering requirements demand that any proposed frequency reference distribution system be proved in “astronomical verification” tests. We present results of the astronomical verification of a stabilized frequency reference transfer system proposed for SKA-mid. The dual-receiver architecture of the Australia Telescope Compact Array was exploited to subtract the phase noise of the sky signal from the data, allowing the phase noise of observations performed using a standard frequency reference, as well as the stabilized frequency reference transfer system transmitting over 77 km of fiber-optic cable, to be directly compared. Results are presented for the fractional frequency stability and phase drift of the stabilized frequency reference transfer system for celestial calibrator observations at 5 and 25 GHz. These observations plus additional laboratory results for the transferred signal stability over a 166 km metropolitan fiber-optic link are used to show that the stabilized transfer system under test exceeds all SKA phase-stability requirements within a broad range of observing conditions. Furthermore, we have shown that alternative reference dissemination systems that use multiple synthesizers to supply reference signals to sub-sections of an array may limit the imaging capability of the telescope.

  9. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  10. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghattas, Omar

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  11. Phylogenetic distribution of a male pheromone that may exploit a nonsexual preference in lampreys

    USGS Publications Warehouse

    Buchinger, Tyler J.; Bussy, Ugo; Li, Ke; Wang, Huiyong; Huertas, Mar; Baker, Cindy F.; Jia, Liang; Hayes, Michael C.; Li, Weiming; Johnson, Nicholas

    2017-01-01

    Pheromones are among the most important sexual signals used by organisms throughout the animal kingdom. However, few are identified in vertebrates, leaving the evolutionary mechanisms underlying vertebrate pheromones poorly understood. Pre-existing biases in receivers’ perceptual systems shape visual and auditory signaling systems, but studies on how receiver biases influence the evolution of pheromone communication remain sparse. The lamprey Petromyzon marinus uses a relatively well-understood suite of pheromones and offers a unique opportunity to study the evolution of vertebrate pheromone communication. Previous studies indicate that male signaling with the mating pheromone 3-keto petromyzonol sulfate (3kPZS) may exploit a nonsexual attraction to juvenile-released 3kPZS that guides migration into productive rearing habitat. Here, we infer the distribution of male signaling with 3kPZS using a phylogenetic comparison comprising six of ten genera and two of three families. Our results indicate that only P. marinus and Ichthyomyzon castaneus release 3kPZS at high rates. Olfactory and behavioral assays with P. marinus, I. castaneus and a subset of three other species that do not use 3kPZS as a sexual signal indicate that male signaling might drive the evolution of female adaptations to detect 3kPZS with specific olfactory mechanisms and respond to 3kPZS with targeted attraction relevant during mate search. We postulate that 3kPZS communication evolved independently in I. castaneus and P. marinus, but cannot eliminate the alternative that other species lost 3kPZS communication. Regardless, our results represent a rare macroevolutionary investigation of a vertebrate pheromone and insight into the evolutionary mechanisms underlying pheromone communication.

  12. A Kernel Embedding-Based Approach for Nonstationary Causal Model Inference.

    PubMed

    Hu, Shoubo; Chen, Zhitang; Chan, Laiwan

    2018-05-01

    Although nonstationary data are more common in the real world, most existing causal discovery methods do not take nonstationarity into consideration. In this letter, we propose a kernel embedding-based approach, ENCI, for nonstationary causal model inference where data are collected from multiple domains with varying distributions. In ENCI, we transform the complicated relation of a cause-effect pair into a linear model of variables of which observations correspond to the kernel embeddings of the cause-and-effect distributions in different domains. In this way, we are able to estimate the causal direction by exploiting the causal asymmetry of the transformed linear model. Furthermore, we extend ENCI to causal graph discovery for multiple variables by transforming the relations among them into a linear nongaussian acyclic model. We show that by exploiting the nonstationarity of distributions, both cause-effect pairs and two kinds of causal graphs are identifiable under mild conditions. Experiments on synthetic and real-world data are conducted to justify the efficacy of ENCI over major existing methods.

  13. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that the schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solutions and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  14. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that that schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solution and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  15. A parallel time integrator for noisy nonlinear oscillatory systems

    NASA Astrophysics Data System (ADS)

    Subber, Waad; Sarkar, Abhijit

    2018-06-01

    In this paper, we adapt a parallel time integration scheme to track the trajectories of noisy non-linear dynamical systems. Specifically, we formulate a parallel algorithm to generate the sample path of nonlinear oscillator defined by stochastic differential equations (SDEs) using the so-called parareal method for ordinary differential equations (ODEs). The presence of Wiener process in SDEs causes difficulties in the direct application of any numerical integration techniques of ODEs including the parareal algorithm. The parallel implementation of the algorithm involves two SDEs solvers, namely a fine-level scheme to integrate the system in parallel and a coarse-level scheme to generate and correct the required initial conditions to start the fine-level integrators. For the numerical illustration, a randomly excited Duffing oscillator is investigated in order to study the performance of the stochastic parallel algorithm with respect to a range of system parameters. The distributed implementation of the algorithm exploits Massage Passing Interface (MPI).

  16. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    PubMed

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  17. Communication architecture for large geostationary platforms

    NASA Technical Reports Server (NTRS)

    Bond, F. E.

    1979-01-01

    Large platforms have been proposed for supporting multipurpose communication payloads to exploit economy of scale, reduce congestion in the geostationary orbit, provide interconnectivity between diverse earth stations, and obtain significant frequency reuse with large multibeam antennas. This paper addresses a specific system design, starting with traffic projections in the next two decades and discussing tradeoffs and design approaches for major components including: antennas, transponders, and switches. Other issues explored are selection of frequency bands, modulation, multiple access, switching methods, and techniques for servicing areas with nonuniform traffic demands. Three-major services are considered: a high-volume trunking system, a direct-to-user system, and a broadcast system for video distribution and similar functions. Estimates of payload weight and d.c. power requirements are presented. Other subjects treated are: considerations of equipment layout for servicing by an orbit transfer vehicle, mechanical stability requirements for the large antennas, and reliability aspects of the large number of transponders employed.

  18. Synthetic depth data creation for sensor setup planning and evaluation of multi-camera multi-person trackers

    NASA Astrophysics Data System (ADS)

    Pattke, Marco; Martin, Manuel; Voit, Michael

    2017-05-01

    Tracking people with cameras in public areas is common today. However with an increasing number of cameras it becomes harder and harder to view the data manually. Especially in safety critical areas automatic image exploitation could help to solve this problem. Setting up such a system can however be difficult because of its increased complexity. Sensor placement is critical to ensure that people are detected and tracked reliably. We try to solve this problem using a simulation framework that is able to simulate different camera setups in the desired environment including animated characters. We combine this framework with our self developed distributed and scalable system for people tracking to test its effectiveness and can show the results of the tracking system in real time in the simulated environment.

  19. Morphological communication: exploiting coupled dynamics in a complex mechanical structure to achieve locomotion

    PubMed Central

    Rieffel, John A.; Valero-Cuevas, Francisco J.; Lipson, Hod

    2010-01-01

    Traditional engineering approaches strive to avoid, or actively suppress, nonlinear dynamic coupling among components. Biological systems, in contrast, are often rife with these dynamics. Could there be, in some cases, a benefit to high degrees of dynamical coupling? Here we present a distributed robotic control scheme inspired by the biological phenomenon of tensegrity-based mechanotransduction. This emergence of morphology-as-information-conduit or ‘morphological communication’, enabled by time-sensitive spiking neural networks, presents a new paradigm for the decentralized control of large, coupled, modular systems. These results significantly bolster, both in magnitude and in form, the idea of morphological computation in robotic control. Furthermore, they lend further credence to ideas of embodied anatomical computation in biological systems, on scales ranging from cellular structures up to the tendinous networks of the human hand. PMID:19776146

  20. Revealing physical interaction networks from statistics of collective dynamics

    PubMed Central

    Nitzan, Mor; Casadiego, Jose; Timme, Marc

    2017-01-01

    Revealing physical interactions in complex systems from observed collective dynamics constitutes a fundamental inverse problem in science. Current reconstruction methods require access to a system’s model or dynamical data at a level of detail often not available. We exploit changes in invariant measures, in particular distributions of sampled states of the system in response to driving signals, and use compressed sensing to reveal physical interaction networks. Dynamical observations following driving suffice to infer physical connectivity even if they are temporally disordered, are acquired at large sampling intervals, and stem from different experiments. Testing various nonlinear dynamic processes emerging on artificial and real network topologies indicates high reconstruction quality for existence as well as type of interactions. These results advance our ability to reveal physical interaction networks in complex synthetic and natural systems. PMID:28246630

  1. Geographically distributed Batch System as a Service: the INDIGO-DataCloud approach exploiting HTCondor

    NASA Astrophysics Data System (ADS)

    Aiftimiei, D. C.; Antonacci, M.; Bagnasco, S.; Boccali, T.; Bucchi, R.; Caballer, M.; Costantini, A.; Donvito, G.; Gaido, L.; Italiano, A.; Michelotto, D.; Panella, M.; Salomoni, D.; Vallero, S.

    2017-10-01

    One of the challenges a scientific computing center has to face is to keep delivering well consolidated computational frameworks (i.e. the batch computing farm), while conforming to modern computing paradigms. The aim is to ease system administration at all levels (from hardware to applications) and to provide a smooth end-user experience. Within the INDIGO- DataCloud project, we adopt two different approaches to implement a PaaS-level, on-demand Batch Farm Service based on HTCondor and Mesos. In the first approach, described in this paper, the various HTCondor daemons are packaged inside pre-configured Docker images and deployed as Long Running Services through Marathon, profiting from its health checks and failover capabilities. In the second approach, we are going to implement an ad-hoc HTCondor framework for Mesos. Container-to-container communication and isolation have been addressed exploring a solution based on overlay networks (based on the Calico Project). Finally, we have studied the possibility to deploy an HTCondor cluster that spans over different sites, exploiting the Condor Connection Broker component, that allows communication across a private network boundary or firewall as in case of multi-site deployments. In this paper, we are going to describe and motivate our implementation choices and to show the results of the first tests performed.

  2. THE HERSCHEL EXPLOITATION OF LOCAL GALAXY ANDROMEDA (HELGA). VI. THE DISTRIBUTION AND PROPERTIES OF MOLECULAR CLOUD ASSOCIATIONS IN M31

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirk, J. M.; Gear, W. K.; Smith, M. W. L.

    In this paper we present a catalog of giant molecular clouds (GMCs) in the Andromeda (M31) galaxy extracted from the Herschel Exploitation of Local Galaxy Andromeda (HELGA) data set. GMCs are identified from the Herschel maps using a hierarchical source extraction algorithm. We present the results of this new catalog and characterize the spatial distribution and spectral energy properties of its clouds based on the radial dust/gas properties found by Smith et al. A total of 326 GMCs in the mass range 10{sup 4}-10{sup 7} M {sub ☉} are identified; their cumulative mass distribution is found to be proportional to Mmore » {sup –2.34}, in agreement with earlier studies. The GMCs appear to follow the same correlation of cloud mass to L {sub CO} observed in the Milky Way. However, comparison between this catalog and interferometry studies also shows that the GMCs are substructured below the Herschel resolution limit, suggesting that we are observing associations of GMCs. Following Gordon et al., we study the spatial structure of M31 by splitting the observed structure into a set of spiral arms and offset rings. We fit radii of 10.3 and 15.5 kpc to the two most prominent rings. We then fit a logarithmic spiral with a pitch angle of 8.°9 to the GMCs not associated with either ring. Last, we comment on the effects of deprojection on our results and investigate the effect different models for M31's inclination will have on the projection of an unperturbed spiral arm system.« less

  3. Automatic forest-fire measuring using ground stations and Unmanned Aerial Systems.

    PubMed

    Martínez-de Dios, José Ramiro; Merino, Luis; Caballero, Fernando; Ollero, Anibal

    2011-01-01

    This paper presents a novel system for automatic forest-fire measurement using cameras distributed at ground stations and mounted on Unmanned Aerial Systems (UAS). It can obtain geometrical measurements of forest fires in real-time such as the location and shape of the fire front, flame height and rate of spread, among others. Measurement of forest fires is a challenging problem that is affected by numerous potential sources of error. The proposed system addresses them by exploiting the complementarities between infrared and visual cameras located at different ground locations together with others onboard Unmanned Aerial Systems (UAS). The system applies image processing and geo-location techniques to obtain forest-fire measurements individually from each camera and then integrates the results from all the cameras using statistical data fusion techniques. The proposed system has been extensively tested and validated in close-to-operational conditions in field fire experiments with controlled safety conditions carried out in Portugal and Spain from 2001 to 2006.

  4. Automatic Forest-Fire Measuring Using Ground Stations and Unmanned Aerial Systems

    PubMed Central

    Martínez-de Dios, José Ramiro; Merino, Luis; Caballero, Fernando; Ollero, Anibal

    2011-01-01

    This paper presents a novel system for automatic forest-fire measurement using cameras distributed at ground stations and mounted on Unmanned Aerial Systems (UAS). It can obtain geometrical measurements of forest fires in real-time such as the location and shape of the fire front, flame height and rate of spread, among others. Measurement of forest fires is a challenging problem that is affected by numerous potential sources of error. The proposed system addresses them by exploiting the complementarities between infrared and visual cameras located at different ground locations together with others onboard Unmanned Aerial Systems (UAS). The system applies image processing and geo-location techniques to obtain forest-fire measurements individually from each camera and then integrates the results from all the cameras using statistical data fusion techniques. The proposed system has been extensively tested and validated in close-to-operational conditions in field fire experiments with controlled safety conditions carried out in Portugal and Spain from 2001 to 2006. PMID:22163958

  5. A reactive transport model for the quantification of risks induced by groundwater heat pump systems in urban aquifers

    NASA Astrophysics Data System (ADS)

    García-Gil, Alejandro; Epting, Jannis; Ayora, Carlos; Garrido, Eduardo; Vázquez-Suñé, Enric; Huggenberger, Peter; Gimenez, Ana Cristina

    2016-11-01

    Shallow geothermal resource exploitation through the use of groundwater heat pump systems not only has hydraulic and thermal effects on the environment but also induces physicochemical changes that can compromise the operability of installations. This study focuses on chemical clogging and dissolution subsidence processes observed during the geothermal re-injection of pumped groundwater into an urban aquifer. To explain these phenomena, two transient reactive transport models of a groundwater heat pump installation in an alluvial aquifer were used to reproduce groundwater-solid matrix interactions occurring in a surrounding aquifer environment during system operation. The models couple groundwater flow, heat and solute transport together with chemical reactions. In these models, the permeability distribution in space changes with precipitation-dissolution reactions over time. The simulations allowed us to estimate the calcite precipitation rates and porosity variations over space and time as a function of existent hydraulic gradients in an aquifer as well as the intensity of CO2 exchanges with the atmosphere. The results obtained from the numerical model show how CO2 exolution processes that occur during groundwater reinjection into an aquifer and calcite precipitation are related to hydraulic efficiency losses in exploitation systems. Finally, the performance of reinjection wells was evaluated over time according to different scenarios until the systems were fully obstructed. Our simulations also show a reduction in hydraulic conductivity that forces re-injected water to flow downwards, thereby enhancing the dissolution of evaporitic bedrock and producing subsidence that can ultimately result in a dramatic collapse of the injection well infrastructure.

  6. Rational selective exploitation and distress: employee reactions to performance-based and mobility-based reward allocations.

    PubMed

    Rusbult, C E; Campbell, M A; Price, M E

    1990-09-01

    Prior research has demonstrated that allocators frequently distribute greater rewards to persons with high professional and geographic mobility than to persons with constrained mobility, especially among the very competent. This phenomenon has been termed rational selective exploitation. Do the recipients of such allocations actually experience this distribution rule as unjust and distressing, or is it a misnomer to refer to this phenomenon as exploitation? Two studies were conducted to explore this question. Study 1 was a laboratory experiment in which we manipulated relative performance level, relative mobility level, and allocation standard: performance based versus mobility based. Study 2 was a cross-sectional survey of actual employees in which subjects reported the degree to which performance and mobility were the basis for pay decisions at their places of employment, as well as the degree to which they perceived each standard to be fair. Both studies demonstrated that people regard mobility-based allocations as less fair and more distressing than performance-based allocations. Furthermore, the degree of distress resulting from mobility-based allocations is greater among persons who are disadvantaged by that standard: among people with constrained mobility, especially those who perform at high levels. These findings provide good support for the assertion that so-called rational selective exploitation is indeed distressing to employees. Reactions to this form of distress are also explored, and the implications of these findings for the allocation process are discussed.

  7. The ESA scientific exploitation element results and outlook

    NASA Astrophysics Data System (ADS)

    Desnos, Yves-louis; Regner, Peter; Delwart, Steven; Benveniste, Jerome; Engdahl, Marcus; Donlon, Craig; Mathieu, Pierre-Philippe; Fernandez, Diego; Gascon, Ferran; Zehner, Claus; Davidson, Malcolm; Goryl, Philippe; Koetz, Benjamin; Pinnock, Simon

    2017-04-01

    The Scientific Exploitation of Operational Missions (SEOM) element of ESA's fourth Earth Observation Envelope Programme (EOEP4) prime objective is to federate, support and expand the international research community built up over the last 25 years exploiting ESA's EO missions. SEOM enables the science community to address new scientific research areas that are opened by the free and open access to data from operational EO missions. Based on community-wide recommendations, gathered through a series of international thematic workshops and scientific user consultation meetings, key research studies have been launched over the last years to further exploit data from the Sentinels (http://seom.esa.int/). During 2016 several Science users consultation workshops have been organized, new results from scientific studies have been published and open-source multi-mission scientific toolboxes have been distributed (SNAP 80000 users from 190 countries). In addition the first ESA Massive Open Online Courses on Climate from space have been deployed (20000 participants) and the second EO Open Science conference was organized at ESA in September 2016 bringing together young EO scientists and data scientists. The new EOEP5 Exploitation element approved in 2016 and starting in 2017 is taking stock of all precursor activities in EO Open Science and Innovation and in particular a workplan for ESA scientific exploitation activities has been presented to Member States taking full benefit of the latest information and communication technology. The results and highlights from current scientific exploitation activities will be presented and an outlook on the upcoming activities under the new EOEP5 exploitation element will be given.

  8. Assessing groundwater availability and the response of the groundwater system to intensive exploitation in the North China Plain by analysis of long-term isotopic tracer data

    NASA Astrophysics Data System (ADS)

    Su, Chen; Cheng, Zhongshuang; Wei, Wen; Chen, Zongyu

    2018-03-01

    The use of isotope tracers as a tool for assessing aquifer responses to intensive exploitation is demonstrated and used to attain a better understanding of the sustainability of intensively exploited aquifers in the North China Plain. Eleven well sites were selected that have long-term (years 1985-2014) analysis data of isotopic tracers. The stable isotopes δ18O and δ2H and hydrochemistry were used to understand the hydrodynamic responses of the aquifer system, including unconfined and confined aquifers, to groundwater abstraction. The time series data of 14C activity were also used to assess groundwater age, thereby contributing to an understanding of groundwater sustainability and aquifer depletion. Enrichment of the heavy oxygen isotope (18O) and elevated concentrations of chloride, sulfate, and nitrate were found in groundwater abstracted from the unconfined aquifer, which suggests that intensive exploitation might induce the potential for aquifer contamination. The time series data of 14C activity showed an increase of groundwater age with exploitation of the confined parts of the aquifer system, which indicates that a larger fraction of old water has been exploited over time, and that the groundwater from the deep aquifer has been mined. The current water demand exceeds the sustainable production capabilities of the aquifer system in the North China Plain. Some measures must be taken to ensure major cuts in groundwater withdrawals from the aquifers after a long period of depletion.

  9. Social Familiarity Governs Prey Patch-Exploitation, - Leaving and Inter-Patch Distribution of the Group-Living Predatory Mite Phytoseiulus persimilis

    PubMed Central

    Zach, Gernot J.; Peneder, Stefan; Strodl, Markus A.; Schausberger, Peter

    2012-01-01

    Background In group-living animals, social interactions and their effects on other life activities such as foraging are commonly determined by discrimination among group members. Accordingly, many group-living species evolved sophisticated social recognition abilities such as the ability to recognize familiar individuals, i.e. individuals encountered before. Social familiarity may affect within-group interactions and between-group movements. In environments with patchily distributed prey, group-living predators must repeatedly decide whether to stay with the group in a given prey patch or to leave and search for new prey patches and groups. Methodology/Principal Findings Based on the assumption that in group-living animals social familiarity allows to optimize the performance in other tasks, as for example predicted by limited attention theory, we assessed the influence of social familiarity on prey patch exploitation, patch-leaving, and inter-patch distribution of the group-living, plant-inhabiting predatory mite Phytoseiulus persimilis. P. persimilis is highly specialized on herbivorous spider mite prey such as the two-spotted spider mite Tetranychus urticae, which is patchily distributed on its host plants. We conducted two experiments with (1) groups of juvenile P. persimilis under limited food on interconnected detached leaflets, and (2) groups of adult P. persimilis females under limited food on whole plants. Familiar individuals of both juvenile and adult predator groups were more exploratory and dispersed earlier from a given spider mite patch, occupied more leaves and depleted prey more quickly than individuals of unfamiliar groups. Moreover, familiar juvenile predators had higher survival chances than unfamiliar juveniles. Conclusions/Significance We argue that patch-exploitation and -leaving, and inter-patch dispersion were more favorably coordinated in groups of familiar than unfamiliar predators, alleviating intraspecific competition and improving prey utilization and suppression. PMID:22900062

  10. Social familiarity governs prey patch-exploitation, -leaving and inter-patch distribution of the group-living predatory mite Phytoseiulus persimilis.

    PubMed

    Zach, Gernot J; Peneder, Stefan; Strodl, Markus A; Schausberger, Peter

    2012-01-01

    In group-living animals, social interactions and their effects on other life activities such as foraging are commonly determined by discrimination among group members. Accordingly, many group-living species evolved sophisticated social recognition abilities such as the ability to recognize familiar individuals, i.e. individuals encountered before. Social familiarity may affect within-group interactions and between-group movements. In environments with patchily distributed prey, group-living predators must repeatedly decide whether to stay with the group in a given prey patch or to leave and search for new prey patches and groups. Based on the assumption that in group-living animals social familiarity allows to optimize the performance in other tasks, as for example predicted by limited attention theory, we assessed the influence of social familiarity on prey patch exploitation, patch-leaving, and inter-patch distribution of the group-living, plant-inhabiting predatory mite Phytoseiulus persimilis. P. persimilis is highly specialized on herbivorous spider mite prey such as the two-spotted spider mite Tetranychus urticae, which is patchily distributed on its host plants. We conducted two experiments with (1) groups of juvenile P. persimilis under limited food on interconnected detached leaflets, and (2) groups of adult P. persimilis females under limited food on whole plants. Familiar individuals of both juvenile and adult predator groups were more exploratory and dispersed earlier from a given spider mite patch, occupied more leaves and depleted prey more quickly than individuals of unfamiliar groups. Moreover, familiar juvenile predators had higher survival chances than unfamiliar juveniles. We argue that patch-exploitation and -leaving, and inter-patch dispersion were more favorably coordinated in groups of familiar than unfamiliar predators, alleviating intraspecific competition and improving prey utilization and suppression.

  11. Subjective evaluation of two stereoscopic imaging systems exploiting visual attention to improve 3D quality of experience

    NASA Astrophysics Data System (ADS)

    Hanhart, Philippe; Ebrahimi, Touradj

    2014-03-01

    Crosstalk and vergence-accommodation rivalry negatively impact the quality of experience (QoE) provided by stereoscopic displays. However, exploiting visual attention and adapting the 3D rendering process on the fly can reduce these drawbacks. In this paper, we propose and evaluate two different approaches that exploit visual attention to improve 3D QoE on stereoscopic displays: an offline system, which uses a saliency map to predict gaze position, and an online system, which uses a remote eye tracking system to measure real time gaze positions. The gaze points were used in conjunction with the disparity map to extract the disparity of the object-of-interest. Horizontal image translation was performed to bring the fixated object on the screen plane. The user preference between standard 3D mode and the two proposed systems was evaluated through a subjective evaluation. Results show that exploiting visual attention significantly improves image quality and visual comfort, with a slight advantage for real time gaze determination. Depth quality is also improved, but the difference is not significant.

  12. Common Readout Unit (CRU) - A new readout architecture for the ALICE experiment

    NASA Astrophysics Data System (ADS)

    Mitra, J.; Khan, S. A.; Mukherjee, S.; Paul, R.

    2016-03-01

    The ALICE experiment at the CERN Large Hadron Collider (LHC) is presently going for a major upgrade in order to fully exploit the scientific potential of the upcoming high luminosity run, scheduled to start in the year 2021. The high interaction rate and the large event size will result in an experimental data flow of about 1 TB/s from the detectors, which need to be processed before sending to the online computing system and data storage. This processing is done in a dedicated Common Readout Unit (CRU), proposed for data aggregation, trigger and timing distribution and control moderation. It act as common interface between sub-detector electronic systems, computing system and trigger processors. The interface links include GBT, TTC-PON and PCIe. GBT (Gigabit transceiver) is used for detector data payload transmission and fixed latency path for trigger distribution between CRU and detector readout electronics. TTC-PON (Timing, Trigger and Control via Passive Optical Network) is employed for time multiplex trigger distribution between CRU and Central Trigger Processor (CTP). PCIe (Peripheral Component Interconnect Express) is the high-speed serial computer expansion bus standard for bulk data transport between CRU boards and processors. In this article, we give an overview of CRU architecture in ALICE, discuss the different interfaces, along with the firmware design and implementation of CRU on the LHCb PCIe40 board.

  13. Energy manager design for microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firestone, Ryan; Marnay, Chris

    2005-01-01

    On-site energy production, known as distributed energy resources (DER), offers consumers many benefits, such as bill savings and predictability, improved system efficiency, improved reliability, control over power quality, and in many cases, greener electricity. Additionally, DER systems can benefit electric utilities by reducing congestion on the grid, reducing the need for new generation and transmission capacity, and offering ancillary services such as voltage support and emergency demand response. Local aggregations of distributed energy resources (DER) that may include active control of on-site end-use energy devices can be called microgrids. Microgrids require control to ensure safe operation and to make dispatchmore » decisions that achieve system objectives such as cost minimization, reliability, efficiency and emissions requirements, while abiding by system constraints and regulatory rules. This control is performed by an energy manager (EM). Preferably, an EM will achieve operation reasonably close to the attainable optimum, it will do this by means robust to deviations from expected conditions, and it will not itself incur insupportable capital or operation and maintenance costs. Also, microgrids can include supervision over end-uses, such as curtailing or rescheduling certain loads. By viewing a unified microgrid as a system of supply and demand, rather than simply a system of on-site generation devices, the benefits of integrated supply and demand control can be exploited, such as economic savings and improved system energy efficiency.« less

  14. Improving Search Algorithms by Using Intelligent Coordinates

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Tumer, Kagan; Bandari, Esfandiar

    2004-01-01

    We consider algorithms that maximize a global function G in a distributed manner, using a different adaptive computational agent to set each variable of the underlying space. Each agent eta is self-interested; it sets its variable to maximize its own function g (sub eta). Three factors govern such a distributed algorithm's performance, related to exploration/exploitation, game theory, and machine learning. We demonstrate how to exploit alI three factors by modifying a search algorithm's exploration stage: rather than random exploration, each coordinate of the search space is now controlled by a separate machine-learning-based player engaged in a noncooperative game. Experiments demonstrate that this modification improves simulated annealing (SA) by up to an order of magnitude for bin packing and for a model of an economic process run over an underlying network. These experiments also reveal interesting small-world phenomena.

  15. Improving search algorithms by using intelligent coordinates

    NASA Astrophysics Data System (ADS)

    Wolpert, David; Tumer, Kagan; Bandari, Esfandiar

    2004-01-01

    We consider algorithms that maximize a global function G in a distributed manner, using a different adaptive computational agent to set each variable of the underlying space. Each agent η is self-interested; it sets its variable to maximize its own function gη. Three factors govern such a distributed algorithm’s performance, related to exploration/exploitation, game theory, and machine learning. We demonstrate how to exploit all three factors by modifying a search algorithm’s exploration stage: rather than random exploration, each coordinate of the search space is now controlled by a separate machine-learning-based “player” engaged in a noncooperative game. Experiments demonstrate that this modification improves simulated annealing (SA) by up to an order of magnitude for bin packing and for a model of an economic process run over an underlying network. These experiments also reveal interesting small-world phenomena.

  16. A Cognitive Approach to e-Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.; Rice, Douglas M.; Eaton, Sharon L.

    2003-12-01

    Like traditional classroom instruction, distributed learning derives from passive training paradigms. Just as student-centered classroom teaching methods have been applied over several decades of classroom instruction, interactive approaches have been encouraged for distributed learning. While implementation of multimedia-based training features may appear to produce active learning, sophisticated use of multimedia features alone does not necessarily enhance learning. This paper describes the results of applying cognitive science principles to enhance learning in a student-centered, distributed learning environment, and lessons learned in developing and delivering this training. Our interactive, scenario-based approach exploits multimedia technology within a systematic, cognitive framework for learning. Themore » basis of the application of cognitive principles is the innovative use of multimedia technology to implement interaction elements. These simple multimedia interactions, which are used to support new concepts, are later combined with other interaction elements to create more complex, integrated practical exercises. This technology-based approach may be applied in a variety of training and education contexts, but is especially well suited for training of equipment operators and maintainers. For example, it has been used in a sustainment training application for the United States Army's Combat Support System Automated Information System Interface (CAISI). The CAISI provides a wireless communications capability that allows various logistics systems to communicate across the battlefield. Based on classroom training material developed by the CAISI Project Office, the Pacific Northwest National Laboratory designed and developed an interactive, student-centered distributed-learning application for CAISI operators and maintainers. This web-based CAISI training system is also distributed on CD media for use on individual computers, and material developed for the computer-based course can be used in the classroom. In addition to its primary role in sustainment training, this distributed learning course can complement or replace portions of the classroom instruction, thus supporting a blended learning solution.« less

  17. Integration of robotic resources into FORCEnet

    NASA Astrophysics Data System (ADS)

    Nguyen, Chinh; Carroll, Daniel; Nguyen, Hoa

    2006-05-01

    The Networked Intelligence, Surveillance, and Reconnaissance (NISR) project integrates robotic resources into Composeable FORCEnet to control and exploit unmanned systems over extremely long distances. The foundations are built upon FORCEnet-the U.S. Navy's process to define C4ISR for net-centric operations-and the Navy Unmanned Systems Common Control Roadmap to develop technologies and standards for interoperability, data sharing, publish-and-subscribe methodology, and software reuse. The paper defines the goals and boundaries for NISR with focus on the system architecture, including the design tradeoffs necessary for unmanned systems in a net-centric model. Special attention is given to two specific scenarios demonstrating the integration of unmanned ground and water surface vehicles into the open-architecture web-based command-and-control information-management system of Composeable FORCEnet. Planned spiral development for NISR will improve collaborative control, expand robotic sensor capabilities, address multiple domains including underwater and aerial platforms, and extend distributive communications infrastructure for battlespace optimization for unmanned systems in net-centric operations.

  18. Feasibility analysis of a hydrogen backup power system for Russian telecom market

    NASA Astrophysics Data System (ADS)

    Borzenko, V. I.; Dunikov, D. O.

    2017-11-01

    We performed feasibility analysis of 10 kW hydrogen backup power system (H2BS) consisting of a water electrolyzer, a metal hydride hydrogen storage and a fuel cell. Capital investments in H2BS are mostly determined by the costs of the PEM electrolyzer, the fuel cell and solid state hydrogen storage materials, for single unit or small series manufacture the cost of AB5-type intermetallic compound can reach 50% of total system cost. Today the capital investments in H2BS are 3 times higher than in conventional lead-acid system of the same capacity. Wide distribution of fuel cell hydrogen vehicles, development of hydrogen infrastructure, and mass production of hydrogen power systems will for sure lower capital investments in fuel cell backup power. Operational expenditures for H2BS is only 15% from the expenditures for lead acid systems, and after 4-5 years of exploitation the total cost of ownership will become lower than for batteries.

  19. Applications of computer algebra to distributed parameter systems

    NASA Technical Reports Server (NTRS)

    Storch, Joel A.

    1993-01-01

    In the analysis of vibrations of continuous elastic systems, one often encounters complicated transcendental equations with roots directly related to the system's natural frequencies. Typically, these equations contain system parameters whose values must be specified before a numerical solution can be obtained. The present paper presents a method whereby the fundamental frequency can be obtained in analytical form to any desired degree of accuracy. The method is based upon truncation of rapidly converging series involving inverse powers of the system natural frequencies. A straightforward method to developing these series and summing them in closed form is presented. It is demonstrated how Computer Algebra can be exploited to perform the intricate analytical procedures which otherwise would render the technique difficult to apply in practice. We illustrate the method by developing two analytical approximations to the fundamental frequency of a vibrating cantilever carrying a rigid tip body. The results are compared to the numerical solution of the exact (transcendental) frequency equation over a range of system parameters.

  20. The biogeodynamics of microbial landscapes

    NASA Astrophysics Data System (ADS)

    Battin, T. J.; Hödl, I.; Bertuzzo, E.; Mari, L.; Suweis, S. S.; Rinaldo, A.

    2011-12-01

    Spatial configuration is fundamental in defining the structural and functional properties of biological systems. Biofilms, surface-attached and matrix-enclosed microorganisms, are a striking example of spatial organisation. Coupled biotic and abiotic processes shape the spatial organisation across scales of the landscapes formed by these benthic biofilms in streams and rivers. Experimenting with such biofilms in streams, we found that, depending on the streambed topography and the related hydrodynamic microenvironment, biofilm landscapes form increasingly diverging spatial patterns as they grow. Strikingly, however, cluster size distributions tend to converge even in contrasting hydrodynamic microenvironments. To reproduce the observed cluster size distributions we used a continuous, size-structured population model. The model accounts for the formation, growth, erosion and merging of biofilm clusters. Our results suggest not only that hydrodynamic forcing induce the diverging patterning of the microbial landscape, but also that microorganisms have developed strategies to equally exploit spatial resources independently of the physical structure of the microenvironment where they live.

  1. Characterizations of double pulsing in neutron multiplicity and coincidence counting systems

    DOE PAGES

    Koehler, Katrina E.; Henzl, Vladimir; Croft, Stephen; ...

    2016-06-29

    Passive neutron coincidence/multiplicity counters are subject to non-ideal behavior, such as double pulsing and dead time. It has been shown in the past that double-pulsing exhibits a distinct signature in a Rossi-alpha distribution, which is not readily noticed using traditional Multiplicity Shift Register analysis. But, it has been assumed that the use of a pre-delay in shift register analysis removes any effects of double pulsing. Here, we use high-fidelity simulations accompanied by experimental measurements to study the effects of double pulsing on multiplicity rates. By exploiting the information from the double pulsing signature peak observable in the Rossi-alpha distribution, themore » double pulsing fraction can be determined. Algebraic correction factors for the multiplicity rates in terms of the double pulsing fraction have been developed. We also discuss the role of these corrections across a range of scenarios.« less

  2. Mapping the distribution of materials in hyperspectral data using the USGS Material Identification and Characterization Algorithm (MICA)

    USGS Publications Warehouse

    Kokaly, R.F.; King, T.V.V.; Hoefen, T.M.

    2011-01-01

    Identifying materials by measuring and analyzing their reflectance spectra has been an important method in analytical chemistry for decades. Airborne and space-based imaging spectrometers allow scientists to detect materials and map their distributions across the landscape. With new satellite-borne hyperspectral sensors planned for the future, for example, HYSPIRI (HYPerspectral InfraRed Imager), robust methods are needed to fully exploit the information content of hyperspectral remote sensing data. A method of identifying and mapping materials using spectral-feature based analysis of reflectance data in an expert-system framework called MICA (Material Identification and Characterization Algorithm) is described in this paper. The core concepts and calculations of MICA are presented. A MICA command file has been developed and applied to map minerals in the full-country coverage of the 2007 Afghanistan HyMap hyperspectral data. ?? 2011 IEEE.

  3. Generation of vortex array laser beams with Dove prism embedded unbalanced Mach-Zehnder interferometer

    NASA Astrophysics Data System (ADS)

    Chu, Shu-Chun

    2009-02-01

    This paper introduces a scheme for generation of vortex laser beams from a solid-state laser with off-axis laser-diode pumping. The proposed system consists of a Dove prism embedded in an unbalanced Mach-Zehnder interferometer configuration. This configuration allows controlled construction of p × p vortex array beams from Ince-Gaussian modes, IGep,p modes. An incident IGe p,p laser beam of variety order p can easily be generated from an end-pumped solid-state laser with an off-axis pumping mechanism. This study simulates this type of vortex array laser beam generation and discusses beam propagation effects. The formation of ordered transverse emission patterns have applications in a variety of areas such as optical data storage, distribution, and processing that exploit the robustness of soliton and vortex fields and optical manipulations of small particles and atoms in the featured intensity distribution.

  4. Power management and distribution system for a More-Electric Aircraft (MADMEL) -- Program status

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maldonado, M.A.; Shah, N.M.; Cleek, K.J.

    1995-12-31

    A number of technology breakthroughs in recent years have rekindled the concept of a more-electric aircraft. High-power solid-state switching devices, electrohydrostatic actuators (EHAs), electromechanical actuators (EMAs), and high-power generators are just a few examples of component developments that have made dramatic improvements in properties such as weight, size, power, and cost. However, these components cannot be applied piecemeal. A complete, and somewhat revolutionary, system design approach is needed to exploit the benefits that a more-electric aircraft can provide. A five-phase Power Management and Distribution System for a More-Electric Aircraft (MADMEL) program was awarded by the Air Force to the Northrop/Grumman,more » Military Aircraft Division team in September 1991. The objective of the program is to design, develop, and demonstrate an advanced electrical power generation and distribution system for a more-electric aircraft (MEA). The MEA emphasizes the use of electrical power in place of hydraulics, pneumatic, and mechanical power to optimize the performance and life cycle cost of the aircraft. This paper presents an overview of the MADMEL program and a top-level summary of the program results, development and testing of major components to date. In Phase 1 and Phase 2 studies, the electrical load requirements were established and the electrical power system architecture was defined for both near-term (NT-year 1996) and far-term (FT-year 2003) MEA application. The detailed design and specification for the electrical power system (EPS), its interface with the Vehicle Management System, and the test set-up were developed under the recently completed Phase 3. The subsystem level hardware fabrication and testing will be performed under the on-going Phase 4 activities. The overall system level integration and testing will be performed in Phase 5.« less

  5. Microgrid to enable optimal distributed energy retail and end-user demand response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Ming; Feng, Wei; Marnay, Chris

    In the face of unprecedented challenges in environmental sustainability and grid resilience, there is an increasingly held consensus regarding the adoption of distributed and renewable energy resources such as microgrids (MGs), and the utilization of flexible electric loads by demand response (DR) to potentially drive a necessary paradigm shift in energy production and consumption patterns. However, the potential value of distributed generation and demand flexibility has not yet been fully realized in the operation of MGs. This study investigates the pricing and operation strategy with DR for a MG retailer in an integrated energy system (IES). Based on co-optimizing retailmore » rates and MG dispatch formulated as a mixed integer quadratic programming (MIQP) problem, our model devises a dynamic pricing scheme that reflects the cost of generation and promotes DR, in tandem with an optimal dispatch plan that exploits spark spread and facilitates the integration of renewables, resulting in improved retailer profits and system stability. Main issues like integrated energy coupling and customer bill reduction are addressed during pricing to ensure rates competitiveness and customer protection. By evaluating on real datasets, the system is demonstrated to optimally coordinate storage, renewables, and combined heat and power (CHP), reduce carbon dioxide emission while maintaining profits, and effectively alleviate the PV curtailment problem. Finally, the model can be used by retailers and MG operators to optimize their operations, as well as regulators to design new utility rates in support of the ongoing transformation of energy systems.« less

  6. Microgrid to enable optimal distributed energy retail and end-user demand response

    DOE PAGES

    Jin, Ming; Feng, Wei; Marnay, Chris; ...

    2018-06-07

    In the face of unprecedented challenges in environmental sustainability and grid resilience, there is an increasingly held consensus regarding the adoption of distributed and renewable energy resources such as microgrids (MGs), and the utilization of flexible electric loads by demand response (DR) to potentially drive a necessary paradigm shift in energy production and consumption patterns. However, the potential value of distributed generation and demand flexibility has not yet been fully realized in the operation of MGs. This study investigates the pricing and operation strategy with DR for a MG retailer in an integrated energy system (IES). Based on co-optimizing retailmore » rates and MG dispatch formulated as a mixed integer quadratic programming (MIQP) problem, our model devises a dynamic pricing scheme that reflects the cost of generation and promotes DR, in tandem with an optimal dispatch plan that exploits spark spread and facilitates the integration of renewables, resulting in improved retailer profits and system stability. Main issues like integrated energy coupling and customer bill reduction are addressed during pricing to ensure rates competitiveness and customer protection. By evaluating on real datasets, the system is demonstrated to optimally coordinate storage, renewables, and combined heat and power (CHP), reduce carbon dioxide emission while maintaining profits, and effectively alleviate the PV curtailment problem. Finally, the model can be used by retailers and MG operators to optimize their operations, as well as regulators to design new utility rates in support of the ongoing transformation of energy systems.« less

  7. Design considerations for parallel graphics libraries

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1994-01-01

    Applications which run on parallel supercomputers are often characterized by massive datasets. Converting these vast collections of numbers to visual form has proven to be a powerful aid to comprehension. For a variety of reasons, it may be desirable to provide this visual feedback at runtime. One way to accomplish this is to exploit the available parallelism to perform graphics operations in place. In order to do this, we need appropriate parallel rendering algorithms and library interfaces. This paper provides a tutorial introduction to some of the issues which arise in designing parallel graphics libraries and their underlying rendering algorithms. The focus is on polygon rendering for distributed memory message-passing systems. We illustrate our discussion with examples from PGL, a parallel graphics library which has been developed on the Intel family of parallel systems.

  8. Full-field implementation of a perfect eavesdropper on a quantum cryptography system.

    PubMed

    Gerhardt, Ilja; Liu, Qin; Lamas-Linares, Antía; Skaar, Johannes; Kurtsiefer, Christian; Makarov, Vadim

    2011-06-14

    Quantum key distribution (QKD) allows two remote parties to grow a shared secret key. Its security is founded on the principles of quantum mechanics, but in reality it significantly relies on the physical implementation. Technological imperfections of QKD systems have been previously explored, but no attack on an established QKD connection has been realized so far. Here we show the first full-field implementation of a complete attack on a running QKD connection. An installed eavesdropper obtains the entire 'secret' key, while none of the parameters monitored by the legitimate parties indicate a security breach. This confirms that non-idealities in physical implementations of QKD can be fully practically exploitable, and must be given increased scrutiny if quantum cryptography is to become highly secure.

  9. A Scalable Data Access Layer to Manage Structured Heterogeneous Biomedical Data.

    PubMed

    Delussu, Giovanni; Lianas, Luca; Frexia, Francesca; Zanetti, Gianluigi

    2016-01-01

    This work presents a scalable data access layer, called PyEHR, designed to support the implementation of data management systems for secondary use of structured heterogeneous biomedical and clinical data. PyEHR adopts the openEHR's formalisms to guarantee the decoupling of data descriptions from implementation details and exploits structure indexing to accelerate searches. Data persistence is guaranteed by a driver layer with a common driver interface. Interfaces for two NoSQL Database Management Systems are already implemented: MongoDB and Elasticsearch. We evaluated the scalability of PyEHR experimentally through two types of tests, called "Constant Load" and "Constant Number of Records", with queries of increasing complexity on synthetic datasets of ten million records each, containing very complex openEHR archetype structures, distributed on up to ten computing nodes.

  10. How to end the nuclear nightmare

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speiser, S.M.

    1984-01-01

    An international lawyer examines the underlying causes of and ways to remove the Soviet-US enmity. He traces the history of distrust between the superpowers to differences in ideology and political systems, expansionism based on nationalism, and the human drive for power, and concludes that these tensions compel us to be enemies. Ideology is the only difference between the superpowers that can be removed. Capitalism could be made compatible with communism by distributing corporate stock to everyone. Eliminating poverty and exploitation would remove the ideological barrier to world peace and nuclear disarmament, making it worthwhile for the Soviet self-interest to cooperatemore » with the US. The capitalistic system and Western freedoms would be preserved, and religious leaders could solve the dilemma of liberation theology. 61 references, 3 figures.« less

  11. Harmonic Fourier beads method for studying rare events on rugged energy surfaces.

    PubMed

    Khavrutskii, Ilja V; Arora, Karunesh; Brooks, Charles L

    2006-11-07

    We present a robust, distributable method for computing minimum free energy paths of large molecular systems with rugged energy landscapes. The method, which we call harmonic Fourier beads (HFB), exploits the Fourier representation of a path in an appropriate coordinate space and proceeds iteratively by evolving a discrete set of harmonically restrained path points-beads-to generate positions for the next path. The HFB method does not require explicit knowledge of the free energy to locate the path. To compute the free energy profile along the final path we employ an umbrella sampling method in two generalized dimensions. The proposed HFB method is anticipated to aid the study of rare events in biomolecular systems. Its utility is demonstrated with an application to conformational isomerization of the alanine dipeptide in gas phase.

  12. Real-Time MENTAT programming language and architecture

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  13. Parallel and fault-tolerant algorithms for hypercube multiprocessors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aykanat, C.

    1988-01-01

    Several techniques for increasing the performance of parallel algorithms on distributed-memory message-passing multi-processor systems are investigated. These techniques are effectively implemented for the parallelization of the Scaled Conjugate Gradient (SCG) algorithm on a hypercube connected message-passing multi-processor. Significant performance improvement is achieved by using these techniques. The SCG algorithm is used for the solution phase of an FE modeling system. Almost linear speed-up is achieved, and it is shown that hypercube topology is scalable for an FE class of problem. The SCG algorithm is also shown to be suitable for vectorization, and near supercomputer performance is achieved on a vectormore » hypercube multiprocessor by exploiting both parallelization and vectorization. Fault-tolerance issues for the parallel SCG algorithm and for the hypercube topology are also addressed.« less

  14. The NERC Data Assimilation Research Centre and Envisat

    NASA Astrophysics Data System (ADS)

    LAHOZ, W. A.

    2001-12-01

    The NERC Data Assimilation Research Centre (DARC), a Centre of Excellence in Earth Observation, has been recently set up in the UK. DARC is a distributed centre, with participation from the universities of Reading, Oxford, Cambridge and Edinburgh, and the Rutherford Appleton Laboratory. It has strong links with the UK Met Office, and with European data assimilation groups. One of the remits of DARC is the exploitation of research satellite data (e.g. from ESA's Envisat, due to be launched in November 2001). This presentation will describe the participation of DARC in the Envisat programme. This participation involves: (1) the calibration/validation of Envisat data using an NWP assimilation system, and (2) the production of 4-d quality-controlled datasets of temperature, ozone and water vapour from Envisat using an NWP assimilation system.

  15. Characterization of attacks on public telephone networks

    NASA Astrophysics Data System (ADS)

    Lorenz, Gary V.; Manes, Gavin W.; Hale, John C.; Marks, Donald; Davis, Kenneth; Shenoi, Sujeet

    2001-02-01

    The U.S. Public Telephone Network (PTN) is a massively connected distributed information systems, much like the Internet. PTN signaling, transmission and operations functions must be protected from physical and cyber attacks to ensure the reliable delivery of telecommunications services. The increasing convergence of PTNs with wireless communications systems, computer networks and the Internet itself poses serious threats to our nation's telecommunications infrastructure. Legacy technologies and advanced services encumber well-known and as of yet undiscovered vulnerabilities that render them susceptible to cyber attacks. This paper presents a taxonomy of cyber attacks on PTNs in converged environments that synthesizes exploits in computer and communications network domains. The taxonomy provides an opportunity for the systematic exploration of mitigative and preventive strategies, as well as for the identification and classification of emerging threats.

  16. Heralded amplification of path entangled quantum states

    NASA Astrophysics Data System (ADS)

    Monteiro, F.; Verbanis, E.; Caprara Vivoli, V.; Martin, A.; Gisin, N.; Zbinden, H.; Thew, R. T.

    2017-06-01

    Device-independent quantum key distribution (DI-QKD) represents one of the most fascinating challenges in quantum communication, exploiting concepts of fundamental physics, namely Bell tests of nonlocality, to ensure the security of a communication link. This requires the loophole-free violation of a Bell inequality, which is intrinsically difficult due to losses in fibre optic transmission channels. Heralded photon amplification (HPA) is a teleportation-based protocol that has been proposed as a means to overcome transmission loss for DI-QKD. Here we demonstrate HPA for path entangled states and characterise the entanglement before and after loss by exploiting a recently developed displacement-based detection scheme. We demonstrate that by exploiting HPA we are able to reliably maintain high fidelity entangled states over loss-equivalent distances of more than 50 km.

  17. Neutron tomography of axially symmetric objects using 14 MeV neutrons from a portable neutron generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersson, P., E-mail: peter.andersson@physics.uu.se; Andersson-Sunden, E.; Sjöstrand, H.

    2014-08-01

    In nuclear boiling water reactor cores, the distribution of water and steam (void) is essential for both safety and efficiency reasons. In order to enhance predictive capabilities, void distribution assessment is performed in two-phase test-loops under reactor-relevant conditions. This article proposes the novel technique of fast-neutron tomography using a portable deuterium-tritium neutron generator to determine the time-averaged void distribution in these loops. Fast neutrons have the advantage of high transmission through the metallic structures and pipes typically concealing a thermal-hydraulic test loop, while still being fairly sensitive to the water/void content. However, commercially available fast-neutron generators also have the disadvantagemore » of a relatively low yield and fast-neutron detection also suffers from relatively low detection efficiency. Fortunately, some loops are axially symmetric, a property which can be exploited to reduce the amount of data needed for tomographic measurement, thus limiting the interrogation time needed. In this article, three axially symmetric test objects depicting a thermal-hydraulic test loop have been examined; steel pipes with outer diameter 24 mm, thickness 1.5 mm, and with three different distributions of the plastic material POM inside the pipes. Data recorded with the FANTOM fast-neutron tomography instrument have been used to perform tomographic reconstructions to assess their radial material distribution. Here, a dedicated tomographic algorithm that exploits the symmetry of these objects has been applied, which is described in the paper. Results are demonstrated in 20 rixel (radial pixel) reconstructions of the interior constitution and 2D visualization of the pipe interior is demonstrated. The local POM attenuation coefficients in the rixels were measured with errors (RMS) of 0.025, 0.020, and 0.022 cm{sup −1}, solid POM attenuation coefficient. The accuracy and precision is high enough to provide a useful indication on the flow mode, and a visualization of the radial material distribution can be obtained. A benefit of this system is its potential to be mounted at any axial height of a two-phase test section without requirements for pre-fabricated entrances or windows. This could mean a significant increase in flexibility of the void distribution assessment capability at many existing two-phase test loops.« less

  18. Neutron tomography of axially symmetric objects using 14 MeV neutrons from a portable neutron generator.

    PubMed

    Andersson, P; Andersson-Sunden, E; Sjöstrand, H; Jacobsson-Svärd, S

    2014-08-01

    In nuclear boiling water reactor cores, the distribution of water and steam (void) is essential for both safety and efficiency reasons. In order to enhance predictive capabilities, void distribution assessment is performed in two-phase test-loops under reactor-relevant conditions. This article proposes the novel technique of fast-neutron tomography using a portable deuterium-tritium neutron generator to determine the time-averaged void distribution in these loops. Fast neutrons have the advantage of high transmission through the metallic structures and pipes typically concealing a thermal-hydraulic test loop, while still being fairly sensitive to the water/void content. However, commercially available fast-neutron generators also have the disadvantage of a relatively low yield and fast-neutron detection also suffers from relatively low detection efficiency. Fortunately, some loops are axially symmetric, a property which can be exploited to reduce the amount of data needed for tomographic measurement, thus limiting the interrogation time needed. In this article, three axially symmetric test objects depicting a thermal-hydraulic test loop have been examined; steel pipes with outer diameter 24 mm, thickness 1.5 mm, and with three different distributions of the plastic material POM inside the pipes. Data recorded with the FANTOM fast-neutron tomography instrument have been used to perform tomographic reconstructions to assess their radial material distribution. Here, a dedicated tomographic algorithm that exploits the symmetry of these objects has been applied, which is described in the paper. Results are demonstrated in 20 rixel (radial pixel) reconstructions of the interior constitution and 2D visualization of the pipe interior is demonstrated. The local POM attenuation coefficients in the rixels were measured with errors (RMS) of 0.025, 0.020, and 0.022 cm(-1), solid POM attenuation coefficient. The accuracy and precision is high enough to provide a useful indication on the flow mode, and a visualization of the radial material distribution can be obtained. A benefit of this system is its potential to be mounted at any axial height of a two-phase test section without requirements for pre-fabricated entrances or windows. This could mean a significant increase in flexibility of the void distribution assessment capability at many existing two-phase test loops.

  19. On the viability of exploiting L-shell fluorescence for X-ray polarimetry

    NASA Technical Reports Server (NTRS)

    Weisskopf, M. C.; Sutherland, P. G.; Elsner, R. F.; Ramsey, B. D.

    1985-01-01

    It has been suggested that one may build an X-ray polarimeter by exploiting the polarization dependence of the angular distribution of L-shell fluorescence photons. In this paper the sensitivity of this approach to polarimetry is examined theoretically. The calculations are applied to several detection schemes using imaging proportional counters that would have direct application in X-ray astronomy. It is found, however, that the sensitivity of this method for measuring X-ray polarization is too low to be of use for other than laboratory applications.

  20. Light-controlled intracellular transport in Caenorhabditis elegans.

    PubMed

    Harterink, Martin; van Bergeijk, Petra; Allier, Calixte; de Haan, Bart; van den Heuvel, Sander; Hoogenraad, Casper C; Kapitein, Lukas C

    2016-02-22

    To establish and maintain their complex morphology and function, neurons and other polarized cells exploit cytoskeletal motor proteins to distribute cargoes to specific compartments. Recent studies in cultured cells have used inducible motor protein recruitment to explore how different motors contribute to polarized transport and to control the subcellular positioning of organelles. Such approaches also seem promising avenues for studying motor activity and organelle positioning within more complex cellular assemblies, but their applicability to multicellular in vivo systems has so far remained unexplored. Here, we report the development of an optogenetic organelle transport strategy in the in vivo model system Caenorhabditis elegans. We demonstrate that movement and pausing of various organelles can be achieved by recruiting the proper cytoskeletal motor protein with light. In neurons, we find that kinesin and dynein exclusively target the axon and dendrite, respectively, revealing the basic principles for polarized transport. In vivo control of motor attachment and organelle distributions will be widely useful in exploring the mechanisms that govern the dynamic morphogenesis of cells and tissues, within the context of a developing animal. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. How ecology shapes exploitation: a framework to predict the behavioural response of human and animal foragers along exploration-exploitation trade-offs.

    PubMed

    Monk, Christopher T; Barbier, Matthieu; Romanczuk, Pawel; Watson, James R; Alós, Josep; Nakayama, Shinnosuke; Rubenstein, Daniel I; Levin, Simon A; Arlinghaus, Robert

    2018-06-01

    Understanding how humans and other animals behave in response to changes in their environments is vital for predicting population dynamics and the trajectory of coupled social-ecological systems. Here, we present a novel framework for identifying emergent social behaviours in foragers (including humans engaged in fishing or hunting) in predator-prey contexts based on the exploration difficulty and exploitation potential of a renewable natural resource. A qualitative framework is introduced that predicts when foragers should behave territorially, search collectively, act independently or switch among these states. To validate it, we derived quantitative predictions from two models of different structure: a generic mathematical model, and a lattice-based evolutionary model emphasising exploitation and exclusion costs. These models independently identified that the exploration difficulty and exploitation potential of the natural resource controls the social behaviour of resource exploiters. Our theoretical predictions were finally compared to a diverse set of empirical cases focusing on fisheries and aquatic organisms across a range of taxa, substantiating the framework's predictions. Understanding social behaviour for given social-ecological characteristics has important implications, particularly for the design of governance structures and regulations to move exploited systems, such as fisheries, towards sustainability. Our framework provides concrete steps in this direction. © 2018 John Wiley & Sons Ltd/CNRS.

  2. 31 CFR 500.206 - Exemption of information and informational materials.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... exploit in the United States the Vietnamese film in any medium, including home video distribution, for... alterations made by persons subject to the jurisdiction of the United States to information or informational...

  3. ParBiBit: Parallel tool for binary biclustering on modern distributed-memory systems

    PubMed Central

    Expósito, Roberto R.

    2018-01-01

    Biclustering techniques are gaining attention in the analysis of large-scale datasets as they identify two-dimensional submatrices where both rows and columns are correlated. In this work we present ParBiBit, a parallel tool to accelerate the search of interesting biclusters on binary datasets, which are very popular on different fields such as genetics, marketing or text mining. It is based on the state-of-the-art sequential Java tool BiBit, which has been proved accurate by several studies, especially on scenarios that result on many large biclusters. ParBiBit uses the same methodology as BiBit (grouping the binary information into patterns) and provides the same results. Nevertheless, our tool significantly improves performance thanks to an efficient implementation based on C++11 that includes support for threads and MPI processes in order to exploit the compute capabilities of modern distributed-memory systems, which provide several multicore CPU nodes interconnected through a network. Our performance evaluation with 18 representative input datasets on two different eight-node systems shows that our tool is significantly faster than the original BiBit. Source code in C++ and MPI running on Linux systems as well as a reference manual are available at https://sourceforge.net/projects/parbibit/. PMID:29608567

  4. ParBiBit: Parallel tool for binary biclustering on modern distributed-memory systems.

    PubMed

    González-Domínguez, Jorge; Expósito, Roberto R

    2018-01-01

    Biclustering techniques are gaining attention in the analysis of large-scale datasets as they identify two-dimensional submatrices where both rows and columns are correlated. In this work we present ParBiBit, a parallel tool to accelerate the search of interesting biclusters on binary datasets, which are very popular on different fields such as genetics, marketing or text mining. It is based on the state-of-the-art sequential Java tool BiBit, which has been proved accurate by several studies, especially on scenarios that result on many large biclusters. ParBiBit uses the same methodology as BiBit (grouping the binary information into patterns) and provides the same results. Nevertheless, our tool significantly improves performance thanks to an efficient implementation based on C++11 that includes support for threads and MPI processes in order to exploit the compute capabilities of modern distributed-memory systems, which provide several multicore CPU nodes interconnected through a network. Our performance evaluation with 18 representative input datasets on two different eight-node systems shows that our tool is significantly faster than the original BiBit. Source code in C++ and MPI running on Linux systems as well as a reference manual are available at https://sourceforge.net/projects/parbibit/.

  5. Constructions of secure entanglement channels assisted by quantum dots inside single-sided optical cavities

    NASA Astrophysics Data System (ADS)

    Heo, Jino; Kang, Min-Sung; Hong, Chang-Ho; Choi, Seong-Gon; Hong, Jong-Phil

    2017-08-01

    We propose quantum information processing schemes to generate and swap entangled states based on the interactions between flying photons and quantum dots (QDs) confined within optical cavities for quantum communication. To produce and distribute entangled states (Bell and Greenberger-Horne-Zeilinger [GHZ] states) between the photonic qubits of flying photons of consumers (Alice and Bob) and electron-spin qubits of a provider (trust center, or TC), the TC employs the interactions of the QD-cavity system, which is composed of a charged QD (negatively charged exciton) inside a single-sided cavity. Subsequently, the TC constructs an entanglement channel (Bell state and 4-qubit GHZ state) to link one consumer with another through entanglement swapping, which can be realized to exploit a probe photon with interactions of the QD-cavity systems and single-qubit measurements without Bell state measurement, for quantum communication between consumers. Consequently, the TC, which has quantum nodes (QD-cavity systems), can accomplish constructing the entanglement channel (authenticated channel) between two separated consumers from the distributions of entangled states and entanglement swapping. Furthermore, our schemes using QD-cavity systems, which are feasible with a certain probability of success and high fidelity, can be experimentally implemented with technology currently in use.

  6. The evolution of phenotypic correlations and ‘developmental memory’

    PubMed Central

    Watson, Richard A.; Wagner, Günter P.; Pavlicev, Mihaela; Weinreich, Daniel M.; Mills, Rob

    2014-01-01

    Development introduces structured correlations among traits that may constrain or bias the distribution of phenotypes produced. Moreover, when suitable heritable variation exists, natural selection may alter such constraints and correlations, affecting the phenotypic variation available to subsequent selection. However, exactly how the distribution of phenotypes produced by complex developmental systems can be shaped by past selective environments is poorly understood. Here we investigate the evolution of a network of recurrent non-linear ontogenetic interactions, such as a gene regulation network, in various selective scenarios. We find that evolved networks of this type can exhibit several phenomena that are familiar in cognitive learning systems. These include formation of a distributed associative memory that can ‘store’ and ‘recall’ multiple phenotypes that have been selected in the past, recreate complete adult phenotypic patterns accurately from partial or corrupted embryonic phenotypes, and ‘generalise’ (by exploiting evolved developmental modules) to produce new combinations of phenotypic features. We show that these surprising behaviours follow from an equivalence between the action of natural selection on phenotypic correlations and associative learning, well-understood in the context of neural networks. This helps to explain how development facilitates the evolution of high-fitness phenotypes and how this ability changes over evolutionary time. PMID:24351058

  7. Regionalization of Habitat Suitability of Masson’s Pine based on geographic information system and Fuzzy Matter-Element Model

    PubMed Central

    Zhou, Xiuteng; Zhao, Manxi; Zhou, Liangyun; Yang, Guang; Huang, Luqi; Yan, Cuiqi; Huang, Quanshu; Ye, Liang; Zhang, Xiaobo; Guo, Lanpin; Ke, Xiao; Guo, Jiao

    2016-01-01

    Pine needles have been widely used in the development of anti-hypertensive and anti-hyperlipidemic agents and health food. However, the widespread distribution of this tree poses great obstacles to the quality control and efficacy evaluation. To facilitate the effective and rational exploitation of Masson’s pine (Pinus massoniana Lamb), as well as ensure effective development of Masson’s pine needles as a medicinal agent, we investigated the spatial distribution of habitat suitability and evaluated the optimal ranges of ecological factors of P. massoniana with 280 samples collected from 12 provinces in China through the evaluation of four constituents known to be effective medicinally. The results of habitat suitability evaluation were also verified by Root Mean Square Error (RMSE). Finally, five ecological factors were chosen in the establishment of a habitat suitability evaluation system. The most suitable areas for P. massoniana growth were mainly concentrated in the middle and lower reaches of the Yangtze River basin, such as Sichuan, Guizhou, and Jiangxi provinces, while the best quality needles were from Guizhou, Sichuan, and the junction area of Chongqing, Hunan, and Hubei provinces. This information revealed that suitable areas for effective constituent accumulation of Masson’s pine needles accounted for only 7.41% of its distribution area. PMID:27694967

  8. Regionalization of Habitat Suitability of Masson’s Pine based on geographic information system and Fuzzy Matter-Element Model

    NASA Astrophysics Data System (ADS)

    Zhou, Xiuteng; Zhao, Manxi; Zhou, Liangyun; Yang, Guang; Huang, Luqi; Yan, Cuiqi; Huang, Quanshu; Ye, Liang; Zhang, Xiaobo; Guo, Lanpin; Ke, Xiao; Guo, Jiao

    2016-10-01

    Pine needles have been widely used in the development of anti-hypertensive and anti-hyperlipidemic agents and health food. However, the widespread distribution of this tree poses great obstacles to the quality control and efficacy evaluation. To facilitate the effective and rational exploitation of Masson’s pine (Pinus massoniana Lamb), as well as ensure effective development of Masson’s pine needles as a medicinal agent, we investigated the spatial distribution of habitat suitability and evaluated the optimal ranges of ecological factors of P. massoniana with 280 samples collected from 12 provinces in China through the evaluation of four constituents known to be effective medicinally. The results of habitat suitability evaluation were also verified by Root Mean Square Error (RMSE). Finally, five ecological factors were chosen in the establishment of a habitat suitability evaluation system. The most suitable areas for P. massoniana growth were mainly concentrated in the middle and lower reaches of the Yangtze River basin, such as Sichuan, Guizhou, and Jiangxi provinces, while the best quality needles were from Guizhou, Sichuan, and the junction area of Chongqing, Hunan, and Hubei provinces. This information revealed that suitable areas for effective constituent accumulation of Masson’s pine needles accounted for only 7.41% of its distribution area.

  9. Analysis of Water Resource Utilization Potential for Jiangsu Coastal Area ' in Nantong City

    NASA Astrophysics Data System (ADS)

    Ren, Li; Liu, Jin-Tao; Ni, Jian-Jun

    2015-04-01

    Along with the advance of the growth of population and social economy, requirements for water quality and quantity in coastal areas is getting higher and higher, but due to the uneven distribution of rainfall years and water exploitation, use and management level, the influence of the shortage of water resources is increasingly prominent, seriously restricting the social and economic sustainable development in this region. Accordingly, water resource utilization potential in Jiangsu coastal region is vital for water security in the region. Taking Nantong City as the study area, the regional water resources development and utilization status were evaluated. In this paper, the meaning of water resources, water resources development and utilization, and water resources development and utilization of the three stages of concepts such as system were discussed. Then the development and utilization of regional water resource evaluation were carried out, and the significance of regional society, economy, resources and environment and its development status quo of water resources were exploited. According to conditions and area source, an evaluation index system for development and utilization of water resources of Nantong was built up. The index layer was composed of 16 indicators. In this study, analytic hierarchy process (AHP) was used to determine of weights of indicators at all levels in the index system. Multistage fuzzy comprehensive evaluation model was selected to evaluate the water resources development and utilization status of Nantong, and then water resource utilization potential of Nantong was analyzed.

  10. Exploiting mosquito sugar feeding to detect mosquito-borne pathogens

    PubMed Central

    Hall-Mendelin, Sonja; Ritchie, Scott A.; Johansen, Cheryl A.; Zborowski, Paul; Cortis, Giles; Dandridge, Scott; Hall, Roy A.; van den Hurk, Andrew F.

    2010-01-01

    Arthropod-borne viruses (arboviruses) represent a global public health problem, with dengue viruses causing millions of infections annually, while emerging arboviruses, such as West Nile, Japanese encephalitis, and chikungunya viruses have dramatically expanded their geographical ranges. Surveillance of arboviruses provides vital data regarding their prevalence and distribution that may be utilized for biosecurity measures and the implementation of disease control strategies. However, current surveillance methods that involve detection of virus in mosquito populations or sero-conversion in vertebrate hosts are laborious, expensive, and logistically problematic. We report a unique arbovirus surveillance system to detect arboviruses that exploits the process whereby mosquitoes expectorate virus in their saliva during sugar feeding. In this system, infected mosquitoes captured by CO2-baited updraft box traps are allowed to feed on honey-soaked nucleic acid preservation cards within the trap. The cards are then analyzed for expectorated virus using real-time reverse transcription-PCR. In field trials, this system detected the presence of Ross River and Barmah Forest viruses in multiple traps deployed at two locations in Australia. Viral RNA was preserved for at least seven days on the cards, allowing for long-term placement of traps and continuous collection of data documenting virus presence in mosquito populations. Furthermore no mosquito handling or processing was required and cards were conveniently shipped to the laboratory overnight. The simplicity and efficacy of this approach has the potential to transform current approaches to vector-borne disease surveillance by streamlining the monitoring of pathogens in vector populations. PMID:20534559

  11. Designing and validating the joint battlespace infosphere

    NASA Astrophysics Data System (ADS)

    Peterson, Gregory D.; Alexander, W. Perry; Birdwell, J. Douglas

    2001-08-01

    Fielding and managing the dynamic, complex information systems infrastructure necessary for defense operations presents significant opportunities for revolutionary improvements in capabilities. An example of this technology trend is the creation and validation of the Joint Battlespace Infosphere (JBI) being developed by the Air Force Research Lab. The JBI is a system of systems that integrates, aggregates, and distributes information to users at all echelons, from the command center to the battlefield. The JBI is a key enabler of meeting the Air Force's Joint Vision 2010 core competencies such as Information Superiority, by providing increased situational awareness, planning capabilities, and dynamic execution. At the same time, creating this new operational environment introduces significant risk due to an increased dependency on computational and communications infrastructure combined with more sophisticated and frequent threats. Hence, the challenge facing the nation is the most effective means to exploit new computational and communications technologies while mitigating the impact of attacks, faults, and unanticipated usage patterns.

  12. Method for detecting core malware sites related to biomedical information systems.

    PubMed

    Kim, Dohoon; Choi, Donghee; Jin, Jonghyun

    2015-01-01

    Most advanced persistent threat attacks target web users through malicious code within landing (exploit) or distribution sites. There is an urgent need to block the affected websites. Attacks on biomedical information systems are no exception to this issue. In this paper, we present a method for locating malicious websites that attempt to attack biomedical information systems. Our approach uses malicious code crawling to rearrange websites in the order of their risk index by analyzing the centrality between malware sites and proactively eliminates the root of these sites by finding the core-hub node, thereby reducing unnecessary security policies. In particular, we dynamically estimate the risk index of the affected websites by analyzing various centrality measures and converting them into a single quantified vector. On average, the proactive elimination of core malicious websites results in an average improvement in zero-day attack detection of more than 20%.

  13. Self-Aware Vehicles: Mission and Performance Adaptation to System Health

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Leonard, Charles; Scotti, Stephen J.

    2016-01-01

    Advances in sensing (miniaturization, distributed sensor networks) combined with improvements in computational power leading to significant gains in perception, real-time decision making/reasoning and dynamic planning under uncertainty as well as big data predictive analysis have set the stage for realization of autonomous system capability. These advances open the design and operating space for self-aware vehicles that are able to assess their own capabilities and adjust their behavior to either complete the assigned mission or to modify the mission to reflect their current capabilities. This paper discusses the self-aware vehicle concept and associated technologies necessary for full exploitation of the concept. A self-aware aircraft, spacecraft or system is one that is aware of its internal state, has situational awareness of its environment, can assess its capabilities currently and project them into the future, understands its mission objectives, and can make decisions under uncertainty regarding its ability to achieve its mission objectives.

  14. Spoken language achieves robustness and evolvability by exploiting degeneracy and neutrality.

    PubMed

    Winter, Bodo

    2014-10-01

    As with biological systems, spoken languages are strikingly robust against perturbations. This paper shows that languages achieve robustness in a way that is highly similar to many biological systems. For example, speech sounds are encoded via multiple acoustically diverse, temporally distributed and functionally redundant cues, characteristics that bear similarities to what biologists call "degeneracy". Speech is furthermore adequately characterized by neutrality, with many different tongue configurations leading to similar acoustic outputs, and different acoustic variants understood as the same by recipients. This highlights the presence of a large neutral network of acoustic neighbors for every speech sound. Such neutrality ensures that a steady backdrop of variation can be maintained without impeding communication, assuring that there is "fodder" for subsequent evolution. Thus, studying linguistic robustness is not only important for understanding how linguistic systems maintain their functioning upon the background of noise, but also for understanding the preconditions for language evolution. © 2014 WILEY Periodicals, Inc.

  15. Exploiting epoxidized natural rubber latex (ENRL) as a starting raw material for latex-based products

    NASA Astrophysics Data System (ADS)

    Siti Nor Qamarina, M.; Fatimah Rubaizah, M. R.; Nurul Suhaira, A.; Norhanifah, M. Y.

    2017-12-01

    Epoxidized natural rubber latex (ENRL) is a chemically modified natural rubber latex produced from epoxidation process that involves usage of organic peracids. Conversion of the ENRL into dry rubber products has been known to exhibit many beneficial properties, however limited published works were found on diversifiying the ENRL latex-based products applications. In this preliminary work, different source of raw materials and neutralization systems were investigated. The objective was to explore possibilities in producing distinctive ENRL. Findings have demonstrated that different source of raw materials and neutralization systems influenced the typical ENRL specifications, stability behavior and particle size distribution. Morphological observations performed on these ENRL systems appeared to agree with the ENRL characteristics achieved. Since experimenting these two main factors resulted in encouraging ENRL findings, detailed work shall be further scrutinized to search for an optimum condition in producing marketable ENRL specifically for latex-based products applications.

  16. Method for Detecting Core Malware Sites Related to Biomedical Information Systems

    PubMed Central

    Kim, Dohoon; Choi, Donghee; Jin, Jonghyun

    2015-01-01

    Most advanced persistent threat attacks target web users through malicious code within landing (exploit) or distribution sites. There is an urgent need to block the affected websites. Attacks on biomedical information systems are no exception to this issue. In this paper, we present a method for locating malicious websites that attempt to attack biomedical information systems. Our approach uses malicious code crawling to rearrange websites in the order of their risk index by analyzing the centrality between malware sites and proactively eliminates the root of these sites by finding the core-hub node, thereby reducing unnecessary security policies. In particular, we dynamically estimate the risk index of the affected websites by analyzing various centrality measures and converting them into a single quantified vector. On average, the proactive elimination of core malicious websites results in an average improvement in zero-day attack detection of more than 20%. PMID:25821511

  17. Efficient Energy Conversion by Grafting Nanochannels with End-charged Stimuli-responsive Polyelectrolyte Brush

    NASA Astrophysics Data System (ADS)

    Chen, Guang; Das, Siddhartha

    2017-11-01

    Polyelectrolyte (PE) brushes have aroused increasing attention in applications in energy conversion and chemical sensing due to the environmentally-responsive and designable nature. PE brushes are charged polymer chains densely grafted on solid-liquid interfaces. By designing copolymeric systems, one can localize the ionizable sites at the brush tip in order to get end-charged PE brushes. Such brushes demonstrate anomalous shrinking/swelling behaviors with tunable environmental parameters such as pH and salt concentration. In this study, we probe the conformation and electrostatics of such PE brush systems with various size, grafting density and charge distribution, and exploit the electrochemomechanical energy conversion capabilities of nanochannels grafted with such PE brush systems. Our results indicate that the presence of the end-charged PE brush layer can massively enhance the streaming potential mediated energy conversion efficiency, and the improvement is more significant in strongly ionic solution.

  18. Study of Thread Level Parallelism in a Video Encoding Application for Chip Multiprocessor Design

    NASA Astrophysics Data System (ADS)

    Debes, Eric; Kaine, Greg

    2002-11-01

    In media applications there is a high level of available thread level parallelism (TLP). In this paper we study the intra TLP in a video encoder. We show that a well-distributed highly optimized encoder running on a symmetric multiprocessor (SMP) system can run 3.2 faster on a 4-way SMP machine than on a single processor. The multithreaded encoder running on an SMP system is then used to understand the requirements of a chip multiprocessor (CMP) architecture, which is one possible architectural direction to better exploit TLP. In the framework of this study, we use a software approach to evaluate the dataflow between processors for the video encoder running on an SMP system. An estimation of the dataflow is done with L2 cache miss event counters using Intel® VTuneTM performance analyzer. The experimental measurements are compared to theoretical results.

  19. Component masses of young, wide, non-magnetic white dwarf binaries in the Sloan Digital Sky Survey Data Release 7

    NASA Astrophysics Data System (ADS)

    Baxter, R. B.; Dobbie, P. D.; Parker, Q. A.; Casewell, S. L.; Lodieu, N.; Burleigh, M. R.; Lawrie, K. A.; Külebi, B.; Koester, D.; Holland, B. R.

    2014-06-01

    We present a spectroscopic component analysis of 18 candidate young, wide, non-magnetic, double-degenerate binaries identified from a search of the Sloan Digital Sky Survey Data Release 7 (DR7). All but two pairings are likely to be physical systems. We show SDSS J084952.47+471247.7 + SDSS J084952.87+471249.4 to be a wide DA + DB binary, only the second identified to date. Combining our measurements for the components of 16 new binaries with results for three similar, previously known systems within the DR7, we have constructed a mass distribution for the largest sample to date (38) of white dwarfs in young, wide, non-magnetic, double-degenerate pairings. This is broadly similar in form to that of the isolated field population with a substantial peak around M ˜ 0.6 M⊙. We identify an excess of ultramassive white dwarfs and attribute this to the primordial separation distribution of their progenitor systems peaking at relatively larger values and the greater expansion of their binary orbits during the final stages of stellar evolution. We exploit this mass distribution to probe the origins of unusual types of degenerates, confirming a mild preference for the progenitor systems of high-field-magnetic white dwarfs, at least within these binaries, to be associated with early-type stars. Additionally, we consider the 19 systems in the context of the stellar initial mass-final mass relation. None appear to be strongly discordant with current understanding of this relationship.

  20. An integrated approach to historical population assessment of the great whales: case of the New Zealand southern right whale.

    PubMed

    Jackson, Jennifer A; Carroll, Emma L; Smith, Tim D; Zerbini, Alexandre N; Patenaude, Nathalie J; Baker, C Scott

    2016-03-01

    Accurate estimation of historical abundance provides an essential baseline for judging the recovery of the great whales. This is particularly challenging for whales hunted prior to twentieth century modern whaling, as population-level catch records are often incomplete. Assessments of whale recovery using pre-modern exploitation indices are therefore rare, despite the intensive, global nature of nineteenth century whaling. Right whales (Eubalaena spp.) were particularly exploited: slow swimmers with strong fidelity to sheltered calving bays, the species made predictable and easy targets. Here, we present the first integrated population-level assessment of the whaling impact and pre-exploitation abundance of a right whale, the New Zealand southern right whale (E. australis). In this assessment, we use a Bayesian population dynamics model integrating multiple data sources: nineteenth century catches, genetic constraints on bottleneck size and individual sightings histories informing abundance and trend. Different catch allocation scenarios are explored to account for uncertainty in the population's offshore distribution. From a pre-exploitation abundance of 28 800-47 100 whales, nineteenth century hunting reduced the population to approximately 30-40 mature females between 1914 and 1926. Today, it stands at less than 12% of pre-exploitation abundance. Despite the challenges of reconstructing historical catches and population boundaries, conservation efforts of historically exploited species benefit from targets for ecological restoration.

  1. An integrated approach to historical population assessment of the great whales: case of the New Zealand southern right whale

    PubMed Central

    Jackson, Jennifer A.; Carroll, Emma L.; Smith, Tim D.; Zerbini, Alexandre N.; Patenaude, Nathalie J.; Baker, C. Scott

    2016-01-01

    Accurate estimation of historical abundance provides an essential baseline for judging the recovery of the great whales. This is particularly challenging for whales hunted prior to twentieth century modern whaling, as population-level catch records are often incomplete. Assessments of whale recovery using pre-modern exploitation indices are therefore rare, despite the intensive, global nature of nineteenth century whaling. Right whales (Eubalaena spp.) were particularly exploited: slow swimmers with strong fidelity to sheltered calving bays, the species made predictable and easy targets. Here, we present the first integrated population-level assessment of the whaling impact and pre-exploitation abundance of a right whale, the New Zealand southern right whale (E. australis). In this assessment, we use a Bayesian population dynamics model integrating multiple data sources: nineteenth century catches, genetic constraints on bottleneck size and individual sightings histories informing abundance and trend. Different catch allocation scenarios are explored to account for uncertainty in the population's offshore distribution. From a pre-exploitation abundance of 28 800–47 100 whales, nineteenth century hunting reduced the population to approximately 30–40 mature females between 1914 and 1926. Today, it stands at less than 12% of pre-exploitation abundance. Despite the challenges of reconstructing historical catches and population boundaries, conservation efforts of historically exploited species benefit from targets for ecological restoration. PMID:27069657

  2. Effects of recruitment, growth, and exploitation on walleye population size structure in northern Wisconsin lakes

    USGS Publications Warehouse

    Hansen, Michael J.; Nate, Nancy A.

    2014-01-01

    We evaluated the dynamics of walleye Sander vitreus population size structure, as indexed by the proportional size distribution (PSD) of quality-length fish, in Escanaba Lake during 1967–2003 and in 204 other lakes in northern Wisconsin during 1990–2011. We estimated PSD from angler-caught walleyes in Escanaba Lake and from spring electrofishing in 204 other lakes, and then related PSD to annual estimates of recruitment to age-3, length at age 3, and annual angling exploitation rate. In Escanaba Lake during 1967–2003, annual estimates of PSD were highly dynamic, growth (positively) explained 35% of PSD variation, recruitment explained only 3% of PSD variation, and exploitation explained only 7% of PSD variation. In 204 other northern Wisconsin lakes during 1990–2011, PSD varied widely among lakes, recruitment (negatively) explained 29% of PSD variation, growth (positively) explained 21% of PSD variation, and exploitation explained only 4% of PSD variation. We conclude that population size structure was most strongly driven by recruitment and growth, rather than exploitation, in northern Wisconsin walleye populations. Studies of other species over wide spatial and temporal ranges of recruitment, growth, and mortality are needed to determine which dynamic rate most strongly influences population size structure of other species. Our findings indicate a need to be cautious about assuming exploitation is a strong driver of walleye population size structure.

  3. Maximizing root/rhizosphere efficiency to improve crop productivity and nutrient use efficiency in intensive agriculture of China.

    PubMed

    Shen, Jianbo; Li, Chunjian; Mi, Guohua; Li, Long; Yuan, Lixing; Jiang, Rongfeng; Zhang, Fusuo

    2013-03-01

    Root and rhizosphere research has been conducted for many decades, but the underlying strategy of root/rhizosphere processes and management in intensive cropping systems remain largely to be determined. Improved grain production to meet the food demand of an increasing population has been highly dependent on chemical fertilizer input based on the traditionally assumed notion of 'high input, high output', which results in overuse of fertilizers but ignores the biological potential of roots or rhizosphere for efficient mobilization and acquisition of soil nutrients. Root exploration in soil nutrient resources and root-induced rhizosphere processes plays an important role in controlling nutrient transformation, efficient nutrient acquisition and use, and thus crop productivity. The efficiency of root/rhizosphere in terms of improved nutrient mobilization, acquisition, and use can be fully exploited by: (1) manipulating root growth (i.e. root development and size, root system architecture, and distribution); (2) regulating rhizosphere processes (i.e. rhizosphere acidification, organic anion and acid phosphatase exudation, localized application of nutrients, rhizosphere interactions, and use of efficient crop genotypes); and (3) optimizing root zone management to synchronize root growth and soil nutrient supply with demand of nutrients in cropping systems. Experiments have shown that root/rhizosphere management is an effective approach to increase both nutrient use efficiency and crop productivity for sustainable crop production. The objectives of this paper are to summarize the principles of root/rhizosphere management and provide an overview of some successful case studies on how to exploit the biological potential of root system and rhizosphere processes to improve crop productivity and nutrient use efficiency.

  4. Surveillance and reconnaissance ground system architecture

    NASA Astrophysics Data System (ADS)

    Devambez, Francois

    2001-12-01

    Modern conflicts induces various modes of deployment, due to the type of conflict, the type of mission, and phase of conflict. It is then impossible to define fixed architecture systems for surveillance ground segments. Thales has developed a structure for a ground segment based on the operational functions required, and on the definition of modules and networks. Theses modules are software and hardware modules, including communications and networks. This ground segment is called MGS (Modular Ground Segment), and is intended for use in airborne reconnaissance systems, surveillance systems, and U.A.V. systems. Main parameters for the definition of a modular ground image exploitation system are : Compliance with various operational configurations, Easy adaptation to the evolution of theses configurations, Interoperability with NATO and multinational forces, Security, Multi-sensors, multi-platforms capabilities, Technical modularity, Evolutivity Reduction of life cycle cost The general performances of the MGS are presented : type of sensors, acquisition process, exploitation of images, report generation, data base management, dissemination, interface with C4I. The MGS is then described as a set of hardware and software modules, and their organization to build numerous operational configurations. Architectures are from minimal configuration intended for a mono-sensor image exploitation system, to a full image intelligence center, for a multilevel exploitation of multi-sensor.

  5. Laser source for dimensional metrology: investigation of an iodine stabilized system based on narrow linewidth 633 nm DBR diode

    NASA Astrophysics Data System (ADS)

    Rerucha, Simon; Yacoot, Andrew; Pham, Tuan M.; Cizek, Martin; Hucl, Vaclav; Lazar, Josef; Cip, Ondrej

    2017-04-01

    We demonstrated that an iodine stabilized distributed Bragg reflector (DBR) diode based laser system lasing at a wavelength in close proximity to λ =633 nm could be used as an alternative laser source to the helium-neon lasers in both scientific and industrial metrology. This yields additional advantages besides the optical frequency stability and coherence: inherent traceability, wider optical frequency tuning range, higher output power and high frequency modulation capability. We experimentally investigated the characteristics of the laser source in two major steps: first using a wavelength meter referenced to a frequency comb controlled with a hydrogen maser and then on an interferometric optical bench testbed where we compared the performance of the laser system with that of a traditional frequency stabilized He-Ne laser. The results indicate that DBR diode laser system provides a good laser source for applications in dimensional (nano)metrology, especially in conjunction with novel interferometric detection methods exploiting high frequency modulation or multiaxis measurement systems.

  6. Impact of Targeted Programs on Health Systems: A Case Study of the Polio Eradication Initiative

    PubMed Central

    Loevinsohn, Benjamin; Aylward, Bruce; Steinglass, Robert; Ogden, Ellyn; Goodman, Tracey; Melgaard, Bjorn

    2002-01-01

    The results of 2 large field studies on the impact of the polio eradication initiative on health systems and 3 supplementary reports were presented at a December 1999 meeting convened by the World Health Organization. All of these studies concluded that positive synergies exist between polio eradication and health systems but that these synergies have not been vigorously exploited. The eradication of polio has probably improved health systems worldwide by broadening distribution of vitamin A supplements, improving cooperation among enterovirus laboratories, and facilitating linkages between health workers and their communities. The results of these studies also show that eliminating polio did not cause a diminution of funding for immunization against other illnesses. Relatively little is known about the opportunity costs of polio eradication. Improved planning in disease eradication initiatives can minimize disruptions in the delivery of other services. Future initiatives should include indicators and baseline data for monitoring effects on health systems development. PMID:11772750

  7. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    NASA Astrophysics Data System (ADS)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  8. Equalization in Aeronautical Telemetry Using Multiple Antennas

    DTIC Science & Technology

    2014-04-01

    Multiple Antennas April 2014 DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Test Resource Management Center...Telemetry Using Multiple Antennas 5a. CONTRACT NUMBER: W900KK-13-C- 0026 5b. GRANT NUMBER: N/A 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Michael...employing two transmit antennas and as a method for exploiting partial channel state information by the transmitter. The generalization involves

  9. Distributed measurement of high electric current by means of polarimetric optical fiber sensor.

    PubMed

    Palmieri, Luca; Sarchi, Davide; Galtarossa, Andrea

    2015-05-04

    A novel distributed optical fiber sensor for spatially resolved monitoring of high direct electric current is proposed and analyzed. The sensor exploits Faraday rotation and is based on the polarization analysis of the Rayleigh backscattered light. Preliminary laboratory tests, performed on a section of electric cable for currents up to 2.5 kA, have confirmed the viability of the method.

  10. Spatial correlation-based side information refinement for distributed video coding

    NASA Astrophysics Data System (ADS)

    Taieb, Mohamed Haj; Chouinard, Jean-Yves; Wang, Demin

    2013-12-01

    Distributed video coding (DVC) architecture designs, based on distributed source coding principles, have benefitted from significant progresses lately, notably in terms of achievable rate-distortion performances. However, a significant performance gap still remains when compared to prediction-based video coding schemes such as H.264/AVC. This is mainly due to the non-ideal exploitation of the video sequence temporal correlation properties during the generation of side information (SI). In fact, the decoder side motion estimation provides only an approximation of the true motion. In this paper, a progressive DVC architecture is proposed, which exploits the spatial correlation of the video frames to improve the motion-compensated temporal interpolation (MCTI). Specifically, Wyner-Ziv (WZ) frames are divided into several spatially correlated groups that are then sent progressively to the receiver. SI refinement (SIR) is performed as long as these groups are being decoded, thus providing more accurate SI for the next groups. It is shown that the proposed progressive SIR method leads to significant improvements over the Discover DVC codec as well as other SIR schemes recently introduced in the literature.

  11. Generalized Drivers in the Mammalian Endangerment Process

    PubMed Central

    González-Suárez, Manuela; Revilla, Eloy

    2014-01-01

    An important challenge for conservation today is to understand the endangerment process and identify any generalized patterns in how threats occur and aggregate across taxa. Here we use a global database describing main current external threats in mammals to evaluate the prevalence of distinct threatening processes, primarily of anthropogenic origin, and to identify generalized drivers of extinction and their association with vulnerability status and intrinsic species' traits. We detect several primary threat combinations that are generally associated with distinct species. In particular, large and widely distributed mammals are affected by combinations of direct exploitation and threats associated with increasing landscape modification that go from logging to intense human land-use. Meanwhile, small, narrowly distributed species are affected by intensifying levels of landscape modification but are not directly exploited. In general more vulnerable species are affected by a greater number of threats, suggesting increased extinction risk is associated with the accumulation of external threats. Overall, our findings show that endangerment in mammals is strongly associated with increasing habitat loss and degradation caused by human land-use intensification. For large and widely distributed mammals there is the additional risk of being hunted. PMID:24587315

  12. Real-time updating of the flood frequency distribution through data assimilation

    NASA Astrophysics Data System (ADS)

    Aguilar, Cristina; Montanari, Alberto; Polo, María-José

    2017-07-01

    We explore the memory properties of catchments for predicting the likelihood of floods based on observations of average flows in pre-flood seasons. Our approach assumes that flood formation is driven by the superimposition of short- and long-term perturbations. The former is given by the short-term meteorological forcing leading to infiltration and/or saturation excess, while the latter is originated by higher-than-usual storage in the catchment. To exploit the above sensitivity to long-term perturbations, a meta-Gaussian model and a data assimilation approach are implemented for updating the flood frequency distribution a season in advance. Accordingly, the peak flow in the flood season is predicted in probabilistic terms by exploiting its dependence on the average flow in the antecedent seasons. We focus on the Po River at Pontelagoscuro and the Danube River at Bratislava. We found that the shape of the flood frequency distribution is noticeably impacted by higher-than-usual flows occurring up to several months earlier. The proposed technique may allow one to reduce the uncertainty associated with the estimation of flood frequency.

  13. Research status of geothermal resources in China

    NASA Astrophysics Data System (ADS)

    Zhang, Lincheng; Li, Guang

    2017-08-01

    As the representative of the new green energy, geothermal resources are characterized by large reserve, wide distribution, cleanness and environmental protection, good stability, high utilization factor and other advantages. According to the characteristics of exploitation and utilization, they can be divided into high-temperature, medium-temperature and low-temperature geothermal resources. The abundant and widely distributed geothermal resources in China have a broad prospect for development. The medium and low temperature geothermal resources are broadly distributed in the continental crustal uplift and subsidence areas inside the plate, represented by the geothermal belt on the southeast coast, while the high temperature geothermal resources concentrate on Southern Tibet-Western Sichuan-Western Yunnan Geothermal Belt and Taiwan Geothermal Belt. Currently, the geothermal resources in China are mainly used for bathing, recuperation, heating and power generation. It is a country that directly makes maximum use of geothermal energy in the world. However, China’s geothermal power generation, including installed generating capacity and power generation capacity, are far behind those of Western European countries and the USA. Studies on exploitation and development of geothermal resources are still weak.

  14. Space-Data Routers: Advanced data routing protocols for enhancing data exploitation for space weather applications

    NASA Astrophysics Data System (ADS)

    Anastasiadis, Anastasios; Daglis, Ioannis A.; Balasis, George; Papadimitriou, Constantinos; Tsaoussidis, Vassilios; Diamantopoulos, Sotirios

    2014-05-01

    Data sharing and access are major issues in space sciences, as they influence the degree of data exploitation. The availability of multi-spacecraft distributed observation methods and adaptive mission architectures require computationally intensive analysis methods. Moreover, accurate space weather forecasting and future space exploration far from Earth will be in need of real-time data distribution and assimilation technologies. The FP7-Space collaborative research project "Space-Data Routers" (SDR) relies on space internetworking and in particular on Delay Tolerant Networking (DTN), which marks the new era in space communications. SDR unifies space and earth communication infrastructures and delivers a set of tools and protocols for space-data exploitation. The main goal is to allow space agencies, academic institutes and research centers to share space-data generated by single or multiple missions, in an efficient, secure and automated manner. Here we are presenting the architecture and basic functionality of a DTN-based application specifically designed in the framework of the SDR project, for data query, retrieval and administration that will enable addressing outstanding science questions related to space weather, through the provision of simultaneous real-time data sampling at multiple points in space. The work leading to this paper has received funding from the European Union's Seventh Framework Programme (FP7-SPACE-2010-1) under grant agreement no. 263330 for the SDR (Space-Data Routers for Exploiting Space Data) collaborative research project. This paper reflects only the authors' views and the Union is not liable for any use that may be made of the information contained therein.

  15. Cascadia Slow Earthquakes: Strategies for Time Independent Inversion of Displacement Fields

    NASA Astrophysics Data System (ADS)

    Szeliga, W. M.; Melbourne, T. I.; Miller, M. M.; Santillan, V. M.

    2004-12-01

    Continuous observations using Global Positioning System geodesy (CGPS) have revealed periodic slow or silent earthquakes along the Cascadia subduction zone with a spectrum of timing and periodicity. These creep events perturb time series of GPS observations and yield coherent displacement fields that relate to the extent and magnitude of fault displacement. In this study, time independent inversions of the surface displacement fields that accompany eight slow earthquakes characterize slip distributions along the plate interface for each event. The inversions employed in this study utilize Okada's elastic dislocation model and a non- negative least squares approach. Methodologies for optimizing the slip distribution smoothing parameter for a particular station distribution have also been investigated, significantly reducing the number of possible slip distributions and the range of estimates for total moment release for each event. The discretized slip distribution calculated for multiple creep events identifies areas of the Cascadia plate interface where slip persistently recurs. The current hypothesis, that slow earthquakes are modulated by forced fluid flow, leads to the possibility that some regions of the Cascadia plate interface may display fault patches preferentially exploited by fluid flow. Thus, the identification of regions of the plate interface that repeatedly slip during slow events may yield important information regarding the identification of these fluid pathways.

  16. Dead time corrections for inbeam γ-spectroscopy measurements

    NASA Astrophysics Data System (ADS)

    Boromiza, M.; Borcea, C.; Negret, A.; Olacel, A.; Suliman, G.

    2017-08-01

    Relatively high counting rates were registered in a proton inelastic scattering experiment on 16O and 28Si using HPGe detectors which was performed at the Tandem facility of IFIN-HH, Bucharest. In consequence, dead time corrections were needed in order to determine the absolute γ-production cross sections. Considering that the real counting rate follows a Poisson distribution, the dead time correction procedure is reformulated in statistical terms. The arriving time interval between the incoming events (Δt) obeys an exponential distribution with a single parameter - the average of the associated Poisson distribution. We use this mathematical connection to calculate and implement the dead time corrections for the counting rates of the mentioned experiment. Also, exploiting an idea introduced by Pommé et al., we describe a consistent method for calculating the dead time correction which completely eludes the complicated problem of measuring the dead time of a given detection system. Several comparisons are made between the corrections implemented through this method and by using standard (phenomenological) dead time models and we show how these results were used for correcting our experimental cross sections.

  17. Creating Hierarchical Pores by Controlled Linker Thermolysis in Multivariate Metal-Organic Frameworks.

    PubMed

    Feng, Liang; Yuan, Shuai; Zhang, Liang-Liang; Tan, Kui; Li, Jia-Luo; Kirchon, Angelo; Liu, Ling-Mei; Zhang, Peng; Han, Yu; Chabal, Yves J; Zhou, Hong-Cai

    2018-02-14

    Sufficient pore size, appropriate stability, and hierarchical porosity are three prerequisites for open frameworks designed for drug delivery, enzyme immobilization, and catalysis involving large molecules. Herein, we report a powerful and general strategy, linker thermolysis, to construct ultrastable hierarchically porous metal-organic frameworks (HP-MOFs) with tunable pore size distribution. Linker instability, usually an undesirable trait of MOFs, was exploited to create mesopores by generating crystal defects throughout a microporous MOF crystal via thermolysis. The crystallinity and stability of HP-MOFs remain after thermolabile linkers are selectively removed from multivariate metal-organic frameworks (MTV-MOFs) through a decarboxylation process. A domain-based linker spatial distribution was found to be critical for creating hierarchical pores inside MTV-MOFs. Furthermore, linker thermolysis promotes the formation of ultrasmall metal oxide nanoparticles immobilized in an open framework that exhibits high catalytic activity for Lewis acid-catalyzed reactions. Most importantly, this work provides fresh insights into the connection between linker apportionment and vacancy distribution, which may shed light on probing the disordered linker apportionment in multivariate systems, a long-standing challenge in the study of MTV-MOFs.

  18. NESDIS OSPO Data Access Policy and CRM

    NASA Astrophysics Data System (ADS)

    Seybold, M. G.; Donoho, N. A.; McNamara, D.; Paquette, J.; Renkevens, T.

    2012-12-01

    The Office of Satellite and Product Operations (OSPO) is the NESDIS office responsible for satellite operations, product generation, and product distribution. Access to and distribution of OSPO data was formally established in a Data Access Policy dated February, 2011. An extension of the data access policy is the OSPO Customer Relationship Management (CRM) Database, which has been in development since 2008 and is reaching a critical level of maturity. This presentation will provide a summary of the data access policy and standard operating procedure (SOP) for handling data access requests. The tangential CRM database will be highlighted including the incident tracking system, reporting and notification capabilities, and the first comprehensive portfolio of NESDIS satellites, instruments, servers, applications, products, user organizations, and user contacts. Select examples of CRM data exploitation will show how OSPO is utilizing the CRM database to more closely satisfy the user community's satellite data needs with new product promotions, as well as new data and imagery distribution methods in OSPO's Environmental Satellite Processing Center (ESPC). In addition, user services and outreach initiatives from the Satellite Products and Services Division will be highlighted.

  19. cuTauLeaping: A GPU-Powered Tau-Leaping Stochastic Simulator for Massive Parallel Analyses of Biological Systems

    PubMed Central

    Besozzi, Daniela; Pescini, Dario; Mauri, Giancarlo

    2014-01-01

    Tau-leaping is a stochastic simulation algorithm that efficiently reconstructs the temporal evolution of biological systems, modeled according to the stochastic formulation of chemical kinetics. The analysis of dynamical properties of these systems in physiological and perturbed conditions usually requires the execution of a large number of simulations, leading to high computational costs. Since each simulation can be executed independently from the others, a massive parallelization of tau-leaping can bring to relevant reductions of the overall running time. The emerging field of General Purpose Graphic Processing Units (GPGPU) provides power-efficient high-performance computing at a relatively low cost. In this work we introduce cuTauLeaping, a stochastic simulator of biological systems that makes use of GPGPU computing to execute multiple parallel tau-leaping simulations, by fully exploiting the Nvidia's Fermi GPU architecture. We show how a considerable computational speedup is achieved on GPU by partitioning the execution of tau-leaping into multiple separated phases, and we describe how to avoid some implementation pitfalls related to the scarcity of memory resources on the GPU streaming multiprocessors. Our results show that cuTauLeaping largely outperforms the CPU-based tau-leaping implementation when the number of parallel simulations increases, with a break-even directly depending on the size of the biological system and on the complexity of its emergent dynamics. In particular, cuTauLeaping is exploited to investigate the probability distribution of bistable states in the Schlögl model, and to carry out a bidimensional parameter sweep analysis to study the oscillatory regimes in the Ras/cAMP/PKA pathway in S. cerevisiae. PMID:24663957

  20. [Ecotourism exploitation model in Bita Lake Natural Reserve of Yunnan].

    PubMed

    Yang, G; Wang, Y; Zhong, L

    2000-12-01

    Bita lake provincial natural reserve is located in Shangri-La region of North-western Yunnan, and was set as a demonstrating area for ecotourism exploitation in 1998. After a year's exploitation construction and half a year's operation as a branch of the 99' Kunming International Horticulture Exposition to accept tourists, it was proved that the ecotourism demonstrating area attained four integrated functions of ecotourism, i.e., tourism, protection, poverty clearing and environment education. Five exploitation and management models including function zoned exploitation model, featured tourism communication model signs system designing model, local Tibetan family reception model and environmental monitoring model, were also successful, which were demonstrated and spreaded to the whole province. Bita lake provincial natural reserve could be a good sample for the ecotourism exploitation natural reserves of the whole country.

  1. To defer or to stand up? How offender formidability affects third party moral outrage.

    PubMed

    Jensen, Niels Holm; Petersen, Michael Bang

    2011-03-16

    According to models of animal behavior, the relative formidability of conspecifics determines the utility of deferring versus aggressing in situations of conflict. Here we apply and extend these models by investigating how the formidability of exploiters shapes third party moral outrage in humans. Deciding whether to defer to or stand up against a formidable exploiter is a complicated decision as there is both much to lose (formidable individuals are able and prone to retaliate) and much to gain (formidable individuals pose a great future threat). An optimally designed outrage system should, therefore, be sensitive to these cost- benefit trade-offs. To test this argument, participants read scenarios containing exploitative acts (trivial vs. serious) and were presented with head-shot photos of the apparent exploiters (formidable vs. non-formidable). As predicted, results showed that, compared to the non- formidable exploiter, the formidable exploiter activated significantly more outrage in male participants when the exploitative act was serious. Conversely, when it was trivial, the formidable exploiter activated significantly less outrage in male participants. However, these findings were conditioned by the exploiters' perceived trustworthiness. Among female participants, the results showed that moral outrage was not modulated by exploiter formidability.

  2. Exploitation of commercial remote sensing images: reality ignored?

    NASA Astrophysics Data System (ADS)

    Allen, Paul C.

    1999-12-01

    The remote sensing market is on the verge of being awash in commercial high-resolution images. Market estimates are based on the growing numbers of planned commercial remote sensing electro-optical, radar, and hyperspectral satellites and aircraft. EarthWatch, Space Imaging, SPOT, and RDL among others are all working towards launch and service of one to five meter panchromatic or radar-imaging satellites. Additionally, new advances in digital air surveillance and reconnaissance systems, both manned and unmanned, are also expected to expand the geospatial customer base. Regardless of platform, image type, or location, each system promises images with some combination of increased resolution, greater spectral coverage, reduced turn-around time (request-to- delivery), and/or reduced image cost. For the most part, however, market estimates for these new sources focus on the raw digital images (from collection to the ground station) while ignoring the requirements for a processing and exploitation infrastructure comprised of exploitation tools, exploitation training, library systems, and image management systems. From this it would appear the commercial imaging community has failed to learn the hard lessons of national government experience choosing instead to ignore reality and replicate the bias of collection over processing and exploitation. While this trend may be not impact the small quantity users that exist today it will certainly adversely affect the mid- to large-sized users of the future.

  3. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills,more » and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.« less

  4. Technologies for distributed defense

    NASA Astrophysics Data System (ADS)

    Seiders, Barbara; Rybka, Anthony

    2002-07-01

    For Americans, the nature of warfare changed on September 11, 2001. Our national security henceforth will require distributed defense. One extreme of distributed defense is represented by fully deployed military troops responding to a threat from a hostile nation state. At the other extreme is a country of 'citizen soldiers', with families and communities securing their common defense through heightened awareness, engagement as good neighbors, and local support of and cooperation with local law enforcement, emergency and health care providers. Technologies - for information exploitation, biological agent detection, health care surveillance, and security - will be critical to ensuring success in distributed defense.

  5. Accelerating Sequences in the Presence of Metal by Exploiting the Spatial Distribution of Off-Resonance

    PubMed Central

    Smith, Matthew R.; Artz, Nathan S.; Koch, Kevin M.; Samsonov, Alexey; Reeder, Scott B.

    2014-01-01

    Purpose To demonstrate feasibility of exploiting the spatial distribution of off-resonance surrounding metallic implants for accelerating multispectral imaging techniques. Theory Multispectral imaging (MSI) techniques perform time-consuming independent 3D acquisitions with varying RF frequency offsets to address the extreme off-resonance from metallic implants. Each off-resonance bin provides a unique spatial sensitivity that is analogous to the sensitivity of a receiver coil, and therefore provides a unique opportunity for acceleration. Methods Fully sampled MSI was performed to demonstrate retrospective acceleration. A uniform sampling pattern across off-resonance bins was compared to several adaptive sampling strategies using a total hip replacement phantom. Monte Carlo simulations were performed to compare noise propagation of two of these strategies. With a total knee replacement phantom, positive and negative off-resonance bins were strategically sampled with respect to the B0 field to minimize aliasing. Reconstructions were performed with a parallel imaging framework to demonstrate retrospective acceleration. Results An adaptive sampling scheme dramatically improved reconstruction quality, which was supported by the noise propagation analysis. Independent acceleration of negative and positive off-resonance bins demonstrated reduced overlapping of aliased signal to improve the reconstruction. Conclusion This work presents the feasibility of acceleration in the presence of metal by exploiting the spatial sensitivities of off-resonance bins. PMID:24431210

  6. Spatio-temporal distribution of net-collected phytoplankton community and its response to marine exploitation in Xiangshan Bay

    NASA Astrophysics Data System (ADS)

    Jiang, Zhibing; Zhu, Xuyu; Gao, Yu; Chen, Quanzhen; Zeng, Jiangning; Zhu, Genhai

    2013-07-01

    To explore the spatial-temporal distribution of the phytoplankton community and evaluate the combined effects of marine resource exploitation, net-collected phytoplankton and physical-chemical parameters were investigated in the Xiangshan Bay during the four seasons of 2010. A total of eight phyla, 97 genera, and 310 species were found, including 232 diatom species, 45 dinoflagellate species and 33 other taxa. The phytoplankton abundances presented a significant ( P<0.001) seasonal difference with the average of 60.66×104 cells/m3. Diatoms (mainly consisting of Coscinodiscus jonesianus, Cerataulina pelagica, Skeleto n ema costatum, and genus Chaetoceros) dominated the phytoplankton assemblage in all seasons. We found great spatio-temporal variation in community composition based on the multidimensional scaling and similarity analysis. Canonical correspondence analysis show that temperature, nutrition, illumination, and salinity were the main variables associated with microalgal assemblage. Compared with the previous studies, an increase in phytoplankton abundance and change in the dominant species coincided with increased exploitation activities in this bay (e.g. operation of coastal power plants, intensive mariculture, tidal flat reclamation, and industrial and agricultural development). The present findings suggest that the government should exercise caution when deciding upon developmental patterns in the sea-related economy.

  7. Artillery/mortar round type classification to increase system situational awareness

    NASA Astrophysics Data System (ADS)

    Desai, Sachi; Grasing, David; Morcos, Amir; Hohil, Myron

    2008-04-01

    Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feedforward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.

  8. Design of Protease Activated Optical Contrast Agents That Exploit a Latent Lysosomotropic Effect for Use in Fluorescence-Guided Surgery

    PubMed Central

    2015-01-01

    There is a need for new molecular-guided contrast agents to enhance surgical procedures such as tumor resection that require a high degree of precision. Cysteine cathepsins are highly up-regulated in a wide variety of cancers, both in tumor cells and in the tumor-supporting cells of the surrounding stroma. Therefore, tools that can be used to dynamically monitor their activity in vivo could be used as imaging contrast agents for intraoperative fluorescence image guided surgery (FGS). Although multiple classes of cathepsin-targeted substrate probes have been reported, most suffer from overall fast clearance from sites of protease activation, leading to reduced signal intensity and duration in vivo. Here we describe the design and synthesis of a series of near-infrared fluorogenic probes that exploit a latent cationic lysosomotropic effect (LLE) to promote cellular retention upon protease activation. These probes show tumor-specific retention, fast activation kinetics, and rapid systemic distribution. We demonstrate that they are suitable for detection of diverse cancer types including breast, colon and lung tumors. Most importantly, the agents are compatible with the existing, FDA approved, da Vinci surgical system for fluorescence guided tumor resection. Therefore, our data suggest that the probes reported here can be used with existing clinical instrumentation to detect tumors and potentially other types of inflammatory lesions to guide surgical decision making in real time. PMID:26039341

  9. Processing Diabetes Mellitus Composite Events in MAGPIE.

    PubMed

    Brugués, Albert; Bromuri, Stefano; Barry, Michael; Del Toro, Óscar Jiménez; Mazurkiewicz, Maciej R; Kardas, Przemyslaw; Pegueroles, Josep; Schumacher, Michael

    2016-02-01

    The focus of this research is in the definition of programmable expert Personal Health Systems (PHS) to monitor patients affected by chronic diseases using agent oriented programming and mobile computing to represent the interactions happening amongst the components of the system. The paper also discusses issues of knowledge representation within the medical domain when dealing with temporal patterns concerning the physiological values of the patient. In the presented agent based PHS the doctors can personalize for each patient monitoring rules that can be defined in a graphical way. Furthermore, to achieve better scalability, the computations for monitoring the patients are distributed among their devices rather than being performed in a centralized server. The system is evaluated using data of 21 diabetic patients to detect temporal patterns according to a set of monitoring rules defined. The system's scalability is evaluated by comparing it with a centralized approach. The evaluation concerning the detection of temporal patterns highlights the system's ability to monitor chronic patients affected by diabetes. Regarding the scalability, the results show the fact that an approach exploiting the use of mobile computing is more scalable than a centralized approach. Therefore, more likely to satisfy the needs of next generation PHSs. PHSs are becoming an adopted technology to deal with the surge of patients affected by chronic illnesses. This paper discusses architectural choices to make an agent based PHS more scalable by using a distributed mobile computing approach. It also discusses how to model the medical knowledge in the PHS in such a way that it is modifiable at run time. The evaluation highlights the necessity of distributing the reasoning to the mobile part of the system and that modifiable rules are able to deal with the change in lifestyle of the patients affected by chronic illnesses.

  10. Applying acoustic telemetry to understand contaminant exposure and bioaccumulation patterns in mobile fishes.

    PubMed

    Taylor, Matthew D; van der Meulen, Dylan E; Brodie, Stephanie; Cadiou, Gwenaël; Knott, Nathan A

    2018-06-01

    Contamination in urbanised estuaries presents a risk to human health, and to the viability of populations of exploited species. Assessing animal movements in relation to contaminated areas may help to explain patterns in bioaccumulation, and assist in the effective management of health risks associated with consumption of exploited species. Using polychlorinated dibenzodioxin and polychlorinated dibenzofuran (PCDD/Fs) contamination in Sydney Harbour estuary as a case study, we present a study that links movement patterns resolved using acoustic telemetry to the accumulation of contaminants in mobile fish on a multi-species basis. Fifty-four individuals across six exploited species (Sea Mullet Mugil cephalus; Luderick Girella tricuspidata; Yellowfin Bream Acanthopagrus australis; Silver Trevally Pseudocaranx georgianus; Mulloway Argyrosomus japonicus; Yellowtail Kingfish Seriola lalandi) were tagged with acoustic transmitters, and their movements tracked for up to 3years. There was substantial inter-specific variation in fish distribution along the estuary. The proportion of distribution that overlapped with contaminated areas explained 84-98% of the inter-specific variation in lipid-standardised biota PCDD/F concentration. There was some seasonal variation in distribution along the estuary, but movement patterns indicated that Sea Mullet, Yellowfin Bream, Silver Trevally, and Mulloway were likely to be exposed to contaminated areas during the period of gonadal maturation. Acoustic telemetry allows examination of spatial and temporal patterns in exposure to contamination. When used alongside biota sampling and testing, this offers a powerful approach to assess exposure, bioaccumulation, and potential risks faced by different species, as well as human health risks associated with their consumption. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  11. Assessment Methods of Groundwater Overdraft Area and Its Application

    NASA Astrophysics Data System (ADS)

    Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun

    2018-05-01

    Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.

  12. Cloud Based Earth Observation Data Exploitation Platforms

    NASA Astrophysics Data System (ADS)

    Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.

    2017-12-01

    In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland and the Amazon Web Services cloud. This work will present an overview of the TEPs and the multi-cloud EO data processing platform, and discuss their main achievements and their impacts in the context of distributed Research Infrastructures such as EPOS and EOSC.

  13. Exploiting Non-Markovianity for Quantum Control.

    PubMed

    Reich, Daniel M; Katz, Nadav; Koch, Christiane P

    2015-07-22

    Quantum technology, exploiting entanglement and the wave nature of matter, relies on the ability to accurately control quantum systems. Quantum control is often compromised by the interaction of the system with its environment since this causes loss of amplitude and phase. However, when the dynamics of the open quantum system is non-Markovian, amplitude and phase flow not only from the system into the environment but also back. Interaction with the environment is then not necessarily detrimental. We show that the back-flow of amplitude and phase can be exploited to carry out quantum control tasks that could not be realized if the system was isolated. The control is facilitated by a few strongly coupled, sufficiently isolated environmental modes. Our paradigmatic example considers a weakly anharmonic ladder with resonant amplitude control only, restricting realizable operations to SO(N). The coupling to the environment, when harnessed with optimization techniques, allows for full SU(N) controllability.

  14. Micro-UAV tracking framework for EO exploitation

    NASA Astrophysics Data System (ADS)

    Browning, David; Wilhelm, Joe; Van Hook, Richard; Gallagher, John

    2012-06-01

    Historically, the Air Force's research into aerial platforms for sensing systems has focused on low-, mid-, and highaltitude platforms. Though these systems are likely to comprise the majority of the Air Force's assets for the foreseeable future, they have limitations. Specifically, these platforms, their sensor packages, and their data exploitation software are unsuited for close-quarter surveillance, such as in alleys and inside of buildings. Micro-UAVs have been gaining in popularity, especially non-fixed-wing platforms such as quad-rotors. These platforms are much more appropriate for confined spaces. However, the types of video exploitation techniques that can effectively be used are different from the typical nadir-looking aerial platform. This paper discusses the creation of a framework for testing existing and new video exploitation algorithms, as well as describes a sample micro-UAV-based tracker.

  15. Biofuels for sustainable transportation

    DOT National Transportation Integrated Search

    2000-06-01

    Biomass is an attractive energy source for a number of reasons. First, it : is renewable as long as it is properly managed. It is also more evenly distributed : over the Earths surface than are finite energy sources, and may be : exploited using m...

  16. Maxwell: A semi-analytic 4D code for earthquake cycle modeling of transform fault systems

    NASA Astrophysics Data System (ADS)

    Sandwell, David; Smith-Konter, Bridget

    2018-05-01

    We have developed a semi-analytic approach (and computational code) for rapidly calculating 3D time-dependent deformation and stress caused by screw dislocations imbedded within an elastic layer overlying a Maxwell viscoelastic half-space. The maxwell model is developed in the Fourier domain to exploit the computational advantages of the convolution theorem, hence substantially reducing the computational burden associated with an arbitrarily complex distribution of force couples necessary for fault modeling. The new aspect of this development is the ability to model lateral variations in shear modulus. Ten benchmark examples are provided for testing and verification of the algorithms and code. One final example simulates interseismic deformation along the San Andreas Fault System where lateral variations in shear modulus are included to simulate lateral variations in lithospheric structure.

  17. A Scalable Data Access Layer to Manage Structured Heterogeneous Biomedical Data

    PubMed Central

    Lianas, Luca; Frexia, Francesca; Zanetti, Gianluigi

    2016-01-01

    This work presents a scalable data access layer, called PyEHR, designed to support the implementation of data management systems for secondary use of structured heterogeneous biomedical and clinical data. PyEHR adopts the openEHR’s formalisms to guarantee the decoupling of data descriptions from implementation details and exploits structure indexing to accelerate searches. Data persistence is guaranteed by a driver layer with a common driver interface. Interfaces for two NoSQL Database Management Systems are already implemented: MongoDB and Elasticsearch. We evaluated the scalability of PyEHR experimentally through two types of tests, called “Constant Load” and “Constant Number of Records”, with queries of increasing complexity on synthetic datasets of ten million records each, containing very complex openEHR archetype structures, distributed on up to ten computing nodes. PMID:27936191

  18. High-Dimensional Circular Quantum Secret Sharing Using Orbital Angular Momentum

    NASA Astrophysics Data System (ADS)

    Tang, Dawei; Wang, Tie-jun; Mi, Sichen; Geng, Xiao-Meng; Wang, Chuan

    2016-11-01

    Quantum secret sharing is to distribute secret message securely between multi-parties. Here exploiting orbital angular momentum (OAM) state of single photons as the information carrier, we propose a high-dimensional circular quantum secret sharing protocol which increases the channel capacity largely. In the proposed protocol, the secret message is split into two parts, and each encoded on the OAM state of single photons. The security of the protocol is guaranteed by the laws of non-cloning theorem. And the secret messages could not be recovered except that the two receivers collaborated with each other. Moreover, the proposed protocol could be extended into high-level quantum systems, and the enhanced security could be achieved.

  19. Antiferromagnetic MnN layer on the MnGa(001) surface

    NASA Astrophysics Data System (ADS)

    Guerrero-Sánchez, J.; Takeuchi, Noboru

    2016-12-01

    Spin polarized first principles total energy calculations have been applied to study the stability and magnetic properties of the MnGa(001) surface and the formation of a topmost MnN layer with the deposit of nitrogen. Before nitrogen adsorption, surface formation energies show a stable gallium terminated ferromagnetic surface. After incorporation of nitrogen atoms, the antiferromagnetic manganese terminated surface becomes stable due to the formation of a MnN layer (Mn-N bonding at the surface). Spin density distribution shows a ferromagnetic/antiferromagnetic arrangement in the first surface layers. This thermodynamically stable structure may be exploited to growth MnGa/MnN magnetic heterostructures as well as to look for exchange biased systems.

  20. Measurement-device-independent quantum cryptography

    DOE PAGES

    Xu, Feihu; Curty, Marcos; Qi, Bing; ...

    2014-12-18

    In theory, quantum key distribution (QKD) provides information-theoretic security based on the laws of physics. Owing to the imperfections of real-life implementations, however, there is a big gap between the theory and practice of QKD, which has been recently exploited by several quantum hacking activities. To fill this gap, a novel approach, called measurement-device-independent QKD (mdiQKD), has been proposed. In addition, it can remove all side-channels from the measurement unit, arguably the most vulnerable part in QKD systems, thus offering a clear avenue toward secure QKD realisations. In this study, we review the latest developments in the framework of mdiQKD,more » together with its assumptions, strengths, and weaknesses.« less

  1. The perspective of the permanent monitoring with an FBG sensor network in oil and gas production in China

    NASA Astrophysics Data System (ADS)

    Zhang, Yuanzhong; Xiao, Lizhi; Fu, Jianwei; Chen, Haifeng; Zhao, Xiaoliang

    2005-12-01

    Most of the onshore oilfields in China are in the middle and late development stages, and great deals of residual oil are waiting for exploitation. Downhole permanent sensor monitoring technology is an effective means to enhance oil and gas recovery. The concept of the downhole permanent sensor network is introduced, and the research status was reviewed. The measurement principle, application and some issues of the Distribute Temperature System (DTS) and Fiber Bragg Grating (FBG) sensor are discussed. Some potential applications of permanent monitoring with FBG sensors in oil and gas production, including enhancing oil and gas recovery and realtime monitoring of casing damaging were reviewed.

  2. Twin Jet

    NASA Technical Reports Server (NTRS)

    Henderson, Brenda; Bozak, Rick

    2010-01-01

    Many subsonic and supersonic vehicles in the current fleet have multiple engines mounted near one another. Some future vehicle concepts may use innovative propulsion systems such as distributed propulsion which will result in multiple jets mounted in close proximity. Engine configurations with multiple jets have the ability to exploit jet-by-jet shielding which may significantly reduce noise. Jet-by-jet shielding is the ability of one jet to shield noise that is emitted by another jet. The sensitivity of jet-by-jet shielding to jet spacing and simulated flight stream Mach number are not well understood. The current experiment investigates the impact of jet spacing, jet operating condition, and flight stream Mach number on the noise radiated from subsonic and supersonic twin jets.

  3. Countermeasure Study on Deep-sea Oil Exploitation in the South China Sea——A Comparison between Deep-sea Oil Exploitation in the South China Sea and the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Qiu, Weiting; Qu, Weilu

    2018-02-01

    The unpromising situation of terrestrial oil resources makes the deep-sea oil industry become an important development strategy. The South China Sea has a vast sea area with a wide distribution of oil and gas resources, but there is a phenomenon that exploration and census rates and oil exploitation are low. In order to solve the above problems, this article analyzes the geology, oil and gas exploration and exploration equipment in the South China Sea and the Gulf of Mexico. Comparing the political environment of China and the United States energy industry and the economic environment of oil companies, this article points out China’s deep-sea oil exploration and mining problems that may exist. Finally, the feasibility of oil exploration and exploitation in the South China Sea is put forward, which will provide reference to improve the conditions of oil exploration in the South China Sea and promoting the stable development of China’s oil industry.

  4. Exploiting vibrational resonance in weak-signal detection

    NASA Astrophysics Data System (ADS)

    Ren, Yuhao; Pan, Yan; Duan, Fabing; Chapeau-Blondeau, François; Abbott, Derek

    2017-08-01

    In this paper, we investigate the first exploitation of the vibrational resonance (VR) effect to detect weak signals in the presence of strong background noise. By injecting a series of sinusoidal interference signals of the same amplitude but with different frequencies into a generalized correlation detector, we show that the detection probability can be maximized at an appropriate interference amplitude. Based on a dual-Dirac probability density model, we compare the VR method with the stochastic resonance approach via adding dichotomous noise. The compared results indicate that the VR method can achieve a higher detection probability for a wider variety of noise distributions.

  5. Quantitative relations between fishing mortality, spawning stress mortality and biomass growth rate (computed with numerical model FISHMO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laevastu, T.

    1983-01-01

    The effects of fishing on a given species biomass have been quantitatively evaluated. A constant recruitment is assumed in this study, but the evaluation can be computed on any known age distribution of exploitable biomass. Fishing mortality is assumed to be constant with age; however, spawning stress mortality increases with age. When fishing (mortality) increases, the spawning stress mortality decreases relative to total and exploitable biomasses. These changes are quantitatively shown for two species from the Bering Sea - walleye pollock, Theragra chalcogramma, and yellowfin sole, Limanda aspera.

  6. Exploiting vibrational resonance in weak-signal detection.

    PubMed

    Ren, Yuhao; Pan, Yan; Duan, Fabing; Chapeau-Blondeau, François; Abbott, Derek

    2017-08-01

    In this paper, we investigate the first exploitation of the vibrational resonance (VR) effect to detect weak signals in the presence of strong background noise. By injecting a series of sinusoidal interference signals of the same amplitude but with different frequencies into a generalized correlation detector, we show that the detection probability can be maximized at an appropriate interference amplitude. Based on a dual-Dirac probability density model, we compare the VR method with the stochastic resonance approach via adding dichotomous noise. The compared results indicate that the VR method can achieve a higher detection probability for a wider variety of noise distributions.

  7. Small-x asymptotics of the quark helicity distribution: Analytic results

    DOE PAGES

    Kovchegov, Yuri V.; Pitonyak, Daniel; Sievert, Matthew D.

    2017-06-15

    In this Letter, we analytically solve the evolution equations for the small-x asymptotic behavior of the (flavor singlet) quark helicity distribution in the large- N c limit. Here, these evolution equations form a set of coupled integro-differential equations, which previously could only be solved numerically. This approximate numerical solution, however, revealed simplifying properties of the small-x asymptotics, which we exploit here to obtain an analytic solution.

  8. The growth and exploitation rate of yellowstripe scad (selaroides leptolepis cuvier, 1833) in the Malacca Strait, Medan Belawan Subdistrict, North Sumatera Province

    NASA Astrophysics Data System (ADS)

    Tambun, J.; Bakti, D.; Desrita

    2018-02-01

    Yellowstripe scad included the one of commodity that has an important economic value in the Malacca Strait. Fish were found mostly in Indonesian of waters made this fish as one of the main target catch. But, it can had negative impact on the population of the fish. The study is done at Belawan Waters on March until May 2017 that which is purposed to study about the frequency distribution of length, determine the parameters of growth and, determine mortality rate and the rate of exploitation in order to provide appropriate management model for the fish resource. Yellowstripe scad was observed around 360 samples with the length range between 110 - 175 mm. The fish separated by bhattacarya method used the aid software FISAT II. A pattern of growth Yellowstripe scad alometrik negative with growth coefisien (K) 1.1 with length asimtotic (L∞) 181.65. The rate of mortality total ( Z) yellowstripe scad 4.34 per year at the rate of mortality natural ( M ) 1.204 per year and rate mortality by fishing (F) 3.136 per year in order to obtain the rate of exploitation 0.722. The value of this exploitation rate has exceeded the value of the optimum exploitation of 0.5.

  9. Geostatistics and remote sensing as predictive tools of tick distribution: a cokriging system to estimate Ixodes scapularis (Acari: Ixodidae) habitat suitability in the United States and Canada from advanced very high resolution radiometer satellite imagery.

    PubMed

    Estrada-Peña, A

    1998-11-01

    Geostatistics (cokriging) was used to model the cross-correlated information between satellite-derived vegetation and climate variables and the distribution of the tick Ixodes scapularis (Say) in the Nearctic. Output was used to map the habitat suitability for I. scapularis on a continental scale. A data base of the localities where I. scapularis was collected in the United States and Canada was developed from a total of 346 published and geocoded records. This data base was cross-correlated with satellite pictures from the advanced very high resolution radiometer sensor obtained from 1984 to 1994 on the Nearctic at 10-d intervals, with a resolution of 8 km per pixel. Eight climate and vegetation variables were tabulated from this imagery. A cokriging system was generated to exploit satellite-derived data and to estimate the distribution of I. scapularis. Results obtained using 2 vegetation (standard NDVI) and 4 temperature variables closely agreed with actual records of the tick, with a sensitivity of 0.97 and a specificity of 0.89, with 6 and 4% of false-positive and false-negative sites, respectively. Such statistical analysis can be used to guide field work toward the correct interpretation of the distribution limits of I. scapularis and can also be used to make predictions about the impact of global change on tick range.

  10. Photon-Number-Resolving Transition-Edge Sensors for the Metrology of Quantum Light Sources

    NASA Astrophysics Data System (ADS)

    Schmidt, M.; von Helversen, M.; López, M.; Gericke, F.; Schlottmann, E.; Heindel, T.; Kück, S.; Reitzenstein, S.; Beyer, J.

    2018-05-01

    Low-temperature photon-number-resolving detectors allow for direct access to the photon number distribution of quantum light sources and can thus be exploited to explore the photon statistics, e.g., solid-state-based non-classical light sources. In this work, we report on the setup and calibration of a detection system based on fiber-coupled tungsten transition-edge sensors (W-TESs). Our stand-alone system comprises two W-TESs, read out by two 2-stage-SQUID current sensors, operated in a compact detector unit that is integrated in an adiabatic demagnetization refrigerator. Fast low-noise analog amplifiers and digitizers are used for signal acquisition. The detection efficiency of the single-mode fiber-coupled detector system in the spectral region of interest (850-950 nm) is determined to be larger than 87 %. The presented detector system opens up new routes in the characterization of quantum light sources for quantum information, quantum-enhanced sensing and quantum metrology.

  11. AI and simulation: What can they learn from each other

    NASA Technical Reports Server (NTRS)

    Colombano, Silvano P.

    1988-01-01

    Simulation and Artificial Intelligence share a fertile common ground both from a practical and from a conceptual point of view. Strengths and weaknesses of both Knowledge Based System and Modeling and Simulation are examined and three types of systems that combine the strengths of both technologies are discussed. These types of systems are a practical starting point, however, the real strengths of both technologies will be exploited only when they are combined in a common knowledge representation paradigm. From an even deeper conceptual point of view, one might even argue that the ability to reason from a set of facts (i.e., Expert System) is less representative of human reasoning than the ability to make a model of the world, change it as required, and derive conclusions about the expected behavior of world entities. This is a fundamental problem in AI, and Modeling Theory can contribute to its solution. The application of Knowledge Engineering technology to a Distributed Processing Network Simulator (DPNS) is discussed.

  12. Polarimetric Intensity Parameterization of Radar and Other Remote Sensing Sources for Advanced Exploitation and Data Fusion: Theory

    DTIC Science & Technology

    2008-10-01

    is theoretically similar to the concept of “partial or compact polarimetry”, yields comparable results to full or quadrature-polarized systems by...to the emerging “compact polarimetry” methodology [9]-[13] that exploits scattering system response to an incomplete set of input EM field components...a scattering operator or matrix. Although as theoretically discussed earlier, performance of such fully-polarized radar system (i.e., quadrature

  13. Odor Landscapes in Turbulent Environments

    NASA Astrophysics Data System (ADS)

    Celani, Antonio; Villermaux, Emmanuel; Vergassola, Massimo

    2014-10-01

    The olfactory system of male moths is exquisitely sensitive to pheromones emitted by females and transported in the environment by atmospheric turbulence. Moths respond to minute amounts of pheromones, and their behavior is sensitive to the fine-scale structure of turbulent plumes where pheromone concentration is detectible. The signal of pheromone whiffs is qualitatively known to be intermittent, yet quantitative characterization of its statistical properties is lacking. This challenging fluid dynamics problem is also relevant for entomology, neurobiology, and the technological design of olfactory stimulators aimed at reproducing physiological odor signals in well-controlled laboratory conditions. Here, we develop a Lagrangian approach to the transport of pheromones by turbulent flows and exploit it to predict the statistics of odor detection during olfactory searches. The theory yields explicit probability distributions for the intensity and the duration of pheromone detections, as well as their spacing in time. Predictions are favorably tested by using numerical simulations, laboratory experiments, and field data for the atmospheric surface layer. The resulting signal of odor detections lends itself to implementation with state-of-the-art technologies and quantifies the amount and the type of information that male moths can exploit during olfactory searches.

  14. Exploiting range imagery: techniques and applications

    NASA Astrophysics Data System (ADS)

    Armbruster, Walter

    2009-07-01

    Practically no applications exist for which automatic processing of 2D intensity imagery can equal human visual perception. This is not the case for range imagery. The paper gives examples of 3D laser radar applications, for which automatic data processing can exceed human visual cognition capabilities and describes basic processing techniques for attaining these results. The examples are drawn from the fields of helicopter obstacle avoidance, object detection in surveillance applications, object recognition at high range, multi-object-tracking, and object re-identification in range image sequences. Processing times and recognition performances are summarized. The techniques used exploit the bijective continuity of the imaging process as well as its independence of object reflectivity, emissivity and illumination. This allows precise formulations of the probability distributions involved in figure-ground segmentation, feature-based object classification and model based object recognition. The probabilistic approach guarantees optimal solutions for single images and enables Bayesian learning in range image sequences. Finally, due to recent results in 3D-surface completion, no prior model libraries are required for recognizing and re-identifying objects of quite general object categories, opening the way to unsupervised learning and fully autonomous cognitive systems.

  15. Evaluation methodology for query-based scene understanding systems

    NASA Astrophysics Data System (ADS)

    Huster, Todd P.; Ross, Timothy D.; Culbertson, Jared L.

    2015-05-01

    In this paper, we are proposing a method for the principled evaluation of scene understanding systems in a query-based framework. We can think of a query-based scene understanding system as a generalization of typical sensor exploitation systems where instead of performing a narrowly defined task (e.g., detect, track, classify, etc.), the system can perform general user-defined tasks specified in a query language. Examples of this type of system have been developed as part of DARPA's Mathematics of Sensing, Exploitation, and Execution (MSEE) program. There is a body of literature on the evaluation of typical sensor exploitation systems, but the open-ended nature of the query interface introduces new aspects to the evaluation problem that have not been widely considered before. In this paper, we state the evaluation problem and propose an approach to efficiently learn about the quality of the system under test. We consider the objective of the evaluation to be to build a performance model of the system under test, and we rely on the principles of Bayesian experiment design to help construct and select optimal queries for learning about the parameters of that model.

  16. Vienna FORTRAN: A FORTRAN language extension for distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Mehrotra, Piyush; Zima, Hans

    1991-01-01

    Exploiting the performance potential of distributed memory machines requires a careful distribution of data across the processors. Vienna FORTRAN is a language extension of FORTRAN which provides the user with a wide range of facilities for such mapping of data structures. However, programs in Vienna FORTRAN are written using global data references. Thus, the user has the advantage of a shared memory programming paradigm while explicitly controlling the placement of data. The basic features of Vienna FORTRAN are presented along with a set of examples illustrating the use of these features.

  17. Programming in Vienna Fortran

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Mehrotra, Piyush; Zima, Hans

    1992-01-01

    Exploiting the full performance potential of distributed memory machines requires a careful distribution of data across the processors. Vienna Fortran is a language extension of Fortran which provides the user with a wide range of facilities for such mapping of data structures. In contrast to current programming practice, programs in Vienna Fortran are written using global data references. Thus, the user has the advantages of a shared memory programming paradigm while explicitly controlling the data distribution. In this paper, we present the language features of Vienna Fortran for FORTRAN 77, together with examples illustrating the use of these features.

  18. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.

    2001-09-01

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less

  19. Precise Distances for Main-belt Asteroids in Only Two Nights

    NASA Astrophysics Data System (ADS)

    Heinze, Aren N.; Metchev, Stanimir

    2015-10-01

    We present a method for calculating precise distances to asteroids using only two nights of data from a single location—far too little for an orbit—by exploiting the angular reflex motion of the asteroids due to Earth’s axial rotation. We refer to this as the rotational reflex velocity method. While the concept is simple and well-known, it has not been previously exploited for surveys of main belt asteroids (MBAs). We offer a mathematical development, estimates of the errors of the approximation, and a demonstration using a sample of 197 asteroids observed for two nights with a small, 0.9-m telescope. This demonstration used digital tracking to enhance detection sensitivity for faint asteroids, but our distance determination works with any detection method. Forty-eight asteroids in our sample had known orbits prior to our observations, and for these we demonstrate a mean fractional error of only 1.6% between the distances we calculate and those given in ephemerides from the Minor Planet Center. In contrast to our two-night results, distance determination by fitting approximate orbits requires observations spanning 7-10 nights. Once an asteroid’s distance is known, its absolute magnitude and size (given a statistically estimated albedo) may immediately be calculated. Our method will therefore greatly enhance the efficiency with which 4m and larger telescopes can probe the size distribution of small (e.g., 100 m) MBAs. This distribution remains poorly known, yet encodes information about the collisional evolution of the asteroid belt—and hence the history of the Solar System.

  20. Automating usability of ATLAS Distributed Computing resources

    NASA Astrophysics Data System (ADS)

    Tupputi, S. A.; Di Girolamo, A.; Kouba, T.; Schovancová, J.; Atlas Collaboration

    2014-06-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  1. Liking and hyperlinking: Community detection in online child sexual exploitation networks.

    PubMed

    Westlake, Bryce G; Bouchard, Martin

    2016-09-01

    The online sexual exploitation of children is facilitated by websites that form virtual communities, via hyperlinks, to distribute images, videos, and other material. However, how these communities form, are structured, and evolve over time is unknown. Collected using a custom-designed webcrawler, we begin from known child sexual exploitation (CE) seed websites and follow hyperlinks to connected, related, websites. Using a repeated measure design we analyze 10 networks of 300 + websites each - over 4.8 million unique webpages in total, over a period of 60 weeks. Community detection techniques reveal that CE-related networks were dominated by two large communities hosting varied material -not necessarily matching the seed website. Community stability, over 60 weeks, varied across networks. Reciprocity in hyperlinking between community members was substantially higher than within the full network, however, websites were not more likely to connect to homogeneous-content websites. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Operational Decision Aids for Exploiting or Mitigating Electromagnetic Propagation Effects

    DTIC Science & Technology

    1989-09-01

    Exploitation or mitigation of environmental effects rank equal in importance with weapons systems. The rapidly changing propagation environment ...global in nature . It not only involves the ocean environment from the tropics to the poles, but also the coastal and land environments . Some of the...tactics must take environ - mental conditions into account and either mitigate or exploit their effects. There are many environmental factors that influence

  3. Optical sensors for electrical elements of a medium voltage distribution network

    NASA Astrophysics Data System (ADS)

    De Maria, Letizia; Bartalesi, Daniele; Serragli, Paolo; Paladino, Domenico

    2012-04-01

    The aging of most of the components of the National transmission and distribution system can potentially influence the reliability of power supply in a Medium Voltage (MV) network. In order to prevent possible dangerous situations, selected diagnostic indicators on electrical parts exploiting reliable and potentially low-cost sensors are required. This paper presents results concerning two main research activities regarding the development and application of innovative optical sensors for the diagnostic of MV electrical components. The first concerns a multi-sensor prototype for the detection of pre-discharges in MV switchboards: it is the combination of three different types of sensors operating simultaneously to detect incipient failure and to reduce the occurrence of false alarms. The system is real-time controlled by an embedded computer through a LabView interface. The second activity refers to a diagnostic tool to provide significant real-time information about early aging of MV/Low Voltage (LV) transformers by means of its vibration fingerprint. A miniaturized Optical Micro-Electro-Mechanical System (MEMS) based unit has been assembled for vibration measurements, wireless connected to a remote computer and controlled via LabView interface. Preliminary comparative tests were carried out with standard piezoelectric accelerometers on a conventional MV/LV test transformer under open circuit and in short-circuited configuration.

  4. Large expansion of oil industry in the Ecuadorian Amazon: biodiversity vulnerability and conservation alternatives.

    PubMed

    Lessmann, Janeth; Fajardo, Javier; Muñoz, Jesús; Bonaccorso, Elisa

    2016-07-01

    Ecuador will experience a significant expansion of the oil industry in its Amazonian region, one of the most biodiverse areas of the world. In view of the changes that are about to come, we explore the conflicts between oil extraction interests and biodiversity protection and apply systematic conservation planning to identify priority areas that should be protected in different oil exploitation scenarios. First, we quantified the current extent of oil blocks and protected zones and their overlap with two biodiversity indicators: 25 ecosystems and 745 species (whose distributions were estimated via species distribution models). With the new scheme of oil exploitation, oil blocks cover 68% (68,196 km(2)) of the Ecuadorian Amazon; half of it occupied by new blocks open for bids in the southern Amazon. This region is especially vulnerable to biodiversity losses, because peaks of species diversity, 19 ecosystems, and a third of its protected zones coincide spatially with oil blocks. Under these circumstances, we used Marxan software to identify priority areas for conservation outside oil blocks, but their coverage was insufficient to completely represent biodiversity. Instead, priority areas that include southern oil blocks provide a higher representation of biodiversity indicators. Therefore, preserving the southern Amazon becomes essential to improve the protection of Amazonian biodiversity in Ecuador, and avoiding oil exploitation in these areas (33% of the extent of southern oil blocks) should be considered a conservation alternative. Also, it is highly recommended to improve current oil exploitation technology to reduce environmental impacts in the region, especially within five oil blocks that we identified as most valuable for the conservation of biodiversity. The application of these and other recommendations depends heavily on the Ecuadorian government, which needs to find a better balance between the use of the Amazon resources and biodiversity conservation.

  5. Taking movement data to new depths: Inferring prey availability and patch profitability from seabird foraging behavior.

    PubMed

    Chimienti, Marianna; Cornulier, Thomas; Owen, Ellie; Bolton, Mark; Davies, Ian M; Travis, Justin M J; Scott, Beth E

    2017-12-01

    Detailed information acquired using tracking technology has the potential to provide accurate pictures of the types of movements and behaviors performed by animals. To date, such data have not been widely exploited to provide inferred information about the foraging habitat. We collected data using multiple sensors (GPS, time depth recorders, and accelerometers) from two species of diving seabirds, razorbills ( Alca torda , N  = 5, from Fair Isle, UK) and common guillemots ( Uria aalge , N  = 2 from Fair Isle and N  = 2 from Colonsay, UK). We used a clustering algorithm to identify pursuit and catching events and the time spent pursuing and catching underwater, which we then used as indicators for inferring prey encounters throughout the water column and responses to changes in prey availability of the areas visited at two levels: individual dives and groups of dives. For each individual dive ( N  = 661 for guillemots, 6214 for razorbills), we modeled the number of pursuit and catching events, in relation to dive depth, duration, and type of dive performed (benthic vs. pelagic). For groups of dives ( N  = 58 for guillemots, 156 for razorbills), we modeled the total time spent pursuing and catching in relation to time spent underwater. Razorbills performed only pelagic dives, most likely exploiting prey available at shallow depths as indicated by the vertical distribution of pursuit and catching events. In contrast, guillemots were more flexible in their behavior, switching between benthic and pelagic dives. Capture attempt rates indicated that they were exploiting deep prey aggregations. The study highlights how novel analysis of movement data can give new insights into how animals exploit food patches, offering a unique opportunity to comprehend the behavioral ecology behind different movement patterns and understand how animals might respond to changes in prey distributions.

  6. Life-history plasticity and sustainable exploitation: a theory of growth compensation applied to walleye management.

    PubMed

    Lester, Nigel P; Shuter, Brian J; Venturelli, Paul; Nadeau, Daniel

    2014-01-01

    A simple population model was developed to evaluate the role of plastic and evolutionary life-history changes on sustainable exploitation rates. Plastic changes are embodied in density-dependent compensatory adjustments to somatic growth rate and larval/juvenile survival, which can compensate for the reductions in reproductive lifetime and mean population fecundity that accompany the higher adult mortality imposed by exploitation. Evolutionary changes are embodied in the selective pressures that higher adult mortality imposes on age at maturity, length at maturity, and reproductive investment. Analytical development, based on a biphasic growth model, led to simple equations that show explicitly how sustainable exploitation rates are bounded by each of these effects. We show that density-dependent growth combined with a fixed length at maturity and fixed reproductive investment can support exploitation-driven mortality that is 80% of the level supported by evolutionary changes in maturation and reproductive investment. Sustainable fishing mortality is proportional to natural mortality (M) times the degree of density-dependent growth, as modified by both the degree of density-dependent early survival and the minimum harvestable length. We applied this model to estimate sustainable exploitation rates for North American walleye populations (Sander vitreus). Our analysis of demographic data from walleye populations spread across a broad latitudinal range indicates that density-dependent variation in growth rate can vary by a factor of 2. Implications of this growth response are generally consistent with empirical studies suggesting that optimal fishing mortality is approximately 0.75M for teleosts. This approach can be adapted to the management of other species, particularly when significant exploitation is imposed on many, widely distributed, but geographically isolated populations.

  7. An EMSO data case study within the INDIGO-DC project

    NASA Astrophysics Data System (ADS)

    Monna, Stephen; Marcucci, Nicola M.; Marinaro, Giuditta; Fiore, Sandro; D'Anca, Alessandro; Antonacci, Marica; Beranzoli, Laura; Favali, Paolo

    2017-04-01

    We present our experience based on a case study within the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) project (www.indigo-datacloud.eu). The aim of INDIGO-DC is to develop a data and computing platform targeting scientific communities. Our case study is an example of activities performed by INGV using data from seafloor observatories that are nodes of the infrastructure EMSO (European Multidisciplinary Seafloor and water column Observatory)-ERIC (www.emso-eu.org). EMSO is composed of several deep-seafloor and water column observatories, deployed at key sites in the European waters, thus forming a widely distributed pan-European infrastructure. In our case study we consider data collected by the NEMO-SN1 observatory, one of the EMSO nodes used for geohazard monitoring, located in the Western Ionian Sea in proximity of Etna volcano. Starting from the case study, through an agile approach, we defined some requirements for INDIGO developers, and tested some of the proposed INDIGO solutions that are of interest for our research community. Given that EMSO is a distributed infrastructure, we are interested in INDIGO solutions that allow access to distributed data storage. Access should be both user-oriented and machine-oriented, and with the use of a common identity and access system. For this purpose, we have been testing: - ONEDATA (https://onedata.org), as global data management system. - INDIGO-IAM as Identity and Access Management system. Another aspect we are interested in is the efficient data processing, and we have focused on two types of INDIGO products: - Ophidia (http://ophidia.cmcc.it), a big data analytics framework for eScience for the analysis of multidimensional data. - A collection of INDIGO Services to run processes for scientific computing through the INDIGO Orchestrator.

  8. Distributed adaptive diagnosis of sensor faults using structural response data

    NASA Astrophysics Data System (ADS)

    Dragos, Kosmas; Smarsly, Kay

    2016-10-01

    The reliability and consistency of wireless structural health monitoring (SHM) systems can be compromised by sensor faults, leading to miscalibrations, corrupted data, or even data loss. Several research approaches towards fault diagnosis, referred to as ‘analytical redundancy’, have been proposed that analyze the correlations between different sensor outputs. In wireless SHM, most analytical redundancy approaches require centralized data storage on a server for data analysis, while other approaches exploit the on-board computing capabilities of wireless sensor nodes, analyzing the raw sensor data directly on board. However, using raw sensor data poses an operational constraint due to the limited power resources of wireless sensor nodes. In this paper, a new distributed autonomous approach towards sensor fault diagnosis based on processed structural response data is presented. The inherent correlations among Fourier amplitudes of acceleration response data, at peaks corresponding to the eigenfrequencies of the structure, are used for diagnosis of abnormal sensor outputs at a given structural condition. Representing an entirely data-driven analytical redundancy approach that does not require any a priori knowledge of the monitored structure or of the SHM system, artificial neural networks (ANN) are embedded into the sensor nodes enabling cooperative fault diagnosis in a fully decentralized manner. The distributed analytical redundancy approach is implemented into a wireless SHM system and validated in laboratory experiments, demonstrating the ability of wireless sensor nodes to self-diagnose sensor faults accurately and efficiently with minimal data traffic. Besides enabling distributed autonomous fault diagnosis, the embedded ANNs are able to adapt to the actual condition of the structure, thus ensuring accurate and efficient fault diagnosis even in case of structural changes.

  9. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready-to-use functional model. In this connection, we recommend to use in hazard analyses non-parametric, kernel estimators of magnitude distribution. The earthquake occurrence process of IIS is not a Poisson process. When earthquakes' occurrences are influenced by a multitude of inducing factors, the interevent time distribution can be modelled by the Weibull distribution supporting a negative ageing property of the process. When earthquake occurrences are due to a specific injection activity, the earthquake rate directly depends on the injection rate and responds immediately to the changes of the injection rate. Furthermore, this response is not limited only to correlated variations of the seismic activity but it also concerns significant changes of the shape of interevent time distribution. Unlike the event rate, the shape of magnitude distribution does not exhibit correlation with the injection rate. This work was supported within SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE 16-2014-1 and within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.

  10. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    NASA Astrophysics Data System (ADS)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline available and it will also provide statistical pedigree data. This pedigree data provides both uncertainties associated with the information and an audit trail cataloging the raw data sources and the processing/exploitation applied to derive the final product. Collaboration provides for a close union between the information producer(s)/exploiter(s) and the information user(s) as well as between local and remote producer(s)/exploiter(s). From a military operational perspective, IMAGES is a step toward further uniting NIMA with its customers and further blurring the dividing line between operational command and control (C2) and its supporting intelligence activities. IMAGES also provides a foundation for reachback to remote data sources, data stores, application software, and computational resources for achieving 'just-in- time' information delivery -- all of which is transparent to the analyst or operator employing the system.

  11. Characterization of the seascape used by juvenile and wintering adult Southern Giant Petrels from Patagonia Argentina

    NASA Astrophysics Data System (ADS)

    Blanco, Gabriela S.; Pisoni, Juan P.; Quintana, Flavio

    2015-02-01

    The characterization of the seascape used by marine top predators provides a wide perspective of pelagic habitat use and it is necessary to understand the functioning of marine systems. The goal of this study was to characterize the oceanographic and biological features of marine areas used by adult and first year juvenile southern giant petrels (SGP, Macronectes giganteus) from northern Patagonian colonies (Isla Arce and Gran Robredo) during the austral fall and winter (2005, 2006, 2007, and 2008). The marine environment exploited by the SGP was characterized using sea surface temperature (SST), SST gradients, chlorophyll-a concentration, water depth, oceanographic regimes, and ocean surface winds. In addition, the biological seascape was defined by considering the distribution of squid during the months of study. Juveniles SGP exploited a wide range of environments focusing mainly on productive neritic waters using a variety of oceanographic regimes. Juveniles were exposed to eutrophic and enriched waters, probably because of the frequent presence of thermal fronts in their utilization areas. Adults' environments lacked of thermal fronts remaining the majority of their time within the oceanographic regime "Continental Shelf", in water depths of 100-200 m, exploiting mesotrophic and eutrophic environments, and remaining in areas of known food resources related to the presence of squid. For the most part, juveniles were exposed to westerly winds, which may have helped them in their initial flight to the shelf break, east of the colony. Wintering adults SGP also explored areas characterized by westerly winds but this did not play a primary role in the selection of their residence areas. Juveniles during their first year at sea have to search for food exploring a variety of unknown environments. During their search, they remained in productive environments associated to fronts and probably also associated to fisheries operating in their foraging areas. The understanding of pelagic birds' habitat selection and preferences through the year is crucial for the monitoring of anthropogenic impacts over these species. Further studies should focus on the prediction of variables that determine the distribution of these species though the year and during different life stages.

  12. Protein location prediction using atomic composition and global features of the amino acid sequence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cherian, Betsy Sheena, E-mail: betsy.skb@gmail.com; Nair, Achuthsankar S.

    2010-01-22

    Subcellular location of protein is constructive information in determining its function, screening for drug candidates, vaccine design, annotation of gene products and in selecting relevant proteins for further studies. Computational prediction of subcellular localization deals with predicting the location of a protein from its amino acid sequence. For a computational localization prediction method to be more accurate, it should exploit all possible relevant biological features that contribute to the subcellular localization. In this work, we extracted the biological features from the full length protein sequence to incorporate more biological information. A new biological feature, distribution of atomic composition is effectivelymore » used with, multiple physiochemical properties, amino acid composition, three part amino acid composition, and sequence similarity for predicting the subcellular location of the protein. Support Vector Machines are designed for four modules and prediction is made by a weighted voting system. Our system makes prediction with an accuracy of 100, 82.47, 88.81 for self-consistency test, jackknife test and independent data test respectively. Our results provide evidence that the prediction based on the biological features derived from the full length amino acid sequence gives better accuracy than those derived from N-terminal alone. Considering the features as a distribution within the entire sequence will bring out underlying property distribution to a greater detail to enhance the prediction accuracy.« less

  13. Using sketch-map coordinates to analyze and bias molecular dynamics simulations

    PubMed Central

    Tribello, Gareth A.; Ceriotti, Michele; Parrinello, Michele

    2012-01-01

    When examining complex problems, such as the folding of proteins, coarse grained descriptions of the system drive our investigation and help us to rationalize the results. Oftentimes collective variables (CVs), derived through some chemical intuition about the process of interest, serve this purpose. Because finding these CVs is the most difficult part of any investigation, we recently developed a dimensionality reduction algorithm, sketch-map, that can be used to build a low-dimensional map of a phase space of high-dimensionality. In this paper we discuss how these machine-generated CVs can be used to accelerate the exploration of phase space and to reconstruct free-energy landscapes. To do so, we develop a formalism in which high-dimensional configurations are no longer represented by low-dimensional position vectors. Instead, for each configuration we calculate a probability distribution, which has a domain that encompasses the entirety of the low-dimensional space. To construct a biasing potential, we exploit an analogy with metadynamics and use the trajectory to adaptively construct a repulsive, history-dependent bias from the distributions that correspond to the previously visited configurations. This potential forces the system to explore more of phase space by making it desirable to adopt configurations whose distributions do not overlap with the bias. We apply this algorithm to a small model protein and succeed in reproducing the free-energy surface that we obtain from a parallel tempering calculation. PMID:22427357

  14. MSAProbs-MPI: parallel multiple sequence aligner for distributed-memory systems.

    PubMed

    González-Domínguez, Jorge; Liu, Yongchao; Touriño, Juan; Schmidt, Bertil

    2016-12-15

    MSAProbs is a state-of-the-art protein multiple sequence alignment tool based on hidden Markov models. It can achieve high alignment accuracy at the expense of relatively long runtimes for large-scale input datasets. In this work we present MSAProbs-MPI, a distributed-memory parallel version of the multithreaded MSAProbs tool that is able to reduce runtimes by exploiting the compute capabilities of common multicore CPU clusters. Our performance evaluation on a cluster with 32 nodes (each containing two Intel Haswell processors) shows reductions in execution time of over one order of magnitude for typical input datasets. Furthermore, MSAProbs-MPI using eight nodes is faster than the GPU-accelerated QuickProbs running on a Tesla K20. Another strong point is that MSAProbs-MPI can deal with large datasets for which MSAProbs and QuickProbs might fail due to time and memory constraints, respectively. Source code in C ++ and MPI running on Linux systems as well as a reference manual are available at http://msaprobs.sourceforge.net CONTACT: jgonzalezd@udc.esSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. The ALMA software architecture

    NASA Astrophysics Data System (ADS)

    Schwarz, Joseph; Farris, Allen; Sommer, Heiko

    2004-09-01

    The software for the Atacama Large Millimeter Array (ALMA) is being developed by many institutes on two continents. The software itself will function in a distributed environment, from the 0.5-14 kmbaselines that separate antennas to the larger distances that separate the array site at the Llano de Chajnantor in Chile from the operations and user support facilities in Chile, North America and Europe. Distributed development demands 1) interfaces that allow separated groups to work with minimal dependence on their counterparts at other locations; and 2) a common architecture to minimize duplication and ensure that developers can always perform similar tasks in a similar way. The Container/Component model provides a blueprint for the separation of functional from technical concerns: application developers concentrate on implementing functionality in Components, which depend on Containers to provide them with services such as access to remote resources, transparent serialization of entity objects to XML, logging, error handling and security. Early system integrations have verified that this architecture is sound and that developers can successfully exploit its features. The Containers and their services are provided by a system-orienteddevelopment team as part of the ALMA Common Software (ACS), middleware that is based on CORBA.

  16. Pervasive sensing

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2000-11-01

    The coordinated exploitation of modern communication, micro- sensor and computer technologies makes it possible to give global reach to our senses. Web-cameras for vision, web- microphones for hearing and web-'noses' for smelling, plus the abilities to sense many factors we cannot ordinarily perceive, are either available or will be soon. Applications include (1) determination of weather and environmental conditions on dense grids or over large areas, (2) monitoring of energy usage in buildings, (3) sensing the condition of hardware in electrical power distribution and information systems, (4) improving process control and other manufacturing, (5) development of intelligent terrestrial, marine, aeronautical and space transportation systems, (6) managing the continuum of routine security monitoring, diverse crises and military actions, and (7) medicine, notably the monitoring of the physiology and living conditions of individuals. Some of the emerging capabilities, such as the ability to measure remotely the conditions inside of people in real time, raise interesting social concerns centered on privacy issues. Methods for sensor data fusion and designs for human-computer interfaces are both crucial for the full realization of the potential of pervasive sensing. Computer-generated virtual reality, augmented with real-time sensor data, should be an effective means for presenting information from distributed sensors.

  17. Discovery potential for directional dark matter detection with nuclear emulsions

    NASA Astrophysics Data System (ADS)

    Guler, A. M.; NEWSdm Collaboration

    2017-06-01

    Direct Dark Matter searches are nowadays one of the most exciting research topics. Several Experimental efforts are concentrated on the development, construction, and operation of detectors looking for the scattering of target nuclei with Weakly Interactive Massive Particles (WIMPs). In this field a new frontier can be opened by directional detectors able to reconstruct the direction of the WIMP-recoiled nucleus thus allowing to extend dark matter searches beyond the neutrino floor. Exploiting directionality would also give a proof of the galactic origin of dark matter making it possible to have a clear and unambiguous signal to background separation. The angular distribution of WIPM-scattered nuclei is indeed expected to be peaked in the direction of the motion of the Solar System in the Galaxy, i.e. toward the Cygnus constellation, while the background distribution is expected to be isotropic. Current directional experiments are based on the use of gas TPC whose sensitivity is limited by the small achievable detector mass. In this paper we show the potentiality in terms of exclusion limit of a directional experiment based on the use of a solid target made by newly developed nuclear emulsions and read-out systems reaching sub-micrometric resolution.

  18. Reassessment of psychological distress and post-traumatic stress disorder in United States Air Force Distributed Common Ground System operators.

    PubMed

    Prince, Lillian; Chappelle, Wayne L; McDonald, Kent D; Goodman, Tanya; Cowper, Sara; Thompson, William

    2015-03-01

    The goal of this study was to assess for the main sources of occupational stress, as well as self-reported symptoms of distress and post-traumatic stress disorder among U.S. Air Force (USAF) Distributed Common Ground System (DCGS) intelligence exploitation and support personnel. DCGS intelligence operators (n=1091) and nonintelligence personnel (n = 447) assigned to a USAF Intelligence, Surveillance, and Reconnaissance Wing responded to the web-based survey. The overall survey response rate was 31%. Study results revealed the most problematic stressors among DCGS intelligence personnel included high workload, low manning, as well as organizational leadership and shift work issues. Results also revealed 14.35% of DCGS intelligence operators' self-reported high levels of psychological distress (twice the rate of DCGS nonintelligence support personnel). Furthermore, 2.0% to 2.5% self-reported high levels of post-traumatic stress disorder symptoms, with no significant difference between groups. The implications of these findings are discussed along with recommendations for USAF medical and mental health providers, as well as operational leadership. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  19. Econo-Thermodynamics: The Nature of Economic Interactions

    NASA Astrophysics Data System (ADS)

    Mimkes, Juergen

    2006-03-01

    Physicists often model economic interactions like collisions of atoms in gases: by interaction one agent gains, the other loses. This leads to a Boltzmann distribution of capital, which has been observed in wealth distributions of different countries. However, economists object: no economic agent will attend a market in which he gets robbed! This conflict may be resolved by writing basic laws of economics into terms of calculus. In these terms the daily struggle for survival of all economic systems turns out to be a Carnot cycle that is driven by energy: heat pumps and economic production depend on oil, GNP and oil consumption run parallel for all countries. Motors and markets are based on the same laws of calculus (macro-economics) and statistics (micro-economics). Economic interactions mean exploiting a third party (nature) and are indeed close to robbing! A baker sells bread to his customers, but the flour comes from nature. Banks sells loans to investors, but the money comes from savers. Econo-thermodynamics is a thrilling new interdisciplinary field.

  20. Detecting Distributed SQL Injection Attacks in a Eucalyptus Cloud Environment

    NASA Technical Reports Server (NTRS)

    Kebert, Alan; Barnejee, Bikramjit; Solano, Juan; Solano, Wanda

    2013-01-01

    The cloud computing environment offers malicious users the ability to spawn multiple instances of cloud nodes that are similar to virtual machines, except that they can have separate external IP addresses. In this paper we demonstrate how this ability can be exploited by an attacker to distribute his/her attack, in particular SQL injection attacks, in such a way that an intrusion detection system (IDS) could fail to identify this attack. To demonstrate this, we set up a small private cloud, established a vulnerable website in one instance, and placed an IDS within the cloud to monitor the network traffic. We found that an attacker could quite easily defeat the IDS by periodically altering its IP address. To detect such an attacker, we propose to use multi-agent plan recognition, where the multiple source IPs are considered as different agents who are mounting a collaborative attack. We show that such a formulation of this problem yields a more sophisticated approach to detecting SQL injection attacks within a cloud computing environment.

  1. Is there substructure around M87?

    NASA Astrophysics Data System (ADS)

    Oldham, L. J.; Evans, N. W.

    2016-10-01

    We present a general method to identify infalling substructure in discrete data sets with position and line-of-sight velocity data. We exploit the fact that galaxies falling on to a brightest cluster galaxy (BCG) in a virialized cluster, or dwarf satellites falling on to a central galaxy like the Milky Way, follow nearly radial orbits. If the orbits are exactly radial, we show how to find the probability distribution for a satellite's energy, given a tracer density for the satellite population, by solving an Abel integral equation. This is an extension of Eddington's classical formula for the isotropic distribution function. When applied to a system of galaxies, clustering in energy space can then be quantified using the Kullback-Leibler divergence, and groups of objects can be identified which, though separated in the sky, may be falling in on the same orbit. This method is tested using mock data and applied to the satellite galaxy population around M87, the BCG in Virgo, and a number of associations that are found, which may represent infalling galaxy groups.

  2. Quantum centipedes: collective dynamics of interacting quantum walkers

    NASA Astrophysics Data System (ADS)

    Krapivsky, P. L.; Luck, J. M.; Mallick, K.

    2016-08-01

    We consider the quantum centipede made of N fermionic quantum walkers on the one-dimensional lattice interacting by means of the simplest of all hard-bound constraints: the distance between two consecutive fermions is either one or two lattice spacings. This composite quantum walker spreads ballistically, just as the simple quantum walk. However, because of the interactions between the internal degrees of freedom, the distribution of its center-of-mass velocity displays numerous ballistic fronts in the long-time limit, corresponding to singularities in the empirical velocity distribution. The spectrum of the centipede and the corresponding group velocities are analyzed by direct means for the first few values of N. Some analytical results are obtained for arbitrary N by exploiting an exact mapping of the problem onto a free-fermion system. We thus derive the maximal velocity describing the ballistic spreading of the two extremal fronts of the centipede wavefunction, including its non-trivial value in the large-N limit.

  3. Acoustic scattering by arbitrary distributions of disjoint, homogeneous cylinders or spheres.

    PubMed

    Hesford, Andrew J; Astheimer, Jeffrey P; Waag, Robert C

    2010-05-01

    A T-matrix formulation is presented to compute acoustic scattering from arbitrary, disjoint distributions of cylinders or spheres, each with arbitrary, uniform acoustic properties. The generalized approach exploits the similarities in these scattering problems to present a single system of equations that is easily specialized to cylindrical or spherical scatterers. By employing field expansions based on orthogonal harmonic functions, continuity of pressure and normal particle velocity are directly enforced at each scatterer using diagonal, analytic expressions to eliminate the need for integral equations. The effect of a cylinder or sphere that encloses all other scatterers is simulated with an outer iterative procedure that decouples the inner-object solution from the effect of the enclosing object to improve computational efficiency when interactions among the interior objects are significant. Numerical results establish the validity and efficiency of the outer iteration procedure for nested objects. Two- and three-dimensional methods that employ this outer iteration are used to measure and characterize the accuracy of two-dimensional approximations to three-dimensional scattering of elevation-focused beams.

  4. Numerical investigation of combustion field of hypervelocity scramjet engine

    NASA Astrophysics Data System (ADS)

    Zhang, Shikong; Li, Jiang; Qin, Fei; Huang, Zhiwei; Xue, Rui

    2016-12-01

    A numerical study of the ground testing of a hydrogen-fueled scramjet engine was undertaken using the commercial computational-fluid-dynamics code CFD++. The simulated Mach number was 12. A 7-species, 9-reaction-step hydrogen-air chemistry kinetics system was adopted for the Reynolds-averaged Navier-Stokes simulation. The two-equation SST turbulence model, which takes into account the wall functions, was used to handle the turbulence-chemistry interactions. The results were validated by experimentally measuring the wall pressure distribution, and the values obtained proved to be in good agreement. The flow pattern at non-reaction/reaction is presented, as are the results of analyzing the supersonic premix/non-premix flame structure, the reaction heat release distribution in different modes, and the change in the equivalence ratio. In this study, we realize the working mode of a hypervelocity engine and provide some suggestions for the combustion organization of the engine as well as offer insight into the potential for exploiting the processes of combustion and flow.

  5. MCdevelop - a universal framework for Stochastic Simulations

    NASA Astrophysics Data System (ADS)

    Slawinska, M.; Jadach, S.

    2011-03-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 48 136 No. of bytes in distributed program, including test data, etc.: 355 698 Distribution format: tar.gz Programming language: ANSI C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system. Operating system: Most UNIX systems, Linux. The application programs were thoroughly tested under Ubuntu 7.04, 8.04 and CERN Scientific Linux 5. Has the code been vectorised or parallelised?: Tools (scripts) for optional parallelisation on a PC farm are included. RAM: 500 bytes Classification: 11.3 External routines: ROOT package version 5.0 or higher ( http://root.cern.ch/drupal/). Nature of problem: Developing any type of stochastic simulation program for high energy physics and other areas. Solution method: Object Oriented programming in C++ with added persistency mechanism, batch scripts for running on PC farms and Autotools.

  6. Spatial ecology of refuge selection by an herbivore under risk of predation

    USGS Publications Warehouse

    Wilson, Tammy L.; Rayburn, Andrew P.; Edwards, Thomas C.

    2012-01-01

    Prey species use structures such as burrows to minimize predation risk. The spatial arrangement of these resources can have important implications for individual and population fitness. For example, there is evidence that clustered resources can benefit individuals by reducing predation risk and increasing foraging opportunity concurrently, which leads to higher population density. However, the scale of clustering that is important in these processes has been ignored during theoretical and empirical development of resource models. Ecological understanding of refuge exploitation by prey can be improved by spatial analysis of refuge use and availability that incorporates the effect of scale. We measured the spatial distribution of pygmy rabbit (Brachylagus idahoensis) refugia (burrows) through censuses in four 6-ha sites. Point pattern analyses were used to evaluate burrow selection by comparing the spatial distribution of used and available burrows. The presence of food resources and additional overstory cover resources was further examined using logistic regression. Burrows were spatially clustered at scales up to approximately 25 m, and then regularly spaced at distances beyond ~40 m. Pygmy rabbit exploitation of burrows did not match availability. Burrows used by pygmy rabbits were likely to be located in areas with high overall burrow density (resource clusters) and high overstory cover, which together minimized predation risk. However, in some cases we observed an interaction between either overstory cover (safety) or understory cover (forage) and burrow density. The interactions show that pygmy rabbits will use burrows in areas with low relative burrow density (high relative predation risk) if understory food resources are high. This points to a potential trade-off whereby rabbits must sacrifice some safety afforded by additional nearby burrows to obtain ample forage resources. Observed patterns of clustered burrows and non-random burrow use improve understanding of the importance of spatial distribution of refugia for burrowing herbivores. The analyses used allowed for the estimation of the spatial scale where subtle trade-offs between predation avoidance and foraging opportunity are likely to occur in a natural system.

  7. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The parameters thus learned are used for calculating the joint distribution of the observations. However, this GMM assumption is essentially an approximation and signals the potential viability of non-parametric density estimators. This is the key idea underlying the new approach.

  8. Thermodiffusion in multicomponent n-alkane mixtures.

    PubMed

    Galliero, Guillaume; Bataller, Henri; Bazile, Jean-Patrick; Diaz, Joseph; Croccolo, Fabrizio; Hoang, Hai; Vermorel, Romain; Artola, Pierre-Arnaud; Rousseau, Bernard; Vesovic, Velisa; Bou-Ali, M Mounir; Ortiz de Zárate, José M; Xu, Shenghua; Zhang, Ke; Montel, François; Verga, Antonio; Minster, Olivier

    2017-01-01

    Compositional grading within a mixture has a strong impact on the evaluation of the pre-exploitation distribution of hydrocarbons in underground layers and sediments. Thermodiffusion, which leads to a partial diffusive separation of species in a mixture due to the geothermal gradient, is thought to play an important role in determining the distribution of species in a reservoir. However, despite recent progress, thermodiffusion is still difficult to measure and model in multicomponent mixtures. In this work, we report on experimental investigations of the thermodiffusion of multicomponent n -alkane mixtures at pressure above 30 MPa. The experiments have been conducted in space onboard the Shi Jian 10 spacecraft so as to isolate the studied phenomena from convection. For the two exploitable cells, containing a ternary liquid mixture and a condensate gas, measurements have shown that the lightest and heaviest species had a tendency to migrate, relatively to the rest of the species, to the hot and cold region, respectively. These trends have been confirmed by molecular dynamics simulations. The measured condensate gas data have been used to quantify the influence of thermodiffusion on the initial fluid distribution of an idealised one dimension reservoir. The results obtained indicate that thermodiffusion tends to noticeably counteract the influence of gravitational segregation on the vertical distribution of species, which could result in an unstable fluid column. This confirms that, in oil and gas reservoirs, the availability of thermodiffusion data for multicomponent mixtures is crucial for a correct evaluation of the initial state fluid distribution.

  9. Spatio-temporal dynamics of a fish predator: Density-dependent and hydrographic effects on Baltic Sea cod population

    PubMed Central

    Bartolino, Valerio; Tian, Huidong; Bergström, Ulf; Jounela, Pekka; Aro, Eero; Dieterich, Christian; Meier, H. E. Markus; Cardinale, Massimiliano; Bland, Barbara

    2017-01-01

    Understanding the mechanisms of spatial population dynamics is crucial for the successful management of exploited species and ecosystems. However, the underlying mechanisms of spatial distribution are generally complex due to the concurrent forcing of both density-dependent species interactions and density-independent environmental factors. Despite the high economic value and central ecological importance of cod in the Baltic Sea, the drivers of its spatio-temporal population dynamics have not been analytically investigated so far. In this paper, we used an extensive trawl survey dataset in combination with environmental data to investigate the spatial dynamics of the distribution of the Eastern Baltic cod during the past three decades using Generalized Additive Models. The results showed that adult cod distribution was mainly affected by cod population size, and to a minor degree by small-scale hydrological factors and the extent of suitable reproductive areas. As population size decreases, the cod population concentrates to the southern part of the Baltic Sea, where the preferred more marine environment conditions are encountered. Using the fitted models, we predicted the Baltic cod distribution back to the 1970s and a temporal index of cod spatial occupation was developed. Our study will contribute to the management and conservation of this important resource and of the ecosystem where it occurs, by showing the forces shaping its spatial distribution and therefore the potential response of the population to future exploitation and environmental changes. PMID:28207804

  10. Where-Fi: a dynamic energy-efficient multimedia distribution framework for MANETs

    NASA Astrophysics Data System (ADS)

    Mohapatra, Shivajit; Carbunar, Bogdan; Pearce, Michael; Chaudhri, Rohit; Vasudevan, Venu

    2008-01-01

    Next generation mobile ad-hoc applications will revolve around users' need for sharing content/presence information with co-located devices. However, keeping such information fresh requires frequent meta-data exchanges, which could result in significant energy overheads. To address this issue, we propose distributed algorithms for energy efficient dissemination of presence and content usage information between nodes in mobile ad-hoc networks. First, we introduce a content dissemination protocol (called CPMP) for effectively distributing frequent small meta-data updates between co-located devices using multicast. We then develop two distributed algorithms that use the CPMP protocol to achieve "phase locked" wake up cycles for all the participating nodes in the network. The first algorithm is designed for fully-connected networks and then extended in the second to handle hidden terminals. The "phase locked" schedules are then exploited to adaptively transition the network interface to a deep sleep state for energy savings. We have implemented a prototype system (called "Where-Fi") on several Motorola Linux-based cell phone models. Our experimental results show that for all network topologies our algorithms were able to achieve "phase locking" between nodes even in the presence of hidden terminals. Moreover, we achieved battery lifetime extensions of as much as 28% for fully connected networks and about 20% for partially connected networks.

  11. Off-Grid Direction of Arrival Estimation Based on Joint Spatial Sparsity for Distributed Sparse Linear Arrays

    PubMed Central

    Liang, Yujie; Ying, Rendong; Lu, Zhenqi; Liu, Peilin

    2014-01-01

    In the design phase of sensor arrays during array signal processing, the estimation performance and system cost are largely determined by array aperture size. In this article, we address the problem of joint direction-of-arrival (DOA) estimation with distributed sparse linear arrays (SLAs) and propose an off-grid synchronous approach based on distributed compressed sensing to obtain larger array aperture. We focus on the complex source distribution in the practical applications and classify the sources into common and innovation parts according to whether a signal of source can impinge on all the SLAs or a specific one. For each SLA, we construct a corresponding virtual uniform linear array (ULA) to create the relationship of random linear map between the signals respectively observed by these two arrays. The signal ensembles including the common/innovation sources for different SLAs are abstracted as a joint spatial sparsity model. And we use the minimization of concatenated atomic norm via semidefinite programming to solve the problem of joint DOA estimation. Joint calculation of the signals observed by all the SLAs exploits their redundancy caused by the common sources and decreases the requirement of array size. The numerical results illustrate the advantages of the proposed approach. PMID:25420150

  12. Inferring epidemiological parameters from phylogenetic information for the HIV-1 epidemic among MSM

    NASA Astrophysics Data System (ADS)

    Quax, Rick; van de Vijver, David A. M. C.; Frentz, Dineke; Sloot, Peter M. A.

    2013-09-01

    The HIV-1 epidemic in Europe is primarily sustained by a dynamic topology of sexual interactions among MSM who have individual immune systems and behavior. This epidemiological process shapes the phylogeny of the virus population. Both fields of epidemic modeling and phylogenetics have a long history, however it remains difficult to use phylogenetic data to infer epidemiological parameters such as the structure of the sexual network and the per-act infectiousness. This is because phylogenetic data is necessarily incomplete and ambiguous. Here we show that the cluster-size distribution indeed contains information about epidemiological parameters using detailed numberical experiments. We simulate the HIV epidemic among MSM many times using the Monte Carlo method with all parameter values and their ranges taken from literature. For each simulation and the corresponding set of parameter values we calculate the likelihood of reproducing an observed cluster-size distribution. The result is an estimated likelihood distribution of all parameters from the phylogenetic data, in particular the structure of the sexual network, the per-act infectiousness, and the risk behavior reduction upon diagnosis. These likelihood distributions encode the knowledge provided by the observed cluster-size distrbution, which we quantify using information theory. Our work suggests that the growing body of genetic data of patients can be exploited to understand the underlying epidemiological process.

  13. An Educational Paper on the Effect Of Even Distribution in the Use of Available Energy Resources So as to Enhance Sustainability

    NASA Astrophysics Data System (ADS)

    Njuh Elongo, A. I.

    2017-12-01

    This paper seeks to suggest a possible and proper solution to the problem of over-exploitation of a certain major energy resource up-to its point of near extinction or pass its peak value. To achieve or present this hypothesis, an experiment was done which varied the following factors which are; the available natural resources, the demand rate and hence consumption rate. The question being " Can sharing the exploitation and consumption of available resources equally, improve sustainability?. " This was investigated through a simple experiment in a farm, via the use of goats grazing on grass. The test experiment was done on one farm where the goats were fed with three different types of grass found around the area, and was fed to them simultaneously as the number of goats also were increased with time. Also, on the other farm which hosted the control experiment, the three types of grass was introduced to the feeding process only when its predecessor was almost going towards extinction. The population of the goats also was increased with time. It was found that on the first farm(test farm) , the resources (being the grass) lasted for a longer time on the farm which hosted the test experiment. So it was concluded that, certainly, the "even distribution" of the exploitation and consumption of resources leads to an increase in sustainability.

  14. Surface-Modified Nanocarriers for Nose-to-Brain Delivery: From Bioadhesion to Targeting

    PubMed Central

    Clementino, Adryana; Buttini, Francesca; Colombo, Gaia; Pescina, Silvia; Stanisçuaski Guterres, Silvia; Nicoli, Sara

    2018-01-01

    In the field of nasal drug delivery, nose-to-brain delivery is among the most fascinating applications, directly targeting the central nervous system, bypassing the blood brain barrier. Its benefits include dose lowering and direct brain distribution of potent drugs, ultimately reducing systemic side effects. Recently, nasal administration of insulin showed promising results in clinical trials for the treatment of Alzheimer’s disease. Nanomedicines could further contribute to making nose-to-brain delivery a reality. While not disregarding the need for devices enabling a formulation deposition in the nose’s upper part, surface modification of nanomedicines appears the key strategy to optimize drug delivery from the nasal cavity to the brain. In this review, nanomedicine delivery based on particle engineering exploiting surface electrostatic charges, mucoadhesive polymers, or chemical moieties targeting the nasal epithelium will be discussed and critically evaluated in relation to nose-to-brain delivery. PMID:29543755

  15. Intrinsic imperfection of self-differencing single-photon detectors harms the security of high-speed quantum cryptography systems

    NASA Astrophysics Data System (ADS)

    Jiang, Mu-Sheng; Sun, Shi-Hai; Tang, Guang-Zhao; Ma, Xiang-Chun; Li, Chun-Yan; Liang, Lin-Mei

    2013-12-01

    Thanks to the high-speed self-differencing single-photon detector (SD-SPD), the secret key rate of quantum key distribution (QKD), which can, in principle, offer unconditionally secure private communications between two users (Alice and Bob), can exceed 1 Mbit/s. However, the SD-SPD may contain loopholes, which can be exploited by an eavesdropper (Eve) to hack into the unconditional security of the high-speed QKD systems. In this paper, we analyze the fact that the SD-SPD can be remotely controlled by Eve in order to spy on full information without being discovered, then proof-of-principle experiments are demonstrated. Here, we point out that this loophole is introduced directly by the operating principle of the SD-SPD, thus, it cannot be removed, except for the fact that some active countermeasures are applied by the legitimate parties.

  16. Finite size effects in epidemic spreading: the problem of overpopulated systems

    NASA Astrophysics Data System (ADS)

    Ganczarek, Wojciech

    2013-12-01

    In this paper we analyze the impact of network size on the dynamics of epidemic spreading. In particular, we investigate the pace of infection in overpopulated systems. In order to do that, we design a model for epidemic spreading on a finite complex network with a restriction to at most one contamination per time step, which can serve as a model for sexually transmitted diseases spreading in some student communes. Because of the highly discrete character of the process, the analysis cannot use the continuous approximation widely exploited for most models. Using a discrete approach, we investigate the epidemic threshold and the quasi-stationary distribution. The main results are two theorems about the mixing time for the process: it scales like the logarithm of the network size and it is proportional to the inverse of the distance from the epidemic threshold.

  17. Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.

    PubMed

    Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn

    2011-09-01

    Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".

  18. Enabling three-dimensional densitometric measurements using laboratory source X-ray micro-computed tomography

    NASA Astrophysics Data System (ADS)

    Pankhurst, M. J.; Fowler, R.; Courtois, L.; Nonni, S.; Zuddas, F.; Atwood, R. C.; Davis, G. R.; Lee, P. D.

    2018-01-01

    We present new software allowing significantly improved quantitative mapping of the three-dimensional density distribution of objects using laboratory source polychromatic X-rays via a beam characterisation approach (c.f. filtering or comparison to phantoms). One key advantage is that a precise representation of the specimen material is not required. The method exploits well-established, widely available, non-destructive and increasingly accessible laboratory-source X-ray tomography. Beam characterisation is performed in two stages: (1) projection data are collected through a range of known materials utilising a novel hardware design integrated into the rotation stage; and (2) a Python code optimises a spectral response model of the system. We provide hardware designs for use with a rotation stage able to be tilted, yet the concept is easily adaptable to virtually any laboratory system and sample, and implicitly corrects the image artefact known as beam hardening.

  19. INTERPOLATING VANCOUVER'S DAILY AMBIENT PM 10 FIELD

    EPA Science Inventory

    In this article we develop a spatial predictive distribution for the ambient space- time response field of daily ambient PM10 in Vancouver, Canada. Observed responses have a consistent temporal pattern from one monitoring site to the next. We exploit this feature of the field b...

  20. Diminishing incidence of Internet child pornographic images.

    PubMed

    Bagley, Christopher

    2003-08-01

    Indecent images of children posted to web sites and newsgroups over a 4-yr. period were sampled. A significant decline in the number of such images posted was observed, probably accounted for by the pressure of groups opposed to the distribution of such exploitive material.

  1. Eradication and control of livestock ticks: biological, economic and social perspectives.

    PubMed

    Walker, Alan R

    2011-07-01

    Comparisons of successful and failed attempts to eradicate livestock ticks reveal that the social context of farming and management of the campaigns have greater influence than techniques of treatment. The biology of ticks is considered principally where it has contributed to control of ticks as practiced on farms. The timing of treatments by life cycle and season can be exploited to reduce numbers of treatments per year. Pastures can be managed to starve and desiccate vulnerable larvae questing on vegetation. Immunity to ticks acquired by hosts can be enhanced by livestock breeding. The aggregated distribution of ticks on hosts with poor immunity can be used to select animals for removal from the herd. Models of tick population dynamics required for predicting outcomes of control methods need better understanding of drivers of distribution, aggregation, stability, and density-dependent mortality. Changing social circumstances, especially of land-use, has an influence on exposure to tick-borne pathogens that can be exploited for disease control.

  2. Aerial Surveys Give New Estimates for Orangutans in Sabah, Malaysia

    PubMed Central

    Gimenez, Olivier; Ambu, Laurentius; Ancrenaz, Karine; Andau, Patrick; Goossens, Benoît; Payne, John; Sawang, Azri; Tuuga, Augustine; Lackman-Ancrenaz, Isabelle

    2005-01-01

    Great apes are threatened with extinction, but precise information about the distribution and size of most populations is currently lacking. We conducted orangutan nest counts in the Malaysian state of Sabah (North Borneo), using a combination of ground and helicopter surveys, and provided a way to estimate the current distribution and size of the populations living throughout the entire state. We show that the number of nests detected during aerial surveys is directly related to the estimated true animal density and that a helicopter is an efficient tool to provide robust estimates of orangutan numbers. Our results reveal that with a total estimated population size of about 11,000 individuals, Sabah is one of the main strongholds for orangutans in North Borneo. More than 60% of orangutans living in the state occur outside protected areas, in production forests that have been through several rounds of logging extraction and are still exploited for timber. The role of exploited forests clearly merits further investigation for orangutan conservation in Sabah. PMID:15630475

  3. Simple methods of exploiting the underlying structure of rule-based systems

    NASA Technical Reports Server (NTRS)

    Hendler, James

    1986-01-01

    Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.

  4. High-speed single-photon signaling for daytime QKD

    NASA Astrophysics Data System (ADS)

    Bienfang, Joshua; Restelli, Alessandro; Clark, Charles

    2011-03-01

    The distribution of quantum-generated cryptographic key at high throughputs can be critically limited by the performance of the systems' single-photon detectors. While noise and afterpulsing are considerations for all single-photon QKD systems, high-transmission rate systems also have critical detector timing-resolution and recovery time requirements. We present experimental results exploiting the high timing resolution and count-rate stability of modified single-photon avalanche diodes (SPADs) in our GHz QKD system operating over a 1.5 km free-space link that demonstrate the ability to apply extremely short temporal gates, enabling daytime free-space QKD with a 4% QBER. We also discuss recent advances in gating techniques for InGaAs SPADs that are suitable for high-speed fiber-based QKD. We present afterpulse-probability measurements that demonstrate the ability to support single-photon count rates above 100 MHz with low afterpulse probability. These results will benefit the design and characterization of free-space and fiber QKD systems. A. Restelli, J.C. Bienfang A. Mink, and C.W. Clark, IEEE J. Sel. Topics in Quant. Electron 16, 1084 (2010).

  5. Energy-efficient hierarchical processing in the network of wireless intelligent sensors (WISE)

    NASA Astrophysics Data System (ADS)

    Raskovic, Dejan

    Sensor network nodes have benefited from technological advances in the field of wireless communication, processing, and power sources. However, the processing power of microcontrollers is often not sufficient to perform sophisticated processing, while the power requirements of digital signal processing boards or handheld computers are usually too demanding for prolonged system use. We are matching the intrinsic hierarchical nature of many digital signal-processing applications with the natural hierarchy in distributed wireless networks, and building the hierarchical system of wireless intelligent sensors. Our goal is to build a system that will exploit the hierarchical organization to optimize the power consumption and extend battery life for the given time and memory constraints, while providing real-time processing of sensor signals. In addition, we are designing our system to be able to adapt to the current state of the environment, by dynamically changing the algorithm through procedure replacement. This dissertation presents the analysis of hierarchical environment and methods for energy profiling used to evaluate different system design strategies, and to optimize time-effective and energy-efficient processing.

  6. Laser micromachining of biofactory-on-a-chip devices

    NASA Astrophysics Data System (ADS)

    Burt, Julian P.; Goater, Andrew D.; Hayden, Christopher J.; Tame, John A.

    2002-06-01

    Excimer laser micromachining provides a flexible means for the manufacture and rapid prototyping of miniaturized systems such as Biofactory-on-a-Chip devices. Biofactories are miniaturized diagnostic devices capable of characterizing, manipulating, separating and sorting suspension of particles such as biological cells. Such systems operate by exploiting the electrical properties of microparticles and controlling particle movement in AC non- uniform stationary and moving electric fields. Applications of Biofactory devices are diverse and include, among others, the healthcare, pharmaceutical, chemical processing, environmental monitoring and food diagnostic markets. To achieve such characterization and separation, Biofactory devices employ laboratory-on-a-chip type components such as complex multilayer microelectrode arrays, microfluidic channels, manifold systems and on-chip detection systems. Here we discuss the manufacturing requirements of Biofactory devices and describe the use of different excimer laser micromachined methods both in stand-alone processes and also in conjunction with conventional fabrication processes such as photolithography and thermal molding. Particular attention is given to the production of large area multilayer microelectrode arrays and the manufacture of complex cross-section microfluidic channel systems for use in simple distribution and device interfacing.

  7. A Response Surface Methodology for Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Altus, Troy David; Sobieski, Jaroslaw (Technical Monitor)

    2002-01-01

    The report describes a new method for optimization of engineering systems such as aerospace vehicles whose design must harmonize a number of subsystems and various physical phenomena, each represented by a separate computer code, e.g., aerodynamics, structures, propulsion, performance, etc. To represent the system internal couplings, the codes receive output from other codes as part of their inputs. The system analysis and optimization task is decomposed into subtasks that can be executed concurrently, each subtask conducted using local state and design variables and holding constant a set of the system-level design variables. The subtasks results are stored in form of the Response Surfaces (RS) fitted in the space of the system-level variables to be used as the subtask surrogates in a system-level optimization whose purpose is to optimize the system objective(s) and to reconcile the system internal couplings. By virtue of decomposition and execution concurrency, the method enables a broad workfront in organization of an engineering project involving a number of specialty groups that might be geographically dispersed, and it exploits the contemporary computing technology of massively concurrent and distributed processing. The report includes a demonstration test case of supersonic business jet design.

  8. Biodiversity loss in seagrass meadows due to local invertebrate fisheries and harbour activities

    NASA Astrophysics Data System (ADS)

    Nordlund, Lina Mtwana; Gullström, Martin

    2013-12-01

    Seagrass meadows provide a wide variety of ecosystem services, but their distribution and health are adversely affected by man. In the present study, we examined the influence of coastal exploitation in terms of invertebrate harvesting and harbour activity on invertebrate community composition in subtropical seagrass meadows at Inhaca Island, Mozambique, in the Western Indian Ocean. There was a fivefold higher invertebrate density and biomass, and clearly higher invertebrate species richness, in the protected (control) site compared to the two exploited sites. The causes for the clear differences between protected and exploited sites were probably a result of (1) the directional outtake of large edible or saleable invertebrates (mostly molluscs) and the absence of boat traffic in the harvested site, and (2) harbour activities. Invertebrate community composition in the two exploited sites also differed (although less clear), which was likely due to inherent distinction in type of disturbance. Our findings revealed that protection of seagrass habitat is necessary and that disturbances of different origin might require different forms of management and conservation. Designing protected areas is however a complex process due to competition for use and space with activities such as invertebrate harvesting and harbours.

  9. Collaborative real-time motion video analysis by human observer and image exploitation algorithms

    NASA Astrophysics Data System (ADS)

    Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen

    2015-05-01

    Motion video analysis is a challenging task, especially in real-time applications. In most safety and security critical applications, a human observer is an obligatory part of the overall analysis system. Over the last years, substantial progress has been made in the development of automated image exploitation algorithms. Hence, we investigate how the benefits of automated video analysis can be integrated suitably into the current video exploitation systems. In this paper, a system design is introduced which strives to combine both the qualities of the human observer's perception and the automated algorithms, thus aiming to improve the overall performance of a real-time video analysis system. The system design builds on prior work where we showed the benefits for the human observer by means of a user interface which utilizes the human visual focus of attention revealed by the eye gaze direction for interaction with the image exploitation system; eye tracker-based interaction allows much faster, more convenient, and equally precise moving target acquisition in video images than traditional computer mouse selection. The system design also builds on prior work we did on automated target detection, segmentation, and tracking algorithms. Beside the system design, a first pilot study is presented, where we investigated how the participants (all non-experts in video analysis) performed in initializing an object tracking subsystem by selecting a target for tracking. Preliminary results show that the gaze + key press technique is an effective, efficient, and easy to use interaction technique when performing selection operations on moving targets in videos in order to initialize an object tracking function.

  10. What does it take to build a medium scale scientific cloud to process significant amounts of Earth observation data?

    NASA Astrophysics Data System (ADS)

    Hollstein, André; Diedrich, Hannes; Spengler, Daniel

    2017-04-01

    The installment of the operational fleet of Sentinels by Copernicus offers an unprecedented influx of freely available Earth Observation data with Sentinel-2 being a great example. It offers a broad range of land applications due to its high spatial sampling from 10 m to 20 m and its multi-spectral imaging capabilities with 13 spectral bands. The open access policy allows unrestricted use by everybody and provides data downloads for on the respective sites. For a small area of interest and shorter time series, data processing, and exploitation can easily be done manually. However, for multi-temporal analysis of larger areas, the data size can quickly increase such that it is not manageable in practice on a personal computer which leads to an increasing interest in central data exploitation platforms. Prominent examples are GoogleEarth Engine, NASA Earth Exchange (NEX) or current developments such as CODE-DE in Germany. Open standards are still evolving, and the choice of a platform may create lock-in scenarios and a situation where scientists are not anymore in full control of all aspects of their analysis. Securing intellectual properties of researchers can become a major issue in the future. Partnering with a startup company that is dedicated to providing tools for farm management and precision farming, GFZ builds a small-scale science cloud named GTS2 for processing and distribution of Sentinel-2 data. The service includes a sophisticated atmospheric correction algorithm, spatial co-registration of time series data, as well as a web API for data distribution. This approach is different from the drag to centralized research using infrastructures controlled by others. By keeping the full licensing rights, it allows developing new business models independent from the initially chosen processing provider. Currently, data is held for the greater German area but is extendable to larger areas on short notice due to a scalable distributed network file system. For a given area of interest, band and time range selection, the API returns only the data that was requested in a fast manner and thereby saves storage space on the user's machine. A jupyterhub instance is a main tool for data exploitation by our users. Nearly all used software is open source, is based on open standards, and allows to transfer software to other infrastructures. In the talk, we give an overview of the current status of the project and the service, but also want to share our experience with its development.

  11. Universal Hitting Time Statistics for Integrable Flows

    NASA Astrophysics Data System (ADS)

    Dettmann, Carl P.; Marklof, Jens; Strömbergsson, Andreas

    2017-02-01

    The perceived randomness in the time evolution of "chaotic" dynamical systems can be characterized by universal probabilistic limit laws, which do not depend on the fine features of the individual system. One important example is the Poisson law for the times at which a particle with random initial data hits a small set. This was proved in various settings for dynamical systems with strong mixing properties. The key result of the present study is that, despite the absence of mixing, the hitting times of integrable flows also satisfy universal limit laws which are, however, not Poisson. We describe the limit distributions for "generic" integrable flows and a natural class of target sets, and illustrate our findings with two examples: the dynamics in central force fields and ellipse billiards. The convergence of the hitting time process follows from a new equidistribution theorem in the space of lattices, which is of independent interest. Its proof exploits Ratner's measure classification theorem for unipotent flows, and extends earlier work of Elkies and McMullen.

  12. IT behind a platform for Translational Cancer Research - concept and objectives.

    PubMed

    Steffens, Michael; Husmann, Gabriele; Koca, Mithat; Lablans, Martin; Komor, Martina; Zeissig, Sylke; Emrich, Katharina; Brandts, Christian; Serve, Hubert; Blettner, Maria; Uckert, Frank

    2012-01-01

    The German Consortium for Translational Cancer Research (DKTK) and the Rhine-Main Translational Cancer Research Network (RM-TCRN) are designed to exploit large population cohorts of cancer patients for the purpose of bio-banking, clinical trials, and clinical cancer registration. Hence, the success of these platforms is heavily dependent on the close interlinking of clinical data from cancer patients, information from study registries, and data from bio-banking systems of different laboratories and scientific institutions. This article referring to the poster discusses the main challenges of the platforms from an information technology point of view, legal and data security issues, and outlines an integrative IT-concept concerning a decentralized, distributed search approach where data management and search is in compliance with existing legislative rules.

  13. Automatic mission planning algorithms for aerial collection of imaging-specific tasks

    NASA Astrophysics Data System (ADS)

    Sponagle, Paul; Salvaggio, Carl

    2017-05-01

    The rapid advancement and availability of small unmanned aircraft systems (sUAS) has led to many novel exploitation tasks utilizing that utilize this unique aerial imagery data. Collection of this unique data requires novel flight planning to accomplish the task at hand. This work describes novel flight planning to better support structure-from-motion missions to minimize occlusions, autonomous and periodic overflight of reflectance calibration panels to permit more efficient and accurate data collection under varying illumination conditions, and the collection of imagery data to study optical properties such as the bidirectional reflectance distribution function without disturbing the target in sensitive or remote areas of interest. These novel mission planning algorithms will provide scientists with additional tools to meet their future data collection needs.

  14. A Hybrid P2P Overlay Network for Non-strictly Hierarchically Categorized Content

    NASA Astrophysics Data System (ADS)

    Wan, Yi; Asaka, Takuya; Takahashi, Tatsuro

    In P2P content distribution systems, there are many cases in which the content can be classified into hierarchically organized categories. In this paper, we propose a hybrid overlay network design suitable for such content called Pastry/NSHCC (Pastry for Non-Strictly Hierarchically Categorized Content). The semantic information of classification hierarchies of the content can be utilized regardless of whether they are in a strict tree structure or not. By doing so, the search scope can be restrained to any granularity, and the number of query messages also decreases while maintaining keyword searching availability. Through simulation, we showed that the proposed method provides better performance and lower overhead than unstructured overlays exploiting the same semantic information.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chao; Pouransari, Hadi; Rajamanickam, Sivasankaran

    We present a parallel hierarchical solver for general sparse linear systems on distributed-memory machines. For large-scale problems, this fully algebraic algorithm is faster and more memory-efficient than sparse direct solvers because it exploits the low-rank structure of fill-in blocks. Depending on the accuracy of low-rank approximations, the hierarchical solver can be used either as a direct solver or as a preconditioner. The parallel algorithm is based on data decomposition and requires only local communication for updating boundary data on every processor. Moreover, the computation-to-communication ratio of the parallel algorithm is approximately the volume-to-surface-area ratio of the subdomain owned by everymore » processor. We also provide various numerical results to demonstrate the versatility and scalability of the parallel algorithm.« less

  16. An exploitation-competition system with negative effect of prey on its predator.

    PubMed

    Wang, Yuanshi

    2015-05-01

    This paper considers an exploitation-competition system in which exploitation is the dominant interaction when the prey is at low density, while competition is dominant when the prey is at high density due to its negative effect on the predator. The two-species system is characterized by differential equations, which are the combination of Lotka-Volterra competitive and predator-prey models. Global dynamics of the model demonstrate some basic properties of exploitation-competition systems: (i) When the growth rate of prey is extremely small, the prey cannot promote the growth of predator. (ii) When the growth rate is small, an obligate predator can survive by preying on the prey, while a facultative predator can approach a high density by the predation. (iii) When the growth rate is intermediate, the predator can approach the maximal density by an intermediate predation. (iv) When the growth rate is large, the predator can persist only if it has a large density and its predation on the prey is big. (v) Intermediate predation is beneficial to the predator under certain parameter range, while over- or under-predation is not good. Extremely big/small predation would lead to extinction of species. Numerical simulations confirm and extend our results. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Special purpose parallel computer architecture for real-time control and simulation in robotic applications

    NASA Technical Reports Server (NTRS)

    Fijany, Amir (Inventor); Bejczy, Antal K. (Inventor)

    1993-01-01

    This is a real-time robotic controller and simulator which is a MIMD-SIMD parallel architecture for interfacing with an external host computer and providing a high degree of parallelism in computations for robotic control and simulation. It includes a host processor for receiving instructions from the external host computer and for transmitting answers to the external host computer. There are a plurality of SIMD microprocessors, each SIMD processor being a SIMD parallel processor capable of exploiting fine grain parallelism and further being able to operate asynchronously to form a MIMD architecture. Each SIMD processor comprises a SIMD architecture capable of performing two matrix-vector operations in parallel while fully exploiting parallelism in each operation. There is a system bus connecting the host processor to the plurality of SIMD microprocessors and a common clock providing a continuous sequence of clock pulses. There is also a ring structure interconnecting the plurality of SIMD microprocessors and connected to the clock for providing the clock pulses to the SIMD microprocessors and for providing a path for the flow of data and instructions between the SIMD microprocessors. The host processor includes logic for controlling the RRCS by interpreting instructions sent by the external host computer, decomposing the instructions into a series of computations to be performed by the SIMD microprocessors, using the system bus to distribute associated data among the SIMD microprocessors, and initiating activity of the SIMD microprocessors to perform the computations on the data by procedure call.

  18. New Quantum Key Distribution Scheme Based on Random Hybrid Quantum Channel with EPR Pairs and GHZ States

    NASA Astrophysics Data System (ADS)

    Yan, Xing-Yu; Gong, Li-Hua; Chen, Hua-Ying; Zhou, Nan-Run

    2018-05-01

    A theoretical quantum key distribution scheme based on random hybrid quantum channel with EPR pairs and GHZ states is devised. In this scheme, EPR pairs and tripartite GHZ states are exploited to set up random hybrid quantum channel. Only one photon in each entangled state is necessary to run forth and back in the channel. The security of the quantum key distribution scheme is guaranteed by more than one round of eavesdropping check procedures. It is of high capacity since one particle could carry more than two bits of information via quantum dense coding.

  19. Ensemble of electrophoretically captured gold nanoparticles as a fingerprint of Boltzmann velocity distribution

    NASA Astrophysics Data System (ADS)

    Hong, S. H.; Kang, M. G.; Lim, J. H.; Hwang, S. W.

    2008-07-01

    An ensemble of electrophoretically captured gold nanoparticles is exploited to fingerprint their velocity distribution in solution. The electrophoretic capture is performed using a dc biased nanogap electrode, and panoramic scanning electron microscopic images are inspected to obtain the regional density of the captured gold nanoparticles. The regional density profile along the surface of the electrode is in a quantitative agreement with the calculated density of the captured nanoparticles. The calculated density is obtained by counting, in the Boltzmann distribution, the number of nanoparticles whose thermal velocity is smaller than the electrophoretic velocity.

  20. France's State of the Art Distributed Optical Fibre Sensors Qualified for the Monitoring of the French Underground Repository for High Level and Intermediate Level Long Lived Radioactive Wastes.

    PubMed

    Delepine-Lesoille, Sylvie; Girard, Sylvain; Landolt, Marcel; Bertrand, Johan; Planes, Isabelle; Boukenter, Aziz; Marin, Emmanuel; Humbert, Georges; Leparmentier, Stéphanie; Auguste, Jean-Louis; Ouerdane, Youcef

    2017-06-13

    This paper presents the state of the art distributed sensing systems, based on optical fibres, developed and qualified for the French Cigéo project, the underground repository for high level and intermediate level long-lived radioactive wastes. Four main parameters, namely strain, temperature, radiation and hydrogen concentration are currently investigated by optical fibre sensors, as well as the tolerances of selected technologies to the unique constraints of the Cigéo's severe environment. Using fluorine-doped silica optical fibre surrounded by a carbon layer and polyimide coating, it is possible to exploit its Raman, Brillouin and Rayleigh scattering signatures to achieve the distributed sensing of the temperature and the strain inside the repository cells of radioactive wastes. Regarding the dose measurement, promising solutions are proposed based on Radiation Induced Attenuation (RIA) responses of sensitive fibres such as the P-doped ones. While for hydrogen measurements, the potential of specialty optical fibres with Pd particles embedded in their silica matrix is currently studied for this gas monitoring through its impact on the fibre Brillouin signature evolution.

  1. France’s State of the Art Distributed Optical Fibre Sensors Qualified for the Monitoring of the French Underground Repository for High Level and Intermediate Level Long Lived Radioactive Wastes

    PubMed Central

    Delepine-Lesoille, Sylvie; Girard, Sylvain; Landolt, Marcel; Bertrand, Johan; Planes, Isabelle; Boukenter, Aziz; Marin, Emmanuel; Humbert, Georges; Leparmentier, Stéphanie; Auguste, Jean-Louis; Ouerdane, Youcef

    2017-01-01

    This paper presents the state of the art distributed sensing systems, based on optical fibres, developed and qualified for the French Cigéo project, the underground repository for high level and intermediate level long-lived radioactive wastes. Four main parameters, namely strain, temperature, radiation and hydrogen concentration are currently investigated by optical fibre sensors, as well as the tolerances of selected technologies to the unique constraints of the Cigéo’s severe environment. Using fluorine-doped silica optical fibre surrounded by a carbon layer and polyimide coating, it is possible to exploit its Raman, Brillouin and Rayleigh scattering signatures to achieve the distributed sensing of the temperature and the strain inside the repository cells of radioactive wastes. Regarding the dose measurement, promising solutions are proposed based on Radiation Induced Attenuation (RIA) responses of sensitive fibres such as the P-doped ones. While for hydrogen measurements, the potential of specialty optical fibres with Pd particles embedded in their silica matrix is currently studied for this gas monitoring through its impact on the fibre Brillouin signature evolution. PMID:28608831

  2. The influence of climate change on the global distribution and fate processes of anthropogenic persistent organic pollutants.

    PubMed

    Kallenborn, Roland; Halsall, Crispin; Dellong, Maud; Carlsson, Pernilla

    2012-11-01

    The effect of climate change on the global distribution and fate of persistent organic pollutants (POPs) is of growing interest to both scientists and policy makers alike. The impact of warmer temperatures and the resulting changes to earth system processes on chemical fate are, however, unclear, although there are a growing number of studies that are beginning to examine these impacts and changes in a quantitative way. In this review, we examine broad areas where changes are occurring or are likely to occur with regard to the environmental cycling and fate of chemical contaminants. For this purpose we are examining scientific information from long-term monitoring data with particular emphasis on the Arctic, to show apparent changes in chemical patterns and behaviour. In addition, we examine evidence of changing chemical processes for a number of environmental compartments and indirect effects of climate change on contaminant emissions and behaviour. We also recommend areas of research to address knowledge gaps. In general, our findings indicate that the indirect consequences of climate change (i.e. shifts in agriculture, resource exploitation opportunities, etc.) will have a more marked impact on contaminants distribution and fate than direct climate change.

  3. Models@Home: distributed computing in bioinformatics using a screensaver based approach.

    PubMed

    Krieger, Elmar; Vriend, Gert

    2002-02-01

    Due to the steadily growing computational demands in bioinformatics and related scientific disciplines, one is forced to make optimal use of the available resources. A straightforward solution is to build a network of idle computers and let each of them work on a small piece of a scientific challenge, as done by Seti@Home (http://setiathome.berkeley.edu), the world's largest distributed computing project. We developed a generally applicable distributed computing solution that uses a screensaver system similar to Seti@Home. The software exploits the coarse-grained nature of typical bioinformatics projects. Three major considerations for the design were: (1) often, many different programs are needed, while the time is lacking to parallelize them. Models@Home can run any program in parallel without modifications to the source code; (2) in contrast to the Seti project, bioinformatics applications are normally more sensitive to lost jobs. Models@Home therefore includes stringent control over job scheduling; (3) to allow use in heterogeneous environments, Linux and Windows based workstations can be combined with dedicated PCs to build a homogeneous cluster. We present three practical applications of Models@Home, running the modeling programs WHAT IF and YASARA on 30 PCs: force field parameterization, molecular dynamics docking, and database maintenance.

  4. Contextual classification of multispectral image data: An unbiased estimator for the context distribution

    NASA Technical Reports Server (NTRS)

    Tilton, J. C.; Swain, P. H. (Principal Investigator); Vardeman, S. B.

    1981-01-01

    A key input to a statistical classification algorithm, which exploits the tendency of certain ground cover classes to occur more frequently in some spatial context than in others, is a statistical characterization of the context: the context distribution. An unbiased estimator of the context distribution is discussed which, besides having the advantage of statistical unbiasedness, has the additional advantage over other estimation techniques of being amenable to an adaptive implementation in which the context distribution estimate varies according to local contextual information. Results from applying the unbiased estimator to the contextual classification of three real LANDSAT data sets are presented and contrasted with results from non-contextual classifications and from contextual classifications utilizing other context distribution estimation techniques.

  5. ISTIMES Integrated System for Transport Infrastructures Surveillance and Monitoring by Electromagnetic Sensing

    NASA Astrophysics Data System (ADS)

    Argenti, M.; Giannini, V.; Averty, R.; Bigagli, L.; Dumoulin, J.

    2012-04-01

    The EC FP7 ISTIMES project has the goal of realizing an ICT-based system exploiting distributed and local sensors for non destructive electromagnetic monitoring in order to make critical transport infrastructures more reliable and safe. Higher situation awareness thanks to real time and detailed information and images of the controlled infrastructure status allows improving decision capabilities for emergency management stakeholders. Web-enabled sensors and a service-oriented approach are used as core of the architecture providing a sys-tem that adopts open standards (e.g. OGC SWE, OGC CSW etc.) and makes efforts to achieve full interoperability with other GMES and European Spatial Data Infrastructure initiatives as well as compliance with INSPIRE. The system exploits an open easily scalable network architecture to accommodate a wide range of sensors integrated with a set of tools for handling, analyzing and processing large data volumes from different organizations with different data models. Situation Awareness tools are also integrated in the system. Definition of sensor observations and services follows a metadata model based on the ISO 19115 Core set of metadata elements and the O&M model of OGC SWE. The ISTIMES infrastructure is based on an e-Infrastructure for geospatial data sharing, with a Data Cata-log that implements the discovery services for sensor data retrieval, acting as a broker through static connections based on standard SOS and WNS interfaces; a Decision Support component which helps decision makers providing support for data fusion and inference and generation of situation indexes; a Presentation component which implements system-users interaction services for information publication and rendering, by means of a WEB Portal using SOA design principles; A security framework using Shibboleth open source middleware based on the Security Assertion Markup Language supporting Single Sign On (SSO). ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663

  6. Mobile agent location in distributed environments

    NASA Astrophysics Data System (ADS)

    Fountoukis, S. G.; Argyropoulos, I. P.

    2012-12-01

    An agent is a small program acting on behalf of a user or an application which plays the role of a user. Artificial intelligence can be encapsulated in agents so that they can be capable of both behaving autonomously and showing an elementary decision ability regarding movement and some specific actions. Therefore they are often called autonomous mobile agents. In a distributed system, they can move themselves from one processing node to another through the interconnecting network infrastructure. Their purpose is to collect useful information and to carry it back to their user. Also, agents are used to start, monitor and stop processes running on the individual interconnected processing nodes of computer cluster systems. An agent has a unique id to discriminate itself from other agents and a current position. The position can be expressed as the address of the processing node which currently hosts the agent. Very often, it is necessary for a user, a processing node or another agent to know the current position of an agent in a distributed system. Several procedures and algorithms have been proposed for the purpose of position location of mobile agents. The most basic of all employs a fixed computing node, which acts as agent position repository, receiving messages from all the moving agents and keeping records of their current positions. The fixed node, responds to position queries and informs users, other nodes and other agents about the position of an agent. Herein, a model is proposed that considers pairs and triples of agents instead of single ones. A location method, which is investigated in this paper, attempts to exploit this model.

  7. Improved understanding of the searching behavior of ant colony optimization algorithms applied to the water distribution design problem

    NASA Astrophysics Data System (ADS)

    Zecchin, A. C.; Simpson, A. R.; Maier, H. R.; Marchi, A.; Nixon, J. B.

    2012-09-01

    Evolutionary algorithms (EAs) have been applied successfully to many water resource problems, such as system design, management decision formulation, and model calibration. The performance of an EA with respect to a particular problem type is dependent on how effectively its internal operators balance the exploitation/exploration trade-off to iteratively find solutions of an increasing quality. For a given problem, different algorithms are observed to produce a variety of different final performances, but there have been surprisingly few investigations into characterizing how the different internal mechanisms alter the algorithm's searching behavior, in both the objective and decision space, to arrive at this final performance. This paper presents metrics for analyzing the searching behavior of ant colony optimization algorithms, a particular type of EA, for the optimal water distribution system design problem, which is a classical NP-hard problem in civil engineering. Using the proposed metrics, behavior is characterized in terms of three different attributes: (1) the effectiveness of the search in improving its solution quality and entering into optimal or near-optimal regions of the search space, (2) the extent to which the algorithm explores as it converges to solutions, and (3) the searching behavior with respect to the feasible and infeasible regions. A range of case studies is considered, where a number of ant colony optimization variants are applied to a selection of water distribution system optimization problems. The results demonstrate the utility of the proposed metrics to give greater insight into how the internal operators affect each algorithm's searching behavior.

  8. Near real-time, on-the-move software PED using VPEF

    NASA Astrophysics Data System (ADS)

    Green, Kevin; Geyer, Chris; Burnette, Chris; Agarwal, Sanjeev; Swett, Bruce; Phan, Chung; Deterline, Diane

    2015-05-01

    The scope of the Micro-Cloud for Operational, Vehicle-Based EO-IR Reconnaissance System (MOVERS) development effort, managed by the Night Vision and Electronic Sensors Directorate (NVESD), is to develop, integrate, and demonstrate new sensor technologies and algorithms that improve improvised device/mine detection using efficient and effective exploitation and fusion of sensor data and target cues from existing and future Route Clearance Package (RCP) sensor systems. Unfortunately, the majority of forward looking Full Motion Video (FMV) and computer vision processing, exploitation, and dissemination (PED) algorithms are often developed using proprietary, incompatible software. This makes the insertion of new algorithms difficult due to the lack of standardized processing chains. In order to overcome these limitations, EOIR developed the Government off-the-shelf (GOTS) Video Processing and Exploitation Framework (VPEF) to be able to provide standardized interfaces (e.g., input/output video formats, sensor metadata, and detected objects) for exploitation software and to rapidly integrate and test computer vision algorithms. EOIR developed a vehicle-based computing framework within the MOVERS and integrated it with VPEF. VPEF was further enhanced for automated processing, detection, and publishing of detections in near real-time, thus improving the efficiency and effectiveness of RCP sensor systems.

  9. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  10. A causal role for right frontopolar cortex in directed, but not random, exploration

    PubMed Central

    Zajkowski, Wojciech K; Kossut, Malgorzata

    2017-01-01

    The explore-exploit dilemma occurs anytime we must choose between exploring unknown options for information and exploiting known resources for reward. Previous work suggests that people use two different strategies to solve the explore-exploit dilemma: directed exploration, driven by information seeking, and random exploration, driven by decision noise. Here, we show that these two strategies rely on different neural systems. Using transcranial magnetic stimulation to inhibit the right frontopolar cortex, we were able to selectively inhibit directed exploration while leaving random exploration intact. This suggests a causal role for right frontopolar cortex in directed, but not random, exploration and that directed and random exploration rely on (at least partially) dissociable neural systems. PMID:28914605

  11. Untethered magnetic millirobot for targeted drug delivery.

    PubMed

    Iacovacci, Veronica; Lucarini, Gioia; Ricotti, Leonardo; Dario, Paolo; Dupont, Pierre E; Menciassi, Arianna

    2015-01-01

    This paper reports the design and development of a novel millimeter-sized robotic system for targeted therapy. The proposed medical robot is conceived to perform therapy in relatively small diameter body canals (spine, urinary system, ovary, etc.), and to release several kinds of therapeutics, depending on the pathology to be treated. The robot is a nearly-buoyant bi-component system consisting of a carrier, in which the therapeutic agent is embedded, and a piston. The piston, by exploiting magnetic effects, docks with the carrier and compresses a drug-loaded hydrogel, thus activating the release mechanism. External magnetic fields are exploited to propel the robot towards the target region, while intermagnetic forces are exploited to trigger drug release. After designing and fabricating the robot, the system has been tested in vitro with an anticancer drug (doxorubicin) embedded in the carrier. The efficiency of the drug release mechanism has been demonstrated by both quantifying the amount of drug released and by assessing the efficacy of this therapeutic procedure on human bladder cancer cells.

  12. Exploiting Lexical Regularities in Designing Natural Language Systems.

    DTIC Science & Technology

    1988-04-01

    ELEMENT. PROJECT. TASKN Artificial Inteligence Laboratory A1A4WR NTumet 0) 545 Technology Square Cambridge, MA 02139 Ln *t- CONTROLLING OFFICE NAME AND...RO-RI95 922 EXPLOITING LEXICAL REGULARITIES IN DESIGNING NATURAL 1/1 LANGUAGE SYSTENS(U) MASSACHUSETTS INST OF TECH CAMBRIDGE ARTIFICIAL INTELLIGENCE...oes.ary and ftdou.Ip hr Nl wow" L,2This paper presents the lexical component of the START Question Answering system developed at the MIT Artificial

  13. [Establishment of industry promotion technology system in Chinese medicine secondary exploitation based on "component structure theory"].

    PubMed

    Cheng, Xu-Dong; Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Jia, Xiao-Bin

    2014-10-01

    The purpose of the secondary exploitation of Chinese medicine is to improve the quality of Chinese medicine products, enhance core competitiveness, for better use in clinical practice, and more effectively solve the patient suffering. Herbs, extraction, separation, refreshing, preparation and quality control are all involved in the industry promotion of Chinese medicine secondary exploitation of industrial production. The Chinese medicine quality improvement and industry promotion could be realized with the whole process of process optimization, quality control, overall processes improvement. Based on the "component structure theory", "multi-dimensional structure & process dynamic quality control system" and systematic and holistic character of Chinese medicine, impacts of whole process were discussed. Technology systems of Chinese medicine industry promotion was built to provide theoretical basis for improving the quality and efficacy of the secondary development of traditional Chinese medicine products.

  14. Optimizing Search and Ranking in Folksonomy Systems by Exploiting Context Information

    NASA Astrophysics Data System (ADS)

    Abel, Fabian; Henze, Nicola; Krause, Daniel

    Tagging systems enable users to annotate resources with freely chosen keywords. The evolving bunch of tag assignments is called folksonomy and there exist already some approaches that exploit folksonomies to improve resource retrieval. In this paper, we analyze and compare graph-based ranking algorithms: FolkRank and SocialPageRank. We enhance these algorithms by exploiting the context of tags, and evaluate the results on the GroupMe! dataset. In GroupMe!, users can organize and maintain arbitrary Web resources in self-defined groups. When users annotate resources in GroupMe!, this can be interpreted in context of a certain group. The grouping activity itself is easy for users to perform. However, it delivers valuable semantic information about resources and their context. We present GRank that uses the context information to improve and optimize the detection of relevant search results, and compare different strategies for ranking result lists in folksonomy systems.

  15. Setting Up a Sentinel 1 Based Soil Moisture - Data Assimilation System for Flash Flood Risk Mitigation

    NASA Astrophysics Data System (ADS)

    Cenci, Luca; Pulvirenti, Luca; Boni, Giorgio; Chini, Marco; Matgen, Patrick; Gabellani, Simone; Squicciarino, Giuseppe; Pierdicca, Nazzareno

    2017-04-01

    Several studies have shown that the assimilation of satellite-derived soil moisture products (SM-DA) within hydrological modelling is able to reduce the uncertainty of discharge predictions. This can be exploited for improving early warning systems (EWS) and it is thus particularly useful for flash flood risk mitigation (Cenci et al., 2016a). The objective of this research was to evaluate the potentialities of an advanced SM-DA system based on the assimilation of synthetic aperture radar (SAR) observations derived from Sentinel 1 (S1) acquisitions. A time-continuous, spatially-distributed, physically-based hydrological model was used: Continuum (Silvestro et al., 2013). The latter is currently exploited for civil protection activities in Italy, both at national and at regional scale. Therefore, its adoption allows for a better understanding of the real potentialities of the aforementioned SM-DA system for improving EWS. The novelty of this research consisted in the use of S1-derived SM products obtained by using a multitemporal retrieval algorithm (Cenci et al., 2016b) in which the correction of the vegetation effect was obtained by means of both SAR (Cosmo-SkyMed) and optical (Landsat) images. The maps were characterised by a comparatively higher spatial/lower temporal resolution (respectively, 100 m and 12 days) w.r.t. maps obtained from commonly used microwave sensors for such applications (e.g. the Advanced SCATterometer, ASCAT). The experiment was carried out in the period October 2014 - February 2015 in an exemplifying Mediterranean catchment prone to flash floods: the Orba Basin (Italy). The Nudging assimilation scheme was chosen for its computational efficiency, particularly useful for operational applications. The impact of the assimilation was evaluated by comparing simulated and observed discharge values. In particular, it was analysed the impact of the assimilation on higher flows. Results were compared with those obtained by assimilating an ASCAT-derived SM product (H08) that can be considered at high spatial resolution (1 km) for hydrological applications and high temporal resolution (36 h) (Wagner et al., 2013). Findings revealed the potentialities of a S1-based SM-DA system for improving discharge predictions, especially of higher flows, and suggested the more appropriate pre-processing techniques to apply to S1 data before the assimilation. The comparison with H08 highlighted the importance of the temporal resolution of the observations. Results are promising but further research is needed before the actual implementation of the aforementioned S1-based SM-DA system for operational applications. References - Cenci L., et al.: Assimilation of H-SAF Soil Moisture Products for Flash Flood Early Warning Systems. Case Study: Mediterranean Catchments, IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.}, 9(12), 5634-5646, doi:10.1109/JSTARS.2016.2598475, 2016a. - Cenci L., et al.: Satellite Soil Moisture Assimilation: Preliminary Assessment of the Sentinel 1 Potentialities, 2016 IEEE Int. Geosci. Remote Sens. Symp. (IGARSS), Beijing, 3098-3101, doi:10.1109/IGARSS.2016.7729801, 2016b. - Silvestro F., et al.: Exploiting Remote Sensing Land Surface Temperature in Distributed Hydrological Modelling: the Example of the Continuum Model, Hydrol. Earth Syst. Sci., 17(1), 39-62, doi:10.5194/hess-17-39-2013, 2013. - Wagner W., et al.: The ASCAT Soil Moisture Product: A Review of its Specifications, Validation Results, and Emerging Applications, Meteorol. Zeitschrift, 22(1), 5-33, doi:10.1127/0941-2948/2013/0399, 2013.

  16. Stochastic Gain in Population Dynamics

    NASA Astrophysics Data System (ADS)

    Traulsen, Arne; Röhl, Torsten; Schuster, Heinz Georg

    2004-07-01

    We introduce an extension of the usual replicator dynamics to adaptive learning rates. We show that a population with a dynamic learning rate can gain an increased average payoff in transient phases and can also exploit external noise, leading the system away from the Nash equilibrium, in a resonancelike fashion. The payoff versus noise curve resembles the signal to noise ratio curve in stochastic resonance. Seen in this broad context, we introduce another mechanism that exploits fluctuations in order to improve properties of the system. Such a mechanism could be of particular interest in economic systems.

  17. Emulytics for Cyber-Enabled Physical Attack Scenarios: Interim LDRD Report of Year One Results.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clem, John; Urias, Vincent; Atkins, William Dee

    Sandia National Laboratories has funded the research and development of a new capability to interactively explore the effects of cyber exploits on the performance of physical protection systems. This informal, interim report of progress summarizes the project’s basis and year one (of two) accomplishments. It includes descriptions of confirmed cyber exploits against a representative testbed protection system and details the development of an emulytics capability to support live, virtual, and constructive experiments. This work will support stakeholders to better engineer, operate, and maintain reliable protection systems.

  18. The 1987 RIACS annual report

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established at the NASA Ames Research Center in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 64 universities with graduate programs in the aerospace sciences, under several Cooperative Agreements with NASA. RIACS's goal is to provide preeminent leadership in basic and applied computer science research as partners in support of NASA's goals and missions. In pursuit of this goal, RIACS contributes to several of the grand challenges in science and engineering facing NASA: flying an airplane inside a computer; determining the chemical properties of materials under hostile conditions in the atmospheres of earth and the planets; sending intelligent machines on unmanned space missions; creating a one-world network that makes all scientific resources, including those in space, accessible to all the world's scientists; providing intelligent computational support to all stages of the process of scientific investigation from problem formulation to results dissemination; and developing accurate global models for climatic behavior throughout the world. In working with these challenges, we seek novel architectures, and novel ways to use them, that exploit the potential of parallel and distributed computation and make possible new functions that are beyond the current reach of computing machines. The investigation includes pattern computers as well as the more familiar numeric and symbolic computers, and it includes networked systems of resources distributed around the world. We believe that successful computer science research is interdisciplinary: it is driven by (and drives) important problems in other disciplines. We believe that research should be guided by a clear long-term vision with planned milestones. And we believe that our environment must foster and exploit innovation. Our activities and accomplishments for the calendar year 1987 and our plans for 1988 are reported.

  19. Exploiting the dynamics of S-phase tracers in developing brain: interkinetic nuclear migration for cells entering versus leaving the S-phase

    NASA Technical Reports Server (NTRS)

    Hayes, N. L.; Nowakowski, R. S.

    2000-01-01

    Two S-phase markers for in vivo studies of cell proliferation in the developing central nervous system, tritiated thymidine ((3)H-TdR) and bromodeoxyuridine (BUdR), were compared using double-labeling techniques in the developing mouse cortex at embryonic day 14 (E14). The labeling efficiencies and detectability of the two tracers were approximately equivalent, and there was no evidence of significant tracer interactions that depend on order of administration. For both tracers, the loading time needed to label an S-phase cell to detectability is estimated at <0.2 h shortly after the injection of the label, but, as the concentration of the label falls, it increases to approximately 0.65 h after about 30 min. Thereafter, cells that enter the S-phase continue to become detectably labeled for approximately 5-6 h. The approximate equivalence of these two tracers was exploited to observe directly the numbers and positions of nuclei entering (labeled with the second tracer only) and leaving (labeled with the first tracer only) the S-phase. As expected, the numbers of nuclei entering and leaving the S-phase both increased as the interval between the two injections lengthened. Also, nuclei leaving the S-phase rapidly move towards the ventricular surface during G2, but, unexpectedly, the distribution of the entering nuclei does not differ significantly from the distribution of the nuclei in the S-phase. This indicates that: (1) the extent and rate of abventricular nuclear movement during G1 is variable, such that not all nuclei traverse the entire width of the ventricular zone, and (2) interkinetic nuclear movements are minimal during S-phase. Copyright 2000 S. Karger AG, Basel.

  20. Asymptotic Normality Through Factorial Cumulants and Partition Identities

    PubMed Central

    Bobecka, Konstancja; Hitczenko, Paweł; López-Blázquez, Fernando; Rempała, Grzegorz; Wesołowski, Jacek

    2013-01-01

    In the paper we develop an approach to asymptotic normality through factorial cumulants. Factorial cumulants arise in the same manner from factorial moments as do (ordinary) cumulants from (ordinary) moments. Another tool we exploit is a new identity for ‘moments’ of partitions of numbers. The general limiting result is then used to (re-)derive asymptotic normality for several models including classical discrete distributions, occupancy problems in some generalized allocation schemes and two models related to negative multinomial distribution. PMID:24591773

  1. Security of continuous-variable quantum key distribution against general attacks.

    PubMed

    Leverrier, Anthony; García-Patrón, Raúl; Renner, Renato; Cerf, Nicolas J

    2013-01-18

    We prove the security of Gaussian continuous-variable quantum key distribution with coherent states against arbitrary attacks in the finite-size regime. In contrast to previously known proofs of principle (based on the de Finetti theorem), our result is applicable in the practically relevant finite-size regime. This is achieved using a novel proof approach, which exploits phase-space symmetries of the protocols as well as the postselection technique introduced by Christandl, Koenig, and Renner [Phys. Rev. Lett. 102, 020504 (2009)].

  2. [Current status of the knowledge on Moroccan anophelines (Diptera: Culicidae): systematic, geographical distribution and vectorial competence].

    PubMed

    Faraj, C; Ouahabi, S; Adlaoui, E; Elaouad, R

    2010-10-01

    This bibliographical study, based on published works, ministry of Health Reports, exploitation of the database relative to the entomological surveillance conducted in the framework of the National Malaria Control Program, as well as unpublished results obtained within the framework of the European project "Emerging disease in a changing European environment", summarizes and completes with new data current knowledge on the systematics, the distribution and the vectorial competence of moroccan anophelines. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  3. Robonaut's Flexible Information Technology Infrastructure

    NASA Technical Reports Server (NTRS)

    Askew, Scott; Bluethmann, William; Alder, Ken; Ambrose, Robert

    2003-01-01

    Robonaut, NASA's humanoid robot, is designed to work as both an astronaut assistant and, in certain situations, an astronaut surrogate. This highly dexterous robot performs complex tasks under telepresence control that could previously only be carried out directly by humans. Currently with 47 degrees of freedom (DOF), Robonaut is a state-of-the-art human size telemanipulator system. while many of Robonaut's embedded components have been custom designed to meet packaging or environmental requirements, the primary computing systems used in Robonaut are currently commercial-off-the-shelf (COTS) products which have some correlation to flight qualified computer systems. This loose coupling of information technology (IT) resources allows Robonaut to exploit cost effective solutions while floating the technology base to take advantage of the rapid pace of IT advances. These IT systems utilize a software development environment, which is both compatible with COTS hardware as well as flight proven computing systems, preserving the majority of software development for a flight system. The ability to use highly integrated and flexible COTS software development tools improves productivity while minimizing redesign for a space flight system. Further, the flexibility of Robonaut's software and communication architecture has allowed it to become a widely used distributed development testbed for integrating new capabilities and furthering experimental research.

  4. Computer-aided modelling and analysis of PV systems: a comparative study.

    PubMed

    Koukouvaos, Charalambos; Kandris, Dionisis; Samarakou, Maria

    2014-01-01

    Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems.

  5. Computer-Aided Modelling and Analysis of PV Systems: A Comparative Study

    PubMed Central

    Koukouvaos, Charalambos

    2014-01-01

    Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems. PMID:24772007

  6. The Balancing Act of Moose and Wolves.

    ERIC Educational Resources Information Center

    Haber, Gordon C.

    1980-01-01

    Discussed is the predator-prey relationship between the moose and the wolves, and the added effect of human exploitations on this relationship. Described is the moose behavior at the population and system levels resulting from the predation pressures on them by the wolves and human exploitation. (DS)

  7. 26 CFR 1.1348-3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... contract. (ii) In the case of a nonresident alien individual, earned income includes only earned income... feature length motion picture which is distributed to exhibitors by Corporation N pursuant to a... net profits derived by N from the exhibition and exploitation of the picture. A was employed by M as...

  8. 26 CFR 1.1348-3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... contract. (ii) In the case of a nonresident alien individual, earned income includes only earned income... feature length motion picture which is distributed to exhibitors by Corporation N pursuant to a... net profits derived by N from the exhibition and exploitation of the picture. A was employed by M as...

  9. 26 CFR 1.1348-3 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... contract. (ii) In the case of a nonresident alien individual, earned income includes only earned income... feature length motion picture which is distributed to exhibitors by Corporation N pursuant to a... net profits derived by N from the exhibition and exploitation of the picture. A was employed by M as...

  10. 26 CFR 1.1348-3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... contract. (ii) In the case of a nonresident alien individual, earned income includes only earned income... feature length motion picture which is distributed to exhibitors by Corporation N pursuant to a... net profits derived by N from the exhibition and exploitation of the picture. A was employed by M as...

  11. The Dutch Identity: A New Tool for the Study of Item Response Models.

    ERIC Educational Resources Information Center

    Holland, Paul W.

    1990-01-01

    The Dutch Identity is presented as a useful tool for expressing the basic equations of item response models that relate the manifest probabilities to the item response functions and the latent trait distribution. Ways in which the identity may be exploited are suggested and illustrated. (SLD)

  12. Distributed Cognition in a Virtual World

    ERIC Educational Resources Information Center

    Gillen, Julia; Ferguson, Rebecca; Peachey, Anna; Twining, Peter

    2012-01-01

    Over a 13-month period, the Schome Park Programme operated the first "closed" (i.e. protected) Teen Second Life project in Europe. The project organised diverse educational events that centred on use of a virtual world and an associated asynchronous forum and wiki. Students and staff together exploited the affordances of the environment…

  13. Baseline Characteristics of Dependent Youth Who Have Been Commercially Sexually Exploited: Findings From a Specialized Treatment Program.

    PubMed

    Landers, Monica; McGrath, Kimberly; Johnson, Melissa H; Armstrong, Mary I; Dollard, Norin

    2017-01-01

    Commercial sexual exploitation of children has emerged as a critical issue within child welfare, but little is currently known about this population or effective treatment approaches to address their unique needs. Children in foster care and runaways are reported to be vulnerable to exploitation because they frequently have unmet needs for family relationships, and they have had inadequate supervision and histories of trauma of which traffickers take advantage. The current article presents data on the demographic characteristics, trauma history, mental and behavioral health needs, physical health needs, and strengths collected on a sample of 87 commercially sexually exploited youth. These youth were served in a specialized treatment program in Miami-Dade County, Florida, for exploited youth involved with the child welfare system. Findings revealed that the youth in this study have high rates of previous sexual abuse (86% of the youth) and other traumatic experiences prior to their exploitation. Youth also exhibited considerable mental and behavioral health needs. Given that few programs emphasize the unique needs of children who have been sexually exploited, recommendations are offered for providing a continuum of specialized housing and treatment services to meet the needs of sexually exploited youth, based on the authors' experiences working with this population.

  14. A framework for evolutionary systems biology

    PubMed Central

    Loewe, Laurence

    2009-01-01

    Background Many difficult problems in evolutionary genomics are related to mutations that have weak effects on fitness, as the consequences of mutations with large effects are often simple to predict. Current systems biology has accumulated much data on mutations with large effects and can predict the properties of knockout mutants in some systems. However experimental methods are too insensitive to observe small effects. Results Here I propose a novel framework that brings together evolutionary theory and current systems biology approaches in order to quantify small effects of mutations and their epistatic interactions in silico. Central to this approach is the definition of fitness correlates that can be computed in some current systems biology models employing the rigorous algorithms that are at the core of much work in computational systems biology. The framework exploits synergies between the realism of such models and the need to understand real systems in evolutionary theory. This framework can address many longstanding topics in evolutionary biology by defining various 'levels' of the adaptive landscape. Addressed topics include the distribution of mutational effects on fitness, as well as the nature of advantageous mutations, epistasis and robustness. Combining corresponding parameter estimates with population genetics models raises the possibility of testing evolutionary hypotheses at a new level of realism. Conclusion EvoSysBio is expected to lead to a more detailed understanding of the fundamental principles of life by combining knowledge about well-known biological systems from several disciplines. This will benefit both evolutionary theory and current systems biology. Understanding robustness by analysing distributions of mutational effects and epistasis is pivotal for drug design, cancer research, responsible genetic engineering in synthetic biology and many other practical applications. PMID:19239699

  15. Bifurcation Analysis of a DC-DC Bidirectional Power Converter Operating with Constant Power Loads

    NASA Astrophysics Data System (ADS)

    Cristiano, Rony; Pagano, Daniel J.; Benadero, Luis; Ponce, Enrique

    Direct current (DC) microgrids (MGs) are an emergent option to satisfy new demands for power quality and integration of renewable resources in electrical distribution systems. This work addresses the large-signal stability analysis of a DC-DC bidirectional converter (DBC) connected to a storage device in an islanding MG. This converter is responsible for controlling the balance of power (load demand and generation) under constant power loads (CPLs). In order to control the DC bus voltage through a DBC, we propose a robust sliding mode control (SMC) based on a washout filter. Dynamical systems techniques are exploited to assess the quality of this switching control strategy. In this sense, a bifurcation analysis is performed to study the nonlinear stability of a reduced model of this system. The appearance of different bifurcations when load parameters and control gains are changed is studied in detail. In the specific case of Teixeira Singularity (TS) bifurcation, some experimental results are provided, confirming the mathematical predictions. Both a deeper insight in the dynamic behavior of the controlled system and valuable design criteria are obtained.

  16. Dark states and delocalization: Competing effects of quantum coherence on the efficiency of light harvesting systems.

    PubMed

    Hu, Zixuan; Engel, Gregory S; Alharbi, Fahhad H; Kais, Sabre

    2018-02-14

    Natural light harvesting systems exploit electronic coupling of identical chromophores to generate efficient and robust excitation transfer and conversion. Dark states created by strong coupling between chromophores in the antenna structure can significantly reduce radiative recombination and enhance energy conversion efficiency. Increasing the number of the chromophores increases the number of dark states and the associated enhanced energy conversion efficiency yet also delocalizes excitations away from the trapping center and reduces the energy conversion rate. Therefore, a competition between dark state protection and delocalization must be considered when designing the optimal size of a light harvesting system. In this study, we explore the two competing mechanisms in a chain-structured antenna and show that dark state protection is the dominant mechanism, with an intriguing dependence on the parity of the number of chromophores. This dependence is linked to the exciton distribution among eigenstates, which is strongly affected by the coupling strength between chromophores and the temperature. Combining these findings, we propose that increasing the coupling strength between the chromophores can significantly increase the power output of the light harvesting system.

  17. Gray: a ray tracing-based Monte Carlo simulator for PET

    NASA Astrophysics Data System (ADS)

    Freese, David L.; Olcott, Peter D.; Buss, Samuel R.; Levin, Craig S.

    2018-05-01

    Monte Carlo simulation software plays a critical role in PET system design. Performing complex, repeated Monte Carlo simulations can be computationally prohibitive, as even a single simulation can require a large amount of time and a computing cluster to complete. Here we introduce Gray, a Monte Carlo simulation software for PET systems. Gray exploits ray tracing methods used in the computer graphics community to greatly accelerate simulations of PET systems with complex geometries. We demonstrate the implementation of models for positron range, annihilation acolinearity, photoelectric absorption, Compton scatter, and Rayleigh scatter. For validation, we simulate the GATE PET benchmark, and compare energy, distribution of hits, coincidences, and run time. We show a speedup using Gray, compared to GATE for the same simulation, while demonstrating nearly identical results. We additionally simulate the Siemens Biograph mCT system with both the NEMA NU-2 scatter phantom and sensitivity phantom. We estimate the total sensitivity within % when accounting for differences in peak NECR. We also estimate the peak NECR to be kcps, or within % of published experimental data. The activity concentration of the peak is also estimated within 1.3%.

  18. An Eddy Current Testing Platform System for Pipe Defect Inspection Based on an Optimized Eddy Current Technique Probe Design.

    PubMed

    Rifai, Damhuji; Abdalla, Ahmed N; Razali, Ramdan; Ali, Kharudin; Faraj, Moneer A

    2017-03-13

    The use of the eddy current technique (ECT) for the non-destructive testing of conducting materials has become increasingly important in the past few years. The use of the non-destructive ECT plays a key role in the ensuring the safety and integrity of the large industrial structures such as oil and gas pipelines. This paper introduce a novel ECT probe design integrated with the distributed ECT inspection system (DSECT) use for crack inspection on inner ferromagnetic pipes. The system consists of an array of giant magneto-resistive (GMR) sensors, a pneumatic system, a rotating magnetic field excitation source and a host PC acting as the data analysis center. Probe design parameters, namely probe diameter, an excitation coil and the number of GMR sensors in the array sensor is optimized using numerical optimization based on the desirability approach. The main benefits of DSECT can be seen in terms of its modularity and flexibility for the use of different types of magnetic transducers/sensors, and signals of a different nature with either digital or analog outputs, making it suited for the ECT probe design using an array of GMR magnetic sensors. A real-time application of the DSECT distributed system for ECT inspection can be exploited for the inspection of 70 mm carbon steel pipe. In order to predict the axial and circumference defect detection, a mathematical model is developed based on the technique known as response surface methodology (RSM). The inspection results of a carbon steel pipe sample with artificial defects indicate that the system design is highly efficient.

  19. Use of a stochastic approach for description of water balance and runoff production dynamics

    NASA Astrophysics Data System (ADS)

    Gioia, A.; Manfreda, S.; Iacobellis, V.; Fiorentino, M.

    2009-04-01

    The present study exploits an analytical model (Manfreda, NHESS [2008]) for the description of the probability density function of soil water balance and runoff generation over a set of river basins belonging to Southern Italy. The model is based on a stochastic differential equation where the rainfall forcing is interpreted as an additive noise in the soil water balance; the watershed heterogeneity is described exploiting the conceptual lumped watershed Xinanjiang model (widely used in China) that uses a parabolic curve for the distribution of the soil water storage capacity (Zhao et al. [1980]). The model, characterized by parameters that depend on soil, vegetation and basin morphology, allowed to derive the probability density function of the relative saturation and the surface runoff of a basin accounting for the spatial heterogeneity in soil water storage. Its application on some river basins belonging to regions of Southern Italy, gives interesting insights for the investigation of the role played by the dynamical interaction between climate, soil, and vegetation in soil moisture and runoff production dynamics. Manfreda, S., Runoff Generation Dynamics within a Humid River Basin, Natural Hazard and Earth System Sciences, 8, 1349-1357, 2008. Zhao, R. -J., Zhang, Y. L., and Fang, L. R.: The Xinanjiang model, Hydrological Forecasting Proceedings Oxford Symposium, IAHS Pub. 129, 351-356, 1980.

  20. A 24-GHz Front-End Integrated on a Multilayer Cellulose-Based Substrate for Doppler Radar Sensors.

    PubMed

    Alimenti, Federico; Palazzi, Valentina; Mariotti, Chiara; Virili, Marco; Orecchini, Giulia; Bonafoni, Stefania; Roselli, Luca; Mezzanotte, Paolo

    2017-09-12

    This paper presents a miniaturized Doppler radar that can be used as a motion sensor for low-cost Internet of things (IoT) applications. For the first time, a radar front-end and its antenna are integrated on a multilayer cellulose-based substrate, built-up by alternating paper, glue and metal layers. The circuit exploits a distributed microstrip structure that is realized using a copper adhesive laminate, so as to obtain a low-loss conductor. The radar operates at 24 GHz and transmits 5 mW of power. The antenna has a gain of 7.4 dBi and features a half power beam-width of 48 degrees. The sensor, that is just the size of a stamp, is able to detect the movement of a walking person up to 10 m in distance, while a minimum speed of 50 mm/s up to 3 m is clearly measured. Beyond this specific result, the present paper demonstrates that the attractive features of cellulose, including ultra-low cost and eco-friendliness (i.e., recyclability and biodegradability), can even be exploited for the realization of future high-frequency hardware. This opens opens the door to the implementation on cellulose of devices and systems which make up the "sensing layer" at the base of the IoT ecosystem.

Top