Sample records for implementing distributed generation

  1. Integrated-Circuit Pseudorandom-Number Generator

    NASA Technical Reports Server (NTRS)

    Steelman, James E.; Beasley, Jeff; Aragon, Michael; Ramirez, Francisco; Summers, Kenneth L.; Knoebel, Arthur

    1992-01-01

    Integrated circuit produces 8-bit pseudorandom numbers from specified probability distribution, at rate of 10 MHz. Use of Boolean logic, circuit implements pseudorandom-number-generating algorithm. Circuit includes eight 12-bit pseudorandom-number generators, outputs are uniformly distributed. 8-bit pseudorandom numbers satisfying specified nonuniform probability distribution are generated by processing uniformly distributed outputs of eight 12-bit pseudorandom-number generators through "pipeline" of D flip-flops, comparators, and memories implementing conditional probabilities on zeros and ones.

  2. Distributed Energy Planning for Climate Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stout, Sherry R; Hotchkiss, Elizabeth L; Day, Megan H

    At various levels of government across the United States and globally climate resilient solutions are being adopted and implemented. Solutions vary based on predicted hazards, community context, priorities, complexity, and available resources. Lessons are being learned through the implementation process, which can be replicated regardless of level or type of government entity carrying out the resiliency planning. Through a number of analyses and technical support across the world, NREL has learned key lessons related to resilience planning associated with power generation and water distribution. Distributed energy generation is a large factor in building resilience with clean energy technologies and solutions.more » The technical and policy solutions associated with distributed energy implementation for resilience fall into a few major categories, including spatial diversification, microgrids, water-energy nexus, policy, and redundancy.« less

  3. Design and implementation of the NaI(Tl)/CsI(Na) detectors output signal generator

    NASA Astrophysics Data System (ADS)

    Zhou, Xu; Liu, Cong-Zhan; Zhao, Jian-Ling; Zhang, Fei; Zhang, Yi-Fei; Li, Zheng-Wei; Zhang, Shuo; Li, Xu-Fang; Lu, Xue-Feng; Xu, Zhen-Ling; Lu, Fang-Jun

    2014-02-01

    We designed and implemented a signal generator that can simulate the output of the NaI(Tl)/CsI(Na) detectors' pre-amplifier onboard the Hard X-ray Modulation Telescope (HXMT). Using the development of the FPGA (Field Programmable Gate Array) with VHDL language and adding a random constituent, we have finally produced the double exponential random pulse signal generator. The statistical distribution of the signal amplitude is programmable. The occurrence time intervals of the adjacent signals contain negative exponential distribution statistically.

  4. A new low-energy bremsstrahlung generator for GEANT4.

    PubMed

    Peralta, L; Rodrigues, P; Trindade, A; Pia, M G

    2005-01-01

    The 2BN bremsstrahlung cross section is a well-adapted distribution to describe the radiative processes at low-electron kinetic energy (E(k) < 500 keV). In this work a method to implement this distribution in a Monte Carlo generator is developed.

  5. Efficient implementation of the Metropolis-Hastings algorithm, with application to the Cormack?Jolly?Seber model

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    2008-01-01

    Judicious choice of candidate generating distributions improves efficiency of the Metropolis-Hastings algorithm. In Bayesian applications, it is sometimes possible to identify an approximation to the target posterior distribution; this approximate posterior distribution is a good choice for candidate generation. These observations are applied to analysis of the Cormack?Jolly?Seber model and its extensions.

  6. Implementation of continuous-variable quantum key distribution with composable and one-sided-device-independent security against coherent attacks.

    PubMed

    Gehring, Tobias; Händchen, Vitus; Duhme, Jörg; Furrer, Fabian; Franz, Torsten; Pacher, Christoph; Werner, Reinhard F; Schnabel, Roman

    2015-10-30

    Secret communication over public channels is one of the central pillars of a modern information society. Using quantum key distribution this is achieved without relying on the hardness of mathematical problems, which might be compromised by improved algorithms or by future quantum computers. State-of-the-art quantum key distribution requires composable security against coherent attacks for a finite number of distributed quantum states as well as robustness against implementation side channels. Here we present an implementation of continuous-variable quantum key distribution satisfying these requirements. Our implementation is based on the distribution of continuous-variable Einstein-Podolsky-Rosen entangled light. It is one-sided device independent, which means the security of the generated key is independent of any memoryfree attacks on the remote detector. Since continuous-variable encoding is compatible with conventional optical communication technology, our work is a step towards practical implementations of quantum key distribution with state-of-the-art security based solely on telecom components.

  7. Implementation of continuous-variable quantum key distribution with composable and one-sided-device-independent security against coherent attacks

    PubMed Central

    Gehring, Tobias; Händchen, Vitus; Duhme, Jörg; Furrer, Fabian; Franz, Torsten; Pacher, Christoph; Werner, Reinhard F.; Schnabel, Roman

    2015-01-01

    Secret communication over public channels is one of the central pillars of a modern information society. Using quantum key distribution this is achieved without relying on the hardness of mathematical problems, which might be compromised by improved algorithms or by future quantum computers. State-of-the-art quantum key distribution requires composable security against coherent attacks for a finite number of distributed quantum states as well as robustness against implementation side channels. Here we present an implementation of continuous-variable quantum key distribution satisfying these requirements. Our implementation is based on the distribution of continuous-variable Einstein–Podolsky–Rosen entangled light. It is one-sided device independent, which means the security of the generated key is independent of any memoryfree attacks on the remote detector. Since continuous-variable encoding is compatible with conventional optical communication technology, our work is a step towards practical implementations of quantum key distribution with state-of-the-art security based solely on telecom components. PMID:26514280

  8. Distributed cooperative control of AC microgrids

    NASA Astrophysics Data System (ADS)

    Bidram, Ali

    In this dissertation, the comprehensive secondary control of electric power microgrids is of concern. Microgrid technical challenges are mainly realized through the hierarchical control structure, including primary, secondary, and tertiary control levels. Primary control level is locally implemented at each distributed generator (DG), while the secondary and tertiary control levels are conventionally implemented through a centralized control structure. The centralized structure requires a central controller which increases the reliability concerns by posing the single point of failure. In this dissertation, the distributed control structure using the distributed cooperative control of multi-agent systems is exploited to increase the secondary control reliability. The secondary control objectives are microgrid voltage and frequency, and distributed generators (DGs) active and reactive powers. Fully distributed control protocols are implemented through distributed communication networks. In the distributed control structure, each DG only requires its own information and the information of its neighbors on the communication network. The distributed structure obviates the requirements for a central controller and complex communication network which, in turn, improves the system reliability. Since the DG dynamics are nonlinear and non-identical, input-output feedback linearization is used to transform the nonlinear dynamics of DGs to linear dynamics. Proposed control frameworks cover the control of microgrids containing inverter-based DGs. Typical microgrid test systems are used to verify the effectiveness of the proposed control protocols.

  9. Distributed plug-and-play optimal generator and load control for power system frequency regulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Changhong; Mallada, Enrique; Low, Steven H.

    A distributed control scheme, which can be implemented on generators and controllable loads in a plug-and-play manner, is proposed for power system frequency regulation. The proposed scheme is based on local measurements, local computation, and neighborhood information exchanges over a communication network with an arbitrary (but connected) topology. In the event of a sudden change in generation or load, the proposed scheme can restore the nominal frequency and the reference inter-area power flows, while minimizing the total cost of control for participating generators and loads. Power network stability under the proposed control is proved with a relatively realistic model whichmore » includes nonlinear power flow and a generic (potentially nonlinear or high-order) turbine-governor model, and further with first- and second-order turbine-governor models as special cases. Finally, in simulations, the proposed control scheme shows a comparable performance to the existing automatic generation control (AGC) when implemented only on the generator side, and demonstrates better dynamic characteristics than AGC when each scheme is implemented on both generators and controllable loads. Simulation results also show robustness of the proposed scheme to communication link failure.« less

  10. Distributed plug-and-play optimal generator and load control for power system frequency regulation

    DOE PAGES

    Zhao, Changhong; Mallada, Enrique; Low, Steven H.; ...

    2018-03-14

    A distributed control scheme, which can be implemented on generators and controllable loads in a plug-and-play manner, is proposed for power system frequency regulation. The proposed scheme is based on local measurements, local computation, and neighborhood information exchanges over a communication network with an arbitrary (but connected) topology. In the event of a sudden change in generation or load, the proposed scheme can restore the nominal frequency and the reference inter-area power flows, while minimizing the total cost of control for participating generators and loads. Power network stability under the proposed control is proved with a relatively realistic model whichmore » includes nonlinear power flow and a generic (potentially nonlinear or high-order) turbine-governor model, and further with first- and second-order turbine-governor models as special cases. Finally, in simulations, the proposed control scheme shows a comparable performance to the existing automatic generation control (AGC) when implemented only on the generator side, and demonstrates better dynamic characteristics than AGC when each scheme is implemented on both generators and controllable loads. Simulation results also show robustness of the proposed scheme to communication link failure.« less

  11. A Plan for Revolutionary Change in Gas Turbine Engine Control System Architecture

    NASA Technical Reports Server (NTRS)

    Culley, Dennis E.

    2011-01-01

    The implementation of Distributed Engine Control technology on the gas turbine engine has been a vexing challenge for the controls community. A successful implementation requires the resolution of multiple technical issues in areas such as network communications, power distribution, and system integration, but especially in the area of high temperature electronics. Impeding the achievement has been the lack of a clearly articulated message about the importance of the distributed control technology to future turbine engine system goals and objectives. To resolve these issues and bring the technology to fruition has, and will continue to require, a broad coalition of resources from government, industry, and academia. This presentation will describe the broad challenges facing the next generation of advanced control systems and the plan which is being put into action to successfully implement the technology on the next generation of gas turbine engine systems.

  12. Distributed source model for the full-wave electromagnetic simulation of nonlinear terahertz generation.

    PubMed

    Fumeaux, Christophe; Lin, Hungyen; Serita, Kazunori; Withayachumnankul, Withawat; Kaufmann, Thomas; Tonouchi, Masayoshi; Abbott, Derek

    2012-07-30

    The process of terahertz generation through optical rectification in a nonlinear crystal is modeled using discretized equivalent current sources. The equivalent terahertz sources are distributed in the active volume and computed based on a separately modeled near-infrared pump beam. This approach can be used to define an appropriate excitation for full-wave electromagnetic numerical simulations of the generated terahertz radiation. This enables predictive modeling of the near-field interactions of the terahertz beam with micro-structured samples, e.g. in a near-field time-resolved microscopy system. The distributed source model is described in detail, and an implementation in a particular full-wave simulation tool is presented. The numerical results are then validated through a series of measurements on square apertures. The general principle can be applied to other nonlinear processes with possible implementation in any full-wave numerical electromagnetic solver.

  13. Transverse circular-polarized Bessel beam generation by inward cylindrical aperture distribution.

    PubMed

    Pavone, S C; Ettorre, M; Casaletti, M; Albani, M

    2016-05-16

    In this paper the focusing capability of a radiating aperture implementing an inward cylindrical traveling wave tangential electric field distribution directed along a fixed polarization unit vector is investigated. In particular, it is shown that such an aperture distribution generates a non-diffractive Bessel beam whose transverse component (with respect to the normal of the radiating aperture) of the electric field takes the form of a zero-th order Bessel function. As a practical implementation of the theoretical analysis, a circular-polarized Bessel beam launcher, made by a radial parallel plate waveguide loaded with several slot pairs, arranged on a spiral pattern, is designed and optimized. The proposed launcher performance agrees with the theoretical model and exhibits an excellent polarization purity.

  14. Polarization-multiplexed plasmonic phase generation with distributed nanoslits.

    PubMed

    Lee, Seung-Yeol; Kim, Kyuho; Lee, Gun-Yeal; Lee, Byoungho

    2015-06-15

    Methods for multiplexing surface plasmon polaritons (SPPs) have been attracting much attention due to their potentials for plasmonic integrated systems, plasmonic holography, and optical tweezing. Here, using closely-distanced distributed nanoslits, we propose a method for generating polarization-multiplexed SPP phase profiles which can be applied for implementing general SPP phase distributions. Two independent types of SPP phase generation mechanisms - polarization-independent and polarization-reversible ones - are combined to generate fully arbitrary phase profiles for each optical handedness. As a simple verification of the proposed scheme, we experimentally demonstrate that the location of plasmonic focus can be arbitrary designed, and switched by the change of optical handedness.

  15. A Low-Cost, Passive Approach for Bacterial Growth and Distribution for Large-Scale Implementation of Bioaugmentation

    DTIC Science & Technology

    2012-07-01

    technologies with significant capital costs, secondary waste streams, the involvement of hazardous materials, and the potential for additional worker...or environmental exposure. A more ideal technology would involve lower capital costs, would not generate secondary waste streams, would be...of bioaugmentation technology in general include low risk to human health and the environment during implementation, low secondary waste generation

  16. Energy Transitions | Integrated Energy Solutions | NREL

    Science.gov Websites

    clean energy access to remote populations across West Africa. NREL Supports Effort to Take Distributed develops and implements pilot projects to accelerate the development of distributed photovoltaics Renewable Energy into India's Electric Grid Volume 1 Volume 2 Designing Distributed Generation in Mexico

  17. Successfully Implementing Net-Zero Energy Policy through the Air Force Military Construction Program

    DTIC Science & Technology

    2013-03-01

    Meets Does not meet Does not meet Meets Renewable Farms Meets Meets Meets Meets On-Site (Distributed Generation) Meets* Meets* Meets Meets...independence, nor does it allow for net-zero energy installations. Developing centralized renewable energy farms is another method for obtaining...combination of centralized renewable energy farms and distributed generation methods. The specific combination of methods an installation will utilize

  18. The Impacts of Changes to Nevada’s Net Metering Policy on the Financial Performance and Adoption of Distributed Photovoltaics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gagnon, Pieter; Sigrin, Ben; Gleason, Mike

    Net energy metering (NEM) is a billing mechanism that has historically compensated owners of distributed generation systems at retail rates for any electricity that they export back to the grid rather than consume on-site. NEM can significantly enhance the financial performance of distributed generation systems from the owner’s perspective. The following analysis was designed to illustrate the potential impact of NEM policy and tariff changes implemented in early 2016 in Nevada.

  19. Kansas Wind Energy Consortium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruenbacher, Don

    2015-12-31

    This project addresses both fundamental and applied research problems that will help with problems defined by the DOE “20% Wind by 2030 Report”. In particular, this work focuses on increasing the capacity of small or community wind generation capabilities that would be operated in a distributed generation approach. A consortium (KWEC – Kansas Wind Energy Consortium) of researchers from Kansas State University and Wichita State University aims to dramatically increase the penetration of wind energy via distributed wind power generation. We believe distributed generation through wind power will play a critical role in the ability to reach and extend themore » renewable energy production targets set by the Department of Energy. KWEC aims to find technical and economic solutions to enable widespread implementation of distributed renewable energy resources that would apply to wind.« less

  20. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    NASA Astrophysics Data System (ADS)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  1. Parallel grid generation algorithm for distributed memory computers

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Moitra, Anutosh

    1994-01-01

    A parallel grid-generation algorithm and its implementation on the Intel iPSC/860 computer are described. The grid-generation scheme is based on an algebraic formulation of homotopic relations. Methods for utilizing the inherent parallelism of the grid-generation scheme are described, and implementation of multiple levELs of parallelism on multiple instruction multiple data machines are indicated. The algorithm is capable of providing near orthogonality and spacing control at solid boundaries while requiring minimal interprocessor communications. Results obtained on the Intel hypercube for a blended wing-body configuration are used to demonstrate the effectiveness of the algorithm. Fortran implementations bAsed on the native programming model of the iPSC/860 computer and the Express system of software tools are reported. Computational gains in execution time speed-up ratios are given.

  2. Distributed state-space generation of discrete-state stochastic models

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Gluckman, Joshua; Nicol, David

    1995-01-01

    High-level formalisms such as stochastic Petri nets can be used to model complex systems. Analysis of logical and numerical properties of these models of ten requires the generation and storage of the entire underlying state space. This imposes practical limitations on the types of systems which can be modeled. Because of the vast amount of memory consumed, we investigate distributed algorithms for the generation of state space graphs. The distributed construction allows us to take advantage of the combined memory readily available on a network of workstations. The key technical problem is to find effective methods for on-the-fly partitioning, so that the state space is evenly distributed among processors. In this paper we report on the implementation of a distributed state-space generator that may be linked to a number of existing system modeling tools. We discuss partitioning strategies in the context of Petri net models, and report on performance observed on a network of workstations, as well as on a distributed memory multi-computer.

  3. A distributed Petri Net controller for a dual arm testbed

    NASA Technical Reports Server (NTRS)

    Bjanes, Atle

    1991-01-01

    This thesis describes the design and functionality of a Distributed Petri Net Controller (DPNC). The controller runs under X Windows to provide a graphical interface. The DPNC allows users to distribute a Petri Net across several host computers linked together via a TCP/IP interface. A sub-net executes on each host, interacting with the other sub-nets by passing a token vector from host to host. One host has a command window which monitors and controls the distributed controller. The input to the DPNC is a net definition file generated by Great SPN. Thus, a net may be designed, analyzed and verified using this package before implementation. The net is distributed to the hosts by tagging transitions that are host-critical with the appropriate host number. The controller will then distribute the remaining places and transitions to the hosts by generating the local nets, the local marking vectors and the global marking vector. Each transition can have one or more preconditions which must be fulfilled before the transition can fire, as well as one or more post-processes to be executed after the transition fires. These implement the actual input/output to the environment (machines, signals, etc.). The DPNC may also be used to simulate a Great SPN net since stochastic and deterministic firing rates are implemented in the controller for timed transitions.

  4. Quantifying Void Ratio in Granular Materials Using Voronoi Tessellation

    NASA Technical Reports Server (NTRS)

    Alshibli, Khalid A.; El-Saidany, Hany A.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    Voronoi technique was used to calculate the local void ratio distribution of granular materials. It was implemented in an application-oriented image processing and analysis algorithm capable of extracting object edges, separating adjacent particles, obtaining the centroid of each particle, generating Voronoi polygons, and calculating the local void ratio. Details of the algorithm capabilities and features are presented. Verification calculations included performing manual digitization of synthetic images using Oda's method and Voronoi polygon system. The developed algorithm yielded very accurate measurements of the local void ratio distribution. Voronoi tessellation has the advantage, compared to Oda's method, of offering a well-defined polygon generation criterion that can be implemented in an algorithm to automatically calculate local void ratio of particulate materials.

  5. Generative models for discovering sparse distributed representations.

    PubMed Central

    Hinton, G E; Ghahramani, Z

    1997-01-01

    We describe a hierarchical, generative model that can be viewed as a nonlinear generalization of factor analysis and can be implemented in a neural network. The model uses bottom-up, top-down and lateral connections to perform Bayesian perceptual inference correctly. Once perceptual inference has been performed the connection strengths can be updated using a very simple learning rule that only requires locally available information. We demonstrate that the network learns to extract sparse, distributed, hierarchical representations. PMID:9304685

  6. A DICOM-based 2nd generation Molecular Imaging Data Grid implementing the IHE XDS-i integration profile.

    PubMed

    Lee, Jasper; Zhang, Jianguo; Park, Ryan; Dagliyan, Grant; Liu, Brent; Huang, H K

    2012-07-01

    A Molecular Imaging Data Grid (MIDG) was developed to address current informatics challenges in archival, sharing, search, and distribution of preclinical imaging studies between animal imaging facilities and investigator sites. This manuscript presents a 2nd generation MIDG replacing the Globus Toolkit with a new system architecture that implements the IHE XDS-i integration profile. Implementation and evaluation were conducted using a 3-site interdisciplinary test-bed at the University of Southern California. The 2nd generation MIDG design architecture replaces the initial design's Globus Toolkit with dedicated web services and XML-based messaging for dedicated management and delivery of multi-modality DICOM imaging datasets. The Cross-enterprise Document Sharing for Imaging (XDS-i) integration profile from the field of enterprise radiology informatics was adopted into the MIDG design because streamlined image registration, management, and distribution dataflow are likewise needed in preclinical imaging informatics systems as in enterprise PACS application. Implementation of the MIDG is demonstrated at the University of Southern California Molecular Imaging Center (MIC) and two other sites with specified hardware, software, and network bandwidth. Evaluation of the MIDG involves data upload, download, and fault-tolerance testing scenarios using multi-modality animal imaging datasets collected at the USC Molecular Imaging Center. The upload, download, and fault-tolerance tests of the MIDG were performed multiple times using 12 collected animal study datasets. Upload and download times demonstrated reproducibility and improved real-world performance. Fault-tolerance tests showed that automated failover between Grid Node Servers has minimal impact on normal download times. Building upon the 1st generation concepts and experiences, the 2nd generation MIDG system improves accessibility of disparate animal-model molecular imaging datasets to users outside a molecular imaging facility's LAN using a new architecture, dataflow, and dedicated DICOM-based management web services. Productivity and efficiency of preclinical research for translational sciences investigators has been further streamlined for multi-center study data registration, management, and distribution.

  7. FPGA and USB based control board for quantum random number generator

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Wan, Xu; Zhang, Hong-Fei; Gao, Yuan; Chen, Teng-Yun; Liang, Hao

    2009-09-01

    The design and implementation of FPGA-and-USB-based control board for quantum experiments are discussed. The usage of quantum true random number generator, control- logic in FPGA and communication with computer through USB protocol are proposed in this paper. Programmable controlled signal input and output ports are implemented. The error-detections of data frame header and frame length are designed. This board has been used in our decoy-state based quantum key distribution (QKD) system successfully.

  8. Architectural Optimization of Digital Libraries

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1998-01-01

    This work investigates performance and scaling issues relevant to large scale distributed digital libraries. Presently, performance and scaling studies focus on specific implementations of production or prototype digital libraries. Although useful information is gained to aid these designers and other researchers with insights to performance and scaling issues, the broader issues relevant to very large scale distributed libraries are not addressed. Specifically, no current studies look at the extreme or worst case possibilities in digital library implementations. A survey of digital library research issues is presented. Scaling and performance issues are mentioned frequently in the digital library literature but are generally not the focus of much of the current research. In this thesis a model for a Generic Distributed Digital Library (GDDL) and nine cases of typical user activities are defined. This model is used to facilitate some basic analysis of scaling issues. Specifically, the calculation of Internet traffic generated for different configurations of the study parameters and an estimate of the future bandwidth needed for a large scale distributed digital library implementation. This analysis demonstrates the potential impact a future distributed digital library implementation would have on the Internet traffic load and raises questions concerning the architecture decisions being made for future distributed digital library designs.

  9. A cloud-based X73 ubiquitous mobile healthcare system: design and implementation.

    PubMed

    Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji

    2014-01-01

    Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed "big data" processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems.

  10. Free-Space Quantum Key Distribution using Polarization Entangled Photons

    NASA Astrophysics Data System (ADS)

    Kurtsiefer, Christian

    2007-06-01

    We report on a complete experimental implementation of a quantum key distribution protocol through a free space link using polarization-entangled photon pairs from a compact parametric down-conversion source [1]. Based on a BB84-equivalent protocol, we generated without interruption over 10 hours a secret key free-space optical link distance of 1.5 km with a rate up to 950 bits per second after error correction and privacy amplification. Our system is based on two time stamp units and relies on no specific hardware channel for coincidence identification besides an IP link. For that, initial clock synchronization with an accuracy of better than 2 ns is achieved, based on a conventional NTP protocol and a tiered cross correlation of time tags on both sides. Time tags are used to servo a local clock, allowing a streamed measurement on correctly identified photon pairs. Contrary to the majority of quantum key distribution systems, this approach does not require a trusted large-bandwidth random number generator, but integrates that into the physical key generation process. We discuss our current progress of implementing a key distribution via an atmospherical link during daylight conditions, and possible attack scenarios on a physical timing information side channel to a entanglement-based key distribution system. [1] I. Marcikic, A. Lamas-Linares, C. Kurtsiefer, Appl. Phys. Lett. 89, 101122 (2006).

  11. Distributed utility technology cost, performance, and environmental characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Y; Adelman, S

    1995-06-01

    Distributed Utility (DU) is an emerging concept in which modular generation and storage technologies sited near customer loads in distribution systems and specifically targeted demand-side management programs are used to supplement conventional central station generation plants to meet customer energy service needs. Research has shown that implementation of the DU concept could provide substantial benefits to utilities. This report summarizes the cost, performance, and environmental and siting characteristics of existing and emerging modular generation and storage technologies that are applicable under the DU concept. It is intended to be a practical reference guide for utility planners and engineers seeking informationmore » on DU technology options. This work was funded by the Office of Utility Technologies of the US Department of Energy.« less

  12. Analysis and Application of Microgrids

    NASA Astrophysics Data System (ADS)

    Yue, Lu

    New trends of generating electricity locally and utilizing non-conventional or renewable energy sources have attracted increasing interests due to the gradual depletion of conventional fossil fuel energy sources. The new type of power generation is called Distributed Generation (DG) and the energy sources utilized by Distributed Generation are termed Distributed Energy Sources (DERs). With DGs embedded in the distribution networks, they evolve from passive distribution networks to active distribution networks enabling bidirectional power flows in the networks. Further incorporating flexible and intelligent controllers and employing future technologies, active distribution networks will turn to a Microgrid. A Microgrid is a small-scale, low voltage Combined with Heat and Power (CHP) supply network designed to supply electrical and heat loads for a small community. To further implement Microgrids, a sophisticated Microgrid Management System must be integrated. However, due to the fact that a Microgrid has multiple DERs integrated and is likely to be deregulated, the ability to perform real-time OPF and economic dispatch with fast speed advanced communication network is necessary. In this thesis, first, problems such as, power system modelling, power flow solving and power system optimization, are studied. Then, Distributed Generation and Microgrid are studied and reviewed, including a comprehensive review over current distributed generation technologies and Microgrid Management Systems, etc. Finally, a computer-based AC optimization method which minimizes the total transmission loss and generation cost of a Microgrid is proposed and a wireless communication scheme based on synchronized Code Division Multiple Access (sCDMA) is proposed. The algorithm is tested with a 6-bus power system and a 9-bus power system.

  13. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework

    PubMed Central

    2012-01-01

    Background For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. Results We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. Conclusion The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources. PMID:23216909

  14. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework.

    PubMed

    Lewis, Steven; Csordas, Attila; Killcoyne, Sarah; Hermjakob, Henning; Hoopmann, Michael R; Moritz, Robert L; Deutsch, Eric W; Boyle, John

    2012-12-05

    For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  15. Conjugate gradient minimisation approach to generating holographic traps for ultracold atoms.

    PubMed

    Harte, Tiffany; Bruce, Graham D; Keeling, Jonathan; Cassettari, Donatella

    2014-11-03

    Direct minimisation of a cost function can in principle provide a versatile and highly controllable route to computational hologram generation. Here we show that the careful design of cost functions, combined with numerically efficient conjugate gradient minimisation, establishes a practical method for the generation of holograms for a wide range of target light distributions. This results in a guided optimisation process, with a crucial advantage illustrated by the ability to circumvent optical vortex formation during hologram calculation. We demonstrate the implementation of the conjugate gradient method for both discrete and continuous intensity distributions and discuss its applicability to optical trapping of ultracold atoms.

  16. A Cloud-Based X73 Ubiquitous Mobile Healthcare System: Design and Implementation

    PubMed Central

    Ji, Zhanlin; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji

    2014-01-01

    Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed “big data” processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems. PMID:24737958

  17. Effect of distributed generation installation on power loss using genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Hasibuan, A.; Masri, S.; Othman, W. A. F. W. B.

    2018-02-01

    Injection of the generator distributed in the distribution network can affect the power system significantly. The effect that occurs depends on the allocation of DG on each part of the distribution network. Implementation of this approach has been made to the IEEE 30 bus standard and shows the optimum location and size of the DG which shows a decrease in power losses in the system. This paper aims to show the impact of distributed generation on the distribution system losses. The main purpose of installing DG on a distribution system is to reduce power losses on the power system.Some problems in power systems that can be solved with the installation of DG, one of which will be explored in the use of DG in this study is to reduce the power loss in the transmission line. Simulation results from case studies on the IEEE 30 bus standard system show that the system power loss decreased from 5.7781 MW to 1,5757 MW or just 27,27%. The simulated DG is injected to the bus with the lowest voltage drop on the bus number 8.

  18. Worldwide telemedicine services based on distributed multimedia electronic patient records by using the second generation Web server hyperwave.

    PubMed

    Quade, G; Novotny, J; Burde, B; May, F; Beck, L E; Goldschmidt, A

    1999-01-01

    A distributed multimedia electronic patient record (EPR) is a central component of a medicine-telematics application that supports physicians working in rural areas of South America, and offers medical services to scientists in Antarctica. A Hyperwave server is used to maintain the patient record. As opposed to common web servers--and as a second generation web server--Hyperwave provides the capability of holding documents in a distributed web space without the problem of broken links. This enables physicians to browse through a patient's record by using a standard browser even if the patient's record is distributed over several servers. The patient record is basically implemented on the "Good European Health Record" (GEHR) architecture.

  19. 100 km differential phase shift quantum key distribution experiment with low jitter up-conversion detectors

    NASA Astrophysics Data System (ADS)

    Diamanti, Eleni; Takesue, Hiroki; Langrock, Carsten; Fejer, M. M.; Yamamoto, Yoshihisa

    2006-12-01

    We present a quantum key distribution experiment in which keys that were secure against all individual eavesdropping attacks allowed by quantum mechanics were distributed over 100 km of optical fiber. We implemented the differential phase shift quantum key distribution protocol and used low timing jitter 1.55 µm single-photon detectors based on frequency up-conversion in periodically poled lithium niobate waveguides and silicon avalanche photodiodes. Based on the security analysis of the protocol against general individual attacks, we generated secure keys at a practical rate of 166 bit/s over 100 km of fiber. The use of the low jitter detectors also increased the sifted key generation rate to 2 Mbit/s over 10 km of fiber.

  20. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  1. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  2. A concurrent distributed system for aircraft tactical decision generation

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.

    1990-01-01

    A research program investigating the use of AI techniques to aid in the development of a tactical decision generator (TDG) for within visual range (WVR) air combat engagements is discussed. The application of AI programming and problem-solving methods in the development and implementation of a concurrent version of the computerized logic for air-to-air warfare simulations (CLAWS) program, a second-generation TDG, is presented. Concurrent computing environments and programming approaches are discussed, and the design and performance of prototype concurrent TDG system (Cube CLAWS) are presented. It is concluded that the Cube CLAWS has provided a useful testbed to evaluate the development of a distributed blackboard system. The project has shown that the complexity of developing specialized software on a distributed, message-passing architecture such as the Hypercube is not overwhelming, and that reasonable speedups and processor efficiency can be achieved by a distributed blackboard system. The project has also highlighted some of the costs of using a distributed approach to designing a blackboard system.

  3. Poster - Thur Eve - 06: Comparison of an open source genetic algorithm to the commercially used IPSA for generation of seed distributions in LDR prostate brachytherapy.

    PubMed

    McGeachy, P; Khan, R

    2012-07-01

    In early stage prostate cancer, low dose rate (LDR) prostate brachytherapy is a favorable treatment modality, where small radioactive seeds are permanently implanted throughout the prostate. Treatment centres currently rely on a commercial optimization algorithm, IPSA, to generate seed distributions for treatment plans. However, commercial software does not allow the user access to the source code, thus reducing the flexibility for treatment planning and impeding any implementation of new and, perhaps, improved clinical techniques. An open source genetic algorithm (GA) has been encoded in MATLAB to generate seed distributions for a simplified prostate and urethra model. To assess the quality of the seed distributions created by the GA, both the GA and IPSA were used to generate seed distributions for two clinically relevant scenarios and the quality of the GA distributions relative to IPSA distributions and clinically accepted standards for seed distributions was investigated. The first clinically relevant scenario involved generating seed distributions for three different prostate volumes (19.2 cc, 32.4 cc, and 54.7 cc). The second scenario involved generating distributions for three separate seed activities (0.397 mCi, 0.455 mCi, and 0.5 mCi). Both GA and IPSA met the clinically accepted criteria for the two scenarios, where distributions produced by the GA were comparable to IPSA in terms of full coverage of the prostate by the prescribed dose, and minimized dose to the urethra, which passed straight through the prostate. Further, the GA offered improved reduction of high dose regions (i.e hot spots) within the planned target volume. © 2012 American Association of Physicists in Medicine.

  4. Energy Efficiency Programs in K-12 Schools: A Guide to Developing and Implementing Greenhouse Gas Reduction Programs. Local Government Climate and Energy Strategy Series

    ERIC Educational Resources Information Center

    US Environmental Protection Agency, 2011

    2011-01-01

    Saving energy through energy efficiency improvements can cost less than generating, transmitting, and distributing energy from power plants, and provides multiple economic and environmental benefits. Local governments can promote energy efficiency in their jurisdictions by developing and implementing strategies that improve the efficiency of…

  5. Security of Distributed-Phase-Reference Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Moroder, Tobias; Curty, Marcos; Lim, Charles Ci Wen; Thinh, Le Phuc; Zbinden, Hugo; Gisin, Nicolas

    2012-12-01

    Distributed-phase-reference quantum key distribution stands out for its easy implementation with present day technology. For many years, a full security proof of these schemes in a realistic setting has been elusive. We solve this long-standing problem and present a generic method to prove the security of such protocols against general attacks. To illustrate our result, we provide lower bounds on the key generation rate of a variant of the coherent-one-way quantum key distribution protocol. In contrast to standard predictions, it appears to scale quadratically with the system transmittance.

  6. Positive feedback : exploring current approaches in iterative travel demand model implementation.

    DOT National Transportation Integrated Search

    2012-01-01

    Currently, the models that TxDOTs Transportation Planning and Programming Division (TPP) developed are : traditional three-step models (i.e., trip generation, trip distribution, and traffic assignment) that are sequentially : applied. A limitation...

  7. Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets.

    PubMed

    Clark, Alex M; Dole, Krishna; Coulon-Spektor, Anna; McNutt, Andrew; Grass, George; Freundlich, Joel S; Reynolds, Robert C; Ekins, Sean

    2015-06-22

    On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user's own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery.

  8. Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets

    PubMed Central

    2015-01-01

    On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user’s own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery. PMID:25994950

  9. A distributed and intelligent system approach for the automatic inspection of steam-generator tubes in nuclear power plants

    NASA Astrophysics Data System (ADS)

    Kang, Soon Ju; Moon, Jae Chul; Choi, Doo-Hyun; Choi, Sung Su; Woo, Hee Gon

    1998-06-01

    The inspection of steam-generator (SG) tubes in a nuclear power plant (NPP) is a time-consuming, laborious, and hazardous task because of several hard constraints such as a highly radiated working environment, a tight task schedule, and the need for many experienced human inspectors. This paper presents a new distributed intelligent system architecture for automating traditional inspection methods. The proposed architecture adopts three basic technical strategies in order to reduce the complexity of system implementation. The first is the distributed task allocation into four stages: inspection planning (IF), signal acquisition (SA), signal evaluation (SE), and inspection data management (IDM). Consequently, dedicated subsystems for automation of each stage can be designed and implemented separately. The second strategy is the inclusion of several useful artificial intelligence techniques for implementing the subsystems of each stage, such as an expert system for IP and SE and machine vision and remote robot control techniques for SA. The third strategy is the integration of the subsystems using client/server-based distributed computing architecture and a centralized database management concept. Through the use of the proposed architecture, human errors, which can occur during inspection, can be minimized because the element of human intervention has been almost eliminated; however, the productivity of the human inspector can be increased equally. A prototype of the proposed system has been developed and successfully tested over the last six years in domestic NPP's.

  10. Extending key sharing: how to generate a key tightly coupled to a network security policy

    NASA Astrophysics Data System (ADS)

    Kazantzidis, Matheos

    2006-04-01

    Current state of the art security policy technologies, besides the small scale limitation and largely manual nature of accompanied management methods, are lacking a) in real-timeliness of policy implementation and b) vulnerabilities and inflexibility stemming from the centralized policy decision making; even if, for example, a policy description or access control database is distributed, the actual decision is often a centralized action and forms a system single point of failure. In this paper we are presenting a new fundamental concept that allows implement a security policy by a systematic and efficient key distribution procedure. Specifically, we extend the polynomial Shamir key splitting. According to this, a global key is split into n parts, any k of which can re-construct the original key. In this paper we present a method that instead of having "any k parts" be able to re-construct the original key, the latter can only be reconstructed if keys are combined as any access control policy describes. This leads into an easily deployable key generation procedure that results a single key per entity that "knows" its role in the specific access control policy from which it was derived. The system is considered efficient as it may be used to avoid expensive PKI operations or pairwise key distributions as well as provides superior security due to its distributed nature, the fact that the key is tightly coupled to the policy, and that policy change may be implemented easier and faster.

  11. Integrated quantum key distribution sender unit for daily-life implementations

    NASA Astrophysics Data System (ADS)

    Mélen, Gwenaelle; Vogl, Tobias; Rau, Markus; Corrielli, Giacomo; Crespi, Andrea; Osellame, Roberto; Weinfurter, Harald

    2016-03-01

    Unlike currently implemented encryption schemes, Quantum Key Distribution provides a secure way of generating and distributing a key among two parties. Although a multitude of research platforms has been developed, the integration of QKD units within classical communication systems remains a tremendous challenge. The recently achieved maturity of integrated photonic technologies could be exploited to create miniature QKD add-ons that could extend the primary function of various existing systems such as mobile devices or optical stations. In this work we report on an integrated optics module enabling secure short-distance communication for, e.g., quantum access schemes. Using BB84-like protocols, Alice's mobile low-cost device can exchange secure key and information everywhere within a trusted node network. The new optics platform (35×20×8mm) compatible with current smartphone's technology generates NIR faint polarised laser pulses with 100MHz repetition rate. Fully automated beam tracking and live basis-alignment on Bob's side ensure user-friendly operation with a quantum link efficiency as high as 50% stable over a few seconds.

  12. MuffinInfo: HTML5-Based Statistics Extractor from Next-Generation Sequencing Data.

    PubMed

    Alic, Andy S; Blanquer, Ignacio

    2016-09-01

    Usually, the information known a priori about a newly sequenced organism is limited. Even resequencing the same organism can generate unpredictable output. We introduce MuffinInfo, a FastQ/Fasta/SAM information extractor implemented in HTML5 capable of offering insights into next-generation sequencing (NGS) data. Our new tool can run on any software or hardware environment, in command line or graphically, and in browser or standalone. It presents information such as average length, base distribution, quality scores distribution, k-mer histogram, and homopolymers analysis. MuffinInfo improves upon the existing extractors by adding the ability to save and then reload the results obtained after a run as a navigable file (also supporting saving pictures of the charts), by supporting custom statistics implemented by the user, and by offering user-adjustable parameters involved in the processing, all in one software. At the moment, the extractor works with all base space technologies such as Illumina, Roche, Ion Torrent, Pacific Biosciences, and Oxford Nanopore. Owing to HTML5, our software demonstrates the readiness of web technologies for mild intensive tasks encountered in bioinformatics.

  13. State estimation for distributed systems with sensing delay

    NASA Astrophysics Data System (ADS)

    Alexander, Harold L.

    1991-08-01

    Control of complex systems such as remote robotic vehicles requires combining data from many sensors where the data may often be delayed by sensory processing requirements. The number and variety of sensors make it desirable to distribute the computational burden of sensing and estimation among multiple processors. Classic Kalman filters do not lend themselves to distributed implementations or delayed measurement data. The alternative Kalman filter designs presented in this paper are adapted for delays in sensor data generation and for distribution of computation for sensing and estimation over a set of networked processors.

  14. Automatic generation of efficient array redistribution routines for distributed memory multicomputers

    NASA Technical Reports Server (NTRS)

    Ramaswamy, Shankar; Banerjee, Prithviraj

    1994-01-01

    Appropriate data distribution has been found to be critical for obtaining good performance on Distributed Memory Multicomputers like the CM-5, Intel Paragon and IBM SP-1. It has also been found that some programs need to change their distributions during execution for better performance (redistribution). This work focuses on automatically generating efficient routines for redistribution. We present a new mathematical representation for regular distributions called PITFALLS and then discuss algorithms for redistribution based on this representation. One of the significant contributions of this work is being able to handle arbitrary source and target processor sets while performing redistribution. Another important contribution is the ability to handle an arbitrary number of dimensions for the array involved in the redistribution in a scalable manner. Our implementation of these techniques is based on an MPI-like communication library. The results presented show the low overheads for our redistribution algorithm as compared to naive runtime methods.

  15. Noise-Induced Synchronization among Sub-RF CMOS Analog Oscillators for Skew-Free Clock Distribution

    NASA Astrophysics Data System (ADS)

    Utagawa, Akira; Asai, Tetsuya; Hirose, Tetsuya; Amemiya, Yoshihito

    We present on-chip oscillator arrays synchronized by random noises, aiming at skew-free clock distribution on synchronous digital systems. Nakao et al. recently reported that independent neural oscillators can be synchronized by applying temporal random impulses to the oscillators [1], [2]. We regard neural oscillators as independent clock sources on LSIs; i. e., clock sources are distributed on LSIs, and they are forced to synchronize through the use of random noises. We designed neuron-based clock generators operating at sub-RF region (<1GHz) by modifying the original neuron model to a new model that is suitable for CMOS implementation with 0.25-μm CMOS parameters. Through circuit simulations, we demonstrate that i) the clock generators are certainly synchronized by pseudo-random noises and ii) clock generators exhibited phase-locked oscillations even if they had small device mismatches.

  16. European Science Notes Information Bulletin Reports on Current European/ Middle Eastern Science

    DTIC Science & Technology

    1991-04-01

    Fault tolerance Technology and VLSIIWSI Implementation 10th IFAC2 Workshop on Distributed Computer Optimal designs Commercial and experimental Control...catalysts that would facilitate cooperation between applications experts and computer architects in designing and implementing a new generation of parallel...speculative. Sediments immediately north of Iceland are up to 1-km However, they demonstrate the methodology for thick but thin rapidly to less than 200-m

  17. Quantum cryptography and applications in the optical fiber network

    NASA Astrophysics Data System (ADS)

    Luo, Yuhui

    2005-09-01

    Quantum cryptography, as part of quantum information and communications, can provide absolute security for information transmission because it is established on the fundamental laws of quantum theory, such as the principle of uncertainty, No-cloning theorem and quantum entanglement. In this thesis research, a novel scheme to implement quantum key distribution based on multiphoton entanglement with a new protocol is proposed. Its advantages are: a larger information capacity can be obtained with a longer transmission distance and the detection of multiple photons is easier than that of a single photon. The security and attacks pertaining to such a system are also studied. Next, a quantum key distribution over wavelength division multiplexed (WDM) optical fiber networks is realized. Quantum key distribution in networks is a long-standing problem for practical applications. Here we combine quantum cryptography and WDM to solve this problem because WDM technology is universally deployed in the current and next generation fiber networks. The ultimate target is to deploy quantum key distribution over commercial networks. The problems arising from the networks are also studied in this part. Then quantum key distribution in multi-access networks using wavelength routing technology is investigated in this research. For the first time, quantum cryptography for multiple individually targeted users has been successfully implemented in sharp contrast to that using the indiscriminating broadcasting structure. It overcomes the shortcoming that every user in the network can acquire the quantum key signals intended to be exchanged between only two users. Furthermore, a more efficient scheme of quantum key distribution is adopted, hence resulting in a higher key rate. Lastly, a quantum random number generator based on quantum optics has been experimentally demonstrated. This device is a key component for quantum key distribution as it can create truly random numbers, which is an essential requirement to perform quantum key distribution. This new generator is composed of a single optical fiber coupler with fiber pigtails, which can be easily used in optical fiber communications.

  18. Not all nonnormal distributions are created equal: Improved theoretical and measurement precision.

    PubMed

    Joo, Harry; Aguinis, Herman; Bradley, Kyle J

    2017-07-01

    We offer a four-category taxonomy of individual output distributions (i.e., distributions of cumulative results): (1) pure power law; (2) lognormal; (3) exponential tail (including exponential and power law with an exponential cutoff); and (4) symmetric or potentially symmetric (including normal, Poisson, and Weibull). The four categories are uniquely associated with mutually exclusive generative mechanisms: self-organized criticality, proportionate differentiation, incremental differentiation, and homogenization. We then introduce distribution pitting, a falsification-based method for comparing distributions to assess how well each one fits a given data set. In doing so, we also introduce decision rules to determine the likely dominant shape and generative mechanism among many that may operate concurrently. Next, we implement distribution pitting using 229 samples of individual output for several occupations (e.g., movie directors, writers, musicians, athletes, bank tellers, call center employees, grocery checkers, electrical fixture assemblers, and wirers). Results suggest that for 75% of our samples, exponential tail distributions and their generative mechanism (i.e., incremental differentiation) likely constitute the dominant distribution shape and explanation of nonnormally distributed individual output. This finding challenges past conclusions indicating the pervasiveness of other types of distributions and their generative mechanisms. Our results further contribute to theory by offering premises about the link between past and future individual output. For future research, our taxonomy and methodology can be used to pit distributions of other variables (e.g., organizational citizenship behaviors). Finally, we offer practical insights on how to increase overall individual output and produce more top performers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ladendorff, Marlene Z.

    Considerable money and effort has been expended by generation, transmission, and distribution entities in North America to implement the North American Electric Reliability Corporation (NERC) Critical Infrastructure Protection (CIP) standards for the bulk electric system. Assumptions have been made that as a result of the implementation of the standards, the grid is more cyber secure than it was pre-NERC CIP, but are there data supporting these claims, or only speculation? Has the implementation of the standards had an effect on the grid? Furthermore, developing a research study to address these and other questions provided surprising results.

  20. Air quality impacts of distributed power generation in the South Coast Air Basin of California 1: Scenario development and modeling analysis

    NASA Astrophysics Data System (ADS)

    Rodriguez, M. A.; Carreras-Sospedra, M.; Medrano, M.; Brouwer, J.; Samuelsen, G. S.; Dabdub, D.

    Distributed generation (DG) is generally defined as the operation of many small stationary power generators throughout an urban air basin. Although DG has the potential to supply a significant portion of the increased power demands in California and the rest of the United States, it may lead to increased levels of in-basin pollutants and adversely impact urban air quality. This study focuses on two main objectives: (1) the systematic characterization of DG installation in urban air basins, and (2) the simulation of potential air quality impacts using a state-of-the-art three-dimensional computational model. A general and systematic approach is devised to construct five realistic and 21 spanning scenarios of DG implementation in the South Coast Air Basin (SoCAB) of California. Realistic scenarios reflect an anticipated level of DG deployment in the SoCAB by the year 2010. Spanning scenarios are developed to determine the potential impacts of unexpected outcomes. Realistic implementations of DG in the SoCAB result in small differences in ozone and particulate matter concentrations in the basin compared to the baseline simulations. The baseline accounts for population increase, but does not consider any future emissions control measures. Model results for spanning implementations with extra high DG market penetration show that domain-wide ozone peak concentrations increase significantly. Also, air quality impacts of spanning implementations when DG operate during a 6-h period are larger than when the same amount of emissions are introduced during a 24-h period.

  1. Computer Aided Synthesis or Measurement Schemes for Telemetry applications

    DTIC Science & Technology

    1997-09-02

    5.2.5. Frame structure generation The algorithm generating the frame structure should take as inputs the sampling frequency requirements of the channels...these channels into the frame structure. Generally there can be a lot of ways to divide channels among groups. The algorithm implemented in...groups) first. The algorithm uses the function "try_permutation" recursively to distribute channels among the groups, and the function "try_subtable

  2. Intelligent Control of Micro Grid: A Big Data-Based Control Center

    NASA Astrophysics Data System (ADS)

    Liu, Lu; Wang, Yanping; Liu, Li; Wang, Zhiseng

    2018-01-01

    In this paper, a structure of micro grid system with big data-based control center is introduced. Energy data from distributed generation, storage and load are analized through the control center, and from the results new trends will be predicted and applied as a feedback to optimize the control. Therefore, each step proceeded in micro grid can be adjusted and orgnized in a form of comprehensive management. A framework of real-time data collection, data processing and data analysis will be proposed by employing big data technology. Consequently, a integrated distributed generation and a optimized energy storage and transmission process can be implemented in the micro grid system.

  3. Copilot: Monitoring Embedded Systems

    NASA Technical Reports Server (NTRS)

    Pike, Lee; Wegmann, Nis; Niller, Sebastian; Goodloe, Alwyn

    2012-01-01

    Runtime verification (RV) is a natural fit for ultra-critical systems, where correctness is imperative. In ultra-critical systems, even if the software is fault-free, because of the inherent unreliability of commodity hardware and the adversity of operational environments, processing units (and their hosted software) are replicated, and fault-tolerant algorithms are used to compare the outputs. We investigate both software monitoring in distributed fault-tolerant systems, as well as implementing fault-tolerance mechanisms using RV techniques. We describe the Copilot language and compiler, specifically designed for generating monitors for distributed, hard real-time systems. We also describe two case-studies in which we generated Copilot monitors in avionics systems.

  4. A distributed-memory approximation algorithm for maximum weight perfect bipartite matching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azad, Ariful; Buluc, Aydin; Li, Xiaoye S.

    We design and implement an efficient parallel approximation algorithm for the problem of maximum weight perfect matching in bipartite graphs, i.e. the problem of finding a set of non-adjacent edges that covers all vertices and has maximum weight. This problem differs from the maximum weight matching problem, for which scalable approximation algorithms are known. It is primarily motivated by finding good pivots in scalable sparse direct solvers before factorization where sequential implementations of maximum weight perfect matching algorithms, such as those available in MC64, are widely used due to the lack of scalable alternatives. To overcome this limitation, we proposemore » a fully parallel distributed memory algorithm that first generates a perfect matching and then searches for weightaugmenting cycles of length four in parallel and iteratively augments the matching with a vertex disjoint set of such cycles. For most practical problems the weights of the perfect matchings generated by our algorithm are very close to the optimum. An efficient implementation of the algorithm scales up to 256 nodes (17,408 cores) on a Cray XC40 supercomputer and can solve instances that are too large to be handled by a single node using the sequential algorithm.« less

  5. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonior, Jason D; Evans, Philip G; Sheets, Gregory S

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  6. Simulation of a large size inductively coupled plasma generator and comparison with experimental data

    NASA Astrophysics Data System (ADS)

    Lei, Fan; Li, Xiaoping; Liu, Yanming; Liu, Donglin; Yang, Min; Yu, Yuanyuan

    2018-01-01

    A two-dimensional axisymmetric inductively coupled plasma (ICP) model with its implementation in the COMSOL (Multi-physics simulation software) platform is described. Specifically, a large size ICP generator filled with argon is simulated in this study. Distributions of the number density and temperature of electrons are obtained for various input power and pressure settings and compared. In addition, the electron trajectory distribution is obtained in simulation. Finally, using experimental data, the results from simulations are compared to assess the veracity of the two-dimensional fluid model. The purpose of this comparison is to validate the veracity of the simulation model. An approximate agreement was found (variation tendency is the same). The main reasons for the numerical magnitude discrepancies are the assumption of a Maxwellian distribution and a Druyvesteyn distribution for the electron energy and the lack of cross sections of collision frequencies and reaction rates for argon plasma.

  7. Generation of structural topologies using efficient technique based on sorted compliances

    NASA Astrophysics Data System (ADS)

    Mazur, Monika; Tajs-Zielińska, Katarzyna; Bochenek, Bogdan

    2018-01-01

    Topology optimization, although well recognized is still widely developed. It has gained recently more attention since large computational ability become available for designers. This process is stimulated simultaneously by variety of emerging, innovative optimization methods. It is observed that traditional gradient-based mathematical programming algorithms, in many cases, are replaced by novel and e cient heuristic methods inspired by biological, chemical or physical phenomena. These methods become useful tools for structural optimization because of their versatility and easy numerical implementation. In this paper engineering implementation of a novel heuristic algorithm for minimum compliance topology optimization is discussed. The performance of the topology generator is based on implementation of a special function utilizing information of compliance distribution within the design space. With a view to cope with engineering problems the algorithm has been combined with structural analysis system Ansys.

  8. An Infrastructure for UML-Based Code Generation Tools

    NASA Astrophysics Data System (ADS)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  9. Layer 1 VPN services in distributed next-generation SONET/SDH networks with inverse multiplexing

    NASA Astrophysics Data System (ADS)

    Ghani, N.; Muthalaly, M. V.; Benhaddou, D.; Alanqar, W.

    2006-05-01

    Advances in next-generation SONET/SDH along with GMPLS control architectures have enabled many new service provisioning capabilities. In particular, a key services paradigm is the emergent Layer 1 virtual private network (L1 VPN) framework, which allows multiple clients to utilize a common physical infrastructure and provision their own 'virtualized' circuit-switched networks. This precludes expensive infrastructure builds and increases resource utilization for carriers. Along these lines, a novel L1 VPN services resource management scheme for next-generation SONET/SDH networks is proposed that fully leverages advanced virtual concatenation and inverse multiplexing features. Additionally, both centralized and distributed GMPLS-based implementations are also tabled to support the proposed L1 VPN services model. Detailed performance analysis results are presented along with avenues for future research.

  10. Noniterative three-dimensional grid generation using parabolic partial differential equations

    NASA Technical Reports Server (NTRS)

    Edwards, T. A.

    1985-01-01

    A new algorithm for generating three-dimensional grids has been developed and implemented which numerically solves a parabolic partial differential equation (PDE). The solution procedure marches outward in two coordinate directions, and requires inversion of a scalar tridiagonal system in the third. Source terms have been introduced to control the spacing and angle of grid lines near the grid boundaries, and to control the outer boundary point distribution. The method has been found to generate grids about 100 times faster than comparable grids generated via solution of elliptic PDEs, and produces smooth grids for finite-difference flow calculations.

  11. Ince-gauss based multiple intermodal phase-matched third-harmonic generations in a step-index silica optical fiber

    NASA Astrophysics Data System (ADS)

    Borne, Adrien; Katsura, Tomotaka; Félix, Corinne; Doppagne, Benjamin; Segonds, Patricia; Bencheikh, Kamel; Levenson, Juan Ariel; Boulanger, Benoit

    2016-01-01

    Several third-harmonic generation processes were performed in a single step-index germanium-doped silica optical fiber under intermodal phase-matching conditions. The nanosecond fundamental beam range between 1400 and 1600 nm. The transverse distributions of the energy were successfully modeled in the form of Ince-Gauss modes, pointing out some ellipticity of fiber core. From these experiments and theoretical calculations, we discuss the implementation of frequency degenerated triple photon generation that shares the same phase-matching condition as third-harmonic generation, which is its reverse process.

  12. Implementation of an attack scheme on a practical QKD system

    NASA Astrophysics Data System (ADS)

    Lamas-Linares, Antia; Liu, Qin; Gerhardt, Ilja; Makarov, Vadim; Kurtsiefer, Christian

    2010-03-01

    We report on an experimental implementation of an attack of a practical quantum key distribution system [1], based on a vulnerability of single photon detectors [2]. An intercept/resend-like attack has been carried out which revealed 100% of the raw key generated between the legitimate communication partners. No increase of the error ratio was observed, which is usually considered a reliable witness for any eavesdropping attempt. We also present an experiment which shows that this attack is not revealed by key distribution protocols probing for eavesdroppers by testing a Bell inequality [3], and discuss implications for practical quantum key distribution.[4pt] [1] I. Marcikic, A. Lamas-Linares, C. Kurtsiefer, Appl. Phys. Lett. 89, 101122 (2006); [2] V. Makarov, New J. Phys. 11, 065003 (2009); [3] A. Ling et al., Phys. Rev. A 78, 020301(R), (2008)

  13. Megawatt-Scale Power Hardware-in-the-Loop Simulation Testing of a Power Conversion Module for Naval Applications

    DTIC Science & Technology

    2015-06-21

    problem was detected . Protection elements were implemented to trigger on over- voltage , over-current, over/under-frequency, and zero-sequence voltage ...power hardware in the loop simulation of distribution networks with photovoltaic generation,” International Journal of Renewable Energy Research...source modules were intended to support both emulation of a representative gas turbine generator set, as well as a flexible, controllable voltage source

  14. The midpoint between dipole and parton showers

    DOE PAGES

    Höche, Stefan; Prestel, Stefan

    2015-09-28

    We present a new parton-shower algorithm. Borrowing from the basic ideas of dipole cascades, the evolution variable is judiciously chosen as the transverse momentum in the soft limit. This leads to a very simple analytic structure of the evolution. A weighting algorithm is implemented that allows one to consistently treat potentially negative values of the splitting functions and the parton distributions. Thus, we provide two independent, publicly available implementations for the two event generators PYTHIA and SHERPA.

  15. A Concurrent Distributed System for Aircraft Tactical Decision Generation

    NASA Technical Reports Server (NTRS)

    McManus, John W.

    1990-01-01

    A research program investigating the use of artificial intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range (WVR) air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of a concurrent version of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS) program, a second generation TDG, is presented. Concurrent computing environments and programming approaches are discussed and the design and performance of a prototype concurrent TDG system are presented.

  16. Distributed Practicum Supervision in a Managed Learning Environment (MLE)

    ERIC Educational Resources Information Center

    Carter, David

    2005-01-01

    This evaluation-research feasibility study piloted the creation of a technology-mediated managed learning environment (MLE) involving the implementation of one of a new generation of instructionally driven management information systems (IMISs). The system, and supporting information and communications technology (ICT) was employed to support…

  17. Information Superiority generated through proper application of Geoinformatics

    NASA Astrophysics Data System (ADS)

    Teichmann, F.

    2012-04-01

    Information Superiority generated through proper application of Geoinformatics Information management and especially geoscience information delivery is a very delicate task. If it is carried out successfully, geoscientific data will provide the main foundation of Information Superiority. However, improper implementation of geodata generation, assimilation, distribution or storage will not only waste valuable resources like manpower or money, but could also give rise to crucial deficiency in knowledge and might lead to potentially extremely harmful disasters or wrong decisions. Comprehensive Approach, Effect Based Operations and Network Enabled Capabilities are the current buzz terms in the security regime. However, they also apply to various interdisciplinary tasks like catastrophe relief missions, civil task operations or even in day to day business operations where geo-science data is used. Based on experience in the application of geoscience data for defence applications the following procedure or tool box for generating geodata should lead to the desired information superiority: 1. Understand and analyse the mission, the task and the environment for which the geodata is needed 2. Carry out a Information Exchange Requirement between the user or customer and the geodata provider 3. Implementation of current interoperability standards and a coherent metadata structure 4. Execute innovative data generation, data provision, data assimilation and data storage 5. Apply a cost-effective and reasonable data life cycle 6. Implement IT security by focusing of the three pillar concepts Integrity, Availability and Confidentiality of the critical data 7. Draft and execute a service level agreement or a memorandum of understanding between the involved parties 8. Execute a Continuous Improvement Cycle These ideas from the IT world should be transferred into the geoscience community and applied in a wide set of scenarios. A standardized approach of how to generate, provide, handle, distribute and store geodata will can reduce costs, strengthen the ties between service costumer and geodata provider and improve the contribution geoscience can make for achieving information superiority for decision makers.

  18. Generating the Local Oscillator "Locally" in Continuous-Variable Quantum Key Distribution Based on Coherent Detection

    NASA Astrophysics Data System (ADS)

    Qi, Bing; Lougovski, Pavel; Pooser, Raphael; Grice, Warren; Bobrek, Miljko

    2015-10-01

    Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In this paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a "locally" generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct a coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad2 ), which is small enough to enable secure key distribution. This technology also opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.

  19. Enabling Functional Neural Circuit Simulations with Distributed Computing of Neuromodulated Plasticity

    PubMed Central

    Potjans, Wiebke; Morrison, Abigail; Diesmann, Markus

    2010-01-01

    A major puzzle in the field of computational neuroscience is how to relate system-level learning in higher organisms to synaptic plasticity. Recently, plasticity rules depending not only on pre- and post-synaptic activity but also on a third, non-local neuromodulatory signal have emerged as key candidates to bridge the gap between the macroscopic and the microscopic level of learning. Crucial insights into this topic are expected to be gained from simulations of neural systems, as these allow the simultaneous study of the multiple spatial and temporal scales that are involved in the problem. In particular, synaptic plasticity can be studied during the whole learning process, i.e., on a time scale of minutes to hours and across multiple brain areas. Implementing neuromodulated plasticity in large-scale network simulations where the neuromodulatory signal is dynamically generated by the network itself is challenging, because the network structure is commonly defined purely by the connectivity graph without explicit reference to the embedding of the nodes in physical space. Furthermore, the simulation of networks with realistic connectivity entails the use of distributed computing. A neuromodulated synapse must therefore be informed in an efficient way about the neuromodulatory signal, which is typically generated by a population of neurons located on different machines than either the pre- or post-synaptic neuron. Here, we develop a general framework to solve the problem of implementing neuromodulated plasticity in a time-driven distributed simulation, without reference to a particular implementation language, neuromodulator, or neuromodulated plasticity mechanism. We implement our framework in the simulator NEST and demonstrate excellent scaling up to 1024 processors for simulations of a recurrent network incorporating neuromodulated spike-timing dependent plasticity. PMID:21151370

  20. SLHAplus: A library for implementing extensions of the standard model

    NASA Astrophysics Data System (ADS)

    Bélanger, G.; Christensen, Neil D.; Pukhov, A.; Semenov, A.

    2011-03-01

    We provide a library to facilitate the implementation of new models in codes such as matrix element and event generators or codes for computing dark matter observables. The library contains an SLHA reader routine as well as diagonalisation routines. This library is available in CalcHEP and micrOMEGAs. The implementation of models based on this library is supported by LanHEP and FeynRules. Program summaryProgram title: SLHAplus_1.3 Catalogue identifier: AEHX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6283 No. of bytes in distributed program, including test data, etc.: 52 119 Distribution format: tar.gz Programming language: C Computer: IBM PC, MAC Operating system: UNIX (Linux, Darwin, Cygwin) RAM: 2000 MB Classification: 11.1 Nature of problem: Implementation of extensions of the standard model in matrix element and event generators and codes for dark matter observables. Solution method: For generic extensions of the standard model we provide routines for reading files that adopt the standard format of the SUSY Les Houches Accord (SLHA) file. The procedure has been generalized to take into account an arbitrary number of blocks so that the reader can be used in generic models including non-supersymmetric ones. The library also contains routines to diagonalize real and complex mass matrices with either unitary or bi-unitary transformations as well as routines for evaluating the running strong coupling constant, running quark masses and effective quark masses. Running time: 0.001 sec

  1. Elegent—An elastic event generator

    NASA Astrophysics Data System (ADS)

    Kašpar, J.

    2014-03-01

    Although elastic scattering of nucleons may look like a simple process, it presents a long-lasting challenge for theory. Due to missing hard energy scale, the perturbative QCD cannot be applied. Instead, many phenomenological/theoretical models have emerged. In this paper we present a unified implementation of some of the most prominent models in a C++ library, moreover extended to account for effects of the electromagnetic interaction. The library is complemented with a number of utilities. For instance, programs to sample many distributions of interest in four-momentum transfer squared, t, impact parameter, b, and collision energy √{s}. These distributions at ISR, Spp¯S, RHIC, Tevatron and LHC energies are available for download from the project web site. Both in the form of ROOT files and PDF figures providing comparisons among the models. The package includes also a tool for Monte-Carlo generation of elastic scattering events, which can easily be embedded in any other program framework. Catalogue identifier: AERT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERT_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 10551 No. of bytes in distributed program, including test data, etc.: 126316 Distribution format: tar.gz Programming language: C++. Computer: Any in principle, tested on x86-64 architecture. Operating system: Any in principle, tested on GNU/Linux. RAM: Strongly depends on the task, but typically below 20MB Classification: 11.6. External routines: ROOT, HepMC Nature of problem: Monte-Carlo simulation of elastic nucleon-nucleon collisions Solution method: Implementation of some of the most prominent phenomenological/theoretical models providing cumulative distribution function that is used for random event generation. Running time: Strongly depends on the task, but typically below 1 h.

  2. Distributed Generation: Challenges and Opportunities, 7. edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2007-10-15

    The report is a comprehensive study of the Distributed Generation (DG) industry. The report takes a wide-ranging look at the current and future state of DG and both individually and collectively addresses the technologies of Microturbines, Reciprocating Engines, Stirling Engines, Fuel Cells, Photovoltaics, Concentrating Solar, Wind, and Microgrids. Topics covered include: the key technologies being used or planned for DG; the uses of DG from utility, energy service provider, and customer viewpoints; the economics of DG; the benefits of DG from multiple perspectives; the barriers that exist to implementing DG; the government programs supporting the DG industry; and, an analysismore » of DG interconnection and net metering rules.« less

  3. Applying the Bootstrap to Taxometric Analysis: Generating Empirical Sampling Distributions to Help Interpret Results

    ERIC Educational Resources Information Center

    Ruscio, John; Ruscio, Ayelet Meron; Meron, Mati

    2007-01-01

    Meehl's taxometric method was developed to distinguish categorical and continuous constructs. However, taxometric output can be difficult to interpret because expected results for realistic data conditions and differing procedural implementations have not been derived analytically or studied through rigorous simulations. By applying bootstrap…

  4. GENXICC2.0: An upgraded version of the generator for hadronic production of double heavy baryons Ξ, Ξ and Ξ

    NASA Astrophysics Data System (ADS)

    Chang, Chao-Hsi; Wang, Jian-Xiong; Wu, Xing-Gang

    2010-06-01

    An upgraded (second) version of the package GENXICC (A Generator for Hadronic Production of the Double Heavy Baryons Ξ, Ξ and Ξ by C.H. Chang, J.X. Wang and X.G. Wu [its first version in: Comput. Phys. Comm. 177 (2007) 467]) is presented. Users, with this version being implemented in PYTHIA and a GNU C compiler, may simulate full events of these processes in various experimental environments conveniently. In comparison with the previous version, in order to implement it in PYTHIA properly, a subprogram for the fragmentation of the produced double heavy diquark to the relevant baryon is supplied and the interface of the generator to PYTHIA is changed accordingly. In the subprogram, with explanation, certain necessary assumptions (approximations) are made in order to conserve the momenta and the QCD 'color' flow for the fragmentation. Program summaryProgram title: GENXICC2.0 Catalogue identifier: ADZJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZJ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 102 482 No. of bytes in distributed program, including test data, etc.: 1 469 519 Distribution format: tar.gz Programming language: Fortran 77/90 Computer: Any LINUX based on PC with FORTRAN 77 or FORTRAN 90 and GNU C compiler as well Operating system: Linux RAM: About 2.0 MByte Classification: 11.2 Catalogue identifier of previous version: ADZJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 177 (2007) 467 Does the new version supersede the previous version?: No Nature of problem: Hadronic production of double heavy baryons Ξ, Ξ and Ξ Solution method: The code is based on NRQCD framework. With proper options, it can generate weighted and un-weighted events of hadronic double heavy baryon production. When the hadronizations of the produced jets and double heavy diquark are taken into account in the production, the upgraded version with proper interface to PYTHIA can generate full events. Reasons for new version: Responding to the feedback from users, we improve the generator mainly by carefully completing the 'final non-perturbative process', i.e. the formulation of the double heavy baryon from relevant intermediate diquark. In the present version, the information for fragmentation about momentum-flow and the color-flow, that is necessary for PYTHIA to generate full events, is retained although reasonable approximations are made. In comparison with the original version, the upgraded one can implement it in PYTHIA properly to do the full event simulation of the double heavy baryon production. Summary of revisions:We try to explain the treatment of the momentum distribution of the process more clearly than the original version, and show how the final baryon is generated through the typical intermediate diquark precisely. We present color flow of the involved processes precisely and the corresponding changes for the program are made. The corresponding changes of the program are explained in the paper. Restrictions: The color flow, particularly, in the piece of code programming of the fragmentation from the produced colorful double heavy diquark into a relevant double heavy baryon, is treated carefully so as to implement it in PYTHIA properly. Running time: It depends on which option is chosen to configure PYTHIA when generating full events and also on which mechanism is chosen to generate the events. Typically, for the most complicated case with gluon-gluon fusion mechanism to generate the mixed events via the intermediate diquark in (cc)[ and (cc)[ states, under the option, IDWTUP=1, to generate 1000 events, takes about 20 hours on a 1.8 GHz Intel P4-processor machine, whereas under the option, IDWTUP=3, even to generate 106 events takes about 40 minutes on the same machine.

  5. The eGo grid model: An open-source and open-data based synthetic medium-voltage grid model for distribution power supply systems

    NASA Astrophysics Data System (ADS)

    Amme, J.; Pleßmann, G.; Bühler, J.; Hülk, L.; Kötter, E.; Schwaegerl, P.

    2018-02-01

    The increasing integration of renewable energy into the electricity supply system creates new challenges for distribution grids. The planning and operation of distribution systems requires appropriate grid models that consider the heterogeneity of existing grids. In this paper, we describe a novel method to generate synthetic medium-voltage (MV) grids, which we applied in our DIstribution Network GeneratOr (DINGO). DINGO is open-source software and uses freely available data. Medium-voltage grid topologies are synthesized based on location and electricity demand in defined demand areas. For this purpose, we use GIS data containing demand areas with high-resolution spatial data on physical properties, land use, energy, and demography. The grid topology is treated as a capacitated vehicle routing problem (CVRP) combined with a local search metaheuristics. We also consider the current planning principles for MV distribution networks, paying special attention to line congestion and voltage limit violations. In the modelling process, we included power flow calculations for validation. The resulting grid model datasets contain 3608 synthetic MV grids in high resolution, covering all of Germany and taking local characteristics into account. We compared the modelled networks with real network data. In terms of number of transformers and total cable length, we conclude that the method presented in this paper generates realistic grids that could be used to implement a cost-optimised electrical energy system.

  6. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  7. Key Barriers to the Implementation of Solar Energy in Nigeria: A Critical Analysis

    NASA Astrophysics Data System (ADS)

    Abdullahi, D.; Suresh, S.; Renukappa, S.; Oloke, D.

    2017-08-01

    Nigeria, potentially, has abundant sunshine throughout the year, making it full thirst for solar energy generation. Even though, the country’s solar energy projects have not realised a fair result over the years, due to many barriers associated with initiatives implementation. Therefore, the entire power sector remains incapacitated to generate, transmit and distribute a clean, affordable and sustainable energy to assist economic growth. The research integrated five African counterpart’s solar energy initiatives, barriers, policies and strategies adopted as a lesson learned to Nigeria. Inadequate solar initiative’s research, lack of technological know-how, short-term policies, lack of awareness and political instability are the major barriers that made the implementation of solar initiatives almost impossible in Nigeria. The shock of the barriers therefore, constitutes a major negative contribution to the crippling of the power sector in the state. Future research will concentrate on initiatives for mitigating solar and other renewable energy barriers.

  8. Parallel file system with metadata distributed across partitioned key-value store c

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  9. Architecture for one-shot compressive imaging using computer-generated holograms.

    PubMed

    Macfaden, Alexander J; Kindness, Stephen J; Wilkinson, Timothy D

    2016-09-10

    We propose a synchronous implementation of compressive imaging. This method is mathematically equivalent to prevailing sequential methods, but uses a static holographic optical element to create a spatially distributed spot array from which the image can be reconstructed with an instantaneous measurement. We present the holographic design requirements and demonstrate experimentally that the linear algebra of compressed imaging can be implemented with this technique. We believe this technique can be integrated with optical metasurfaces, which will allow the development of new compressive sensing methods.

  10. a Hadoop-Based Algorithm of Generating dem Grid from Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Jian, X.; Xiao, X.; Chengfang, H.; Zhizhong, Z.; Zhaohui, W.; Dengzhong, Z.

    2015-04-01

    Airborne LiDAR technology has proven to be the most powerful tools to obtain high-density, high-accuracy and significantly detailed surface information of terrain and surface objects within a short time, and from which the Digital Elevation Model of high quality can be extracted. Point cloud data generated from the pre-processed data should be classified by segmentation algorithms, so as to differ the terrain points from disorganized points, then followed by a procedure of interpolating the selected points to turn points into DEM data. The whole procedure takes a long time and huge computing resource due to high-density, that is concentrated on by a number of researches. Hadoop is a distributed system infrastructure developed by the Apache Foundation, which contains a highly fault-tolerant distributed file system (HDFS) with high transmission rate and a parallel programming model (Map/Reduce). Such a framework is appropriate for DEM generation algorithms to improve efficiency. Point cloud data of Dongting Lake acquired by Riegl LMS-Q680i laser scanner was utilized as the original data to generate DEM by a Hadoop-based algorithms implemented in Linux, then followed by another traditional procedure programmed by C++ as the comparative experiment. Then the algorithm's efficiency, coding complexity, and performance-cost ratio were discussed for the comparison. The results demonstrate that the algorithm's speed depends on size of point set and density of DEM grid, and the non-Hadoop implementation can achieve a high performance when memory is big enough, but the multiple Hadoop implementation can achieve a higher performance-cost ratio, while point set is of vast quantities on the other hand.

  11. Stochastic DG Placement for Conservation Voltage Reduction Based on Multiple Replications Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhaoyu; Chen, Bokan; Wang, Jianhui

    2015-06-01

    Conservation voltage reduction (CVR) and distributed-generation (DG) integration are popular strategies implemented by utilities to improve energy efficiency. This paper investigates the interactions between CVR and DG placement to minimize load consumption in distribution networks, while keeping the lowest voltage level within the predefined range. The optimal placement of DG units is formulated as a stochastic optimization problem considering the uncertainty of DG outputs and load consumptions. A sample average approximation algorithm-based technique is developed to solve the formulated problem effectively. A multiple replications procedure is developed to test the stability of the solution and calculate the confidence interval ofmore » the gap between the candidate solution and optimal solution. The proposed method has been applied to the IEEE 37-bus distribution test system with different scenarios. The numerical results indicate that the implementations of CVR and DG, if combined, can achieve significant energy savings.« less

  12. A distributed control approach for power and energy management in a notional shipboard power system

    NASA Astrophysics Data System (ADS)

    Shen, Qunying

    The main goal of this thesis is to present a power control module (PCON) based approach for power and energy management and to examine its control capability in shipboard power system (SPS). The proposed control scheme is implemented in a notional medium voltage direct current (MVDC) integrated power system (IPS) for electric ship. To realize the control functions such as ship mode selection, generator launch schedule, blackout monitoring, and fault ride-through, a PCON based distributed power and energy management system (PEMS) is developed. The control scheme is proposed as two-layer hierarchical architecture with system level on the top as the supervisory control and zonal level on the bottom as the decentralized control, which is based on the zonal distribution characteristic of the notional MVDC IPS that was proposed as one of the approaches for Next Generation Integrated Power System (NGIPS) by Norbert Doerry. Several types of modules with different functionalities are used to derive the control scheme in detail for the notional MVDC IPS. Those modules include the power generation module (PGM) that controls the function of generators, the power conversion module (PCM) that controls the functions of DC/DC or DC/AC converters, etc. Among them, the power control module (PCON) plays a critical role in the PEMS. It is the core of the control process. PCONs in the PEMS interact with all the other modules, such as power propulsion module (PPM), energy storage module (ESM), load shedding module (LSHED), and human machine interface (HMI) to realize the control algorithm in PEMS. The proposed control scheme is implemented in real time using the real time digital simulator (RTDS) to verify its validity. To achieve this, a system level energy storage module (SESM) and a zonal level energy storage module (ZESM) are developed in RTDS to cooperate with PCONs to realize the control functionalities. In addition, a load shedding module which takes into account the reliability of power supply (in terms of quality of service) is developed. This module can supply uninterruptible power to the mission critical loads. In addition, a multi-agent system (MAS) based framework is proposed to implement the PCON based PEMS through a hardware setup that is composed of MAMBA boards and FPGA interface. Agents are implemented using Java Agent DEvelopment Framework (JADE). Various test scenarios were tested to validate the approach.

  13. Combined effect of CVR and penetration of DG in the voltage profile and losses of lowvoltage secondary distribution networks

    NASA Astrophysics Data System (ADS)

    Bokhari, Abdullah

    Demarcations between traditional distribution power systems and distributed generation (DG) architectures are increasingly evolving as higher DG penetration is introduced in the system. The concerns in existing electric power systems (EPSs) to accommodate less restrictive interconnection policies while maintaining reliability and performance of power delivery have been the major challenge for DG growth. In this dissertation, the work is aimed to study power quality, energy saving and losses in a low voltage distributed network under various DG penetration cases. Simulation platform suite that includes electric power system, distributed generation and ZIP load models is implemented to determine the impact of DGs on power system steady state performance and the voltage profile of the customers/loads in the network under the voltage reduction events. The investigation designed to test the DG impact on power system starting with one type of DG, then moves on multiple DG types distributed in a random case and realistic/balanced case. The functionality of the proposed DG interconnection is designed to meet the basic requirements imposed by the various interconnection standards, most notably IEEE 1547, public service commission, and local utility regulation. It is found that implementation of DGs on the low voltage secondary network would improve customer's voltage profile, system losses and significantly provide energy savings and economics for utilities. In a network populated with DGs, utility would have a uniform voltage profile at the customers end as the voltage profile becomes more concentrated around targeted voltage level. The study further reinforced the concept that the behavior of DG in distributed network would improve voltage regulation as certain percentage reduction on utility side would ensure uniform percentage reduction seen by all customers and reduce number of voltage violations.

  14. Impact of Mobile Dose-Tracking Technology on Medication Distribution at an Academic Medical Center.

    PubMed

    Kelm, Matthew; Campbell, Udobi

    2016-05-01

    Medication dose-tracking technologies have the potential to improve efficiency and reduce costs associated with re-dispensing doses reported as missing. Data describing this technology and its impact on the medication use process are limited. The purpose of this study is to assess the impact of dose-tracking technology on pharmacy workload and drug expense at an academic, acute care medical center. Dose-tracking technology was implemented in June 2014. Pre-implementation data were collected from February to April 2014. Post-implementation data were collected from July to September 2014. The primary endpoint was the percent of re-dispensed oral syringe and compounded sterile product (CSP) doses within the pre- and post-implementation periods per 1,000 discharges. Secondary endpoints included pharmaceutical expense generated from re-dispensing doses, labor costs, and staff satisfaction with the medication distribution process. We observed an average 6% decrease in re-dispensing of oral syringe and CSP doses from pre- to post-implementation (15,440 vs 14,547 doses; p = .047). However, when values were adjusted per 1,000 discharges, this trend did not reach statistical significance (p = .074). Pharmaceutical expense generated from re-dispensing doses was significantly reduced from pre- to post-implementation ($834,830 vs $746,466 [savings of $88,364]; p = .047). We estimated that $2,563 worth of technician labor was avoided in re-dispensing missing doses. We also saw significant improvement in staff perception of technology assisting in reducing missing doses (p = .0003), as well as improvement in effectiveness of resolving or minimizing missing doses (p = .01). The use of mobile dose-tracking technology demonstrated meaningful reductions in both the number of doses re-dispensed and cost of pharmaceuticals dispensed.

  15. A Test Generation Framework for Distributed Fault-Tolerant Algorithms

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.

    2009-01-01

    Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.

  16. Analysis of BaBar data for three meson tau decay modes using the Tauola generator

    DOE PAGES

    Shekhovtsova, Olga

    2014-11-24

    The hadronic current for the τ⁻ → π⁻π⁺π⁻ν τ decay calculated in the framework of the Resonance Chiral Theory with an additional modification to include the σ meson is described. In addition, implementation into the Monte Carlo generator Tauola and fitting strategy to get the model parameters using the one-dimensional distributions are discussed. The results of the fit to one-dimensional mass invariant spectrum of the BaBar data are presented.

  17. The effect of the NERC CIP standards on the reliability of the North American Bulk Electric System

    DOE PAGES

    Ladendorff, Marlene Z.

    2016-06-01

    Considerable money and effort has been expended by generation, transmission, and distribution entities in North America to implement the North American Electric Reliability Corporation (NERC) Critical Infrastructure Protection (CIP) standards for the bulk electric system. Assumptions have been made that as a result of the implementation of the standards, the grid is more cyber secure than it was pre-NERC CIP, but are there data supporting these claims, or only speculation? Has the implementation of the standards had an effect on the grid? Furthermore, developing a research study to address these and other questions provided surprising results.

  18. Providing Limited Local Electric Service During a Major Grid Outage: A First Assessment Based on Customer Willingness to Pay.

    PubMed

    Baik, Sunhee; Morgan, M Granger; Davis, Alexander L

    2018-02-01

    While they are rare, widespread blackouts of the bulk power system can result in large costs to individuals and society. If local distribution circuits remain intact, it is possible to use new technologies including smart meters, intelligent switches that can change the topology of distribution circuits, and distributed generation owned by customers and the power company, to provide limited local electric power service. Many utilities are already making investments that would make this possible. We use customers' measured willingness to pay to explore when the incremental investments needed to implement these capabilities would be justified. Under many circumstances, upgrades in advanced distribution systems could be justified for a customer charge of less than a dollar a month (plus the cost of electricity used during outages), and would be less expensive and safer than the proliferation of small portable backup generators. We also discuss issues of social equity, extreme events, and various sources of underlying uncertainty. © 2017 Society for Risk Analysis.

  19. A parallel Monte Carlo code for planar and SPECT imaging: implementation, verification and applications in (131)I SPECT.

    PubMed

    Dewaraja, Yuni K; Ljungberg, Michael; Majumdar, Amitava; Bose, Abhijit; Koral, Kenneth F

    2002-02-01

    This paper reports the implementation of the SIMIND Monte Carlo code on an IBM SP2 distributed memory parallel computer. Basic aspects of running Monte Carlo particle transport calculations on parallel architectures are described. Our parallelization is based on equally partitioning photons among the processors and uses the Message Passing Interface (MPI) library for interprocessor communication and the Scalable Parallel Random Number Generator (SPRNG) to generate uncorrelated random number streams. These parallelization techniques are also applicable to other distributed memory architectures. A linear increase in computing speed with the number of processors is demonstrated for up to 32 processors. This speed-up is especially significant in Single Photon Emission Computed Tomography (SPECT) simulations involving higher energy photon emitters, where explicit modeling of the phantom and collimator is required. For (131)I, the accuracy of the parallel code is demonstrated by comparing simulated and experimental SPECT images from a heart/thorax phantom. Clinically realistic SPECT simulations using the voxel-man phantom are carried out to assess scatter and attenuation correction.

  20. Advanced Power Electronic Interfaces for Distributed Energy Systems, Part 2: Modeling, Development, and Experimental Evaluation of Advanced Control Functions for Single-Phase Utility-Connected Inverter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, S.; Kroposki, B.; Kramer, W.

    Integrating renewable energy and distributed generations into the Smart Grid architecture requires power electronic (PE) for energy conversion. The key to reaching successful Smart Grid implementation is to develop interoperable, intelligent, and advanced PE technology that improves and accelerates the use of distributed energy resource systems. This report describes the simulation, design, and testing of a single-phase DC-to-AC inverter developed to operate in both islanded and utility-connected mode. It provides results on both the simulations and the experiments conducted, demonstrating the ability of the inverter to provide advanced control functions such as power flow and VAR/voltage regulation. This report alsomore » analyzes two different techniques used for digital signal processor (DSP) code generation. Initially, the DSP code was written in C programming language using Texas Instrument's Code Composer Studio. In a later stage of the research, the Simulink DSP toolbox was used to self-generate code for the DSP. The successful tests using Simulink self-generated DSP codes show promise for fast prototyping of PE controls.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dalimunthe, Amty Ma’rufah Ardhiyah; Mindara, Jajat Yuda; Panatarani, Camellia

    Smart grid and distributed generation should be the solution of the global climate change and the crisis energy of the main source of electrical power generation which is fossil fuel. In order to meet the rising electrical power demand and increasing service quality demands, as well as reduce pollution, the existing power grid infrastructure should be developed into a smart grid and distributed power generation which provide a great opportunity to address issues related to energy efficiency, energy security, power quality and aging infrastructure systems. The conventional of the existing distributed generation system is an AC grid while for amore » renewable resources requires a DC grid system. This paper explores the model of smart DC grid by introducing a model of smart DC grid with the stable power generation give a minimal and compressed circuitry that can be implemented very cost-effectively with simple components. The PC based application software for controlling was developed to show the condition of the grid and to control the grid become ‘smart’. The model is then subjected to a severe system perturbation, such as incremental change in loads to test the performance of the system again stability. It is concluded that the system able to detect and controlled the voltage stability which indicating the ability of power system to maintain steady voltage within permissible rangers in normal condition.« less

  2. Smart grid integration of small-scale trigeneration systems

    NASA Astrophysics Data System (ADS)

    Vacheva, Gergana; Kanchev, Hristiyan; Hinov, Nikolay

    2017-12-01

    This paper presents a study on the possibilities for implementation of local heating, air-conditioning and electricity generation (trigeneration) as distributed energy resource in the Smart Grid. By the means of microturbine-based generators and absorption chillers buildings are able to meet partially or entirely their electrical load curve or even supply power to the grid by following their heating and air-conditioning daily schedule. The principles of small-scale cooling, heating and power generation systems are presented at first, then the thermal calculations of an example building are performed: the heat losses due to thermal conductivity and the estimated daily heating and air-conditioning load curves. By considering daily power consumption curves and weather data for several winter and summer days, the heating/air-conditioning schedule is estimated and the available electrical energy from a microturbine-based cogeneration system is estimated. Simulation results confirm the potential of using cogeneration and trigeneration systems for local distributed electricity generation and grid support in the daily peaks of power consumption.

  3. Programmable quantum random number generator without postprocessing.

    PubMed

    Nguyen, Lac; Rehain, Patrick; Sua, Yong Meng; Huang, Yu-Ping

    2018-02-15

    We demonstrate a viable source of unbiased quantum random numbers whose statistical properties can be arbitrarily programmed without the need for any postprocessing such as randomness distillation or distribution transformation. It is based on measuring the arrival time of single photons in shaped temporal modes that are tailored with an electro-optical modulator. We show that quantum random numbers can be created directly in customized probability distributions and pass all randomness tests of the NIST and Dieharder test suites without any randomness extraction. The min-entropies of such generated random numbers are measured close to the theoretical limits, indicating their near-ideal statistics and ultrahigh purity. Easy to implement and arbitrarily programmable, this technique can find versatile uses in a multitude of data analysis areas.

  4. High-speed polarization-encoded quantum key distribution based on silicon photonic integrated devices

    NASA Astrophysics Data System (ADS)

    Bunandar, Darius; Urayama, Junji; Boynton, Nicholas; Martinez, Nicholas; Derose, Christopher; Lentine, Anthony; Davids, Paul; Camacho, Ryan; Wong, Franco; Englund, Dirk

    We present a compact polarization-encoded quantum key distribution (QKD) transmitter near a 1550-nm wavelength implemented on a CMOS-compatible silicon-on-insulator photonics platform. The transmitter generates arbitrary polarization qubits at gigahertz bandwidth with an extinction ratio better than 30 dB using high-speed carrier-depletion phase modulators. We demonstrate the performance of this device by generating secret keys at a rate of 1 Mbps in a complete QKD field test. Our work shows the potential of using advanced photonic integrated circuits to enable high-speed quantum-secure communications. This work was supported by the SECANT QKD Grand Challenge, the Samsung Global Research Outreach Program, and the Air Force Office of Scientific Research.

  5. Generating and using truly random quantum states in Mathematica

    NASA Astrophysics Data System (ADS)

    Miszczak, Jarosław Adam

    2012-01-01

    The problem of generating random quantum states is of a great interest from the quantum information theory point of view. In this paper we present a package for Mathematica computing system harnessing a specific piece of hardware, namely Quantis quantum random number generator (QRNG), for investigating statistical properties of quantum states. The described package implements a number of functions for generating random states, which use Quantis QRNG as a source of randomness. It also provides procedures which can be used in simulations not related directly to quantum information processing. Program summaryProgram title: TRQS Catalogue identifier: AEKA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7924 No. of bytes in distributed program, including test data, etc.: 88 651 Distribution format: tar.gz Programming language: Mathematica, C Computer: Requires a Quantis quantum random number generator (QRNG, http://www.idquantique.com/true-random-number-generator/products-overview.html) and supporting a recent version of Mathematica Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit) RAM: Case dependent Classification: 4.15 Nature of problem: Generation of random density matrices. Solution method: Use of a physical quantum random number generator. Running time: Generating 100 random numbers takes about 1 second, generating 1000 random density matrices takes more than a minute.

  6. Super-resolving random-Gaussian apodized photon sieve.

    PubMed

    Sabatyan, Arash; Roshaninejad, Parisa

    2012-09-10

    A novel apodized photon sieve is presented in which random dense Gaussian distribution is implemented to modulate the pinhole density in each zone. The random distribution in dense Gaussian distribution causes intrazone discontinuities. Also, the dense Gaussian distribution generates a substantial number of pinholes in order to form a large degree of overlap between the holes in a few innermost zones of the photon sieve; thereby, clear zones are formed. The role of the discontinuities on the focusing properties of the photon sieve is examined as well. Analysis shows that secondary maxima have evidently been suppressed, transmission has increased enormously, and the central maxima width is approximately unchanged in comparison to the dense Gaussian distribution. Theoretical results have been completely verified by experiment.

  7. The design and implementation of the Technical Facilities Controller (TFC) for the Goldstone deep space communications complex

    NASA Technical Reports Server (NTRS)

    Killian, D. A.; Menninger, F. J.; Gorman, T.; Glenn, P.

    1988-01-01

    The Technical Facilities Controller is a microprocessor-based energy management system that is to be implemented in the Deep Space Network facilities. This system is used in conjunction with facilities equipment at each of the complexes in the operation and maintenance of air-conditioning equipment, power generation equipment, power distribution equipment, and other primary facilities equipment. The implementation of the Technical Facilities Controller was completed at the Goldstone Deep Space Communications Complex and is now operational. The installation completed at the Goldstone Complex is described and the utilization of the Technical Facilities Controller is evaluated. The findings will be used in the decision to implement a similar system at the overseas complexes at Canberra, Australia, and Madrid, Spain.

  8. Execution time supports for adaptive scientific algorithms on distributed memory machines

    NASA Technical Reports Server (NTRS)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  9. Execution time support for scientific programs on distributed memory machines

    NASA Technical Reports Server (NTRS)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  10. Waste prevention in liquid detergent distribution: a comparison based on life cycle assessment.

    PubMed

    Nessi, Simone; Rigamonti, Lucia; Grosso, Mario

    2014-11-15

    The distribution of liquid detergents through self-dispensing systems has been adopted in some Italian retail stores over the last few years. By enabling the consumer to refill several times the same container, it is proposed as a less waste-generating and more environmentally friendly alternative to the traditional distribution with single-use plastic containers. For this reason, its implementation is encouraged by the national waste prevention programme recently adopted in Italy. In order to assess such claims, a life cycle assessment was carried out to evaluate whether detergent distribution through self-dispensing systems actually allows to achieve the expected reduction in waste generation and environmental impacts. The focus was on the distribution within the large-scale retail trade and on the categories of laundry detergents, fabric softeners and hand dishwashing detergents. For each of them, a set of baseline single-use scenarios were compared with two alternative waste prevention scenarios, where the detergent is distributed through self-dispensing systems. Beyond waste generation, also the Cumulative Energy Demand and thirteen midpoint-level potential impact indicators were calculated for the comparison. Results showed that a reduction in waste generation up to 98% can be achieved, depending on the category of detergent, on the baseline scenario of comparison and on the number of times the refillable container is used. A progressive reduction in the energy demand and in most of the potential impacts was also observed, starting from a minimum number of uses of the refillable container. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  12. Advanced Unstructured Grid Generation for Complex Aerodynamic Applications

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    2008-01-01

    A new approach for distribution of grid points on the surface and in the volume has been developed and implemented in the NASA unstructured grid generation code VGRID. In addition to the point and line sources of prior work, the new approach utilizes surface and volume sources for automatic curvature-based grid sizing and convenient point distribution in the volume. A new exponential growth function produces smoother and more efficient grids and provides superior control over distribution of grid points in the field. All types of sources support anisotropic grid stretching which not only improves the grid economy but also provides more accurate solutions for certain aerodynamic applications. The new approach does not require a three-dimensional background grid as in the previous methods. Instead, it makes use of an efficient bounding-box auxiliary medium for storing grid parameters defined by surface sources. The new approach is less memory-intensive and more efficient computationally. The grids generated with the new method either eliminate the need for adaptive grid refinement for certain class of problems or provide high quality initial grids that would enhance the performance of many adaptation methods.

  13. Generating the local oscillator "locally" in continuous-variable quantum key distribution based on coherent detection

    DOE PAGES

    Qi, Bing; Lougovski, Pavel; Pooser, Raphael C.; ...

    2015-10-21

    Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In our paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a “locally” generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct amore » coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad 2), which is small enough to enable secure key distribution. This technology opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.« less

  14. A Distributed Fuzzy Associative Classifier for Big Data.

    PubMed

    Segatori, Armando; Bechini, Alessio; Ducange, Pietro; Marcelloni, Francesco

    2017-09-19

    Fuzzy associative classification has not been widely analyzed in the literature, although associative classifiers (ACs) have proved to be very effective in different real domain applications. The main reason is that learning fuzzy ACs is a very heavy task, especially when dealing with large datasets. To overcome this drawback, in this paper, we propose an efficient distributed fuzzy associative classification approach based on the MapReduce paradigm. The approach exploits a novel distributed discretizer based on fuzzy entropy for efficiently generating fuzzy partitions of the attributes. Then, a set of candidate fuzzy association rules is generated by employing a distributed fuzzy extension of the well-known FP-Growth algorithm. Finally, this set is pruned by using three purposely adapted types of pruning. We implemented our approach on the popular Hadoop framework. Hadoop allows distributing storage and processing of very large data sets on computer clusters built from commodity hardware. We have performed an extensive experimentation and a detailed analysis of the results using six very large datasets with up to 11,000,000 instances. We have also experimented different types of reasoning methods. Focusing on accuracy, model complexity, computation time, and scalability, we compare the results achieved by our approach with those obtained by two distributed nonfuzzy ACs recently proposed in the literature. We highlight that, although the accuracies result to be comparable, the complexity, evaluated in terms of number of rules, of the classifiers generated by the fuzzy distributed approach is lower than the one of the nonfuzzy classifiers.

  15. High-Assurance Spiral

    DTIC Science & Technology

    2017-11-01

    Public Release; Distribution Unlimited. PA# 88ABW-2017-5388 Date Cleared: 30 OCT 2017 13. SUPPLEMENTARY NOTES 14. ABSTRACT Cyber- physical systems... physical processes that interact in intricate manners. This makes verification of the software complex and unwieldy. In this report, an approach towards...resulting implementations. 15. SUBJECT TERMS Cyber- physical systems, Formal guarantees, Code generation 16. SECURITY CLASSIFICATION OF: 17

  16. Parallelization of NAS Benchmarks for Shared Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    This paper presents our experiences of parallelizing the sequential implementation of NAS benchmarks using compiler directives on SGI Origin2000 distributed shared memory (DSM) system. Porting existing applications to new high performance parallel and distributed computing platforms is a challenging task. Ideally, a user develops a sequential version of the application, leaving the task of porting to new generations of high performance computing systems to parallelization tools and compilers. Due to the simplicity of programming shared-memory multiprocessors, compiler developers have provided various facilities to allow the users to exploit parallelism. Native compilers on SGI Origin2000 support multiprocessing directives to allow users to exploit loop-level parallelism in their programs. Additionally, supporting tools can accomplish this process automatically and present the results of parallelization to the users. We experimented with these compiler directives and supporting tools by parallelizing sequential implementation of NAS benchmarks. Results reported in this paper indicate that with minimal effort, the performance gain is comparable with the hand-parallelized, carefully optimized, message-passing implementations of the same benchmarks.

  17. BIRD: A general interface for sparse distributed memory simulators

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1990-01-01

    Kanerva's sparse distributed memory (SDM) has now been implemented for at least six different computers, including SUN3 workstations, the Apple Macintosh, and the Connection Machine. A common interface for input of commands would both aid testing of programs on a broad range of computer architectures and assist users in transferring results from research environments to applications. A common interface also allows secondary programs to generate command sequences for a sparse distributed memory, which may then be executed on the appropriate hardware. The BIRD program is an attempt to create such an interface. Simplifying access to different simulators should assist developers in finding appropriate uses for SDM.

  18. S3D: An interactive surface grid generation tool

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1992-01-01

    S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.

  19. Empirical performance of the multivariate normal universal portfolio

    NASA Astrophysics Data System (ADS)

    Tan, Choon Peng; Pang, Sook Theng

    2013-09-01

    Universal portfolios generated by the multivariate normal distribution are studied with emphasis on the case where variables are dependent, namely, the covariance matrix is not diagonal. The moving-order multivariate normal universal portfolio requires very long implementation time and large computer memory in its implementation. With the objective of reducing memory and implementation time, the finite-order universal portfolio is introduced. Some stock-price data sets are selected from the local stock exchange and the finite-order universal portfolio is run on the data sets, for small finite order. Empirically, it is shown that the portfolio can outperform the moving-order Dirichlet universal portfolio of Cover and Ordentlich[2] for certain parameters in the selected data sets.

  20. Learning from Massive Distributed Data Sets (Invited)

    NASA Astrophysics Data System (ADS)

    Kang, E. L.; Braverman, A. J.

    2013-12-01

    Technologies for remote sensing and ever-expanding computer experiments in climate science are generating massive data sets. Meanwhile, it has been common in all areas of large-scale science to have these 'big data' distributed over multiple different physical locations, and moving large amounts of data can be impractical. In this talk, we will discuss efficient ways for us to summarize and learn from distributed data. We formulate a graphical model to mimic the main characteristics of a distributed-data network, including the size of the data sets and speed of moving data. With this nominal model, we investigate the trade off between prediction accurate and cost of data movement, theoretically and through simulation experiments. We will also discuss new implementations of spatial and spatio-temporal statistical methods optimized for distributed data.

  1. Sequential data assimilation for a distributed hydrologic model considering different time scale of internal processes

    NASA Astrophysics Data System (ADS)

    Noh, S.; Tachikawa, Y.; Shiiba, M.; Kim, S.

    2011-12-01

    Applications of the sequential data assimilation methods have been increasing in hydrology to reduce uncertainty in the model prediction. In a distributed hydrologic model, there are many types of state variables and each variable interacts with each other based on different time scales. However, the framework to deal with the delayed response, which originates from different time scale of hydrologic processes, has not been thoroughly addressed in the hydrologic data assimilation. In this study, we propose the lagged filtering scheme to consider the lagged response of internal states in a distributed hydrologic model using two filtering schemes; particle filtering (PF) and ensemble Kalman filtering (EnKF). The EnKF is one of the widely used sub-optimal filters implementing an efficient computation with limited number of ensemble members, however, still based on Gaussian approximation. PF can be an alternative in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions involved. In case of PF, advanced particle regularization scheme is implemented together to preserve the diversity of the particle system. In case of EnKF, the ensemble square root filter (EnSRF) are implemented. Each filtering method is parallelized and implemented in the high performance computing system. A distributed hydrologic model, the water and energy transfer processes (WEP) model, is applied for the Katsura River catchment, Japan to demonstrate the applicability of proposed approaches. Forecasted results via PF and EnKF are compared and analyzed in terms of the prediction accuracy and the probabilistic adequacy. Discussions are focused on the prospects and limitations of each data assimilation method.

  2. Stimulated Brillouin scattering in ultra-long distributed feedback Bragg gratings in standard optical fiber.

    PubMed

    Loranger, Sébastien; Lambin-Iezzi, Victor; Wahbeh, Mamoun; Kashyap, Raman

    2016-04-15

    Distributed feedback (DFB) fiber Bragg gratings (FBG) are widely used as narrow-band filters and single-mode cavities for lasers. Recently, a nonlinear generation has been shown in 10-20 cm DFB gratings in a highly nonlinear fiber. First, we show in this Letter a novel fabrication technique of ultra-long DFBs in a standard fiber (SMF-28). Second, we demonstrate nonlinear generation in such gratings. A particular inscription technique was used to fabricate all-in-phase ultra-long FBG and to implement reproducible phase shift to form a DFB mode. We demonstrate stimulated Brillouin scattering (SBS) emission from this DFB mode and characterize the resulting laser. It seems that such a SBS based DFB laser stabilizes a pump's jittering and reduces its linewidth.

  3. Compiling global name-space programs for distributed execution

    NASA Technical Reports Server (NTRS)

    Koelbel, Charles; Mehrotra, Piyush

    1990-01-01

    Distributed memory machines do not provide hardware support for a global address space. Thus programmers are forced to partition the data across the memories of the architecture and use explicit message passing to communicate data between processors. The compiler support required to allow programmers to express their algorithms using a global name-space is examined. A general method is presented for analysis of a high level source program and along with its translation to a set of independently executing tasks communicating via messages. If the compiler has enough information, this translation can be carried out at compile-time. Otherwise run-time code is generated to implement the required data movement. The analysis required in both situations is described and the performance of the generated code on the Intel iPSC/2 is presented.

  4. Optimal Interpolation scheme to generate reference crop evapotranspiration

    NASA Astrophysics Data System (ADS)

    Tomas-Burguera, Miquel; Beguería, Santiago; Vicente-Serrano, Sergio; Maneta, Marco

    2018-05-01

    We used an Optimal Interpolation (OI) scheme to generate a reference crop evapotranspiration (ETo) grid, forcing meteorological variables, and their respective error variance in the Iberian Peninsula for the period 1989-2011. To perform the OI we used observational data from the Spanish Meteorological Agency (AEMET) and outputs from a physically-based climate model. To compute ETo we used five OI schemes to generate grids for the five observed climate variables necessary to compute ETo using the FAO-recommended form of the Penman-Monteith equation (FAO-PM). The granularity of the resulting grids are less sensitive to variations in the density and distribution of the observational network than those generated by other interpolation methods. This is because our implementation of the OI method uses a physically-based climate model as prior background information about the spatial distribution of the climatic variables, which is critical for under-observed regions. This provides temporal consistency in the spatial variability of the climatic fields. We also show that increases in the density and improvements in the distribution of the observational network reduces substantially the uncertainty of the climatic and ETo estimates. Finally, a sensitivity analysis of observational uncertainties and network densification suggests the existence of a trade-off between quantity and quality of observations.

  5. The NatCarb geoportal: Linking distributed data from the Carbon Sequestration Regional Partnerships

    USGS Publications Warehouse

    Carr, T.R.; Rich, P.M.; Bartley, J.D.

    2007-01-01

    The Department of Energy (DOE) Carbon Sequestration Regional Partnerships are generating the data for a "carbon atlas" of key geospatial data (carbon sources, potential sinks, etc.) required for rapid implementation of carbon sequestration on a broad scale. The NATional CARBon Sequestration Database and Geographic Information System (NatCarb) provides Web-based, nation-wide data access. Distributed computing solutions link partnerships and other publicly accessible repositories of geological, geophysical, natural resource, infrastructure, and environmental data. Data are maintained and enhanced locally, but assembled and accessed through a single geoportal. NatCarb, as a first attempt at a national carbon cyberinfrastructure (NCCI), assembles the data required to address technical and policy challenges of carbon capture and storage. We present a path forward to design and implement a comprehensive and successful NCCI. ?? 2007 The Haworth Press, Inc. All rights reserved.

  6. Secret Key Generation via a Modified Quantum Secret Sharing Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith IV, Amos M; Evans, Philip G; Lawrie, Benjamin J

    We present and experimentally show a novel protocol for distributing secret information between two and only two parties in a N-party single-qubit Quantum Secret Sharing (QSS) system. We demonstrate this new algorithm with N = 3 active parties over 6km of telecom. ber. Our experimental device is based on the Clavis2 Quantum Key Distribution (QKD) system built by ID Quantique but is generalizable to any implementation. We show that any two out of the N parties can build secret keys based on partial information from each other and with collaboration from the remaining N > 2 parties. This algorithm allowsmore » for the creation of two-party secret keys were standard QSS does not and signicantly reduces the number of resources needed to implement QKD on a highly connected network such as the electrical grid.« less

  7. Energy sweep compensation of induction accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sampayan, S.E.; Caporaso, G.J.; Chen, Y-J

    1990-09-12

    The ETA-II linear induction accelerator (LIA) is designed to drive a microwave free electron laser (FEL). Beam energy sweep must be limited to {plus minus}1% for 50 ns to limit beam corkscrew motion and ensure high power FEL output over the full duration of the beam flattop. To achieve this energy sweep requirement, we have implemented a pulse distribution system and are planning implementation of a tapered pulse forming line (PFL) in the pulse generators driving acceleration gaps. The pulse distribution system assures proper phasing of the high voltage pulse to the electron beam. Additionally, cell-to-cell coupling of beam inducedmore » transients is reduced. The tapered PFL compensates for accelerator cell and loading nonlinearities. Circuit simulations show good agreement with preliminary data and predict the required energy sweep requirement can be met.« less

  8. The structure of the clouds distributed operating system

    NASA Technical Reports Server (NTRS)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1989-01-01

    A novel system architecture, based on the object model, is the central structuring concept used in the Clouds distributed operating system. This architecture makes Clouds attractive over a wide class of machines and environments. Clouds is a native operating system, designed and implemented at Georgia Tech. and runs on a set of generated purpose computers connected via a local area network. The system architecture of Clouds is composed of a system-wide global set of persistent (long-lived) virtual address spaces, called objects that contain persistent data and code. The object concept is implemented at the operating system level, thus presenting a single level storage view to the user. Lightweight treads carry computational activity through the code stored in the objects. The persistent objects and threads gives rise to a programming environment composed of shared permanent memory, dispensing with the need for hardware-derived concepts such as the file systems and message systems. Though the hardware may be distributed and may have disks and networks, the Clouds provides the applications with a logically centralized system, based on a shared, structured, single level store. The current design of Clouds uses a minimalist philosophy with respect to both the kernel and the operating system. That is, the kernel and the operating system support a bare minimum of functionality. Clouds also adheres to the concept of separation of policy and mechanism. Most low-level operating system services are implemented above the kernel and most high level services are implemented at the user level. From the measured performance of using the kernel mechanisms, we are able to demonstrate that efficient implementations are feasible for the object model on commercially available hardware. Clouds provides a rich environment for conducting research in distributed systems. Some of the topics addressed in this paper include distributed programming environments, consistency of persistent data and fault-tolerance.

  9. Experimental comparison of PV-smoothing controllers using distributed generators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Ellis, Abraham; Denda, Atsushi

    The power output variability of photovoltaic systems can affect local electrical grids in locations with high renewable energy penetrations or weak distribution or transmission systems. In those rare cases, quick controllable generators (e.g., energy storage systems) or loads can counteract the destabilizing effects by compensating for the power fluctuations. Previously, control algorithms for coordinated and uncoordinated operation of a small natural gas engine-generator (genset) and a battery for smoothing PV plant output were optimized using MATLAB/Simulink simulations. The simulations demonstrated that a traditional generation resource such as a natural gas genset in combination with a battery would smooth the photovoltaicmore » output while using a smaller battery state of charge (SOC) range and extending the life of the battery. This paper reports on the experimental implementation of the coordinated and uncoordinated controllers to verify the simulations and determine the differences in the controllers. The experiments were performed with the PNM PV and energy storage Prosperity site and a gas engine-generator located at the Aperture Center at Mesa Del Sol in Albuquerque, New Mexico. Two field demonstrations were performed to compare the different PV smoothing control algorithms: (1) implementing the coordinated and uncoordinated controls while switching off a subsection of the PV array at precise times on successive clear days, and (2) comparing the results of the battery and genset outputs for the coordinated control on a high variability day with simulations of the coordinated and uncoordinated controls. It was found that for certain PV power profiles the SOC range of the battery may be larger with the coordinated control, but the total amp-hours through the battery-which approximates battery wear-will always be smaller with the coordinated control.« less

  10. Including a Service Learning Educational Research Project in a Biology Course-I: Assessing Community Awareness of Childhood Lead Poisoning

    ERIC Educational Resources Information Center

    Abu-Shakra, Amal; Saliim, Eric

    2012-01-01

    A university course project was developed and implemented in a biology course, focusing on environmental problems, to assess community awareness of childhood lead poisoning. A set of 385 questionnaires was generated and distributed in an urban community in North Carolina, USA. The completed questionnaires were sorted first into yes and no sets…

  11. Parameter regimes for a single sequential quantum repeater

    NASA Astrophysics Data System (ADS)

    Rozpędek, F.; Goodenough, K.; Ribeiro, J.; Kalb, N.; Caprara Vivoli, V.; Reiserer, A.; Hanson, R.; Wehner, S.; Elkouss, D.

    2018-07-01

    Quantum key distribution allows for the generation of a secret key between distant parties connected by a quantum channel such as optical fibre or free space. Unfortunately, the rate of generation of a secret key by direct transmission is fundamentally limited by the distance. This limit can be overcome by the implementation of so-called quantum repeaters. Here, we assess the performance of a specific but very natural setup called a single sequential repeater for quantum key distribution. We offer a fine-grained assessment of the repeater by introducing a series of benchmarks. The benchmarks, which should be surpassed to claim a working repeater, are based on finite-energy considerations, thermal noise and the losses in the setup. In order to boost the performance of the studied repeaters we introduce two methods. The first one corresponds to the concept of a cut-off, which reduces the effect of decoherence during the storage of a quantum state by introducing a maximum storage time. Secondly, we supplement the standard classical post-processing with an advantage distillation procedure. Using these methods, we find realistic parameters for which it is possible to achieve rates greater than each of the benchmarks, guiding the way towards implementing quantum repeaters.

  12. Plan for the Characterization of HIRF Effects on a Fault-Tolerant Computer Communication System

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo; Malekpour, Mahyar R.; Miner, Paul S.; Koppen, Sandra V.

    2008-01-01

    This report presents the plan for the characterization of the effects of high intensity radiated fields on a prototype implementation of a fault-tolerant data communication system. Various configurations of the communication system will be tested. The prototype system is implemented using off-the-shelf devices. The system will be tested in a closed-loop configuration with extensive real-time monitoring. This test is intended to generate data suitable for the design of avionics health management systems, as well as redundancy management mechanisms and policies for robust distributed processing architectures.

  13. Proceedings: Sisal `93

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feo, J.T.

    1993-10-01

    This report contain papers on: Programmability and performance issues; The case of an iterative partial differential equation solver; Implementing the kernal of the Australian Region Weather Prediction Model in Sisal; Even and quarter-even prime length symmetric FFTs and their Sisal Implementations; Top-down thread generation for Sisal; Overlapping communications and computations on NUMA architechtures; Compiling technique based on dataflow analysis for funtional programming language Valid; Copy elimination for true multidimensional arrays in Sisal 2.0; Increasing parallelism for an optimization that reduces copying in IF2 graphs; Caching in on Sisal; Cache performance of Sisal Vs. FORTRAN; FFT algorithms on a shared-memory multiprocessor;more » A parallel implementation of nonnumeric search problems in Sisal; Computer vision algorithms in Sisal; Compilation of Sisal for a high-performance data driven vector processor; Sisal on distributed memory machines; A virtual shared addressing system for distributed memory Sisal; Developing a high-performance FFT algorithm in Sisal for a vector supercomputer; Implementation issues for IF2 on a static data-flow architechture; and Systematic control of parallelism in array-based data-flow computation. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.« less

  14. Interpreting drinking water quality in the distribution system using Dempster-Shafer theory of evidence.

    PubMed

    Sadiq, Rehan; Rodriguez, Manuel J

    2005-04-01

    Interpreting water quality data routinely generated for control and monitoring purposes in water distribution systems is a complicated task for utility managers. In fact, data for diverse water quality indicators (physico-chemical and microbiological) are generated at different times and at different locations in the distribution system. To simplify and improve the understanding and the interpretation of water quality, methodologies for aggregation and fusion of data must be developed. In this paper, the Dempster-Shafer theory also called theory of evidence is introduced as a potential methodology for interpreting water quality data. The conceptual basis of this methodology and the process for its implementation are presented by two applications. The first application deals with the interpretation of spatial water quality data fusion, while the second application deals with the development of water quality index based on key monitored indicators. Based on the obtained results, the authors discuss the potential contribution of theory of evidence as a decision-making tool for water quality management.

  15. Distributed Saturation

    NASA Technical Reports Server (NTRS)

    Chung, Ming-Ying; Ciardo, Gianfranco; Siminiceanu, Radu I.

    2007-01-01

    The Saturation algorithm for symbolic state-space generation, has been a recent break-through in the exhaustive veri cation of complex systems, in particular globally-asyn- chronous/locally-synchronous systems. The algorithm uses a very compact Multiway Decision Diagram (MDD) encoding for states and the fastest symbolic exploration algo- rithm to date. The distributed version of Saturation uses the overall memory available on a network of workstations (NOW) to efficiently spread the memory load during the highly irregular exploration. A crucial factor in limiting the memory consumption during the symbolic state-space generation is the ability to perform garbage collection to free up the memory occupied by dead nodes. However, garbage collection over a NOW requires a nontrivial communication overhead. In addition, operation cache policies become critical while analyzing large-scale systems using the symbolic approach. In this technical report, we develop a garbage collection scheme and several operation cache policies to help on solving extremely complex systems. Experiments show that our schemes improve the performance of the original distributed implementation, SmArTNow, in terms of time and memory efficiency.

  16. Generation of radially-polarized terahertz pulses for coupling into coaxial waveguides

    PubMed Central

    Navarro-Cía, Miguel; Wu, Jiang; Liu, Huiyun; Mitrofanov, Oleg

    2016-01-01

    Coaxial waveguides exhibit no dispersion and therefore can serve as an ideal channel for transmission of broadband THz pulses. Implementation of THz coaxial waveguide systems however requires THz beams with radially-polarized distribution. We demonstrate the launching of THz pulses into coaxial waveguides using the effect of THz pulse generation at semiconductor surfaces. We find that the radial transient photo-currents produced upon optical excitation of the surface at normal incidence radiate a THz pulse with the field distribution matching the mode of the coaxial waveguide. In this simple scheme, the optical excitation beam diameter controls the spatial profile of the generated radially-polarized THz pulse and allows us to achieve efficient coupling into the TEM waveguide mode in a hollow coaxial THz waveguide. The TEM quasi-single mode THz waveguide excitation and non-dispersive propagation of a short THz pulse is verified experimentally by time-resolved near-field mapping of the THz field at the waveguide output. PMID:27941845

  17. A two-step method for developing a control rod program for boiling water reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taner, M.S.; Levine, S.H.; Hsiao, M.Y.

    1992-01-01

    This paper reports on a two-step method that is established for the generation of a long-term control rod program for boiling water reactors (BWRs). The new method assumes a time-variant target power distribution in core depletion. In the new method, the BWR control rod programming is divided into two steps. In step 1, a sequence of optimal, exposure-dependent Haling power distribution profiles is generated, utilizing the spectral shift concept. In step 2, a set of exposure-dependent control rod patterns is developed by using the Haling profiles generated at step 1 as a target. The new method is implemented in amore » computer program named OCTOPUS. The optimization procedure of OCTOPUS is based on the method of approximation programming, in which the SIMULATE-E code is used to determine the nucleonics characteristics of the reactor core state. In a test in cycle length over a time-invariant, target Haling power distribution case because of a moderate application of spectral shift. No thermal limits of the core were violated. The gain in cycle length could be increased further by broadening the extent of the spetral shift.« less

  18. MADANALYSIS 5, a user-friendly framework for collider phenomenology

    NASA Astrophysics Data System (ADS)

    Conte, Eric; Fuks, Benjamin; Serret, Guillaume

    2013-01-01

    We present MADANALYSIS 5, a new framework for phenomenological investigations at particle colliders. Based on a C++ kernel, this program allows us to efficiently perform, in a straightforward and user-friendly fashion, sophisticated physics analyses of event files such as those generated by a large class of Monte Carlo event generators. MADANALYSIS 5 comes with two modes of running. The first one, easier to handle, uses the strengths of a powerful PYTHON interface in order to implement physics analyses by means of a set of intuitive commands. The second one requires one to implement the analyses in the C++ programming language, directly within the core of the analysis framework. This opens unlimited possibilities concerning the level of complexity which can be reached, being only limited by the programming skills and the originality of the user. Program summaryProgram title: MadAnalysis 5 Catalogue identifier: AENO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Permission to use, copy, modify and distribute this program is granted under the terms of the GNU General Public License. No. of lines in distributed program, including test data, etc.: 31087 No. of bytes in distributed program, including test data, etc.: 399105 Distribution format: tar.gz Programming language: PYTHON, C++. Computer: All platforms on which Python version 2.7, Root version 5.27 and the g++ compiler are available. Compatibility with newer versions of these programs is also ensured. However, the Python version must be below version 3.0. Operating system: Unix, Linux and Mac OS operating systems on which the above-mentioned versions of Python and Root, as well as g++, are available. Classification: 11.1. External routines: ROOT (http://root.cern.ch/drupal/) Nature of problem: Implementing sophisticated phenomenological analyses in high-energy physics through a flexible, efficient and straightforward fashion, starting from event files such as those produced by Monte Carlo event generators. The event files can have been matched or not to parton-showering and can have been processed or not by a (fast) simulation of a detector. According to the sophistication level of the event files (parton-level, hadron-level, reconstructed-level), one must note that several input formats are possible. Solution method: We implement an interface allowing the production of predefined as well as user-defined histograms for a large class of kinematical distributions after applying a set of event selection cuts specified by the user. This therefore allows us to devise robust and novel search strategies for collider experiments, such as those currently running at the Large Hadron Collider at CERN, in a very efficient way. Restrictions: Unsupported event file format. Unusual features: The code is fully based on object representations for events, particles, reconstructed objects and cuts, which facilitates the implementation of an analysis. Running time: It depends on the purposes of the user and on the number of events to process. It varies from a few seconds to the order of the minute for several millions of events.

  19. Local Alignment Tool Based on Hadoop Framework and GPU Architecture

    PubMed Central

    Hung, Che-Lun; Hua, Guan-Jie

    2014-01-01

    With the rapid growth of next generation sequencing technologies, such as Slex, more and more data have been discovered and published. To analyze such huge data the computational performance is an important issue. Recently, many tools, such as SOAP, have been implemented on Hadoop and GPU parallel computing architectures. BLASTP is an important tool, implemented on GPU architectures, for biologists to compare protein sequences. To deal with the big biology data, it is hard to rely on single GPU. Therefore, we implement a distributed BLASTP by combining Hadoop and multi-GPUs. The experimental results present that the proposed method can improve the performance of BLASTP on single GPU, and also it can achieve high availability and fault tolerance. PMID:24955362

  20. Local alignment tool based on Hadoop framework and GPU architecture.

    PubMed

    Hung, Che-Lun; Hua, Guan-Jie

    2014-01-01

    With the rapid growth of next generation sequencing technologies, such as Slex, more and more data have been discovered and published. To analyze such huge data the computational performance is an important issue. Recently, many tools, such as SOAP, have been implemented on Hadoop and GPU parallel computing architectures. BLASTP is an important tool, implemented on GPU architectures, for biologists to compare protein sequences. To deal with the big biology data, it is hard to rely on single GPU. Therefore, we implement a distributed BLASTP by combining Hadoop and multi-GPUs. The experimental results present that the proposed method can improve the performance of BLASTP on single GPU, and also it can achieve high availability and fault tolerance.

  1. Implementation of continuous-variable quantum key distribution with discrete modulation

    NASA Astrophysics Data System (ADS)

    Hirano, Takuya; Ichikawa, Tsubasa; Matsubara, Takuto; Ono, Motoharu; Oguri, Yusuke; Namiki, Ryo; Kasai, Kenta; Matsumoto, Ryutaroh; Tsurumaru, Toyohiro

    2017-06-01

    We have developed a continuous-variable quantum key distribution (CV-QKD) system that employs discrete quadrature-amplitude modulation and homodyne detection of coherent states of light. We experimentally demonstrated automated secure key generation with a rate of 50 kbps when a quantum channel is a 10 km optical fibre. The CV-QKD system utilises a four-state and post-selection protocol and generates a secure key against the entangling cloner attack. We used a pulsed light source of 1550 nm wavelength with a repetition rate of 10 MHz. A commercially available balanced receiver is used to realise shot-noise-limited pulsed homodyne detection. We used a non-binary LDPC code for error correction (reverse reconciliation) and the Toeplitz matrix multiplication for privacy amplification. A graphical processing unit card is used to accelerate the software-based post-processing.

  2. Hadoop-BAM: directly manipulating next generation sequencing data in the cloud.

    PubMed

    Niemenmaa, Matti; Kallio, Aleksi; Schumacher, André; Klemelä, Petri; Korpelainen, Eija; Heljanko, Keijo

    2012-03-15

    Hadoop-BAM is a novel library for the scalable manipulation of aligned next-generation sequencing data in the Hadoop distributed computing framework. It acts as an integration layer between analysis applications and BAM files that are processed using Hadoop. Hadoop-BAM solves the issues related to BAM data access by presenting a convenient API for implementing map and reduce functions that can directly operate on BAM records. It builds on top of the Picard SAM JDK, so tools that rely on the Picard API are expected to be easily convertible to support large-scale distributed processing. In this article we demonstrate the use of Hadoop-BAM by building a coverage summarizing tool for the Chipster genome browser. Our results show that Hadoop offers good scalability, and one should avoid moving data in and out of Hadoop between analysis steps.

  3. Binary variable multiple-model multiple imputation to address missing data mechanism uncertainty: Application to a smoking cessation trial

    PubMed Central

    Siddique, Juned; Harel, Ofer; Crespi, Catherine M.; Hedeker, Donald

    2014-01-01

    The true missing data mechanism is never known in practice. We present a method for generating multiple imputations for binary variables that formally incorporates missing data mechanism uncertainty. Imputations are generated from a distribution of imputation models rather than a single model, with the distribution reflecting subjective notions of missing data mechanism uncertainty. Parameter estimates and standard errors are obtained using rules for nested multiple imputation. Using simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal smoking cessation trial where nonignorably missing data were a concern. Our method provides a simple approach for formalizing subjective notions regarding nonresponse and can be implemented using existing imputation software. PMID:24634315

  4. SLAM, a Mathematica interface for SUSY spectrum generators

    NASA Astrophysics Data System (ADS)

    Marquard, Peter; Zerf, Nikolai

    2014-03-01

    We present and publish a Mathematica package, which can be used to automatically obtain any numerical MSSM input parameter from SUSY spectrum generators, which follow the SLHA standard, like SPheno, SOFTSUSY, SuSeFLAV or Suspect. The package enables a very comfortable way of numerical evaluations within the MSSM using Mathematica. It implements easy to use predefined high scale and low scale scenarios like mSUGRA or mhmax and if needed enables the user to directly specify the input required by the spectrum generators. In addition it supports an automatic saving and loading of SUSY spectra to and from a SQL data base, avoiding the rerun of a spectrum generator for a known spectrum. Catalogue identifier: AERX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERX_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4387 No. of bytes in distributed program, including test data, etc.: 37748 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer where Mathematica version 6 or higher is running providing bash and sed. Operating system: Linux. Classification: 11.1. External routines: A SUSY spectrum generator such as SPheno, SOFTSUSY, SuSeFLAV or SUSPECT Nature of problem: Interfacing published spectrum generators for automated creation, saving and loading of SUSY particle spectra. Solution method: SLAM automatically writes/reads SLHA spectrum generator input/output and is able to save/load generated data in/from a data base. Restrictions: No general restrictions, specific restrictions are given in the manuscript. Running time: A single spectrum calculation takes much less than one second on a modern PC.

  5. The Particle Physics Data Grid. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    2002-08-16

    The main objective of the Particle Physics Data Grid (PPDG) project has been to implement and evaluate distributed (Grid-enabled) data access and management technology for current and future particle and nuclear physics experiments. The specific goals of PPDG have been to design, implement, and deploy a Grid-based software infrastructure capable of supporting the data generation, processing and analysis needs common to the physics experiments represented by the participants, and to adapt experiment-specific software to operate in the Grid environment and to exploit this infrastructure. To accomplish these goals, the PPDG focused on the implementation and deployment of several critical services:more » reliable and efficient file replication service, high-speed data transfer services, multisite file caching and staging service, and reliable and recoverable job management services. The focus of the activity was the job management services and the interplay between these services and distributed data access in a Grid environment. Software was developed to study the interaction between HENP applications and distributed data storage fabric. One key conclusion was the need for a reliable and recoverable tool for managing large collections of interdependent jobs. An attached document provides an overview of the current status of the Directed Acyclic Graph Manager (DAGMan) with its main features and capabilities.« less

  6. Feedback-Based Projected-Gradient Method for Real-Time Optimization of Aggregations of Energy Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall-Anese, Emiliano; Bernstein, Andrey; Simonetto, Andrea

    This paper develops an online optimization method to maximize operational objectives of distribution-level distributed energy resources (DERs), while adjusting the aggregate power generated (or consumed) in response to services requested by grid operators. The design of the online algorithm is based on a projected-gradient method, suitably modified to accommodate appropriate measurements from the distribution network and the DERs. By virtue of this approach, the resultant algorithm can cope with inaccuracies in the representation of the AC power flows, it avoids pervasive metering to gather the state of noncontrollable resources, and it naturally lends itself to a distributed implementation. Optimality claimsmore » are established in terms of tracking of the solution of a well-posed time-varying convex optimization problem.« less

  7. Feedback-Based Projected-Gradient Method For Real-Time Optimization of Aggregations of Energy Resources: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall-Anese, Emiliano; Bernstein, Andrey; Simonetto, Andrea

    This paper develops an online optimization method to maximize the operational objectives of distribution-level distributed energy resources (DERs) while adjusting the aggregate power generated (or consumed) in response to services requested by grid operators. The design of the online algorithm is based on a projected-gradient method, suitably modified to accommodate appropriate measurements from the distribution network and the DERs. By virtue of this approach, the resultant algorithm can cope with inaccuracies in the representation of the AC power, it avoids pervasive metering to gather the state of noncontrollable resources, and it naturally lends itself to a distributed implementation. Optimality claimsmore » are established in terms of tracking of the solution of a well-posed time-varying optimization problem.« less

  8. Flapping wing applied to wind generators

    NASA Astrophysics Data System (ADS)

    Colidiuc, Alexandra; Galetuse, Stelian; Suatean, Bogdan

    2012-11-01

    The new conditions at the international level for energy source distributions and the continuous increasing of energy consumption must lead to a new alternative resource with the condition of keeping the environment clean. This paper offers a new approach for a wind generator and is based on the theoretical aerodynamic model. This new model of wind generator helped me to test what influences would be if there will be a bird airfoil instead of a normal wind generator airfoil. The aim is to calculate the efficiency for the new model of wind generator. A representative direction for using the renewable energy is referred to the transformation of wind energy into electrical energy, with the help of wind turbines; the development of such systems lead to new solutions based on high efficiency, reduced costs and suitable to the implementation conditions.

  9. Possibilities of creating a pure coal-fired power industry based on nanomaterials

    NASA Astrophysics Data System (ADS)

    Zyryanov, V. V.

    2015-08-01

    A concept of distributed multigeneration during combustion of homogenized solid fuels with the addition of oxygen-enriched (to 30-50%) air is proposed. To implement this concept, application of medium-temperature δ-Bi2O3/Ag-nanocermet-based membranes is suggested under low pressures and sweeping of oxygen by the cleaned exit gas or the air. The primary product of the multigeneration is microsphere materials. The heat, the AC and the DC electric energy, the cleaned exit gases with a high CO2 content, and volatile elements adsorbed by the filters are the secondary products. To completely clean the exit gases, which is necessary to implement the distributed multigeneration, an array of successive passive plants is proposed. A thermoelectric module based on a BiTeSb-skutterudite nanocomposite is effective in generation of the DC electric energy at microthermoelectric power plants.

  10. Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Chen, Yousu; Wu, Di

    2015-12-09

    Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Messagemore » Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.« less

  11. Chaotic optical time-domain reflectometry using a distributed feedback laser diode modulated by an improved Colpitts oscillator

    NASA Astrophysics Data System (ADS)

    Li, Jing Xia; Xu, Hang; Liu, Li; Su, Peng Cheng; Zhang, Jian Guo

    2015-05-01

    We report a chaotic optical time-domain reflectometry for fiber fault location, where a chaotic probe signal is generated by driving a distributed feedback laser diode with an improved Colpitts chaotic oscillator. The results show that the unterminated fiber end, the loose connector, and the mismatch connector can be precisely located. A measurement range of approximately 91 km and a range independent resolution of 6 cm are achieved. This implementation method is easy to integrate and is cost effective, which gives it great potential for commercial applications.

  12. A digital protection system incorporating knowledge based learning

    NASA Astrophysics Data System (ADS)

    Watson, Karan; Russell, B. Don; McCall, Kurt

    A digital system architecture used to diagnoses the operating state and health of electric distribution lines and to generate actions for line protection is presented. The architecture is described functionally and to a limited extent at the hardware level. This architecture incorporates multiple analysis and fault-detection techniques utilizing a variety of parameters. In addition, a knowledge-based decision maker, a long-term memory retention and recall scheme, and a learning environment are described. Preliminary laboratory implementations of the system elements have been completed. Enhanced protection for electric distribution feeders is provided by this system. Advantages of the system are enumerated.

  13. A compact free space quantum key distribution system capable of daylight operation

    NASA Astrophysics Data System (ADS)

    Benton, David M.; Gorman, Phillip M.; Tapster, Paul R.; Taylor, David M.

    2010-06-01

    A free space quantum key distribution system has been demonstrated. Consideration has been given to factors such as field of view and spectral width, to cut down the deleterious effect from background light levels. Suitable optical sources such as lasers and RCLEDs have been investigated as well as optimal wavelength choices, always with a view to building a compact and robust system. The implementation of background reduction measures resulted in a system capable of operating in daylight conditions. An autonomous system was left running and generating shared key material continuously for over 7 days.

  14. Distributed Energy Systems: Security Implications of the Grid of the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stamber, Kevin L.; Kelic, Andjelka; Taylor, Robert A.

    2017-01-01

    Distributed Energy Resources (DER) are being added to the nation's electric grid, and as penetration of these resources increases, they have the potential to displace or offset large-scale, capital-intensive, centralized generation. Integration of DER into operation of the traditional electric grid requires automated operational control and communication of DER elements, from system measurement to control hardware and software, in conjunction with a utility's existing automated and human-directed control of other portions of the system. Implementation of DER technologies suggests a number of gaps from both a security and a policy perspective. This page intentionally left blank.

  15. High-speed continuous-variable quantum key distribution without sending a local oscillator.

    PubMed

    Huang, Duan; Huang, Peng; Lin, Dakai; Wang, Chao; Zeng, Guihua

    2015-08-15

    We report a 100-MHz continuous-variable quantum key distribution (CV-QKD) experiment over a 25-km fiber channel without sending a local oscillator (LO). We use a "locally" generated LO and implement with a 1-GHz shot-noise-limited homodyne detector to achieve high-speed quantum measurement, and we propose a secure phase compensation scheme to maintain a low level of excess noise. These make high-bit-rate CV-QKD significantly simpler for larger transmission distances compared with previous schemes in which both LO and quantum signals are transmitted through the insecure quantum channel.

  16. Exorcising the Ghost in the Machine: Synthetic Spectral Data Cubes for Assessing Big Data Algorithms

    NASA Astrophysics Data System (ADS)

    Araya, M.; Solar, M.; Mardones, D.; Hochfärber, T.

    2015-09-01

    The size and quantity of the data that is being generated by large astronomical projects like ALMA, requires a paradigm change in astronomical data analysis. Complex data, such as highly sensitive spectroscopic data in the form of large data cubes, are not only difficult to manage, transfer and visualize, but they make traditional data analysis techniques unfeasible. Consequently, the attention has been placed on machine learning and artificial intelligence techniques, to develop approximate and adaptive methods for astronomical data analysis within a reasonable computational time. Unfortunately, these techniques are usually sub optimal, stochastic and strongly dependent of the parameters, which could easily turn into “a ghost in the machine” for astronomers and practitioners. Therefore, a proper assessment of these methods is not only desirable but mandatory for trusting them in large-scale usage. The problem is that positively verifiable results are scarce in astronomy, and moreover, science using bleeding-edge instrumentation naturally lacks of reference values. We propose an Astronomical SYnthetic Data Observations (ASYDO), a virtual service that generates synthetic spectroscopic data in the form of data cubes. The objective of the tool is not to produce accurate astrophysical simulations, but to generate a large number of labelled synthetic data, to assess advanced computing algorithms for astronomy and to develop novel Big Data algorithms. The synthetic data is generated using a set of spectral lines, template functions for spatial and spectral distributions, and simple models that produce reasonable synthetic observations. Emission lines are obtained automatically using IVOA's SLAP protocol (or from a relational database) and their spectral profiles correspond to distributions in the exponential family. The spatial distributions correspond to simple functions (e.g., 2D Gaussian), or to scalable template objects. The intensity, broadening and radial velocity of each line is given by very simple and naive physical models, yet ASYDO's generic implementation supports new user-made models, which potentially allows adding more realistic simulations. The resulting data cube is saved as a FITS file, also including all the tables and images used for generating the cube. We expect to implement ASYDO as a virtual observatory service in the near future.

  17. 78 FR 28501 - Approval and Promulgation of Air Quality Implementation Plans; Minnesota; Flint Hills Resources...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-15

    ..., refinery fuel gas is generated by the facility's processes and collected into two fuel gas mix drums, designated 41V-33 and 45V-39. The gases are then distributed from these mix drums to combustion units at the facility, such as boilers and heaters. FHR Pine Bend operates H 2 S CEMs on the mix drums to satisfy the...

  18. Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen

    Here, we propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator–coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.

  19. Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits

    DOE PAGES

    Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen; ...

    2018-03-12

    Here, we propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator–coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.

  20. Encryption key distribution via chaos synchronization

    NASA Astrophysics Data System (ADS)

    Keuninckx, Lars; Soriano, Miguel C.; Fischer, Ingo; Mirasso, Claudio R.; Nguimdo, Romain M.; van der Sande, Guy

    2017-02-01

    We present a novel encryption scheme, wherein an encryption key is generated by two distant complex nonlinear units, forced into synchronization by a chaotic driver. The concept is sufficiently generic to be implemented on either photonic, optoelectronic or electronic platforms. The method for generating the key bitstream from the chaotic signals is reconfigurable. Although derived from a deterministic process, the obtained bit series fulfill the randomness conditions as defined by the National Institute of Standards test suite. We demonstrate the feasibility of our concept on an electronic delay oscillator circuit and test the robustness against attacks using a state-of-the-art system identification method.

  1. Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits

    NASA Astrophysics Data System (ADS)

    Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen; Gauthier, Daniel J.

    2018-03-01

    We propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator-coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.

  2. Assessment of Moderate- and High-Temperature Geothermal Resources of the United States

    USGS Publications Warehouse

    Williams, Colin F.; Reed, Marshall J.; Mariner, Robert H.; DeAngelo, Jacob; Galanis, S. Peter

    2008-01-01

    Scientists with the U.S. Geological Survey (USGS) recently completed an assessment of our Nation's geothermal resources. Geothermal power plants are currently operating in six states: Alaska, California, Hawaii, Idaho, Nevada, and Utah. The assessment indicates that the electric power generation potential from identified geothermal systems is 9,057 Megawatts-electric (MWe), distributed over 13 states. The mean estimated power production potential from undiscovered geothermal resources is 30,033 MWe. Additionally, another estimated 517,800 MWe could be generated through implementation of technology for creating geothermal reservoirs in regions characterized by high temperature, but low permeability, rock formations.

  3. A parallel time integrator for noisy nonlinear oscillatory systems

    NASA Astrophysics Data System (ADS)

    Subber, Waad; Sarkar, Abhijit

    2018-06-01

    In this paper, we adapt a parallel time integration scheme to track the trajectories of noisy non-linear dynamical systems. Specifically, we formulate a parallel algorithm to generate the sample path of nonlinear oscillator defined by stochastic differential equations (SDEs) using the so-called parareal method for ordinary differential equations (ODEs). The presence of Wiener process in SDEs causes difficulties in the direct application of any numerical integration techniques of ODEs including the parareal algorithm. The parallel implementation of the algorithm involves two SDEs solvers, namely a fine-level scheme to integrate the system in parallel and a coarse-level scheme to generate and correct the required initial conditions to start the fine-level integrators. For the numerical illustration, a randomly excited Duffing oscillator is investigated in order to study the performance of the stochastic parallel algorithm with respect to a range of system parameters. The distributed implementation of the algorithm exploits Massage Passing Interface (MPI).

  4. Deterministic binary vectors for efficient automated indexing of MEDLINE/PubMed abstracts.

    PubMed

    Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R; Bernstam, Elmer V; Cohen, Trevor

    2012-01-01

    The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI.

  5. Deterministic Binary Vectors for Efficient Automated Indexing of MEDLINE/PubMed Abstracts

    PubMed Central

    Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R.; Bernstam, Elmer V.; Cohen, Trevor

    2012-01-01

    The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI. PMID:23304369

  6. Parallelization of KENO-Va Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Ramón, Javier; Peña, Jorge

    1995-07-01

    KENO-Va is a code integrated within the SCALE system developed by Oak Ridge that solves the transport equation through the Monte Carlo Method. It is being used at the Consejo de Seguridad Nuclear (CSN) to perform criticality calculations for fuel storage pools and shipping casks. Two parallel versions of the code: one for shared memory machines and other for distributed memory systems using the message-passing interface PVM have been generated. In both versions the neutrons of each generation are tracked in parallel. In order to preserve the reproducibility of the results in both versions, advanced seeds for random numbers were used. The CONVEX C3440 with four processors and shared memory at CSN was used to implement the shared memory version. A FDDI network of 6 HP9000/735 was employed to implement the message-passing version using proprietary PVM. The speedup obtained was 3.6 in both cases.

  7. Distributed Processing System for Restoration of Electric Power Distribution Network Using Two-Layered Contract Net Protocol

    NASA Astrophysics Data System (ADS)

    Kodama, Yu; Hamagami, Tomoki

    Distributed processing system for restoration of electric power distribution network using two-layered CNP is proposed. The goal of this study is to develop the restoration system which adjusts to the future power network with distributed generators. The state of the art of this study is that the two-layered CNP is applied for the distributed computing environment in practical use. The two-layered CNP has two classes of agents, named field agent and operating agent in the network. In order to avoid conflicts of tasks, operating agent controls privilege for managers to send the task announcement messages in CNP. This technique realizes the coordination between agents which work asynchronously in parallel with others. Moreover, this study implements the distributed processing system using a de-fact standard multi-agent framework, JADE(Java Agent DEvelopment framework). This study conducts the simulation experiments of power distribution network restoration and compares the proposed system with the previous system. We confirmed the results show effectiveness of the proposed system.

  8. Parallel halftoning technique using dot diffusion optimization

    NASA Astrophysics Data System (ADS)

    Molina-Garcia, Javier; Ponomaryov, Volodymyr I.; Reyes-Reyes, Rogelio; Cruz-Ramos, Clara

    2017-05-01

    In this paper, a novel approach for halftone images is proposed and implemented for images that are obtained by the Dot Diffusion (DD) method. Designed technique is based on an optimization of the so-called class matrix used in DD algorithm and it consists of generation new versions of class matrix, which has no baron and near-baron in order to minimize inconsistencies during the distribution of the error. Proposed class matrix has different properties and each is designed for two different applications: applications where the inverse-halftoning is necessary, and applications where this method is not required. The proposed method has been implemented in GPU (NVIDIA GeForce GTX 750 Ti), multicore processors (AMD FX(tm)-6300 Six-Core Processor and in Intel core i5-4200U), using CUDA and OpenCV over a PC with linux. Experimental results have shown that novel framework generates a good quality of the halftone images and the inverse halftone images obtained. The simulation results using parallel architectures have demonstrated the efficiency of the novel technique when it is implemented in real-time processing.

  9. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less

  10. Theory and implementation of a very high throughput true random number generator in field programmable gate array.

    PubMed

    Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao

    2016-04-01

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  11. Quantum-Assisted Learning of Hardware-Embedded Probabilistic Graphical Models

    NASA Astrophysics Data System (ADS)

    Benedetti, Marcello; Realpe-Gómez, John; Biswas, Rupak; Perdomo-Ortiz, Alejandro

    2017-10-01

    Mainstream machine-learning techniques such as deep learning and probabilistic programming rely heavily on sampling from generally intractable probability distributions. There is increasing interest in the potential advantages of using quantum computing technologies as sampling engines to speed up these tasks or to make them more effective. However, some pressing challenges in state-of-the-art quantum annealers have to be overcome before we can assess their actual performance. The sparse connectivity, resulting from the local interaction between quantum bits in physical hardware implementations, is considered the most severe limitation to the quality of constructing powerful generative unsupervised machine-learning models. Here, we use embedding techniques to add redundancy to data sets, allowing us to increase the modeling capacity of quantum annealers. We illustrate our findings by training hardware-embedded graphical models on a binarized data set of handwritten digits and two synthetic data sets in experiments with up to 940 quantum bits. Our model can be trained in quantum hardware without full knowledge of the effective parameters specifying the corresponding quantum Gibbs-like distribution; therefore, this approach avoids the need to infer the effective temperature at each iteration, speeding up learning; it also mitigates the effect of noise in the control parameters, making it robust to deviations from the reference Gibbs distribution. Our approach demonstrates the feasibility of using quantum annealers for implementing generative models, and it provides a suitable framework for benchmarking these quantum technologies on machine-learning-related tasks.

  12. Comparing pseudo-absences generation techniques in Boosted Regression Trees models for conservation purposes: A case study on amphibians in a protected area.

    PubMed

    Cerasoli, Francesco; Iannella, Mattia; D'Alessandro, Paola; Biondi, Maurizio

    2017-01-01

    Boosted Regression Trees (BRT) is one of the modelling techniques most recently applied to biodiversity conservation and it can be implemented with presence-only data through the generation of artificial absences (pseudo-absences). In this paper, three pseudo-absences generation techniques are compared, namely the generation of pseudo-absences within target-group background (TGB), testing both the weighted (WTGB) and unweighted (UTGB) scheme, and the generation at random (RDM), evaluating their performance and applicability in distribution modelling and species conservation. The choice of the target group fell on amphibians, because of their rapid decline worldwide and the frequent lack of guidelines for conservation strategies and regional-scale planning, which instead could be provided through an appropriate implementation of SDMs. Bufo bufo, Salamandrina perspicillata and Triturus carnifex were considered as target species, in order to perform our analysis with species having different ecological and distributional characteristics. The study area is the "Gran Sasso-Monti della Laga" National Park, which hosts 15 Natura 2000 sites and represents one of the most important biodiversity hotspots in Europe. Our results show that the model calibration ameliorates when using the target-group based pseudo-absences compared to the random ones, especially when applying the WTGB. Contrarily, model discrimination did not significantly vary in a consistent way among the three approaches with respect to the tree target species. Both WTGB and RDM clearly isolate the highly contributing variables, supplying many relevant indications for species conservation actions. Moreover, the assessment of pairwise variable interactions and their three-dimensional visualization further increase the amount of useful information for protected areas' managers. Finally, we suggest the use of RDM as an admissible alternative when it is not possible to individuate a suitable set of species as a representative target-group from which the pseudo-absences can be generated.

  13. Source-Independent Quantum Random Number Generation

    NASA Astrophysics Data System (ADS)

    Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng

    2016-01-01

    Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .

  14. Energy management and control of active distribution systems

    NASA Astrophysics Data System (ADS)

    Shariatzadeh, Farshid

    Advancements in the communication, control, computation and information technologies have driven the transition to the next generation active power distribution systems. Novel control techniques and management strategies are required to achieve the efficient, economic and reliable grid. The focus of this work is energy management and control of active distribution systems (ADS) with integrated renewable energy sources (RESs) and demand response (DR). Here, ADS mean automated distribution system with remotely operated controllers and distributed energy resources (DERs). DER as active part of the next generation future distribution system includes: distributed generations (DGs), RESs, energy storage system (ESS), plug-in hybrid electric vehicles (PHEV) and DR. Integration of DR and RESs into ADS is critical to realize the vision of sustainability. The objective of this dissertation is the development of management architecture to control and operate ADS in the presence of DR and RES. One of the most challenging issues for operating ADS is the inherent uncertainty of DR and RES as well as conflicting objective of DER and electric utilities. ADS can consist of different layers such as system layer and building layer and coordination between these layers is essential. In order to address these challenges, multi-layer energy management and control architecture is proposed with robust algorithms in this work. First layer of proposed multi-layer architecture have been implemented at the system layer. Developed AC optimal power flow (AC-OPF) generates fair price for all DR and non-DR loads which is used as a control signal for second layer. Second layer controls DR load at buildings using a developed look-ahead robust controller. Load aggregator collects information from all buildings and send aggregated load to the system optimizer. Due to the different time scale at these two management layers, time coordination scheme is developed. Robust and deterministic controllers are developed to maximize the energy usage from rooftop photovoltaic (PV) generation locally and minimize heat-ventilation and air conditioning (HVAC) consumption while maintaining inside temperature within comfort zone. The performance of the developed multi-layer architecture has been analyzed using test case studies and results show the robustness of developed controller in the presence of uncertainty.

  15. Evaluation of reduced point charge models of proteins through Molecular Dynamics simulations: application to the Vps27 UIM-1-Ubiquitin complex.

    PubMed

    Leherte, Laurence; Vercauteren, Daniel P

    2014-02-01

    Reduced point charge models of amino acids are designed, (i) from local extrema positions in charge density distribution functions built from the Poisson equation applied to smoothed molecular electrostatic potential (MEP) functions, and (ii) from local maxima positions in promolecular electron density distribution functions. Corresponding charge values are fitted versus all-atom Amber99 MEPs. To easily generate reduced point charge models for protein structures, libraries of amino acid templates are built. The program GROMACS is used to generate stable Molecular Dynamics trajectories of an Ubiquitin-ligand complex (PDB: 1Q0W), under various implementation schemes, solvation, and temperature conditions. Point charges that are not located on atoms are considered as virtual sites with a nul mass and radius. The results illustrate how the intra- and inter-molecular H-bond interactions are affected by the degree of reduction of the point charge models and give directions for their implementation; a special attention to the atoms selected to locate the virtual sites and to the Coulomb-14 interactions is needed. Results obtained at various temperatures suggest that the use of reduced point charge models allows to probe local potential hyper-surface minima that are similar to the all-atom ones, but are characterized by lower energy barriers. It enables to generate various conformations of the protein complex more rapidly than the all-atom point charge representation. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Dispatch Strategy Development for Grid-tied Household Energy Systems

    NASA Astrophysics Data System (ADS)

    Cardwell, Joseph

    The prevalence of renewable generation will increase in the next several decades and offset conventional generation more and more. Yet this increase is not coming without challenges. Solar, wind, and even some water resources are intermittent and unpredictable, and thereby create scheduling challenges due to their inherent "uncontrolled" nature. To effectively manage these distributed renewable assets, new control algorithms must be developed for applications including energy management, bridge power, and system stability. This can be completed through a centralized control center though efforts are being made to parallel the control architecture with the organization of the renewable assets themselves--namely, distributed controls. Building energy management systems are being employed to control localized energy generation, storage, and use to reduce disruption on the net utility load. One such example is VOLTTRONTM, an agent-based platform for building energy control in real time. In this thesis, algorithms developed in VOLTTRON simulate a home energy management system that consists of a solar PV array, a lithium-ion battery bank, and the grid. Dispatch strategies are implemented to reduce energy charges from overall consumption (/kWh) and demand charges (/kW). Dispatch strategies for implementing storage devices are tuned on a month-to-month basis to provide a meaningful economic advantage under simulated scenarios to explore algorithm sensitivity to changing external factors. VOLTTRON agents provide automated real-time optimization of dispatch strategies to efficiently manage energy supply and demand, lower consumer costs associated with energy usage, and reduce load on the utility grid.

  17. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  18. An improved panel method for the solution of three-dimensional leading-edge vortex flows. Volume 1: Theory document

    NASA Technical Reports Server (NTRS)

    Johnson, F. T.; Lu, P.; Tinoco, E. N.

    1980-01-01

    An improved panel method for the solution of three dimensional flow and wing and wing-body combinations with leading edge vortex separation is presented. The method employs a three dimensional inviscid flow model in which the configuration, the rolled-up vortex sheets, and the wake are represented by quadratic doublet distributions. The strength of the singularity distribution as well as shape and position of the vortex spirals are computed in an iterative fashion starting with an assumed initial sheet geometry. The method calculates forces and moments as well as detail surface pressure distributions. Improvements include the implementation of improved panel numerics for the purpose of elimination the highly nonlinear effects of ring vortices around double panel edges, and the development of a least squares procedure for damping vortex sheet geometry update instabilities. A complete description of the method is included. A variety of cases generated by the computer program implementing the method are presented which verify the mathematical assumptions of the method and which compare computed results with experimental data to verify the underlying physical assumptions made by the method.

  19. Spin echo SPI methods for quantitative analysis of fluids in porous media.

    PubMed

    Li, Linqing; Han, Hui; Balcom, Bruce J

    2009-06-01

    Fluid density imaging is highly desirable in a wide variety of porous media measurements. The SPRITE class of MRI methods has proven to be robust and general in their ability to generate density images in porous media, however the short encoding times required, with correspondingly high magnetic field gradient strengths and filter widths, and low flip angle RF pulses, yield sub-optimal S/N images, especially at low static field strength. This paper explores two implementations of pure phase encode spin echo 1D imaging, with application to a proposed new petroleum reservoir core analysis measurement. In the first implementation of the pulse sequence, we modify the spin echo single point imaging (SE-SPI) technique to acquire the k-space origin data point, with a near zero evolution time, from the free induction decay (FID) following a 90 degrees excitation pulse. Subsequent k-space data points are acquired by separately phase encoding individual echoes in a multi-echo acquisition. T(2) attenuation of the echo train yields an image convolution which causes blurring. The T(2) blur effect is moderate for porous media with T(2) lifetime distributions longer than 5 ms. As a robust, high S/N, and fast 1D imaging method, this method will be highly complementary to SPRITE techniques for the quantitative analysis of fluid content in porous media. In the second implementation of the SE-SPI pulse sequence, modification of the basic measurement permits fast determination of spatially resolved T(2) distributions in porous media through separately phase encoding each echo in a multi-echo CPMG pulse train. An individual T(2) weighted image may be acquired from each echo. The echo time (TE) of each T(2) weighted image may be reduced to 500 micros or less. These profiles can be fit to extract a T(2) distribution from each pixel employing a variety of standard inverse Laplace transform methods. Fluid content 1D images are produced as an essential by product of determining the spatially resolved T(2) distribution. These 1D images do not suffer from a T(2) related blurring. The above SE-SPI measurements are combined to generate 1D images of the local saturation and T(2) distribution as a function of saturation, upon centrifugation of petroleum reservoir core samples. The logarithm mean T(2) is observed to shift linearly with water saturation. This new reservoir core analysis measurement may provide a valuable calibration of the Coates equation for irreducible water saturation, which has been widely implemented in NMR well logging measurements.

  20. fixedTimeEvents: An R package for the distribution of distances between discrete events in fixed time

    NASA Astrophysics Data System (ADS)

    Liland, Kristian Hovde; Snipen, Lars

    When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.

  1. Implementation of a Publish-Subscribe Protocol in Microgrid Islanding and Resynchronization with Self-Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starke, M.; Herron, A.; King, D.

    Communications systems and protocols are becoming second nature to utilities operating distribution systems. Traditionally, centralized communication approaches are often used, while recently in microgrid applications, distributed communication and control schema emerge offering several advantages such as improved system reliability, plug-and-play operation and distributed intelligence. Still, operation and control of microgrids including distributed communication schema have been less of a discussion in the literature. To address the challenge of multiple-inverter microgrid synchronization, a publish-subscribe protocol based, Data Distribution Service (DDS), communication schema for microgrids is proposed in this paper. The communication schema is discussed in details for individual devices such asmore » generators, photovoltaic systems, energy storage systems, microgrid point of common coupling switch, and supporting applications. In conclusion, islanding and resynchronization of a microgrid are demonstrated on a test-bed utilizing this schema.« less

  2. Measurement of distributions sensitive to the underlying event in inclusive Z-boson production in [Formula: see text] collisions at [Formula: see text] TeV with the ATLAS detector.

    PubMed

    Aad, G; Abbott, B; Abdallah, J; Abdel Khalek, S; Abdinov, O; Aben, R; Abi, B; Abolins, M; AbouZeid, O S; Abramowicz, H; Abreu, H; Abreu, R; Abulaiti, Y; Acharya, B S; Adamczyk, L; Adams, D L; Adelman, J; Adomeit, S; Adye, T; Agatonovic-Jovin, T; Aguilar-Saavedra, J A; Agustoni, M; Ahlen, S P; Ahmadov, F; Aielli, G; Akerstedt, H; Åkesson, T P A; Akimoto, G; Akimov, A V; Alberghi, G L; Albert, J; Albrand, S; Alconada Verzini, M J; Aleksa, M; Aleksandrov, I N; Alexa, C; Alexander, G; Alexandre, G; Alexopoulos, T; Alhroob, M; Alimonti, G; Alio, L; Alison, J; Allbrooke, B M M; Allison, L J; Allport, P P; Almond, J; Aloisio, A; Alonso, A; Alonso, F; Alpigiani, C; Altheimer, A; Alvarez Gonzalez, B; Alviggi, M G; Amako, K; Amaral Coutinho, Y; Amelung, C; Amidei, D; Amor Dos Santos, S P; Amorim, A; Amoroso, S; Amram, N; Amundsen, G; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anderson, K J; Andreazza, A; Andrei, V; Anduaga, X S; Angelidakis, S; Angelozzi, I; Anger, P; Angerami, A; Anghinolfi, F; Anisenkov, A V; Anjos, N; Annovi, A; Antonaki, A; Antonelli, M; Antonov, A; Antos, J; Anulli, F; Aoki, M; Aperio Bella, L; Apolle, R; Arabidze, G; Aracena, I; Arai, Y; Araque, J P; Arce, A T H; Arguin, J-F; Argyropoulos, S; Arik, M; Armbruster, A J; Arnaez, O; Arnal, V; Arnold, H; Arratia, M; Arslan, O; Artamonov, A; Artoni, G; Asai, S; Asbah, N; Ashkenazi, A; Åsman, B; Asquith, L; Assamagan, K; Astalos, R; Atkinson, M; Atlay, N B; Auerbach, B; Augsten, K; Aurousseau, M; Avolio, G; Azuelos, G; Azuma, Y; Baak, M A; Baas, A; Bacci, C; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Backus Mayes, J; Badescu, E; Bagiacchi, P; Bagnaia, P; Bai, Y; Bain, T; Baines, J T; Baker, O K; Balek, P; Balli, F; Banas, E; Banerjee, Sw; Bannoura, A A E; Bansal, V; Bansil, H S; Barak, L; Baranov, S P; Barberio, E L; Barberis, D; Barbero, M; Barillari, T; Barisonzi, M; Barklow, T; Barlow, N; Barnett, B M; Barnett, R M; Barnovska, Z; Baroncelli, A; Barone, G; Barr, A J; Barreiro, F; Barreiro Guimarães da Costa, J; Bartoldus, R; Barton, A E; Bartos, P; Bartsch, V; Bassalat, A; Basye, A; Bates, R L; Batley, J R; Battaglia, M; Battistin, M; Bauer, F; Bawa, H S; Beattie, M D; Beau, T; Beauchemin, P H; Beccherle, R; Bechtle, P; Beck, H P; Becker, K; Becker, S; Beckingham, M; Becot, C; Beddall, A J; Beddall, A; Bedikian, S; Bednyakov, V A; Bee, C P; Beemster, L J; Beermann, T A; Begel, M; Behr, K; Belanger-Champagne, C; Bell, P J; Bell, W H; Bella, G; Bellagamba, L; Bellerive, A; Bellomo, M; Belotskiy, K; Beltramello, O; Benary, O; Benchekroun, D; Bendtz, K; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez Garcia, J A; Benjamin, D P; Bensinger, J R; Benslama, K; Bentvelsen, S; Berge, D; Bergeaas Kuutmann, E; Berger, N; Berghaus, F; Beringer, J; Bernard, C; Bernat, P; Bernius, C; Bernlochner, F U; Berry, T; Berta, P; Bertella, C; Bertoli, G; Bertolucci, F; Bertsche, C; Bertsche, D; Besana, M I; Besjes, G J; Bessidskaia, O; Bessner, M; Besson, N; Betancourt, C; Bethke, S; Bhimji, W; Bianchi, R M; Bianchini, L; Bianco, M; Biebel, O; Bieniek, S P; Bierwagen, K; Biesiada, J; Biglietti, M; Bilbao De Mendizabal, J; Bilokon, H; Bindi, M; Binet, S; Bingul, A; Bini, C; Black, C W; Black, J E; Black, K M; Blackburn, D; Blair, R E; Blanchard, J-B; Blazek, T; Bloch, I; Blocker, C; Blum, W; Blumenschein, U; Bobbink, G J; Bobrovnikov, V S; Bocchetta, S S; Bocci, A; Bock, C; Boddy, C R; Boehler, M; Boek, J; Boek, J; Boek, T T; Bogaerts, J A; Bogdanchikov, A G; Bogouch, A; Bohm, C; Bohm, J; Boisvert, V; Bold, T; Boldea, V; Boldyrev, A S; Bomben, M; Bona, M; Boonekamp, M; Borisov, A; Borissov, G; Borri, M; Borroni, S; Bortfeldt, J; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Boterenbrood, H; Boudreau, J; Bouffard, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Bousson, N; Boutouil, S; Boveia, A; Boyd, J; Boyko, I R; Bracinik, J; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Braun, H M; Brazzale, S F; Brelier, B; Brendlinger, K; Brennan, A J; Brenner, R; Bressler, S; Bristow, K; Bristow, T M; Britton, D; Brochu, F M; Brock, I; Brock, R; Bromberg, C; Bronner, J; Brooijmans, G; Brooks, T; Brooks, W K; Brosamer, J; Brost, E; Brown, J; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Brunet, S; Bruni, A; Bruni, G; Bruschi, M; Bryngemark, L; Buanes, T; Buat, Q; Bucci, F; Buchholz, P; Buckingham, R M; Buckley, A G; Buda, S I; Budagov, I A; Buehrer, F; Bugge, L; Bugge, M K; Bulekov, O; Bundock, A C; Burckhart, H; Burdin, S; Burghgrave, B; Burke, S; Burmeister, I; Busato, E; Büscher, D; Büscher, V; Bussey, P; Buszello, C P; Butler, B; Butler, J M; Butt, A I; Buttar, C M; Butterworth, J M; Butti, P; Buttinger, W; Buzatu, A; Byszewski, M; Cabrera Urbán, S; Caforio, D; Cakir, O; Calafiura, P; Calandri, A; Calderini, G; Calfayan, P; Calkins, R; Caloba, L P; Calvet, D; Calvet, S; Camacho Toro, R; Camarda, S; Cameron, D; Caminada, L M; Caminal Armadans, R; Campana, S; Campanelli, M; Campoverde, A; Canale, V; Canepa, A; Cano Bret, M; Cantero, J; Cantrill, R; Cao, T; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capua, M; Caputo, R; Cardarelli, R; Carli, T; Carlino, G; Carminati, L; Caron, S; Carquin, E; Carrillo-Montoya, G D; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Casolino, M; Castaneda-Miranda, E; Castelli, A; Castillo Gimenez, V; Castro, N F; Catastini, P; Catinaccio, A; Catmore, J R; Cattai, A; Cattani, G; Cavaliere, V; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerio, B; Cerny, K; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cerv, M; Cervelli, A; Cetin, S A; Chafaq, A; Chakraborty, D; Chalupkova, I; Chang, P; Chapleau, B; Chapman, J D; Charfeddine, D; Charlton, D G; Chau, C C; Chavez Barajas, C A; Cheatham, S; Chegwidden, A; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, K; Chen, L; Chen, S; Chen, X; Chen, Y; Chen, Y; Cheng, H C; Cheng, Y; Cheplakov, A; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Chevalier, L; Chiarella, V; Chiefari, G; Childers, J T; Chilingarov, A; Chiodini, G; Chisholm, A S; Chislett, R T; Chitan, A; Chizhov, M V; Chouridou, S; Chow, B K B; Chromek-Burckhart, D; Chu, M L; Chudoba, J; Chwastowski, J J; Chytka, L; Ciapetti, G; Ciftci, A K; Ciftci, R; Cinca, D; Cindro, V; Ciocio, A; Cirkovic, P; Citron, Z H; Citterio, M; Ciubancan, M; Clark, A; Clark, P J; Clarke, R N; Cleland, W; Clemens, J C; Clement, C; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Coffey, L; Cogan, J G; Coggeshall, J; Cole, B; Cole, S; Colijn, A P; Collot, J; Colombo, T; Colon, G; Compostella, G; Conde Muiño, P; Coniavitis, E; Conidi, M C; Connell, S H; Connelly, I A; Consonni, S M; Consorti, V; Constantinescu, S; Conta, C; Conti, G; Conventi, F; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Cooper-Smith, N J; Copic, K; Cornelissen, T; Corradi, M; Corriveau, F; Corso-Radu, A; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Côté, D; Cottin, G; Cowan, G; Cox, B E; Cranmer, K; Cree, G; Crépé-Renaudin, S; Crescioli, F; Cribbs, W A; Crispin Ortuzar, M; Cristinziani, M; Croft, V; Crosetti, G; Cuciuc, C-M; Cuhadar Donszelmann, T; Cummings, J; Curatolo, M; Cuthbert, C; Czirr, H; Czodrowski, P; Czyczula, Z; D'Auria, S; D'Onofrio, M; Cunha Sargedas De Sousa, M J Da; Via, C Da; Dabrowski, W; Dafinca, A; Dai, T; Dale, O; Dallaire, F; Dallapiccola, C; Dam, M; Daniells, A C; Dano Hoffmann, M; Dao, V; Darbo, G; Darmora, S; Dassoulas, J A; Dattagupta, A; Davey, W; David, C; Davidek, T; Davies, E; Davies, M; Davignon, O; Davison, A R; Davison, P; Davygora, Y; Dawe, E; Dawson, I; Daya-Ishmukhametova, R K; De, K; de Asmundis, R; De Castro, S; De Cecco, S; De Groot, N; de Jong, P; De la Torre, H; De Lorenzi, F; De Nooij, L; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; Dearnaley, W J; Debbe, R; Debenedetti, C; Dechenaux, B; Dedovich, D V; Deigaard, I; Del Peso, J; Del Prete, T; Deliot, F; Delitzsch, C M; Deliyergiyev, M; Dell'Acqua, A; Dell'Asta, L; Dell'Orso, M; Della Pietra, M; Della Volpe, D; Delmastro, M; Delsart, P A; Deluca, C; Demers, S; Demichev, M; Demilly, A; Denisov, S P; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Deterre, C; Deviveiros, P O; Dewhurst, A; Dhaliwal, S; Di Ciaccio, A; Di Ciaccio, L; Di Domenico, A; Di Donato, C; Di Girolamo, A; Di Girolamo, B; Di Mattia, A; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Di Valentino, D; Dias, F A; Diaz, M A; Diehl, E B; Dietrich, J; Dietzsch, T A; Diglio, S; Dimitrievska, A; Dingfelder, J; Dionisi, C; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; do Vale, M A B; Do Valle Wemans, A; Dobos, D; Doglioni, C; Doherty, T; Dohmae, T; Dolejsi, J; Dolezal, Z; Dolgoshein, B A; Donadelli, M; Donati, S; Dondero, P; Donini, J; Dopke, J; Doria, A; Dova, M T; Doyle, A T; Dris, M; Dubbert, J; Dube, S; Dubreuil, E; Duchovni, E; Duckeck, G; Ducu, O A; Duda, D; Dudarev, A; Dudziak, F; Duflot, L; Duguid, L; Dührssen, M; Dunford, M; Duran Yildiz, H; Düren, M; Durglishvili, A; Dwuznik, M; Dyndal, M; Ebke, J; Edson, W; Edwards, N C; Ehrenfeld, W; Eifert, T; Eigen, G; Einsweiler, K; Ekelof, T; El Kacimi, M; Ellert, M; Elles, S; Ellinghaus, F; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Enari, Y; Endner, O C; Endo, M; Engelmann, R; Erdmann, J; Ereditato, A; Eriksson, D; Ernis, G; Ernst, J; Ernst, M; Ernwein, J; Errede, D; Errede, S; Ertel, E; Escalier, M; Esch, H; Escobar, C; Esposito, B; Etienvre, A I; Etzion, E; Evans, H; Ezhilov, A; Fabbri, L; Facini, G; Fakhrutdinov, R M; Falciano, S; Falla, R J; Faltova, J; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farooque, T; Farrell, S; Farrington, S M; Farthouat, P; Fassi, F; Fassnacht, P; Fassouliotis, D; Favareto, A; Fayard, L; Federic, P; Fedin, O L; Fedorko, W; Fehling-Kaschek, M; Feigl, S; Feligioni, L; Feng, C; Feng, E J; Feng, H; Fenyuk, A B; Fernandez Perez, S; Ferrag, S; Ferrando, J; Ferrari, A; Ferrari, P; Ferrari, R; Ferreira de Lima, D E; Ferrer, A; Ferrere, D; Ferretti, C; Ferretto Parodi, A; Fiascaris, M; Fiedler, F; Filipčič, A; Filipuzzi, M; Filthaut, F; Fincke-Keeler, M; Finelli, K D; Fiolhais, M C N; Fiorini, L; Firan, A; Fischer, A; Fischer, J; Fisher, W C; Fitzgerald, E A; Flechl, M; Fleck, I; Fleischmann, P; Fleischmann, S; Fletcher, G T; Fletcher, G; Flick, T; Floderus, A; Flores Castillo, L R; Florez Bustos, A C; Flowerdew, M J; Formica, A; Forti, A; Fortin, D; Fournier, D; Fox, H; Fracchia, S; Francavilla, P; Franchini, M; Franchino, S; Francis, D; Franconi, L; Franklin, M; Franz, S; Fraternali, M; French, S T; Friedrich, C; Friedrich, F; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fulsom, B G; Fuster, J; Gabaldon, C; Gabizon, O; Gabrielli, A; Gabrielli, A; Gadatsch, S; Gadomski, S; Gagliardi, G; Gagnon, P; Galea, C; Galhardo, B; Gallas, E J; Gallo, V; Gallop, B J; Gallus, P; Galster, G; Gan, K K; Gao, J; Gao, Y S; Garay Walls, F M; Garberson, F; García, C; García Navarro, J E; Garcia-Sciveres, M; Gardner, R W; Garelli, N; Garonne, V; Gatti, C; Gaudio, G; Gaur, B; Gauthier, L; Gauzzi, P; Gavrilenko, I L; Gay, C; Gaycken, G; Gazis, E N; Ge, P; Gecse, Z; Gee, C N P; Geerts, D A A; Geich-Gimbel, Ch; Gellerstedt, K; Gemme, C; Gemmell, A; Genest, M H; Gentile, S; George, M; George, S; Gerbaudo, D; Gershon, A; Ghazlane, H; Ghodbane, N; Giacobbe, B; Giagu, S; Giangiobbe, V; Giannetti, P; Gianotti, F; Gibbard, B; Gibson, S M; Gilchriese, M; Gillam, T P S; Gillberg, D; Gilles, G; Gingrich, D M; Giokaris, N; Giordani, M P; Giordano, R; Giorgi, F M; Giorgi, F M; Giraud, P F; Giugni, D; Giuliani, C; Giulini, M; Gjelsten, B K; Gkaitatzis, S; Gkialas, I; Gladilin, L K; Glasman, C; Glatzer, J; Glaysher, P C F; Glazov, A; Glonti, G L; Goblirsch-Kolb, M; Goddard, J R; Godlewski, J; Goeringer, C; Goldfarb, S; Golling, T; Golubkov, D; Gomes, A; Gomez Fajardo, L S; Gonçalo, R; Goncalves Pinto Firmino Da Costa, J; Gonella, L; González de la Hoz, S; Gonzalez Parra, G; Gonzalez-Sevilla, S; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Goshaw, A T; Gössling, C; Gostkin, M I; Gouighri, M; Goujdami, D; Goulette, M P; Goussiou, A G; Goy, C; Gozpinar, S; Grabas, H M X; Graber, L; Grabowska-Bold, I; Grafström, P; Grahn, K-J; Gramling, J; Gramstad, E; Grancagnolo, S; Grassi, V; Gratchev, V; Gray, H M; Graziani, E; Grebenyuk, O G; Greenwood, Z D; Gregersen, K; Gregor, I M; Grenier, P; Griffiths, J; Grillo, A A; Grimm, K; Grinstein, S; Gris, Ph; Grishkevich, Y V; Grivaz, J-F; Grohs, J P; Grohsjean, A; Gross, E; Grosse-Knetter, J; Grossi, G C; Groth-Jensen, J; Grout, Z J; Guan, L; Guescini, F; Guest, D; Gueta, O; Guicheney, C; Guido, E; Guillemin, T; Guindon, S; Gul, U; Gumpert, C; Gunther, J; Guo, J; Gupta, S; Gutierrez, P; Gutierrez Ortiz, N G; Gutschow, C; Guttman, N; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haber, C; Hadavand, H K; Haddad, N; Haefner, P; Hageböeck, S; Hajduk, Z; Hakobyan, H; Haleem, M; Hall, D; Halladjian, G; Hamacher, K; Hamal, P; Hamano, K; Hamer, M; Hamilton, A; Hamilton, S; Hamity, G N; Hamnett, P G; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Hanke, P; Hann, R; Hansen, J B; Hansen, J D; Hansen, P H; Hara, K; Hard, A S; Harenberg, T; Hariri, F; Harkusha, S; Harper, D; Harrington, R D; Harris, O M; Harrison, P F; Hartjes, F; Hasegawa, M; Hasegawa, S; Hasegawa, Y; Hasib, A; Hassani, S; Haug, S; Hauschild, M; Hauser, R; Havranek, M; Hawkes, C M; Hawkings, R J; Hawkins, A D; Hayashi, T; Hayden, D; Hays, C P; Hayward, H S; Haywood, S J; Head, S J; Heck, T; Hedberg, V; Heelan, L; Heim, S; Heim, T; Heinemann, B; Heinrich, L; Hejbal, J; Helary, L; Heller, C; Heller, M; Hellman, S; Hellmich, D; Helsens, C; Henderson, J; Heng, Y; Henderson, R C W; Hengler, C; Henrichs, A; Henriques Correia, A M; Henrot-Versille, S; Hensel, C; Herbert, G H; Hernández Jiménez, Y; Herrberg-Schubert, R; Herten, G; Hertenberger, R; Hervas, L; Hesketh, G G; Hessey, N P; Hickling, R; Higón-Rodriguez, E; Hill, E; Hill, J C; Hiller, K H; Hillert, S; Hillier, S J; Hinchliffe, I; Hines, E; Hirose, M; Hirschbuehl, D; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoenig, F; Hoffman, J; Hoffmann, D; Hofmann, J I; Hohlfeld, M; Holmes, T R; Hong, T M; Hooft van Huysduynen, L; Hopkins, W H; Horii, Y; Hostachy, J-Y; Hou, S; Hoummada, A; Howard, J; Howarth, J; Hrabovsky, M; Hristova, I; Hrivnac, J; Hryn'ova, T; Hsu, C; Hsu, P J; Hsu, S-C; Hu, D; Hu, X; Huang, Y; Hubacek, Z; Hubaut, F; Huegging, F; Huffman, T B; Hughes, E W; Hughes, G; Huhtinen, M; Hülsing, T A; Hurwitz, M; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibragimov, I; Iconomidou-Fayard, L; Ideal, E; Iengo, P; Igonkina, O; Iizawa, T; Ikegami, Y; Ikematsu, K; Ikeno, M; Ilchenko, Y; Iliadis, D; Ilic, N; Inamaru, Y; Ince, T; Ioannou, P; Iodice, M; Iordanidou, K; Ippolito, V; Irles Quiles, A; Isaksson, C; Ishino, M; Ishitsuka, M; Ishmukhametov, R; Issever, C; Istin, S; Iturbe Ponce, J M; Iuppa, R; Ivarsson, J; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jackson, B; Jackson, M; Jackson, P; Jaekel, M R; Jain, V; Jakobs, K; Jakobsen, S; Jakoubek, T; Jakubek, J; Jamin, D O; Jana, D K; Jansen, E; Jansen, H; Janssen, J; Janus, M; Jarlskog, G; Javadov, N; Javůrek, T; Jeanty, L; Jejelava, J; Jeng, G-Y; Jennens, D; Jenni, P; Jentzsch, J; Jeske, C; Jézéquel, S; Ji, H; Jia, J; Jiang, Y; Jimenez Belenguer, M; Jin, S; Jinaru, A; Jinnouchi, O; Joergensen, M D; Johansson, K E; Johansson, P; Johns, K A; Jon-And, K; Jones, G; Jones, R W L; Jones, T J; Jongmanns, J; Jorge, P M; Joshi, K D; Jovicevic, J; Ju, X; Jung, C A; Jungst, R M; Jussel, P; Juste Rozas, A; Kaci, M; Kaczmarska, A; Kado, M; Kagan, H; Kagan, M; Kajomovitz, E; Kalderon, C W; Kama, S; Kamenshchikov, A; Kanaya, N; Kaneda, M; Kaneti, S; Kantserov, V A; Kanzaki, J; Kaplan, B; Kapliy, A; Kar, D; Karakostas, K; Karastathis, N; Kareem, M J; Karnevskiy, M; Karpov, S N; Karpova, Z M; Karthik, K; Kartvelishvili, V; Karyukhin, A N; Kashif, L; Kasieczka, G; Kass, R D; Kastanas, A; Kataoka, Y; Katre, A; Katzy, J; Kaushik, V; Kawagoe, K; Kawamoto, T; Kawamura, G; Kazama, S; Kazanin, V F; Kazarinov, M Y; Keeler, R; Kehoe, R; Keil, M; Keller, J S; Kempster, J J; Keoshkerian, H; Kepka, O; Kerševan, B P; Kersten, S; Kessoku, K; Keung, J; Khalil-Zada, F; Khandanyan, H; Khanov, A; Khodinov, A; Khomich, A; Khoo, T J; Khoriauli, G; Khoroshilov, A; Khovanskiy, V; Khramov, E; Khubua, J; Kim, H Y; Kim, H; Kim, S H; Kimura, N; Kind, O; King, B T; King, M; King, R S B; King, S B; Kirk, J; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kiss, F; Kittelmann, T; Kiuchi, K; Kladiva, E; Klein, M; Klein, U; Kleinknecht, K; Klimek, P; Klimentov, A; Klingenberg, R; Klinger, J A; Klioutchnikova, T; Klok, P F; Kluge, E-E; Kluit, P; Kluth, S; Kneringer, E; Knoops, E B F G; Knue, A; Kobayashi, D; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Koevesarki, P; Koffas, T; Koffeman, E; Kogan, L A; Kohlmann, S; Kohout, Z; Kohriki, T; Koi, T; Kolanoski, H; Koletsou, I; Koll, J; Komar, A A; Komori, Y; Kondo, T; Kondrashova, N; Köneke, K; König, A C; König, S; Kono, T; Konoplich, R; Konstantinidis, N; Kopeliansky, R; Koperny, S; Köpke, L; Kopp, A K; Korcyl, K; Kordas, K; Korn, A; Korol, A A; Korolkov, I; Korolkova, E V; Korotkov, V A; Kortner, O; Kortner, S; Kostyukhin, V V; Kotov, V M; Kotwal, A; Kourkoumelis, C; Kouskoura, V; Koutsman, A; Kowalewski, R; Kowalski, T Z; Kozanecki, W; Kozhin, A S; Kral, V; Kramarenko, V A; Kramberger, G; Krasnopevtsev, D; Krasny, M W; Krasznahorkay, A; Kraus, J K; Kravchenko, A; Kreiss, S; Kretz, M; Kretzschmar, J; Kreutzfeldt, K; Krieger, P; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Kruker, T; Krumnack, N; Krumshteyn, Z V; Kruse, A; Kruse, M C; Kruskal, M; Kubota, T; Kuday, S; Kuehn, S; Kugel, A; Kuhl, A; Kuhl, T; Kukhtin, V; Kulchitsky, Y; Kuleshov, S; Kuna, M; Kunkle, J; Kupco, A; Kurashige, H; Kurochkin, Y A; Kurumida, R; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; La Rosa, A; La Rotonda, L; Lacasta, C; Lacava, F; Lacey, J; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Laier, H; Lambourne, L; Lammers, S; Lampen, C L; Lampl, W; Lançon, E; Landgraf, U; Landon, M P J; Lang, V S; Lankford, A J; Lanni, F; Lantzsch, K; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Manghi, F Lasagni; Lassnig, M; Laurelli, P; Lavrijsen, W; Law, A T; Laycock, P; Le Dortz, O; Le Guirriec, E; Le Menedeu, E; LeCompte, T; Ledroit-Guillon, F; Lee, C A; Lee, H; Lee, J S H; Lee, S C; Lee, L; Lefebvre, G; Lefebvre, M; Legger, F; Leggett, C; Lehan, A; Lehmacher, M; Lehmann Miotto, G; Lei, X; Leight, W A; Leisos, A; Leister, A G; Leite, M A L; Leitner, R; Lellouch, D; Lemmer, B; Leney, K J C; Lenz, T; Lenzen, G; Lenzi, B; Leone, R; Leone, S; Leonidopoulos, C; Leontsinis, S; Leroy, C; Lester, C G; Lester, C M; Levchenko, M; Levêque, J; Levin, D; Levinson, L J; Levy, M; Lewis, A; Lewis, G H; Leyko, A M; Leyton, M; Li, B; Li, B; Li, H; Li, H L; Li, L; Li, L; Li, S; Li, Y; Liang, Z; Liao, H; Liberti, B; Lichard, P; Lie, K; Liebal, J; Liebig, W; Limbach, C; Limosani, A; Lin, S C; Lin, T H; Linde, F; Lindquist, B E; Linnemann, J T; Lipeles, E; Lipniacka, A; Lisovyi, M; Liss, T M; Lissauer, D; Lister, A; Litke, A M; Liu, B; Liu, D; Liu, J B; Liu, K; Liu, L; Liu, M; Liu, M; Liu, Y; Livan, M; Livermore, S S A; Lleres, A; Llorente Merino, J; Lloyd, S L; Lo Sterzo, F; Lobodzinska, E; Loch, P; Lockman, W S; Loddenkoetter, T; Loebinger, F K; Loevschall-Jensen, A E; Loginov, A; Lohse, T; Lohwasser, K; Lokajicek, M; Lombardo, V P; Long, B A; Long, J D; Long, R E; Lopes, L; Lopez Mateos, D; Lopez Paredes, B; Lopez Paz, I; Lorenz, J; Lorenzo Martinez, N; Losada, M; Loscutoff, P; Lou, X; Lounis, A; Love, J; Love, P A; Lowe, A J; Lu, F; Lu, N; Lubatti, H J; Luci, C; Lucotte, A; Luehring, F; Lukas, W; Luminari, L; Lundberg, O; Lund-Jensen, B; Lungwitz, M; Lynn, D; Lysak, R; Lytken, E; Ma, H; Ma, L L; Maccarrone, G; Macchiolo, A; Machado Miguens, J; Macina, D; Madaffari, D; Madar, R; Maddocks, H J; Mader, W F; Madsen, A; Maeno, M; Maeno, T; Maevskiy, A; Magradze, E; Mahboubi, K; Mahlstedt, J; Mahmoud, S; Maiani, C; Maidantchik, C; Maier, A A; Maio, A; Majewski, S; Makida, Y; Makovec, N; Mal, P; Malaescu, B; Malecki, Pa; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Maltezos, S; Malyshev, V M; Malyukov, S; Mamuzic, J; Mandelli, B; Mandelli, L; Mandić, I; Mandrysch, R; Maneira, J; Manfredini, A; Manhaes de Andrade Filho, L; Manjarres Ramos, J A; Mann, A; Manning, P M; Manousakis-Katsikakis, A; Mansoulie, B; Mantifel, R; Mapelli, L; March, L; Marchand, J F; Marchiori, G; Marcisovsky, M; Marino, C P; Marjanovic, M; Marques, C N; Marroquim, F; Marsden, S P; Marshall, Z; Marti, L F; Marti-Garcia, S; Martin, B; Martin, B; Martin, T A; Martin, V J; Martin Dit Latour, B; Martinez, H; Martinez, M; Martin-Haugh, S; Martyniuk, A C; Marx, M; Marzano, F; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massa, L; Massol, N; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Mättig, P; Mattmann, J; Maurer, J; Maxfield, S J; Maximov, D A; Mazini, R; Mazzaferro, L; Mc Goldrick, G; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McCubbin, N A; McFarlane, K W; Mcfayden, J A; Mchedlidze, G; McMahon, S J; McPherson, R A; Mechnich, J; Medinnis, M; Meehan, S; Mehlhase, S; Mehta, A; Meier, K; Meineck, C; Meirose, B; Melachrinos, C; Mellado Garcia, B R; Meloni, F; Mengarelli, A; Menke, S; Meoni, E; Mercurio, K M; Mergelmeyer, S; Meric, N; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Merritt, H; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Middleton, R P; Migas, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Milic, A; Miller, D W; Mills, C; Milov, A; Milstead, D A; Milstein, D; Minaenko, A A; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Ming, Y; Mir, L M; Mirabelli, G; Mitani, T; Mitrevski, J; Mitsou, V A; Mitsui, S; Miucci, A; Miyagawa, P S; Mjörnmark, J U; Moa, T; Mochizuki, K; Mohapatra, S; Mohr, W; Molander, S; Moles-Valls, R; Mönig, K; Monini, C; Monk, J; Monnier, E; Montejo Berlingen, J; Monticelli, F; Monzani, S; Moore, R W; Morange, N; Moreno, D; Moreno Llácer, M; Morettini, P; Morgenstern, M; Morii, M; Moritz, S; Morley, A K; Mornacchi, G; Morris, J D; Morvaj, L; Moser, H G; Mosidze, M; Moss, J; Motohashi, K; Mount, R; Mountricha, E; Mouraviev, S V; Moyse, E J W; Muanza, S; Mudd, R D; Mueller, F; Mueller, J; Mueller, K; Mueller, T; Mueller, T; Muenstermann, D; Munwes, Y; Murillo Quijada, J A; Murray, W J; Musheghyan, H; Musto, E; Myagkov, A G; Myska, M; Nackenhorst, O; Nadal, J; Nagai, K; Nagai, R; Nagai, Y; Nagano, K; Nagarkar, A; Nagasaka, Y; Nagel, M; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Namasivayam, H; Nanava, G; Narayan, R; Nattermann, T; Naumann, T; Navarro, G; Nayyar, R; Neal, H A; Nechaeva, P Yu; Neep, T J; Nef, P D; Negri, A; Negri, G; Negrini, M; Nektarijevic, S; Nelson, A; Nelson, T K; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neumann, M; Neves, R M; Nevski, P; Newman, P R; Nguyen, D H; Nickerson, R B; Nicolaidou, R; Nicquevert, B; Nielsen, J; Nikiforou, N; Nikiforov, A; Nikolaenko, V; Nikolic-Audit, I; Nikolics, K; Nikolopoulos, K; Nilsson, P; Ninomiya, Y; Nisati, A; Nisius, R; Nobe, T; Nodulman, L; Nomachi, M; Nomidis, I; Norberg, S; Nordberg, M; Novgorodova, O; Nowak, S; Nozaki, M; Nozka, L; Ntekas, K; Nunes Hanninger, G; Nunnemann, T; Nurse, E; Nuti, F; O'Brien, B J; O'grady, F; O'Neil, D C; O'Shea, V; Oakham, F G; Oberlack, H; Obermann, T; Ocariz, J; Ochi, A; Ochoa, M I; Oda, S; Odaka, S; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohman, H; Okamura, W; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Olchevski, A G; Olivares Pino, S A; Oliveira Damazio, D; Oliver Garcia, E; Olszewski, A; Olszowska, J; Onofre, A; Onyisi, P U E; Oram, C J; Oreglia, M J; Oren, Y; Orestano, D; Orlando, N; Oropeza Barrera, C; Orr, R S; Osculati, B; Ospanov, R; Otero Y Garzon, G; Otono, H; Ouchrif, M; Ouellette, E A; Ould-Saada, F; Ouraou, A; Oussoren, K P; Ouyang, Q; Ovcharova, A; Owen, M; Ozcan, V E; Ozturk, N; Pachal, K; Pacheco Pages, A; Padilla Aranda, C; Pagáčová, M; Pagan Griso, S; Paganis, E; Pahl, C; Paige, F; Pais, P; Pajchel, K; Palacino, G; Palestini, S; Palka, M; Pallin, D; Palma, A; Palmer, J D; Pan, Y B; Panagiotopoulou, E; Panduro Vazquez, J G; Pani, P; Panikashvili, N; Panitkin, S; Pantea, D; Paolozzi, L; Papadopoulou, Th D; Papageorgiou, K; Paramonov, A; Paredes Hernandez, D; Parker, M A; Parodi, F; Parsons, J A; Parzefall, U; Pasqualucci, E; Passaggio, S; Passeri, A; Pastore, F; Pastore, Fr; Pásztor, G; Pataraia, S; Patel, N D; Pater, J R; Patricelli, S; Pauly, T; Pearce, J; Pedersen, L E; Pedersen, M; Pedraza Lopez, S; Pedro, R; Peleganchuk, S V; Pelikan, D; Peng, H; Penning, B; Penwell, J; Perepelitsa, D V; Perez Codina, E; Pérez García-Estañ, M T; Perez Reale, V; Perini, L; Pernegger, H; Perrella, S; Perrino, R; Peschke, R; Peshekhonov, V D; Peters, K; Peters, R F Y; Petersen, B A; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petrolo, E; Petrucci, F; Pettersson, N E; Pezoa, R; Phillips, P W; Piacquadio, G; Pianori, E; Picazio, A; Piccaro, E; Piccinini, M; Piegaia, R; Pignotti, D T; Pilcher, J E; Pilkington, A D; Pina, J; Pinamonti, M; Pinder, A; Pinfold, J L; Pingel, A; Pinto, B; Pires, S; Pitt, M; Pizio, C; Plazak, L; Pleier, M-A; Pleskot, V; Plotnikova, E; Plucinski, P; Poddar, S; Podlyski, F; Poettgen, R; Poggioli, L; Pohl, D; Pohl, M; Polesello, G; Policicchio, A; Polifka, R; Polini, A; Pollard, C S; Polychronakos, V; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Popovic, D S; Poppleton, A; Portell Bueso, X; Pospisil, S; Potamianos, K; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Pozdnyakov, V; Pralavorio, P; Pranko, A; Prasad, S; Pravahan, R; Prell, S; Price, D; Price, J; Price, L E; Prieur, D; Primavera, M; Proissl, M; Prokofiev, K; Prokoshin, F; Protopapadaki, E; Protopopescu, S; Proudfoot, J; Przybycien, M; Przysiezniak, H; Ptacek, E; Puddu, D; Pueschel, E; Puldon, D; Purohit, M; Puzo, P; Qian, J; Qin, G; Qin, Y; Quadt, A; Quarrie, D R; Quayle, W B; Queitsch-Maitland, M; Quilty, D; Qureshi, A; Radeka, V; Radescu, V; Radhakrishnan, S K; Radloff, P; Rados, P; Ragusa, F; Rahal, G; Rajagopalan, S; Rammensee, M; Randle-Conde, A S; Rangel-Smith, C; Rao, K; Rauscher, F; Rave, T C; Ravenscroft, T; Raymond, M; Read, A L; Readioff, N P; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reeves, K; Rehnisch, L; Reisin, H; Relich, M; Rembser, C; Ren, H; Ren, Z L; Renaud, A; Rescigno, M; Resconi, S; Rezanova, O L; Reznicek, P; Rezvani, R; Richter, R; Ridel, M; Rieck, P; Rieger, J; Rijssenbeek, M; Rimoldi, A; Rinaldi, L; Ritsch, E; Riu, I; Rizatdinova, F; Rizvi, E; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robson, A; Roda, C; Rodrigues, L; Roe, S; Røhne, O; Rolli, S; Romaniouk, A; Romano, M; Romero Adam, E; Rompotis, N; Ronzani, M; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, M; Rose, P; Rosendahl, P L; Rosenthal, O; Rossetti, V; Rossi, E; Rossi, L P; Rosten, R; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Royon, C R; Rozanov, A; Rozen, Y; Ruan, X; Rubbo, F; Rubinskiy, I; Rud, V I; Rudolph, C; Rudolph, M S; Rühr, F; Ruiz-Martinez, A; Rurikova, Z; Rusakovich, N A; Ruschke, A; Rutherfoord, J P; Ruthmann, N; Ryabov, Y F; Rybar, M; Rybkin, G; Ryder, N C; Saavedra, A F; Sacerdoti, S; Saddique, A; Sadeh, I; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Sakamoto, H; Sakurai, Y; Salamanna, G; Salamon, A; Saleem, M; Salek, D; Sales De Bruin, P H; Salihagic, D; Salnikov, A; Salt, J; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sampsonidis, D; Sanchez, A; Sánchez, J; Sanchez Martinez, V; Sandaker, H; Sandbach, R L; Sander, H G; Sanders, M P; Sandhoff, M; Sandoval, T; Sandoval, C; Sandstroem, R; Sankey, D P C; Sansoni, A; Santoni, C; Santonico, R; Santos, H; Santoyo Castillo, I; Sapp, K; Sapronov, A; Saraiva, J G; Sarrazin, B; Sartisohn, G; Sasaki, O; Sasaki, Y; Sauvage, G; Sauvan, E; Savard, P; Savu, D O; Sawyer, C; Sawyer, L; Saxon, D H; Saxon, J; Sbarra, C; Sbrizzi, A; Scanlon, T; Scannicchio, D A; Scarcella, M; Scarfone, V; Schaarschmidt, J; Schacht, P; Schaefer, D; Schaefer, R; Schaepe, S; Schaetzel, S; Schäfer, U; Schaffer, A C; Schaile, D; Schamberger, R D; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Scherzer, M I; Schiavi, C; Schieck, J; Schillo, C; Schioppa, M; Schlenker, S; Schmidt, E; Schmieden, K; Schmitt, C; Schmitt, S; Schneider, B; Schnellbach, Y J; Schnoor, U; Schoeffel, L; Schoening, A; Schoenrock, B D; Schorlemmer, A L S; Schott, M; Schouten, D; Schovancova, J; Schramm, S; Schreyer, M; Schroeder, C; Schuh, N; Schultens, M J; Schultz-Coulon, H-C; Schulz, H; Schumacher, M; Schumm, B A; Schune, Ph; Schwanenberger, C; Schwartzman, A; Schwarz, T A; Schwegler, Ph; Schwemling, Ph; Schwienhorst, R; Schwindling, J; Schwindt, T; Schwoerer, M; Sciacca, F G; Scifo, E; Sciolla, G; Scott, W G; Scuri, F; Scutti, F; Searcy, J; Sedov, G; Sedykh, E; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Sekula, S J; Selbach, K E; Seliverstov, D M; Sellers, G; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Serre, T; Seuster, R; Severini, H; Sfiligoj, T; Sforza, F; Sfyrla, A; Shabalina, E; Shamim, M; Shan, L Y; Shang, R; Shank, J T; Shapiro, M; Shatalov, P B; Shaw, K; Shehu, C Y; Sherwood, P; Shi, L; Shimizu, S; Shimmin, C O; Shimojima, M; Shiyakova, M; Shmeleva, A; Shochet, M J; Short, D; Shrestha, S; Shulga, E; Shupe, M A; Shushkevich, S; Sicho, P; Sidiropoulou, O; Sidorov, D; Sidoti, A; Siegert, F; Sijacki, Dj; Silva, J; Silver, Y; Silverstein, D; Silverstein, S B; Simak, V; Simard, O; Simic, Lj; Simion, S; Simioni, E; Simmons, B; Simoniello, R; Simonyan, M; Sinervo, P; Sinev, N B; Sipica, V; Siragusa, G; Sircar, A; Sisakyan, A N; Sivoklokov, S Yu; Sjölin, J; Sjursen, T B; Skottowe, H P; Skovpen, K Yu; Skubic, P; Slater, M; Slavicek, T; Sliwa, K; Smakhtin, V; Smart, B H; Smestad, L; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, K M; Smizanska, M; Smolek, K; Snesarev, A A; Snidero, G; Snyder, S; Sobie, R; Socher, F; Soffer, A; Soh, D A; Solans, C A; Solar, M; Solc, J; Soldatov, E Yu; Soldevila, U; Solodkov, A A; Soloshenko, A; Solovyanov, O V; Solovyev, V; Sommer, P; Song, H Y; Soni, N; Sood, A; Sopczak, A; Sopko, B; Sopko, V; Sorin, V; Sosebee, M; Soualah, R; Soueid, P; Soukharev, A M; South, D; Spagnolo, S; Spanò, F; Spearman, W R; Spettel, F; Spighi, R; Spigo, G; Spiller, L A; Spousta, M; Spreitzer, T; Spurlock, B; Denis, R D St; Staerz, S; Stahlman, J; Stamen, R; Stamm, S; Stanecka, E; Stanek, R W; Stanescu, C; Stanescu-Bellu, M; Stanitzki, M M; Stapnes, S; Starchenko, E A; Stark, J; Staroba, P; Starovoitov, P; Staszewski, R; Stavina, P; Steinberg, P; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stern, S; Stewart, G A; Stillings, J A; Stockton, M C; Stoebe, M; Stoicea, G; Stolte, P; Stonjek, S; Stradling, A R; Straessner, A; Stramaglia, M E; Strandberg, J; Strandberg, S; Strandlie, A; Strauss, E; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Stroynowski, R; Struebig, A; Stucci, S A; Stugu, B; Styles, N A; Su, D; Su, J; Subramaniam, R; Succurro, A; Sugaya, Y; Suhr, C; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, S; Sun, X; Sundermann, J E; Suruliz, K; Susinno, G; Sutton, M R; Suzuki, Y; Svatos, M; Swedish, S; Swiatlowski, M; Sykora, I; Sykora, T; Ta, D; Taccini, C; Tackmann, K; Taenzer, J; Taffard, A; Tafirout, R; Taiblum, N; Takai, H; Takashima, R; Takeda, H; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A A; Tam, J Y C; Tan, K G; Tanaka, J; Tanaka, R; Tanaka, S; Tanaka, S; Tanasijczuk, A J; Tannenwald, B B; Tannoury, N; Tapprogge, S; Tarem, S; Tarrade, F; Tartarelli, G F; Tas, P; Tasevsky, M; Tashiro, T; Tassi, E; Tavares Delgado, A; Tayalati, Y; Taylor, F E; Taylor, G N; Taylor, W; Teischinger, F A; Teixeira Dias Castanheira, M; Teixeira-Dias, P; Temming, K K; Ten Kate, H; Teng, P K; Teoh, J J; Terada, S; Terashi, K; Terron, J; Terzo, S; Testa, M; Teuscher, R J; Therhaag, J; Theveneaux-Pelzer, T; Thomas, J P; Thomas-Wilsker, J; Thompson, E N; Thompson, P D; Thompson, P D; Thompson, R J; Thompson, A S; Thomsen, L A; Thomson, E; Thomson, M; Thong, W M; Thun, R P; Tian, F; Tibbetts, M J; Tikhomirov, V O; Tikhonov, Yu A; Timoshenko, S; Tiouchichine, E; Tipton, P; Tisserant, S; Todorov, T; Todorova-Nova, S; Toggerson, B; Tojo, J; Tokár, S; Tokushuku, K; Tollefson, K; Tolley, E; Tomlinson, L; Tomoto, M; Tompkins, L; Toms, K; Topilin, N D; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Tran, H L; Trefzger, T; Tremblet, L; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Tripiana, M F; Trischuk, W; Trocmé, B; Troncon, C; Trottier-McDonald, M; Trovatelli, M; True, P; Trzebinski, M; Trzupek, A; Tsarouchas, C; Tseng, J C-L; Tsiareshka, P V; Tsionou, D; Tsipolitis, G; Tsirintanis, N; Tsiskaridze, S; Tsiskaridze, V; Tskhadadze, E G; Tsukerman, I I; Tsulaia, V; Tsuno, S; Tsybychev, D; Tudorache, A; Tudorache, V; Tuna, A N; Tupputi, S A; Turchikhin, S; Turecek, D; Turk Cakir, I; Turra, R; Tuts, P M; Tykhonov, A; Tylmad, M; Tyndel, M; Uchida, K; Ueda, I; Ueno, R; Ughetto, M; Ugland, M; Uhlenbrock, M; Ukegawa, F; Unal, G; Undrus, A; Unel, G; Ungaro, F C; Unno, Y; Unverdorben, C; Urbaniec, D; Urquijo, P; Usai, G; Usanova, A; Vacavant, L; Vacek, V; Vachon, B; Valencic, N; Valentinetti, S; Valero, A; Valery, L; Valkar, S; Valladolid Gallego, E; Vallecorsa, S; Valls Ferrer, J A; Van Den Wollenberg, W; Van Der Deijl, P C; van der Geer, R; van der Graaf, H; Van Der Leeuw, R; van der Ster, D; van Eldik, N; van Gemmeren, P; Van Nieuwkoop, J; van Vulpen, I; van Woerden, M C; Vanadia, M; Vandelli, W; Vanguri, R; Vaniachine, A; Vankov, P; Vannucci, F; Vardanyan, G; Vari, R; Varnes, E W; Varol, T; Varouchas, D; Vartapetian, A; Varvell, K E; Vazeille, F; Vazquez Schroeder, T; Veatch, J; Veloso, F; Veneziano, S; Ventura, A; Ventura, D; Venturi, M; Venturi, N; Venturini, A; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Viazlo, O; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Vigne, R; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinogradov, V B; Virzi, J; Vivarelli, I; Vives Vaque, F; Vlachos, S; Vladoiu, D; Vlasak, M; Vogel, A; Vogel, M; Vokac, P; Volpi, G; Volpi, M; von der Schmitt, H; von Radziewski, H; von Toerne, E; Vorobel, V; Vorobev, K; Vos, M; Voss, R; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vu Anh, T; Vuillermet, R; Vukotic, I; Vykydal, Z; Wagner, P; Wagner, W; Wahlberg, H; Wahrmund, S; Wakabayashi, J; Walder, J; Walker, R; Walkowiak, W; Wall, R; Waller, P; Walsh, B; Wang, C; Wang, C; Wang, F; Wang, H; Wang, H; Wang, J; Wang, J; Wang, K; Wang, R; Wang, S M; Wang, T; Wang, X; Wanotayaroj, C; Warburton, A; Ward, C P; Wardrope, D R; Warsinsky, M; Washbrook, A; Wasicki, C; Watkins, P M; Watson, A T; Watson, I J; Watson, M F; Watts, G; Watts, S; Waugh, B M; Webb, S; Weber, M S; Weber, S W; Webster, J S; Weidberg, A R; Weigell, P; Weinert, B; Weingarten, J; Weiser, C; Weits, H; Wells, P S; Wenaus, T; Wendland, D; Weng, Z; Wengler, T; Wenig, S; Wermes, N; Werner, M; Werner, P; Wessels, M; Wetter, J; Whalen, K; White, A; White, M J; White, R; White, S; Whiteson, D; Wicke, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wienemann, P; Wiglesworth, C; Wiik-Fuchs, L A M; Wijeratne, P A; Wildauer, A; Wildt, M A; Wilkens, H G; Will, J Z; Williams, H H; Williams, S; Willis, C; Willocq, S; Wilson, A; Wilson, J A; Wingerter-Seez, I; Winklmeier, F; Winter, B T; Wittgen, M; Wittig, T; Wittkowski, J; Wollstadt, S J; Wolter, M W; Wolters, H; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wright, M; Wu, M; Wu, S L; Wu, X; Wu, Y; Wulf, E; Wyatt, T R; Wynne, B M; Xella, S; Xiao, M; Xu, D; Xu, L; Yabsley, B; Yacoob, S; Yakabe, R; Yamada, M; Yamaguchi, H; Yamaguchi, Y; Yamamoto, A; Yamamoto, K; Yamamoto, S; Yamamura, T; Yamanaka, T; Yamauchi, K; Yamazaki, Y; Yan, Z; Yang, H; Yang, H; Yang, U K; Yang, Y; Yanush, S; Yao, L; Yao, W-M; Yasu, Y; Yatsenko, E; Yau Wong, K H; Ye, J; Ye, S; Yeletskikh, I; Yen, A L; Yildirim, E; Yilmaz, M; Yoosoofmiya, R; Yorita, K; Yoshida, R; Yoshihara, K; Young, C; Young, C J S; Youssef, S; Yu, D R; Yu, J; Yu, J M; Yu, J; Yuan, L; Yurkewicz, A; Yusuff, I; Zabinski, B; Zaidan, R; Zaitsev, A M; Zaman, A; Zambito, S; Zanello, L; Zanzi, D; Zeitnitz, C; Zeman, M; Zemla, A; Zengel, K; Zenin, O; Ženiš, T; Zerwas, D; Zevi Della Porta, G; Zhang, D; Zhang, F; Zhang, H; Zhang, J; Zhang, L; Zhang, X; Zhang, Z; Zhao, Z; Zhemchugov, A; Zhong, J; Zhou, B; Zhou, L; Zhou, N; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zhukov, K; Zibell, A; Zieminska, D; Zimine, N I; Zimmermann, C; Zimmermann, R; Zimmermann, S; Zimmermann, S; Zinonos, Z; Ziolkowski, M; Zobernig, G; Zoccoli, A; Zur Nedden, M; Zurzolo, G; Zutshi, V; Zwalinski, L

    A measurement of charged-particle distributions sensitive to the properties of the underlying event is presented for an inclusive sample of events containing a [Formula: see text]-boson, decaying to an electron or muon pair. The measurement is based on data collected using the ATLAS detector at the LHC in proton-proton collisions at a centre-of-mass energy of [Formula: see text] TeV with an integrated luminosity of [Formula: see text] fb[Formula: see text]. Distributions of the charged particle multiplicity and of the charged particle transverse momentum are measured in regions of azimuthal angle defined with respect to the [Formula: see text]-boson direction. The measured distributions are compared to similar distributions measured in jet events, and to the predictions of various Monte Carlo generators implementing different underlying event models.

  3. Implementation of a Publish-Subscribe Protocol in Microgrid Islanding and Resynchronization with Self-Discovery

    DOE PAGES

    Starke, M.; Herron, A.; King, D.; ...

    2017-08-24

    Communications systems and protocols are becoming second nature to utilities operating distribution systems. Traditionally, centralized communication approaches are often used, while recently in microgrid applications, distributed communication and control schema emerge offering several advantages such as improved system reliability, plug-and-play operation and distributed intelligence. Still, operation and control of microgrids including distributed communication schema have been less of a discussion in the literature. To address the challenge of multiple-inverter microgrid synchronization, a publish-subscribe protocol based, Data Distribution Service (DDS), communication schema for microgrids is proposed in this paper. The communication schema is discussed in details for individual devices such asmore » generators, photovoltaic systems, energy storage systems, microgrid point of common coupling switch, and supporting applications. In conclusion, islanding and resynchronization of a microgrid are demonstrated on a test-bed utilizing this schema.« less

  4. All fiber-coupled, long-term stable timing distribution for free-electron lasers with few-femtosecond jitter

    PubMed Central

    Şafak, K.; Xin, M.; Callahan, P. T.; Peng, M. Y.; Kärtner, F. X.

    2015-01-01

    We report recent progress made in a complete fiber-optic, high-precision, long-term stable timing distribution system for synchronization of next generation X-ray free-electron lasers. Timing jitter characterization of the master laser shows less than 170-as RMS integrated jitter for frequencies above 10 kHz, limited by the detection noise floor. Timing stabilization of a 3.5-km polarization-maintaining fiber link is successfully achieved with an RMS drift of 3.3 fs over 200 h of operation using all fiber-coupled elements. This all fiber-optic implementation will greatly reduce the complexity of optical alignment in timing distribution systems and improve the overall mechanical and timing stability of the system. PMID:26798814

  5. Early Benchmarks of Product Generation Capabilities of the GOES-R Ground System for Operational Weather Prediction

    NASA Astrophysics Data System (ADS)

    Kalluri, S. N.; Haman, B.; Vititoe, D.

    2014-12-01

    The ground system under development for Geostationary Operational Environmental Satellite-R (GOES-R) series of weather satellite has completed a key milestone in implementing the science algorithms that process raw sensor data to higher level products in preparation for launch. Real time observations from GOES-R are expected to make significant contributions to Earth and space weather prediction, and there are stringent requirements to product weather products at very low latency to meet NOAA's operational needs. Simulated test data from all the six GOES-R sensors are being processed by the system to test and verify performance of the fielded system. Early results show that the system development is on track to meet functional and performance requirements to process science data. Comparison of science products generated by the ground system from simulated data with those generated by the algorithm developers show close agreement among data sets which demonstrates that the algorithms are implemented correctly. Successful delivery of products to AWIPS and the Product Distribution and Access (PDA) system from the core system demonstrate that the external interfaces are working.

  6. Electric Water Heater Modeling and Control Strategies for Demand Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diao, Ruisheng; Lu, Shuai; Elizondo, Marcelo A.

    2012-07-22

    Abstract— Demand response (DR) has a great potential to provide balancing services at normal operating conditions and emergency support when a power system is subject to disturbances. Effective control strategies can significantly relieve the balancing burden of conventional generators and reduce investment on generation and transmission expansion. This paper is aimed at modeling electric water heaters (EWH) in households and tests their response to control strategies to implement DR. The open-loop response of EWH to a centralized signal is studied by adjusting temperature settings to provide regulation services; and two types of decentralized controllers are tested to provide frequency supportmore » following generator trips. EWH models are included in a simulation platform in DIgSILENT to perform electromechanical simulation, which contains 147 households in a distribution feeder. Simulation results show the dependence of EWH response on water heater usage . These results provide insight suggestions on the need of control strategies to achieve better performance for demand response implementation. Index Terms— Centralized control, decentralized control, demand response, electrical water heater, smart grid« less

  7. Monte Carlo generator ELRADGEN 2.0 for simulation of radiative events in elastic ep-scattering of polarized particles

    NASA Astrophysics Data System (ADS)

    Akushevich, I.; Filoti, O. F.; Ilyichev, A.; Shumeiko, N.

    2012-07-01

    The structure and algorithms of the Monte Carlo generator ELRADGEN 2.0 designed to simulate radiative events in polarized ep-scattering are presented. The full set of analytical expressions for the QED radiative corrections is presented and discussed in detail. Algorithmic improvements implemented to provide faster simulation of hard real photon events are described. Numerical tests show high quality of generation of photonic variables and radiatively corrected cross section. The comparison of the elastic radiative tail simulated within the kinematical conditions of the BLAST experiment at MIT BATES shows a good agreement with experimental data. Catalogue identifier: AELO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1299 No. of bytes in distributed program, including test data, etc.: 11 348 Distribution format: tar.gz Programming language: FORTRAN 77 Computer: All Operating system: Any RAM: 1 MB Classification: 11.2, 11.4 Nature of problem: Simulation of radiative events in polarized ep-scattering. Solution method: Monte Carlo simulation according to the distributions of the real photon kinematic variables that are calculated by the covariant method of QED radiative correction estimation. The approach provides rather fast and accurate generation. Running time: The simulation of 108 radiative events for itest:=1 takes up to 52 seconds on Pentium(R) Dual-Core 2.00 GHz processor.

  8. Towards Implementation of Green Technology in Sabah Construction Industry

    NASA Astrophysics Data System (ADS)

    Azland Jainudin, Noor; Jugah, Ivy; Nasrizal Awang Ali, Awang; Tawie, Rudy

    2017-12-01

    The construction industry in Sabah is one of the major roles for development of social, economic infrastructures and buildings in generating wealth to the state besides the tourism sector. The increasing number of construction projects particularly in the rapid developing city of Kota Kinabalu, green technology as a whole is becoming more significant as it helps to develop effective solutions to encounter global environmental issues. The objective of the research is to identify the awareness and implementation of green technology in construction industry in Kota Kinabalu, Sabah. The methodology of the research is through distributing the questionnaire to the contractors, developers, consultants, architects and state government agencies to the area in Kota Kinabalu only. The questionnaires had been analysed to find out the mean value. 100 questionnaires distributed to the respondents but merely 85 questionnaires collected have been analysed. Based on the findings, 83.5% organisations were aware with the concept of green technology in construction project. In terms of the implementation only 64.7% had been implemented in their organizations. More than 50% from the major players such as contractors, consultants, developers, architects and state government agencies were aware based on six green technology concepts in their organizations. As a conclusion, the awareness towards green policy concept in construction industry is very satisfied. Meanwhile, in terms of implementation need to be increased the number of organizations to be involved in green technology in construction industry.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chrisochoides, N.; Sukup, F.

    In this paper we present a parallel implementation of the Bowyer-Watson (BW) algorithm using the task-parallel programming model. The BW algorithm constitutes an ideal mesh refinement strategy for implementing a large class of unstructured mesh generation techniques on both sequential and parallel computers, by preventing the need for global mesh refinement. Its implementation on distributed memory multicomputes using the traditional data-parallel model has been proven very inefficient due to excessive synchronization needed among processors. In this paper we demonstrate that with the task-parallel model we can tolerate synchronization costs inherent to data-parallel methods by exploring concurrency in the processor level.more » Our preliminary performance data indicate that the task- parallel approach: (i) is almost four times faster than the existing data-parallel methods, (ii) scales linearly, and (iii) introduces minimum overheads compared to the {open_quotes}best{close_quotes} sequential implementation of the BW algorithm.« less

  10. Optimized and parallelized implementation of the electronegativity equalization method and the atom-bond electronegativity equalization method.

    PubMed

    Vareková, R Svobodová; Koca, J

    2006-02-01

    The most common way to calculate charge distribution in a molecule is ab initio quantum mechanics (QM). Some faster alternatives to QM have also been developed, the so-called "equalization methods" EEM and ABEEM, which are based on DFT. We have implemented and optimized the EEM and ABEEM methods and created the EEM SOLVER and ABEEM SOLVER programs. It has been found that the most time-consuming part of equalization methods is the reduction of the matrix belonging to the equation system generated by the method. Therefore, for both methods this part was replaced by the parallel algorithm WIRS and implemented within the PVM environment. The parallelized versions of the programs EEM SOLVER and ABEEM SOLVER showed promising results, especially on a single computer with several processors (compact PVM). The implemented programs are available through the Web page http://ncbr.chemi.muni.cz/~n19n/eem_abeem.

  11. Quantum key distribution session with 16-dimensional photonic states.

    PubMed

    Etcheverry, S; Cañas, G; Gómez, E S; Nogueira, W A T; Saavedra, C; Xavier, G B; Lima, G

    2013-01-01

    The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD.

  12. The application of connectionism to query planning/scheduling in intelligent user interfaces

    NASA Technical Reports Server (NTRS)

    Short, Nicholas, Jr.; Shastri, Lokendra

    1990-01-01

    In the mid nineties, the Earth Observing System (EOS) will generate an estimated 10 terabytes of data per day. This enormous amount of data will require the use of sophisticated technologies from real time distributed Artificial Intelligence (AI) and data management. Without regard to the overall problems in distributed AI, efficient models were developed for doing query planning and/or scheduling in intelligent user interfaces that reside in a network environment. Before intelligent query/planning can be done, a model for real time AI planning and/or scheduling must be developed. As Connectionist Models (CM) have shown promise in increasing run times, a connectionist approach to AI planning and/or scheduling is proposed. The solution involves merging a CM rule based system to a general spreading activation model for the generation and selection of plans. The system was implemented in the Rochester Connectionist Simulator and runs on a Sun 3/260.

  13. Data-driven modeling of solar-powered urban microgrids

    PubMed Central

    Halu, Arda; Scala, Antonio; Khiyami, Abdulaziz; González, Marta C.

    2016-01-01

    Distributed generation takes center stage in today’s rapidly changing energy landscape. Particularly, locally matching demand and generation in the form of microgrids is becoming a promising alternative to the central distribution paradigm. Infrastructure networks have long been a major focus of complex networks research with their spatial considerations. We present a systemic study of solar-powered microgrids in the urban context, obeying real hourly consumption patterns and spatial constraints of the city. We propose a microgrid model and study its citywide implementation, identifying the self-sufficiency and temporal properties of microgrids. Using a simple optimization scheme, we find microgrid configurations that result in increased resilience under cost constraints. We characterize load-related failures solving power flows in the networks, and we show the robustness behavior of urban microgrids with respect to optimization using percolation methods. Our findings hint at the existence of an optimal balance between cost and robustness in urban microgrids. PMID:26824071

  14. Connecting Scientists, College Students, Middle School Students & Elementary Students through Intergenerational Afterschool STEM Programming

    NASA Astrophysics Data System (ADS)

    Ali, N. A.; Paglierani, R.; Raftery, C. L.; Romero, V.; Harper, M. R.; Chilcott, C.; Peticolas, L. M.; Hauck, K.; Yan, D.; Ruderman, I.; Frappier, R.

    2015-12-01

    The Multiverse education group at UC Berkeley's Space Sciences Lab created the NASA-funded "Five Stars Pathway" model in which five "generations" of girls and women engage in science together in an afterschool setting, with each generation representing one stage in the pathway of pursuing a career in science, technology, engineering, or math (STEM). The five stages are: elementary-age students, middle-school-age students, undergraduate-level college students, graduate-level college students and professional scientists. This model was field-tested at two Girls Inc. afterschool locations in the San Francisco Bay Area and distributed to Girls Inc. affiliates and other afterschool program coordinators nationwide. This presentation will explore some of the challenges and success of implementing a multigenerational STEM model as well as distributing the free curriculum for interested scientists and college students to use with afterschool programs.

  15. Quantum key distribution session with 16-dimensional photonic states

    NASA Astrophysics Data System (ADS)

    Etcheverry, S.; Cañas, G.; Gómez, E. S.; Nogueira, W. A. T.; Saavedra, C.; Xavier, G. B.; Lima, G.

    2013-07-01

    The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD.

  16. Data-driven modeling of solar-powered urban microgrids.

    PubMed

    Halu, Arda; Scala, Antonio; Khiyami, Abdulaziz; González, Marta C

    2016-01-01

    Distributed generation takes center stage in today's rapidly changing energy landscape. Particularly, locally matching demand and generation in the form of microgrids is becoming a promising alternative to the central distribution paradigm. Infrastructure networks have long been a major focus of complex networks research with their spatial considerations. We present a systemic study of solar-powered microgrids in the urban context, obeying real hourly consumption patterns and spatial constraints of the city. We propose a microgrid model and study its citywide implementation, identifying the self-sufficiency and temporal properties of microgrids. Using a simple optimization scheme, we find microgrid configurations that result in increased resilience under cost constraints. We characterize load-related failures solving power flows in the networks, and we show the robustness behavior of urban microgrids with respect to optimization using percolation methods. Our findings hint at the existence of an optimal balance between cost and robustness in urban microgrids.

  17. Disaster management and mitigation: the telecommunications infrastructure.

    PubMed

    Patricelli, Frédéric; Beakley, James E; Carnevale, Angelo; Tarabochia, Marcello; von Lubitz, Dag K J E

    2009-03-01

    Among the most typical consequences of disasters is the near or complete collapse of terrestrial telecommunications infrastructures (especially the distribution network--the 'last mile') and their concomitant unavailability to the rescuers and the higher echelons of mitigation teams. Even when such damage does not take place, the communications overload/congestion resulting from significantly elevated traffic generated by affected residents can be highly disturbing. The paper proposes innovative remedies to the telecommunications difficulties in disaster struck regions. The offered solutions are network-centric operations-cap able, and can be employed in management of disasters of any magnitude (local to national or international). Their implementation provide ground rescue teams (such as law enforcement, firemen, healthcare personnel, civilian authorities) with tactical connectivity among themselves, and, through the Next Generation Network backbone, ensure the essential bidirectional free flow of information and distribution of Actionable Knowledge among ground units, command/control centres, and civilian and military agencies participating in the rescue effort.

  18. Entangled quantum key distribution over two free-space optical links.

    PubMed

    Erven, C; Couteau, C; Laflamme, R; Weihs, G

    2008-10-13

    We report on the first real-time implementation of a quantum key distribution (QKD) system using entangled photon pairs that are sent over two free-space optical telescope links. The entangled photon pairs are produced with a type-II spontaneous parametric down-conversion source placed in a central, potentially untrusted, location. The two free-space links cover a distance of 435 m and 1,325 m respectively, producing a total separation of 1,575 m. The system relies on passive polarization analysis units, GPS timing receivers for synchronization, and custom written software to perform the complete QKD protocol including error correction and privacy amplification. Over 6.5 hours during the night, we observed an average raw key generation rate of 565 bits/s, an average quantum bit error rate (QBER) of 4.92%, and an average secure key generation rate of 85 bits/s.

  19. Quantum key distribution session with 16-dimensional photonic states

    PubMed Central

    Etcheverry, S.; Cañas, G.; Gómez, E. S.; Nogueira, W. A. T.; Saavedra, C.; Xavier, G. B.; Lima, G.

    2013-01-01

    The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD. PMID:23897033

  20. The NGEE Arctic Data Archive -- Portal for Archiving and Distributing Data and Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boden, Thomas A; Palanisamy, Giri; Devarakonda, Ranjeet

    2014-01-01

    The Next-Generation Ecosystem Experiments (NGEE Arctic) project is committed to implementing a rigorous and high-quality data management program. The goal is to implement innovative and cost-effective guidelines and tools for collecting, archiving, and sharing data within the project, the larger scientific community, and the public. The NGEE Arctic web site is the framework for implementing these data management and data sharing tools. The open sharing of NGEE Arctic data among project researchers, the broader scientific community, and the public is critical to meeting the scientific goals and objectives of the NGEE Arctic project and critical to advancing the mission ofmore » the Department of Energy (DOE), Office of Science, Biological and Environmental (BER) Terrestrial Ecosystem Science (TES) program.« less

  1. Electrical/electronics working group summary

    NASA Technical Reports Server (NTRS)

    Schoenfeld, A. D.

    1984-01-01

    The electrical/electronics, technology area was considered. It was found that there are no foreseeable circuit or component problems to hinder the implementation of the flywheel energy storage concept. The definition of the major component or technology developments required to permit a technology ready date of 1987 was addressed. Recommendations: motor/generators, suspension electronics, power transfer, power conditioning and distribution, and modeling. An introduction to the area of system engineering is also included.

  2. Implementation of the Next Generation Attenuation (NGA) ground-motion prediction equations in Fortran and R

    USGS Publications Warehouse

    Kaklamanos, James; Boore, David M.; Thompson, Eric M.; Campbell, Kenneth W.

    2010-01-01

    Although these programs have been used by the U.S. Geological Survey (USGS), Tufts University, and others, no warranty, expressed or implied, is made by Tufts or the USGS as to the accuracy or functioning of the programs and related material, nor shall the fact of distribution constitute any such warranty, and no responsibility is assumed by Tufts or the USGS in connection therewith.

  3. Algebraic grid generation with corner singularities

    NASA Technical Reports Server (NTRS)

    Vinokur, M.; Lombard, C. K.

    1983-01-01

    A simple noniterative algebraic procedure is presented for generating smooth computational meshes on a quadrilateral topology. Coordinate distribution and normal derivative are provided on all boundaries, one of which may include a slope discontinuity. The boundary conditions are sufficient to guarantee continuity of global meshes formed of joined patches generated by the procedure. The method extends to 3-D. The procedure involves a synthesis of prior techniques stretching functions, cubic blending functions, and transfinite interpolation - to which is added the functional form of the corner solution. The procedure introduces the concept of generalized blending, which is implemented as an automatic scaling of the boundary derivatives for effective interpolation. Some implications of the treatment at boundaries for techniques solving elliptic PDE's are discussed in an Appendix.

  4. An Incentive-based Online Optimization Framework for Distribution Grids

    DOE PAGES

    Zhou, Xinyang; Dall'Anese, Emiliano; Chen, Lijun; ...

    2017-10-09

    This article formulates a time-varying social-welfare maximization problem for distribution grids with distributed energy resources (DERs) and develops online distributed algorithms to identify (and track) its solutions. In the considered setting, network operator and DER-owners pursue given operational and economic objectives, while concurrently ensuring that voltages are within prescribed limits. The proposed algorithm affords an online implementation to enable tracking of the solutions in the presence of time-varying operational conditions and changing optimization objectives. It involves a strategy where the network operator collects voltage measurements throughout the feeder to build incentive signals for the DER-owners in real time; DERs thenmore » adjust the generated/consumed powers in order to avoid the violation of the voltage constraints while maximizing given objectives. Stability of the proposed schemes is analytically established and numerically corroborated.« less

  5. An Incentive-based Online Optimization Framework for Distribution Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Xinyang; Dall'Anese, Emiliano; Chen, Lijun

    This article formulates a time-varying social-welfare maximization problem for distribution grids with distributed energy resources (DERs) and develops online distributed algorithms to identify (and track) its solutions. In the considered setting, network operator and DER-owners pursue given operational and economic objectives, while concurrently ensuring that voltages are within prescribed limits. The proposed algorithm affords an online implementation to enable tracking of the solutions in the presence of time-varying operational conditions and changing optimization objectives. It involves a strategy where the network operator collects voltage measurements throughout the feeder to build incentive signals for the DER-owners in real time; DERs thenmore » adjust the generated/consumed powers in order to avoid the violation of the voltage constraints while maximizing given objectives. Stability of the proposed schemes is analytically established and numerically corroborated.« less

  6. Grid generation methodology and CFD simulations in sliding vane compressors and expanders

    NASA Astrophysics Data System (ADS)

    Bianchi, Giuseppe; Rane, Sham; Kovacevic, Ahmed; Cipollone, Roberto; Murgia, Stefano; Contaldi, Giulio

    2017-08-01

    The limiting factor for the employment of advanced 3D CFD tools in the analysis and design of rotary vane machines is the unavailability of methods for generation of computational grids suitable for fast and reliable numerical analysis. The paper addresses this challenge presenting the development of an analytical grid generation for vane machines that is based on the user defined nodal displacement. In particular, mesh boundaries are defined as parametric curves generated using trigonometrical modelling of the axial cross section of the machine while the distribution of computational nodes is performed using algebraic algorithms with transfinite interpolation, post orthogonalisation and smoothing. Algebraic control functions are introduced for distribution of nodes on the rotor and casing boundaries in order to achieve good grid quality in terms of cell size and expansion. In this way, the moving and deforming fluid domain of the sliding vane machine is discretized and the conservation of intrinsic quantities in ensured by maintaining the cell connectivity and structure. For validation of generated grids, a mid-size air compressor and a small-scale expander for Organic Rankine Cycle applications have been investigated in this paper. Remarks on implementation of the mesh motion algorithm, stability and robustness experienced with the ANSYS CFX solver as well as the obtained flow results are presented.

  7. Simulation of Ectopic Pacemakers in the Heart: Multiple Ectopic Beats Generated by Reentry inside Fibrotic Regions

    PubMed Central

    Gouvêa de Barros, Bruno; Weber dos Santos, Rodrigo; Alonso, Sergio

    2015-01-01

    The inclusion of nonconducting media, mimicking cardiac fibrosis, in two models of cardiac tissue produces the formation of ectopic beats. The fraction of nonconducting media in comparison with the fraction of healthy myocytes and the topological distribution of cells determines the probability of ectopic beat generation. First, a detailed subcellular microscopic model that accounts for the microstructure of the cardiac tissue is constructed and employed for the numerical simulation of action potential propagation. Next, an equivalent discrete model is implemented, which permits a faster integration of the equations. This discrete model is a simplified version of the microscopic model that maintains the distribution of connections between cells. Both models produce similar results when describing action potential propagation in homogeneous tissue; however, they slightly differ in the generation of ectopic beats in heterogeneous tissue. Nevertheless, both models present the generation of reentry inside fibrotic tissues. This kind of reentry restricted to microfibrosis regions can result in the formation of ectopic pacemakers, that is, regions that will generate a series of ectopic stimulus at a fast pacing rate. In turn, such activity has been related to trigger fibrillation in the atria and in the ventricles in clinical and animal studies. PMID:26583127

  8. Estimation of synthetic flood design hydrographs using a distributed rainfall-runoff model coupled with a copula-based single storm rainfall generator

    NASA Astrophysics Data System (ADS)

    Candela, A.; Brigandì, G.; Aronica, G. T.

    2014-07-01

    In this paper a procedure to derive synthetic flood design hydrographs (SFDH) using a bivariate representation of rainfall forcing (rainfall duration and intensity) via copulas, which describes and models the correlation between two variables independently of the marginal laws involved, coupled with a distributed rainfall-runoff model, is presented. Rainfall-runoff modelling (R-R modelling) for estimating the hydrological response at the outlet of a catchment was performed by using a conceptual fully distributed procedure based on the Soil Conservation Service - Curve Number method as an excess rainfall model and on a distributed unit hydrograph with climatic dependencies for the flow routing. Travel time computation, based on the distributed unit hydrograph definition, was performed by implementing a procedure based on flow paths, determined from a digital elevation model (DEM) and roughness parameters obtained from distributed geographical information. In order to estimate the primary return period of the SFDH, which provides the probability of occurrence of a hydrograph flood, peaks and flow volumes obtained through R-R modelling were treated statistically using copulas. Finally, the shapes of hydrographs have been generated on the basis of historically significant flood events, via cluster analysis. An application of the procedure described above has been carried out and results presented for the case study of the Imera catchment in Sicily, Italy.

  9. Geochemical baseline distribution of harmful elements in the surface soils of Campania region.

    NASA Astrophysics Data System (ADS)

    Albanese, Stefano; Lima, Annamaria; Qu, Chengkai; Cicchella, Domenico; Buccianti, Antonella; De Vivo, Benedetto

    2015-04-01

    Environmental geochemical mapping has assumed an increasing relevance and the separation of values to discriminate between anthropogenic pollution and natural (geogenic) sources has become crucial to address environmental problems affecting the quality of life of human beings. In the last decade, a number of geochemical prospecting projects, mostly focused on surface soils (topsoils), were carried out at different scales (from regional to local) across the whole Campania region (Italy) to characterize the distribution of both harmful elements and persistent organic pollutants (POP) in the environment and to generating a valuable database to serve as reference in developing geomedical studies. During the 2014, a database reporting the distribution of 53 chemical elements in 3536 topsoil samples, collected across the whole region, was completed. The geochemical data, after necessary quality controls, were georeferenced and processed in a geochemistry dedicated GIS software named GEODAS. For each considered element a complete set of maps was generated to depict both the discrete and the spatially continuous (interpolated) distribution of elemental concentrations across the region. The interpolated maps were generated using the Multifractal Inverse Distance eighted (MIDW) algorithm. Subsequently, the S-A method, also implemented in GEODAS, was applied to MIDW maps to eliminate spatially limited anomalies from the original grid and to generate the distribution patterns of geochemical baselines for each element. For a selected group of elements geochemical data were also treated by means of a Compositional Data Analysis (CoDA) aiming at investigating the regionalised structure of the data by considering the joint behaviour of several elements constituting for each sample its whole composition. A regional environmental risk assessment was run on the basis of the regional distribution of heavy metals in soil, land use types and population. The risk assessment produced a ranking of priorities and located areas of regional territory where human health risk is more relevant and follow-up activities are required.

  10. Magnetoacoustic microscopic imaging of conductive objects and nanoparticles distribution

    NASA Astrophysics Data System (ADS)

    Liu, Siyu; Zhang, Ruochong; Luo, Yunqi; Zheng, Yuanjin

    2017-09-01

    Magnetoacoustic tomography has been demonstrated as a powerful and low-cost multi-wave imaging modality. However, due to limited spatial resolution and detection efficiency of magnetoacoustic signal, full potential of the magnetoacoustic imaging remains to be tapped. Here we report a high-resolution magnetoacoustic microscopy method, where magnetic stimulation is provided by a compact solenoid resonance coil connected with a matching network, and acoustic reception is realized by using a high-frequency focused ultrasound transducer. Scanning the magnetoacoustic microscopy system perpendicularly to the acoustic axis of the focused transducer would generate a two-dimensional microscopic image with acoustically determined lateral resolution. It is analyzed theoretically and demonstrated experimentally that magnetoacoustic generation in this microscopic system depends on the conductivity profile of conductive objects and localized distribution of superparamagnetic iron magnetic nanoparticles, based on two different but related implementations. The lateral resolution is characterized. Directional nature of magnetoacoustic vibration and imaging sensitivity for mapping magnetic nanoparticles are also discussed. The proposed microscopy system offers a high-resolution method that could potentially map intrinsic conductivity distribution in biological tissue and extraneous magnetic nanoparticles.

  11. Efficiency degradation due to tracking errors for point focusing solar collectors

    NASA Technical Reports Server (NTRS)

    Hughes, R. O.

    1978-01-01

    An important parameter in the design of point focusing solar collectors is the intercept factor which is a measure of efficiency and of energy available for use in the receiver. Using statistical methods, an expression of the expected value of the intercept factor is derived for various configurations and control law implementations. The analysis assumes that a radially symmetric flux distribution (not necessarily Gaussian) is generated at the focal plane due to the sun's finite image and various reflector errors. The time-varying tracking errors are assumed to be uniformly distributed within the threshold limits and allows the expected value calculation.

  12. Mathematical model for the analysis of structure and optimal operational parameters of a solid oxide fuel cell generator

    NASA Astrophysics Data System (ADS)

    Coralli, Alberto; Villela de Miranda, Hugo; Espiúca Monteiro, Carlos Felipe; Resende da Silva, José Francisco; Valadão de Miranda, Paulo Emílio

    2014-12-01

    Solid oxide fuel cells are globally recognized as a very promising technology in the area of highly efficient electricity generation with a low environmental impact. This technology can be advantageously implemented in many situations in Brazil and it is well suited to the use of ethanol as a primary energy source, an important feature given the highly developed Brazilian ethanol industry. In this perspective, a simplified mathematical model is developed for a fuel cell and its balance of plant, in order to identify the optimal system structure and the most convenient values for the operational parameters, with the aim of maximizing the global electric efficiency. In this way it is discovered the best operational configuration for the desired application, which is the distributed generation in the concession area of the electricity distribution company Elektro. The data regarding this configuration are required for the continuation of the research project, i.e. the development of a prototype, a cost analysis of the developed system and a detailed perspective of the market opportunities in Brazil.

  13. The impact of runoff generation mechanisms on the location of critical source areas

    USGS Publications Warehouse

    Lyon, S.W.; McHale, M.R.; Walter, M.T.; Steenhuis, T.S.

    2006-01-01

    Identifying phosphorus (P) source areas and transport pathways is a key step in decreasing P loading to natural water systems. This study compared the effects of two modeled runoff generation processes - saturation excess and infiltration excess - on total phosphorus (TP) and soluble reactive phosphorus (SRP) concentrations in 10 catchment streams of a Catskill mountain watershed in southeastern New York. The spatial distribution of runoff from forested land and agricultural land was generated for both runoff processes; results of both distributions were consistent with Soil Conservation Service-Curve Number (SCS-CN) theory. These spatial runoff distributions were then used to simulate stream concentrations of TP and SRP through a simple equation derived from an observed relation between P concentration and land use; empirical results indicate that TP and SRP concentrations increased with increasing percentage of agricultural land. Simulated TP and SRP stream concentrations predicted for the 10 catchments were strongly affected by the assumed runoff mechanism. The modeled TP and SRP concentrations produced by saturation excess distribution averaged 31 percent higher and 42 percent higher, respectively, than those produced by the infiltration excess distribution. Misrepresenting the primary runoff mechanism could not only produce erroneous concentrations, it could fail to correctly locate critical source areas for implementation of best management practices. Thus, identification of the primary runoff mechanism is critical in selection of appropriate models in the mitigation of nonpoint source pollution. Correct representation of runoff processes is also critical in the future development of biogeochemical transport models, especially those that address nutrient fluxes.

  14. NASA's Next Generation Space Geodesy Program

    NASA Technical Reports Server (NTRS)

    Pearlman, M. R.; Frey, H. V.; Gross, R. S.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry J. F.; Merkowitz, S. M.; Noll, C. E.; Pavilis, E. C.; hide

    2012-01-01

    Requirements for the ITRF have increased dramatically since the 1980s. The most stringent requirement comes from critical sea level monitoring programs: a global accuracy of 1.0 mm, and 0.1mm/yr stability, a factor of 10 to 20 beyond current capability. Other requirements for the ITRF coming from ice mass change, ground motion, and mass transport studies are similar. Current and future satellite missions will have ever-increasing measurement capability and will lead to increasingly sophisticated models of these and other changes in the Earth system. Ground space geodesy networks with enhanced measurement capability will be essential to meeting the ITRF requirements and properly interpreting the satellite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation of the observed geophysical signals. NASA has embarked on a Space Geodesy Program with a long-range goal to build, deploy and operate a next generation NASA Space Geodetic Network (SGN). The plan is to build integrated, multi-technique next-generation space geodetic observing systems as the core contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Phase 1 of this project has been funded to (1) Establish and demonstrate a next-generation prototype integrated Space Geodetic Station at Goddard s Geophysical and Astronomical Observatory (GGAO), including next-generation SLR and VLBI systems along with modern GNSS and DORIS; (2) Complete ongoing Network Design Studies that describe the appropriate number and distribution of next-generation Space Geodetic Stations for an improved global network; (3) Upgrade analysis capability to handle the next-generation data; (4) Implement a modern survey system to measure inter-technique vectors for co-location; and (5) Develop an Implementation Plan to build, deploy and operate a next-generation integrated NASA SGN that will serve as NASA s contribution to the international global geodetic network. An envisioned Phase 2 (which is not currently funded) would include the replication of up to ten such stations to be deployed either as integrated units or as a complement to already in-place components provided by other organizations. This talk will give an update on the activities underway and the plans for completion.

  15. NASA's Next Generation Space Geodesy Program

    NASA Technical Reports Server (NTRS)

    Merkowitz, S. M.; Desai, S. D.; Gross, R. S.; Hillard, L. M.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry, J. F.; Murphy, D.; Noll, C. E.; hide

    2012-01-01

    Requirements for the ITRF have increased dramatically since the 1980s. The most stringent requirement comes from critical sea level monitoring programs: a global accuracy of 1.0 mm, and 0.1mm/yr stability, a factor of 10 to 20 beyond current capability. Other requirements for the ITRF coming from ice mass change, ground motion, and mass transport studies are similar. Current and future satellite missions will have ever-increasing measurement capability and will lead to increasingly sophisticated models of these and other changes in the Earth system. Ground space geodesy networks with enhanced measurement capability will be essential to meeting the ITRF requirements and properly interpreting the satellite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation of the observed geophysical signals. NASA has embarked on a Space Geodesy Program with a long-range goal to build, deploy and operate a next generation NASA Space Geodetic Network (SGN). The plan is to build integrated, multi-technique next-generation space geodetic observing systems as the core contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Phase 1 of this project has been funded to (1) Establish and demonstrate a next-generation prototype integrated Space Geodetic Station at Goddard's Geophysical and Astronomical Observatory (GGAO), including next-generation SLR and VLBI systems along with modern GNSS and DORIS; (2) Complete ongoing Network Design Studies that describe the appropriate number and distribution of next-generation Space Geodetic Stations for an improved global network; (3) Upgrade analysis capability to handle the next-generation data; (4) Implement a modern survey system to measure inter-technique vectors for co-location; and (5) Develop an Implementation Plan to build, deploy and operate a next-generation integrated NASA SGN that will serve as NASA's contribution to the international global geodetic network. An envisioned Phase 2 (which is not currently funded) would include the replication of up to ten such stations to be deployed either as integrated units or as a complement to already in-place components provided by other organizations. This talk will give an update on the activities underway and the plans for completion.

  16. Particle simulation of electromagnetic emissions from electrostatic instability driven by an electron ring beam on the density gradient

    NASA Astrophysics Data System (ADS)

    Horký, Miroslav; Omura, Yoshiharu; Santolík, Ondřej

    2018-04-01

    This paper presents the wave mode conversion between electrostatic and electromagnetic waves on the plasma density gradient. We use 2-D electromagnetic code KEMPO2 implemented with the generation of density gradient to simulate such a conversion process. In the dense region, we use ring beam instability to generate electron Bernstein waves and we study the temporal evolution of wave spectra, velocity distributions, Poynting flux, and electric and magnetic energies to observe the wave mode conversion. Such a conversion process can be a source of electromagnetic emissions which are routinely measured by spacecraft on the plasmapause density gradient.

  17. Encryption key distribution via chaos synchronization

    PubMed Central

    Keuninckx, Lars; Soriano, Miguel C.; Fischer, Ingo; Mirasso, Claudio R.; Nguimdo, Romain M.; Van der Sande, Guy

    2017-01-01

    We present a novel encryption scheme, wherein an encryption key is generated by two distant complex nonlinear units, forced into synchronization by a chaotic driver. The concept is sufficiently generic to be implemented on either photonic, optoelectronic or electronic platforms. The method for generating the key bitstream from the chaotic signals is reconfigurable. Although derived from a deterministic process, the obtained bit series fulfill the randomness conditions as defined by the National Institute of Standards test suite. We demonstrate the feasibility of our concept on an electronic delay oscillator circuit and test the robustness against attacks using a state-of-the-art system identification method. PMID:28233876

  18. CMOS-based Stochastically Spiking Neural Network for Optimization under Uncertainties

    DTIC Science & Technology

    2017-03-01

    inverse tangent characteristics at varying input voltage (VIN) [Fig. 3], thereby it is suitable for Kernel function implementation. By varying bias...cost function/constraint variables are generated based on inverse transform on CDF. In Fig. 5, F-1(u) for uniformly distributed random number u [0, 1...extracts random samples of x varying with CDF of F(x). In Fig. 6, we present a successive approximation (SA) circuit to evaluate inverse

  19. A Blueprint for Demonstrating Quantum Supremacy with Superconducting Qubits

    NASA Technical Reports Server (NTRS)

    Kechedzhi, Kostyantyn

    2018-01-01

    Long coherence times and high fidelity control recently achieved in scalable superconducting circuits paved the way for the growing number of experimental studies of many-qubit quantum coherent phenomena in these devices. Albeit full implementation of quantum error correction and fault tolerant quantum computation remains a challenge the near term pre-error correction devices could allow new fundamental experiments despite inevitable accumulation of errors. One such open question foundational for quantum computing is achieving the so called quantum supremacy, an experimental demonstration of a computational task that takes polynomial time on the quantum computer whereas the best classical algorithm would require exponential time and/or resources. It is possible to formulate such a task for a quantum computer consisting of less than a 100 qubits. The computational task we consider is to provide approximate samples from a non-trivial quantum distribution. This is a generalization for the case of superconducting circuits of ideas behind boson sampling protocol for quantum optics introduced by Arkhipov and Aaronson. In this presentation we discuss a proof-of-principle demonstration of such a sampling task on a 9-qubit chain of superconducting gmon qubits developed by Google. We discuss theoretical analysis of the driven evolution of the device resulting in output approximating samples from a uniform distribution in the Hilbert space, a quantum chaotic state. We analyze quantum chaotic characteristics of the output of the circuit and the time required to generate a sufficiently complex quantum distribution. We demonstrate that the classical simulation of the sampling output requires exponential resources by connecting the task of calculating the output amplitudes to the sign problem of the Quantum Monte Carlo method. We also discuss the detailed theoretical modeling required to achieve high fidelity control and calibration of the multi-qubit unitary evolution in the device. We use a novel cross-entropy statistical metric as a figure of merit to verify the output and calibrate the device controls. Finally, we demonstrate the statistics of the wave function amplitudes generated on the 9-gmon chain and verify the quantum chaotic nature of the generated quantum distribution. This verifies the implementation of the quantum supremacy protocol.

  20. An implementation of a chemical and thermal nonequilibrium flow solver on unstructured meshes and application to blunt bodies

    NASA Technical Reports Server (NTRS)

    Prabhu, Ramadas K.

    1994-01-01

    This paper presents a nonequilibrium flow solver, implementation of the algorithm on unstructured meshes, and application to hypersonic flow past blunt bodies. Air is modeled as a mixture of five chemical species, namely O2, N2, O, NO, and N, having two temperatures namely translational and vibrational. The solution algorithm is a cell centered, point implicit upwind scheme that employs Roe's flux difference splitting technique. Implementation of this algorithm on unstructured meshes is described. The computer code is applied to solve Mach 15 flow with and without a Type IV shock interference on a cylindrical body of 2.5mm radius representing a cowl lip. Adaptively generated meshes are employed, and the meshes are refined several times until the solution exhibits detailed flow features and surface pressure and heat flux distributions. Effects of a catalytic wall on surface heat flux distribution are studied. For the Mach 15 Type IV shock interference flow, present results showed a peak heat flux of 544 MW/m2 for a fully catalytic wall and 431 MW/m(exp 2) for a noncatalytic wall. Some of the results are compared with available computational data.

  1. Architecture of distributed picture archiving and communication systems for storing and processing high resolution medical images

    NASA Astrophysics Data System (ADS)

    Tokareva, Victoria

    2018-04-01

    New generation medicine demands a better quality of analysis increasing the amount of data collected during checkups, and simultaneously decreasing the invasiveness of a procedure. Thus it becomes urgent not only to develop advanced modern hardware, but also to implement special software infrastructure for using it in everyday clinical practice, so-called Picture Archiving and Communication Systems (PACS). Developing distributed PACS is a challenging task for nowadays medical informatics. The paper discusses the architecture of distributed PACS server for processing large high-quality medical images, with respect to technical specifications of modern medical imaging hardware, as well as international standards in medical imaging software. The MapReduce paradigm is proposed for image reconstruction by server, and the details of utilizing the Hadoop framework for this task are being discussed in order to provide the design of distributed PACS as ergonomic and adapted to the needs of end users as possible.

  2. Forced underwater laminar flows with active magnetohydrodynamic metamaterials

    NASA Astrophysics Data System (ADS)

    Culver, Dean; Urzhumov, Yaroslav

    2017-12-01

    Theory and practical implementations for wake-free propulsion systems are proposed and proven with computational fluid dynamic modeling. Introduced earlier, the concept of active hydrodynamic metamaterials is advanced by introducing magnetohydrodynamic metamaterials, structures with custom-designed volumetric distribution of Lorentz forces acting on a conducting fluid. Distributions of volume forces leading to wake-free, laminar flows are designed using multivariate optimization. Theoretical indications are presented that such flows can be sustained at arbitrarily high Reynolds numbers. Moreover, it is shown that in the limit Re ≫102 , a fixed volume force distribution may lead to a forced laminar flow across a wide range of Re numbers, without the need to reconfigure the force-generating metamaterial. Power requirements for such a device are studied as a function of the fluid conductivity. Implications to the design of distributed propulsion systems underwater and in space are discussed.

  3. Measurement of distributions sensitive to the underlying event in inclusive Z-boson production in pp collisions at √s = 7 TeV with the ATLAS detector

    DOE PAGES

    Aad, G.

    2014-12-10

    A measurement of charged-particle distributions sensitive to the properties of the underlying event is presented for an inclusive sample of events containing a \\(Z\\)-boson, decaying to an electron or muon pair. The measurement is based on data collected using the ATLAS detector at the LHC in proton–proton collisions at a centre-of-mass energy of \\(7\\) TeV with an integrated luminosity of \\(4.6\\) fb\\(^{-1}\\). Distributions of the charged particle multiplicity and of the charged particle transverse momentum are measured in regions of azimuthal angle defined with respect to the \\(Z\\)-boson direction. As a result, the measured distributions are compared to similar distributionsmore » measured in jet events, and to the predictions of various Monte Carlo generators implementing different underlying event models.« less

  4. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  5. Smart grids: A paradigm shift on energy generation and distribution with the emergence of a new energy management business model

    NASA Astrophysics Data System (ADS)

    Cardenas, Jesus Alvaro

    An energy and environmental crisis will emerge throughout the world if we continue with our current practices of generation and distribution of electricity. A possible solution to this problem is based on the Smart grid concept, which is heavily influenced by Information and Communication Technology (ICT). Although the electricity industry is mostly regulated, there are global models used as roadmaps for Smart Grids' implementation focusing on technologies and the basic generation-distribution-transmission model. This project aims to further enhance a business model for a future global deployment. It takes into consideration the many factors interacting in this energy provision process, based on the diffusion of technologies and literature surveys on the available documents in the Internet as well as peer-reviewed publications. Tariffs and regulations, distributed energy generation, integration of service providers, consumers becoming producers, self-healing devices, and many other elements are shifting this industry into a major change towards liberalization and deregulation of this sector, which has been heavily protected by the government due to the importance of electricity for consumers. We propose an Energy Management Business Model composed by four basic elements: Supply Chain, Information and Communication Technology (ICT), Stakeholders Response, and the resulting Green Efficient Energy (GEE). We support the developed model based on the literature survey, we support it with the diffusion analysis of these elements, and support the overall model with two surveys: one for peers and professionals, and other for experts in the field, based on the Smart Grid Carnegie Melon Maturity Model (CMU SEI SGMM). The contribution of this model is a simple path to follow for entities that want to achieve environmental friendly energy with the involvement of technology and all stakeholders.

  6. Real-Time Data Streaming and Storing Structure for the LHD's Fusion Plasma Experiments

    NASA Astrophysics Data System (ADS)

    Nakanishi, Hideya; Ohsuna, Masaki; Kojima, Mamoru; Imazu, Setsuo; Nonomura, Miki; Emoto, Masahiko; Yoshida, Masanobu; Iwata, Chie; Ida, Katsumi

    2016-02-01

    The LHD data acquisition and archiving system, i.e., LABCOM system, has been fully equipped with high-speed real-time acquisition, streaming, and storage capabilities. To deal with more than 100 MB/s continuously generated data at each data acquisition (DAQ) node, DAQ tasks have been implemented as multitasking and multithreaded ones in which the shared memory plays the most important role for inter-process fast and massive data handling. By introducing a 10-second time chunk named “subshot,” endless data streams can be stored into a consecutive series of fixed length data blocks so that they will soon become readable by other processes even while the write process is continuing. Real-time device and environmental monitoring are also implemented in the same way with further sparse resampling. The central data storage has been separated into two layers to be capable of receiving multiple 100 MB/s inflows in parallel. For the frontend layer, high-speed SSD arrays are used as the GlusterFS distributed filesystem which can provide max. 2 GB/s throughput. Those design optimizations would be informative for implementing the next-generation data archiving system in big physics, such as ITER.

  7. BigWig and BigBed: enabling browsing of large distributed datasets.

    PubMed

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  8. Decoy-state quantum key distribution with a leaky source

    NASA Astrophysics Data System (ADS)

    Tamaki, Kiyoshi; Curty, Marcos; Lucamarini, Marco

    2016-06-01

    In recent years, there has been a great effort to prove the security of quantum key distribution (QKD) with a minimum number of assumptions. Besides its intrinsic theoretical interest, this would allow for larger tolerance against device imperfections in the actual implementations. However, even in this device-independent scenario, one assumption seems unavoidable, that is, the presence of a protected space devoid of any unwanted information leakage in which the legitimate parties can privately generate, process and store their classical data. In this paper we relax this unrealistic and hardly feasible assumption and introduce a general formalism to tackle the information leakage problem in most of existing QKD systems. More specifically, we prove the security of optical QKD systems using phase and intensity modulators in their transmitters, which leak the setting information in an arbitrary manner. We apply our security proof to cases of practical interest and show key rates similar to those obtained in a perfectly shielded environment. Our work constitutes a fundamental step forward in guaranteeing implementation security of quantum communication systems.

  9. A Simple Simulation Technique for Nonnormal Data with Prespecified Skewness, Kurtosis, and Covariance Matrix.

    PubMed

    Foldnes, Njål; Olsson, Ulf Henning

    2016-01-01

    We present and investigate a simple way to generate nonnormal data using linear combinations of independent generator (IG) variables. The simulated data have prespecified univariate skewness and kurtosis and a given covariance matrix. In contrast to the widely used Vale-Maurelli (VM) transform, the obtained data are shown to have a non-Gaussian copula. We analytically obtain asymptotic robustness conditions for the IG distribution. We show empirically that popular test statistics in covariance analysis tend to reject true models more often under the IG transform than under the VM transform. This implies that overly optimistic evaluations of estimators and fit statistics in covariance structure analysis may be tempered by including the IG transform for nonnormal data generation. We provide an implementation of the IG transform in the R environment.

  10. A Framework to Describe, Analyze and Generate Interactive Motor Behaviors

    PubMed Central

    Jarrassé, Nathanaël; Charalambous, Themistoklis; Burdet, Etienne

    2012-01-01

    While motor interaction between a robot and a human, or between humans, has important implications for society as well as promising applications, little research has been devoted to its investigation. In particular, it is important to understand the different ways two agents can interact and generate suitable interactive behaviors. Towards this end, this paper introduces a framework for the description and implementation of interactive behaviors of two agents performing a joint motor task. A taxonomy of interactive behaviors is introduced, which can classify tasks and cost functions that represent the way each agent interacts. The role of an agent interacting during a motor task can be directly explained from the cost function this agent is minimizing and the task constraints. The novel framework is used to interpret and classify previous works on human-robot motor interaction. Its implementation power is demonstrated by simulating representative interactions of two humans. It also enables us to interpret and explain the role distribution and switching between roles when performing joint motor tasks. PMID:23226231

  11. Design and implementation of flexible TWDM-PON with PtP WDM overlay based on WSS for next-generation optical access networks

    NASA Astrophysics Data System (ADS)

    Wu, Bin; Yin, Hongxi; Qin, Jie; Liu, Chang; Liu, Anliang; Shao, Qi; Xu, Xiaoguang

    2016-09-01

    Aiming at the increasing demand of the diversification services and flexible bandwidth allocation of the future access networks, a flexible passive optical network (PON) scheme combining time and wavelength division multiplexing (TWDM) with point-to-point wavelength division multiplexing (PtP WDM) overlay is proposed for the next-generation optical access networks in this paper. A novel software-defined optical distribution network (ODN) structure is designed based on wavelength selective switches (WSS), which can implement wavelength and bandwidth dynamical allocations and suits for the bursty traffic. The experimental results reveal that the TWDM-PON can provide 40 Gb/s downstream and 10 Gb/s upstream data transmission, while the PtP WDM-PON can support 10 GHz point-to-point dedicated bandwidth as the overlay complement system. The wavelengths of the TWDM-PON and PtP WDM-PON are allocated dynamically based on WSS, which verifies the feasibility of the proposed structure.

  12. A framework to describe, analyze and generate interactive motor behaviors.

    PubMed

    Jarrassé, Nathanaël; Charalambous, Themistoklis; Burdet, Etienne

    2012-01-01

    While motor interaction between a robot and a human, or between humans, has important implications for society as well as promising applications, little research has been devoted to its investigation. In particular, it is important to understand the different ways two agents can interact and generate suitable interactive behaviors. Towards this end, this paper introduces a framework for the description and implementation of interactive behaviors of two agents performing a joint motor task. A taxonomy of interactive behaviors is introduced, which can classify tasks and cost functions that represent the way each agent interacts. The role of an agent interacting during a motor task can be directly explained from the cost function this agent is minimizing and the task constraints. The novel framework is used to interpret and classify previous works on human-robot motor interaction. Its implementation power is demonstrated by simulating representative interactions of two humans. It also enables us to interpret and explain the role distribution and switching between roles when performing joint motor tasks.

  13. An overview of the EOSDIS V0 information management system: Lessons learned from the implementation of a distributed data system

    NASA Technical Reports Server (NTRS)

    Ryan, Patrick M.

    1994-01-01

    The EOSDIS Version 0 system, released in July, 1994, is a working prototype of a distributed data system. One of the purposes of the V0 project is to take several existing data systems and coordinate them into one system while maintaining the independent nature of the original systems. The project is a learning experience and the lessons are being passed on to the architects of the system which will distribute the data received from the planned EOS satellites. In the V0 system, the data resides on heterogeneous systems across the globe but users are presented with a single, integrated interface. This interface allows users to query the participating data centers based on a wide set of criteria. Because this system is a prototype, we used many novel approaches in trying to connect a diverse group of users with the huge amount of available data. Some of these methods worked and others did not. Now that V0 has been released to the public, we can look back at the design and implementation of the system and also consider some possible future directions for the next generation of EOSDIS.

  14. Bayesian modelling of uncertainties of Monte Carlo radiative-transfer simulations

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Eggers, Hans C.; Kerzendorf, Wolfgang E.

    2018-04-01

    One of the big challenges in astrophysics is the comparison of complex simulations to observations. As many codes do not directly generate observables (e.g. hydrodynamic simulations), the last step in the modelling process is often a radiative-transfer treatment. For this step, the community relies increasingly on Monte Carlo radiative transfer due to the ease of implementation and scalability with computing power. We show how to estimate the statistical uncertainty given the output of just a single radiative-transfer simulation in which the number of photon packets follows a Poisson distribution and the weight (e.g. energy or luminosity) of a single packet may follow an arbitrary distribution. Our Bayesian approach produces a posterior distribution that is valid for any number of packets in a bin, even zero packets, and is easy to implement in practice. Our analytic results for large number of packets show that we generalise existing methods that are valid only in limiting cases. The statistical problem considered here appears in identical form in a wide range of Monte Carlo simulations including particle physics and importance sampling. It is particularly powerful in extracting information when the available data are sparse or quantities are small.

  15. Bayesian modelling of uncertainties of Monte Carlo radiative-transfer simulations

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Eggers, Hans C.; Kerzendorf, Wolfgang E.

    2018-07-01

    One of the big challenges in astrophysics is the comparison of complex simulations to observations. As many codes do not directly generate observables (e.g. hydrodynamic simulations), the last step in the modelling process is often a radiative-transfer treatment. For this step, the community relies increasingly on Monte Carlo radiative transfer due to the ease of implementation and scalability with computing power. We consider simulations in which the number of photon packets is Poisson distributed, while the weight assigned to a single photon packet follows any distribution of choice. We show how to estimate the statistical uncertainty of the sum of weights in each bin from the output of a single radiative-transfer simulation. Our Bayesian approach produces a posterior distribution that is valid for any number of packets in a bin, even zero packets, and is easy to implement in practice. Our analytic results for large number of packets show that we generalize existing methods that are valid only in limiting cases. The statistical problem considered here appears in identical form in a wide range of Monte Carlo simulations including particle physics and importance sampling. It is particularly powerful in extracting information when the available data are sparse or quantities are small.

  16. NREL Smart Grid Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hambrick, J.

    2012-01-01

    Although implementing Smart Grid projects at the distribution level provides many advantages and opportunities for advanced operation and control, a number of significant challenges must be overcome to maintain the high level of safety and reliability that the modern grid must provide. For example, while distributed generation (DG) promises to provide opportunities to increase reliability and efficiency and may provide grid support services such as volt/var control, the presence of DG can impact distribution operation and protection schemes. Additionally, the intermittent nature of many DG energy sources such as photovoltaics (PV) can present a number of challenges to voltage regulation,more » etc. This presentation provides an overview a number of Smart Grid projects being performed by the National Renewable Energy Laboratory (NREL) along with utility, industry, and academic partners. These projects include modeling and analysis of high penetration PV scenarios (with and without energy storage), development and testing of interconnection and microgrid equipment, as well as the development and implementation of advanced instrumentation and data acquisition used to analyze the impacts of intermittent renewable resources. Additionally, standards development associated with DG interconnection and analysis as well as Smart Grid interoperability will be discussed.« less

  17. Integrated topology for an aircraft electric power distribution system using MATLAB and ILP optimization technique and its implementation

    NASA Astrophysics Data System (ADS)

    Madhikar, Pratik Ravindra

    The most important and crucial design feature while designing an Aircraft Electric Power Distribution System (EPDS) is reliability. In EPDS, the distribution of power is from top level generators to bottom level loads through various sensors, actuators and rectifiers with the help of AC & DC buses and control switches. As the demands of the consumer is never ending and the safety is utmost important, there is an increase in loads and as a result increase in power management. Therefore, the design of an EPDS should be optimized to have maximum efficiency. This thesis discusses an integrated tool that is based on a Need Based Design method and Fault Tree Analysis (FTA) to achieve the optimum design of an EPDS to provide maximum reliability in terms of continuous connectivity, power management and minimum cost. If an EPDS is formulated as an optimization problem then it can be solved with the help of connectivity, cost and power constraints by using a linear solver to get the desired output of maximum reliability at minimum cost. Furthermore, the thesis also discusses the viability and implementation of the resulted topology on typical large aircraft specifications.

  18. A Framework for Parallel Unstructured Grid Generation for Complex Aerodynamic Simulations

    NASA Technical Reports Server (NTRS)

    Zagaris, George; Pirzadeh, Shahyar Z.; Chrisochoides, Nikos

    2009-01-01

    A framework for parallel unstructured grid generation targeting both shared memory multi-processors and distributed memory architectures is presented. The two fundamental building-blocks of the framework consist of: (1) the Advancing-Partition (AP) method used for domain decomposition and (2) the Advancing Front (AF) method used for mesh generation. Starting from the surface mesh of the computational domain, the AP method is applied recursively to generate a set of sub-domains. Next, the sub-domains are meshed in parallel using the AF method. The recursive nature of domain decomposition naturally maps to a divide-and-conquer algorithm which exhibits inherent parallelism. For the parallel implementation, the Master/Worker pattern is employed to dynamically balance the varying workloads of each task on the set of available CPUs. Performance results by this approach are presented and discussed in detail as well as future work and improvements.

  19. TU-C-BRE-11: 3D EPID-Based in Vivo Dosimetry: A Major Step Forward Towards Optimal Quality and Safety in Radiation Oncology Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mijnheer, B; Mans, A; Olaciregui-Ruiz, I

    Purpose: To develop a 3D in vivo dosimetry method that is able to substitute pre-treatment verification in an efficient way, and to terminate treatment delivery if the online measured 3D dose distribution deviates too much from the predicted dose distribution. Methods: A back-projection algorithm has been further developed and implemented to enable automatic 3D in vivo dose verification of IMRT/VMAT treatments using a-Si EPIDs. New software tools were clinically introduced to allow automated image acquisition, to periodically inspect the record-and-verify database, and to automatically run the EPID dosimetry software. The comparison of the EPID-reconstructed and planned dose distribution is donemore » offline to raise automatically alerts and to schedule actions when deviations are detected. Furthermore, a software package for online dose reconstruction was also developed. The RMS of the difference between the cumulative planned and reconstructed 3D dose distributions was used for triggering a halt of a linac. Results: The implementation of fully automated 3D EPID-based in vivo dosimetry was able to replace pre-treatment verification for more than 90% of the patient treatments. The process has been fully automated and integrated in our clinical workflow where over 3,500 IMRT/VMAT treatments are verified each year. By optimizing the dose reconstruction algorithm and the I/O performance, the delivered 3D dose distribution is verified in less than 200 ms per portal image, which includes the comparison between the reconstructed and planned dose distribution. In this way it was possible to generate a trigger that can stop the irradiation at less than 20 cGy after introducing large delivery errors. Conclusion: The automatic offline solution facilitated the large scale clinical implementation of 3D EPID-based in vivo dose verification of IMRT/VMAT treatments; the online approach has been successfully tested for various severe delivery errors.« less

  20. Sailfish: A flexible multi-GPU implementation of the lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Januszewski, M.; Kostur, M.

    2014-09-01

    We present Sailfish, an open source fluid simulation package implementing the lattice Boltzmann method (LBM) on modern Graphics Processing Units (GPUs) using CUDA/OpenCL. We take a novel approach to GPU code implementation and use run-time code generation techniques and a high level programming language (Python) to achieve state of the art performance, while allowing easy experimentation with different LBM models and tuning for various types of hardware. We discuss the general design principles of the code, scaling to multiple GPUs in a distributed environment, as well as the GPU implementation and optimization of many different LBM models, both single component (BGK, MRT, ELBM) and multicomponent (Shan-Chen, free energy). The paper also presents results of performance benchmarks spanning the last three NVIDIA GPU generations (Tesla, Fermi, Kepler), which we hope will be useful for researchers working with this type of hardware and similar codes. Catalogue identifier: AETA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Lesser General Public License, version 3 No. of lines in distributed program, including test data, etc.: 225864 No. of bytes in distributed program, including test data, etc.: 46861049 Distribution format: tar.gz Programming language: Python, CUDA C, OpenCL. Computer: Any with an OpenCL or CUDA-compliant GPU. Operating system: No limits (tested on Linux and Mac OS X). RAM: Hundreds of megabytes to tens of gigabytes for typical cases. Classification: 12, 6.5. External routines: PyCUDA/PyOpenCL, Numpy, Mako, ZeroMQ (for multi-GPU simulations), scipy, sympy Nature of problem: GPU-accelerated simulation of single- and multi-component fluid flows. Solution method: A wide range of relaxation models (LBGK, MRT, regularized LB, ELBM, Shan-Chen, free energy, free surface) and boundary conditions within the lattice Boltzmann method framework. Simulations can be run in single or double precision using one or more GPUs. Restrictions: The lattice Boltzmann method works for low Mach number flows only. Unusual features: The actual numerical calculations run exclusively on GPUs. The numerical code is built dynamically at run-time in CUDA C or OpenCL, using templates and symbolic formulas. The high-level control of the simulation is maintained by a Python process. Additional comments: !!!!! The distribution file for this program is over 45 Mbytes and therefore is not delivered directly when Download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. !!!!! Running time: Problem-dependent, typically minutes (for small cases or short simulations) to hours (large cases or long simulations).

  1. Estimating the Cost of Providing Foundational Public Health Services.

    PubMed

    Mamaril, Cezar Brian C; Mays, Glen P; Branham, Douglas Keith; Bekemeier, Betty; Marlowe, Justin; Timsina, Lava

    2017-12-28

    To estimate the cost of resources required to implement a set of Foundational Public Health Services (FPHS) as recommended by the Institute of Medicine. A stochastic simulation model was used to generate probability distributions of input and output costs across 11 FPHS domains. We used an implementation attainment scale to estimate costs of fully implementing FPHS. We use data collected from a diverse cohort of 19 public health agencies located in three states that implemented the FPHS cost estimation methodology in their agencies during 2014-2015. The average agency incurred costs of $48 per capita implementing FPHS at their current attainment levels with a coefficient of variation (CV) of 16 percent. Achieving full FPHS implementation would require $82 per capita (CV=19 percent), indicating an estimated resource gap of $34 per capita. Substantial variation in costs exists across communities in resources currently devoted to implementing FPHS, with even larger variation in resources needed for full attainment. Reducing geographic inequities in FPHS may require novel financing mechanisms and delivery models that allow health agencies to have robust roles within the health system and realize a minimum package of public health services for the nation. © Health Research and Educational Trust.

  2. De novo design of molecular architectures by evolutionary assembly of drug-derived building blocks.

    PubMed

    Schneider, G; Lee, M L; Stahl, M; Schneider, P

    2000-07-01

    An evolutionary algorithm was developed for fragment-based de novo design of molecules (TOPAS, TOPology-Assigning System). This stochastic method aims at generating a novel molecular structure mimicking a template structure. A set of approximately 25,000 fragment structures serves as the building block supply, which were obtained by a straightforward fragmentation procedure applied to 36,000 known drugs. Eleven reaction schemes were implemented for both fragmentation and building block assembly. This combination of drug-derived building blocks and a restricted set of reaction schemes proved to be a key for the automatic development of novel, synthetically tractable structures. In a cyclic optimization process, molecular architectures were generated from a parent structure by virtual synthesis, and the best structure of a generation was selected as the parent for the subsequent TOPAS cycle. Similarity measures were used to define 'fitness', based on 2D-structural similarity or topological pharmacophore distance between the template molecule and the variants. The concept of varying library 'diversity' during a design process was consequently implemented by using adaptive variant distributions. The efficiency of the design algorithm was demonstrated for the de novo construction of potential thrombin inhibitors mimicking peptide and non-peptide template structures.

  3. Distribution and Fate of Energetics on DoD Test and Training Ranges: Final Report

    DTIC Science & Technology

    2006-11-01

    tests for unconfined charges........................................................ 106 Table 5-3. Mass (g) of residue generated by BIP of unfuzed anti...Lands Withdrawal Act (Public Law 106 -65). As a portion of this EIS, the Army has pledged to implement a program to identify possible munitions...containing Tritonal, PBXN -109, Composition H-6, and Composition B (Baker et al. 2000). Included in the list of simulated UXOs was the 155-mm, 105-mm, and 8

  4. Nonperturbative methods in HZE ion transport

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Badavi, Francis F.; Costen, Robert C.; Shinn, Judy L.

    1993-01-01

    A nonperturbative analytic solution of the high charge and energy (HZE) Green's function is used to implement a computer code for laboratory ion beam transport. The code is established to operate on the Langley Research Center nuclear fragmentation model used in engineering applications. Computational procedures are established to generate linear energy transfer (LET) distributions for a specified ion beam and target for comparison with experimental measurements. The code is highly efficient and compares well with the perturbation approximations.

  5. REMORA: a pilot in the ocean of BioMoby web-services.

    PubMed

    Carrere, Sébastien; Gouzy, Jérôme

    2006-04-01

    Emerging web-services technology allows interoperability between multiple distributed architectures. Here, we present REMORA, a web server implemented according to the BioMoby web-service specifications, providing life science researchers with an easy-to-use workflow generator and launcher, a repository of predefined workflows and a survey system. Jerome.Gouzy@toulouse.inra.fr The REMORA web server is freely available at http://bioinfo.genopole-toulouse.prd.fr/remora, sources are available upon request from the authors.

  6. Practical implementation of multilevel quantum cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulik, S. P.; Maslennikov, G. A.; Moreva, E. V.

    2006-05-15

    The physical principles of a quantum key distribution protocol using four-level optical systems are discussed. Quantum information is encoded into polarization states created by frequency-nondegenerate spontaneous parametric down-conversion in collinear geometry. In the scheme under analysis, the required nonorthogonal states are generated in a single nonlinear crystal. All states in the selected basis are measured deterministically. The results of initial experiments on transformation of the basis polarization states of a four-level optical system are discussed.

  7. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  8. Modeling uncertainty and correlation in soil properties using Restricted Pairing and implications for ensemble-based hillslope-scale soil moisture and temperature estimation

    NASA Astrophysics Data System (ADS)

    Flores, A. N.; Entekhabi, D.; Bras, R. L.

    2007-12-01

    Soil hydraulic and thermal properties (SHTPs) affect both the rate of moisture redistribution in the soil column and the volumetric soil water capacity. Adequately constraining these properties through field and lab analysis to parameterize spatially-distributed hydrology models is often prohibitively expensive. Because SHTPs vary significantly at small spatial scales individual soil samples are also only reliably indicative of local conditions, and these properties remain a significant source of uncertainty in soil moisture and temperature estimation. In ensemble-based soil moisture data assimilation, uncertainty in the model-produced prior estimate due to associated uncertainty in SHTPs must be taken into account to avoid under-dispersive ensembles. To treat SHTP uncertainty for purposes of supplying inputs to a distributed watershed model we use the restricted pairing (RP) algorithm, an extension of Latin Hypercube (LH) sampling. The RP algorithm generates an arbitrary number of SHTP combinations by sampling the appropriate marginal distributions of the individual soil properties using the LH approach, while imposing a target rank correlation among the properties. A previously-published meta- database of 1309 soils representing 12 textural classes is used to fit appropriate marginal distributions to the properties and compute the target rank correlation structure, conditioned on soil texture. Given categorical soil textures, our implementation of the RP algorithm generates an arbitrarily-sized ensemble of realizations of the SHTPs required as input to the TIN-based Realtime Integrated Basin Simulator with vegetation dynamics (tRIBS+VEGGIE) distributed parameter ecohydrology model. Soil moisture ensembles simulated with RP- generated SHTPs exhibit less variance than ensembles simulated with SHTPs generated by a scheme that neglects correlation among properties. Neglecting correlation among SHTPs can lead to physically unrealistic combinations of parameters that exhibit implausible hydrologic behavior when input to the tRIBS+VEGGIE model.

  9. The performance of residential micro-cogeneration coupled with thermal and electrical storage

    NASA Astrophysics Data System (ADS)

    Kopf, John

    Over 80% of residential secondary energy consumption in Canada and Ontario is used for space and water heating. The peak electricity demands resulting from residential energy consumption increase the reliance on fossil-fuel generation stations. Distributed energy resources can help to decrease the reliance on central generation stations. Presently, distributed energy resources such as solar photovoltaic, wind and bio-mass generation are subsidized in Ontario. Micro-cogeneration is an emerging technology that can be implemented as a distributed energy resource within residential or commercial buildings. Micro-cogeneration has the potential to reduce a building's energy consumption by simultaneously generating thermal and electrical power on-site. The coupling of a micro-cogeneration device with electrical storage can improve the system's ability to reduce peak electricity demands. The performance potential of micro-cogeneration devices has yet to be fully realized. This research addresses the performance of a residential micro-cogeneration device and it's ability to meet peak occupant electrical loads when coupled with electrical storage. An integrated building energy model was developed of a residential micro-cogeneration system: the house, the micro-cogeneration device, all balance of plant and space heating components, a thermal storage device, an electrical storage device, as well as the occupant electrical and hot water demands. This model simulated the performance of a micro-cogeneration device coupled to an electrical storage system within a Canadian household. A customized controller was created in ESP-r to examine the impact of various system control strategies. The economic performance of the system was assessed from the perspective of a local energy distribution company and an end-user under hypothetical electricity export purchase price scenarios. It was found that with certain control strategies the micro-cogeneration system was able to improve the economic performance for both the end user and local distribution company.

  10. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less

  11. Gene network inference by fusing data from diverse distributions

    PubMed Central

    Žitnik, Marinka; Zupan, Blaž

    2015-01-01

    Motivation: Markov networks are undirected graphical models that are widely used to infer relations between genes from experimental data. Their state-of-the-art inference procedures assume the data arise from a Gaussian distribution. High-throughput omics data, such as that from next generation sequencing, often violates this assumption. Furthermore, when collected data arise from multiple related but otherwise nonidentical distributions, their underlying networks are likely to have common features. New principled statistical approaches are needed that can deal with different data distributions and jointly consider collections of datasets. Results: We present FuseNet, a Markov network formulation that infers networks from a collection of nonidentically distributed datasets. Our approach is computationally efficient and general: given any number of distributions from an exponential family, FuseNet represents model parameters through shared latent factors that define neighborhoods of network nodes. In a simulation study, we demonstrate good predictive performance of FuseNet in comparison to several popular graphical models. We show its effectiveness in an application to breast cancer RNA-sequencing and somatic mutation data, a novel application of graphical models. Fusion of datasets offers substantial gains relative to inference of separate networks for each dataset. Our results demonstrate that network inference methods for non-Gaussian data can help in accurate modeling of the data generated by emergent high-throughput technologies. Availability and implementation: Source code is at https://github.com/marinkaz/fusenet. Contact: blaz.zupan@fri.uni-lj.si Supplementary information: Supplementary information is available at Bioinformatics online. PMID:26072487

  12. A Learning Framework for Winner-Take-All Networks with Stochastic Synapses.

    PubMed

    Mostafa, Hesham; Cauwenberghs, Gert

    2018-06-01

    Many recent generative models make use of neural networks to transform the probability distribution of a simple low-dimensional noise process into the complex distribution of the data. This raises the question of whether biological networks operate along similar principles to implement a probabilistic model of the environment through transformations of intrinsic noise processes. The intrinsic neural and synaptic noise processes in biological networks, however, are quite different from the noise processes used in current abstract generative networks. This, together with the discrete nature of spikes and local circuit interactions among the neurons, raises several difficulties when using recent generative modeling frameworks to train biologically motivated models. In this letter, we show that a biologically motivated model based on multilayer winner-take-all circuits and stochastic synapses admits an approximate analytical description. This allows us to use the proposed networks in a variational learning setting where stochastic backpropagation is used to optimize a lower bound on the data log likelihood, thereby learning a generative model of the data. We illustrate the generality of the proposed networks and learning technique by using them in a structured output prediction task and a semisupervised learning task. Our results extend the domain of application of modern stochastic network architectures to networks where synaptic transmission failure is the principal noise mechanism.

  13. Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A

    2011-01-01

    The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less

  14. Benefit-cost methodology study with example application of the use of wind generators

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.

    1975-01-01

    An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.

  15. Optimal condition for employing an axicon-generated Bessel beam to fabricate cylindrical microlens arrays

    NASA Astrophysics Data System (ADS)

    Luo, Zhi; Yin, Kai; Dong, Xinran; Duan, Ji’an

    2018-05-01

    A numerical algorithm, modelling the transformation from a Gaussian beam to a Bessel beam, is presented for the purpose to study the optimal condition for employing an axicon-generated Bessel beam to fabricate cylindrical microlens arrays (CMLAs). By applying the numerical algorithm to simulate the spatial intensity distribution behind the axicon under different defects of a rotund-apex and different diameter ratios of an incident beam to the axicon, we find that the diffraction effects formed by the axicon edge can be almost eliminated when the diameter ratio is less than 1:2, but the spatial intensity distribution is disturbed dramatically even a few tens of microns deviation of the apex, especially for the front part of the axicon-generated Bessel beam. Fortunately, the lateral intensity profile in the rear part still maintains a desirable Bessel curve. Therefore, the rear part of the Bessel area and the less than 1:2 diameter ratio are the optimal choice for employing an axicon-generated Bessel beam to implement surface microstructures fabrication. Furthermore, by applying the optimal conditions to direct writing microstructures on fused silica with a femtosecond (fs) laser, a large area close-packed CMLA is fabricated. The CMLA presents high quality and uniformity and its optical performance is also demonstrated.

  16. Web-based segmentation and display of three-dimensional radiologic image data.

    PubMed

    Silverstein, J; Rubenstein, J; Millman, A; Panko, W

    1998-01-01

    In many clinical circumstances, viewing sequential radiological image data as three-dimensional models is proving beneficial. However, designing customized computer-generated radiological models is beyond the scope of most physicians, due to specialized hardware and software requirements. We have created a simple method for Internet users to remotely construct and locally display three-dimensional radiological models using only a standard web browser. Rapid model construction is achieved by distributing the hardware intensive steps to a remote server. Once created, the model is automatically displayed on the requesting browser and is accessible to multiple geographically distributed users. Implementation of our server software on large scale systems could be of great service to the worldwide medical community.

  17. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  18. Field demonstration of a continuous-variable quantum key distribution network.

    PubMed

    Huang, Duan; Huang, Peng; Li, Huasheng; Wang, Tao; Zhou, Yingming; Zeng, Guihua

    2016-08-01

    We report on what we believe is the first field implementation of a continuous-variable quantum key distribution (CV-QKD) network with point-to-point configuration. Four QKD nodes are deployed on standard communication infrastructures connected with commercial telecom optical fiber. Reliable key exchange is achieved in the wavelength-division-multiplexing CV-QKD network. The impact of a complex and volatile field environment on the excess noise is investigated, since excess noise controlling and reduction is arguably the major issue pertaining to distance and the secure key rate. We confirm the applicability and verify the maturity of the CV-QKD network in a metropolitan area, thus paving the way for a next-generation global secure communication network.

  19. Experimental realization of equiangular three-state quantum key distribution

    PubMed Central

    Schiavon, Matteo; Vallone, Giuseppe; Villoresi, Paolo

    2016-01-01

    Quantum key distribution using three states in equiangular configuration combines a security threshold comparable with the one of the Bennett-Brassard 1984 protocol and a quantum bit error rate (QBER) estimation that does not need to reveal part of the key. We implement an entanglement-based version of the Renes 2004 protocol, using only passive optic elements in a linear scheme for the positive-operator valued measure (POVM), generating an asymptotic secure key rate of more than 10 kbit/s, with a mean QBER of 1.6%. We then demonstrate its security in the case of finite key and evaluate the key rate for both collective and general attacks. PMID:27465643

  20. Advantages and difficulties of implementation of flat-panel multimedia monitoring system in a surgical MRI suite

    NASA Astrophysics Data System (ADS)

    Deckard, Michael; Ratib, Osman M.; Rubino, Gregory

    2002-05-01

    Our project was to design and implement a ceiling-mounted multi monitor display unit for use in a high-field MRI surgical suite. The system is designed to simultaneously display images/data from four different digital and/or analog sources with: minimal interference from the adjacent high magnetic field, minimal signal-to-noise/artifact contribution to the MRI images and compliance with codes and regulations for the sterile neuro-surgical environment. Provisions were also made to accommodate the importing and exporting of video information via PACS and remote processing/display for clinical and education uses. Commercial fiber optic receivers/transmitters were implemented along with supporting video processing and distribution equipment to solve the video communication problem. A new generation of high-resolution color flat panel displays was selected for the project. A custom-made monitor mount and in-suite electronics enclosure was designed and constructed at UCLA. Difficulties with implementing an isolated AC power system are discussed and a work-around solution presented.

  1. Recent enhancements to the GRIDGEN structured grid generation system

    NASA Technical Reports Server (NTRS)

    Steinbrenner, John P.; Chawner, John R.

    1992-01-01

    Significant enhancements are being implemented into the GRIDGEN3D, multiple block, structured grid generation software. Automatic, point-to-point, interblock connectivity will be possible through the addition of the domain entity to GRIDBLOCK's block construction process. Also, the unification of GRIDGEN2D and GRIDBLOCK has begun with the addition of edge grid point distribution capability to GRIDBLOCK. The geometric accuracy of surface grids and the ease with which databases may be obtained is being improved by adding support for standard computer-aided design formats (e.g., PATRAN Neutral and IGES files). Finally, volume grid quality was improved through addition of new SOR algorithm features and the new hybrid control function type to GRIDGEN3D.

  2. Benefit transfer and spatial heterogeneity of preferences for water quality improvements.

    PubMed

    Martin-Ortega, J; Brouwer, R; Ojea, E; Berbel, J

    2012-09-15

    The improvement in the water quality resulting from the implementation of the EU Water Framework Directive is expected to generate substantial non-market benefits. A wide spread estimation of these benefits across Europe will require the application of benefit transfer. We use a spatially explicit valuation design to account for the spatial heterogeneity of preferences to help generate lower transfer errors. A map-based choice experiment is applied in the Guadalquivir River Basin (Spain), accounting simultaneously for the spatial distribution of water quality improvements and beneficiaries. Our results show that accounting for the spatial heterogeneity of preferences generally produces lower transfer errors. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. RTDS implementation of an improved sliding mode based inverter controller for PV system.

    PubMed

    Islam, Gazi; Muyeen, S M; Al-Durra, Ahmed; Hasanien, Hany M

    2016-05-01

    This paper proposes a novel approach for testing dynamics and control aspects of a large scale photovoltaic (PV) system in real time along with resolving design hindrances of controller parameters using Real Time Digital Simulator (RTDS). In general, the harmonic profile of a fast controller has wide distribution due to the large bandwidth of the controller. The major contribution of this paper is that the proposed control strategy gives an improved voltage harmonic profile and distribute it more around the switching frequency along with fast transient response; filter design, thus, becomes easier. The implementation of a control strategy with high bandwidth in small time steps of Real Time Digital Simulator (RTDS) is not straight forward. This paper shows a good methodology for the practitioners to implement such control scheme in RTDS. As a part of the industrial process, the controller parameters are optimized using particle swarm optimization (PSO) technique to improve the low voltage ride through (LVRT) performance under network disturbance. The response surface methodology (RSM) is well adapted to build analytical models for recovery time (Rt), maximum percentage overshoot (MPOS), settling time (Ts), and steady state error (Ess) of the voltage profile immediate after inverter under disturbance. A systematic approach of controller parameter optimization is detailed. The transient performance of the PSO based optimization method applied to the proposed sliding mode controlled PV inverter is compared with the results from genetic algorithm (GA) based optimization technique. The reported real time implementation challenges and controller optimization procedure are applicable to other control applications in the field of renewable and distributed generation systems. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Hadoop-BAM: directly manipulating next generation sequencing data in the cloud

    PubMed Central

    Niemenmaa, Matti; Kallio, Aleksi; Schumacher, André; Klemelä, Petri; Korpelainen, Eija; Heljanko, Keijo

    2012-01-01

    Summary: Hadoop-BAM is a novel library for the scalable manipulation of aligned next-generation sequencing data in the Hadoop distributed computing framework. It acts as an integration layer between analysis applications and BAM files that are processed using Hadoop. Hadoop-BAM solves the issues related to BAM data access by presenting a convenient API for implementing map and reduce functions that can directly operate on BAM records. It builds on top of the Picard SAM JDK, so tools that rely on the Picard API are expected to be easily convertible to support large-scale distributed processing. In this article we demonstrate the use of Hadoop-BAM by building a coverage summarizing tool for the Chipster genome browser. Our results show that Hadoop offers good scalability, and one should avoid moving data in and out of Hadoop between analysis steps. Availability: Available under the open-source MIT license at http://sourceforge.net/projects/hadoop-bam/ Contact: matti.niemenmaa@aalto.fi Supplementary information: Supplementary material is available at Bioinformatics online. PMID:22302568

  5. Research on the effects of wind power grid to the distribution network of Henan province

    NASA Astrophysics Data System (ADS)

    Liu, Yunfeng; Zhang, Jian

    2018-04-01

    With the draining of traditional energy, all parts of nation implement policies to develop new energy to generate electricity under the favorable national policy. The wind has no pollution, Renewable and other advantages. It has become the most popular energy among the new energy power generation. The development of wind power in Henan province started relatively late, but the speed of the development is fast. The wind power of Henan province has broad development prospects. Wind power has the characteristics of volatility and randomness. The wind power access to power grids will cause much influence on the power stability and the power quality of distribution network, and some areas have appeared abandon the wind phenomenon. So the study of wind power access to power grids and find out improvement measures is very urgent. Energy storage has the properties of the space transfer energy can stabilize the operation of power grid and improve the power quality.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, Bing; Lougovski, Pavel; Pooser, Raphael C.

    Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In our paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a “locally” generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct amore » coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad 2), which is small enough to enable secure key distribution. This technology opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.« less

  7. SCOS 2: A distributed architecture for ground system control

    NASA Astrophysics Data System (ADS)

    Keyte, Karl P.

    The current generation of spacecraft ground control systems in use at the European Space Agency/European Space Operations Centre (ESA/ESOC) is based on the SCOS 1. Such systems have become difficult to manage in both functional and financial terms. The next generation of spacecraft is demanding more flexibility in the use, configuration and distribution of control facilities as well as functional requirements capable of matching those being planned for future missions. SCOS 2 is more than a successor to SCOS 1. Many of the shortcomings of the existing system have been carefully analyzed by user and technical communities and a complete redesign was made. Different technologies were used in many areas including hardware platform, network architecture, user interfaces and implementation techniques, methodologies and language. As far as possible a flexible design approach has been made using popular industry standards to provide vendor independence in both hardware and software areas. This paper describes many of the new approaches made in the architectural design of the SCOS 2.

  8. Communication library for run-time visualization of distributed, asynchronous data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowlan, J.; Wightman, B.T.

    1994-04-01

    In this paper we present a method for collecting and visualizing data generated by a parallel computational simulation during run time. Data distributed across multiple processes is sent across parallel communication lines to a remote workstation, which sorts and queues the data for visualization. We have implemented our method in a set of tools called PORTAL (for Parallel aRchitecture data-TrAnsfer Library). The tools comprise generic routines for sending data from a parallel program (callable from either C or FORTRAN), a semi-parallel communication scheme currently built upon Unix Sockets, and a real-time connection to the scientific visualization program AVS. Our methodmore » is most valuable when used to examine large datasets that can be efficiently generated and do not need to be stored on disk. The PORTAL source libraries, detailed documentation, and a working example can be obtained by anonymous ftp from info.mcs.anl.gov from the file portal.tar.Z from the directory pub/portal.« less

  9. Joint Real-Time Energy and Demand-Response Management using a Hybrid Coalitional-Noncooperative Game

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Fulin; Gu, Yi; Hao, Jun

    In order to model the interactions among utility companies, building demands and renewable energy generators (REGs), a hybrid coalitional-noncooperative game framework has been proposed. We formulate a dynamic non-cooperative game to study the energy dispatch within multiple utility companies, while we take a coalitional perspective on REGs and buildings demands through a hedonic coalition formation game approach. In this case, building demands request different power supply from REGs, then the building demands can be organized into an ultimate coalition structure through a distributed hedonic shift algorithm. At the same time, utility companies can also obtain a stable power generation profile.more » In addition, the interactive progress among the utility companies and building demands which cannot be supplied by REGs is implemented by distributed game theoretic algorithms. Numerical results illustrate that the proposed hybrid coalitional-noncooperative game scheme reduces the cost of both building demands and utility companies compared with the initial scene.« less

  10. A data distributed parallel algorithm for ray-traced volume rendering

    NASA Technical Reports Server (NTRS)

    Ma, Kwan-Liu; Painter, James S.; Hansen, Charles D.; Krogh, Michael F.

    1993-01-01

    This paper presents a divide-and-conquer ray-traced volume rendering algorithm and a parallel image compositing method, along with their implementation and performance on the Connection Machine CM-5, and networked workstations. This algorithm distributes both the data and the computations to individual processing units to achieve fast, high-quality rendering of high-resolution data. The volume data, once distributed, is left intact. The processing nodes perform local ray tracing of their subvolume concurrently. No communication between processing units is needed during this locally ray-tracing process. A subimage is generated by each processing unit and the final image is obtained by compositing subimages in the proper order, which can be determined a priori. Test results on both the CM-5 and a group of networked workstations demonstrate the practicality of our rendering algorithm and compositing method.

  11. Integration of Heterogenous Digital Surface Models

    NASA Astrophysics Data System (ADS)

    Boesch, R.; Ginzler, C.

    2011-08-01

    The application of extended digital surface models often reveals, that despite an acceptable global accuracy for a given dataset, the local accuracy of the model can vary in a wide range. For high resolution applications which cover the spatial extent of a whole country, this can be a major drawback. Within the Swiss National Forest Inventory (NFI), two digital surface models are available, one derived from LiDAR point data and the other from aerial images. Automatic photogrammetric image matching with ADS80 aerial infrared images with 25cm and 50cm resolution is used to generate a surface model (ADS-DSM) with 1m resolution covering whole switzerland (approx. 41000 km2). The spatially corresponding LiDAR dataset has a global point density of 0.5 points per m2 and is mainly used in applications as interpolated grid with 2m resolution (LiDAR-DSM). Although both surface models seem to offer a comparable accuracy from a global view, local analysis shows significant differences. Both datasets have been acquired over several years. Concerning LiDAR-DSM, different flight patterns and inconsistent quality control result in a significantly varying point density. The image acquisition of the ADS-DSM is also stretched over several years and the model generation is hampered by clouds, varying illumination and shadow effects. Nevertheless many classification and feature extraction applications requiring high resolution data depend on the local accuracy of the used surface model, therefore precise knowledge of the local data quality is essential. The commercial photogrammetric software NGATE (part of SOCET SET) generates the image based surface model (ADS-DSM) and delivers also a map with figures of merit (FOM) of the matching process for each calculated height pixel. The FOM-map contains matching codes like high slope, excessive shift or low correlation. For the generation of the LiDAR-DSM only first- and last-pulse data was available. Therefore only the point distribution can be used to derive a local accuracy measure. For the calculation of a robust point distribution measure, a constrained triangulation of local points (within an area of 100m2) has been implemented using the Open Source project CGAL. The area of each triangle is a measure for the spatial distribution of raw points in this local area. Combining the FOM-map with the local evaluation of LiDAR points allows an appropriate local accuracy evaluation of both surface models. The currently implemented strategy ("partial replacement") uses the hypothesis, that the ADS-DSM is superior due to its better global accuracy of 1m. If the local analysis of the FOM-map within the 100m2 area shows significant matching errors, the corresponding area of the triangulated LiDAR points is analyzed. If the point density and distribution is sufficient, the LiDAR-DSM will be used in favor of the ADS-DSM at this location. If the local triangulation reflects low point density or the variance of triangle areas exceeds a threshold, the investigated location will be marked as NODATA area. In a future implementation ("anisotropic fusion") an anisotropic inverse distance weighting (IDW) will be used, which merges both surface models in the point data space by using FOM-map and local triangulation to derive a quality weight for each of the interpolation points. The "partial replacement" implementation and the "fusion" prototype for the anisotropic IDW make use of the Open Source projects CGAL (Computational Geometry Algorithms Library), GDAL (Geospatial Data Abstraction Library) and OpenCV (Open Source Computer Vision).

  12. Design and implementation of an Internet based effective controlling and monitoring system with wireless fieldbus communications technologies for process automation--an experimental study.

    PubMed

    Cetinceviz, Yucel; Bayindir, Ramazan

    2012-05-01

    The network requirements of control systems in industrial applications increase day by day. The Internet based control system and various fieldbus systems have been designed in order to meet these requirements. This paper describes an Internet based control system with wireless fieldbus communication designed for distributed processes. The system was implemented as an experimental setup in a laboratory. In industrial facilities, the process control layer and the distance connection of the distributed control devices in the lowest levels of the industrial production environment are provided with fieldbus networks. In this paper, the Internet based control system that will be able to meet the system requirements with a new-generation communication structure, which is called wired/wireless hybrid system, has been designed on field level and carried out to cover all sectors of distributed automation, from process control, to distributed input/output (I/O). The system has been accomplished by hardware structure with a programmable logic controller (PLC), a communication processor (CP) module, two industrial wireless modules and a distributed I/O module, Motor Protection Package (MPP) and software structure with WinCC flexible program used for the screen of Scada (Supervisory Control And Data Acquisition), SIMATIC MANAGER package program ("STEP7") used for the hardware and network configuration and also for downloading control program to PLC. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Enterprise-wide worklist management.

    PubMed

    Locko, Roberta C; Blume, Hartwig; Goble, John C

    2002-01-01

    Radiologists in multi-facility health care delivery networks must serve not only their own departments but also departments of associated clinical facilities. We describe our experience with a picture archiving and communication system (PACS) implementation that provides a dynamic view of relevant radiological workload across multiple facilities. We implemented a distributed query system that permits management of enterprise worklists based on modality, body part, exam status, and other criteria that span multiple compatible PACSs. Dynamic worklists, with lesser flexibility, can be constructed if the incompatible PACSs support specific DICOM functionality. Enterprise-wide worklists were implemented across Generations Plus/Northern Manhattan Health Network, linking radiology departments of three hospitals (Harlem, Lincoln, and Metropolitan) with 1465 beds and 4260 ambulatory patients per day. Enterprise-wide, dynamic worklist management improves utilization of radiologists and enhances the quality of care across large multi-facility health care delivery organizations. Integration of other workflow-related components remain a significant challenge.

  14. Employing online quantum random number generators for generating truly random quantum states in Mathematica

    NASA Astrophysics Data System (ADS)

    Miszczak, Jarosław Adam

    2013-01-01

    The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or decrease depending on the connection speed between the computer and the server providing random numbers. Running time: Depends on the used source of randomness and the amount of random data used in the experiment. References: [1] M. Wahl, M. Leifgen, M. Berlin, T. Röhlicke, H.-J. Rahn, O. Benson., An ultrafast quantum random number generator with provably bounded output bias based on photon arrival time measurements, Applied Physics Letters, Vol. 098, 171105 (2011). http://dx.doi.org/10.1063/1.3578456.

  15. A matrix-algebraic formulation of distributed-memory maximal cardinality matching algorithms in bipartite graphs

    DOE PAGES

    Azad, Ariful; Buluç, Aydın

    2016-05-16

    We describe parallel algorithms for computing maximal cardinality matching in a bipartite graph on distributed-memory systems. Unlike traditional algorithms that match one vertex at a time, our algorithms process many unmatched vertices simultaneously using a matrix-algebraic formulation of maximal matching. This generic matrix-algebraic framework is used to develop three efficient maximal matching algorithms with minimal changes. The newly developed algorithms have two benefits over existing graph-based algorithms. First, unlike existing parallel algorithms, cardinality of matching obtained by the new algorithms stays constant with increasing processor counts, which is important for predictable and reproducible performance. Second, relying on bulk-synchronous matrix operations,more » these algorithms expose a higher degree of parallelism on distributed-memory platforms than existing graph-based algorithms. We report high-performance implementations of three maximal matching algorithms using hybrid OpenMP-MPI and evaluate the performance of these algorithm using more than 35 real and randomly generated graphs. On real instances, our algorithms achieve up to 200 × speedup on 2048 cores of a Cray XC30 supercomputer. Even higher speedups are obtained on larger synthetically generated graphs where our algorithms show good scaling on up to 16,384 cores.« less

  16. SP2Bench: A SPARQL Performance Benchmark

    NASA Astrophysics Data System (ADS)

    Schmidt, Michael; Hornung, Thomas; Meier, Michael; Pinkel, Christoph; Lausen, Georg

    A meaningful analysis and comparison of both existing storage schemes for RDF data and evaluation approaches for SPARQL queries necessitates a comprehensive and universal benchmark platform. We present SP2Bench, a publicly available, language-specific performance benchmark for the SPARQL query language. SP2Bench is settled in the DBLP scenario and comprises a data generator for creating arbitrarily large DBLP-like documents and a set of carefully designed benchmark queries. The generated documents mirror vital key characteristics and social-world distributions encountered in the original DBLP data set, while the queries implement meaningful requests on top of this data, covering a variety of SPARQL operator constellations and RDF access patterns. In this chapter, we discuss requirements and desiderata for SPARQL benchmarks and present the SP2Bench framework, including its data generator, benchmark queries and performance metrics.

  17. High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei

    2018-01-01

    Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb ×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits /s , with a failure probability less than 10-5. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.

  18. Bayesian tomography by interacting Markov chains

    NASA Astrophysics Data System (ADS)

    Romary, T.

    2017-12-01

    In seismic tomography, we seek to determine the velocity of the undergound from noisy first arrival travel time observations. In most situations, this is an ill posed inverse problem that admits several unperfect solutions. Given an a priori distribution over the parameters of the velocity model, the Bayesian formulation allows to state this problem as a probabilistic one, with a solution under the form of a posterior distribution. The posterior distribution is generally high dimensional and may exhibit multimodality. Moreover, as it is known only up to a constant, the only sensible way to addressthis problem is to try to generate simulations from the posterior. The natural tools to perform these simulations are Monte Carlo Markov chains (MCMC). Classical implementations of MCMC algorithms generally suffer from slow mixing: the generated states are slow to enter the stationary regime, that is to fit the observations, and when one mode of the posterior is eventually identified, it may become difficult to visit others. Using a varying temperature parameter relaxing the constraint on the data may help to enter the stationary regime. Besides, the sequential nature of MCMC makes them ill fitted toparallel implementation. Running a large number of chains in parallel may be suboptimal as the information gathered by each chain is not mutualized. Parallel tempering (PT) can be seen as a first attempt to make parallel chains at different temperatures communicate but only exchange information between current states. In this talk, I will show that PT actually belongs to a general class of interacting Markov chains algorithm. I will also show that this class enables to design interacting schemes that can take advantage of the whole history of the chain, by authorizing exchanges toward already visited states. The algorithms will be illustrated with toy examples and an application to first arrival traveltime tomography.

  19. Auction-based distributed efficient economic operations of microgrid systems

    NASA Astrophysics Data System (ADS)

    Zou, Suli; Ma, Zhongjing; Liu, Xiangdong

    2014-12-01

    This paper studies the economic operations of the microgrid in a distributed way such that the operational schedule of each of the units, like generators, load units, storage units, etc., in a microgrid system, is implemented by autonomous agents. We apply and generalise the progressive second price (PSP) auction mechanism which was proposed by Lazar and Semret to efficiently allocate the divisible network resources. Considering the economic operation for the microgrid systems, the generators play as sellers to supply energy and the load units play as the buyers to consume energy, while a storage unit, like battery, super capacitor, etc., may transit between buyer and seller, such that it is a buyer when it charges and becomes a seller when it discharges. Furthermore in a connected mode, each individual unit competes against not only the other individual units in the microgrid but also the exogenous main grid possessing fixed electricity price and infinite trade capacity; that is to say, the auctioneer assigns the electricity among all individual units and the main grid with respect to the submitted bid strategies of all individual units in the microgrid in an economic way. Due to these distinct characteristics, the underlying auction games are distinct from those studied in the literature. We show that under mild conditions, the efficient economic operation strategy is a Nash equilibrium (NE) for the PSP auction games, and propose a distributed algorithm under which the system can converge to an NE. We also show that the performance of worst NE can be bounded with respect to the system parameters, say the energy trading price with the main grid, and based upon that, the implemented NE is unique and efficient under some conditions.

  20. Design of fast signal processing readout front-end electronics implemented in CMOS 40 nm technology

    NASA Astrophysics Data System (ADS)

    Kleczek, Rafal

    2016-12-01

    The author presents considerations on the design of fast readout front-end electronics implemented in a CMOS 40 nm technology with an emphasis on the system dead time, noise performance and power dissipation. The designed processing channel consists of a charge sensitive amplifier with different feedback types (Krummenacher, resistive and constant current blocks), a threshold setting block, a discriminator and a counter with logic circuitry. The results of schematic and post-layout simulations with randomly generated input pulses in a time domain according to the Poisson distribution are presented and analyzed. Dead time below 20 ns is possible while keeping noise ENC ≈ 90 e- for a detector capacitance CDET = 160 fF.

  1. Examining Troughs in the Mass Distribution of All Theoretically Possible Tryptic Peptides

    PubMed Central

    Nefedov, Alexey V.; Mitra, Indranil; Brasier, Allan R.; Sadygov, Rovshan G.

    2011-01-01

    This work describes the mass distribution of all theoretically possibly tryptic peptides made of 20 amino acids, up to the mass of 3 kDa, with resolution of 0.001 Da. We characterize regions between the peaks of the distribution, including gaps (forbidden zones) and low-populated areas (quiet zones). We show how the gaps shrink over the mass range, and when they completely disappear. We demonstrate that peptide compositions in quiet zones are less diverse than those in the peaks of the distribution, and that by eliminating certain types of unrealistic compositions the gaps in the distribution may be increased. The mass distribution is generated using a parallel implementation of a recursive procedure that enumerates all amino acid compositions. It allows us to enumerate all compositions of tryptic peptides below 3 kDa in 48 minutes using a computer cluster with 12 Intel Xeon X5650 CPUs (72 cores). The results of this work can be used to facilitate protein identification and mass defect labeling in mass spectrometry-based proteomics experiments. PMID:21780838

  2. Multi-user distribution of polarization entangled photon pairs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trapateau, J.; Orieux, A.; Diamanti, E.

    We experimentally demonstrate multi-user distribution of polarization entanglement using commercial telecom wavelength division demultiplexers. The entangled photon pairs are generated from a broadband source based on spontaneous parametric down conversion in a periodically poled lithium niobate crystal using a double path setup employing a Michelson interferometer and active phase stabilisation. We test and compare demultiplexers based on various technologies and analyze the effect of their characteristics, such as losses and polarization dependence, on the quality of the distributed entanglement for three channel pairs of each demultiplexer. In all cases, we obtain a Bell inequality violation, whose value depends on themore » demultiplexer features. This demonstrates that entanglement can be distributed to at least three user pairs of a network from a single source. Additionally, we verify for the best demultiplexer that the violation is maintained when the pairs are distributed over a total channel attenuation corresponding to 20 km of optical fiber. These techniques are therefore suitable for resource-efficient practical implementations of entanglement-based quantum key distribution and other quantum communication network applications.« less

  3. Next Generation Solvent (NGS): Development for Caustic-Side Solvent Extraction of Cesium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moyer, Bruce A.; Birdwell, Jr, Joseph F.; Bonnesen, Peter V.

    This report summarizes the FY 2010 and 2011 accomplishments at Oak Ridge National Laboratory (ORNL) in developing the Next Generation Caustic-Side Solvent Extraction (NG-CSSX) process, referred to commonly as the Next Generation Solvent (NGS), under funding from the U.S. Department of Energy, Office of Environmental Management (DOE-EM), Office of Technology Innovation and Development. The primary product of this effort is a process solvent and preliminary flowsheet capable of meeting a target decontamination factor (DF) of 40,000 for worst-case Savannah River Site (SRS) waste with a concentration factor of 15 or higher in the 18-stage equipment configuration of the SRS Modularmore » Caustic-Side Solvent Extraction Unit (MCU). In addition, the NG-CSSX process may be readily adapted for use in the SRS Salt Waste Processing Facility (SWPF) or in supplemental tank-waste treatment at Hanford upon appropriate solvent or flowsheet modifications. Efforts in FY 2010 focused on developing a solvent composition and process flowsheet for MCU implementation. In FY 2011 accomplishments at ORNL involved a wide array of chemical-development activities and testing up through single-stage hydraulic and mass-transfer tests in 5-cm centrifugal contactors. Under subcontract from ORNL, Argonne National Laboratory (ANL) designed a preliminary flowsheet using ORNL cesium distribution data, and Tennessee Technological University confirmed a chemical model for cesium distribution ratios (DCs) as a function of feed composition. Interlaboratory efforts were coordinated with complementary engineering tests carried out (and reported separately) by personnel at Savannah River National Laboratory (SRNL) and Savannah River Remediation (SRR) with helpful advice by Parsons Engineering and General Atomics on aspects of possible SWPF implementation.« less

  4. Next Generation Solvent Development for Caustic-Side Solvent Extraction of Cesium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moyer, Bruce A.; Birdwell, Joseph F.; Bonnesen, Peter V.

    This report summarizes the FY 2010 and 2011 accomplishments at Oak Ridge National Laboratory (ORNL) in developing the Next Generation Caustic-Side Solvent Extraction (NG-CSSX) process, referred to commonly as the Next Generation Solvent (NGS), under funding from the U.S. Department of Energy, Office of Environmental Management (DOE-EM), Office of Technology Innovation and Development. The primary product of this effort is a process solvent and preliminary flowsheet capable of meeting a target decontamination factor (DF) of 40,000 for worst-case Savannah River Site (SRS) waste with a concentration factor of 15 or higher in the 18-stage equipment configuration of the SRS Modularmore » Caustic-Side Solvent Extraction Unit (MCU). In addition, the NG-CSSX process may be readily adapted for use in the SRS Salt Waste Processing Facility (SWPF) or in supplemental tank-waste treatment at Hanford upon appropriate solvent or flowsheet modifications. Efforts in FY 2010 focused on developing a solvent composition and process flowsheet for MCU implementation. In FY 2011 accomplishments at ORNL involved a wide array of chemical-development activities and testing up through single-stage hydraulic and mass-transfer tests in 5-cm centrifugal contactors. Under subcontract from ORNL, Argonne National Laboratory (ANL) designed a preliminary flowsheet using ORNL cesium distribution data, and Tennessee Technological University confirmed a chemical model for cesium distribution ratios (DCs) as a function of feed composition. Inter laboratory efforts were coordinated with complementary engineering tests carried out (and reported separately) by personnel at Savannah River National Laboratory (SRNL) and Savannah River Remediation (SRR) with helpful advice by Parsons Engineering and General Atomics on aspects of possible SWPF implementation.« less

  5. The Effects of Implementing TopModel Concepts in the Noah Model

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, C. D.; Houser, Paul R. (Technical Monitor)

    2002-01-01

    Topographic effects on runoff generation have been documented observationally (e.g., Dunne and Black, 1970) and are the subject of the physically based rainfall-runoff model TOPMODEL (Beven and Kirkby, 1979; Beven, 1986a;b) and its extensions, which incorporate variable soil transmissivity effects (Sivapalan et al, 1987, Wood et al., 1988; 1990). These effects have been shown to exert significant control over the spatial distribution of runoff, soil moisture and evapotranspiration, and by extension, the latent and sensible heat fluxes

  6. Quantum cryptography with entangled photons

    PubMed

    Jennewein; Simon; Weihs; Weinfurter; Zeilinger

    2000-05-15

    By realizing a quantum cryptography system based on polarization entangled photon pairs we establish highly secure keys, because a single photon source is approximated and the inherent randomness of quantum measurements is exploited. We implement a novel key distribution scheme using Wigner's inequality to test the security of the quantum channel, and, alternatively, realize a variant of the BB84 protocol. Our system has two completely independent users separated by 360 m, and generates raw keys at rates of 400-800 bits/s with bit error rates around 3%.

  7. Secure, Autonomous, Intelligent Controller for Integrating Distributed Sensor Webs

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2007-01-01

    This paper describes the infrastructure and protocols necessary to enable near-real-time commanding, access to space-based assets, and the secure interoperation between sensor webs owned and controlled by various entities. Select terrestrial and aeronautics-base sensor webs will be used to demonstrate time-critical interoperability between integrated, intelligent sensor webs both terrestrial and between terrestrial and space-based assets. For this work, a Secure, Autonomous, Intelligent Controller and knowledge generation unit is implemented using Virtual Mission Operation Center technology.

  8. Soil Bacteria and Fungi Respond on Different Spatial Scales to Invasion by the Legume Lespedeza cuneata

    DTIC Science & Technology

    2011-05-24

    of 230   community similarity (Legendre and Legendre 1998). 231   232   Permutational Multivariate Analysis of Variance ( PerMANOVA ) (McArdle...241   null hypothesis can be rejected with a type I error rate of a. We used an implementation 242   of PerMANOVA that involved sequential removal...TEXTURE, and 249   HABITAT. 250   251   The null distribution for PerMANOVA tests for site-scale effects was generated 252   using a restricted

  9. Approximate Green's function methods for HZE transport in multilayered materials

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Badavi, Francis F.; Shinn, Judy L.; Costen, Robert C.

    1993-01-01

    A nonperturbative analytic solution of the high charge and energy (HZE) Green's function is used to implement a computer code for laboratory ion beam transport in multilayered materials. The code is established to operate on the Langley nuclear fragmentation model used in engineering applications. Computational procedures are established to generate linear energy transfer (LET) distributions for a specified ion beam and target for comparison with experimental measurements. The code was found to be highly efficient and compared well with the perturbation approximation.

  10. MODIS. Volume 1: MODIS level 1A software baseline requirements

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Fleig, Albert; Ardanuy, Philip; Goff, Thomas; Carpenter, Lloyd; Solomon, Carl; Storey, James

    1994-01-01

    This document describes the level 1A software requirements for the moderate resolution imaging spectroradiometer (MODIS) instrument. This includes internal and external requirements. Internal requirements include functional, operational, and data processing as well as performance, quality, safety, and security engineering requirements. External requirements include those imposed by data archive and distribution systems (DADS); scheduling, control, monitoring, and accounting (SCMA); product management (PM) system; MODIS log; and product generation system (PGS). Implementation constraints and requirements for adapting the software to the physical environment are also included.

  11. Method to implement the CCD timing generator based on FPGA

    NASA Astrophysics Data System (ADS)

    Li, Binhua; Song, Qian; He, Chun; Jin, Jianhui; He, Lin

    2010-07-01

    With the advance of the PFPA technology, the design methodology of digital systems is changing. In recent years we develop a method to implement the CCD timing generator based on FPGA and VHDL. This paper presents the principles and implementation skills of the method. Taking a developed camera as an example, we introduce the structure, input and output clocks/signals of a timing generator implemented in the camera. The generator is composed of a top module and a bottom module. The bottom one is made up of 4 sub-modules which correspond to 4 different operation modes. The modules are implemented by 5 VHDL programs. Frame charts of the architecture of these programs are shown in the paper. We also describe implementation steps of the timing generator in Quartus II, and the interconnections between the generator and a Nios soft core processor which is the controller of this generator. Some test results are presented in the end.

  12. The challenge of transferring an implementation strategy from academia to the field: a process evaluation of local quality improvement collaboratives in Dutch primary care using the normalization process theory.

    PubMed

    Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy

    2014-12-01

    A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.

  13. A Comprehensive Comparison of Multiparty Secure Additions with Differential Privacy

    PubMed Central

    Goryczka, Slawomir; Xiong, Li

    2016-01-01

    This paper considers the problem of secure data aggregation (mainly summation) in a distributed setting, while ensuring differential privacy of the result. We study secure multiparty addition protocols using well known security schemes: Shamir’s secret sharing, perturbation-based, and various encryptions. We supplement our study with our new enhanced encryption scheme EFT, which is efficient and fault tolerant. Differential privacy of the final result is achieved by either distributed Laplace or Geometric mechanism (respectively DLPA or DGPA), while approximated differential privacy is achieved by diluted mechanisms. Distributed random noise is generated collectively by all participants, which draw random variables from one of several distributions: Gamma, Gauss, Geometric, or their diluted versions. We introduce a new distributed privacy mechanism with noise drawn from the Laplace distribution, which achieves smaller redundant noise with efficiency. We compare complexity and security characteristics of the protocols with different differential privacy mechanisms and security schemes. More importantly, we implemented all protocols and present an experimental comparison on their performance and scalability in a real distributed environment. Based on the evaluations, we identify our security scheme and Laplace DLPA as the most efficient for secure distributed data aggregation with privacy. PMID:28919841

  14. A Comprehensive Comparison of Multiparty Secure Additions with Differential Privacy.

    PubMed

    Goryczka, Slawomir; Xiong, Li

    2017-01-01

    This paper considers the problem of secure data aggregation (mainly summation) in a distributed setting, while ensuring differential privacy of the result. We study secure multiparty addition protocols using well known security schemes: Shamir's secret sharing, perturbation-based, and various encryptions. We supplement our study with our new enhanced encryption scheme EFT, which is efficient and fault tolerant. Differential privacy of the final result is achieved by either distributed Laplace or Geometric mechanism (respectively DLPA or DGPA), while approximated differential privacy is achieved by diluted mechanisms. Distributed random noise is generated collectively by all participants, which draw random variables from one of several distributions: Gamma, Gauss, Geometric, or their diluted versions. We introduce a new distributed privacy mechanism with noise drawn from the Laplace distribution, which achieves smaller redundant noise with efficiency. We compare complexity and security characteristics of the protocols with different differential privacy mechanisms and security schemes. More importantly, we implemented all protocols and present an experimental comparison on their performance and scalability in a real distributed environment. Based on the evaluations, we identify our security scheme and Laplace DLPA as the most efficient for secure distributed data aggregation with privacy.

  15. Design and implementation of co-operative control strategy for hybrid AC/DC microgrids

    NASA Astrophysics Data System (ADS)

    Mahmud, Rasel

    This thesis is mainly divided in two major sections: 1) Modeling and control of AC microgrid, DC microgrid, Hybrid AC/DC microgrid using distributed co-operative control, and 2) Development of a four bus laboratory prototype of an AC microgrid system. At first, a distributed cooperative control (DCC) for a DC microgrid considering the state-of-charge (SoC) of the batteries in a typical plug-in-electric-vehicle (PEV) is developed. In DC microgrids, this methodology is developed to assist the load sharing amongst the distributed generation units (DGs), according to their ratings with improved voltage regulation. Subsequently, a DCC based control algorithm for AC microgrid is also investigated to improve the performance of AC microgrid in terms of power sharing among the DGs, voltage regulation and frequency deviation. The results validate the advantages of the proposed methodology as compared to traditional droop control of AC microgrid. The DCC-based control methodology for AC microgrid and DC microgrid are further expanded to develop a DCC-based power management algorithm for hybrid AC/DC microgrid. The developed algorithm for hybrid microgrid controls the power flow through the interfacing converter (IC) between the AC and DC microgrids. This will facilitate the power sharing between the DGs according to their power ratings. Moreover, it enables the fixed scheduled power delivery at different operating conditions, while maintaining good voltage regulation and improved frequency profile. The second section provides a detailed explanation and step-by-step design and development of an AC/DC microgrid testbed. Controllers for the three-phase inverters are designed and tested on different generation units along with their corresponding inductor-capacitor-inductor (LCL) filters to eliminate the switching frequency harmonics. Electric power distribution line models are developed to form the microgrid network topology. Voltage and current sensors are placed in the proper positions to achieve a full visibility over the microgrid. A running average filter (RAF) based enhanced phase-locked-loop (EPLL) is designed and implemented to extract frequency and phase angle information. A PLL-based synchronizing scheme is also developed to synchronize the DGs to the microgrid. The developed laboratory prototype runs on dSpace platform for real time data acquisition, communication and controller implementation.

  16. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    NASA Astrophysics Data System (ADS)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  17. GOTHIC: Gravitational oct-tree code accelerated by hierarchical time step controlling

    NASA Astrophysics Data System (ADS)

    Miki, Yohei; Umemura, Masayuki

    2017-04-01

    The tree method is a widely implemented algorithm for collisionless N-body simulations in astrophysics well suited for GPU(s). Adopting hierarchical time stepping can accelerate N-body simulations; however, it is infrequently implemented and its potential remains untested in GPU implementations. We have developed a Gravitational Oct-Tree code accelerated by HIerarchical time step Controlling named GOTHIC, which adopts both the tree method and the hierarchical time step. The code adopts some adaptive optimizations by monitoring the execution time of each function on-the-fly and minimizes the time-to-solution by balancing the measured time of multiple functions. Results of performance measurements with realistic particle distribution performed on NVIDIA Tesla M2090, K20X, and GeForce GTX TITAN X, which are representative GPUs of the Fermi, Kepler, and Maxwell generation of GPUs, show that the hierarchical time step achieves a speedup by a factor of around 3-5 times compared to the shared time step. The measured elapsed time per step of GOTHIC is 0.30 s or 0.44 s on GTX TITAN X when the particle distribution represents the Andromeda galaxy or the NFW sphere, respectively, with 224 = 16,777,216 particles. The averaged performance of the code corresponds to 10-30% of the theoretical single precision peak performance of the GPU.

  18. Analytical modelling of Halbach linear generator incorporating pole shifting and piece-wise spring for ocean wave energy harvesting

    NASA Astrophysics Data System (ADS)

    Tan, Yimin; Lin, Kejian; Zu, Jean W.

    2018-05-01

    Halbach permanent magnet (PM) array has attracted tremendous research attention in the development of electromagnetic generators for its unique properties. This paper has proposed a generalized analytical model for linear generators. The slotted stator pole-shifting and implementation of Halbach array have been combined for the first time. Initially, the magnetization components of the Halbach array have been determined using Fourier decomposition. Then, based on the magnetic scalar potential method, the magnetic field distribution has been derived employing specially treated boundary conditions. FEM analysis has been conducted to verify the analytical model. A slotted linear PM generator with Halbach PM has been constructed to validate the model and further improved using piece-wise springs to trigger full range reciprocating motion. A dynamic model has been developed to characterize the dynamic behavior of the slider. This analytical method provides an effective tool in development and optimization of Halbach PM generator. The experimental results indicate that piece-wise springs can be employed to improve generator performance under low excitation frequency.

  19. The Canadian National EMS Research Agenda: Impact and Feasibility of Implementation of Previously Generated Recommendations.

    PubMed

    Jensen, J L; Blanchard, I E; Bigham, B L; Carter, Aje; Brown, R; Socha, D; Brown, L H; Travers, A H; Craig, A M; Morrison, L J

    2015-09-01

    A recent mixed-methods study on the state of emergency medical services (EMS) research in Canada led to the generation of nineteen actionable recommendations. As part of the dissemination plan, a survey was distributed to EMS stakeholders to determine the anticipated impact and feasibility of implementing these recommendations in Canadian systems. An online survey explored both the implementation impact and feasibility for each recommendation using a five-point scale. The sample consisted of participants from the Canadian National EMS Research Agenda study (published in 2013) and additional EMS research stakeholders identified through snowball sampling. Responses were analysed descriptively using median and plotted on a matrix. Participants reported any planned or ongoing initiatives related to the recommendations, and required or anticipated resources. Free text responses were analysed with simple content analysis, collated by recommendation. The survey was sent to 131 people, 94 (71.8%) of whom responded: 30 EMS managers/regulators (31.9%), 22 researchers (23.4%), 15 physicians (16.0%), 13 educators (13.8%), and 5 EMS providers (5.3%). Two recommendations (11%) had a median impact score of 4 (of 5) and feasibility score of 4 (of 5). Eight recommendations (42%) had an impact score of 5, with a feasibility score of 3. Nine recommendations (47%) had an impact score of 4 and a feasibility score of 3. For most recommendations, participants scored the anticipated impact higher than the feasibility to implement. Ongoing or planned initiatives exist pertaining to all recommendations except one. All of the recommendations will require additional resources to implement.

  20. Generating moment matching scenarios using optimization techniques

    DOE PAGES

    Mehrotra, Sanjay; Papp, Dávid

    2013-05-16

    An optimization based method is proposed to generate moment matching scenarios for numerical integration and its use in stochastic programming. The main advantage of the method is its flexibility: it can generate scenarios matching any prescribed set of moments of the underlying distribution rather than matching all moments up to a certain order, and the distribution can be defined over an arbitrary set. This allows for a reduction in the number of scenarios and allows the scenarios to be better tailored to the problem at hand. The method is based on a semi-infinite linear programming formulation of the problem thatmore » is shown to be solvable with polynomial iteration complexity. A practical column generation method is implemented. The column generation subproblems are polynomial optimization problems; however, they need not be solved to optimality. It is found that the columns in the column generation approach can be efficiently generated by random sampling. The number of scenarios generated matches a lower bound of Tchakaloff's. The rate of convergence of the approximation error is established for continuous integrands, and an improved bound is given for smooth integrands. Extensive numerical experiments are presented in which variants of the proposed method are compared to Monte Carlo and quasi-Monte Carlo methods on both numerical integration problems and stochastic optimization problems. The benefits of being able to match any prescribed set of moments, rather than all moments up to a certain order, is also demonstrated using optimization problems with 100-dimensional random vectors. Here, empirical results show that the proposed approach outperforms Monte Carlo and quasi-Monte Carlo based approaches on the tested problems.« less

  1. Implementation notes on bdes(1). [data encryption implementation

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1991-01-01

    This note describes the implementation of bdes, the file encryption program being distributed in the 4.4 release of the Berkeley Software Distribution. It implements all modes of the Data Encryption Standard program.

  2. Analysis on Voltage Profile of Distribution Network with Distributed Generation

    NASA Astrophysics Data System (ADS)

    Shao, Hua; Shi, Yujie; Yuan, Jianpu; An, Jiakun; Yang, Jianhua

    2018-02-01

    Penetration of distributed generation has some impacts on a distribution network in load flow, voltage profile, reliability, power loss and so on. After the impacts and the typical structures of the grid-connected distributed generation are analyzed, the back/forward sweep method of the load flow calculation of the distribution network is modelled including distributed generation. The voltage profiles of the distribution network affected by the installation location and the capacity of distributed generation are thoroughly investigated and simulated. The impacts on the voltage profiles are summarized and some suggestions to the installation location and the capacity of distributed generation are given correspondingly.

  3. Testing Pairwise Association between Spatially Autocorrelated Variables: A New Approach Using Surrogate Lattice Data

    PubMed Central

    Deblauwe, Vincent; Kennel, Pol; Couteron, Pierre

    2012-01-01

    Background Independence between observations is a standard prerequisite of traditional statistical tests of association. This condition is, however, violated when autocorrelation is present within the data. In the case of variables that are regularly sampled in space (i.e. lattice data or images), such as those provided by remote-sensing or geographical databases, this problem is particularly acute. Because analytic derivation of the null probability distribution of the test statistic (e.g. Pearson's r) is not always possible when autocorrelation is present, we propose instead the use of a Monte Carlo simulation with surrogate data. Methodology/Principal Findings The null hypothesis that two observed mapped variables are the result of independent pattern generating processes is tested here by generating sets of random image data while preserving the autocorrelation function of the original images. Surrogates are generated by matching the dual-tree complex wavelet spectra (and hence the autocorrelation functions) of white noise images with the spectra of the original images. The generated images can then be used to build the probability distribution function of any statistic of association under the null hypothesis. We demonstrate the validity of a statistical test of association based on these surrogates with both actual and synthetic data and compare it with a corrected parametric test and three existing methods that generate surrogates (randomization, random rotations and shifts, and iterative amplitude adjusted Fourier transform). Type I error control was excellent, even with strong and long-range autocorrelation, which is not the case for alternative methods. Conclusions/Significance The wavelet-based surrogates are particularly appropriate in cases where autocorrelation appears at all scales or is direction-dependent (anisotropy). We explore the potential of the method for association tests involving a lattice of binary data and discuss its potential for validation of species distribution models. An implementation of the method in Java for the generation of wavelet-based surrogates is available online as supporting material. PMID:23144961

  4. High-Precision Differential Predictions for Top-Quark Pairs at the LHC

    NASA Astrophysics Data System (ADS)

    Czakon, Michal; Heymes, David; Mitov, Alexander

    2016-02-01

    We present the first complete next-to-next-to-leading order (NNLO) QCD predictions for differential distributions in the top-quark pair production process at the LHC. Our results are derived from a fully differential partonic Monte Carlo calculation with stable top quarks which involves no approximations beyond the fixed-order truncation of the perturbation series. The NNLO corrections improve the agreement between existing LHC measurements [V. Khachatryan et al. (CMS Collaboration), Eur. Phys. J. C 75, 542 (2015)] and standard model predictions for the top-quark transverse momentum distribution, thus helping alleviate one long-standing discrepancy. The shape of the top-quark pair invariant mass distribution turns out to be stable with respect to radiative corrections beyond NLO which increases the value of this observable as a place to search for physics beyond the standard model. The results presented here provide essential input for parton distribution function fits, implementation of higher-order effects in Monte Carlo generators, as well as top-quark mass and strong coupling determination.

  5. High-Precision Differential Predictions for Top-Quark Pairs at the LHC.

    PubMed

    Czakon, Michal; Heymes, David; Mitov, Alexander

    2016-02-26

    We present the first complete next-to-next-to-leading order (NNLO) QCD predictions for differential distributions in the top-quark pair production process at the LHC. Our results are derived from a fully differential partonic Monte Carlo calculation with stable top quarks which involves no approximations beyond the fixed-order truncation of the perturbation series. The NNLO corrections improve the agreement between existing LHC measurements [V. Khachatryan et al. (CMS Collaboration), Eur. Phys. J. C 75, 542 (2015)] and standard model predictions for the top-quark transverse momentum distribution, thus helping alleviate one long-standing discrepancy. The shape of the top-quark pair invariant mass distribution turns out to be stable with respect to radiative corrections beyond NLO which increases the value of this observable as a place to search for physics beyond the standard model. The results presented here provide essential input for parton distribution function fits, implementation of higher-order effects in Monte Carlo generators, as well as top-quark mass and strong coupling determination.

  6. Kmerind: A Flexible Parallel Library for K-mer Indexing of Biological Sequences on Distributed Memory Systems.

    PubMed

    Pan, Tony; Flick, Patrick; Jain, Chirag; Liu, Yongchao; Aluru, Srinivas

    2017-10-09

    Counting and indexing fixed length substrings, or k-mers, in biological sequences is a key step in many bioinformatics tasks including genome alignment and mapping, genome assembly, and error correction. While advances in next generation sequencing technologies have dramatically reduced the cost and improved latency and throughput, few bioinformatics tools can efficiently process the datasets at the current generation rate of 1.8 terabases every 3 days. We present Kmerind, a high performance parallel k-mer indexing library for distributed memory environments. The Kmerind library provides a set of simple and consistent APIs with sequential semantics and parallel implementations that are designed to be flexible and extensible. Kmerind's k-mer counter performs similarly or better than the best existing k-mer counting tools even on shared memory systems. In a distributed memory environment, Kmerind counts k-mers in a 120 GB sequence read dataset in less than 13 seconds on 1024 Xeon CPU cores, and fully indexes their positions in approximately 17 seconds. Querying for 1% of the k-mers in these indices can be completed in 0.23 seconds and 28 seconds, respectively. Kmerind is the first k-mer indexing library for distributed memory environments, and the first extensible library for general k-mer indexing and counting. Kmerind is available at https://github.com/ParBLiSS/kmerind.

  7. Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons.

    PubMed

    Probst, Dimitri; Petrovici, Mihai A; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz

    2015-01-01

    The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems.

  8. Compiling probabilistic, bio-inspired circuits on a field programmable analog array

    PubMed Central

    Marr, Bo; Hasler, Jennifer

    2014-01-01

    A field programmable analog array (FPAA) is presented as an energy and computational efficiency engine: a mixed mode processor for which functions can be compiled at significantly less energy costs using probabilistic computing circuits. More specifically, it will be shown that the core computation of any dynamical system can be computed on the FPAA at significantly less energy per operation than a digital implementation. A stochastic system that is dynamically controllable via voltage controlled amplifier and comparator thresholds is implemented, which computes Bernoulli random variables. From Bernoulli variables it is shown exponentially distributed random variables, and random variables of an arbitrary distribution can be computed. The Gillespie algorithm is simulated to show the utility of this system by calculating the trajectory of a biological system computed stochastically with this probabilistic hardware where over a 127X performance improvement over current software approaches is shown. The relevance of this approach is extended to any dynamical system. The initial circuits and ideas for this work were generated at the 2008 Telluride Neuromorphic Workshop. PMID:24847199

  9. Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons

    PubMed Central

    Probst, Dimitri; Petrovici, Mihai A.; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz

    2015-01-01

    The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems. PMID:25729361

  10. Protection of autonomous microgrids using agent-based distributed communication

    DOE PAGES

    Cintuglu, Mehmet H.; Ma, Tan; Mohammed, Osama A.

    2016-04-06

    This study presents a real-time implementation of autonomous microgrid protection using agent-based distributed communication. Protection of an autonomous microgrid requires special considerations compared to large scale distribution net-works due to the presence of power converters and relatively low inertia. In this work, we introduce a practical overcurrent and a frequency selectivity method to overcome conventional limitations. The proposed overcurrent scheme defines a selectivity mechanism considering the remedial action scheme (RAS) of the microgrid after a fault instant based on feeder characteristics and the location of the intelligent electronic devices (IEDs). A synchrophasor-based online frequency selectivity approach is proposed to avoidmore » pulse loading effects in low inertia microgrids. Experimental results are presented for verification of the pro-posed schemes using a laboratory based microgrid. The setup was composed of actual generation units and IEDs using IEC 61850 protocol. The experimental results were in excellent agreement with the proposed protection scheme.« less

  11. Implementation of jump-diffusion algorithms for understanding FLIR scenes

    NASA Astrophysics Data System (ADS)

    Lanterman, Aaron D.; Miller, Michael I.; Snyder, Donald L.

    1995-07-01

    Our pattern theoretic approach to the automated understanding of forward-looking infrared (FLIR) images brings the traditionally separate endeavors of detection, tracking, and recognition together into a unified jump-diffusion process. New objects are detected and object types are recognized through discrete jump moves. Between jumps, the location and orientation of objects are estimated via continuous diffusions. An hypothesized scene, simulated from the emissive characteristics of the hypothesized scene elements, is compared with the collected data by a likelihood function based on sensor statistics. This likelihood is combined with a prior distribution defined over the set of possible scenes to form a posterior distribution. The jump-diffusion process empirically generates the posterior distribution. Both the diffusion and jump operations involve the simulation of a scene produced by a hypothesized configuration. Scene simulation is most effectively accomplished by pipelined rendering engines such as silicon graphics. We demonstrate the execution of our algorithm on a silicon graphics onyx/reality engine.

  12. Characterising RNA secondary structure space using information entropy

    PubMed Central

    2013-01-01

    Comparative methods for RNA secondary structure prediction use evolutionary information from RNA alignments to increase prediction accuracy. The model is often described in terms of stochastic context-free grammars (SCFGs), which generate a probability distribution over secondary structures. It is, however, unclear how this probability distribution changes as a function of the input alignment. As prediction programs typically only return a single secondary structure, better characterisation of the underlying probability space of RNA secondary structures is of great interest. In this work, we show how to efficiently compute the information entropy of the probability distribution over RNA secondary structures produced for RNA alignments by a phylo-SCFG, and implement it for the PPfold model. We also discuss interpretations and applications of this quantity, including how it can clarify reasons for low prediction reliability scores. PPfold and its source code are available from http://birc.au.dk/software/ppfold/. PMID:23368905

  13. Synchrony detection and amplification by silicon neurons with STDP synapses.

    PubMed

    Bofill-i-petit, Adria; Murray, Alan F

    2004-09-01

    Spike-timing dependent synaptic plasticity (STDP) is a form of plasticity driven by precise spike-timing differences between presynaptic and postsynaptic spikes. Thus, the learning rules underlying STDP are suitable for learning neuronal temporal phenomena such as spike-timing synchrony. It is well known that weight-independent STDP creates unstable learning processes resulting in balanced bimodal weight distributions. In this paper, we present a neuromorphic analog very large scale integration (VLSI) circuit that contains a feedforward network of silicon neurons with STDP synapses. The learning rule implemented can be tuned to have a moderate level of weight dependence. This helps stabilise the learning process and still generates binary weight distributions. From on-chip learning experiments we show that the chip can detect and amplify hierarchical spike-timing synchrony structures embedded in noisy spike trains. The weight distributions of the network emerging from learning are bimodal.

  14. Predictive and Reactive Distribution of Vaccines and Antivirals during Cross-Regional Pandemic Outbreaks

    PubMed Central

    Uribe-Sánchez, Andrés; Savachkin, Alex

    2011-01-01

    As recently pointed out by the Institute of Medicine, the existing pandemic mitigation models lack the dynamic decision support capability. We develop a large-scale simulation-driven optimization model for generating dynamic predictive distribution of vaccines and antivirals over a network of regional pandemic outbreaks. The model incorporates measures of morbidity, mortality, and social distancing, translated into the cost of lost productivity and medical expenses. The performance of the strategy is compared to that of the reactive myopic policy, using a sample outbreak in Fla, USA, with an affected population of over four millions. The comparison is implemented at different levels of vaccine and antiviral availability and administration capacity. Sensitivity analysis is performed to assess the impact of variability of some critical factors on policy performance. The model is intended to support public health policy making for effective distribution of limited mitigation resources. PMID:23074658

  15. Protection of autonomous microgrids using agent-based distributed communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet H.; Ma, Tan; Mohammed, Osama A.

    This study presents a real-time implementation of autonomous microgrid protection using agent-based distributed communication. Protection of an autonomous microgrid requires special considerations compared to large scale distribution net-works due to the presence of power converters and relatively low inertia. In this work, we introduce a practical overcurrent and a frequency selectivity method to overcome conventional limitations. The proposed overcurrent scheme defines a selectivity mechanism considering the remedial action scheme (RAS) of the microgrid after a fault instant based on feeder characteristics and the location of the intelligent electronic devices (IEDs). A synchrophasor-based online frequency selectivity approach is proposed to avoidmore » pulse loading effects in low inertia microgrids. Experimental results are presented for verification of the pro-posed schemes using a laboratory based microgrid. The setup was composed of actual generation units and IEDs using IEC 61850 protocol. The experimental results were in excellent agreement with the proposed protection scheme.« less

  16. Continuous high speed coherent one-way quantum key distribution.

    PubMed

    Stucki, Damien; Barreiro, Claudio; Fasel, Sylvain; Gautier, Jean-Daniel; Gay, Olivier; Gisin, Nicolas; Thew, Rob; Thoma, Yann; Trinkler, Patrick; Vannel, Fabien; Zbinden, Hugo

    2009-08-03

    Quantum key distribution (QKD) is the first commercial quantum technology operating at the level of single quanta and is a leading light for quantum-enabled photonic technologies. However, controlling these quantum optical systems in real world environments presents significant challenges. For the first time, we have brought together three key concepts for future QKD systems: a simple high-speed protocol; high performance detection; and integration both, at the component level and for standard fibre network connectivity. The QKD system is capable of continuous and autonomous operation, generating secret keys in real time. Laboratory and field tests were performed and comparisons made with robust InGaAs avalanche photodiodes and superconducting detectors. We report the first real world implementation of a fully functional QKD system over a 43 dB-loss (150 km) transmission line in the Swisscom fibre optic network where we obtained average real-time distribution rates over 3 hours of 2.5 bps.

  17. Practical comparison of distributed ledger technologies for IoT

    NASA Astrophysics Data System (ADS)

    Red, Val A.

    2017-05-01

    Existing distributed ledger implementations - specifically, several blockchain implementations - embody a cacophony of divergent capabilities augmenting innovations of cryptographic hashes, consensus mechanisms, and asymmetric cryptography in a wide variety of applications. Whether specifically designed for cryptocurrency or otherwise, several distributed ledgers rely upon modular mechanisms such as consensus or smart contracts. These components, however, can vary substantially among implementations; differences involving proof-of-work, practical byzantine fault tolerance, and other consensus approaches exemplify distinct distributed ledger variations. Such divergence results in unique combinations of modules, performance, latency, and fault tolerance. As implementations continue to develop rapidly due to the emerging nature of blockchain technologies, this paper encapsulates a snapshot of sensor and internet of things (IoT) specific implementations of blockchain as of the end of 2016. Several technical risks and divergent approaches preclude standardization of a blockchain for sensors and IoT in the foreseeable future; such issues will be assessed alongside the practicality of IoT applications among Hyperledger, Iota, and Ethereum distributed ledger implementations suggested for IoT. This paper contributes a comparison of existing distributed ledger implementations intended for practical sensor and IoT utilization. A baseline for characterizing distributed ledger implementations in the context of IoT and sensors is proposed. Technical approaches and performance are compared considering IoT size, weight, and power limitations. Consensus and smart contracts, if applied, are also analyzed for the respective implementations' practicality and security. Overall, the maturity of distributed ledgers with respect to sensor and IoT applicability will be analyzed for enterprise interoperability.

  18. Integrated Field Testing of Fuel Cells and Micro-Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jerome R. Temchin; Stephen J. Steffel

    A technical and economic evaluation of the prospects for the deployment of distributed generation on Long Beach Island, New Jersey concluded that properly sited DG would defer upgrading of the electric power grid for 10 years. This included the deployment of fuel cells or microturbines as well as reciprocating engines. The implementation phase of this project focused on the installation of a 120 kW CHP microturbine system at the Harvey Cedars Bible Conference in Harvey Cedars, NJ. A 1.1 MW generator powered by a gas-fired reciprocating engine for additional grid support was also installed at a local substation. This reportmore » contains installation and operation issues as well as the utility perspective on DG deployment.« less

  19. Distributed numerical controllers

    NASA Astrophysics Data System (ADS)

    Orban, Peter E.

    2001-12-01

    While the basic principles of Numerical Controllers (NC) have not changed much during the years, the implementation of NCs' has changed tremendously. NC equipment has evolved from yesterday's hard-wired specialty control apparatus to today's graphics intensive, networked, increasingly PC based open systems, controlling a wide variety of industrial equipment with positioning needs. One of the newest trends in NC technology is the distributed implementation of the controllers. Distributed implementation promises to offer robustness, lower implementation costs, and a scalable architecture. Historically partitioning has been done along the hierarchical levels, moving individual modules into self contained units. The paper discusses various NC architectures, the underlying technology for distributed implementation, and relevant design issues. First the functional requirements of individual NC modules are analyzed. Module functionality, cycle times, and data requirements are examined. Next the infrastructure for distributed node implementation is reviewed. Various communication protocols and distributed real-time operating system issues are investigated and compared. Finally, a different, vertical system partitioning, offering true scalability and reconfigurability is presented.

  20. Absorption cooling sources atmospheric emissions decrease by implementation of simple algorithm for limiting temperature of cooling water

    NASA Astrophysics Data System (ADS)

    Wojdyga, Krzysztof; Malicki, Marcin

    2017-11-01

    Constant strive to improve the energy efficiency forces carrying out activities aimed at reduction of energy consumption hence decreasing amount of contamination emissions to atmosphere. Cooling demand, both for air-conditioning and process cooling, plays an increasingly important role in the balance of Polish electricity generation and distribution system in summer. During recent years' demand for electricity during summer months has been steadily and significantly increasing leading to deficits of energy availability during particularly hot periods. This causes growing importance and interest in trigeneration power generation sources and heat recovery systems producing chilled water. Key component of such system is thermally driven chiller, mostly absorption, based on lithium-bromide and water mixture. Absorption cooling systems also exist in Poland as stand-alone systems, supplied with heating from various sources, generated solely for them or recovered as waste or useless energy. The publication presents a simple algorithm, designed to reduce the amount of heat for the supply of absorption chillers producing chilled water for the purposes of air conditioning by reducing the temperature of the cooling water, and its impact on decreasing emissions of harmful substances into the atmosphere. Scale of environmental advantages has been rated for specific sources what enabled evaluation and estimation of simple algorithm implementation to sources existing nationally.

  1. Incorporating Wind Power Forecast Uncertainties Into Stochastic Unit Commitment Using Neural Network-Based Prediction Intervals.

    PubMed

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2015-09-01

    Penetration of renewable energy resources, such as wind and solar power, into power systems significantly increases the uncertainties on system operation, stability, and reliability in smart grids. In this paper, the nonparametric neural network-based prediction intervals (PIs) are implemented for forecast uncertainty quantification. Instead of a single level PI, wind power forecast uncertainties are represented in a list of PIs. These PIs are then decomposed into quantiles of wind power. A new scenario generation method is proposed to handle wind power forecast uncertainties. For each hour, an empirical cumulative distribution function (ECDF) is fitted to these quantile points. The Monte Carlo simulation method is used to generate scenarios from the ECDF. Then the wind power scenarios are incorporated into a stochastic security-constrained unit commitment (SCUC) model. The heuristic genetic algorithm is utilized to solve the stochastic SCUC problem. Five deterministic and four stochastic case studies incorporated with interval forecasts of wind power are implemented. The results of these cases are presented and discussed together. Generation costs, and the scheduled and real-time economic dispatch reserves of different unit commitment strategies are compared. The experimental results show that the stochastic model is more robust than deterministic ones and, thus, decreases the risk in system operations of smart grids.

  2. Phenomenology of the Higgs effective Lagrangian via F eynR ules

    NASA Astrophysics Data System (ADS)

    Alloul, Adam; Fuks, Benjamin; Sanz, Verónica

    2014-04-01

    The Higgs discovery and the lack of any other hint for new physics favor a description of non-standard Higgs physics in terms of an effective field theory. We present an implementation of a general Higgs effective Lagrangian containing operators up to dimension six in the framework of F eynR ules and provide details on the translation between the mass and interaction bases, in particular for three- and four-point interaction vertices involving Higgs and gauge bosons. We illustrate the strengths of this implementation by using the UFO interface of F eynR ules capable to generate model files that can be understood by the M adG raph 5 event generator and that have the specificity to contain all interaction vertices, without any restriction on the number of external legs or on the complexity of the Lorentz structures. We then investigate several new physics effects in total rates and differential distributions for different Higgs production modes, including gluon fusion, associated production with a gauge boson and di-Higgs production. We finally study contact interactions of gauge and Higgs bosons to fermions.

  3. Optimal and fast E/B separation with a dual messenger field

    NASA Astrophysics Data System (ADS)

    Kodi Ramanah, Doogesh; Lavaux, Guilhem; Wandelt, Benjamin D.

    2018-05-01

    We adapt our recently proposed dual messenger algorithm for spin field reconstruction and showcase its efficiency and effectiveness in Wiener filtering polarized cosmic microwave background (CMB) maps. Unlike conventional preconditioned conjugate gradient (PCG) solvers, our preconditioner-free technique can deal with high-resolution joint temperature and polarization maps with inhomogeneous noise distributions and arbitrary mask geometries with relative ease. Various convergence diagnostics illustrate the high quality of the dual messenger reconstruction. In contrast, the PCG implementation fails to converge to a reasonable solution for the specific problem considered. The implementation of the dual messenger method is straightforward and guarantees numerical stability and convergence. We show how the algorithm can be modified to generate fluctuation maps, which, combined with the Wiener filter solution, yield unbiased constrained signal realizations, consistent with observed data. This algorithm presents a pathway to exact global analyses of high-resolution and high-sensitivity CMB data for a statistically optimal separation of E and B modes. It is therefore relevant for current and next-generation CMB experiments, in the quest for the elusive primordial B-mode signal.

  4. Robust stochastic optimization for reservoir operation

    NASA Astrophysics Data System (ADS)

    Pan, Limeng; Housh, Mashor; Liu, Pan; Cai, Ximing; Chen, Xin

    2015-01-01

    Optimal reservoir operation under uncertainty is a challenging engineering problem. Application of classic stochastic optimization methods to large-scale problems is limited due to computational difficulty. Moreover, classic stochastic methods assume that the estimated distribution function or the sample inflow data accurately represents the true probability distribution, which may be invalid and the performance of the algorithms may be undermined. In this study, we introduce a robust optimization (RO) approach, Iterative Linear Decision Rule (ILDR), so as to provide a tractable approximation for a multiperiod hydropower generation problem. The proposed approach extends the existing LDR method by accommodating nonlinear objective functions. It also provides users with the flexibility of choosing the accuracy of ILDR approximations by assigning a desired number of piecewise linear segments to each uncertainty. The performance of the ILDR is compared with benchmark policies including the sampling stochastic dynamic programming (SSDP) policy derived from historical data. The ILDR solves both the single and multireservoir systems efficiently. The single reservoir case study results show that the RO method is as good as SSDP when implemented on the original historical inflows and it outperforms SSDP policy when tested on generated inflows with the same mean and covariance matrix as those in history. For the multireservoir case study, which considers water supply in addition to power generation, numerical results show that the proposed approach performs as well as in the single reservoir case study in terms of optimal value and distributional robustness.

  5. The natural mathematics of behavior analysis.

    PubMed

    Li, Don; Hautus, Michael J; Elliffe, Douglas

    2018-04-19

    Models that generate event records have very general scope regarding the dimensions of the target behavior that we measure. From a set of predicted event records, we can generate predictions for any dependent variable that we could compute from the event records of our subjects. In this sense, models that generate event records permit us a freely multivariate analysis. To explore this proposition, we conducted a multivariate examination of Catania's Operant Reserve on single VI schedules in transition using a Markov Chain Monte Carlo scheme for Approximate Bayesian Computation. Although we found systematic deviations between our implementation of Catania's Operant Reserve and our observed data (e.g., mismatches in the shape of the interresponse time distributions), the general approach that we have demonstrated represents an avenue for modelling behavior that transcends the typical constraints of algebraic models. © 2018 Society for the Experimental Analysis of Behavior.

  6. High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole.

    PubMed

    Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei

    2018-01-05

    Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80  Gb×45.6  Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114  bits/s, with a failure probability less than 10^{-5}. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.

  7. Distributed cerebellar plasticity implements adaptable gain control in a manipulation task: a closed-loop robotic simulation

    PubMed Central

    Garrido, Jesús A.; Luque, Niceto R.; D'Angelo, Egidio; Ros, Eduardo

    2013-01-01

    Adaptable gain regulation is at the core of the forward controller operation performed by the cerebro-cerebellar loops and it allows the intensity of motor acts to be finely tuned in a predictive manner. In order to learn and store information about body-object dynamics and to generate an internal model of movement, the cerebellum is thought to employ long-term synaptic plasticity. LTD at the PF-PC synapse has classically been assumed to subserve this function (Marr, 1969). However, this plasticity alone cannot account for the broad dynamic ranges and time scales of cerebellar adaptation. We therefore tested the role of plasticity distributed over multiple synaptic sites (Hansel et al., 2001; Gao et al., 2012) by generating an analog cerebellar model embedded into a control loop connected to a robotic simulator. The robot used a three-joint arm and performed repetitive fast manipulations with different masses along an 8-shape trajectory. In accordance with biological evidence, the cerebellum model was endowed with both LTD and LTP at the PF-PC, MF-DCN and PC-DCN synapses. This resulted in a network scheme whose effectiveness was extended considerably compared to one including just PF-PC synaptic plasticity. Indeed, the system including distributed plasticity reliably self-adapted to manipulate different masses and to learn the arm-object dynamics over a time course that included fast learning and consolidation, along the lines of what has been observed in behavioral tests. In particular, PF-PC plasticity operated as a time correlator between the actual input state and the system error, while MF-DCN and PC-DCN plasticity played a key role in generating the gain controller. This model suggests that distributed synaptic plasticity allows generation of the complex learning properties of the cerebellum. The incorporation of further plasticity mechanisms and of spiking signal processing will allow this concept to be extended in a more realistic computational scenario. PMID:24130518

  8. Characterization of industrial waste from a natural gas distribution company and management strategies: a case study of the East Azerbaijan Gas Company (Iran).

    PubMed

    Taghipour, Hassan; Aslhashemi, Ahmad; Assadi, Mohammad; Khodaei, Firoz; Mardangahi, Baharak; Mosaferi, Mohammad; Roshani, Babak

    2012-10-01

    Although a fundamental prerequisite for the successful implementation of any waste management plan is the availability of sufficient and accurate data, there are few available studies regarding the characterization and management of gas distribution company waste (GDCW). This study aimed to characterize the industrial waste generated by the East Azerbaijan Gas Distribution Company (EAGDC) and to present environmental management strategies. The EAGDC serves 57 cities and 821 villages with a total population of more than 2.5 million as well as numerous industrial units. The methodology of this study was based on a checklist of data collected from each zone of the company, site visits (observation), and quantity and quality analysis according to the formal data available from different zones. The results indicate that more than 35 different kinds of industrial solid waste are generated in different industrial installations. The most important types of generated waste include empty barrels (including mercaptans, diesel fuel, deionized waters and oil), faulty gas meters and regulators, a variety of industrial oils, sleeves, filter elements and faulty pipes, valves and fittings. The results indicated that, currently, GDCW is generally handled and disposed of with domestic waste, deposited in companies' installation yards and stores or, sometimes, recycled through non-scientific approaches that can create health risks to the public and the environment, even though most of the GDCW was determined to be recyclable or reusable materials. This study concludes that gas distribution companies must pay more attention to source reduction, recycling and reusing of waste to preserve natural resources, landfill space and the environment.

  9. Simulation of load-sharing in standalone distributed generation system

    NASA Astrophysics Data System (ADS)

    Ajewole, Titus O.; Craven, Robert P. M.; Kayode, Olakunle; Babalola, Olufisayo S.

    2018-05-01

    This paper presents a study on load-sharing among the component generating units of a multi-source electric microgrid that is operated as an autonomous ac supply-mode system. Emerging trend in power system development permits deployment of microgrids for standalone or stand-by applications, thereby requiring active- and reactive power sharing among the discrete generating units contained in hybrid-source microgrids. In this study, therefore, a laboratory-scale model of a microgrid energized with three renewable energy-based sources is employed as a simulation platform to investigate power sharing among the power-generating units. Each source is represented by a source emulator that captures the real operational characteristics of the mimicked generating unit and, with implementation of real-life weather data and load profiles on the model; the sharing of the load among the generating units is investigated. There is a proportionate generation of power by the three source emulators, with their frequencies perfectly synchronized at the point of common coupling as a result of balance flow of power among them. This hybrid topology of renewable energy-based microgrid could therefore be seamlessly adapted into national energy mix by the indigenous electric utility providers in Nigeria.

  10. Intelligent and robust optimization frameworks for smart grids

    NASA Astrophysics Data System (ADS)

    Dhansri, Naren Reddy

    A smart grid implies a cyberspace real-time distributed power control system to optimally deliver electricity based on varying consumer characteristics. Although smart grids solve many of the contemporary problems, they give rise to new control and optimization problems with the growing role of renewable energy sources such as wind or solar energy. Under highly dynamic nature of distributed power generation and the varying consumer demand and cost requirements, the total power output of the grid should be controlled such that the load demand is met by giving a higher priority to renewable energy sources. Hence, the power generated from renewable energy sources should be optimized while minimizing the generation from non renewable energy sources. This research develops a demand-based automatic generation control and optimization framework for real-time smart grid operations by integrating conventional and renewable energy sources under varying consumer demand and cost requirements. Focusing on the renewable energy sources, the intelligent and robust control frameworks optimize the power generation by tracking the consumer demand in a closed-loop control framework, yielding superior economic and ecological benefits and circumvent nonlinear model complexities and handles uncertainties for superior real-time operations. The proposed intelligent system framework optimizes the smart grid power generation for maximum economical and ecological benefits under an uncertain renewable wind energy source. The numerical results demonstrate that the proposed framework is a viable approach to integrate various energy sources for real-time smart grid implementations. The robust optimization framework results demonstrate the effectiveness of the robust controllers under bounded power plant model uncertainties and exogenous wind input excitation while maximizing economical and ecological performance objectives. Therefore, the proposed framework offers a new worst-case deterministic optimization algorithm for smart grid automatic generation control.

  11. Differences in tsunami generation between the December 26, 2004 and March 28, 2005 Sumatra earthquakes

    USGS Publications Warehouse

    Geist, E.L.; Bilek, S.L.; Arcas, D.; Titov, V.V.

    2006-01-01

    Source parameters affecting tsunami generation and propagation for the Mw > 9.0 December 26, 2004 and the Mw = 8.6 March 28, 2005 earthquakes are examined to explain the dramatic difference in tsunami observations. We evaluate both scalar measures (seismic moment, maximum slip, potential energy) and finite-source repre-sentations (distributed slip and far-field beaming from finite source dimensions) of tsunami generation potential. There exists significant variability in local tsunami runup with respect to the most readily available measure, seismic moment. The local tsunami intensity for the December 2004 earthquake is similar to other tsunamigenic earthquakes of comparable magnitude. In contrast, the March 2005 local tsunami was deficient relative to its earthquake magnitude. Tsunami potential energy calculations more accurately reflect the difference in tsunami severity, although these calculations are dependent on knowledge of the slip distribution and therefore difficult to implement in a real-time system. A significant factor affecting tsunami generation unaccounted for in these scalar measures is the location of regions of seafloor displacement relative to the overlying water depth. The deficiency of the March 2005 tsunami seems to be related to concentration of slip in the down-dip part of the rupture zone and the fact that a substantial portion of the vertical displacement field occurred in shallow water or on land. The comparison of the December 2004 and March 2005 Sumatra earthquakes presented in this study is analogous to previous studies comparing the 1952 and 2003 Tokachi-Oki earthquakes and tsunamis, in terms of the effect slip distribution has on local tsunamis. Results from these studies indicate the difficulty in rapidly assessing local tsunami runup from magnitude and epicentral location information alone.

  12. A domain specific language for performance portable molecular dynamics algorithms

    NASA Astrophysics Data System (ADS)

    Saunders, William Robert; Grant, James; Müller, Eike Hermann

    2018-03-01

    Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.

  13. Rules implementing Sections 201 and 210 of the Public Utility Regulatory Policies Act of 1978: a regulatory history

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danziger, R.N.; Caples, P.W.; Huning, J.R.

    1980-09-15

    An analysis is made of the rules implementing Sections 201 and 210 of the Public Utility Regulatory Policies Act of 1978 (PURPA). The act provides that utilities must purchase power from qualifying producers of electricity at nondiscriminatory rates, and it exempts private generators from virtually all state and Federal utility regulations. Most of the analysis presented is taken from the perspective of photovoltaics (PV) and solar thermal electric point-focusing distributed receivers (pfdr). It is felt, however, that the analysis is applicable both to cogeneration and other emerging technologies. Chapters presented are: The FERC Response to Oral Comments on the Proposedmore » Rules Implementing Sections 201 and 210 of PURPA; Additional Changes Made or Not Made That Were Addressed in Other Than Oral Testimony; View on the Proposed Rules Implementing Sections 201 and 210 of PURPA; Response to Comments on the Proposed 201 and 210 Rules; and Summary Analysis of the Environmental Assessment of the Rules. Pertinent reference material is provided in the Appendices, including the text of the rules. (MCW)« less

  14. Voltage management of distribution networks with high penetration of distributed photovoltaic generation sources

    NASA Astrophysics Data System (ADS)

    Alyami, Saeed

    Installation of photovoltaic (PV) units could lead to great challenges to the existing electrical systems. Issues such as voltage rise, protection coordination, islanding detection, harmonics, increased or changed short-circuit levels, etc., need to be carefully addressed before we can see a wide adoption of this environmentally friendly technology. Voltage rise or overvoltage issues are of particular importance to be addressed for deploying more PV systems to distribution networks. This dissertation proposes a comprehensive solution to deal with the voltage violations in distribution networks, from controlling PV power outputs and electricity consumption of smart appliances in real time to optimal placement of PVs at the planning stage. The dissertation is composed of three parts: the literature review, the work that has already been done and the future research tasks. An overview on renewable energy generation and its challenges are given in Chapter 1. The overall literature survey, motivation and the scope of study are also outlined in the chapter. Detailed literature reviews are given in the rest of chapters. The overvoltage and undervoltage phenomena in typical distribution networks with integration of PVs are further explained in Chapter 2. Possible approaches for voltage quality control are also discussed in this chapter, followed by the discussion on the importance of the load management for PHEVs and appliances and its benefits to electric utilities and end users. A new real power capping method is presented in Chapter 3 to prevent overvoltage by adaptively setting the power caps for PV inverters in real time. The proposed method can maintain voltage profiles below a pre-set upper limit while maximizing the PV generation and fairly distributing the real power curtailments among all the PV systems in the network. As a result, each of the PV systems in the network has equal opportunity to generate electricity and shares the responsibility of voltage regulation. The method does not require global information and can be implemented either under a centralized supervisory control scheme or in a distributed way via consensus control. Chapter 4 investigates autonomous operation schedules for three types of intelligent appliances (or residential controllable loads) without receiving external signals for cost saving and for assisting the management of possible photovoltaic generation systems installed in the same distribution network. The three types of controllable loads studied in the chapter are electric water heaters, refrigerators deicing loads, and dishwashers, respectively. Chapter 5 investigates the method to mitigate overvoltage issues at the planning stage. A probabilistic method is presented in the chapter to evaluate the overvoltage risk in a distribution network with different PV capacity sizes under different load levels. Kolmogorov--Smirnov test (K--S test) is used to identify the most proper probability distributions for solar irradiance in different months. To increase accuracy, an iterative process is used to obtain the maximum allowable injection of active power from PVs. Conclusion and discussions on future work are given in Chapter 6.

  15. Passive state preparation in the Gaussian-modulated coherent-states quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, Bing; Evans, Philip G.; Grice, Warren P.

    In the Gaussian-modulated coherent-states (GMCS) quantum key distribution (QKD) protocol, Alice prepares quantum states actively: For each transmission, Alice generates a pair of Gaussian-distributed random numbers, encodes them on a weak coherent pulse using optical amplitude and phase modulators, and then transmits the Gaussian-modulated weak coherent pulse to Bob. Here we propose a passive state preparation scheme using a thermal source. In our scheme, Alice splits the output of a thermal source into two spatial modes using a beam splitter. She measures one mode locally using conjugate optical homodyne detectors, and transmits the other mode to Bob after applying appropriatemore » optical attenuation. Under normal conditions, Alice's measurement results are correlated to Bob's, and they can work out a secure key, as in the active state preparation scheme. Given the initial thermal state generated by the source is strong enough, this scheme can tolerate high detector noise at Alice's side. Furthermore, the output of the source does not need to be single mode, since an optical homodyne detector can selectively measure a single mode determined by the local oscillator. Preliminary experimental results suggest that the proposed scheme could be implemented using an off-the-shelf amplified spontaneous emission source.« less

  16. Passive state preparation in the Gaussian-modulated coherent-states quantum key distribution

    DOE PAGES

    Qi, Bing; Evans, Philip G.; Grice, Warren P.

    2018-01-01

    In the Gaussian-modulated coherent-states (GMCS) quantum key distribution (QKD) protocol, Alice prepares quantum states actively: For each transmission, Alice generates a pair of Gaussian-distributed random numbers, encodes them on a weak coherent pulse using optical amplitude and phase modulators, and then transmits the Gaussian-modulated weak coherent pulse to Bob. Here we propose a passive state preparation scheme using a thermal source. In our scheme, Alice splits the output of a thermal source into two spatial modes using a beam splitter. She measures one mode locally using conjugate optical homodyne detectors, and transmits the other mode to Bob after applying appropriatemore » optical attenuation. Under normal conditions, Alice's measurement results are correlated to Bob's, and they can work out a secure key, as in the active state preparation scheme. Given the initial thermal state generated by the source is strong enough, this scheme can tolerate high detector noise at Alice's side. Furthermore, the output of the source does not need to be single mode, since an optical homodyne detector can selectively measure a single mode determined by the local oscillator. Preliminary experimental results suggest that the proposed scheme could be implemented using an off-the-shelf amplified spontaneous emission source.« less

  17. Prospects of second generation artificial intelligence tools in calibration of chemical sensors.

    PubMed

    Braibanti, Antonio; Rao, Rupenaguntla Sambasiva; Ramam, Veluri Anantha; Rao, Gollapalli Nageswara; Rao, Vaddadi Venkata Panakala

    2005-05-01

    Multivariate data driven calibration models with neural networks (NNs) are developed for binary (Cu++ and Ca++) and quaternary (K+, Ca++, NO3- and Cl-) ion-selective electrode (ISE) data. The response profiles of ISEs with concentrations are non-linear and sub-Nernstian. This task represents function approximation of multi-variate, multi-response, correlated, non-linear data with unknown noise structure i.e. multi-component calibration/prediction in chemometric parlance. Radial distribution function (RBF) and Fuzzy-ARTMAP-NN models implemented in the software packages, TRAJAN and Professional II, are employed for the calibration. The optimum NN models reported are based on residuals in concentration space. Being a data driven information technology, NN does not require a model, prior- or posterior- distribution of data or noise structure. Missing information, spikes or newer trends in different concentration ranges can be modeled through novelty detection. Two simulated data sets generated from mathematical functions are modeled as a function of number of data points and network parameters like number of neurons and nearest neighbors. The success of RBF and Fuzzy-ARTMAP-NNs to develop adequate calibration models for experimental data and function approximation models for more complex simulated data sets ensures AI2 (artificial intelligence, 2nd generation) as a promising technology in quantitation.

  18. High-capacity quantum key distribution via hyperentangled degrees of freedom

    NASA Astrophysics Data System (ADS)

    Simon, David S.; Sergienko, Alexander V.

    2014-06-01

    Quantum key distribution (QKD) has long been a promising area for the application of quantum effects in solving real-world problems. However, two major obstacles have stood in the way of its widespread application: low secure key generation rates and short achievable operating distances. In this paper, a new physical mechanism for dealing with the first of these problems is proposed: the interplay between different degrees of freedom in a hyperentangled system (parametric down-conversion) is used to increase the Hilbert space dimension available for key generation while maintaining security. Polarization-based Bell tests provide security checking, while orbital angular momentum (OAM) and total angular momentum (TAM) provide a higher key generation rate. Whether to measure TAM or OAM is decided randomly in each trial. The concurrent noncommutativity of TAM with OAM and polarization provides the physical basis for quantum security. TAM measurements link polarization to OAM, so that if the legitimate participants measure OAM while the eavesdropper measures TAM (or vice-versa), then polarization entanglement is lost, revealing the eavesdropper. In contrast to other OAM-based QKD methods, complex active switching between OAM bases is not required; instead, passive switching by beam splitters combined with much simpler active switching between polarization bases makes implementation at high OAM more practical.

  19. Measurement-device-independent quantum key distribution for Scarani-Acin-Ribordy-Gisin 04 protocol

    PubMed Central

    Mizutani, Akihiro; Tamaki, Kiyoshi; Ikuta, Rikizo; Yamamoto, Takashi; Imoto, Nobuyuki

    2014-01-01

    The measurement-device-independent quantum key distribution (MDI QKD) was proposed to make BB84 completely free from any side-channel in detectors. Like in prepare & measure QKD, the use of other protocols in MDI setting would be advantageous in some practical situations. In this paper, we consider SARG04 protocol in MDI setting. The prepare & measure SARG04 is proven to be able to generate a key up to two-photon emission events. In MDI setting we show that the key generation is possible from the event with single or two-photon emission by a party and single-photon emission by the other party, but the two-photon emission event by both parties cannot contribute to the key generation. On the contrary to prepare & measure SARG04 protocol where the experimental setup is exactly the same as BB84, the measurement setup for SARG04 in MDI setting cannot be the same as that for BB84 since the measurement setup for BB84 in MDI setting induces too many bit errors. To overcome this problem, we propose two alternative experimental setups, and we simulate the resulting key rate. Our study highlights the requirements that MDI QKD poses on us regarding with the implementation of a variety of QKD protocols. PMID:24913431

  20. Single-intensity-recording optical encryption technique based on phase retrieval algorithm and QR code

    NASA Astrophysics Data System (ADS)

    Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi

    2014-12-01

    Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.

  1. Reducing the Volume of NASA Earth-Science Data

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Braverman, Amy J.; Guillaume, Alexandre

    2010-01-01

    A computer program reduces data generated by NASA Earth-science missions into representative clusters characterized by centroids and membership information, thereby reducing the large volume of data to a level more amenable to analysis. The program effects an autonomous data-reduction/clustering process to produce a representative distribution and joint relationships of the data, without assuming a specific type of distribution and relationship and without resorting to domain-specific knowledge about the data. The program implements a combination of a data-reduction algorithm known as the entropy-constrained vector quantization (ECVQ) and an optimization algorithm known as the differential evolution (DE). The combination of algorithms generates the Pareto front of clustering solutions that presents the compromise between the quality of the reduced data and the degree of reduction. Similar prior data-reduction computer programs utilize only a clustering algorithm, the parameters of which are tuned manually by users. In the present program, autonomous optimization of the parameters by means of the DE supplants the manual tuning of the parameters. Thus, the program determines the best set of clustering solutions without human intervention.

  2. Liberalization of the Spanish electricity sector: An advanced model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unda, J.I.

    1998-06-01

    Spain`s electricity industry is being restructured to provide a competitive generation market, a regulated, open access transmission and distribution system, and phased-in customer choice. But while the reform is radical in its objectives, it will be gradual in its implementation. This article briefly describes the current state of affairs within the Spanish electricity sector and details the reform plans set out in the act, focusing on the adopted institutional design and the established transition period. It also offers an overview of the role that the regulatory authority will play throughout the process.

  3. Adaptive Transmission Planning: Implementing a New Paradigm for Managing Economic Risks in Grid Expansion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hobbs, Benjamin F.; Xu, Qingyu; Ho, Jonathan

    The problem of whether, where, when, and what types of transmission facilities to build in terms of minimizing costs and maximizing net economic benefits has been a challenge for the power industry from the beginning-ever since Thomas Edison debated whether to create longer dc distribution lines (with their high losses) or build new power stations in expanding his urban markets. Today's planning decisions are far more complex, as grids cover the continent and new transmission, generation, and demand-side technologies emerge.

  4. Testbeds for Assessing Critical Scenarios in Power Control Systems

    NASA Astrophysics Data System (ADS)

    Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem

    The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.

  5. Acousto-Optical Vector Matrix Product Processor: Implementation Issues

    DTIC Science & Technology

    1989-04-25

    power by a factor of 3.8. The acoustic velocity in longitudinal TeO2 is 4200 m/s, almost the same as the 4100 m/s acoustic velocity in dense flint glass ...field via an Interaction Model AOD150 dense flint glass Bragg Cell. The cell’s specifications are listed in the table below. BRAGG CELL SPECIFICATIONS...39 ns intervals). Since the speed of sound in dense flint glass is 4100 m/s, the acoustic field generated in a 10 As interval is distributed over a 4.1

  6. Next Generation Transport Phenomenology Model

    NASA Technical Reports Server (NTRS)

    Strickland, Douglas J.; Knight, Harold; Evans, J. Scott

    2004-01-01

    This report describes the progress made in Quarter 3 of Contract Year 3 on the development of Aeronomy Phenomenology Modeling Tool (APMT), an open-source, component-based, client-server architecture for distributed modeling, analysis, and simulation activities focused on electron and photon transport for general atmospheres. In the past quarter, column emission rate computations were implemented in Java, preexisting Fortran programs for computing synthetic spectra were embedded into APMT through Java wrappers, and work began on a web-based user interface for setting input parameters and running the photoelectron and auroral electron transport models.

  7. Distributed Coordination for Optimal Energy Generation and Distribution in Cyber-Physical Energy Networks.

    PubMed

    Ahn, Hyo-Sung; Kim, Byeong-Yeon; Lim, Young-Hun; Lee, Byung-Hun; Oh, Kwang-Kyo

    2018-03-01

    This paper proposes three coordination laws for optimal energy generation and distribution in energy network, which is composed of physical flow layer and cyber communication layer. The physical energy flows through the physical layer; but all the energies are coordinated to generate and flow by distributed coordination algorithms on the basis of communication information. First, distributed energy generation and energy distribution laws are proposed in a decoupled manner without considering the interactive characteristics between the energy generation and energy distribution. Second, a joint coordination law to treat the energy generation and energy distribution in a coupled manner taking account of the interactive characteristics is designed. Third, to handle over- or less-energy generation cases, an energy distribution law for networks with batteries is designed. The coordination laws proposed in this paper are fully distributed in the sense that they are decided optimally only using relative information among neighboring nodes. Through numerical simulations, the validity of the proposed distributed coordination laws is illustrated.

  8. Future impacts of distributed power generation on ambient ozone and particulate matter concentrations in the San Joaquin Valley of California.

    PubMed

    Vutukuru, Satish; Carreras-Sospedra, Marc; Brouwer, Jacob; Dabdub, Donald

    2011-12-01

    Distributed power generation-electricity generation that is produced by many small stationary power generators distributed throughout an urban air basin-has the potential to supply a significant portion of electricity in future years. As a result, distributed generation may lead to increased pollutant emissions within an urban air basin, which could adversely affect air quality. However, the use of combined heating and power with distributed generation may reduce the energy consumption for space heating and air conditioning, resulting in a net decrease of pollutant and greenhouse gas emissions. This work used a systematic approach based on land-use geographical information system data to determine the spatial and temporal distribution of distributed generation emissions in the San Joaquin Valley Air Basin of California and simulated the potential air quality impacts using state-of-the-art three-dimensional computer models. The evaluation of the potential market penetration of distributed generation focuses on the year 2023. In general, the air quality impacts of distributed generation were found to be small due to the restrictive 2007 California Air Resources Board air emission standards applied to all distributed generation units and due to the use of combined heating and power. Results suggest that if distributed generation units were allowed to emit at the current Best Available Control Technology standards (which are less restrictive than the 2007 California Air Resources Board standards), air quality impacts of distributed generation could compromise compliance with the federal 8-hr average ozone standard in the region.

  9. Future Impacts of Distributed Power Generation on Ambient Ozone and Particulate Matter Concentrations in the San Joaquin Valley of California.

    PubMed

    Vutukuru, Satish; Carreras-Sospedra, Marc; Brouwer, Jacob; Dabdub, Donald

    2011-12-01

    Distributed power generation-electricity generation that is produced by many small stationary power generators distributed throughout an urban air basin-has the potential to supply a significant portion of electricity in future years. As a result, distributed generation may lead to increased pollutant emissions within an urban air basin, which could adversely affect air quality. However, the use of combined heating and power with distributed generation may reduce the energy consumption for space heating and air conditioning, resulting in a net decrease of pollutant and greenhouse gas emissions. This work used a systematic approach based on land-use geographical information system data to determine the spatial and temporal distribution of distributed generation emissions in the San Joaquin Valley Air Basin of California and simulated the potential air quality impacts using state-of-the-art three-dimensional computer models. The evaluation of the potential market penetration of distributed generation focuses on the year 2023. In general, the air quality impacts of distributed generation were found to be small due to the restrictive 2007 California Air Resources Board air emission standards applied to all distributed generation units and due to the use of combined heating and power. Results suggest that if distributed generation units were allowed to emit at the current Best Available Control Technology standards (which are less restrictive than the 2007 California Air Resources Board standards), air quality impacts of distributed generation could compromise compliance with the federal 8-hr average ozone standard in the region. [Box: see text].

  10. Toward a theory of distributed word expert natural language parsing

    NASA Technical Reports Server (NTRS)

    Rieger, C.; Small, S.

    1981-01-01

    An approach to natural language meaning-based parsing in which the unit of linguistic knowledge is the word rather than the rewrite rule is described. In the word expert parser, knowledge about language is distributed across a population of procedural experts, each representing a word of the language, and each an expert at diagnosing that word's intended usage in context. The parser is structured around a coroutine control environment in which the generator-like word experts ask questions and exchange information in coming to collective agreement on sentence meaning. The word expert theory is advanced as a better cognitive model of human language expertise than the traditional rule-based approach. The technical discussion is organized around examples taken from the prototype LISP system which implements parts of the theory.

  11. Using Bayesian networks to support decision-focused information retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehner, P.; Elsaesser, C.; Seligman, L.

    This paper has described an approach to controlling the process of pulling data/information from distributed data bases in a way that is specific to a persons specific decision making context. Our prototype implementation of this approach uses a knowledge-based planner to generate a plan, an automatically constructed Bayesian network to evaluate the plan, specialized processing of the network to derive key information items that would substantially impact the evaluation of the plan (e.g., determine that replanning is needed), automated construction of Standing Requests for Information (SRIs) which are automated functions that monitor changes and trends in distributed data base thatmore » are relevant to the key information items. This emphasis of this paper is on how Bayesian networks are used.« less

  12. Buffered coscheduling for parallel programming and enhanced fault tolerance

    DOEpatents

    Petrini, Fabrizio [Los Alamos, NM; Feng, Wu-chun [Los Alamos, NM

    2006-01-31

    A computer implemented method schedules processor jobs on a network of parallel machine processors or distributed system processors. Control information communications generated by each process performed by each processor during a defined time interval is accumulated in buffers, where adjacent time intervals are separated by strobe intervals for a global exchange of control information. A global exchange of the control information communications at the end of each defined time interval is performed during an intervening strobe interval so that each processor is informed by all of the other processors of the number of incoming jobs to be received by each processor in a subsequent time interval. The buffered coscheduling method of this invention also enhances the fault tolerance of a network of parallel machine processors or distributed system processors

  13. Modeling, Monitoring and Fault Diagnosis of Spacecraft Air Contaminants

    NASA Technical Reports Server (NTRS)

    Ramirez, W. Fred; Skliar, Mikhail; Narayan, Anand; Morgenthaler, George W.; Smith, Gerald J.

    1996-01-01

    Progress and results in the development of an integrated air quality modeling, monitoring, fault detection, and isolation system are presented. The focus was on development of distributed models of the air contaminants transport, the study of air quality monitoring techniques based on the model of transport process and on-line contaminant concentration measurements, and sensor placement. Different approaches to the modeling of spacecraft air contamination are discussed, and a three-dimensional distributed parameter air contaminant dispersion model applicable to both laminar and turbulent transport is proposed. A two-dimensional approximation of a full scale transport model is also proposed based on the spatial averaging of the three dimensional model over the least important space coordinate. A computer implementation of the transport model is considered and a detailed development of two- and three-dimensional models illustrated by contaminant transport simulation results is presented. The use of a well established Kalman filtering approach is suggested as a method for generating on-line contaminant concentration estimates based on both real time measurements and the model of contaminant transport process. It is shown that high computational requirements of the traditional Kalman filter can render difficult its real-time implementation for high-dimensional transport model and a novel implicit Kalman filtering algorithm is proposed which is shown to lead to an order of magnitude faster computer implementation in the case of air quality monitoring.

  14. CIF2Cell: Generating geometries for electronic structure programs

    NASA Astrophysics Data System (ADS)

    Björkman, Torbjörn

    2011-05-01

    The CIF2Cell program generates the geometrical setup for a number of electronic structure programs based on the crystallographic information in a Crystallographic Information Framework (CIF) file. The program will retrieve the space group number, Wyckoff positions and crystallographic parameters, make a sensible choice for Bravais lattice vectors (primitive or principal cell) and generate all atomic positions. Supercells can be generated and alloys are handled gracefully. The code currently has output interfaces to the electronic structure programs ABINIT, CASTEP, CPMD, Crystal, Elk, Exciting, EMTO, Fleur, RSPt, Siesta and VASP. Program summaryProgram title: CIF2Cell Catalogue identifier: AEIM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL version 3 No. of lines in distributed program, including test data, etc.: 12 691 No. of bytes in distributed program, including test data, etc.: 74 933 Distribution format: tar.gz Programming language: Python (versions 2.4-2.7) Computer: Any computer that can run Python (versions 2.4-2.7) Operating system: Any operating system that can run Python (versions 2.4-2.7) Classification: 7.3, 7.8, 8 External routines: PyCIFRW [1] Nature of problem: Generate the geometrical setup of a crystallographic cell for a variety of electronic structure programs from data contained in a CIF file. Solution method: The CIF file is parsed using routines contained in the library PyCIFRW [1], and crystallographic as well as bibliographic information is extracted. The program then generates the principal cell from symmetry information, crystal parameters, space group number and Wyckoff sites. Reduction to a primitive cell is then performed, and the resulting cell is output to suitably named files along with documentation of the information source generated from any bibliographic information contained in the CIF file. If the space group symmetries is not present in the CIF file the program will fall back on internal tables, so only the minimal input of space group, crystal parameters and Wyckoff positions are required. Additional key features are handling of alloys and supercell generation. Additional comments: Currently implements support for the following general purpose electronic structure programs: ABINIT [2,3], CASTEP [4], CPMD [5], Crystal [6], Elk [7], exciting [8], EMTO [9], Fleur [10], RSPt [11], Siesta [12] and VASP [13-16]. Running time: The examples provided in the distribution take only seconds to run.

  15. Merging W W and W W + jet with Minlo

    DOE PAGES

    Hamilton, Keith; Melia, Tom; Monni, Pier Francesco; ...

    2016-09-12

    We present a simulation program for the production of a pair of W bosons in association with a jet, that can be used in conjunction with general-purpose shower Monte Carlo generators, according to the Powheg method. We have further adapted and implemented the Minlo ' method on top of the NLO calculation underlying our W + W - + jet generator. Thus, the resulting simulation achieves NLO accuracy not only for inclusive distributions in W + W - + jet production but also W + W - production, i.e. when the associated jet is not resolved, without the introduction ofmore » any unphysical merging scale. This work represents the first extension of the Minlo ' method, in its original form, to the case of a genuine underlying 2 → 2 process, with non-trivial virtual corrections.« less

  16. Test Results from a High Power Linear Alternator Test Rig

    NASA Technical Reports Server (NTRS)

    Birchenough, Arthur G.; Hervol, David S.; Gardner, Brent G.

    2010-01-01

    Stirling cycle power conversion is an enabling technology that provides high thermodynamic efficiency but also presents unique challenges with regard to electrical power generation, management, and distribution. The High Power Linear Alternator Test Rig (HPLATR) located at the NASA Glenn Research Center (GRC) in Cleveland, OH is a demonstration test bed that simulates electrical power generation from a Stirling engine driven alternator. It implements the high power electronics necessary to provide a well regulated DC user load bus. These power electronics use a novel design solution that includes active rectification and power factor control, active ripple suppression, along with a unique building block approach that permits the use of high voltage or high current alternator designs. This presentation describes the HPLATR, the test program, and the operational results.

  17. The Next Generation Science Standards: A potential revolution for geoscience education

    NASA Astrophysics Data System (ADS)

    Wysession, Michael E.

    2014-05-01

    The first and only set of U.S.-nationally distributed K-12 science education standards have been adopted by many states across America, with the potential to be adopted by many more. Earth and space science plays a prominent role in the new standards, with particular emphasis on critical Earth issues such as climate change, sustainability, and human impacts on Earth systems. In the states that choose to adopt the Next Generation Science Standards (NGSS), American youth will have a rigorous practice-based formal education in these important areas. Much work needs to be done to insure the adoption and adequate implementation of the NGSS by a majority of American states, however, and there are many things that Earth and space scientists can do to help facilitate the process.

  18. Test Results From a High Power Linear Alternator Test Rig

    NASA Technical Reports Server (NTRS)

    Birchenough, Arthur G.; Hervol, David S.; Gardner, Brent G.

    2010-01-01

    Stirling cycle power conversion is an enabling technology that provides high thermodynamic efficiency but also presents unique challenges with regard to electrical power generation, management, and distribution. The High Power Linear Alternator Test Rig (HPLATR) located at the NASA Glenn Research Center (GRC) in Cleveland, Ohio is a demonstration test bed that simulates electrical power generation from a Stirling engine driven alternator. It implements the high power electronics necessary to provide a well regulated DC user load bus. These power electronics use a novel design solution that includes active rectification and power factor control, active ripple suppression, along with a unique building block approach that permits the use of high voltage or high current alternator designs. This report describes the HPLATR, the test program, and the operational results.

  19. Implementation of personalized medicine in Central-Eastern Europe: pitfalls and potentials based on citizen's attitude.

    PubMed

    Balicza, Peter; Terebessy, Andras; Grosz, Zoltan; Varga, Noemi Agnes; Gal, Aniko; Fekete, Balint Andras; Molnar, Maria Judit

    2018-03-01

    Next-generation sequencing is increasingly utilized worldwide as a research and diagnostic tool and is anticipated to be implemented into everyday clinical practice. Since Central-Eastern European attitude toward genetic testing, especially broad genetic testing, is not well known, we performed a survey on this issue among Hungarian participants. A self-administered questionnaire was distributed among patients and patient relatives at our neurogenetic outpatient clinic. Members of the general population were also recruited via public media. We used chi-square testing and binary logistic regression to examine factors influencing attitude. We identified a mixed attitude toward genetic testing. Access to physician consultation positively influenced attitude. A higher self-determined genetic familiarity score associated with higher perceived genetic influence score, which in turn associated with greater willingness to participate in genetic testing. Medical professionals constituted a skeptical group. We think that given the controversies and complexities of the next-generation sequencing field, the optimal clinical translation of NGS data should be performed in institutions which have the unique capability to provide interprofessional health education, transformative biomedical research, and crucial patient care. With optimization of the clinical translational process, improvement of genetic literacy may increase patient engagement and empowerment. The paper highlights that in countries with relatively low-genetic literacy, a special strategy is needed to enhance the implementation of personalized medicine.

  20. Focal-Plane Sensing-Processing: A Power-Efficient Approach for the Implementation of Privacy-Aware Networked Visual Sensors

    PubMed Central

    Fernández-Berni, Jorge; Carmona-Galán, Ricardo; del Río, Rocío; Kleihorst, Richard; Philips, Wilfried; Rodríguez-Vázquez, Ángel

    2014-01-01

    The capture, processing and distribution of visual information is one of the major challenges for the paradigm of the Internet of Things. Privacy emerges as a fundamental barrier to overcome. The idea of networked image sensors pervasively collecting data generates social rejection in the face of sensitive information being tampered by hackers or misused by legitimate users. Power consumption also constitutes a crucial aspect. Images contain a massive amount of data to be processed under strict timing requirements, demanding high-performance vision systems. In this paper, we describe a hardware-based strategy to concurrently address these two key issues. By conveying processing capabilities to the focal plane in addition to sensing, we can implement privacy protection measures just at the point where sensitive data are generated. Furthermore, such measures can be tailored for efficiently reducing the computational load of subsequent processing stages. As a proof of concept, a full-custom QVGA vision sensor chip is presented. It incorporates a mixed-signal focal-plane sensing-processing array providing programmable pixelation of multiple image regions in parallel. In addition to this functionality, the sensor exploits reconfigurability to implement other processing primitives, namely block-wise dynamic range adaptation, integral image computation and multi-resolution filtering. The proposed circuitry is also suitable to build a granular space, becoming the raw material for subsequent feature extraction and recognition of categorized objects. PMID:25195849

  1. Focal-plane sensing-processing: a power-efficient approach for the implementation of privacy-aware networked visual sensors.

    PubMed

    Fernández-Berni, Jorge; Carmona-Galán, Ricardo; del Río, Rocío; Kleihorst, Richard; Philips, Wilfried; Rodríguez-Vázquez, Ángel

    2014-08-19

    The capture, processing and distribution of visual information is one of the major challenges for the paradigm of the Internet of Things. Privacy emerges as a fundamental barrier to overcome. The idea of networked image sensors pervasively collecting data generates social rejection in the face of sensitive information being tampered by hackers or misused by legitimate users. Power consumption also constitutes a crucial aspect. Images contain a massive amount of data to be processed under strict timing requirements, demanding high-performance vision systems. In this paper, we describe a hardware-based strategy to concurrently address these two key issues. By conveying processing capabilities to the focal plane in addition to sensing, we can implement privacy protection measures just at the point where sensitive data are generated. Furthermore, such measures can be tailored for efficiently reducing the computational load of subsequent processing stages. As a proof of concept, a full-custom QVGA vision sensor chip is presented. It incorporates a mixed-signal focal-plane sensing-processing array providing programmable pixelation of multiple image regions in parallel. In addition to this functionality, the sensor exploits reconfigurability to implement other processing primitives, namely block-wise dynamic range adaptation, integral image computation and multi-resolution filtering. The proposed circuitry is also suitable to build a granular space, becoming the raw material for subsequent feature extraction and recognition of categorized objects.

  2. MAFsnp: A Multi-Sample Accurate and Flexible SNP Caller Using Next-Generation Sequencing Data

    PubMed Central

    Hu, Jiyuan; Li, Tengfei; Xiu, Zidi; Zhang, Hong

    2015-01-01

    Most existing statistical methods developed for calling single nucleotide polymorphisms (SNPs) using next-generation sequencing (NGS) data are based on Bayesian frameworks, and there does not exist any SNP caller that produces p-values for calling SNPs in a frequentist framework. To fill in this gap, we develop a new method MAFsnp, a Multiple-sample based Accurate and Flexible algorithm for calling SNPs with NGS data. MAFsnp is based on an estimated likelihood ratio test (eLRT) statistic. In practical situation, the involved parameter is very close to the boundary of the parametric space, so the standard large sample property is not suitable to evaluate the finite-sample distribution of the eLRT statistic. Observing that the distribution of the test statistic is a mixture of zero and a continuous part, we propose to model the test statistic with a novel two-parameter mixture distribution. Once the parameters in the mixture distribution are estimated, p-values can be easily calculated for detecting SNPs, and the multiple-testing corrected p-values can be used to control false discovery rate (FDR) at any pre-specified level. With simulated data, MAFsnp is shown to have much better control of FDR than the existing SNP callers. Through the application to two real datasets, MAFsnp is also shown to outperform the existing SNP callers in terms of calling accuracy. An R package “MAFsnp” implementing the new SNP caller is freely available at http://homepage.fudan.edu.cn/zhangh/softwares/. PMID:26309201

  3. Water Data Infrastructure for Next-Generation e-Water-Services in Flanders

    NASA Astrophysics Data System (ADS)

    Smets, Steven; Pannemans, Bart; Minnema, Bennie; Weerts, Albrech H.; de Rooij, Erik; Natschke, Michael; Stiers, Walter; Wolfs, Vincenct; Willems, Patrick; Vansteenkiste, Thomas; Cauwenberghs, Kris

    2017-04-01

    Efficient sharing of water data and services (e.g. models, tools) is a challenging task. Several EU projects (e.g. DRIHM) already investigated some of the bottlenecks. In a new project, we investigated several issues to establish a Water Data Infrastructure for e-WaterServices in Flanders. Important features of such a WDI deals are - Institutional arrangements - agreement around technology and standards - agreement about dissemination of water related data and tools The goal of the WDI is to get to one (distributed) environment with models, data and tools for professionals, scientists and citizens to analyse data and run (the latest state of the art) models without (direct) interaction with the providers and developers of these data, models and tools. In the project, a WDI architecture was developed and proposed based on the developed WDI principles. The WDI principles and architecture were tested and demonstrated with 3 proof of concept (where execution of a lumped and distributed hydrological model and hydraulic models, running and visualisation were distributed over the infrastructure of the different projecpartners). We will present the WDI principles and architecture and its implementation for 3 use cases (operational, policy and on the fly modelling of accidents, e.g. spill). Results of the proof of concepts will be shown. It was found that institutional arrangements are the biggest hurdle for implementation of such a WDI.

  4. Proceedings of the Workshop on software tools for distributed intelligent control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less

  5. IFIS Model-Plus: A Web-Based GUI for Visualization, Comparison and Evaluation of Distributed Flood Forecasts and Hindcasts

    NASA Astrophysics Data System (ADS)

    Krajewski, W. F.; Della Libera Zanchetta, A.; Mantilla, R.; Demir, I.

    2017-12-01

    This work explores the use of hydroinformatics tools to provide an user friendly and accessible interface for executing and assessing the output of realtime flood forecasts using distributed hydrological models. The main result is the implementation of a web system that uses an Iowa Flood Information System (IFIS)-based environment for graphical displays of rainfall-runoff simulation results for both real-time and past storm events. It communicates with ASYNCH ODE solver to perform large-scale distributed hydrological modeling based on segmentation of the terrain into hillslope-link hydrologic units. The cyber-platform also allows hindcast of model performance by testing multiple model configurations and assumptions of vertical flows in the soils. The scope of the currently implemented system is the entire set of contributing watersheds for the territory of the state of Iowa. The interface provides resources for visualization of animated maps for different water-related modeled states of the environment, including flood-waves propagation with classification of flood magnitude, runoff generation, surface soil moisture and total water column in the soil. Additional tools for comparing different model configurations and performing model evaluation by comparing to observed variables at monitored sites are also available. The user friendly interface has been published to the web under the URL http://ifis.iowafloodcenter.org/ifis/sc/modelplus/.

  6. Poster - 18: New features in EGSnrc for photon cross sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, Elsayed; Mainegra-Hing, Ernesto; Rogers, Davi

    2016-08-15

    Purpose: To implement two new features in the EGSnrc Monte Carlo system. The first is an option to account for photonuclear attenuation, which can contribute a few percent to the total cross section at the higher end of the energy range of interest to medical physics. The second is an option to use exact NIST XCOM photon cross sections. Methods: For the first feature, the photonuclear total cross sections are generated from the IAEA evaluated data. In the current, first-order implementation, after a photonuclear event, there is no energy deposition or secondary particle generation. The implementation is validated against deterministicmore » calculations and experimental measurements of transmission signals. For the second feature, before this work, if the user explicitly requested XCOM photon cross sections, EGSnrc still used its own internal incoherent scattering cross sections. These differ by up to 2% from XCOM data between 30 keV and 40 MeV. After this work, exact XCOM incoherent scattering cross sections are an available option. Minor interpolation artifacts in pair and triplet XCOM cross sections are also addressed. The default for photon cross section in EGSnrc is XCOM except for the new incoherent scattering cross sections, which have to be explicitly requested. The photonuclear, incoherent, pair and triplet data from this work are available for elements and compounds for photon energies from 1 keV to 100 GeV. Results: Both features are implemented and validated in EGSnrc.Conclusions: The two features are part of the standard EGSnrc distribution as of version 4.2.3.2.« less

  7. SU-E-J-127: Implementation of An Online Replanning Tool for VMAT Using Flattening Filter-Free Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ates, O; Ahunbay, E; Li, X

    2015-06-15

    Purpose: This is to report the implementation of an online replanning tool based on segment aperture morphing (SAM) for VMAT with flattening filter free (FFF) beams. Methods: Previously reported SAM algorithm modified to accommodate VMAT with FFF beams was implemented in a tool that was interfaced with a treatment planning system (Monaco, Elekta). The tool allows (1) to output the beam parameters of the original VMAT plan from Monaco, and (2) to input the apertures generated from the SAM algorithm into Monaco for the dose calculation on daily CT/CBCT/MRI in the following steps:(1) Quickly generating target contour based on themore » image of the day, using an auto-segmentation tool (ADMIRE, Elekta) with manual editing if necessary; (2) Morphing apertures based on the SAM in the original VMAT plan to account for the interfractional change of the target from the planning to the daily images; (3) Calculating dose distribution for new apertures with the same numbers of MU as in the original plan; (4) Transferring the new plan into a record & verify system (MOSAIQ, Elekta); (5) Performing a pre-delivery QA based on software; (6) Delivering the adaptive plan for the fraction.This workflow was implemented on a 16-CPU (2.6 GHz dual-core) hardware with GPU and was tested for sample cases of prostate, pancreas and lung tumors. Results: The online replanning process can be completed within 10 minutes. The adaptive plans generally have improved the plan quality when compared to the IGRT repositioning plans. The adaptive plans with FFF beams have better normal tissue sparing as compared with those of FF beams. Conclusion: The online replanning tool based on SAM can quickly generate adaptive VMAT plans using FFF beams with improved plan quality than those from the IGRT repositioning plans based on daily CT/CBCT/MRI and can be used clinically. This research was supported by Elekta Inc. (Crawley, UK)« less

  8. Network-based Arbitrated Quantum Signature Scheme with Graph State

    NASA Astrophysics Data System (ADS)

    Ma, Hongling; Li, Fei; Mao, Ningyi; Wang, Yijun; Guo, Ying

    2017-08-01

    Implementing an arbitrated quantum signature(QAS) through complex networks is an interesting cryptography technology in the literature. In this paper, we propose an arbitrated quantum signature for the multi-user-involved networks, whose topological structures are established by the encoded graph state. The determinative transmission of the shared keys, is enabled by the appropriate stabilizers performed on the graph state. The implementation of this scheme depends on the deterministic distribution of the multi-user-shared graph state on which the encoded message can be processed in signing and verifying phases. There are four parties involved, the signatory Alice, the verifier Bob, the arbitrator Trent and Dealer who assists the legal participants in the signature generation and verification. The security is guaranteed by the entanglement of the encoded graph state which is cooperatively prepared by legal participants in complex quantum networks.

  9. Performance, implementation and network management techniques for a European CDMA-based land-mobile satellite system

    NASA Astrophysics Data System (ADS)

    Arenaccio, S.; Vernucci, A.; Padovani, R.; Arcidiacono, A.

    Results of a detailed comparative performance assessment between two candidate access solutions for the provision of land-mobile services, i.e., FDMA and CDMA, for the European Land-Mobile Satellite Services (LMSS) provision are presented. The design of the CDMA access system and the network architecture, system procedures, network control, operation in fading environments, and implementation aspects of the system are described. The CDMA system is shown to yield superior traffic capability, despite the absence of polarization reuse due to payload design, especially in the second-generation era (multiple spot-beams). In this case, the advantage was found to be largely dependent on the traffic distribution across spot beams. Power control techniques are proposed to cope with the geographical disadvantage suffered by mobile stations located at the beam borders to compensate for fadings.

  10. Electron beam therapy with coil-generated magnetic fields.

    PubMed

    Nardi, Eran; Barnea, Gideon; Ma, Chang-Ming

    2004-06-01

    This paper presents an initial study on the issues involved in the practical implementation of the use of transverse magnetic fields in electron beam therapy. By using such magnetic fields the dose delivered to the tumor region can increase significantly relative to that deposited to the healthy tissue. Initially we calculated the magnetic fields produced by the Helmholtz coil and modified Helmholtz coil configurations. These configurations, which can readily be used to generate high intensity magnetic fields, approximate the idealized magnetic fields studied in our previous publications. It was therefore of interest to perform a detailed study of the fields produced by these configurations. Electron beam dose distributions for 15 MeV electrons were calculated using the ACCEPTM code for a 3T transverse magnetic field produced by the modified Helmholtz configuration. The dose distribution was compared to those obtained with no magnetic field. The results were similar to those obtained in our previous work, where an idealized step function magnetic field was used and a 3T field was shown to be the optimal field strength. A simpler configuration was also studied in which a single external coil was used to generate the field. Electron dose distributions are also presented for a given geometry and given magnetic field strength using this configuration. The results indicate that this method is more difficult to apply to radiotherapy due to its lack of symmetry and its irregularity. For the various configurations dealt with here, a major problem is the need to shield the magnetic field in the beam propagation volume, a topic that must be studied in detail.

  11. Evaluating the validity of multiple imputation for missing physiological data in the national trauma data bank.

    PubMed

    Moore, Lynne; Hanley, James A; Lavoie, André; Turgeon, Alexis

    2009-05-01

    The National Trauma Data Bank (NTDB) is plagued by the problem of missing physiological data. The Glasgow Coma Scale score, Respiratory Rate and Systolic Blood Pressure are an essential part of risk adjustment strategies for trauma system evaluation and clinical research. Missing data on these variables may compromise the feasibility and the validity of trauma group comparisons. To evaluate the validity of Multiple Imputation (MI) for completing missing physiological data in the National Trauma Data Bank (NTDB), by assessing the impact of MI on 1) frequency distributions, 2) associations with mortality, and 3) risk adjustment. Analyses were based on 170,956 NTDB observations with complete physiological data (observed data set). Missing physiological data were artificially imposed on this data set and then imputed using MI (MI data set). To assess the impact of MI on risk adjustment, 100 pairs of hospitals were randomly selected with replacement and compared using adjusted Odds Ratios (OR) of mortality. OR generated by the observed data set were then compared to those generated by the MI data set. Frequency distributions and associations with mortality were preserved following MI. The median absolute difference between adjusted OR of mortality generated by the observed data set and by the MI data set was 3.6% (inter-quartile range: 2.4%-6.1%). This study suggests that, provided it is implemented with care, MI of missing physiological data in the NTDB leads to valid frequency distributions, preserves associations with mortality, and does not compromise risk adjustment in inter-hospital comparisons of mortality.

  12. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, Implementation Strategy for a Distribution Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Ravindra; Reilly, James T.; Wang, Jianhui

    Electric distribution utilities encounter many challenges to successful deployment of Distribution Management Systems (DMSs). The key challenges are documented in this report, along with suggestions for overcoming them. This report offers a recommended list of activities for implementing a DMS. It takes a strategic approach to implementing DMS from a project management perspective. The project management strategy covers DMS planning, procurement, design, building, testing, Installation, commissioning, and system integration issues and solutions. It identifies the risks that are associated with implementation and suggests strategies for utilities to use to mitigate them or avoid them altogether. Attention is given to commonmore » barriers to successful DMS implementation. This report begins with an overview of the implementation strategy for a DMS and proceeds to put forward a basic approach for procuring hardware and software for a DMS; designing the interfaces with external corporate computing systems such as EMS, GIS, OMS, and AMI; and implementing a complete solution.« less

  13. SU-E-T-626: Accuracy of Dose Calculation Algorithms in MultiPlan Treatment Planning System in Presence of Heterogeneities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moignier, C; Huet, C; Barraux, V

    Purpose: Advanced stereotactic radiotherapy (SRT) treatments require accurate dose calculation for treatment planning especially for treatment sites involving heterogeneous patient anatomy. The purpose of this study was to evaluate the accuracy of dose calculation algorithms, Raytracing and Monte Carlo (MC), implemented in the MultiPlan treatment planning system (TPS) in presence of heterogeneities. Methods: First, the LINAC of a CyberKnife radiotherapy facility was modeled with the PENELOPE MC code. A protocol for the measurement of dose distributions with EBT3 films was established and validated thanks to comparison between experimental dose distributions and calculated dose distributions obtained with MultiPlan Raytracing and MCmore » algorithms as well as with the PENELOPE MC model for treatments planned with the homogenous Easycube phantom. Finally, bones and lungs inserts were used to set up a heterogeneous Easycube phantom. Treatment plans with the 10, 7.5 or the 5 mm field sizes were generated in Multiplan TPS with different tumor localizations (in the lung and at the lung/bone/soft tissue interface). Experimental dose distributions were compared to the PENELOPE MC and Multiplan calculations using the gamma index method. Results: Regarding the experiment in the homogenous phantom, 100% of the points passed for the 3%/3mm tolerance criteria. These criteria include the global error of the method (CT-scan resolution, EBT3 dosimetry, LINAC positionning …), and were used afterwards to estimate the accuracy of the MultiPlan algorithms in heterogeneous media. Comparison of the dose distributions obtained in the heterogeneous phantom is in progress. Conclusion: This work has led to the development of numerical and experimental dosimetric tools for small beam dosimetry. Raytracing and MC algorithms implemented in MultiPlan TPS were evaluated in heterogeneous media.« less

  14. Low-cost high performance distributed data storage for multi-channel observations

    NASA Astrophysics Data System (ADS)

    Liu, Ying-bo; Wang, Feng; Deng, Hui; Ji, Kai-fan; Dai, Wei; Wei, Shou-lin; Liang, Bo; Zhang, Xiao-li

    2015-10-01

    The New Vacuum Solar Telescope (NVST) is a 1-m solar telescope that aims to observe the fine structures in both the photosphere and the chromosphere of the Sun. The observational data acquired simultaneously from one channel for the chromosphere and two channels for the photosphere bring great challenges to the data storage of NVST. The multi-channel instruments of NVST, including scientific cameras and multi-band spectrometers, generate at least 3 terabytes data per day and require high access performance while storing massive short-exposure images. It is worth studying and implementing a storage system for NVST which would balance the data availability, access performance and the cost of development. In this paper, we build a distributed data storage system (DDSS) for NVST and then deeply evaluate the availability of real-time data storage on a distributed computing environment. The experimental results show that two factors, i.e., the number of concurrent read/write and the file size, are critically important for improving the performance of data access on a distributed environment. Referring to these two factors, three strategies for storing FITS files are presented and implemented to ensure the access performance of the DDSS under conditions of multi-host write and read simultaneously. The real applications of the DDSS proves that the system is capable of meeting the requirements of NVST real-time high performance observational data storage. Our study on the DDSS is the first attempt for modern astronomical telescope systems to store real-time observational data on a low-cost distributed system. The research results and corresponding techniques of the DDSS provide a new option for designing real-time massive astronomical data storage system and will be a reference for future astronomical data storage.

  15. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.

  16. Parallelization of a Monte Carlo particle transport simulation code

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P.; Bousis, C.; Emfietzoglou, D.

    2010-05-01

    We have developed a high performance version of the Monte Carlo particle transport simulation code MC4. The original application code, developed in Visual Basic for Applications (VBA) for Microsoft Excel, was first rewritten in the C programming language for improving code portability. Several pseudo-random number generators have been also integrated and studied. The new MC4 version was then parallelized for shared and distributed-memory multiprocessor systems using the Message Passing Interface. Two parallel pseudo-random number generator libraries (SPRNG and DCMT) have been seamlessly integrated. The performance speedup of parallel MC4 has been studied on a variety of parallel computing architectures including an Intel Xeon server with 4 dual-core processors, a Sun cluster consisting of 16 nodes of 2 dual-core AMD Opteron processors and a 200 dual-processor HP cluster. For large problem size, which is limited only by the physical memory of the multiprocessor server, the speedup results are almost linear on all systems. We have validated the parallel implementation against the serial VBA and C implementations using the same random number generator. Our experimental results on the transport and energy loss of electrons in a water medium show that the serial and parallel codes are equivalent in accuracy. The present improvements allow for studying of higher particle energies with the use of more accurate physical models, and improve statistics as more particles tracks can be simulated in low response time.

  17. Control strategy of an electrically actuated morphing flap for the next generation green regional aircraft

    NASA Astrophysics Data System (ADS)

    Arena, Maurizio; Noviello, Maria Chiara; Rea, Francesco; Amoroso, Francesco; Pecora, Rosario

    2018-03-01

    The design and application of adaptive devices are currently ambitious targets in the field of aviation research addressed at new generation aircraft. The development of intelligent structures involves aspects of multidisciplinary nature: the combination of compact architectures, embedded electrical systems and smart materials, allows for developing a highly innovative device. The paper aims to present the control system design of an innovative morphing flap tailored for the next generation regional aircraft, within Clean Sky 2 - Airgreen 2 European Research Scenario. A distributed system of electromechanical actuators (EMAs) has been sized to enable up to three operating modes of a structure arranged in four blocks along the chord-wise direction: •overall camber-morphing; •upwards/downwards deflection and twisting of the final tip segment. A state-of-art feedback logic based on a decentralized control strategy for shape control is outlined, including the results of dynamic stability analysis based on the blocks rational schematization within Matlab/Simulink® environment. Such study has been performed implementing a state-space model, considering also design parameters as the torsional stiffness and damping of the actuation chain. The design process is flowing towards an increasingly "robotized" system, which can be externally controlled to perform certain operations. Future developments will be the control laws implementation as well as the functionality test on a real flap prototype.

  18. The Monte Carlo event generator AcerMC versions 2.0 to 3.8 with interfaces to PYTHIA 6.4, HERWIG 6.5 and ARIADNE 4.1

    NASA Astrophysics Data System (ADS)

    Kersevan, Borut Paul; Richter-Waş, Elzbieta

    2013-03-01

    The AcerMC Monte Carlo generator is dedicated to the generation of Standard Model background processes which were recognised as critical for the searches at LHC, and generation of which was either unavailable or not straightforward so far. The program itself provides a library of the massive matrix elements (coded by MADGRAPH) and native phase space modules for generation of a set of selected processes. The hard process event can be completed by the initial and the final state radiation, hadronisation and decays through the existing interface with either PYTHIA, HERWIG or ARIADNE event generators and (optionally) TAUOLA and PHOTOS. Interfaces to all these packages are provided in the distribution version. The phase-space generation is based on the multi-channel self-optimising approach using the modified Kajantie-Byckling formalism for phase space construction and further smoothing of the phase space was obtained by using a modified ac-VEGAS algorithm. An additional improvement in the recent versions is the inclusion of the consistent prescription for matching the matrix element calculations with parton showering for a select list of processes. Catalogue identifier: ADQQ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADQQ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3853309 No. of bytes in distributed program, including test data, etc.: 68045728 Distribution format: tar.gz Programming language: FORTRAN 77 with popular extensions (g77, gfortran). Computer: All running Linux. Operating system: Linux. Classification: 11.2, 11.6. External routines: CERNLIB (http://cernlib.web.cern.ch/cernlib/), LHAPDF (http://lhapdf.hepforge.org/) Catalogue identifier of previous version: ADQQ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 149(2003)142 Does the new version supersede the previous version?: Yes Nature of problem: Despite a large repertoire of processes implemented for generation in event generators like PYTHIA [1] or HERWIG [2] a number of background processes, crucial for studying the expected physics of the LHC experiments, is missing. For some of these processes the matrix element expressions are rather lengthy and/or to achieve a reasonable generation efficiency it is necessary to tailor the phase space selection procedure to the dynamics of the process. That is why it is not practical to imagine that any of the above general purpose generators will contain every, or even only observable, processes which will occur at LHC collisions. A more practical solution can be found in a library of dedicated matrix-element-based generators, with the standardised interfaces like that proposed in [3], to the more universal one which is used to complete the event generation. Solution method: The AcerMC EventGenerator provides a library of the matrix-element-based generators for several processes. The initial- and final-state showers, beam remnants and underlying events, fragmentation and remaining decays are supposed to be performed by the other universal generator to which this one is interfaced. We will call it a supervising generator. The interfaces to PYTHIA 6.4, ARIADNE 4.1 and HERWIG 6.5, as such generators, are provided. Provided is also an interface to TAUOLA [4] and PHOTOS [5] packages for τ-lepton decays (including spin correlations treatment) and QED radiations in decays of particles. At present, the following matrix-element-based processes have been implemented: gg,qq¯→tt¯bb¯, qq¯→W(→ℓν)bb¯; qq¯→W(→ℓν)tt¯; gg,qq¯→Z/γ∗(→ℓℓ)bb¯; gg,qq¯→Z/γ∗(→ℓℓ,νν,bb¯)tt¯; complete EW gg,qq¯→(Z/W/γ∗→)tt¯bb¯; gg,qq¯→tt¯tt¯; gg,qq¯→(tt¯→)ff¯bff¯b¯; gg,qq¯→(WWbb →)ff¯ff¯bb¯. Both interfaces allow the use of the LHAPDF/LHAGLUE library of parton density functions. Provided is also a set of control processes: qq¯→W→ℓν; qq¯→Z/γ∗→ℓℓ; gg,qq¯→tt¯ and gg→(tt¯→)WbWb¯; Reasons for new version: Implementation of several new processes and methods. Summary of revisions: Each version added new processes or functionalities, a detailed list is given in the section “Changes since AcerMC 1.0”. Restrictions: The package is optimised for the 14 TeV pp collision simulated in the LHC environment and also works at the achieved LHC energies of 7 TeV and 8 TeV. The consistency between results of the complete generation using PYTHIA 6.4 or HERWIG 6.5 interfaces is technically limited by the different approaches taken in both these generators for evaluating αQCD and αQED couplings and by the different models for fragmentation/hadronisation. For the consistency check, in the AcerMC library contains native coded definitions of the QCD and αQED. Using these native definitions leads to the same total cross-sections both with PYTHIA 6.4 or HERWIG 6.5 interfaces.

  19. A data grid for imaging-based clinical trials

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng; Chao, Sander S.; Lee, Jasper; Liu, Brent; Documet, Jorge; Huang, H. K.

    2007-03-01

    Clinical trials play a crucial role in testing new drugs or devices in modern medicine. Medical imaging has also become an important tool in clinical trials because images provide a unique and fast diagnosis with visual observation and quantitative assessment. A typical imaging-based clinical trial consists of: 1) A well-defined rigorous clinical trial protocol, 2) a radiology core that has a quality control mechanism, a biostatistics component, and a server for storing and distributing data and analysis results; and 3) many field sites that generate and send image studies to the radiology core. As the number of clinical trials increases, it becomes a challenge for a radiology core servicing multiple trials to have a server robust enough to administrate and quickly distribute information to participating radiologists/clinicians worldwide. The Data Grid can satisfy the aforementioned requirements of imaging based clinical trials. In this paper, we present a Data Grid architecture for imaging-based clinical trials. A Data Grid prototype has been implemented in the Image Processing and Informatics (IPI) Laboratory at the University of Southern California to test and evaluate performance in storing trial images and analysis results for a clinical trial. The implementation methodology and evaluation protocol of the Data Grid are presented.

  20. Design notes for the next generation persistent object manager for CAP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isely, M.; Fischler, M.; Galli, M.

    1995-05-01

    The CAP query system software at Fermilab has several major components, including SQS (for managing the query), the retrieval system (for fetching auxiliary data), and the query software itself. The central query software in particular is essentially a modified version of the `ptool` product created at UIC (University of Illinois at Chicago) as part of the PASS project under Bob Grossman. The original UIC version was designed for use in a single-user non-distributed Unix environment. The Fermi modifications were an attempt to permit multi-user access to a data set distributed over a set of storage nodes. (The hardware is anmore » IBM SP-x system - a cluster of AIX POWER2 nodes with an IBM-proprietary high speed switch interconnect). Since the implementation work of the Fermi-ized ptool, the CAP members have learned quite a bit about the nature of queries and where the current performance bottlenecks exist. This has lead them to design a persistent object manager that will overcome these problems. For backwards compatibility with ptool, the ptool persistent object API will largely be retained, but the implementation will be entirely different.« less

  1. vhMentor: An Ontology Supported Mobile Agent System for Pervasive Health Care Monitoring.

    PubMed

    Christopoulou, Stella C; Kotsilieris, Theodore; Anagnostopoulos, Ioannis; Anagnostopoulos, Christos-Nikolaos; Mylonas, Phivos

    2017-01-01

    Healthcare provision is a set of activities that demands the collaboration of several stakeholders (e.g. physicians, nurses, managers, patients etc.) who hold distinct expertise and responsibilities. In addition, medical knowledge is diversely located and often shared under no central coordination and supervision authority, while medical data flows remain mostly passive regarding the way data is delivered to both clinicians and patients. In this paper, we propose the implementation of a virtual health Mentor (vhMentor) which stands as a dedicated ontology schema and FIPA compliant agent system. Agent technology proves to be ideal for developing healthcare applications due to its distributed operation over systems and data sources of high heterogeneity. Agents are able to perform their tasks by acting pro-actively in order to assist individuals to overcome limitations posed during accessing medical data and executing non-automatic error-prone processes. vhMentor further comprises the Jess rules engine in order to implement reasoning logic. Thus, on the one hand vhMentor is a prototype that fills the gap between healthcare systems and the care provision community, while on the other hand allows the blending of next generation distributed services in healthcare domain.

  2. Modelling heavy metals build-up on urban road surfaces for effective stormwater reuse strategy implementation.

    PubMed

    Hong, Nian; Zhu, Panfeng; Liu, An

    2017-12-01

    Urban road stormwater is an alternative water resource to mitigate water shortage issues in the worldwide. Heavy metals deposited (build-up) on urban road surface can enter road stormwater runoff, undermining stormwater reuse safety. As heavy metal build-up loads perform high variabilities in terms of spatial distribution and is strongly influenced by surrounding land uses, it is essential to develop an approach to identify hot-spots where stormwater runoff could include high heavy metal concentrations and hence cannot be reused if it is not properly treated. This study developed a robust modelling approach to estimating heavy metal build-up loads on urban roads using land use fractions (representing percentages of land uses within a given area) by an artificial neural network (ANN) model technique. Based on the modelling results, a series of heavy metal load spatial distribution maps and a comprehensive ecological risk map were generated. These maps provided a visualization platform to identify priority areas where the stormwater can be safely reused. Additionally, these maps can be utilized as an urban land use planning tool in the context of effective stormwater reuse strategy implementation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Design and implementation of handheld and desktop software for the structured reporting of hepatic masses using the LI-RADS schema.

    PubMed

    Clark, Toshimasa J; McNeeley, Michael F; Maki, Jeffrey H

    2014-04-01

    The Liver Imaging Reporting and Data System (LI-RADS) can enhance communication between radiologists and clinicians if applied consistently. We identified an institutional need to improve liver imaging report standardization and developed handheld and desktop software to serve this purpose. We developed two complementary applications that implement the LI-RADS schema. A mobile application for iOS devices written in the Objective-C language allows for rapid characterization of hepatic observations under a variety of circumstances. A desktop application written in the Java language allows for comprehensive observation characterization and standardized report text generation. We chose the applications' languages and feature sets based on the computing resources of target platforms, anticipated usage scenarios, and ease of application installation, deployment, and updating. Our primary results are the publication of the core source code implementing the LI-RADS algorithm and the availability of the applications for use worldwide via our website, http://www.liradsapp.com/. The Java application is free open-source software that can be integrated into nearly any vendor's reporting system. The iOS application is distributed through Apple's iTunes App Store. Observation categorizations of both programs have been manually validated to be correct. The iOS application has been used to characterize liver tumors during multidisciplinary conferences of our institution, and several faculty members, fellows, and residents have adopted the generated text of Java application into their diagnostic reports. Although these two applications were developed for the specific reporting requirements of our liver tumor service, we intend to apply this development model to other diseases as well. Through semiautomated structured report generation and observation characterization, we aim to improve patient care while increasing radiologist efficiency. Published by Elsevier Inc.

  4. Two-sided Topp-Leone Weibull distribution

    NASA Astrophysics Data System (ADS)

    Podeang, Krittaya; Bodhisuwan, Winai

    2017-11-01

    In this paper, we introduce a general class of lifetime distributions, called the two-sided Topp-Leone generated family of distribution. A special case of new family is the two-sided Topp-Leone Weibull distribution. This distribution used the two-sided Topp-Leone distribution as a generator for the Weibull distribution. The two-sided Topp-Leone Weibull distribution is presented in several shapes of distributions such as decreasing, unimodal, and bimodal which make this distribution more than flexible than the Weibull distribution. Its quantile function is presented. The parameter estimation method by using maximum likelihood estimation is discussed. The proposed distribution is applied to the strength data set, remission times of bladder cancer patients data set and time to failure of turbocharger data set. We compare the proposed distribution to the Topp-Leone Generated Weibull distribution. In conclusion, the two-sided Topp-Leone Weibull distribution performs similarly as the Topp-Leone Generated Weibull distribution in the first and second data sets. However, the proposed distribution can perform better than fit to Topp-Leone Generated Weibull distribution for the other.

  5. Distributed Generation of Electricity and its Environmental Impacts

    EPA Pesticide Factsheets

    Distributed generation refers to technologies that generate electricity at or near where it will be used. Learn about how distributed energy generation can support the delivery of clean, reliable power to additional customers.

  6. autokonf - A Configuration Script Generator Implemented in Perl

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reus, J F

    This paper discusses configuration scripts in general and the scripting language issues involved. A brief description of GNU autoconf is provided along with a contrasting overview of autokonf, a configuration script generator implemented in Perl, whose macros are implemented in Perl, generating a configuration script in Perl. It is very portable, easily extensible, and readily mastered.

  7. MEG source localization of spatially extended generators of epileptic activity: comparing entropic and hierarchical bayesian approaches.

    PubMed

    Chowdhury, Rasheda Arman; Lina, Jean Marc; Kobayashi, Eliane; Grova, Christophe

    2013-01-01

    Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG) or Magneto-EncephaloGraphy (MEG) signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i) brain activity may be modeled using cortical parcels and (ii) brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP) method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM) and the Hierarchical Bayesian (HB) source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC) analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2) to 30 cm(2), whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.

  8. MEG Source Localization of Spatially Extended Generators of Epileptic Activity: Comparing Entropic and Hierarchical Bayesian Approaches

    PubMed Central

    Chowdhury, Rasheda Arman; Lina, Jean Marc; Kobayashi, Eliane; Grova, Christophe

    2013-01-01

    Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG) or Magneto-EncephaloGraphy (MEG) signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i) brain activity may be modeled using cortical parcels and (ii) brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP) method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM) and the Hierarchical Bayesian (HB) source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC) analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm2 to 30 cm2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered. PMID:23418485

  9. High-Efficiency Food Production in a Renewable Energy Based Micro-Grid Power System

    NASA Technical Reports Server (NTRS)

    Bubenheim, David; Meiners, Dennis

    2016-01-01

    Controlled Environment Agriculture (CEA) systems can be used to produce high-quality, desirable food year round, and the fresh produce can positively contribute to the health and well being of residents in communities with difficult supply logistics. While CEA has many positive outcomes for a remote community, the associated high electric demands have prohibited widespread implementation in what is typically already a fully subscribed power generation and distribution system. Recent advances in CEA technologies as well as renewable power generation, storage, and micro-grid management are increasing system efficiency and expanding the possibilities for enhancing community supporting infrastructure without increasing demands for outside supplied fuels. We will present examples of how new lighting, nutrient delivery, and energy management and control systems can enable significant increases in food production efficiency while maintaining high yields in CEA. Examples from Alaskan communities where initial incorporation of renewable power generation, energy storage and grid management techniques have already reduced diesel fuel consumption for electric generation by more than 40% and expanded grid capacity will be presented. We will discuss how renewable power generation, efficient grid management to extract maximum community service per kW, and novel energy storage approaches can expand the food production, water supply, waste treatment, sanitation and other community support services without traditional increases of consumable fuels supplied from outside the community. These capabilities offer communities with a range of choices to enhance their communities. The examples represent a synergy of technology advancement efforts to develop sustainable community support systems for future space-based human habitats and practical implementation of infrastructure components to increase efficiency and enhance health and well being in remote communities today and tomorrow.

  10. High-Efficiency Food Production in a Renewable Energy Based Micro-Grid

    NASA Technical Reports Server (NTRS)

    Bubenheim, David L.

    2017-01-01

    Controlled Environment Agriculture (CEA) systems can be used to produce high-quality, desirable food year round, and the fresh produce can positively contribute to the health and well being of residents in communities with difficult supply logistics. While CEA has many positive outcomes for a remote community, the associated high electric demands have prohibited widespread implementation in what is typically already a fully subscribed power generation and distribution system. Recent advances in CEA technologies as well as renewable power generation, storage, and micro-grid management are increasing system efficiency and expanding the possibilities for enhancing community supporting infrastructure without increasing demands for outside supplied fuels. We will present examples of how new lighting, nutrient delivery, and energy management and control systems can enable significant increases in food production efficiency while maintaining high yields in CEA.Examples from Alaskan communities where initial incorporation of renewable power generation, energy storage and grid management techniques have already reduced diesel fuel consumption for electric generation by more than 40 and expanded grid capacity will be presented. We will discuss how renewable power generation, efficient grid management to extract maximum community service per kW, and novel energy storage approaches can expand the food production, water supply, waste treatment, sanitation and other community support services without traditional increases of consumable fuels supplied from outside the community. These capabilities offer communities with a range of choices to enhance their communities. The examples represent a synergy of technology advancement efforts to develop sustainable community support systems for future space-based human habitats and practical implementation of infrastructure components to increase efficiency and enhance health and well-being in remote communities today and tomorrow.

  11. PPLN-waveguide-based polarization entangled QKD simulator

    NASA Astrophysics Data System (ADS)

    Gariano, John; Djordjevic, Ivan B.

    2017-08-01

    We have developed a comprehensive simulator to study the polarization entangled quantum key distribution (QKD) system, which takes various imperfections into account. We assume that a type-II SPDC source using a PPLN-based nonlinear optical waveguide is used to generate entangled photon pairs and implements the BB84 protocol, using two mutually unbiased basis with two orthogonal polarizations in each basis. The entangled photon pairs are then simulated to be transmitted to both parties; Alice and Bob, through the optical channel, imperfect optical elements and onto the imperfect detector. It is assumed that Eve has no control over the detectors, and can only gain information from the public channel and the intercept resend attack. The secure key rate (SKR) is calculated using an upper bound and by using actual code rates of LDPC codes implementable in FPGA hardware. After the verification of the simulation results, such as the pair generation rate and the number of error due to multiple pairs, for the ideal scenario, available in the literature, we then introduce various imperfections. Then, the results are compared to previously reported experimental results where a BBO nonlinear crystal is used, and the improvements in SKRs are determined for when a PPLN-waveguide is used instead.

  12. In situ electrochemical high-energy X-ray diffraction using a capillary working electrode cell geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Matthias J.; Bedford, Nicholas M.; Jiang, Naisheng

    The ability to generate new electrochemically active materials for energy generation and storage with improved properties will likely be derived from an understanding of atomic-scale structure/function relationships during electrochemical events. Here, the design and implementation of a new capillary electrochemical cell designed specifically forin situhigh-energy X-ray diffraction measurements is described. By increasing the amount of electrochemically active material in the X-ray path while implementing low-Zcell materials with anisotropic scattering profiles, an order of magnitude enhancement in diffracted X-ray signal over traditional cell geometries for multiple electrochemically active materials is demonstrated. This signal improvement is crucial for high-energy X-ray diffraction measurementsmore » and subsequent Fourier transformation into atomic pair distribution functions for atomic-scale structural analysis. As an example, clear structural changes in LiCoO 2under reductive and oxidative conditions using the capillary cell are demonstrated, which agree with prior studies. Accurate modeling of the LiCoO 2diffraction data using reverse Monte Carlo simulations further verifies accurate background subtraction and strong signal from the electrochemically active material, enabled by the capillary working electrode geometry.« less

  13. Distributed Prognostic Health Management with Gaussian Process Regression

    NASA Technical Reports Server (NTRS)

    Saha, Sankalita; Saha, Bhaskar; Saxena, Abhinav; Goebel, Kai Frank

    2010-01-01

    Distributed prognostics architecture design is an enabling step for efficient implementation of health management systems. A major challenge encountered in such design is formulation of optimal distributed prognostics algorithms. In this paper. we present a distributed GPR based prognostics algorithm whose target platform is a wireless sensor network. In addition to challenges encountered in a distributed implementation, a wireless network poses constraints on communication patterns, thereby making the problem more challenging. The prognostics application that was used to demonstrate our new algorithms is battery prognostics. In order to present trade-offs within different prognostic approaches, we present comparison with the distributed implementation of a particle filter based prognostics for the same battery data.

  14. Topsoil pollution forecasting using artificial neural networks on the example of the abnormally distributed heavy metal at Russian subarctic

    NASA Astrophysics Data System (ADS)

    Tarasov, D. A.; Buevich, A. G.; Sergeev, A. P.; Shichkin, A. V.; Baglaeva, E. M.

    2017-06-01

    Forecasting the soil pollution is a considerable field of study in the light of the general concern of environmental protection issues. Due to the variation of content and spatial heterogeneity of pollutants distribution at urban areas, the conventional spatial interpolation models implemented in many GIS packages mostly cannot provide appreciate interpolation accuracy. Moreover, the problem of prediction the distribution of the element with high variability in the concentration at the study site is particularly difficult. The work presents two neural networks models forecasting a spatial content of the abnormally distributed soil pollutant (Cr) at a particular location of the subarctic Novy Urengoy, Russia. A method of generalized regression neural network (GRNN) was compared to a common multilayer perceptron (MLP) model. The proposed techniques have been built, implemented and tested using ArcGIS and MATLAB. To verify the models performances, 150 scattered input data points (pollutant concentrations) have been selected from 8.5 km2 area and then split into independent training data set (105 points) and validation data set (45 points). The training data set was generated for the interpolation using ordinary kriging while the validation data set was used to test their accuracies. The networks structures have been chosen during a computer simulation based on the minimization of the RMSE. The predictive accuracy of both models was confirmed to be significantly higher than those achieved by the geostatistical approach (kriging). It is shown that MLP could achieve better accuracy than both kriging and even GRNN for interpolating surfaces.

  15. Experimental on-demand recovery of entanglement by local operations within non-Markovian dynamics

    PubMed Central

    Orieux, Adeline; D'Arrigo, Antonio; Ferranti, Giacomo; Franco, Rosario Lo; Benenti, Giuliano; Paladino, Elisabetta; Falci, Giuseppe; Sciarrino, Fabio; Mataloni, Paolo

    2015-01-01

    In many applications entanglement must be distributed through noisy communication channels that unavoidably degrade it. Entanglement cannot be generated by local operations and classical communication (LOCC), implying that once it has been distributed it is not possible to recreate it by LOCC. Recovery of entanglement by purely local control is however not forbidden in the presence of non-Markovian dynamics, and here we demonstrate in two all-optical experiments that such entanglement restoration can even be achieved on-demand. First, we implement an open-loop control scheme based on a purely local operation, without acquiring any information on the environment; then, we use a closed-loop scheme in which the environment is measured, the outcome controling the local operations on the system. The restored entanglement is a manifestation of “hidden” quantum correlations resumed by the local control. Relying on local control, both schemes improve the efficiency of entanglement sharing in distributed quantum networks. PMID:25712406

  16. Computational strategies for three-dimensional flow simulations on distributed computer systems. Ph.D. Thesis Semiannual Status Report, 15 Aug. 1993 - 15 Feb. 1994

    NASA Technical Reports Server (NTRS)

    Weed, Richard Allen; Sankar, L. N.

    1994-01-01

    An increasing amount of research activity in computational fluid dynamics has been devoted to the development of efficient algorithms for parallel computing systems. The increasing performance to price ratio of engineering workstations has led to research to development procedures for implementing a parallel computing system composed of distributed workstations. This thesis proposal outlines an ongoing research program to develop efficient strategies for performing three-dimensional flow analysis on distributed computing systems. The PVM parallel programming interface was used to modify an existing three-dimensional flow solver, the TEAM code developed by Lockheed for the Air Force, to function as a parallel flow solver on clusters of workstations. Steady flow solutions were generated for three different wing and body geometries to validate the code and evaluate code performance. The proposed research will extend the parallel code development to determine the most efficient strategies for unsteady flow simulations.

  17. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    NASA Astrophysics Data System (ADS)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  18. HIGHLIGHTING DIFFERENCES BETWEEN CONDITIONAL AND UNCONDITIONAL QUANTILE REGRESSION APPROACHES THROUGH AN APPLICATION TO ASSESS MEDICATION ADHERENCE

    PubMed Central

    BORAH, BIJAN J.; BASU, ANIRBAN

    2014-01-01

    The quantile regression (QR) framework provides a pragmatic approach in understanding the differential impacts of covariates along the distribution of an outcome. However, the QR framework that has pervaded the applied economics literature is based on the conditional quantile regression method. It is used to assess the impact of a covariate on a quantile of the outcome conditional on specific values of other covariates. In most cases, conditional quantile regression may generate results that are often not generalizable or interpretable in a policy or population context. In contrast, the unconditional quantile regression method provides more interpretable results as it marginalizes the effect over the distributions of other covariates in the model. In this paper, the differences between these two regression frameworks are highlighted, both conceptually and econometrically. Additionally, using real-world claims data from a large US health insurer, alternative QR frameworks are implemented to assess the differential impacts of covariates along the distribution of medication adherence among elderly patients with Alzheimer’s disease. PMID:23616446

  19. Pacific Northwest GridWise™ Testbed Demonstration Projects; Part I. Olympic Peninsula Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammerstrom, Donald J.; Ambrosio, Ron; Carlon, Teresa A.

    2008-01-09

    This report describes the implementation and results of a field demonstration wherein residential electric water heaters and thermostats, commercial building space conditioning, municipal water pump loads, and several distributed generators were coordinated to manage constrained feeder electrical distribution through the two-way communication of load status and electric price signals. The field demonstration took place in Washington and Oregon and was paid for by the U.S. Department of Energy and several northwest utilities. Price is found to be an effective control signal for managing transmission or distribution congestion. Real-time signals at 5-minute intervals are shown to shift controlled load in time.more » The behaviors of customers and their responses under fixed, time-of-use, and real-time price contracts are compared. Peak loads are effectively reduced on the experimental feeder. A novel application of portfolio theory is applied to the selection of an optimal mix of customer contract types.« less

  20. Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Hughes, Richard

    2004-05-01

    Quantum key distribution (QKD) uses single-photon communications to generate the shared, secret random number sequences that are used to encrypt and decrypt secret communications. The unconditional security of QKD is based on the interplay between fundamental principles of quantum physics and information theory. An adversary can neither successfully tap the transmissions, nor evade detection (eavesdropping raises the key error rate above a threshold value). QKD could be particularly attractive for free-space optical communications, both ground-based and for satellites. I will describe a QKD experiment performed over multi-kilometer line-of-sight paths, which serves as a model for a satellite-to-ground key distribution system. The system uses single-photon polarization states, without active polarization switching, and for the first time implements the complete BB84 QKD protocol including, reconciliation, privacy amplification and the all-important authentication stage. It is capable of continuous operation throughout the day and night, achieving the self-sustaining production of error-free, shared, secret bits. I will also report on the results of satellite-to-ground QKD modeling.

  1. Unsteady Probabilistic Analysis of a Gas Turbine System

    NASA Technical Reports Server (NTRS)

    Brown, Marilyn

    2003-01-01

    In this work, we have considered an annular cascade configuration subjected to unsteady inflow conditions. The unsteady response calculation has been implemented into the time marching CFD code, MSUTURBO. The computed steady state results for the pressure distribution demonstrated good agreement with experimental data. We have computed results for the amplitudes of the unsteady pressure over the blade surfaces. With the increase in gas turbine engine structural complexity and performance over the past 50 years, structural engineers have created an array of safety nets to ensure against component failures in turbine engines. In order to reduce what is now considered to be excessive conservatism and yet maintain the same adequate margins of safety, there is a pressing need to explore methods of incorporating probabilistic design procedures into engine development. Probabilistic methods combine and prioritize the statistical distributions of each design variable, generate an interactive distribution and offer the designer a quantified relationship between robustness, endurance and performance. The designer can therefore iterate between weight reduction, life increase, engine size reduction, speed increase etc.

  2. OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing

    NASA Astrophysics Data System (ADS)

    Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping

    2017-02-01

    The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.

  3. Method and apparatus for anti-islanding protection of distributed generations

    DOEpatents

    Ye, Zhihong; John, Vinod; Wang, Changyong; Garces, Luis Jose; Zhou, Rui; Li, Lei; Walling, Reigh Allen; Premerlani, William James; Sanza, Peter Claudius; Liu, Yan; Dame, Mark Edward

    2006-03-21

    An apparatus for anti-islanding protection of a distributed generation with respect to a feeder connected to an electrical grid is disclosed. The apparatus includes a sensor adapted to generate a voltage signal representative of an output voltage and/or a current signal representative of an output current at the distributed generation, and a controller responsive to the signals from the sensor. The controller is productive of a control signal directed to the distributed generation to drive an operating characteristic of the distributed generation out of a nominal range in response to the electrical grid being disconnected from the feeder.

  4. SMUD Community Renewable Energy Deployment Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sison-Lebrilla, Elaine; Tiangco, Valentino; Lemes, Marco

    2015-06-08

    This report summarizes the completion of four renewable energy installations supported by California Energy Commission (CEC) grant number CEC Grant PIR-11-005, the US Department of Energy (DOE) Assistance Agreement, DE-EE0003070, and the Sacramento Municipal Utility District (SMUD) Community Renewable Energy Deployment (CRED) program. The funding from the DOE, combined with funding from the CEC, supported the construction of a solar power system, biogas generation from waste systems, and anaerobic digestion systems at dairy facilities, all for electricity generation and delivery to SMUD’s distribution system. The deployment of CRED projects shows that solar projects and anaerobic digesters can be successfully implementedmore » under favorable economic conditions and business models and through collaborative partnerships. This work helps other communities learn how to assess, overcome barriers, utilize, and benefit from renewable resources for electricity generation in their region. In addition to reducing GHG emissions, the projects also demonstrate that solar projects and anaerobic digesters can be readily implemented through collaborative partnerships. This work helps other communities learn how to assess, overcome barriers, utilize, and benefit from renewable resources for electricity generation in their region.« less

  5. Ultrashort vortex from a Gaussian pulse - An achromatic-interferometric approach.

    PubMed

    Naik, Dinesh N; Saad, Nabil A; Rao, D Narayana; Viswanathan, Nirmal K

    2017-05-24

    The more than a century old Sagnac interferometer is put to first of its kind use to generate an achromatic single-charge vortex equivalent to a Laguerre-Gaussian beam possessing orbital angular momentum (OAM). The interference of counter-propagating polychromatic Gaussian beams of beam waist ω λ with correlated linear phase (ϕ 0  ≥ 0.025 λ) and lateral shear (y 0  ≥ 0.05 ω λ ) in orthogonal directions is shown to create a vortex phase distribution around the null interference. Using a wavelength-tunable continuous-wave laser the entire range of visible wavelengths is shown to satisfy the condition for vortex generation to achieve a highly stable white-light vortex with excellent propagation integrity. The application capablitiy of the proposed scheme is demonstrated by generating ultrashort optical vortex pulses, its nonlinear frequency conversion and transforming them to vector pulses. We believe that our scheme for generating robust achromatic vortex (implemented with only mirrors and a beam-splitter) pulses in the femtosecond regime, with no conceivable spectral-temporal range and peak-power limitations, can have significant advantages for a variety of applications.

  6. Reflexive reasoning for distributed real-time systems

    NASA Technical Reports Server (NTRS)

    Goldstein, David

    1994-01-01

    This paper discusses the implementation and use of reflexive reasoning in real-time, distributed knowledge-based applications. Recently there has been a great deal of interest in agent-oriented systems. Implementing such systems implies a mechanism for sharing knowledge, goals and other state information among the agents. Our techniques facilitate an agent examining both state information about other agents and the parameters of the knowledge-based system shell implementing its reasoning algorithms. The shell implementing the reasoning is the Distributed Artificial Intelligence Toolkit, which is a derivative of CLIPS.

  7. A Review of Microgrid Architectures and Control Strategy

    NASA Astrophysics Data System (ADS)

    Jadav, Krishnarajsinh A.; Karkar, Hitesh M.; Trivedi, I. N.

    2017-12-01

    In this paper microgrid architecture and various converters control strategies are reviewed. Microgrid is defined as interconnected network of distributed energy resources, loads and energy storage systems. This emerging concept realizes the potential of distributed generators. AC microgrid interconnects various AC distributed generators like wind turbine and DC distributed generators like PV, fuel cell using inverter. While in DC microgrid output of an AC distributed generator must be converted to DC using rectifiers and DC distributed generator can be directly interconnected. Hybrid microgrid is the solution to avoid this multiple reverse conversions AC-DC-AC and DC-AC-DC that occur in the individual AC-DC microgrid. In hybrid microgrid all AC distributed generators will be connected in AC microgrid and DC distributed generators will be connected in DC microgrid. Interlinking converter is used for power balance in both microgrids, which transfer power from one microgrid to other if any microgrid is overloaded. At the end, review of interlinking converter control strategies is presented.

  8. Implementation of Dryden Continuous Turbulence Model into Simulink for LSA-02 Flight Test Simulation

    NASA Astrophysics Data System (ADS)

    Ichwanul Hakim, Teuku Mohd; Arifianto, Ony

    2018-04-01

    Turbulence is a movement of air on small scale in the atmosphere that caused by instabilities of pressure and temperature distribution. Turbulence model is integrated into flight mechanical model as an atmospheric disturbance. Common turbulence model used in flight mechanical model are Dryden and Von Karman model. In this minor research, only Dryden continuous turbulence model were made. Dryden continuous turbulence model has been implemented, it refers to the military specification MIL-HDBK-1797. The model was implemented into Matlab Simulink. The model will be integrated with flight mechanical model to observe response of the aircraft when it is flight through turbulence field. The turbulence model is characterized by multiplying the filter which are generated from power spectral density with band-limited Gaussian white noise input. In order to ensure that the model provide a good result, model verification has been done by comparing the implemented model with the similar model that is provided in aerospace blockset. The result shows that there are some difference for 2 linear velocities (vg and wg), and 3 angular rate (pg, qg and rg). The difference is instantly caused by different determination of turbulence scale length which is used in aerospace blockset. With the adjustment of turbulence length in the implemented model, both model result the similar output.

  9. An implementation of the distributed programming structural synthesis system (PROSSS)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1981-01-01

    A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.

  10. Quantum key distribution with prepare-and-measure Bell test

    PubMed Central

    Tan, Yong-gang

    2016-01-01

    The prepare-and-measure quantum key distribution (QKD) has the merits of fast speed, high key generation rate, and easy implementation. However, the detector side channel attacks greatly undermine the security of the key bits. The eavesdropper, Eve, exploits the flaws of the detectors to obtain illegal information without violating quantum principles. It means that she can intervene in the communication without being detected. A prepare-and-measure Bell test protocol will be proposed. By randomly carrying out Bell test at the side of the information receiver, Bob, Eve’s illegal information gain within the detector side channel attack can be well bounded. This protocol does not require any improvement on the detectors used in available prepare-and-measure QKD. Though we only illustrate its application in the BB84 protocol, it is applicable for any prepare-and-measure QKD. PMID:27733771

  11. Molecular Monte Carlo Simulations Using Graphics Processing Units: To Waste Recycle or Not?

    PubMed

    Kim, Jihan; Rodgers, Jocelyn M; Athènes, Manuel; Smit, Berend

    2011-10-11

    In the waste recycling Monte Carlo (WRMC) algorithm, (1) multiple trial states may be simultaneously generated and utilized during Monte Carlo moves to improve the statistical accuracy of the simulations, suggesting that such an algorithm may be well posed for implementation in parallel on graphics processing units (GPUs). In this paper, we implement two waste recycling Monte Carlo algorithms in CUDA (Compute Unified Device Architecture) using uniformly distributed random trial states and trial states based on displacement random-walk steps, and we test the methods on a methane-zeolite MFI framework system to evaluate their utility. We discuss the specific implementation details of the waste recycling GPU algorithm and compare the methods to other parallel algorithms optimized for the framework system. We analyze the relationship between the statistical accuracy of our simulations and the CUDA block size to determine the efficient allocation of the GPU hardware resources. We make comparisons between the GPU and the serial CPU Monte Carlo implementations to assess speedup over conventional microprocessors. Finally, we apply our optimized GPU algorithms to the important problem of determining free energy landscapes, in this case for molecular motion through the zeolite LTA.

  12. Analysis and Prediction of Weather Impacted Ground Stop Operations

    NASA Technical Reports Server (NTRS)

    Wang, Yao Xun

    2014-01-01

    When the air traffic demand is expected to exceed the available airport's capacity for a short period of time, Ground Stop (GS) operations are implemented by Federal Aviation Administration (FAA) Traffic Flow Management (TFM). The GS requires departing aircraft meeting specific criteria to remain on the ground to achieve reduced demands at the constrained destination airport until the end of the GS. This paper provides a high-level overview of the statistical distributions as well as causal factors for the GSs at the major airports in the United States. The GS's character, the weather impact on GSs, GS variations with delays, and the interaction between GSs and Ground Delay Programs (GDPs) at Newark Liberty International Airport (EWR) are investigated. The machine learning methods are used to generate classification models that map the historical airport weather forecast, schedule traffic, and other airport conditions to implemented GS/GDP operations and the models are evaluated using the cross-validations. This modeling approach produced promising results as it yielded an 85% overall classification accuracy to distinguish the implemented GS days from the normal days without GS and GDP operations and a 71% accuracy to differentiate the GS and GDP implemented days from the GDP only days.

  13. HELAC-Onia 2.0: An upgraded matrix-element and event generator for heavy quarkonium physics

    NASA Astrophysics Data System (ADS)

    Shao, Hua-Sheng

    2016-01-01

    We present an upgraded version (denoted as version 2.0) of the program HELAC-ONIA for the automated computation of heavy-quarkonium helicity amplitudes within non-relativistic QCD framework. The new code has been designed to include many new and useful features for practical phenomenological simulations. It is designed for job submissions under cluster environment for parallel computations via PYTHON scripts. We have interfaced HELAC-ONIA to the parton shower Monte Carlo programs PYTHIA 8 and QEDPS to take into account the parton-shower effects. Moreover, the decay module guarantees that the program can perform the spin-entangled (cascade-)decay of heavy quarkonium after its generation. We have also implemented a reweighting method to automatically estimate the uncertainties from renormalization and/or factorization scales as well as parton-distribution functions to weighted or unweighted events. A further update is the possibility to generate one-dimensional or two-dimensional plots encoded in the analysis files on the fly. Some dedicated examples are given at the end of the writeup.

  14. Fast and accurate mock catalogue generation for low-mass galaxies

    NASA Astrophysics Data System (ADS)

    Koda, Jun; Blake, Chris; Beutler, Florian; Kazin, Eyal; Marin, Felipe

    2016-06-01

    We present an accurate and fast framework for generating mock catalogues including low-mass haloes, based on an implementation of the COmoving Lagrangian Acceleration (COLA) technique. Multiple realisations of mock catalogues are crucial for analyses of large-scale structure, but conventional N-body simulations are too computationally expensive for the production of thousands of realizations. We show that COLA simulations can produce accurate mock catalogues with a moderate computation resource for low- to intermediate-mass galaxies in 1012 M⊙ haloes, both in real and redshift space. COLA simulations have accurate peculiar velocities, without systematic errors in the velocity power spectra for k ≤ 0.15 h Mpc-1, and with only 3-per cent error for k ≤ 0.2 h Mpc-1. We use COLA with 10 time steps and a Halo Occupation Distribution to produce 600 mock galaxy catalogues of the WiggleZ Dark Energy Survey. Our parallelized code for efficient generation of accurate halo catalogues is publicly available at github.com/junkoda/cola_halo.

  15. A method to generate small-scale, high-resolution sedimentary bedform architecture models representing realistic geologic facies

    DOE PAGES

    Meckel, T. A.; Trevisan, L.; Krishnamurthy, P. G.

    2017-08-23

    Small-scale (mm to m) sedimentary structures (e.g. ripple lamination, cross-bedding) have received a great deal of attention in sedimentary geology. The influence of depositional heterogeneity on subsurface fluid flow is now widely recognized, but incorporating these features in physically-rational bedform models at various scales remains problematic. The current investigation expands the capability of an existing set of open-source codes, allowing generation of high-resolution 3D bedform architecture models. The implemented modifications enable the generation of 3D digital models consisting of laminae and matrix (binary field) with characteristic depositional architecture. The binary model is then populated with petrophysical properties using a texturalmore » approach for additional analysis such as statistical characterization, property upscaling, and single and multiphase fluid flow simulation. One example binary model with corresponding threshold capillary pressure field and the scripts used to generate them are provided, but the approach can be used to generate dozens of previously documented common facies models and a variety of property assignments. An application using the example model is presented simulating buoyant fluid (CO 2) migration and resulting saturation distribution.« less

  16. Quantity, Quality, and Availability of Waste Heat from United States Thermal Power Generation.

    PubMed

    Gingerich, Daniel B; Mauter, Meagan S

    2015-07-21

    Secondary application of unconverted heat produced during electric power generation has the potential to improve the life-cycle fuel efficiency of the electric power industry and the sectors it serves. This work quantifies the residual heat (also known as waste heat) generated by U.S. thermal power plants and assesses the intermittency and transport issues that must be considered when planning to utilize this heat. Combining Energy Information Administration plant-level data with literature-reported process efficiency data, we develop estimates of the unconverted heat flux from individual U.S. thermal power plants in 2012. Together these power plants discharged an estimated 18.9 billion GJ(th) of residual heat in 2012, 4% of which was discharged at temperatures greater than 90 °C. We also characterize the temperature, spatial distribution, and temporal availability of this residual heat at the plant level and model the implications for the technical and economic feasibility of its end use. Increased implementation of flue gas desulfurization technologies at coal-fired facilities and the higher quality heat generated in the exhaust of natural gas fuel cycles are expected to increase the availability of residual heat generated by 10.6% in 2040.

  17. Small gas-turbine units for the power industry: Ways for improving the efficiency and the scale of implementation

    NASA Astrophysics Data System (ADS)

    Kosoi, A. S.; Popel', O. S.; Beschastnykh, V. N.; Zeigarnik, Yu. A.; Sinkevich, M. V.

    2017-10-01

    Small power units (<1 MW) see increasing application due to enhanced growth of the distributed power generation and smart power supply systems. They are usually used for feeding facilities whose connection to centralized networks involves certain problems of engineering or economical nature. Small power generation is based on a wide range of processes and primary sources, including renewable and local ones, such as nonconventional hydrocarbon fuel comprising associated gas, biogas, coalmine methane, etc. Characteristics of small gas-turbine units (GTU) that are most widely available on the world market are reviewed. The most promising lines for the development of the new generation of small GTUs are examined. Special emphasis is placed on the three lines selected for improving the efficiency of small GTUs: increasing the fuel efficiency, cutting down the maintenance cost, and integration with local or renewable power sources. It is demonstrated that, as to the specific fuel consumption, small GTUs of the new generation can have an efficiency 20-25% higher than those of the previous generation, require no maintenance between overhauls, and can be capable of efficient integration into intelligent electrical networks with power facilities operating on renewable or local power sources.

  18. Radiometric Block Adjusment and Digital Radiometric Model Generation

    NASA Astrophysics Data System (ADS)

    Pros, A.; Colomina, I.; Navarro, J. A.; Antequera, R.; Andrinal, P.

    2013-05-01

    In this paper we present a radiometric block adjustment method that is related to geometric block adjustment and to the concept of a terrain Digital Radiometric Model (DRM) as a complement to the terrain digital elevation and surface models. A DRM, in our concept, is a function that for each ground point returns a reflectance value and a Bidirectional Reflectance Distribution Function (BRDF). In a similar way to the terrain geometric reconstruction procedure, given an image block of some terrain area, we split the DRM generation in two phases: radiometric block adjustment and DRM generation. In the paper we concentrate on the radiometric block adjustment step, but we also describe a preliminary DRM generator. In the block adjustment step, after a radiometric pre-calibraton step, local atmosphere radiative transfer parameters, and ground reflectances and BRDFs at the radiometric tie points are estimated. This radiometric block adjustment is based on atmospheric radiative transfer (ART) models, pre-selected BRDF models and radiometric ground control points. The proposed concept is implemented and applied in an experimental campaign, and the obtained results are presented. The DRM and orthophoto mosaics are generated showing no radiometric differences at the seam lines.

  19. A method to generate small-scale, high-resolution sedimentary bedform architecture models representing realistic geologic facies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meckel, T. A.; Trevisan, L.; Krishnamurthy, P. G.

    Small-scale (mm to m) sedimentary structures (e.g. ripple lamination, cross-bedding) have received a great deal of attention in sedimentary geology. The influence of depositional heterogeneity on subsurface fluid flow is now widely recognized, but incorporating these features in physically-rational bedform models at various scales remains problematic. The current investigation expands the capability of an existing set of open-source codes, allowing generation of high-resolution 3D bedform architecture models. The implemented modifications enable the generation of 3D digital models consisting of laminae and matrix (binary field) with characteristic depositional architecture. The binary model is then populated with petrophysical properties using a texturalmore » approach for additional analysis such as statistical characterization, property upscaling, and single and multiphase fluid flow simulation. One example binary model with corresponding threshold capillary pressure field and the scripts used to generate them are provided, but the approach can be used to generate dozens of previously documented common facies models and a variety of property assignments. An application using the example model is presented simulating buoyant fluid (CO 2) migration and resulting saturation distribution.« less

  20. World Ocean Circulation Experiment

    NASA Technical Reports Server (NTRS)

    Clarke, R. Allyn

    1992-01-01

    The oceans are an equal partner with the atmosphere in the global climate system. The World Ocean Circulation Experiment is presently being implemented to improve ocean models that are useful for climate prediction both by encouraging more model development but more importantly by providing quality data sets that can be used to force or to validate such models. WOCE is the first oceanographic experiment that plans to generate and to use multiparameter global ocean data sets. In order for WOCE to succeed, oceanographers must establish and learn to use more effective methods of assembling, quality controlling, manipulating and distributing oceanographic data.

  1. An image filtering technique for SPIDER visible tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonnesu, N., E-mail: nicola.fonnesu@igi.cnr.it; Agostini, M.; Brombin, M.

    2014-02-15

    The tomographic diagnostic developed for the beam generated in the SPIDER facility (100 keV, 50 A prototype negative ion source of ITER neutral beam injector) will characterize the two-dimensional particle density distribution of the beam. The simulations described in the paper show that instrumental noise has a large influence on the maximum achievable resolution of the diagnostic. To reduce its impact on beam pattern reconstruction, a filtering technique has been adapted and implemented in the tomography code. This technique is applied to the simulated tomographic reconstruction of the SPIDER beam, and the main results are reported.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livescu, Veronica; Bronkhorst, Curt Allan; Vander Wiel, Scott Alan

    Many challenges exist with regard to understanding and representing complex physical processes involved with ductile damage and failure in polycrystalline metallic materials. Currently, the ability to accurately predict the macroscale ductile damage and failure response of metallic materials is lacking. Research at Los Alamos National Laboratory (LANL) is aimed at building a coupled experimental and computational methodology that supports the development of predictive damage capabilities by: capturing real distributions of microstructural features from real material and implementing them as digitally generated microstructures in damage model development; and, distilling structure-property information to link microstructural details to damage evolution under a multitudemore » of loading states.« less

  3. An Examination of Hypercube Implementations of Genetic Algorithms

    DTIC Science & Technology

    1992-03-01

    t of th s C!ie(tol.3 ) i ., r .’ Itor l’C ". Ing sUq , -Cs iS or reducing ,is ourlen : .V,isr,qon Heac uar’ers Ser .ces. Directorate for -nformation...is of length n and the building.block-size is r , all combinations of the n loc* taken r at a time must be generated. The cardinality of the size i n...U Approved for public release; distribution unlimited RO Form Approved REPORT DOCUMENTATION PAGE OMB No. 0704-0188 Pi C rponr! - ; Lurcen " r :T"s

  4. Distributed control topologies for deep space formation flying spacecraft

    NASA Technical Reports Server (NTRS)

    Hadaegh, F. Y.; Smith, R. S.

    2002-01-01

    A formation of satellites flying in deep space can be specified in terms of the relative satellite positions and absolute satellite orientations. The redundancy in the relative position specification generates a family of control topologies with equivalent stability and reference tracking performance, one of which can be implemented without requiring communication between the spacecraft. A relative position design formulation is inherently unobservable, and a methodology for circumventing this problem is presented. Additional redundancy in the control actuation space can be exploited for feed-forward control of the formation centroid's location in space, or for minimization of total fuel consumption.

  5. Noise properties in the ideal Kirchhoff-Law-Johnson-Noise secure communication system.

    PubMed

    Gingl, Zoltan; Mingesz, Robert

    2014-01-01

    In this paper we determine the noise properties needed for unconditional security for the ideal Kirchhoff-Law-Johnson-Noise (KLJN) secure key distribution system using simple statistical analysis. It has already been shown using physical laws that resistors and Johnson-like noise sources provide unconditional security. However real implementations use artificial noise generators, therefore it is a question if other kind of noise sources and resistor values could be used as well. We answer this question and in the same time we provide a theoretical basis to analyze real systems as well.

  6. Strategy for an Extensible Microcomputer-Based Mumps System for Private Practice

    PubMed Central

    Walters, Richard F.; Johnson, Stephen L.

    1979-01-01

    A macro expander technique has been adopted to generate a machine independent single user version of ANSI Standard MUMPS running on an 8080 Microcomputer. This approach makes it possible to have the medically oriented MUMPS language available on inexpensive systems suitable for small group practice settings. Substitution of another macro expansion set allows the same interpreter to be implemented on another computer, thereby providing compatibility with comparable or larger scale systems. Furthermore, since the global file handler can be separated from the interpreter, this approach permits development of a distributed MUMPS system with no change in applications software.

  7. Design of a Mission Data Storage and Retrieval System for NASA Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Lux, Jessica; Downing, Bob; Sheldon, Jack

    2007-01-01

    The Western Aeronautical Test Range (WATR) at the NASA Dryden Flight Research Center (DFRC) employs the WATR Integrated Next Generation System (WINGS) for the processing and display of aeronautical flight data. This report discusses the post-mission segment of the WINGS architecture. A team designed and implemented a system for the near- and long-term storage and distribution of mission data for flight projects at DFRC, providing the user with intelligent access to data. Discussed are the legacy system, an industry survey, system operational concept, high-level system features, and initial design efforts.

  8. Efficient implementation of multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    DOEpatents

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2012-01-10

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  9. Efficient implementation of a multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    DOEpatents

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2008-01-01

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  10. Distributed Generation to Support Development-Focused Climate Action

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sadie; Gagnon, Pieter; Stout, Sherry

    2016-09-01

    This paper explores the role of distributed generation, with a high renewable energy contribution, in supporting low emission climate-resilient development. The paper presents potential impacts on development (via energy access), greenhouse gas emission mitigation, and climate resilience directly associated with distributed generation, as well as specific actions that may enhance or increase the likelihood of climate and development benefits. This paper also seeks to provide practical and timely insights to support distributed generation policymaking and planning within the context of common climate and development goals as the distributed generation landscape rapidly evolves globally. Country-specific distributed generation policy and program examples,more » as well as analytical tools that can inform efforts internationally, are also highlighted throughout the paper.« less

  11. Hydrogen Fuel Cell Analysis: Lessons Learned from Stationary Power Generation Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott E. Grasman; John W. Sheffield; Fatih Dogan

    2010-04-30

    This study considered opportunities for hydrogen in stationary applications in order to make recommendations related to RD&D strategies that incorporate lessons learned and best practices from relevant national and international stationary power efforts, as well as cost and environmental modeling of pathways. The study analyzed the different strategies utilized in power generation systems and identified the different challenges and opportunities for producing and using hydrogen as an energy carrier. Specific objectives included both a synopsis/critical analysis of lessons learned from previous stationary power programs and recommendations for a strategy for hydrogen infrastructure deployment. This strategy incorporates all hydrogen pathways andmore » a combination of distributed power generating stations, and provides an overview of stationary power markets, benefits of hydrogen-based stationary power systems, and competitive and technological challenges. The motivation for this project was to identify the lessons learned from prior stationary power programs, including the most significant obstacles, how these obstacles have been approached, outcomes of the programs, and how this information can be used by the Hydrogen, Fuel Cells & Infrastructure Technologies Program to meet program objectives primarily related to hydrogen pathway technologies (production, storage, and delivery) and implementation of fuel cell technologies for distributed stationary power. In addition, the lessons learned address environmental and safety concerns, including codes and standards, and education of key stakeholders.« less

  12. Derivation of flood frequency curves in poorly gauged Mediterranean catchments using a simple stochastic hydrological rainfall-runoff model

    NASA Astrophysics Data System (ADS)

    Aronica, G. T.; Candela, A.

    2007-12-01

    SummaryIn this paper a Monte Carlo procedure for deriving frequency distributions of peak flows using a semi-distributed stochastic rainfall-runoff model is presented. The rainfall-runoff model here used is very simple one, with a limited number of parameters and practically does not require any calibration, resulting in a robust tool for those catchments which are partially or poorly gauged. The procedure is based on three modules: a stochastic rainfall generator module, a hydrologic loss module and a flood routing module. In the rainfall generator module the rainfall storm, i.e. the maximum rainfall depth for a fixed duration, is assumed to follow the two components extreme value (TCEV) distribution whose parameters have been estimated at regional scale for Sicily. The catchment response has been modelled by using the Soil Conservation Service-Curve Number (SCS-CN) method, in a semi-distributed form, for the transformation of total rainfall to effective rainfall and simple form of IUH for the flood routing. Here, SCS-CN method is implemented in probabilistic form with respect to prior-to-storm conditions, allowing to relax the classical iso-frequency assumption between rainfall and peak flow. The procedure is tested on six practical case studies where synthetic FFC (flood frequency curve) were obtained starting from model variables distributions by simulating 5000 flood events combining 5000 values of total rainfall depth for the storm duration and AMC (antecedent moisture conditions) conditions. The application of this procedure showed how Monte Carlo simulation technique can reproduce the observed flood frequency curves with reasonable accuracy over a wide range of return periods using a simple and parsimonious approach, limited data input and without any calibration of the rainfall-runoff model.

  13. A Reserve-based Method for Mitigating the Impact of Renewable Energy

    NASA Astrophysics Data System (ADS)

    Krad, Ibrahim

    The fundamental operating paradigm of today's power systems is undergoing a significant shift. This is partially motivated by the increased desire for incorporating variable renewable energy resources into generation portfolios. While these generating technologies offer clean energy at zero marginal cost, i.e. no fuel costs, they also offer unique operating challenges for system operators. Perhaps the biggest operating challenge these resources introduce is accommodating their intermittent fuel source availability. For this reason, these generators increase the system-wide variability and uncertainty. As a result, system operators are revisiting traditional operating strategies to more efficiently incorporate these generation resources to maximize the benefit they provide while minimizing the challenges they introduce. One way system operators have accounted for system variability and uncertainty is through the use of operating reserves. Operating reserves can be simplified as excess capacity kept online during real time operations to help accommodate unforeseen fluctuations in demand. With new generation resources, a new class of operating reserves has emerged that is generally known as flexibility, or ramping, reserves. This new reserve class is meant to better position systems to mitigate severe ramping in the net load profile. The best way to define this new requirement is still under investigation. Typical requirement definitions focus on the additional uncertainty introduced by variable generation and there is room for improvement regarding explicit consideration for the variability they introduce. An exogenous reserve modification method is introduced in this report that can improve system reliability with minimal impacts on total system wide production costs. Another potential solution to this problem is to formulate the problem as a stochastic programming problem. The unit commitment and economic dispatch problems are typically formulated as deterministic problems due to fast solution times and the solutions being sufficient for operations. Improvements in technical computing hardware have reignited interest in stochastic modeling. The variability of wind and solar naturally lends itself to stochastic modeling. The use of explicit reserve requirements in stochastic models is an area of interest for power system researchers. This report introduces a new reserve modification implementation based on previous results to be used in a stochastic modeling framework. With technological improvements in distributed generation technologies, microgrids are currently being researched and implemented. Microgrids are small power systems that have the ability to serve their demand with their own generation resources and may have a connection to a larger power system. As battery technologies improve, they are becoming a more viable option in these distributed power systems and research is necessary to determine the most efficient way to utilize them. This report will investigate several unique operating strategies for batteries in small power systems and analyze their benefits. These new operating strategies will help reduce operating costs and improve system reliability.

  14. Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Brune, Ryan Carl

    Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.

  15. A Conditions Data Management System for HEP Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laycock, P. J.; Dykstra, D.; Formica, A.

    Conditions data infrastructure for both ATLAS and CMS have to deal with the management of several Terabytes of data. Distributed computing access to this data requires particular care and attention to manage request-rates of up to several tens of kHz. Thanks to the large overlap in use cases and requirements, ATLAS and CMS have worked towards a common solution for conditions data management with the aim of using this design for data-taking in Run 3. In the meantime other experiments, including NA62, have expressed an interest in this cross- experiment initiative. For experiments with a smaller payload volume and complexity,more » there is particular interest in simplifying the payload storage. The conditions data management model is implemented in a small set of relational database tables. A prototype access toolkit consisting of an intermediate web server has been implemented, using standard technologies available in the Java community. Access is provided through a set of REST services for which the API has been described in a generic way using standard Open API specications, implemented in Swagger. Such a solution allows the automatic generation of client code and server stubs and further allows changes in the backend technology transparently. An important advantage of using a REST API for conditions access is the possibility of caching identical URLs, addressing one of the biggest challenges that large distributed computing solutions impose on conditions data access, avoiding direct DB access by means of standard web proxy solutions.« less

  16. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    PubMed

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  17. Ergodicity, hidden bias and the growth rate gain

    NASA Astrophysics Data System (ADS)

    Rochman, Nash D.; Popescu, Dan M.; Sun, Sean X.

    2018-05-01

    Many single-cell observables are highly heterogeneous. A part of this heterogeneity stems from age-related phenomena: the fact that there is a nonuniform distribution of cells with different ages. This has led to a renewed interest in analytic methodologies including use of the ‘von Foerster equation’ for predicting population growth and cell age distributions. Here we discuss how some of the most popular implementations of this machinery assume a strong condition on the ergodicity of the cell cycle duration ensemble. We show that one common definition for the term ergodicity, ‘a single individual observed over many generations recapitulates the behavior of the entire ensemble’ is implied by the other, ‘the probability of observing any state is conserved across time and over all individuals’ in an ensemble with a fixed number of individuals but that this is not true when the ensemble is growing. We further explore the impact of generational correlations between cell cycle durations on the population growth rate. Finally, we explore the ‘growth rate gain’—the phenomenon that variations in the cell cycle duration leads to an improved population-level growth rate—in this context. We highlight that, fundamentally, this effect is due to asymmetric division.

  18. A technique for generating phase-space-based Monte Carlo beamlets in radiotherapy applications.

    PubMed

    Bush, K; Popescu, I A; Zavgorodni, S

    2008-09-21

    As radiotherapy treatment planning moves toward Monte Carlo (MC) based dose calculation methods, the MC beamlet is becoming an increasingly common optimization entity. At present, methods used to produce MC beamlets have utilized a particle source model (PSM) approach. In this work we outline the implementation of a phase-space-based approach to MC beamlet generation that is expected to provide greater accuracy in beamlet dose distributions. In this approach a standard BEAMnrc phase space is sorted and divided into beamlets with particles labeled using the inheritable particle history variable. This is achieved with the use of an efficient sorting algorithm, capable of sorting a phase space of any size into the required number of beamlets in only two passes. Sorting a phase space of five million particles can be achieved in less than 8 s on a single-core 2.2 GHz CPU. The beamlets can then be transported separately into a patient CT dataset, producing separate dose distributions (doselets). Methods for doselet normalization and conversion of dose to absolute units of Gy for use in intensity modulated radiation therapy (IMRT) plan optimization are also described.

  19. Security of six-state quantum key distribution protocol with threshold detectors

    PubMed Central

    Kato, Go; Tamaki, Kiyoshi

    2016-01-01

    The security of quantum key distribution (QKD) is established by a security proof, and the security proof puts some assumptions on the devices consisting of a QKD system. Among such assumptions, security proofs of the six-state protocol assume the use of photon number resolving (PNR) detector, and as a result the bit error rate threshold for secure key generation for the six-state protocol is higher than that for the BB84 protocol. Unfortunately, however, this type of detector is demanding in terms of technological level compared to the standard threshold detector, and removing the necessity of such a detector enhances the feasibility of the implementation of the six-state protocol. Here, we develop the security proof for the six-state protocol and show that we can use the threshold detector for the six-state protocol. Importantly, the bit error rate threshold for the key generation for the six-state protocol (12.611%) remains almost the same as the one (12.619%) that is derived from the existing security proofs assuming the use of PNR detectors. This clearly demonstrates feasibility of the six-state protocol with practical devices. PMID:27443610

  20. A spatially distributed model for the dynamic prediction of sediment erosion and transport in mountainous forested watersheds

    NASA Astrophysics Data System (ADS)

    Doten, Colleen O.; Bowling, Laura C.; Lanini, Jordan S.; Maurer, Edwin P.; Lettenmaier, Dennis P.

    2006-04-01

    Erosion and sediment transport in a temperate forested watershed are predicted with a new sediment model that represents the main sources of sediment generation in forested environments (mass wasting, hillslope erosion, and road surface erosion) within the distributed hydrology-soil-vegetation model (DHSVM) environment. The model produces slope failures on the basis of a factor-of-safety analysis with the infinite slope model through use of stochastically generated soil and vegetation parameters. Failed material is routed downslope with a rule-based scheme that determines sediment delivery to streams. Sediment from hillslopes and road surfaces is also transported to the channel network. A simple channel routing scheme is implemented to predict basin sediment yield. We demonstrate through an initial application of this model to the Rainy Creek catchment, a tributary of the Wenatchee River, which drains the east slopes of the Cascade Mountains, that the model produces plausible sediment yield and ratios of landsliding and surface erosion when compared to published rates for similar catchments in the Pacific Northwest. A road removal scenario and a basin-wide fire scenario are both evaluated with the model.

  1. 78 FR 46621 - Status of the Office of New Reactors' Implementation of Electronic Distribution of Advanced...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-01

    ... of Electronic Distribution of Advanced Reactor Correspondence AGENCY: Nuclear Regulatory Commission. ACTION: Implementation of electronic distribution of advanced reactor correspondence; issuance. SUMMARY... public that, in the future, publicly available correspondence originating from the Division of Advanced...

  2. Control of dispatch dynamics for lowering the cost of distributed generation in the built environment

    NASA Astrophysics Data System (ADS)

    Flores, Robert Joseph

    Distributed generation can provide many benefits over traditional central generation such as increased reliability and efficiency while reducing emissions. Despite these potential benefits, distributed generation is generally not purchased unless it reduces energy costs. Economic dispatch strategies can be designed such that distributed generation technologies reduce overall facility energy costs. In this thesis, a microturbine generator is dispatched using different economic control strategies, reducing the cost of energy to the facility. Several industrial and commercial facilities are simulated using acquired electrical, heating, and cooling load data. Industrial and commercial utility rate structures are modeled after Southern California Edison and Southern California Gas Company tariffs and used to find energy costs for the simulated buildings and corresponding microturbine dispatch. Using these control strategies, building models, and utility rate models, a parametric study examining various generator characteristics is performed. An economic assessment of the distributed generation is then performed for both the microturbine generator and parametric study. Without the ability to export electricity to the grid, the economic value of distributed generation is limited to reducing the individual costs that make up the cost of energy for a building. Any economic dispatch strategy must be built to reduce these individual costs. While the ability of distributed generation to reduce cost depends of factors such as electrical efficiency and operations and maintenance cost, the building energy demand being serviced has a strong effect on cost reduction. Buildings with low load factors can accept distributed generation with higher operating costs (low electrical efficiency and/or high operations and maintenance cost) due to the value of demand reduction. As load factor increases, lower operating cost generators are desired due to a larger portion of the building load being met in an effort to reduce demand. In addition, buildings with large thermal demand have access to the least expensive natural gas, lowering the cost of operating distributed generation. Recovery of exhaust heat from DG reduces cost only if the buildings thermal demand coincides with the electrical demand. Capacity limits exist where annual savings from operation of distributed generation decrease if further generation is installed. For low operating cost generators, the approximate limit is the average building load. This limit decreases as operating costs increase. In addition, a high capital cost of distributed generation can be accepted if generator operating costs are low. As generator operating costs increase, capital cost must decrease if a positive economic performance is desired.

  3. Self Healing Percolation

    NASA Astrophysics Data System (ADS)

    Scala, Antonio

    2015-03-01

    We introduce the concept of self-healing in the field of complex networks modelling; in particular, self-healing capabilities are implemented through distributed communication protocols that exploit redundant links to recover the connectivity of the system. Self-healing is a crucial in implementing the next generation of smart grids allowing to ensure a high quality of service to the users. We then map our self-healing procedure in a percolation problem and analyse the interplay between redundancies and topology in improving the resilience of networked infrastructures to multiple failures. We find exact results both for planar lattices and for random lattices, hinting the role of duality in the design of resilient networks. Finally, we introduce a cavity method approach to study the recovery of connectivity after damage in self-healing networks. CNR-PNR National Project ``Crisis-Lab,'' EU HOME/2013/CIPS/AG/4000005013 project CI2C and EU FET project MULTIPLEX nr.317532.

  4. Implementation of a SVWP-based laser beam shaping technique for generation of 100-mJ-level picosecond pulses.

    PubMed

    Adamonis, J; Aleknavičius, A; Michailovas, K; Balickas, S; Petrauskienė, V; Gertus, T; Michailovas, A

    2016-10-01

    We present implementation of the energy-efficient and flexible laser beam shaping technique in a high-power and high-energy laser amplifier system. The beam shaping is based on a spatially variable wave plate (SVWP) fabricated by femtosecond laser nanostructuring of glass. We reshaped the initially Gaussian beam into a super-Gaussian (SG) of the 12th order with efficiency of about 50%. The 12th order of the SG beam provided the best compromise between large fill factor, low diffraction on the edges of the active media, and moderate intensity distribution modification during free-space propagation. We obtained 150 mJ pulses of 532 nm radiation. High-energy, pulse duration of 85 ps and the nearly flat-top spatial profile of the beam make it ideal for pumping optical parametric chirped pulse amplification systems.

  5. A Comparison of Three Programming Models for Adaptive Applications

    NASA Technical Reports Server (NTRS)

    Shan, Hong-Zhang; Singh, Jaswinder Pal; Oliker, Leonid; Biswa, Rupak; Kwak, Dochan (Technical Monitor)

    2000-01-01

    We study the performance and programming effort for two major classes of adaptive applications under three leading parallel programming models. We find that all three models can achieve scalable performance on the state-of-the-art multiprocessor machines. The basic parallel algorithms needed for different programming models to deliver their best performance are similar, but the implementations differ greatly, far beyond the fact of using explicit messages versus implicit loads/stores. Compared with MPI and SHMEM, CC-SAS (cache-coherent shared address space) provides substantial ease of programming at the conceptual and program orchestration level, which often leads to the performance gain. However it may also suffer from the poor spatial locality of physically distributed shared data on large number of processors. Our CC-SAS implementation of the PARMETIS partitioner itself runs faster than in the other two programming models, and generates more balanced result for our application.

  6. Photonic beamforming network for multibeam satellite-on-board phased-array antennas

    NASA Astrophysics Data System (ADS)

    Piqueras, M. A.; Cuesta-Soto, F.; Villalba, P.; Martí, A.; Hakansson, A.; Perdigués, J.; Caille, G.

    2017-11-01

    The implementation of a beamforming unit based on integrated photonic technologies is addressed in this work. This integrated photonic solution for multibeam coverage will be compared with the digital and the RF solution. Photonic devices show unique characteristics that match the critical requirements of space oriented devices such as low mass/size, low power consumption and easily scalable to big systems. An experimental proof-of-concept of the photonic beamforming structure based on 4x4 and 8x8 Butler matrices is presented. The proof-of-concept is based in the heterodyne generation of multiple phase engineered RF signals for the conformation of 8-4 different beams in an antenna array. Results show the feasibility of this technology for the implementation of optical beamforming with phase distribution errors below σ=10o with big savings in the required mass and size of the beamforming unit.

  7. NGDS Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blackman, Harold; Moore, Joseph

    2014-06-30

    The ultimate goal of the National Geothermal Data System (NGDS) is to support the discovery and generation of geothermal sources of energy. The NGDS was designed and has been implemented to provide online access to important geothermal-related data from a network of data providers in order to: • Increase the efficiency of exploration, development and usage of geothermal energy by providing a basis for financial risk analysis of potential sites • Assist state and federal agencies in making land and resource management assessments • Foster the discovery of new geothermal resources by supporting ongoing and future geothermal-related research • Increasemore » public awareness of geothermal energy It is through the implementation of this distributed data system and its subsequent use that substantial increases to the general access and understanding of geothermal related data will result. NGDS provides a mechanism for the sharing of data thereby fostering the discovery of new resources and supporting ongoing geothermal research.« less

  8. The Construction of 3-d Neutral Density for Arbitrary Data Sets

    NASA Astrophysics Data System (ADS)

    Riha, S.; McDougall, T. J.; Barker, P. M.

    2014-12-01

    The Neutral Density variable allows inference of water pathways from thermodynamic properties in the global ocean, and is therefore an essential component of global ocean circulation analysis. The widely used algorithm for the computation of Neutral Density yields accurate results for data sets which are close to the observed climatological ocean. Long-term numerical climate simulations, however, often generate a significant drift from present-day climate, which renders the existing algorithm inaccurate. To remedy this problem, new algorithms which operate on arbitrary data have been developed, which may potentially be used to compute Neutral Density during runtime of a numerical model.We review existing approaches for the construction of Neutral Density in arbitrary data sets, detail their algorithmic structure, and present an analysis of the computational cost for implementations on a single-CPU computer. We discuss possible strategies for the implementation in state-of-the-art numerical models, with a focus on distributed computing environments.

  9. SeaWiFS Science Algorithm Flow Chart

    NASA Technical Reports Server (NTRS)

    Darzi, Michael

    1998-01-01

    This flow chart describes the baseline science algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Data Processing System (SDPS). As such, it includes only processing steps used in the generation of the operational products that are archived by NASA's Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC). It is meant to provide the reader with a basic understanding of the scientific algorithm steps applied to SeaWiFS data. It does not include non-science steps, such as format conversions, and places the greatest emphasis on the geophysical calculations of the level-2 processing. Finally, the flow chart reflects the logic sequences and the conditional tests of the software so that it may be used to evaluate the fidelity of the implementation of the scientific algorithm. In many cases however, the chart may deviate from the details of the software implementation so as to simplify the presentation.

  10. Preventive chemotherapy in human helminthiasis: theoretical and operational aspects

    PubMed Central

    Chitsulo, L.; Engels, D.; Savioli, L.

    2017-01-01

    Preventive chemotherapy (PC), the large-scale distribution of anthelminthic drugs to population groups at risk, is the core intervention recommended by the WHO for reducing morbidity and transmission of the four main helminth infections, namely lymphatic filariasis, onchocerciasis, schistosomiasis and soil-transmitted helminthiasis. The strategy is widely implemented worldwide but its general theoretical foundations have not been described so far in a comprehensive and cohesive manner. Starting from the information available on the biological and epidemiological characteristics of helminth infections, as well as from the experience generated by disease control and elimination interventions across the world, we extrapolate the fundamentals and synthesise the principles that regulate PC and justify its implementation as a sound and essential public health intervention. The outline of the theoretical aspects of PC contributes to a thorough understanding of the different facets of this strategy and helps comprehend opportunities and limits of control and elimination interventions directed against helminth infections. PMID:22040463

  11. Experimental plug and play quantum coin flipping.

    PubMed

    Pappa, Anna; Jouguet, Paul; Lawson, Thomas; Chailloux, André; Legré, Matthieu; Trinkler, Patrick; Kerenidis, Iordanis; Diamanti, Eleni

    2014-04-24

    Performing complex cryptographic tasks will be an essential element in future quantum communication networks. These tasks are based on a handful of fundamental primitives, such as coin flipping, where two distrustful parties wish to agree on a randomly generated bit. Although it is known that quantum versions of these primitives can offer information-theoretic security advantages with respect to classical protocols, a demonstration of such an advantage in a practical communication scenario has remained elusive. Here we experimentally implement a quantum coin flipping protocol that performs strictly better than classically possible over a distance suitable for communication over metropolitan area optical networks. The implementation is based on a practical plug and play system, developed by significantly enhancing a commercial quantum key distribution device. Moreover, we provide combined quantum coin flipping protocols that are almost perfectly secure against bounded adversaries. Our results offer a useful toolbox for future secure quantum communications.

  12. United Space Alliance LLC Parachute Refurbishment Facility Model

    NASA Technical Reports Server (NTRS)

    Esser, Valerie; Pessaro, Martha; Young, Angela

    2007-01-01

    The Parachute Refurbishment Facility Model was created to reflect the flow of hardware through the facility using anticipated start and delivery times from a project level IV schedule. Distributions for task times were built using historical build data for SFOC work and new data generated for CLV/ARES task times. The model currently processes 633 line items from 14 SFOC builds for flight readiness, 16 SFOC builds returning from flight for defoul, wash, and dry operations, 12 builds for CLV manufacturing operations, and 1 ARES 1X build. Modeling the planned workflow through the PRF is providing a reliable way to predict the capability of the facility as well as the manpower resource need. Creating a real world process allows for real world problems to be identified and potential workarounds to be implemented in a safe, simulated world before taking it to the next step, implementation in the real world.

  13. Medical education in India: current challenges and the way forward.

    PubMed

    Solanki, Anjali; Kashyap, Surender

    2014-12-01

    Medical education in India is suffering from various shortcomings at conceptual as well as implementation level. With the expansion in medical education, the doctor to patient ratio has increased but these numbers do not align well with the overall quality of medical care in the country. To address this issue, a comprehensive analysis of various associated factors is essential. Indian medical education is suffering from a maldistribution of resources, unregulated growth in the private sector, lack of uniform admission procedures and traditional curricula lacking innovative approaches. To achieve higher standards of medical education, our goal should be to re-evaluate each and every aspect; create an efficient accreditation system; promote an equal distribution of resources, redesign curricula with stricter implementation and improved assessment methodologies; all of which will generate efficient medical graduates and consequently better health care delivery, and resulting in desired change within the system.

  14. Generating multiplex gradients of biomolecules for controlling cellular adhesion in parallel microfluidic channels.

    PubMed

    Didar, Tohid Fatanat; Tabrizian, Maryam

    2012-11-07

    Here we present a microfluidic platform to generate multiplex gradients of biomolecules within parallel microfluidic channels, in which a range of multiplex concentration gradients with different profile shapes are simultaneously produced. Nonlinear polynomial gradients were also generated using this device. The gradient generation principle is based on implementing parrallel channels with each providing a different hydrodynamic resistance. The generated biomolecule gradients were then covalently functionalized onto the microchannel surfaces. Surface gradients along the channel width were a result of covalent attachments of biomolecules to the surface, which remained functional under high shear stresses (50 dyn/cm(2)). An IgG antibody conjugated to three different fluorescence dyes (FITC, Cy5 and Cy3) was used to demonstrate the resulting multiplex concentration gradients of biomolecules. The device enabled generation of gradients with up to three different biomolecules in each channel with varying concentration profiles. We were also able to produce 2-dimensional gradients in which biomolecules were distributed along the length and width of the channel. To demonstrate the applicability of the developed design, three different multiplex concentration gradients of REDV and KRSR peptides were patterned along the width of three parallel channels and adhesion of primary human umbilical vein endothelial cell (HUVEC) in each channel was subsequently investigated using a single chip.

  15. Development of asymmetric stent for treatment of eccentric plaque.

    PubMed

    Syaifudin, Achmad; Takeda, Ryo; Sasaki, Katsuhiko

    2018-01-01

    The selection of stent and balloon type is decisive in the stenting process. In the treatment of an eccentric plaque obstruction, a symmetric expansion from stent dilatation generates nonuniform stress distribution, which may aggravate fibrous cap prone to rupture. This paper developed a new stent design to treat eccentric plaque using structural transient dynamic analysis in ANSYS. A non-symmetric structural geometry of stent is generated to obtain reasonable stress distribution safe for the arterial layer surrounding the stent. To derive the novel structural geometry, a Sinusoidal stent type is modified by varying struts length and width, adding bridges, and varying curvature width of struts. An end ring of stent struts was also modified to eliminate dogboning phenomenon and to reduce the Ectropion angle. Two balloon types were used to deploy the stent, an ordinary cylindrical and offset balloon. Positive modification results were used to construct the final non-symmetric stent design, called an Asymmetric stent. Analyses of the deformation characteristics, changes in surface roughness and induced stresses within intact arterial layer were subsequently examined. Interaction between the stent and vessel wall was implemented by means of changes in surface roughness and stress distribution analyses. The Palmaz and the Sinusoidal stent were used for a comparative study. This study indicated that the Asymmetric stent types reduced the central radial recoiling and the dogboning phenomenon. In terms of changes in surface roughness and induced stresses, the Asymmetric stent has a comparable effect with that of the Sinusoidal stent. In addition, it could enhance the distribution of surface roughening as expanded by an offset balloon.

  16. Distributed Leadership and Organizational Change: Implementation of a Teaching Performance Measure

    ERIC Educational Resources Information Center

    Sloan, Tine

    2013-01-01

    This article explores leadership practice and change as evidenced in multiple data sources gathered during a self-study implementation of a teaching performance assessment. It offers promising models of distributed leadership and organizational change that can inform future program implementers and the field in general. Our experiences suggest…

  17. Real-Time Optimization and Control of Next-Generation Distribution

    Science.gov Websites

    Infrastructure | Grid Modernization | NREL Real-Time Optimization and Control of Next -Generation Distribution Infrastructure Real-Time Optimization and Control of Next-Generation Distribution Infrastructure This project develops innovative, real-time optimization and control methods for next-generation

  18. Hybrid MPI/OpenMP Implementation of the ORAC Molecular Dynamics Program for Generalized Ensemble and Fast Switching Alchemical Simulations.

    PubMed

    Procacci, Piero

    2016-06-27

    We present a new release (6.0β) of the ORAC program [Marsili et al. J. Comput. Chem. 2010, 31, 1106-1116] with a hybrid OpenMP/MPI (open multiprocessing message passing interface) multilevel parallelism tailored for generalized ensemble (GE) and fast switching double annihilation (FS-DAM) nonequilibrium technology aimed at evaluating the binding free energy in drug-receptor system on high performance computing platforms. The production of the GE or FS-DAM trajectories is handled using a weak scaling parallel approach on the MPI level only, while a strong scaling force decomposition scheme is implemented for intranode computations with shared memory access at the OpenMP level. The efficiency, simplicity, and inherent parallel nature of the ORAC implementation of the FS-DAM algorithm, project the code as a possible effective tool for a second generation high throughput virtual screening in drug discovery and design. The code, along with documentation, testing, and ancillary tools, is distributed under the provisions of the General Public License and can be freely downloaded at www.chim.unifi.it/orac .

  19. GOES-R GS Product Generation Infrastructure Operations

    NASA Astrophysics Data System (ADS)

    Blanton, M.; Gundy, J.

    2012-12-01

    GOES-R GS Product Generation Infrastructure Operations: The GOES-R Ground System (GS) will produce a much larger set of products with higher data density than previous GOES systems. This requires considerably greater compute and memory resources to achieve the necessary latency and availability for these products. Over time, new algorithms could be added and existing ones removed or updated, but the GOES-R GS cannot go down during this time. To meet these GOES-R GS processing needs, the Harris Corporation will implement a Product Generation (PG) infrastructure that is scalable, extensible, extendable, modular and reliable. The primary parts of the PG infrastructure are the Service Based Architecture (SBA), which includes the Distributed Data Fabric (DDF). The SBA is the middleware that encapsulates and manages science algorithms that generate products. The SBA is divided into three parts, the Executive, which manages and configures the algorithm as a service, the Dispatcher, which provides data to the algorithm, and the Strategy, which determines when the algorithm can execute with the available data. The SBA is a distributed architecture, with services connected to each other over a compute grid and is highly scalable. This plug-and-play architecture allows algorithms to be added, removed, or updated without affecting any other services or software currently running and producing data. Algorithms require product data from other algorithms, so a scalable and reliable messaging is necessary. The SBA uses the DDF to provide this data communication layer between algorithms. The DDF provides an abstract interface over a distributed and persistent multi-layered storage system (memory based caching above disk-based storage) and an event system that allows algorithm services to know when data is available and to get the data that they need to begin processing when they need it. Together, the SBA and the DDF provide a flexible, high performance architecture that can meet the needs of product processing now and as they grow in the future.

  20. Hierarchical storage of large volume of multidector CT data using distributed servers

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Rosset, Antoine; Heuberger, Joris; Bandon, David

    2006-03-01

    Multidector scanners and hybrid multimodality scanners have the ability to generate large number of high-resolution images resulting in very large data sets. In most cases, these datasets are generated for the sole purpose of generating secondary processed images and 3D rendered images as well as oblique and curved multiplanar reformatted images. It is therefore not essential to archive the original images after they have been processed. We have developed an architecture of distributed archive servers for temporary storage of large image datasets for 3D rendering and image processing without the need for long term storage in PACS archive. With the relatively low cost of storage devices it is possible to configure these servers to hold several months or even years of data, long enough for allowing subsequent re-processing if required by specific clinical situations. We tested the latest generation of RAID servers provided by Apple computers with a capacity of 5 TBytes. We implemented a peer-to-peer data access software based on our Open-Source image management software called OsiriX, allowing remote workstations to directly access DICOM image files located on the server through a new technology called "bonjour". This architecture offers a seamless integration of multiple servers and workstations without the need for central database or complex workflow management tools. It allows efficient access to image data from multiple workstation for image analysis and visualization without the need for image data transfer. It provides a convenient alternative to centralized PACS architecture while avoiding complex and time-consuming data transfer and storage.

  1. Automatic Data Distribution for CFD Applications on Structured Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Yan, Jerry

    2000-01-01

    Data distribution is an important step in implementation of any parallel algorithm. The data distribution determines data traffic, utilization of the interconnection network and affects the overall code efficiency. In recent years a number data distribution methods have been developed and used in real programs for improving data traffic. We use some of the methods for translating data dependence and affinity relations into data distribution directives. We describe an automatic data alignment and placement tool (ADAFT) which implements these methods and show it results for some CFD codes (NPB and ARC3D). Algorithms for program analysis and derivation of data distribution implemented in ADAFT are efficient three pass algorithms. Most algorithms have linear complexity with the exception of some graph algorithms having complexity O(n(sup 4)) in the worst case.

  2. Automatic Data Distribution for CFD Applications on Structured Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Yan, Jerry

    1999-01-01

    Data distribution is an important step in implementation of any parallel algorithm. The data distribution determines data traffic, utilization of the interconnection network and affects the overall code efficiency. In recent years a number data distribution methods have been developed and used in real programs for improving data traffic. We use some of the methods for translating data dependence and affinity relations into data distribution directives. We describe an automatic data alignment and placement tool (ADAPT) which implements these methods and show it results for some CFD codes (NPB and ARC3D). Algorithms for program analysis and derivation of data distribution implemented in ADAPT are efficient three pass algorithms. Most algorithms have linear complexity with the exception of some graph algorithms having complexity O(n(sup 4)) in the worst case.

  3. Foundational Report Series. Advanced Distribution management Systems for Grid Modernization (Importance of DMS for Distribution Grid Modernization)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui

    2015-09-01

    Grid modernization is transforming the operation and management of electric distribution systems from manual, paper-driven business processes to electronic, computer-assisted decisionmaking. At the center of this business transformation is the distribution management system (DMS), which provides a foundation from which optimal levels of performance can be achieved in an increasingly complex business and operating environment. Electric distribution utilities are facing many new challenges that are dramatically increasing the complexity of operating and managing the electric distribution system: growing customer expectations for service reliability and power quality, pressure to achieve better efficiency and utilization of existing distribution system assets, and reductionmore » of greenhouse gas emissions by accommodating high penetration levels of distributed generating resources powered by renewable energy sources (wind, solar, etc.). Recent “storm of the century” events in the northeastern United States and the lengthy power outages and customer hardships that followed have greatly elevated the need to make power delivery systems more resilient to major storm events and to provide a more effective electric utility response during such regional power grid emergencies. Despite these newly emerging challenges for electric distribution system operators, only a small percentage of electric utilities have actually implemented a DMS. This paper discusses reasons why a DMS is needed and why the DMS may emerge as a mission-critical system that will soon be considered essential as electric utilities roll out their grid modernization strategies.« less

  4. IEEE 1588 Time Synchronization Board in MTCA.4 Form Factor

    NASA Astrophysics Data System (ADS)

    Jabłoński, G.; Makowski, D.; Mielczarek, A.; Orlikowski, M.; Perek, P.; Napieralski, A.; Makijarvi, P.; Simrock, S.

    2015-06-01

    Distributed data acquisition and control systems in large-scale scientific experiments, like e.g. ITER, require time synchronization with nanosecond precision. A protocol commonly used for that purpose is the Precise Timing Protocol (PTP), also known as IEEE 1588 standard. It uses the standard Ethernet signalling and protocols and allows obtaining timing accuracy of the order of tens of nanoseconds. The MTCA.4 is gradually becoming the platform of choice for building such systems. Currently there is no commercially available implementation of the PTP receiver on that platform. In this paper, we present a module in the MTCA.4 form factor supporting this standard. The module may be used as a timing receiver providing reference clocks in an MTCA.4 chassis, generating a Pulse Per Second (PPS) signal and allowing generation of triggers and timestamping of events on 8 configurable backplane lines and two front panel connectors. The module is based on the Xilinx Spartan 6 FPGA and thermally stabilized Voltage Controlled Oscillator controlled by the digital-to-analog converter. The board supports standalone operation, without the support from the host operating system, as the entire control algorithm is run on a Microblaze CPU implemented in the FPGA. The software support for the card includes the low-level API in the form of Linux driver, user-mode library, high-level API: ITER Nominal Device Support and EPICS IOC. The device has been tested in the ITER timing distribution network (TCN) with three cascaded PTP-enabled Hirschmann switches and a GPS reference clock source. An RMS synchronization accuracy, measured by direct comparison of the PPS signals, better than 20 ns has been obtained.

  5. Forward and Inverse Modeling of Self-potential. A Tomography of Groundwater Flow and Comparison Between Deterministic and Stochastic Inversion Methods

    NASA Astrophysics Data System (ADS)

    Quintero-Chavarria, E.; Ochoa Gutierrez, L. H.

    2016-12-01

    Applications of the Self-potential Method in the fields of Hydrogeology and Environmental Sciences have had significant developments during the last two decades with a strong use on groundwater flows identification. Although only few authors deal with the forward problem's solution -especially in geophysics literature- different inversion procedures are currently being developed but in most cases they are compared with unconventional groundwater velocity fields and restricted to structured meshes. This research solves the forward problem based on the finite element method using the St. Venant's Principle to transform a point dipole, which is the field generated by a single vector, into a distribution of electrical monopoles. Then, two simple aquifer models were generated with specific boundary conditions and head potentials, velocity fields and electric potentials in the medium were computed. With the model's surface electric potential, the inverse problem is solved to retrieve the source of electric potential (vector field associated to groundwater flow) using deterministic and stochastic approaches. The first approach was carried out by implementing a Tikhonov regularization with a stabilized operator adapted to the finite element mesh while for the second a hierarchical Bayesian model based on Markov chain Monte Carlo (McMC) and Markov Random Fields (MRF) was constructed. For all implemented methods, the result between the direct and inverse models was contrasted in two ways: 1) shape and distribution of the vector field, and 2) magnitude's histogram. Finally, it was concluded that inversion procedures are improved when the velocity field's behavior is considered, thus, the deterministic method is more suitable for unconfined aquifers than confined ones. McMC has restricted applications and requires a lot of information (particularly in potentials fields) while MRF has a remarkable response especially when dealing with confined aquifers.

  6. Automatic generation of computable implementation guides from clinical information models.

    PubMed

    Boscá, Diego; Maldonado, José Alberto; Moner, David; Robles, Montserrat

    2015-06-01

    Clinical information models are increasingly used to describe the contents of Electronic Health Records. Implementation guides are a common specification mechanism used to define such models. They contain, among other reference materials, all the constraints and rules that clinical information must obey. However, these implementation guides typically are oriented to human-readability, and thus cannot be processed by computers. As a consequence, they must be reinterpreted and transformed manually into an executable language such as Schematron or Object Constraint Language (OCL). This task can be difficult and error prone due to the big gap between both representations. The challenge is to develop a methodology for the specification of implementation guides in such a way that humans can read and understand easily and at the same time can be processed by computers. In this paper, we propose and describe a novel methodology that uses archetypes as basis for generation of implementation guides. We use archetypes to generate formal rules expressed in Natural Rule Language (NRL) and other reference materials usually included in implementation guides such as sample XML instances. We also generate Schematron rules from NRL rules to be used for the validation of data instances. We have implemented these methods in LinkEHR, an archetype editing platform, and exemplify our approach by generating NRL rules and implementation guides from EN ISO 13606, openEHR, and HL7 CDA archetypes. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Planning for a Distributed Disruption: Innovative Practices for Incorporating Distributed Solar into Utility Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Andrew D.; Barbose, Galen L.; Seel, Joachim

    The rapid growth of distributed solar photovoltaics (DPV) has critical implications for U.S. utility planning processes. This report informs utility planning through a comparative analysis of roughly 30 recent utility integrated resource plans or other generation planning studies, transmission planning studies, and distribution system plans. It reveals a spectrum of approaches to incorporating DPV across nine key planning areas, and it identifies areas where even the best current practices might be enhanced. (1) Forecasting DPV deployment: Because it explicitly captures several predictive factors, customer-adoption modeling is the most comprehensive forecasting approach. It could be combined with other forecasting methods tomore » generate a range of potential futures. (2) Ensuring robustness of decisions to uncertain DPV quantities: using a capacity-expansion model to develop least-cost plans for various scenarios accounts for changes in net load and the generation portfolio; an innovative variation of this approach combines multiple per-scenario plans with trigger events, which indicate when conditions have changed sufficiently from the expected to trigger modifications in resource-acquisition strategy. (3) Characterizing DPV as a resource option: Today's most comprehensive plans account for all of DPV's monetary costs and benefits. An enhanced approach would address non-monetary and societal impacts as well. (4) Incorporating the non-dispatchability of DPV into planning: Rather than having a distinct innovative practice, innovation in this area is represented by evolving methods for capturing this important aspect of DPV. (5) Accounting for DPV's location-specific factors: The innovative propensity-to-adopt method employs several factors to predict future DPV locations. Another emerging utility innovation is locating DPV strategically to enhance its benefits. (6) Estimating DPV's impact on transmission and distribution investments: Innovative practices are being implemented to evaluate system needs, hosting capacities, and system investments needed to accommodate DPV deployment. (7) Estimating avoided losses associated with DPV: A time-differentiated marginal loss rate provides the most comprehensive estimate of avoided losses due to DPV, but no studies appear to use it. (8) Considering changes in DPV's value with higher solar penetration: Innovative methods for addressing the value changes at high solar penetrations are lacking among the studies we evaluate. (9) Integrating DPV in planning across generation, transmission, and distribution: A few states and regions have started to develop more comprehensive processes that link planning forums, but there are still many issues to address.« less

  8. Planning for a Distributed Disruption: Innovative Practices for Incorporating Distributed Solar into Utility Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mill, Andrew; Barbose, Galen; Seel, Joachim

    The rapid growth of distributed solar photovoltaics (DPV) has critical implications for U.S. utility planning processes. This report informs utility planning through a comparative analysis of roughly 30 recent utility integrated resource plans or other generation planning studies, transmission planning studies, and distribution system plans. It reveals a spectrum of approaches to incorporating DPV across nine key planning areas, and it identifies areas where even the best current practices might be enhanced. 1) Forecasting DPV deployment: Because it explicitly captures several predictive factors, customer-adoption modeling is the most comprehensive forecasting approach. It could be combined with other forecasting methods tomore » generate a range of potential futures. 2) Ensuring robustness of decisions to uncertain DPV quantities: using a capacity-expansion model to develop least-cost plans for various scenarios accounts for changes in net load and the generation portfolio; an innovative variation of this approach combines multiple per-scenario plans with trigger events, which indicate when conditions have changed sufficiently from the expected to trigger modifications in resource-acquisition strategy. 3) Characterizing DPV as a resource option: Today’s most comprehensive plans account for all of DPV’s monetary costs and benefits. An enhanced approach would address non-monetary and societal impacts as well. 4) Incorporating the non-dispatchability of DPV into planning: Rather than having a distinct innovative practice, innovation in this area is represented by evolving methods for capturing this important aspect of DPV. 5) Accounting for DPV’s location-specific factors: The innovative propensity-to-adopt method employs several factors to predict future DPV locations. Another emerging utility innovation is locating DPV strategically to enhance its benefits. 6) Estimating DPV’s impact on transmission and distribution investments: Innovative practices are being implemented to evaluate system needs, hosting capacities, and system investments needed to accommodate DPV deployment. 7) Estimating avoided losses associated with DPV: A time-differentiated marginal loss rate provides the most comprehensive estimate of avoided losses due to DPV, but no studies appear to use it. 8) Considering changes in DPV’s value with higher solar penetration: Innovative methods for addressing the value changes at high solar penetrations are lacking among the studies we evaluate. 9) Integrating DPV in planning across generation, transmission, and distribution: A few states and regions have started to develop more comprehensive processes that link planning forums, but there are still many issues to address.« less

  9. Implementing and Investigating Distributed Leadership in a National University Network--SaMnet

    ERIC Educational Resources Information Center

    Sharma, Manjula D.; Rifkin, Will; Tzioumis, Vicky; Hill, Matthew; Johnson, Elizabeth; Varsavsky, Cristina; Jones, Susan; Beames, Stephanie; Crampton, Andrea; Zadnik, Marjan; Pyke, Simon

    2017-01-01

    The literature suggests that collaborative approaches to leadership, such as distributed leadership, are essential for supporting educational innovators in leading change in teaching in universities. This paper briefly describes the array of activities, processes and resources to support distributed leadership in the implementation of a network,…

  10. Technology Solutions | Distributed Generation Interconnection Collaborative

    Science.gov Websites

    technologies, both hardware and software, can support the wider adoption of distributed generation on the grid . As the penetration of distributed-generation photovoltaics (DGPV) has risen rapidly in recent years posed by high penetrations of distributed PV. Other promising technologies include new utility software

  11. Discolouration in potable water distribution systems: a review.

    PubMed

    Vreeburg, J H G; Boxall, J B

    2007-02-01

    A large proportion of the customer contacts that drinking water supply companies receive stem from the occurrence of discoloured water. Currently, such complaints are dealt with in a reactive manner. However, water companies are being driven to implement planned activities to control discolouration prior to contacts occurring. Hence improved understanding of the dominant processes and predictive and management tools are needed. The material responsible for discolouration has a variety of origins and a range of processes and mechanisms may be associated with its accumulation within distribution systems. Irrespective of material origins, accumulation processes and mechanisms, discolouration events occur as a result of systems changes leading to mobilisation of the accumulations from within the network. Despite this conceptual understanding, there are very few published practicable tools and techniques available to aid water companies in the planned management and control of discolouration problems. Two recently developed and published, but different approaches to address this are reviewed here: the PODDS model which was developed to predict levels of turbidity as a result of change in hydraulic conditions, but which is semi-empirical and requires calibration; and the resuspension potential method which was developed to directly measure discolouration resulting from a controlled change in hydraulic conditions, providing a direct assessment of discolouration risk, although intrinsically requiring the limited generation of discoloured water within a live network. Both these methods support decision making on the need for maintenance operations. While risk evaluation and implementation of appropriate maintenance can be implemented to control discolouration risk, new material will continue to accumulate and hence an ongoing programme of maintenance is required. One sustainable measure to prevent such re-accumulation of material is the adoption of a self-cleaning threshold, an hydraulic force which a pipe experiences on a regular basis that effectively prevents the accumulation of material. This concept has been effectively employed for the design of new networks in the Netherlands. Alternatively, measures could be implemented to limit or prevent particles from entering or being generated within the network, such as by improving treatment or preventing the formation of corrosion by-products through lining or replacing ferrous pipes. The cost benefit of such capex investment or ongoing opex is uncertain as the quantification and relative significance of factors possibly leading to material accumulation are poorly understood. Hence, this is an area in need of significant further practical research and development.

  12. Scalable hybrid computation with spikes.

    PubMed

    Sarpeshkar, Rahul; O'Halloran, Micah

    2002-09-01

    We outline a hybrid analog-digital scheme for computing with three important features that enable it to scale to systems of large complexity: First, like digital computation, which uses several one-bit precise logical units to collectively compute a precise answer to a computation, the hybrid scheme uses several moderate-precision analog units to collectively compute a precise answer to a computation. Second, frequent discrete signal restoration of the analog information prevents analog noise and offset from degrading the computation. And, third, a state machine enables complex computations to be created using a sequence of elementary computations. A natural choice for implementing this hybrid scheme is one based on spikes because spike-count codes are digital, while spike-time codes are analog. We illustrate how spikes afford easy ways to implement all three components of scalable hybrid computation. First, as an important example of distributed analog computation, we show how spikes can create a distributed modular representation of an analog number by implementing digital carry interactions between spiking analog neurons. Second, we show how signal restoration may be performed by recursive spike-count quantization of spike-time codes. And, third, we use spikes from an analog dynamical system to trigger state transitions in a digital dynamical system, which reconfigures the analog dynamical system using a binary control vector; such feedback interactions between analog and digital dynamical systems create a hybrid state machine (HSM). The HSM extends and expands the concept of a digital finite-state-machine to the hybrid domain. We present experimental data from a two-neuron HSM on a chip that implements error-correcting analog-to-digital conversion with the concurrent use of spike-time and spike-count codes. We also present experimental data from silicon circuits that implement HSM-based pattern recognition using spike-time synchrony. We outline how HSMs may be used to perform learning, vector quantization, spike pattern recognition and generation, and how they may be reconfigured.

  13. Feedback mechanisms including real-time electronic alerts to achieve near 100% timely prophylactic antibiotic administration in surgical cases.

    PubMed

    Nair, Bala G; Newman, Shu-Fang; Peterson, Gene N; Wu, Wei-Ying; Schwid, Howard A

    2010-11-01

    Administration of prophylactic antibiotics during surgery is generally performed by the anesthesia providers. Timely antibiotic administration within the optimal time window before incision is critical for prevention of surgical site infections. However, this often becomes a difficult task for the anesthesia team during the busy part of a case when the patient is being anesthetized. Starting with the implementation of an anesthesia information management system (AIMS), we designed and implemented several feedback mechanisms to improve compliance of proper antibiotic delivery and documentation. This included generating e-mail feedback of missed documentation, distributing monthly summary reports, and generating real-time electronic alerts with a decision support system. In 20,974 surgical cases for the period, June 2008 to January 2010, the interventions of AIMS install, e-mail feedback, summary reports, and real-time alerts changed antibiotic compliance by -1.5%, 2.3%, 4.9%, and 9.3%, respectively, when compared with the baseline value of 90.0% ± 2.9% when paper anesthesia records were used. Highest antibiotic compliance was achieved when using real-time alerts. With real-time alerts, monthly compliance was >99% for every month between June 2009 and January 2010. Installation of AIMS itself did not improve antibiotic compliance over that achieved with paper anesthesia records. However, real-time guidance and reminders through electronic messages generated by a computerized decision support system (Smart Anesthesia Messenger, or SAM) significantly improved compliance. With such a system a consistent compliance of >99% was achieved.

  14. TH-A-19A-04: Latent Uncertainties and Performance of a GPU-Implemented Pre-Calculated Track Monte Carlo Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, M; Seuntjens, J; Roberge, D

    Purpose: Assessing the performance and uncertainty of a pre-calculated Monte Carlo (PMC) algorithm for proton and electron transport running on graphics processing units (GPU). While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from recycling a limited number of tracks in the pre-generated track bank is missing from the literature. With a proper uncertainty analysis, an optimal pre-generated track bank size can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pre-generated for electrons and protons using EGSnrc and GEANT4, respectively. The PMC algorithm for track transport was implementedmore » on the CUDA programming framework. GPU-PMC dose distributions were compared to benchmark dose distributions simulated using general-purpose MC codes in the same conditions. A latent uncertainty analysis was performed by comparing GPUPMC dose values to a “ground truth” benchmark while varying the track bank size and primary particle histories. Results: GPU-PMC dose distributions and benchmark doses were within 1% of each other in voxels with dose greater than 50% of Dmax. In proton calculations, a submillimeter distance-to-agreement error was observed at the Bragg Peak. Latent uncertainty followed a Poisson distribution with the number of tracks per energy (TPE) and a track bank of 20,000 TPE produced a latent uncertainty of approximately 1%. Efficiency analysis showed a 937× and 508× gain over a single processor core running DOSXYZnrc for 16 MeV electrons in water and bone, respectively. Conclusion: The GPU-PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty below 1%. The track bank size necessary to achieve an optimal efficiency can be tuned based on the desired uncertainty. Coupled with a model to calculate dose contributions from uncharged particles, GPU-PMC is a candidate for inverse planning of modulated electron radiotherapy and scanned proton beams. This work was supported in part by FRSQ-MSSS (Grant No. 22090), NSERC RG (Grant No. 432290) and CIHR MOP (Grant No. MOP-211360)« less

  15. Required spatial resolution of hydrological models to evaluate urban flood resilience measures

    NASA Astrophysics Data System (ADS)

    Gires, A.; Giangola-Murzyn, A.; Tchiguirinskaia, I.; Schertzer, D.; Lovejoy, S.

    2012-04-01

    During a flood in urban area, several non-linear processes (rainfall, surface runoff, sewer flow, and sub-surface flow) interact. Fully distributed hydrological models are a useful tool to better understand these complex interactions between natural processes and man built environment. Developing an efficient model is a first step to improve the understanding of flood resilience in urban area. Given that the previously mentioned underlying physical phenomenon exhibit different relevant scales, determining the required spatial resolution of such model is tricky but necessary issue. For instance such model should be able to properly represent large scale effects of local scale flood resilience measures such as stop logs. The model should also be as simple as possible without being simplistic. In this paper we test two types of model. First we use an operational semi-distributed model over a 3400 ha peri-urban area located in Seine-Saint-Denis (North-East of Paris). In this model, the area is divided into sub-catchments of average size 17 ha that are considered as homogenous, and only the sewer discharge is modelled. The rainfall data, whose resolution is 1 km is space and 5 min in time, comes from the C-band radar of Trappes, located in the West of Paris, and operated by Météo-France. It was shown that the spatial resolution of both the model and the rainfall field did not enable to fully grasp the small scale rainfall variability. To achieve this, first an ensemble of realistic rainfall fields downscaled to a resolution of 100 m is generated with the help of multifractal space-time cascades whose characteristic exponents are estimated on the available radar data. Second the corresponding ensemble of sewer hydrographs is simulated by inputting each rainfall realization to the model. It appears that the probability distribution of the simulated peak flow exhibits a power-law behaviour. This indicates that there is a great uncertainty associated with small scale rainfall. Second we focus on a 50 ha catchment of this area and implement Multi-Hydro, a fully distributed urban hydrological model currently being developed at Ecole des Ponts ParisTech (El Tabach et al., 2009). The version used in this paper consists in an interactive coupling between a 2D model representing infiltration and surface runoff (TREX, Two dimensional Runoff, Erosion and eXport model, Velleux et al., 2011) and a 1D model of sewer networks (SWMM, Storm Water Management Model, Rossman, 2007). Spatial resolution ranging from 2 m to 50 m for land use, topography and rainfall are tested. A special highlight on the impact of small scales rainfall is done. To achieve this the previously mentioned methodology is implemented with rainfall fields downscaled to 10 m in space and 20 s in time. Finally, we will discuss the gains generated by the implementation of the fully distributed model.

  16. Smart grid as a service: a discussion on design issues.

    PubMed

    Chao, Hung-Lin; Tsai, Chen-Chou; Hsiung, Pao-Ann; Chou, I-Hsin

    2014-01-01

    Smart grid allows the integration of distributed renewable energy resources into the conventional electricity distribution power grid such that the goals of reduction in power cost and in environment pollution can be met through an intelligent and efficient matching between power generators and power loads. Currently, this rapidly developing infrastructure is not as "smart" as it should be because of the lack of a flexible, scalable, and adaptive structure. As a solution, this work proposes smart grid as a service (SGaaS), which not only allows a smart grid to be composed out of basic services, but also allows power users to choose between different services based on their own requirements. The two important issues of service-level agreements and composition of services are also addressed in this work. Finally, we give the details of how SGaaS can be implemented using a FIPA-compliant JADE multiagent system.

  17. Measurement-device-independent quantum key distribution.

    PubMed

    Lo, Hoi-Kwong; Curty, Marcos; Qi, Bing

    2012-03-30

    How to remove detector side channel attacks has been a notoriously hard problem in quantum cryptography. Here, we propose a simple solution to this problem--measurement-device-independent quantum key distribution (QKD). It not only removes all detector side channels, but also doubles the secure distance with conventional lasers. Our proposal can be implemented with standard optical components with low detection efficiency and highly lossy channels. In contrast to the previous solution of full device independent QKD, the realization of our idea does not require detectors of near unity detection efficiency in combination with a qubit amplifier (based on teleportation) or a quantum nondemolition measurement of the number of photons in a pulse. Furthermore, its key generation rate is many orders of magnitude higher than that based on full device independent QKD. The results show that long-distance quantum cryptography over say 200 km will remain secure even with seriously flawed detectors.

  18. Construction and Resource Utilization Explorer (CRUX): Implementing Instrument Suite Data Fusion to Characterize Regolith Hydrogen Resources

    NASA Technical Reports Server (NTRS)

    Haldemann, Albert F. C.; Johnson, Jerome B.; Elphic, Richard C.; Boynton, William V.; Wetzel, John

    2006-01-01

    CRUX is a modular suite of geophysical and borehole instruments combined with display and decision support system (MapperDSS) tools to characterize regolith resources, surface conditions, and geotechnical properties. CRUX is a NASA-funded Technology Maturation Program effort to provide enabling technology for Lunar and Planetary Surface Operations (LPSO). The MapperDSS uses data fusion methods with CRUX instruments, and other available data and models, to provide regolith properties information needed for LPSO that cannot be determined otherwise. We demonstrate the data fusion method by showing how it might be applied to characterize the distribution and form of hydrogen using a selection of CRUX instruments: Borehole Neutron Probe and Thermal Evolved Gas Analyzer data as a function of depth help interpret Surface Neutron Probe data to generate 3D information. Secondary information from other instruments along with physical models improves the hydrogen distribution characterization, enabling information products for operational decision-making.

  19. Fully Quantum Fluctuation Theorems

    NASA Astrophysics Data System (ADS)

    Åberg, Johan

    2018-02-01

    Systems that are driven out of thermal equilibrium typically dissipate random quantities of energy on microscopic scales. Crooks fluctuation theorem relates the distribution of these random work costs to the corresponding distribution for the reverse process. By an analysis that explicitly incorporates the energy reservoir that donates the energy and the control system that implements the dynamic, we obtain a quantum generalization of Crooks theorem that not only includes the energy changes in the reservoir but also the full description of its evolution, including coherences. Moreover, this approach opens up the possibility for generalizations of the concept of fluctuation relations. Here, we introduce "conditional" fluctuation relations that are applicable to nonequilibrium systems, as well as approximate fluctuation relations that allow for the analysis of autonomous evolution generated by global time-independent Hamiltonians. We furthermore extend these notions to Markovian master equations, implicitly modeling the influence of the heat bath.

  20. An approach for generating trajectory-based dynamics which conserves the canonical distribution in the phase space formulation of quantum mechanics. II. Thermal correlation functions.

    PubMed

    Liu, Jian; Miller, William H

    2011-03-14

    We show the exact expression of the quantum mechanical time correlation function in the phase space formulation of quantum mechanics. The trajectory-based dynamics that conserves the quantum canonical distribution-equilibrium Liouville dynamics (ELD) proposed in Paper I is then used to approximately evaluate the exact expression. It gives exact thermal correlation functions (of even nonlinear operators, i.e., nonlinear functions of position or momentum operators) in the classical, high temperature, and harmonic limits. Various methods have been presented for the implementation of ELD. Numerical tests of the ELD approach in the Wigner or Husimi phase space have been made for a harmonic oscillator and two strongly anharmonic model problems, for each potential autocorrelation functions of both linear and nonlinear operators have been calculated. It suggests ELD can be a potentially useful approach for describing quantum effects for complex systems in condense phase.

  1. Soft evolution of multi-jet final states

    DOE PAGES

    Gerwick, Erik; Schumann, Steffen; Höche, Stefan; ...

    2015-02-16

    We present a new framework for computing resummed and matched distributions in processes with many hard QCD jets. The intricate color structure of soft gluon emission at large angles renders resummed calculations highly non-trivial in this case. We automate all ingredients necessary for the color evolution of the soft function at next-to-leading-logarithmic accuracy, namely the selection of the color bases and the projections of color operators and Born amplitudes onto those bases. Explicit results for all QCD processes with up to 2 → 5 partons are given. We also devise a new tree-level matching scheme for resummed calculations which exploitsmore » a quasi-local subtraction based on the Catani-Seymour dipole formalism. We implement both resummation and matching in the Sherpa event generator. As a proof of concept, we compute the resummed and matched transverse-thrust distribution for hadronic collisions.« less

  2. Smart Grid as a Service: A Discussion on Design Issues

    PubMed Central

    Tsai, Chen-Chou; Chou, I-Hsin

    2014-01-01

    Smart grid allows the integration of distributed renewable energy resources into the conventional electricity distribution power grid such that the goals of reduction in power cost and in environment pollution can be met through an intelligent and efficient matching between power generators and power loads. Currently, this rapidly developing infrastructure is not as “smart” as it should be because of the lack of a flexible, scalable, and adaptive structure. As a solution, this work proposes smart grid as a service (SGaaS), which not only allows a smart grid to be composed out of basic services, but also allows power users to choose between different services based on their own requirements. The two important issues of service-level agreements and composition of services are also addressed in this work. Finally, we give the details of how SGaaS can be implemented using a FIPA-compliant JADE multiagent system. PMID:25243214

  3. Compact Spreader Schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Placidi, M.; Jung, J. -Y.; Ratti, A.

    2014-07-25

    This paper describes beam distribution schemes adopting a novel implementation based on low amplitude vertical deflections combined with horizontal ones generated by Lambertson-type septum magnets. This scheme offers substantial compactness in the longitudinal layouts of the beam lines and increased flexibility for beam delivery of multiple beam lines on a shot-to-shot basis. Fast kickers (FK) or transverse electric field RF Deflectors (RFD) provide the low amplitude deflections. Initially proposed at the Stanford Linear Accelerator Center (SLAC) as tools for beam diagnostics and more recently adopted for multiline beam pattern schemes, RFDs offer repetition capabilities and a likely better amplitude reproducibilitymore » when compared to FKs, which, in turn, offer more modest financial involvements both in construction and operation. Both solutions represent an ideal approach for the design of compact beam distribution systems resulting in space and cost savings while preserving flexibility and beam quality.« less

  4. Laser Calibration of an Impact Disdrometer

    NASA Technical Reports Server (NTRS)

    Lane, John E.; Kasparis, Takis; Metzger, Philip T.; Jones, W. Linwood

    2014-01-01

    A practical approach to developing an operational low-cost disdrometer hinges on implementing an effective in situ adaptive calibration strategy. This calibration strategy lowers the cost of the device and provides a method to guarantee continued automatic calibration. In previous work, a collocated tipping bucket rain gauge was utilized to provide a calibration signal to the disdrometer's digital signal processing software. Rainfall rate is proportional to the 11/3 moment of the drop size distribution (a 7/2 moment can also be assumed, depending on the choice of terminal velocity relationship). In the previous case, the disdrometer calibration was characterized and weighted to the 11/3 moment of the drop size distribution (DSD). Optical extinction by rainfall is proportional to the 2nd moment of the DSD. Using visible laser light as a means to focus and generate an auxiliary calibration signal, the adaptive calibration processing is significantly improved.

  5. Computer Generated Holography with Intensity-Graded Patterns

    PubMed Central

    Conti, Rossella; Assayag, Osnath; de Sars, Vincent; Guillon, Marc; Emiliani, Valentina

    2016-01-01

    Computer Generated Holography achieves patterned illumination at the sample plane through phase modulation of the laser beam at the objective back aperture. This is obtained by using liquid crystal-based spatial light modulators (LC-SLMs), which modulate the spatial phase of the incident laser beam. A variety of algorithms is employed to calculate the phase modulation masks addressed to the LC-SLM. These algorithms range from simple gratings-and-lenses to generate multiple diffraction-limited spots, to iterative Fourier-transform algorithms capable of generating arbitrary illumination shapes perfectly tailored on the base of the target contour. Applications for holographic light patterning include multi-trap optical tweezers, patterned voltage imaging and optical control of neuronal excitation using uncaging or optogenetics. These past implementations of computer generated holography used binary input profile to generate binary light distribution at the sample plane. Here we demonstrate that using graded input sources, enables generating intensity graded light patterns and extend the range of application of holographic light illumination. At first, we use intensity-graded holograms to compensate for LC-SLM position dependent diffraction efficiency or sample fluorescence inhomogeneity. Finally we show that intensity-graded holography can be used to equalize photo evoked currents from cells expressing different levels of chanelrhodopsin2 (ChR2), one of the most commonly used optogenetics light gated channels, taking into account the non-linear dependence of channel opening on incident light. PMID:27799896

  6. 78 FR 16825 - Approval of Air Quality Implementation Plans; Navajo Nation; Regional Haze Requirements for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-19

    ... Quality Implementation Plans; Navajo Nation; Regional Haze Requirements for Navajo Generating Station... source-specific federal implementation plan (FIP) requiring the Navajo Generating Station (NGS), located... . SUPPLEMENTARY INFORMATION: Throughout this document, ``we'', ``us'', and ``our'' refer to EPA. Table of Contents...

  7. HAlign-II: efficient ultra-large multiple sequence alignment and phylogenetic tree reconstruction with distributed and parallel computing.

    PubMed

    Wan, Shixiang; Zou, Quan

    2017-01-01

    Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.

  8. Eddington's demon: inferring galaxy mass functions and other distributions from uncertain data

    NASA Astrophysics Data System (ADS)

    Obreschkow, D.; Murray, S. G.; Robotham, A. S. G.; Westmeier, T.

    2018-03-01

    We present a general modified maximum likelihood (MML) method for inferring generative distribution functions from uncertain and biased data. The MML estimator is identical to, but easier and many orders of magnitude faster to compute than the solution of the exact Bayesian hierarchical modelling of all measurement errors. As a key application, this method can accurately recover the mass function (MF) of galaxies, while simultaneously dealing with observational uncertainties (Eddington bias), complex selection functions and unknown cosmic large-scale structure. The MML method is free of binning and natively accounts for small number statistics and non-detections. Its fast implementation in the R-package dftools is equally applicable to other objects, such as haloes, groups, and clusters, as well as observables other than mass. The formalism readily extends to multidimensional distribution functions, e.g. a Choloniewski function for the galaxy mass-angular momentum distribution, also handled by dftools. The code provides uncertainties and covariances for the fitted model parameters and approximate Bayesian evidences. We use numerous mock surveys to illustrate and test the MML method, as well as to emphasize the necessity of accounting for observational uncertainties in MFs of modern galaxy surveys.

  9. A Simulation Framework for Battery Cell Impact Safety Modeling Using LS-DYNA

    DOE PAGES

    Marcicki, James; Zhu, Min; Bartlett, Alexander; ...

    2017-02-04

    The development process of electrified vehicles can benefit significantly from computer-aided engineering tools that predict themultiphysics response of batteries during abusive events. A coupled structural, electrical, electrochemical, and thermal model framework has been developed within the commercially available LS-DYNA software. The finite element model leverages a three-dimensional mesh structure that fully resolves the unit cell components. The mechanical solver predicts the distributed stress and strain response with failure thresholds leading to the onset of an internal short circuit. In this implementation, an arbitrary compressive strain criterion is applied locally to each unit cell. A spatially distributed equivalent circuit model providesmore » an empirical representation of the electrochemical responsewith minimal computational complexity.The thermalmodel provides state information to index the electrical model parameters, while simultaneously accepting irreversible and reversible sources of heat generation. The spatially distributed models of the electrical and thermal dynamics allow for the localization of current density and corresponding temperature response. The ability to predict the distributed thermal response of the cell as its stored energy is completely discharged through the short circuit enables an engineering safety assessment. A parametric analysis of an exemplary model is used to demonstrate the simulation capabilities.« less

  10. Next-generation purex flowsheets with acetohydroxamic acid as complexant for FBR and thermal-fuel reprocessing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Shekhar; Koganti, S.B.

    2008-07-01

    Acetohydroxamic acid (AHA) is a novel complexant for recycle of nuclear-fuel materials. It can be used in ordinary centrifugal extractors, eliminating the need for electro-redox equipment or complex maintenance requirements in a remotely maintained hot cell. In this work, the effect of AHA on Pu(IV) distribution ratios in 30% TBP system was quantified, modeled, and integrated in SIMPSEX code. Two sets of batch experiments involving macro Pu concentrations (conducted at IGCAR) and one high-Pu flowsheet (literature) were simulated for AHA based U-Pu separation. Based on the simulation and validation results, AHA based next-generation reprocessing flowsheets are proposed for co-processing basedmore » FBR and thermal-fuel reprocessing as well as evaporator-less macro-level Pu concentration process required for MOX fuel fabrication. Utilization of AHA results in significant simplification in plant design and simpler technology implementations with significant cost savings. (authors)« less

  11. A parallel computational model for GATE simulations.

    PubMed

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Digital diffraction analysis enables low-cost molecular diagnostics on a smartphone

    PubMed Central

    Im, Hyungsoon; Castro, Cesar M.; Shao, Huilin; Liong, Monty; Song, Jun; Pathania, Divya; Fexon, Lioubov; Min, Changwook; Avila-Wallace, Maria; Zurkiya, Omar; Rho, Junsung; Magaoay, Brady; Tambouret, Rosemary H.; Pivovarov, Misha; Weissleder, Ralph; Lee, Hakho

    2015-01-01

    The widespread distribution of smartphones, with their integrated sensors and communication capabilities, makes them an ideal platform for point-of-care (POC) diagnosis, especially in resource-limited settings. Molecular diagnostics, however, have been difficult to implement in smartphones. We herein report a diffraction-based approach that enables molecular and cellular diagnostics. The D3 (digital diffraction diagnosis) system uses microbeads to generate unique diffraction patterns which can be acquired by smartphones and processed by a remote server. We applied the D3 platform to screen for precancerous or cancerous cells in cervical specimens and to detect human papillomavirus (HPV) DNA. The D3 assay generated readouts within 45 min and showed excellent agreement with gold-standard pathology or HPV testing, respectively. This approach could have favorable global health applications where medical access is limited or when pathology bottlenecks challenge prompt diagnostic readouts. PMID:25870273

  13. Constraining anomalous Higgs boson couplings to the heavy-flavor fermions using matrix element techniques

    NASA Astrophysics Data System (ADS)

    Gritsan, Andrei V.; Röntsch, Raoul; Schulze, Markus; Xiao, Meng

    2016-09-01

    In this paper, we investigate anomalous interactions of the Higgs boson with heavy fermions, employing shapes of kinematic distributions. We study the processes p p →t t ¯+H , b b ¯+H , t q +H , and p p →H →τ+τ- and present applications of event generation, reweighting techniques for fast simulation of anomalous couplings, as well as matrix element techniques for optimal sensitivity. We extend the matrix element likelihood approach (MELA) technique, which proved to be a powerful matrix element tool for Higgs boson discovery and characterization during Run I of the LHC, and implement all analysis tools in the JHU generator framework. A next-to-leading-order QCD description of the p p →t t ¯+H process allows us to investigate the performance of the MELA in the presence of extra radiation. Finally, projections for LHC measurements through the end of Run III are presented.

  14. Design and Development of ChemInfoCloud: An Integrated Cloud Enabled Platform for Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Bhavasar, Arvind; Vyas, Renu

    2015-01-01

    The power of cloud computing and distributed computing has been harnessed to handle vast and heterogeneous data required to be processed in any virtual screening protocol. A cloud computing platorm ChemInfoCloud was built and integrated with several chemoinformatics and bioinformatics tools. The robust engine performs the core chemoinformatics tasks of lead generation, lead optimisation and property prediction in a fast and efficient manner. It has also been provided with some of the bioinformatics functionalities including sequence alignment, active site pose prediction and protein ligand docking. Text mining, NMR chemical shift (1H, 13C) prediction and reaction fingerprint generation modules for efficient lead discovery are also implemented in this platform. We have developed an integrated problem solving cloud environment for virtual screening studies that also provides workflow management, better usability and interaction with end users using container based virtualization, OpenVz.

  15. Realization of multiple orbital angular momentum modes simultaneously through four-dimensional antenna arrays.

    PubMed

    Sun, Chao; Yang, Shiwen; Chen, Yikai; Guo, Jixin; Qu, Shiwei

    2018-01-09

    Electromagnetic waves carrying orbital angular momentum (OAM) in radio frequency range have drawn great attention owing to its potential applications in increasing communication capacity. In this paper, both single-pole single-throw (SPST) switches and single-pole double-throw (SPDT) switches are designed and implemented. Optimal time sequence allows four-dimensional (4-D) circular antenna array to generate multiple OAM-carrying waves as well as enhance the field intensity of each OAM-carrying wave. A novel experimental platform is developed to measure the phase distribution when the transmitting antenna and the receiving antenna operate at different frequencies. The good agreement between the measurement and simulation results demonstrate that 4-D circular antenna array is able to generate multiple OAM modes simultaneously. Furthermore, the superiority of the 4-D circular antenna array in receiving and demodulating multiple OAM-carrying signals is validated through the filter and bit error rate (BER) simulations.

  16. HEALPix: A Framework for High-Resolution Discretization and Fast Analysis of Data Distributed on the Sphere

    NASA Technical Reports Server (NTRS)

    Gorski, K. M.; Hivon, Eric; Banday, A. J.; Wandelt, Benjamin D.; Hansen, Frode K.; Reinecke, Mstvos; Bartelmann, Matthia

    2005-01-01

    HEALPix the Hierarchical Equal Area isoLatitude Pixelization is a versatile structure for the pixelization of data on the sphere. An associated library of computational algorithms and visualization software supports fast scientific applications executable directly on discretized spherical maps generated from very large volumes of astronomical data. Originally developed to address the data processing and analysis needs of the present generation of cosmic microwave background experiments (e.g., BOOMERANG, WMAP), HEALPix can be expanded to meet many of the profound challenges that will arise in confrontation with the observational output of future missions and experiments, including, e.g., Planck, Herschel, SAFIR, and the Beyond Einstein inflation probe. In this paper we consider the requirements and implementation constraints on a framework that simultaneously enables an efficient discretization with associated hierarchical indexation and fast analysis/synthesis of functions defined on the sphere. We demonstrate how these are explicitly satisfied by HEALPix.

  17. Ultracapacitors for fuel saving in small size hybrid vehicles

    NASA Astrophysics Data System (ADS)

    Solero, L.; Lidozzi, A.; Serrao, V.; Martellucci, L.; Rossi, E.

    The main purpose of the paper is to describe a small size hybrid vehicle having ultracapacitors as on-board storage unit. The vehicle on-board main power supply is achieved by a genset being formed of a 250 cm 3 internal combustion engine and a permanent magnet synchronous electric generator, whereas 4 16V-500F ultracapacitors modules are connected in series in order to supply as well as to store the power peaks during respectively acceleration and braking vehicle modes of operation. The traction power is provided by a permanent magnet synchronous electric motor, whereas a distributed power electronic interface is in charge of all the required electronic conversions as well of controlling the operating conditions for each power unit. The paper discusses the implemented control strategy and shows experimental results on the modes of operation of both generation unit and storage unit.

  18. Digital diffraction analysis enables low-cost molecular diagnostics on a smartphone.

    PubMed

    Im, Hyungsoon; Castro, Cesar M; Shao, Huilin; Liong, Monty; Song, Jun; Pathania, Divya; Fexon, Lioubov; Min, Changwook; Avila-Wallace, Maria; Zurkiya, Omar; Rho, Junsung; Magaoay, Brady; Tambouret, Rosemary H; Pivovarov, Misha; Weissleder, Ralph; Lee, Hakho

    2015-05-05

    The widespread distribution of smartphones, with their integrated sensors and communication capabilities, makes them an ideal platform for point-of-care (POC) diagnosis, especially in resource-limited settings. Molecular diagnostics, however, have been difficult to implement in smartphones. We herein report a diffraction-based approach that enables molecular and cellular diagnostics. The D3 (digital diffraction diagnosis) system uses microbeads to generate unique diffraction patterns which can be acquired by smartphones and processed by a remote server. We applied the D3 platform to screen for precancerous or cancerous cells in cervical specimens and to detect human papillomavirus (HPV) DNA. The D3 assay generated readouts within 45 min and showed excellent agreement with gold-standard pathology or HPV testing, respectively. This approach could have favorable global health applications where medical access is limited or when pathology bottlenecks challenge prompt diagnostic readouts.

  19. Distributed Energy Implementation Options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shah, Chandralata N

    2017-09-13

    This presentation covers the options for implementing distributed energy projects. It distinguishes between options available for distributed energy that is government owned versus privately owned, with a focus on the privately owned options including Energy Savings Performance Contract Energy Sales Agreements (ESPC ESAs). The presentation covers the new ESPC ESA Toolkit and other Federal Energy Management Program resources.

  20. A knowledge base architecture for distributed knowledge agents

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel; Walls, Bryan

    1990-01-01

    A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.

Top