Sample records for halving-based hash function

  1. Hash function based on chaotic map lattices.

    PubMed

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  2. Hash function based on chaotic map lattices

    NASA Astrophysics Data System (ADS)

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  3. Collision analysis of one kind of chaos-based hash function

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Peng, Wenbing; Liao, Xiaofeng; Xiang, Tao

    2010-02-01

    In the last decade, various chaos-based hash functions have been proposed. Nevertheless, the corresponding analyses of them lag far behind. In this Letter, we firstly take a chaos-based hash function proposed very recently in Amin, Faragallah and Abd El-Latif (2009) [11] as a sample to analyze its computational collision problem, and then generalize the construction method of one kind of chaos-based hash function and summarize some attentions to avoid the collision problem. It is beneficial to the hash function design based on chaos in the future.

  4. Perceptual Audio Hashing Functions

    NASA Astrophysics Data System (ADS)

    Özer, Hamza; Sankur, Bülent; Memon, Nasir; Anarım, Emin

    2005-12-01

    Perceptual hash functions provide a tool for fast and reliable identification of content. We present new audio hash functions based on summarization of the time-frequency spectral characteristics of an audio document. The proposed hash functions are based on the periodicity series of the fundamental frequency and on singular-value description of the cepstral frequencies. They are found, on one hand, to perform very satisfactorily in identification and verification tests, and on the other hand, to be very resilient to a large variety of attacks. Moreover, we address the issue of security of hashes and propose a keying technique, and thereby a key-dependent hash function.

  5. A cryptographic hash function based on chaotic network automata

    NASA Astrophysics Data System (ADS)

    Machicao, Jeaneth; Bruno, Odemir M.

    2017-12-01

    Chaos theory has been used to develop several cryptographic methods relying on the pseudo-random properties extracted from simple nonlinear systems such as cellular automata (CA). Cryptographic hash functions (CHF) are commonly used to check data integrity. CHF “compress” arbitrary long messages (input) into much smaller representations called hash values or message digest (output), designed to prevent the ability to reverse the hash values into the original message. This paper proposes a chaos-based CHF inspired on an encryption method based on chaotic CA rule B1357-S2468. Here, we propose an hybrid model that combines CA and networks, called network automata (CNA), whose chaotic spatio-temporal outputs are used to compute a hash value. Following the Merkle and Damgård model of construction, a portion of the message is entered as the initial condition of the network automata, so that the rest parts of messages are iteratively entered to perturb the system. The chaotic network automata shuffles the message using flexible control parameters, so that the generated hash value is highly sensitive to the message. As demonstrated in our experiments, the proposed model has excellent pseudo-randomness and sensitivity properties with acceptable performance when compared to conventional hash functions.

  6. Gene function prediction based on Gene Ontology Hierarchy Preserving Hashing.

    PubMed

    Zhao, Yingwen; Fu, Guangyuan; Wang, Jun; Guo, Maozu; Yu, Guoxian

    2018-02-23

    Gene Ontology (GO) uses structured vocabularies (or terms) to describe the molecular functions, biological roles, and cellular locations of gene products in a hierarchical ontology. GO annotations associate genes with GO terms and indicate the given gene products carrying out the biological functions described by the relevant terms. However, predicting correct GO annotations for genes from a massive set of GO terms as defined by GO is a difficult challenge. To combat with this challenge, we introduce a Gene Ontology Hierarchy Preserving Hashing (HPHash) based semantic method for gene function prediction. HPHash firstly measures the taxonomic similarity between GO terms. It then uses a hierarchy preserving hashing technique to keep the hierarchical order between GO terms, and to optimize a series of hashing functions to encode massive GO terms via compact binary codes. After that, HPHash utilizes these hashing functions to project the gene-term association matrix into a low-dimensional one and performs semantic similarity based gene function prediction in the low-dimensional space. Experimental results on three model species (Homo sapiens, Mus musculus and Rattus norvegicus) for interspecies gene function prediction show that HPHash performs better than other related approaches and it is robust to the number of hash functions. In addition, we also take HPHash as a plugin for BLAST based gene function prediction. From the experimental results, HPHash again significantly improves the prediction performance. The codes of HPHash are available at: http://mlda.swu.edu.cn/codes.php?name=HPHash. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. A more secure parallel keyed hash function based on chaotic neural network

    NASA Astrophysics Data System (ADS)

    Huang, Zhongquan

    2011-08-01

    Although various hash functions based on chaos or chaotic neural network were proposed, most of them can not work efficiently in parallel computing environment. Recently, an algorithm for parallel keyed hash function construction based on chaotic neural network was proposed [13]. However, there is a strict limitation in this scheme that its secret keys must be nonce numbers. In other words, if the keys are used more than once in this scheme, there will be some potential security flaw. In this paper, we analyze the cause of vulnerability of the original one in detail, and then propose the corresponding enhancement measures, which can remove the limitation on the secret keys. Theoretical analysis and computer simulation indicate that the modified hash function is more secure and practical than the original one. At the same time, it can keep the parallel merit and satisfy the other performance requirements of hash function, such as good statistical properties, high message and key sensitivity, and strong collision resistance, etc.

  8. Collision attack against Tav-128 hash function

    NASA Astrophysics Data System (ADS)

    Hariyanto, Fajar; Hayat Susanti, Bety

    2017-10-01

    Tav-128 is a hash function which is designed for Radio Frequency Identification (RFID) authentication protocol. Tav-128 is expected to be a cryptographically secure hash function which meets collision resistance properties. In this research, a collision attack is done to prove whether Tav-128 is a collision resistant hash function. The results show that collisions can be obtained in Tav-128 hash function which means in other word, Tav-128 is not a collision resistant hash function.

  9. A Simple Secure Hash Function Scheme Using Multiple Chaotic Maps

    NASA Astrophysics Data System (ADS)

    Ahmad, Musheer; Khurana, Shruti; Singh, Sushmita; AlSharari, Hamed D.

    2017-06-01

    The chaotic maps posses high parameter sensitivity, random-like behavior and one-way computations, which favor the construction of cryptographic hash functions. In this paper, we propose to present a novel hash function scheme which uses multiple chaotic maps to generate efficient variable-sized hash functions. The message is divided into four parts, each part is processed by a different 1D chaotic map unit yielding intermediate hash code. The four codes are concatenated to two blocks, then each block is processed through 2D chaotic map unit separately. The final hash value is generated by combining the two partial hash codes. The simulation analyses such as distribution of hashes, statistical properties of confusion and diffusion, message and key sensitivity, collision resistance and flexibility are performed. The results reveal that the proposed anticipated hash scheme is simple, efficient and holds comparable capabilities when compared with some recent chaos-based hash algorithms.

  10. Image encryption algorithm based on multiple mixed hash functions and cyclic shift

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Zhu, Xiaoqiang; Wu, Xiangjun; Zhang, Yingqian

    2018-08-01

    This paper proposes a new one-time pad scheme for chaotic image encryption that is based on the multiple mixed hash functions and the cyclic-shift function. The initial value is generated using both information of the plaintext image and the chaotic sequences, which are calculated from the SHA1 and MD5 hash algorithms. The scrambling sequences are generated by the nonlinear equations and logistic map. This paper aims to improve the deficiencies of traditional Baptista algorithms and its improved algorithms. We employ the cyclic-shift function and piece-wise linear chaotic maps (PWLCM), which give each shift number the characteristics of chaos, to diffuse the image. Experimental results and security analysis show that the new scheme has better security and can resist common attacks.

  11. SGFSC: speeding the gene functional similarity calculation based on hash tables.

    PubMed

    Tian, Zhen; Wang, Chunyu; Guo, Maozu; Liu, Xiaoyan; Teng, Zhixia

    2016-11-04

    In recent years, many measures of gene functional similarity have been proposed and widely used in all kinds of essential research. These methods are mainly divided into two categories: pairwise approaches and group-wise approaches. However, a common problem with these methods is their time consumption, especially when measuring the gene functional similarities of a large number of gene pairs. The problem of computational efficiency for pairwise approaches is even more prominent because they are dependent on the combination of semantic similarity. Therefore, the efficient measurement of gene functional similarity remains a challenging problem. To speed current gene functional similarity calculation methods, a novel two-step computing strategy is proposed: (1) establish a hash table for each method to store essential information obtained from the Gene Ontology (GO) graph and (2) measure gene functional similarity based on the corresponding hash table. There is no need to traverse the GO graph repeatedly for each method with the help of the hash table. The analysis of time complexity shows that the computational efficiency of these methods is significantly improved. We also implement a novel Speeding Gene Functional Similarity Calculation tool, namely SGFSC, which is bundled with seven typical measures using our proposed strategy. Further experiments show the great advantage of SGFSC in measuring gene functional similarity on the whole genomic scale. The proposed strategy is successful in speeding current gene functional similarity calculation methods. SGFSC is an efficient tool that is freely available at http://nclab.hit.edu.cn/SGFSC . The source code of SGFSC can be downloaded from http://pan.baidu.com/s/1dFFmvpZ .

  12. Text image authenticating algorithm based on MD5-hash function and Henon map

    NASA Astrophysics Data System (ADS)

    Wei, Jinqiao; Wang, Ying; Ma, Xiaoxue

    2017-07-01

    In order to cater to the evidentiary requirements of the text image, this paper proposes a fragile watermarking algorithm based on Hash function and Henon map. The algorithm is to divide a text image into parts, get flippable pixels and nonflippable pixels of every lump according to PSD, generate watermark of non-flippable pixels with MD5-Hash, encrypt watermark with Henon map and select embedded blocks. The simulation results show that the algorithm with a good ability in tampering localization can be used to authenticate and forensics the authenticity and integrity of text images

  13. Implementation of cryptographic hash function SHA256 in C++

    NASA Astrophysics Data System (ADS)

    Shrivastava, Akash

    2012-02-01

    This abstract explains the implementation of SHA Secure hash algorithm 256 using C++. The SHA-2 is a strong hashing algorithm used in almost all kinds of security applications. The algorithm consists of 2 phases: Preprocessing and hash computation. Preprocessing involves padding a message, parsing the padded message into m-bits blocks, and setting initialization values to be used in the hash computation. It generates a message schedule from padded message and uses that schedule, along with functions, constants, and word operations to iteratively generate a series of hash values. The final hash value generated by the computation is used to determine the message digest. SHA-2 includes a significant number of changes from its predecessor, SHA-1. SHA-2 consists of a set of four hash functions with digests that are 224, 256, 384 or 512 bits. The algorithm outputs a 256 bits message block with an internal state block of 256 bits and initial block size of 512 bits. Maximum message length in bit is generated is 2^64 -1, over all computed over a series of 64 rounds consisting or several operations such as and, or, Xor, Shr, Rot. The code will provide clear understanding of the hash algorithm and generates hash values to retrieve message digest.

  14. Spherical hashing: binary code embedding with hyperspheres.

    PubMed

    Heo, Jae-Pil; Lee, Youngwoon; He, Junfeng; Chang, Shih-Fu; Yoon, Sung-Eui

    2015-11-01

    Many binary code embedding schemes have been actively studied recently, since they can provide efficient similarity search, and compact data representations suitable for handling large scale image databases. Existing binary code embedding techniques encode high-dimensional data by using hyperplane-based hashing functions. In this paper we propose a novel hypersphere-based hashing function, spherical hashing, to map more spatially coherent data points into a binary code compared to hyperplane-based hashing functions. We also propose a new binary code distance function, spherical Hamming distance, tailored for our hypersphere-based binary coding scheme, and design an efficient iterative optimization process to achieve both balanced partitioning for each hash function and independence between hashing functions. Furthermore, we generalize spherical hashing to support various similarity measures defined by kernel functions. Our extensive experiments show that our spherical hashing technique significantly outperforms state-of-the-art techniques based on hyperplanes across various benchmarks with sizes ranging from one to 75 million of GIST, BoW and VLAD descriptors. The performance gains are consistent and large, up to 100 percent improvements over the second best method among tested methods. These results confirm the unique merits of using hyperspheres to encode proximity regions in high-dimensional spaces. Finally, our method is intuitive and easy to implement.

  15. Fully Integrated Passive UHF RFID Tag for Hash-Based Mutual Authentication Protocol.

    PubMed

    Mikami, Shugo; Watanabe, Dai; Li, Yang; Sakiyama, Kazuo

    2015-01-01

    Passive radio-frequency identification (RFID) tag has been used in many applications. While the RFID market is expected to grow, concerns about security and privacy of the RFID tag should be overcome for the future use. To overcome these issues, privacy-preserving authentication protocols based on cryptographic algorithms have been designed. However, to the best of our knowledge, evaluation of the whole tag, which includes an antenna, an analog front end, and a digital processing block, that runs authentication protocols has not been studied. In this paper, we present an implementation and evaluation of a fully integrated passive UHF RFID tag that runs a privacy-preserving mutual authentication protocol based on a hash function. We design a single chip including the analog front end and the digital processing block. We select a lightweight hash function supporting 80-bit security strength and a standard hash function supporting 128-bit security strength. We show that when the lightweight hash function is used, the tag completes the protocol with a reader-tag distance of 10 cm. Similarly, when the standard hash function is used, the tag completes the protocol with the distance of 8.5 cm. We discuss the impact of the peak power consumption of the tag on the distance of the tag due to the hash function.

  16. Fully Integrated Passive UHF RFID Tag for Hash-Based Mutual Authentication Protocol

    PubMed Central

    Mikami, Shugo; Watanabe, Dai; Li, Yang; Sakiyama, Kazuo

    2015-01-01

    Passive radio-frequency identification (RFID) tag has been used in many applications. While the RFID market is expected to grow, concerns about security and privacy of the RFID tag should be overcome for the future use. To overcome these issues, privacy-preserving authentication protocols based on cryptographic algorithms have been designed. However, to the best of our knowledge, evaluation of the whole tag, which includes an antenna, an analog front end, and a digital processing block, that runs authentication protocols has not been studied. In this paper, we present an implementation and evaluation of a fully integrated passive UHF RFID tag that runs a privacy-preserving mutual authentication protocol based on a hash function. We design a single chip including the analog front end and the digital processing block. We select a lightweight hash function supporting 80-bit security strength and a standard hash function supporting 128-bit security strength. We show that when the lightweight hash function is used, the tag completes the protocol with a reader-tag distance of 10 cm. Similarly, when the standard hash function is used, the tag completes the protocol with the distance of 8.5 cm. We discuss the impact of the peak power consumption of the tag on the distance of the tag due to the hash function. PMID:26491714

  17. Model-based vision using geometric hashing

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  18. Fast perceptual image hash based on cascade algorithm

    NASA Astrophysics Data System (ADS)

    Ruchay, Alexey; Kober, Vitaly; Yavtushenko, Evgeniya

    2017-09-01

    In this paper, we propose a perceptual image hash algorithm based on cascade algorithm, which can be applied in image authentication, retrieval, and indexing. Image perceptual hash uses for image retrieval in sense of human perception against distortions caused by compression, noise, common signal processing and geometrical modifications. The main disadvantage of perceptual hash is high time expenses. In the proposed cascade algorithm of image retrieval initializes with short hashes, and then a full hash is applied to the processed results. Computer simulation results show that the proposed hash algorithm yields a good performance in terms of robustness, discriminability, and time expenses.

  19. Linear Subspace Ranking Hashing for Cross-Modal Retrieval.

    PubMed

    Li, Kai; Qi, Guo-Jun; Ye, Jun; Hua, Kien A

    2017-09-01

    Hashing has attracted a great deal of research in recent years due to its effectiveness for the retrieval and indexing of large-scale high-dimensional multimedia data. In this paper, we propose a novel ranking-based hashing framework that maps data from different modalities into a common Hamming space where the cross-modal similarity can be measured using Hamming distance. Unlike existing cross-modal hashing algorithms where the learned hash functions are binary space partitioning functions, such as the sign and threshold function, the proposed hashing scheme takes advantage of a new class of hash functions closely related to rank correlation measures which are known to be scale-invariant, numerically stable, and highly nonlinear. Specifically, we jointly learn two groups of linear subspaces, one for each modality, so that features' ranking orders in different linear subspaces maximally preserve the cross-modal similarities. We show that the ranking-based hash function has a natural probabilistic approximation which transforms the original highly discontinuous optimization problem into one that can be efficiently solved using simple gradient descent algorithms. The proposed hashing framework is also flexible in the sense that the optimization procedures are not tied up to any specific form of loss function, which is typical for existing cross-modal hashing methods, but rather we can flexibly accommodate different loss functions with minimal changes to the learning steps. We demonstrate through extensive experiments on four widely-used real-world multimodal datasets that the proposed cross-modal hashing method can achieve competitive performance against several state-of-the-arts with only moderate training and testing time.

  20. Efficient computation of hashes

    NASA Astrophysics Data System (ADS)

    Lopes, Raul H. C.; Franqueira, Virginia N. L.; Hobson, Peter R.

    2014-06-01

    The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced.

  1. Neighborhood Discriminant Hashing for Large-Scale Image Retrieval.

    PubMed

    Tang, Jinhui; Li, Zechao; Wang, Meng; Zhao, Ruizhen

    2015-09-01

    With the proliferation of large-scale community-contributed images, hashing-based approximate nearest neighbor search in huge databases has aroused considerable interest from the fields of computer vision and multimedia in recent years because of its computational and memory efficiency. In this paper, we propose a novel hashing method named neighborhood discriminant hashing (NDH) (for short) to implement approximate similarity search. Different from the previous work, we propose to learn a discriminant hashing function by exploiting local discriminative information, i.e., the labels of a sample can be inherited from the neighbor samples it selects. The hashing function is expected to be orthogonal to avoid redundancy in the learned hashing bits as much as possible, while an information theoretic regularization is jointly exploited using maximum entropy principle. As a consequence, the learned hashing function is compact and nonredundant among bits, while each bit is highly informative. Extensive experiments are carried out on four publicly available data sets and the comparison results demonstrate the outperforming performance of the proposed NDH method over state-of-the-art hashing techniques.

  2. Multiview alignment hashing for efficient image search.

    PubMed

    Liu, Li; Yu, Mengyang; Shao, Ling

    2015-03-01

    Hashing is a popular and efficient method for nearest neighbor search in large-scale data spaces by embedding high-dimensional feature descriptors into a similarity preserving Hamming space with a low dimension. For most hashing methods, the performance of retrieval heavily depends on the choice of the high-dimensional feature descriptor. Furthermore, a single type of feature cannot be descriptive enough for different images when it is used for hashing. Thus, how to combine multiple representations for learning effective hashing functions is an imminent task. In this paper, we present a novel unsupervised multiview alignment hashing approach based on regularized kernel nonnegative matrix factorization, which can find a compact representation uncovering the hidden semantics and simultaneously respecting the joint probability distribution of data. In particular, we aim to seek a matrix factorization to effectively fuse the multiple information sources meanwhile discarding the feature redundancy. Since the raised problem is regarded as nonconvex and discrete, our objective function is then optimized via an alternate way with relaxation and converges to a locally optimal solution. After finding the low-dimensional representation, the hashing functions are finally obtained through multivariable logistic regression. The proposed method is systematically evaluated on three data sets: 1) Caltech-256; 2) CIFAR-10; and 3) CIFAR-20, and the results show that our method significantly outperforms the state-of-the-art multiview hashing techniques.

  3. On the concept of cryptographic quantum hashing

    NASA Astrophysics Data System (ADS)

    Ablayev, F.; Ablayev, M.

    2015-12-01

    In the letter we define the notion of a quantum resistant ((ε ,δ ) -resistant) hash function which consists of a combination of pre-image (one-way) resistance (ε-resistance) and collision resistance (δ-resistance) properties. We present examples and discussion that supports the idea of quantum hashing. We present an explicit quantum hash function which is ‘balanced’, one-way resistant and collision resistant and demonstrate how to build a large family of quantum hash functions. Balanced quantum hash functions need a high degree of entanglement between the qubits. We use a phase transformation technique to express quantum hashing constructions, which is an effective way of mapping hash states to coherent states in a superposition of time-bin modes. The phase transformation technique is ready to be implemented with current optical technology.

  4. Discriminative Projection Selection Based Face Image Hashing

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  5. On the balanced quantum hashing

    NASA Astrophysics Data System (ADS)

    Ablayev, F.; Ablayev, M.; Vasiliev, A.

    2016-02-01

    In the paper we define a notion of a resistant quantum hash function which combines a notion of pre-image (one-way) resistance and the notion of collision resistance. In the quantum setting one-way resistance property and collision resistance property are correlated: the “more” a quantum function is one-way resistant the “less” it is collision resistant and vice versa. We present an explicit quantum hash function which is “balanced” one-way resistant and collision resistant and demonstrate how to build a large family of balanced quantum hash functions.

  6. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  7. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    PubMed Central

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196

  8. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption.

    PubMed

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-29

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  9. Hash Functions and Information Theoretic Security

    NASA Astrophysics Data System (ADS)

    Bagheri, Nasour; Knudsen, Lars R.; Naderi, Majid; Thomsen, Søren S.

    Information theoretic security is an important security notion in cryptography as it provides a true lower bound for attack complexities. However, in practice attacks often have a higher cost than the information theoretic bound. In this paper we study the relationship between information theoretic attack costs and real costs. We show that in the information theoretic model, many well-known and commonly used hash functions such as MD5 and SHA-256 fail to be preimage resistant.

  10. Robust hashing with local models for approximate similarity search.

    PubMed

    Song, Jingkuan; Yang, Yi; Li, Xuelong; Huang, Zi; Yang, Yang

    2014-07-01

    Similarity search plays an important role in many applications involving high-dimensional data. Due to the known dimensionality curse, the performance of most existing indexing structures degrades quickly as the feature dimensionality increases. Hashing methods, such as locality sensitive hashing (LSH) and its variants, have been widely used to achieve fast approximate similarity search by trading search quality for efficiency. However, most existing hashing methods make use of randomized algorithms to generate hash codes without considering the specific structural information in the data. In this paper, we propose a novel hashing method, namely, robust hashing with local models (RHLM), which learns a set of robust hash functions to map the high-dimensional data points into binary hash codes by effectively utilizing local structural information. In RHLM, for each individual data point in the training dataset, a local hashing model is learned and used to predict the hash codes of its neighboring data points. The local models from all the data points are globally aligned so that an optimal hash code can be assigned to each data point. After obtaining the hash codes of all the training data points, we design a robust method by employing l2,1 -norm minimization on the loss function to learn effective hash functions, which are then used to map each database point into its hash code. Given a query data point, the search process first maps it into the query hash code by the hash functions and then explores the buckets, which have similar hash codes to the query hash code. Extensive experimental results conducted on real-life datasets show that the proposed RHLM outperforms the state-of-the-art methods in terms of search quality and efficiency.

  11. Study of the similarity function in Indexing-First-One hashing

    NASA Astrophysics Data System (ADS)

    Lai, Y.-L.; Jin, Z.; Goi, B.-M.; Chai, T.-Y.

    2017-06-01

    The recent proposed Indexing-First-One (IFO) hashing is a latest technique that is particularly adopted for eye iris template protection, i.e. IrisCode. However, IFO employs the measure of Jaccard Similarity (JS) initiated from Min-hashing has yet been adequately discussed. In this paper, we explore the nature of JS in binary domain and further propose a mathematical formulation to generalize the usage of JS, which is subsequently verified by using CASIA v3-Interval iris database. Our study reveals that JS applied in IFO hashing is a generalized version in measure two input objects with respect to Min-Hashing where the coefficient of JS is equal to one. With this understanding, IFO hashing can propagate the useful properties of Min-hashing, i.e. similarity preservation, thus favorable for similarity searching or recognition in binary space.

  12. Query-Adaptive Reciprocal Hash Tables for Nearest Neighbor Search.

    PubMed

    Liu, Xianglong; Deng, Cheng; Lang, Bo; Tao, Dacheng; Li, Xuelong

    2016-02-01

    Recent years have witnessed the success of binary hashing techniques in approximate nearest neighbor search. In practice, multiple hash tables are usually built using hashing to cover more desired results in the hit buckets of each table. However, rare work studies the unified approach to constructing multiple informative hash tables using any type of hashing algorithms. Meanwhile, for multiple table search, it also lacks of a generic query-adaptive and fine-grained ranking scheme that can alleviate the binary quantization loss suffered in the standard hashing techniques. To solve the above problems, in this paper, we first regard the table construction as a selection problem over a set of candidate hash functions. With the graph representation of the function set, we propose an efficient solution that sequentially applies normalized dominant set to finding the most informative and independent hash functions for each table. To further reduce the redundancy between tables, we explore the reciprocal hash tables in a boosting manner, where the hash function graph is updated with high weights emphasized on the misclassified neighbor pairs of previous hash tables. To refine the ranking of the retrieved buckets within a certain Hamming radius from the query, we propose a query-adaptive bitwise weighting scheme to enable fine-grained bucket ranking in each hash table, exploiting the discriminative power of its hash functions and their complement for nearest neighbor search. Moreover, we integrate such scheme into the multiple table search using a fast, yet reciprocal table lookup algorithm within the adaptive weighted Hamming radius. In this paper, both the construction method and the query-adaptive search method are general and compatible with different types of hashing algorithms using different feature spaces and/or parameter settings. Our extensive experiments on several large-scale benchmarks demonstrate that the proposed techniques can significantly outperform both

  13. Secure Image Hash Comparison for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.

    2014-06-06

    The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.

  14. FSH: fast spaced seed hashing exploiting adjacent hashes.

    PubMed

    Girotto, Samuele; Comin, Matteo; Pizzi, Cinzia

    2018-01-01

    Patterns with wildcards in specified positions, namely spaced seeds , are increasingly used instead of k -mers in many bioinformatics applications that require indexing, querying and rapid similarity search, as they can provide better sensitivity. Many of these applications require to compute the hashing of each position in the input sequences with respect to the given spaced seed, or to multiple spaced seeds. While the hashing of k -mers can be rapidly computed by exploiting the large overlap between consecutive k -mers, spaced seeds hashing is usually computed from scratch for each position in the input sequence, thus resulting in slower processing. The method proposed in this paper, fast spaced-seed hashing (FSH), exploits the similarity of the hash values of spaced seeds computed at adjacent positions in the input sequence. In our experiments we compute the hash for each positions of metagenomics reads from several datasets, with respect to different spaced seeds. We also propose a generalized version of the algorithm for the simultaneous computation of multiple spaced seeds hashing. In the experiments, our algorithm can compute the hashing values of spaced seeds with a speedup, with respect to the traditional approach, between 1.6[Formula: see text] to 5.3[Formula: see text], depending on the structure of the spaced seed. Spaced seed hashing is a routine task for several bioinformatics application. FSH allows to perform this task efficiently and raise the question of whether other hashing can be exploited to further improve the speed up. This has the potential of major impact in the field, making spaced seed applications not only accurate, but also faster and more efficient. The software FSH is freely available for academic use at: https://bitbucket.org/samu661/fsh/overview.

  15. Feature hashing for fast image retrieval

    NASA Astrophysics Data System (ADS)

    Yan, Lingyu; Fu, Jiarun; Zhang, Hongxin; Yuan, Lu; Xu, Hui

    2018-03-01

    Currently, researches on content based image retrieval mainly focus on robust feature extraction. However, due to the exponential growth of online images, it is necessary to consider searching among large scale images, which is very timeconsuming and unscalable. Hence, we need to pay much attention to the efficiency of image retrieval. In this paper, we propose a feature hashing method for image retrieval which not only generates compact fingerprint for image representation, but also prevents huge semantic loss during the process of hashing. To generate the fingerprint, an objective function of semantic loss is constructed and minimized, which combine the influence of both the neighborhood structure of feature data and mapping error. Since the machine learning based hashing effectively preserves neighborhood structure of data, it yields visual words with strong discriminability. Furthermore, the generated binary codes leads image representation building to be of low-complexity, making it efficient and scalable to large scale databases. Experimental results show good performance of our approach.

  16. G-Hash: Towards Fast Kernel-based Similarity Search in Large Graph Databases.

    PubMed

    Wang, Xiaohong; Smalter, Aaron; Huan, Jun; Lushington, Gerald H

    2009-01-01

    Structured data including sets, sequences, trees and graphs, pose significant challenges to fundamental aspects of data management such as efficient storage, indexing, and similarity search. With the fast accumulation of graph databases, similarity search in graph databases has emerged as an important research topic. Graph similarity search has applications in a wide range of domains including cheminformatics, bioinformatics, sensor network management, social network management, and XML documents, among others.Most of the current graph indexing methods focus on subgraph query processing, i.e. determining the set of database graphs that contains the query graph and hence do not directly support similarity search. In data mining and machine learning, various graph kernel functions have been designed to capture the intrinsic similarity of graphs. Though successful in constructing accurate predictive and classification models for supervised learning, graph kernel functions have (i) high computational complexity and (ii) non-trivial difficulty to be indexed in a graph database.Our objective is to bridge graph kernel function and similarity search in graph databases by proposing (i) a novel kernel-based similarity measurement and (ii) an efficient indexing structure for graph data management. Our method of similarity measurement builds upon local features extracted from each node and their neighboring nodes in graphs. A hash table is utilized to support efficient storage and fast search of the extracted local features. Using the hash table, a graph kernel function is defined to capture the intrinsic similarity of graphs and for fast similarity query processing. We have implemented our method, which we have named G-hash, and have demonstrated its utility on large chemical graph databases. Our results show that the G-hash method achieves state-of-the-art performance for k-nearest neighbor (k-NN) classification. Most importantly, the new similarity measurement and the index

  17. A Novel Approach for Constructing One-Way Hash Function Based on a Message Block Controlled 8D Hyperchaotic Map

    NASA Astrophysics Data System (ADS)

    Lin, Zhuosheng; Yu, Simin; Lü, Jinhu

    2017-06-01

    In this paper, a novel approach for constructing one-way hash function based on 8D hyperchaotic map is presented. First, two nominal matrices both with constant and variable parameters are adopted for designing 8D discrete-time hyperchaotic systems, respectively. Then each input plaintext message block is transformed into 8 × 8 matrix following the order of left to right and top to bottom, which is used as a control matrix for the switch of the nominal matrix elements both with the constant parameters and with the variable parameters. Through this switching control, a new nominal matrix mixed with the constant and variable parameters is obtained for the 8D hyperchaotic map. Finally, the hash function is constructed with the multiple low 8-bit hyperchaotic system iterative outputs after being rounded down, and its secure analysis results are also given, validating the feasibility and reliability of the proposed approach. Compared with the existing schemes, the main feature of the proposed method is that it has a large number of key parameters with avalanche effect, resulting in the difficulty for estimating or predicting key parameters via various attacks.

  18. Computing quantum hashing in the model of quantum branching programs

    NASA Astrophysics Data System (ADS)

    Ablayev, Farid; Ablayev, Marat; Vasiliev, Alexander

    2018-02-01

    We investigate the branching program complexity of quantum hashing. We consider a quantum hash function that maps elements of a finite field into quantum states. We require that this function is preimage-resistant and collision-resistant. We consider two complexity measures for Quantum Branching Programs (QBP): a number of qubits and a number of compu-tational steps. We show that the quantum hash function can be computed efficiently. Moreover, we prove that such QBP construction is optimal. That is, we prove lower bounds that match the constructed quantum hash function computation.

  19. A novel class sensitive hashing technique for large-scale content-based remote sensing image retrieval

    NASA Astrophysics Data System (ADS)

    Reato, Thomas; Demir, Begüm; Bruzzone, Lorenzo

    2017-10-01

    This paper presents a novel class sensitive hashing technique in the framework of large-scale content-based remote sensing (RS) image retrieval. The proposed technique aims at representing each image with multi-hash codes, each of which corresponds to a primitive (i.e., land cover class) present in the image. To this end, the proposed method consists of a three-steps algorithm. The first step is devoted to characterize each image by primitive class descriptors. These descriptors are obtained through a supervised approach, which initially extracts the image regions and their descriptors that are then associated with primitives present in the images. This step requires a set of annotated training regions to define primitive classes. A correspondence between the regions of an image and the primitive classes is built based on the probability of each primitive class to be present at each region. All the regions belonging to the specific primitive class with a probability higher than a given threshold are highly representative of that class. Thus, the average value of the descriptors of these regions is used to characterize that primitive. In the second step, the descriptors of primitive classes are transformed into multi-hash codes to represent each image. This is achieved by adapting the kernel-based supervised locality sensitive hashing method to multi-code hashing problems. The first two steps of the proposed technique, unlike the standard hashing methods, allow one to represent each image by a set of primitive class sensitive descriptors and their hash codes. Then, in the last step, the images in the archive that are very similar to a query image are retrieved based on a multi-hash-code-matching scheme. Experimental results obtained on an archive of aerial images confirm the effectiveness of the proposed technique in terms of retrieval accuracy when compared to the standard hashing methods.

  20. Deep Constrained Siamese Hash Coding Network and Load-Balanced Locality-Sensitive Hashing for Near Duplicate Image Detection.

    PubMed

    Hu, Weiming; Fan, Yabo; Xing, Junliang; Sun, Liang; Cai, Zhaoquan; Maybank, Stephen

    2018-09-01

    We construct a new efficient near duplicate image detection method using a hierarchical hash code learning neural network and load-balanced locality-sensitive hashing (LSH) indexing. We propose a deep constrained siamese hash coding neural network combined with deep feature learning. Our neural network is able to extract effective features for near duplicate image detection. The extracted features are used to construct a LSH-based index. We propose a load-balanced LSH method to produce load-balanced buckets in the hashing process. The load-balanced LSH significantly reduces the query time. Based on the proposed load-balanced LSH, we design an effective and feasible algorithm for near duplicate image detection. Extensive experiments on three benchmark data sets demonstrate the effectiveness of our deep siamese hash encoding network and load-balanced LSH.

  1. Binary Multidimensional Scaling for Hashing.

    PubMed

    Huang, Yameng; Lin, Zhouchen

    2017-10-04

    Hashing is a useful technique for fast nearest neighbor search due to its low storage cost and fast query speed. Unsupervised hashing aims at learning binary hash codes for the original features so that the pairwise distances can be best preserved. While several works have targeted on this task, the results are not satisfactory mainly due to the oversimplified model. In this paper, we propose a unified and concise unsupervised hashing framework, called Binary Multidimensional Scaling (BMDS), which is able to learn the hash code for distance preservation in both batch and online mode. In the batch mode, unlike most existing hashing methods, we do not need to simplify the model by predefining the form of hash map. Instead, we learn the binary codes directly based on the pairwise distances among the normalized original features by Alternating Minimization. This enables a stronger expressive power of the hash map. In the online mode, we consider the holistic distance relationship between current query example and those we have already learned, rather than only focusing on current data chunk. It is useful when the data come in a streaming fashion. Empirical results show that while being efficient for training, our algorithm outperforms state-of-the-art methods by a large margin in terms of distance preservation, which is practical for real-world applications.

  2. Implementation of the AES as a Hash Function for Confirming the Identity of Software on a Computer System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Randy R.; Bass, Robert B.; Kouzes, Richard T.

    2003-01-20

    This paper provides a brief overview of the implementation of the Advanced Encryption Standard (AES) as a hash function for confirming the identity of software resident on a computer system. The PNNL Software Authentication team chose to use a hash function to confirm software identity on a system for situations where: (1) there is limited time to perform the confirmation and (2) access to the system is restricted to keyboard or thumbwheel input and output can only be displayed on a monitor. PNNL reviewed three popular algorithms: the Secure Hash Algorithm - 1 (SHA-1), the Message Digest - 5 (MD-5),more » and the Advanced Encryption Standard (AES) and selected the AES to incorporate in software confirmation tool we developed. This paper gives a brief overview of the SHA-1, MD-5, and the AES and sites references for further detail. It then explains the overall processing steps of the AES to reduce a large amount of generic data-the plain text, such is present in memory and other data storage media in a computer system, to a small amount of data-the hash digest, which is a mathematically unique representation or signature of the former that could be displayed on a computer's monitor. This paper starts with a simple definition and example to illustrate the use of a hash function. It concludes with a description of how the software confirmation tool uses the hash function to confirm the identity of software on a computer system.« less

  3. Novel Duplicate Address Detection with Hash Function

    PubMed Central

    Song, GuangJia; Ji, ZhenZhou

    2016-01-01

    Duplicate address detection (DAD) is an important component of the address resolution protocol (ARP) and the neighbor discovery protocol (NDP). DAD determines whether an IP address is in conflict with other nodes. In traditional DAD, the target address to be detected is broadcast through the network, which provides convenience for malicious nodes to attack. A malicious node can send a spoofing reply to prevent the address configuration of a normal node, and thus, a denial-of-service attack is launched. This study proposes a hash method to hide the target address in DAD, which prevents an attack node from launching destination attacks. If the address of a normal node is identical to the detection address, then its hash value should be the same as the “Hash_64” field in the neighboring solicitation message. Consequently, DAD can be successfully completed. This process is called DAD-h. Simulation results indicate that address configuration using DAD-h has a considerably higher success rate when under attack compared with traditional DAD. Comparative analysis shows that DAD-h does not require third-party devices and considerable computing resources; it also provides a lightweight security resolution. PMID:26991901

  4. Double hashing technique in closed hashing search process

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Zulkarnain, Iskandar; Jaya, Hendra

    2017-09-01

    The search process is used in various activities performed both online and offline, many algorithms that can be used to perform the search process one of which is a hash search algorithm, search process with hash search algorithm used in this study using double hashing technique where the data will be formed into the table with same length and then search, the results of this study indicate that the search process with double hashing technique allows faster searching than the usual search techniques, this research allows to search the solution by dividing the value into the main table and overflow table so that the search process is expected faster than the data stacked in the form of one table and collision data could avoided.

  5. Efficient hash tables for network applications.

    PubMed

    Zink, Thomas; Waldvogel, Marcel

    2015-01-01

    Hashing has yet to be widely accepted as a component of hard real-time systems and hardware implementations, due to still existing prejudices concerning the unpredictability of space and time requirements resulting from collisions. While in theory perfect hashing can provide optimal mapping, in practice, finding a perfect hash function is too expensive, especially in the context of high-speed applications. The introduction of hashing with multiple choices, d-left hashing and probabilistic table summaries, has caused a shift towards deterministic DRAM access. However, high amounts of rare and expensive high-speed SRAM need to be traded off for predictability, which is infeasible for many applications. In this paper we show that previous suggestions suffer from the false precondition of full generality. Our approach exploits four individual degrees of freedom available in many practical applications, especially hardware and high-speed lookups. This reduces the requirement of on-chip memory up to an order of magnitude and guarantees constant lookup and update time at the cost of only minute amounts of additional hardware. Our design makes efficient hash table implementations cheaper, more predictable, and more practical.

  6. Hierarchical Recurrent Neural Hashing for Image Retrieval With Hierarchical Convolutional Features.

    PubMed

    Lu, Xiaoqiang; Chen, Yaxiong; Li, Xuelong

    Hashing has been an important and effective technology in image retrieval due to its computational efficiency and fast search speed. The traditional hashing methods usually learn hash functions to obtain binary codes by exploiting hand-crafted features, which cannot optimally represent the information of the sample. Recently, deep learning methods can achieve better performance, since deep learning architectures can learn more effective image representation features. However, these methods only use semantic features to generate hash codes by shallow projection but ignore texture details. In this paper, we proposed a novel hashing method, namely hierarchical recurrent neural hashing (HRNH), to exploit hierarchical recurrent neural network to generate effective hash codes. There are three contributions of this paper. First, a deep hashing method is proposed to extensively exploit both spatial details and semantic information, in which, we leverage hierarchical convolutional features to construct image pyramid representation. Second, our proposed deep network can exploit directly convolutional feature maps as input to preserve the spatial structure of convolutional feature maps. Finally, we propose a new loss function that considers the quantization error of binarizing the continuous embeddings into the discrete binary codes, and simultaneously maintains the semantic similarity and balanceable property of hash codes. Experimental results on four widely used data sets demonstrate that the proposed HRNH can achieve superior performance over other state-of-the-art hashing methods.Hashing has been an important and effective technology in image retrieval due to its computational efficiency and fast search speed. The traditional hashing methods usually learn hash functions to obtain binary codes by exploiting hand-crafted features, which cannot optimally represent the information of the sample. Recently, deep learning methods can achieve better performance, since deep

  7. Image Hashes as Templates for Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images,more » and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring

  8. Secure Minutiae-Based Fingerprint Templates Using Random Triangle Hashing

    NASA Astrophysics Data System (ADS)

    Jin, Zhe; Jin Teoh, Andrew Beng; Ong, Thian Song; Tee, Connie

    Due to privacy concern on the widespread use of biometric authentication systems, biometric template protection has gained great attention in the biometric research recently. It is a challenging task to design a biometric template protection scheme which is anonymous, revocable and noninvertible while maintaining acceptable performance. Many methods have been proposed to resolve this problem, and cancelable biometrics is one of them. In this paper, we propose a scheme coined as Random Triangle Hashing which follows the concept of cancelable biometrics in the fingerprint domain. In this method, re-alignment of fingerprints is not required as all the minutiae are translated into a pre-defined 2 dimensional space based on a reference minutia. After that, the proposed Random Triangle hashing method is used to enforce the one-way property (non-invertibility) of the biometric template. The proposed method is resistant to minor translation error and rotation distortion. Finally, the hash vectors are converted into bit-strings to be stored in the database. The proposed method is evaluated using the public database FVC2004 DB1. An EER of less than 1% is achieved by using the proposed method.

  9. Query-Adaptive Hash Code Ranking for Large-Scale Multi-View Visual Search.

    PubMed

    Liu, Xianglong; Huang, Lei; Deng, Cheng; Lang, Bo; Tao, Dacheng

    2016-10-01

    Hash-based nearest neighbor search has become attractive in many applications. However, the quantization in hashing usually degenerates the discriminative power when using Hamming distance ranking. Besides, for large-scale visual search, existing hashing methods cannot directly support the efficient search over the data with multiple sources, and while the literature has shown that adaptively incorporating complementary information from diverse sources or views can significantly boost the search performance. To address the problems, this paper proposes a novel and generic approach to building multiple hash tables with multiple views and generating fine-grained ranking results at bitwise and tablewise levels. For each hash table, a query-adaptive bitwise weighting is introduced to alleviate the quantization loss by simultaneously exploiting the quality of hash functions and their complement for nearest neighbor search. From the tablewise aspect, multiple hash tables are built for different data views as a joint index, over which a query-specific rank fusion is proposed to rerank all results from the bitwise ranking by diffusing in a graph. Comprehensive experiments on image search over three well-known benchmarks show that the proposed method achieves up to 17.11% and 20.28% performance gains on single and multiple table search over the state-of-the-art methods.

  10. A Hash Based Remote User Authentication and Authenticated Key Agreement Scheme for the Integrated EPR Information System.

    PubMed

    Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi; Wang, Chun-Cheng

    2015-11-01

    To protect patient privacy and ensure authorized access to remote medical services, many remote user authentication schemes for the integrated electronic patient record (EPR) information system have been proposed in the literature. In a recent paper, Das proposed a hash based remote user authentication scheme using passwords and smart cards for the integrated EPR information system, and claimed that the proposed scheme could resist various passive and active attacks. However, in this paper, we found that Das's authentication scheme is still vulnerable to modification and user duplication attacks. Thereafter we propose a secure and efficient authentication scheme for the integrated EPR information system based on lightweight hash function and bitwise exclusive-or (XOR) operations. The security proof and performance analysis show our new scheme is well-suited to adoption in remote medical healthcare services.

  11. Range image registration based on hash map and moth-flame optimization

    NASA Astrophysics Data System (ADS)

    Zou, Li; Ge, Baozhen; Chen, Lei

    2018-03-01

    Over the past decade, evolutionary algorithms (EAs) have been introduced to solve range image registration problems because of their robustness and high precision. However, EA-based range image registration algorithms are time-consuming. To reduce the computational time, an EA-based range image registration algorithm using hash map and moth-flame optimization is proposed. In this registration algorithm, a hash map is used to avoid over-exploitation in registration process. Additionally, we present a search equation that is better at exploration and a restart mechanism to avoid being trapped in local minima. We compare the proposed registration algorithm with the registration algorithms using moth-flame optimization and several state-of-the-art EA-based registration algorithms. The experimental results show that the proposed algorithm has a lower computational cost than other algorithms and achieves similar registration precision.

  12. Forensic hash for multimedia information

    NASA Astrophysics Data System (ADS)

    Lu, Wenjun; Varna, Avinash L.; Wu, Min

    2010-01-01

    Digital multimedia such as images and videos are prevalent on today's internet and cause significant social impact, which can be evidenced by the proliferation of social networking sites with user generated contents. Due to the ease of generating and modifying images and videos, it is critical to establish trustworthiness for online multimedia information. In this paper, we propose novel approaches to perform multimedia forensics using compact side information to reconstruct the processing history of a document. We refer to this as FASHION, standing for Forensic hASH for informatION assurance. Based on the Radon transform and scale space theory, the proposed forensic hash is compact and can effectively estimate the parameters of geometric transforms and detect local tampering that an image may have undergone. Forensic hash is designed to answer a broader range of questions regarding the processing history of multimedia data than the simple binary decision from traditional robust image hashing, and also offers more efficient and accurate forensic analysis than multimedia forensic techniques that do not use any side information.

  13. Implementation of 4-way Superscalar Hash MIPS Processor Using FPGA

    NASA Astrophysics Data System (ADS)

    Sahib Omran, Safaa; Fouad Jumma, Laith

    2018-05-01

    Due to the quick advancements in the personal communications systems and wireless communications, giving data security has turned into a more essential subject. This security idea turns into a more confounded subject when next-generation system requirements and constant calculation speed are considered in real-time. Hash functions are among the most essential cryptographic primitives and utilized as a part of the many fields of signature authentication and communication integrity. These functions are utilized to acquire a settled size unique fingerprint or hash value of an arbitrary length of message. In this paper, Secure Hash Algorithms (SHA) of types SHA-1, SHA-2 (SHA-224, SHA-256) and SHA-3 (BLAKE) are implemented on Field-Programmable Gate Array (FPGA) in a processor structure. The design is described and implemented using a hardware description language, namely VHSIC “Very High Speed Integrated Circuit” Hardware Description Language (VHDL). Since the logical operation of the hash types of (SHA-1, SHA-224, SHA-256 and SHA-3) are 32-bits, so a Superscalar Hash Microprocessor without Interlocked Pipelines (MIPS) processor are designed with only few instructions that were required in invoking the desired Hash algorithms, when the four types of hash algorithms executed sequentially using the designed processor, the total time required equal to approximately 342 us, with a throughput of 4.8 Mbps while the required to execute the same four hash algorithms using the designed four-way superscalar is reduced to 237 us with improved the throughput to 5.1 Mbps.

  14. The Speech multi features fusion perceptual hash algorithm based on tensor decomposition

    NASA Astrophysics Data System (ADS)

    Huang, Y. B.; Fan, M. H.; Zhang, Q. Y.

    2018-03-01

    With constant progress in modern speech communication technologies, the speech data is prone to be attacked by the noise or maliciously tampered. In order to make the speech perception hash algorithm has strong robustness and high efficiency, this paper put forward a speech perception hash algorithm based on the tensor decomposition and multi features is proposed. This algorithm analyses the speech perception feature acquires each speech component wavelet packet decomposition. LPCC, LSP and ISP feature of each speech component are extracted to constitute the speech feature tensor. Speech authentication is done by generating the hash values through feature matrix quantification which use mid-value. Experimental results showing that the proposed algorithm is robust for content to maintain operations compared with similar algorithms. It is able to resist the attack of the common background noise. Also, the algorithm is highly efficiency in terms of arithmetic, and is able to meet the real-time requirements of speech communication and complete the speech authentication quickly.

  15. FBC: a flat binary code scheme for fast Manhattan hash retrieval

    NASA Astrophysics Data System (ADS)

    Kong, Yan; Wu, Fuzhang; Gao, Lifa; Wu, Yanjun

    2018-04-01

    Hash coding is a widely used technique in approximate nearest neighbor (ANN) search, especially in document search and multimedia (such as image and video) retrieval. Based on the difference of distance measurement, hash methods are generally classified into two categories: Hamming hashing and Manhattan hashing. Benefitting from better neighborhood structure preservation, Manhattan hashing methods outperform earlier methods in search effectiveness. However, due to using decimal arithmetic operations instead of bit operations, Manhattan hashing becomes a more time-consuming process, which significantly decreases the whole search efficiency. To solve this problem, we present an intuitive hash scheme which uses Flat Binary Code (FBC) to encode the data points. As a result, the decimal arithmetic used in previous Manhattan hashing can be replaced by more efficient XOR operator. The final experiments show that with a reasonable memory space growth, our FBC speeds up more than 80% averagely without any search accuracy loss when comparing to the state-of-art Manhattan hashing methods.

  16. Bin-Hash Indexing: A Parallel Method for Fast Query Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, Edward W; Gosink, Luke J.; Wu, Kesheng

    2008-06-27

    This paper presents a new parallel indexing data structure for answering queries. The index, called Bin-Hash, offers extremely high levels of concurrency, and is therefore well-suited for the emerging commodity of parallel processors, such as multi-cores, cell processors, and general purpose graphics processing units (GPU). The Bin-Hash approach first bins the base data, and then partitions and separately stores the values in each bin as a perfect spatial hash table. To answer a query, we first determine whether or not a record satisfies the query conditions based on the bin boundaries. For the bins with records that can not bemore » resolved, we examine the spatial hash tables. The procedures for examining the bin numbers and the spatial hash tables offer the maximum possible level of concurrency; all records are able to be evaluated by our procedure independently in parallel. Additionally, our Bin-Hash procedures access much smaller amounts of data than similar parallel methods, such as the projection index. This smaller data footprint is critical for certain parallel processors, like GPUs, where memory resources are limited. To demonstrate the effectiveness of Bin-Hash, we implement it on a GPU using the data-parallel programming language CUDA. The concurrency offered by the Bin-Hash index allows us to fully utilize the GPU's massive parallelism in our work; over 12,000 records can be simultaneously evaluated at any one time. We show that our new query processing method is an order of magnitude faster than current state-of-the-art CPU-based indexing technologies. Additionally, we compare our performance to existing GPU-based projection index strategies.« less

  17. Speaker Linking and Applications using Non-Parametric Hashing Methods

    DTIC Science & Technology

    2016-09-08

    clustering method based on hashing—canopy- clustering . We apply this method to a large corpus of speaker recordings, demonstrate performance tradeoffs...and compare to other hash- ing methods. Index Terms: speaker recognition, clustering , hashing, locality sensitive hashing. 1. Introduction We assume...speaker in our corpus. Second, given a QBE method, how can we perform speaker clustering —each clustering should be a single speaker, and a cluster should

  18. A Robust and Effective Smart-Card-Based Remote User Authentication Mechanism Using Hash Function

    PubMed Central

    Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme. PMID:24892078

  19. A robust and effective smart-card-based remote user authentication mechanism using hash function.

    PubMed

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme.

  20. Robust hashing for 3D models

    NASA Astrophysics Data System (ADS)

    Berchtold, Waldemar; Schäfer, Marcel; Rettig, Michael; Steinebach, Martin

    2014-02-01

    3D models and applications are of utmost interest in both science and industry. With the increment of their usage, their number and thereby the challenge to correctly identify them increases. Content identification is commonly done by cryptographic hashes. However, they fail as a solution in application scenarios such as computer aided design (CAD), scientific visualization or video games, because even the smallest alteration of the 3D model, e.g. conversion or compression operations, massively changes the cryptographic hash as well. Therefore, this work presents a robust hashing algorithm for 3D mesh data. The algorithm applies several different bit extraction methods. They are built to resist desired alterations of the model as well as malicious attacks intending to prevent correct allocation. The different bit extraction methods are tested against each other and, as far as possible, the hashing algorithm is compared to the state of the art. The parameters tested are robustness, security and runtime performance as well as False Acceptance Rate (FAR) and False Rejection Rate (FRR), also the probability calculation of hash collision is included. The introduced hashing algorithm is kept adaptive e.g. in hash length, to serve as a proper tool for all applications in practice.

  1. Unsupervised Deep Hashing With Pseudo Labels for Scalable Image Retrieval.

    PubMed

    Zhang, Haofeng; Liu, Li; Long, Yang; Shao, Ling

    2018-04-01

    In order to achieve efficient similarity searching, hash functions are designed to encode images into low-dimensional binary codes with the constraint that similar features will have a short distance in the projected Hamming space. Recently, deep learning-based methods have become more popular, and outperform traditional non-deep methods. However, without label information, most state-of-the-art unsupervised deep hashing (DH) algorithms suffer from severe performance degradation for unsupervised scenarios. One of the main reasons is that the ad-hoc encoding process cannot properly capture the visual feature distribution. In this paper, we propose a novel unsupervised framework that has two main contributions: 1) we convert the unsupervised DH model into supervised by discovering pseudo labels; 2) the framework unifies likelihood maximization, mutual information maximization, and quantization error minimization so that the pseudo labels can maximumly preserve the distribution of visual features. Extensive experiments on three popular data sets demonstrate the advantages of the proposed method, which leads to significant performance improvement over the state-of-the-art unsupervised hashing algorithms.

  2. Hash Bit Selection for Nearest Neighbor Search.

    PubMed

    Xianglong Liu; Junfeng He; Shih-Fu Chang

    2017-11-01

    To overcome the barrier of storage and computation when dealing with gigantic-scale data sets, compact hashing has been studied extensively to approximate the nearest neighbor search. Despite the recent advances, critical design issues remain open in how to select the right features, hashing algorithms, and/or parameter settings. In this paper, we address these by posing an optimal hash bit selection problem, in which an optimal subset of hash bits are selected from a pool of candidate bits generated by different features, algorithms, or parameters. Inspired by the optimization criteria used in existing hashing algorithms, we adopt the bit reliability and their complementarity as the selection criteria that can be carefully tailored for hashing performance in different tasks. Then, the bit selection solution is discovered by finding the best tradeoff between search accuracy and time using a modified dynamic programming method. To further reduce the computational complexity, we employ the pairwise relationship among hash bits to approximate the high-order independence property, and formulate it as an efficient quadratic programming method that is theoretically equivalent to the normalized dominant set problem in a vertex- and edge-weighted graph. Extensive large-scale experiments have been conducted under several important application scenarios of hash techniques, where our bit selection framework can achieve superior performance over both the naive selection methods and the state-of-the-art hashing algorithms, with significant accuracy gains ranging from 10% to 50%, relatively.

  3. Deep classification hashing for person re-identification

    NASA Astrophysics Data System (ADS)

    Wang, Jiabao; Li, Yang; Zhang, Xiancai; Miao, Zhuang; Tao, Gang

    2018-04-01

    As the development of surveillance in public, person re-identification becomes more and more important. The largescale databases call for efficient computation and storage, hashing technique is one of the most important methods. In this paper, we proposed a new deep classification hashing network by introducing a new binary appropriation layer in the traditional ImageNet pre-trained CNN models. It outputs binary appropriate features, which can be easily quantized into binary hash-codes for hamming similarity comparison. Experiments show that our deep hashing method can outperform the state-of-the-art methods on the public CUHK03 and Market1501 datasets.

  4. Self-Supervised Video Hashing With Hierarchical Binary Auto-Encoder.

    PubMed

    Song, Jingkuan; Zhang, Hanwang; Li, Xiangpeng; Gao, Lianli; Wang, Meng; Hong, Richang

    2018-07-01

    Existing video hash functions are built on three isolated stages: frame pooling, relaxed learning, and binarization, which have not adequately explored the temporal order of video frames in a joint binary optimization model, resulting in severe information loss. In this paper, we propose a novel unsupervised video hashing framework dubbed self-supervised video hashing (SSVH), which is able to capture the temporal nature of videos in an end-to-end learning to hash fashion. We specifically address two central problems: 1) how to design an encoder-decoder architecture to generate binary codes for videos and 2) how to equip the binary codes with the ability of accurate video retrieval. We design a hierarchical binary auto-encoder to model the temporal dependencies in videos with multiple granularities, and embed the videos into binary codes with less computations than the stacked architecture. Then, we encourage the binary codes to simultaneously reconstruct the visual content and neighborhood structure of the videos. Experiments on two real-world data sets show that our SSVH method can significantly outperform the state-of-the-art methods and achieve the current best performance on the task of unsupervised video retrieval.

  5. Self-Supervised Video Hashing With Hierarchical Binary Auto-Encoder

    NASA Astrophysics Data System (ADS)

    Song, Jingkuan; Zhang, Hanwang; Li, Xiangpeng; Gao, Lianli; Wang, Meng; Hong, Richang

    2018-07-01

    Existing video hash functions are built on three isolated stages: frame pooling, relaxed learning, and binarization, which have not adequately explored the temporal order of video frames in a joint binary optimization model, resulting in severe information loss. In this paper, we propose a novel unsupervised video hashing framework dubbed Self-Supervised Video Hashing (SSVH), that is able to capture the temporal nature of videos in an end-to-end learning-to-hash fashion. We specifically address two central problems: 1) how to design an encoder-decoder architecture to generate binary codes for videos; and 2) how to equip the binary codes with the ability of accurate video retrieval. We design a hierarchical binary autoencoder to model the temporal dependencies in videos with multiple granularities, and embed the videos into binary codes with less computations than the stacked architecture. Then, we encourage the binary codes to simultaneously reconstruct the visual content and neighborhood structure of the videos. Experiments on two real-world datasets (FCVID and YFCC) show that our SSVH method can significantly outperform the state-of-the-art methods and achieve the currently best performance on the task of unsupervised video retrieval.

  6. Object-Location-Aware Hashing for Multi-Label Image Retrieval via Automatic Mask Learning.

    PubMed

    Huang, Chang-Qin; Yang, Shang-Ming; Pan, Yan; Lai, Han-Jiang

    2018-09-01

    Learning-based hashing is a leading approach of approximate nearest neighbor search for large-scale image retrieval. In this paper, we develop a deep supervised hashing method for multi-label image retrieval, in which we propose to learn a binary "mask" map that can identify the approximate locations of objects in an image, so that we use this binary "mask" map to obtain length-limited hash codes which mainly focus on an image's objects but ignore the background. The proposed deep architecture consists of four parts: 1) a convolutional sub-network to generate effective image features; 2) a binary "mask" sub-network to identify image objects' approximate locations; 3) a weighted average pooling operation based on the binary "mask" to obtain feature representations and hash codes that pay most attention to foreground objects but ignore the background; and 4) the combination of a triplet ranking loss designed to preserve relative similarities among images and a cross entropy loss defined on image labels. We conduct comprehensive evaluations on four multi-label image data sets. The results indicate that the proposed hashing method achieves superior performance gains over the state-of-the-art supervised or unsupervised hashing baselines.

  7. HashDist: Reproducible, Relocatable, Customizable, Cross-Platform Software Stacks for Open Hydrological Science

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and

  8. Compact binary hashing for music retrieval

    NASA Astrophysics Data System (ADS)

    Seo, Jin S.

    2014-03-01

    With the huge volume of music clips available for protection, browsing, and indexing, there is an increased attention to retrieve the information contents of the music archives. Music-similarity computation is an essential building block for browsing, retrieval, and indexing of digital music archives. In practice, as the number of songs available for searching and indexing is increased, so the storage cost in retrieval systems is becoming a serious problem. This paper deals with the storage problem by extending the supervector concept with the binary hashing. We utilize the similarity-preserving binary embedding in generating a hash code from the supervector of each music clip. Especially we compare the performance of the various binary hashing methods for music retrieval tasks on the widely-used genre dataset and the in-house singer dataset. Through the evaluation, we find an effective way of generating hash codes for music similarity estimation which improves the retrieval performance.

  9. Algorithm That Synthesizes Other Algorithms for Hashing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2010-01-01

    An algorithm that includes a collection of several subalgorithms has been devised as a means of synthesizing still other algorithms (which could include computer code) that utilize hashing to determine whether an element (typically, a number or other datum) is a member of a set (typically, a list of numbers). Each subalgorithm synthesizes an algorithm (e.g., a block of code) that maps a static set of key hashes to a somewhat linear monotonically increasing sequence of integers. The goal in formulating this mapping is to cause the length of the sequence thus generated to be as close as practicable to the original length of the set and thus to minimize gaps between the elements. The advantage of the approach embodied in this algorithm is that it completely avoids the traditional approach of hash-key look-ups that involve either secondary hash generation and look-up or further searching of a hash table for a desired key in the event of collisions. This algorithm guarantees that it will never be necessary to perform a search or to generate a secondary key in order to determine whether an element is a member of a set. This algorithm further guarantees that any algorithm that it synthesizes can be executed in constant time. To enforce these guarantees, the subalgorithms are formulated to employ a set of techniques, each of which works very effectively covering a certain class of hash-key values. These subalgorithms are of two types, summarized as follows: Given a list of numbers, try to find one or more solutions in which, if each number is shifted to the right by a constant number of bits and then masked with a rotating mask that isolates a set of bits, a unique number is thereby generated. In a variant of the foregoing procedure, omit the masking. Try various combinations of shifting, masking, and/or offsets until the solutions are found. From the set of solutions, select the one that provides the greatest compression for the representation and is executable in the

  10. Meet-in-the-Middle Preimage Attacks on Hash Modes of Generalized Feistel and Misty Schemes with SP Round Function

    NASA Astrophysics Data System (ADS)

    Moon, Dukjae; Hong, Deukjo; Kwon, Daesung; Hong, Seokhie

    We assume that the domain extender is the Merkle-Damgård (MD) scheme and he message is padded by a ‘1’, and minimum number of ‘0’s, followed by a fixed size length information so that the length of padded message is multiple of block length. Under this assumption, we analyze securities of the hash mode when the compression function follows the Davies-Meyer (DM) scheme and the underlying block cipher is one of the plain Feistel or Misty scheme or the generalized Feistel or Misty schemes with Substitution-Permutation (SP) round function. We do this work based on Meet-in-the-Middle (MitM) preimage attack techniques, and develop several useful initial structures.

  11. Simultenious binary hash and features learning for image retrieval

    NASA Astrophysics Data System (ADS)

    Frantc, V. A.; Makov, S. V.; Voronin, V. V.; Marchuk, V. I.; Semenishchev, E. A.; Egiazarian, K. O.; Agaian, S.

    2016-05-01

    Content-based image retrieval systems have plenty of applications in modern world. The most important one is the image search by query image or by semantic description. Approaches to this problem are employed in personal photo-collection management systems, web-scale image search engines, medical systems, etc. Automatic analysis of large unlabeled image datasets is virtually impossible without satisfactory image-retrieval technique. It's the main reason why this kind of automatic image processing has attracted so much attention during recent years. Despite rather huge progress in the field, semantically meaningful image retrieval still remains a challenging task. The main issue here is the demand to provide reliable results in short amount of time. This paper addresses the problem by novel technique for simultaneous learning of global image features and binary hash codes. Our approach provide mapping of pixel-based image representation to hash-value space simultaneously trying to save as much of semantic image content as possible. We use deep learning methodology to generate image description with properties of similarity preservation and statistical independence. The main advantage of our approach in contrast to existing is ability to fine-tune retrieval procedure for very specific application which allow us to provide better results in comparison to general techniques. Presented in the paper framework for data- dependent image hashing is based on use two different kinds of neural networks: convolutional neural networks for image description and autoencoder for feature to hash space mapping. Experimental results confirmed that our approach has shown promising results in compare to other state-of-the-art methods.

  12. Large-scale Cross-modality Search via Collective Matrix Factorization Hashing.

    PubMed

    Ding, Guiguang; Guo, Yuchen; Zhou, Jile; Gao, Yue

    2016-09-08

    By transforming data into binary representation, i.e., Hashing, we can perform high-speed search with low storage cost, and thus Hashing has collected increasing research interest in the recent years. Recently, how to generate Hashcode for multimodal data (e.g., images with textual tags, documents with photos, etc) for large-scale cross-modality search (e.g., searching semantically related images in database for a document query) is an important research issue because of the fast growth of multimodal data in the Web. To address this issue, a novel framework for multimodal Hashing is proposed, termed as Collective Matrix Factorization Hashing (CMFH). The key idea of CMFH is to learn unified Hashcodes for different modalities of one multimodal instance in the shared latent semantic space in which different modalities can be effectively connected. Therefore, accurate cross-modality search is supported. Based on the general framework, we extend it in the unsupervised scenario where it tries to preserve the Euclidean structure, and in the supervised scenario where it fully exploits the label information of data. The corresponding theoretical analysis and the optimization algorithms are given. We conducted comprehensive experiments on three benchmark datasets for cross-modality search. The experimental results demonstrate that CMFH can significantly outperform several state-of-the-art cross-modality Hashing methods, which validates the effectiveness of the proposed CMFH.

  13. Practical security and privacy attacks against biometric hashing using sparse recovery

    NASA Astrophysics Data System (ADS)

    Topcu, Berkay; Karabat, Cagatay; Azadmanesh, Matin; Erdogan, Hakan

    2016-12-01

    Biometric hashing is a cancelable biometric verification method that has received research interest recently. This method can be considered as a two-factor authentication method which combines a personal password (or secret key) with a biometric to obtain a secure binary template which is used for authentication. We present novel practical security and privacy attacks against biometric hashing when the attacker is assumed to know the user's password in order to quantify the additional protection due to biometrics when the password is compromised. We present four methods that can reconstruct a biometric feature and/or the image from a hash and one method which can find the closest biometric data (i.e., face image) from a database. Two of the reconstruction methods are based on 1-bit compressed sensing signal reconstruction for which the data acquisition scenario is very similar to biometric hashing. Previous literature introduced simple attack methods, but we show that we can achieve higher level of security threats using compressed sensing recovery techniques. In addition, we present privacy attacks which reconstruct a biometric image which resembles the original image. We quantify the performance of the attacks using detection error tradeoff curves and equal error rates under advanced attack scenarios. We show that conventional biometric hashing methods suffer from high security and privacy leaks under practical attacks, and we believe more advanced hash generation methods are necessary to avoid these attacks.

  14. Toward Optimal Manifold Hashing via Discrete Locally Linear Embedding.

    PubMed

    Rongrong Ji; Hong Liu; Liujuan Cao; Di Liu; Yongjian Wu; Feiyue Huang

    2017-11-01

    Binary code learning, also known as hashing, has received increasing attention in large-scale visual search. By transforming high-dimensional features to binary codes, the original Euclidean distance is approximated via Hamming distance. More recently, it is advocated that it is the manifold distance, rather than the Euclidean distance, that should be preserved in the Hamming space. However, it retains as an open problem to directly preserve the manifold structure by hashing. In particular, it first needs to build the local linear embedding in the original feature space, and then quantize such embedding to binary codes. Such a two-step coding is problematic and less optimized. Besides, the off-line learning is extremely time and memory consuming, which needs to calculate the similarity matrix of the original data. In this paper, we propose a novel hashing algorithm, termed discrete locality linear embedding hashing (DLLH), which well addresses the above challenges. The DLLH directly reconstructs the manifold structure in the Hamming space, which learns optimal hash codes to maintain the local linear relationship of data points. To learn discrete locally linear embeddingcodes, we further propose a discrete optimization algorithm with an iterative parameters updating scheme. Moreover, an anchor-based acceleration scheme, termed Anchor-DLLH, is further introduced, which approximates the large similarity matrix by the product of two low-rank matrices. Experimental results on three widely used benchmark data sets, i.e., CIFAR10, NUS-WIDE, and YouTube Face, have shown superior performance of the proposed DLLH over the state-of-the-art approaches.

  15. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Corned beef hash. 319.303 Section 319.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.303 Corned beef hash. (a) “Corned Beef Hash” is the semi-solid food product in the form of a...

  16. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Corned beef hash. 319.303 Section 319.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.303 Corned beef hash. (a) “Corned Beef Hash” is the semi-solid food product in the form of a...

  17. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Corned beef hash. 319.303 Section 319.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.303 Corned beef hash. (a) “Corned Beef Hash” is the semi-solid food product in the form of a...

  18. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Corned beef hash. 319.303 Section 319.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.303 Corned beef hash. (a) “Corned Beef Hash” is the semi-solid food product in the form of a...

  19. SnapDock—template-based docking by Geometric Hashing

    PubMed Central

    Estrin, Michael; Wolfson, Haim J.

    2017-01-01

    Abstract Motivation: A highly efficient template-based protein–protein docking algorithm, nicknamed SnapDock, is presented. It employs a Geometric Hashing-based structural alignment scheme to align the target proteins to the interfaces of non-redundant protein–protein interface libraries. Docking of a pair of proteins utilizing the 22 600 interface PIFACE library is performed in < 2 min on the average. A flexible version of the algorithm allowing hinge motion in one of the proteins is presented as well. Results: To evaluate the performance of the algorithm a blind re-modelling of 3547 PDB complexes, which have been uploaded after the PIFACE publication has been performed with success ratio of about 35%. Interestingly, a similar experiment with the template free PatchDock docking algorithm yielded a success rate of about 23% with roughly 1/3 of the solutions different from those of SnapDock. Consequently, the combination of the two methods gave a 42% success ratio. Availability and implementation: A web server of the application is under development. Contact: michaelestrin@gmail.com or wolfson@tau.ac.il PMID:28881968

  20. Deeply learnt hashing forests for content based image retrieval in prostate MR images

    NASA Astrophysics Data System (ADS)

    Shah, Amit; Conjeti, Sailesh; Navab, Nassir; Katouzian, Amin

    2016-03-01

    Deluge in the size and heterogeneity of medical image databases necessitates the need for content based retrieval systems for their efficient organization. In this paper, we propose such a system to retrieve prostate MR images which share similarities in appearance and content with a query image. We introduce deeply learnt hashing forests (DL-HF) for this image retrieval task. DL-HF effectively leverages the semantic descriptiveness of deep learnt Convolutional Neural Networks. This is used in conjunction with hashing forests which are unsupervised random forests. DL-HF hierarchically parses the deep-learnt feature space to encode subspaces with compact binary code words. We propose a similarity preserving feature descriptor called Parts Histogram which is derived from DL-HF. Correlation defined on this descriptor is used as a similarity metric for retrieval from the database. Validations on publicly available multi-center prostate MR image database established the validity of the proposed approach. The proposed method is fully-automated without any user-interaction and is not dependent on any external image standardization like image normalization and registration. This image retrieval method is generalizable and is well-suited for retrieval in heterogeneous databases other imaging modalities and anatomies.

  1. A hash based mutual RFID tag authentication protocol in telecare medicine information system.

    PubMed

    Srivastava, Keerti; Awasthi, Amit K; Kaul, Sonam D; Mittal, R C

    2015-01-01

    Radio Frequency Identification (RFID) is a technology which has multidimensional applications to reduce the complexity of today life. Everywhere, like access control, transportation, real-time inventory, asset management and automated payment systems etc., RFID has its enormous use. Recently, this technology is opening its wings in healthcare environments, where potential applications include patient monitoring, object traceability and drug administration systems etc. In this paper, we propose a secure RFID-based protocol for the medical sector. This protocol is based on hash operation with synchronized secret. The protocol is safe against active and passive attacks such as forgery, traceability, replay and de-synchronization attack.

  2. Performance of hashed cache data migration schemes on multicomputers

    NASA Technical Reports Server (NTRS)

    Hiranandani, Seema; Saltz, Joel; Mehrotra, Piyush; Berryman, Harry

    1991-01-01

    After conducting an examination of several data-migration mechanisms which permit an explicit and controlled mapping of data to memory, a set of schemes for storage and retrieval of off-processor array elements is experimentally evaluated and modeled. All schemes considered have their basis in the use of hash tables for efficient access of nonlocal data. The techniques in question are those of hashed cache, partial enumeration, and full enumeration; in these, nonlocal data are stored in hash tables, so that the operative difference lies in the amount of memory used by each scheme and in the retrieval mechanism used for nonlocal data.

  3. Optimal hash arrangement of tentacles in jellyfish

    NASA Astrophysics Data System (ADS)

    Okabe, Takuya; Yoshimura, Jin

    2016-06-01

    At first glance, the trailing tentacles of a jellyfish appear to be randomly arranged. However, close examination of medusae has revealed that the arrangement and developmental order of the tentacles obey a mathematical rule. Here, we show that medusa jellyfish adopt the best strategy to achieve the most uniform distribution of a variable number of tentacles. The observed order of tentacles is a real-world example of an optimal hashing algorithm known as Fibonacci hashing in computer science.

  4. Exploiting the HASH Planetary Nebula Research Platform

    NASA Astrophysics Data System (ADS)

    Parker, Quentin A.; Bojičić, Ivan; Frew, David J.

    2017-10-01

    The HASH (Hong Kong/ AAO/ Strasbourg/ Hα) planetary nebula research platform is a unique data repository with a graphical interface and SQL capability that offers the community powerful, new ways to undertake Galactic PN studies. HASH currently contains multi-wavelength images, spectra, positions, sizes, morphologies and other data whenever available for 2401 true, 447 likely, and 692 possible Galactic PNe, for a total of 3540 objects. An additional 620 Galactic post-AGB stars, pre-PNe, and PPN candidates are included. All objects were classified and evaluated following the precepts and procedures established and developed by our group over the last 15 years. The complete database contains over 6,700 Galactic objects including the many mimics and related phenomena previously mistaken or confused with PNe. Curation and updating currently occurs on a weekly basis to keep the repository as up to date as possible until the official release of HASH v1 planned in the near future.

  5. Matching Aerial Images to 3D Building Models Using Context-Based Geometric Hashing

    PubMed Central

    Jung, Jaewook; Sohn, Gunho; Bang, Kiin; Wichmann, Andreas; Armenakis, Costas; Kada, Martin

    2016-01-01

    A city is a dynamic entity, which environment is continuously changing over time. Accordingly, its virtual city models also need to be regularly updated to support accurate model-based decisions for various applications, including urban planning, emergency response and autonomous navigation. A concept of continuous city modeling is to progressively reconstruct city models by accommodating their changes recognized in spatio-temporal domain, while preserving unchanged structures. A first critical step for continuous city modeling is to coherently register remotely sensed data taken at different epochs with existing building models. This paper presents a new model-to-image registration method using a context-based geometric hashing (CGH) method to align a single image with existing 3D building models. This model-to-image registration process consists of three steps: (1) feature extraction; (2) similarity measure; and matching, and (3) estimating exterior orientation parameters (EOPs) of a single image. For feature extraction, we propose two types of matching cues: edged corner features representing the saliency of building corner points with associated edges, and contextual relations among the edged corner features within an individual roof. A set of matched corners are found with given proximity measure through geometric hashing, and optimal matches are then finally determined by maximizing the matching cost encoding contextual similarity between matching candidates. Final matched corners are used for adjusting EOPs of the single airborne image by the least square method based on collinearity equations. The result shows that acceptable accuracy of EOPs of a single image can be achievable using the proposed registration approach as an alternative to a labor-intensive manual registration process. PMID:27338410

  6. A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol.

    PubMed

    Zeng, Ping; Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun

    2017-01-01

    In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on-all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications.

  7. A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol

    PubMed Central

    Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun

    2017-01-01

    In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on—all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications. PMID:28399157

  8. Discrete cosine transform and hash functions toward implementing a (robust-fragile) watermarking scheme

    NASA Astrophysics Data System (ADS)

    Al-Mansoori, Saeed; Kunhu, Alavi

    2013-10-01

    This paper proposes a blind multi-watermarking scheme based on designing two back-to-back encoders. The first encoder is implemented to embed a robust watermark into remote sensing imagery by applying a Discrete Cosine Transform (DCT) approach. Such watermark is used in many applications to protect the copyright of the image. However, the second encoder embeds a fragile watermark using `SHA-1' hash function. The purpose behind embedding a fragile watermark is to prove the authenticity of the image (i.e. tamper-proof). Thus, the proposed technique was developed as a result of new challenges with piracy of remote sensing imagery ownership. This led researchers to look for different means to secure the ownership of satellite imagery and prevent the illegal use of these resources. Therefore, Emirates Institution for Advanced Science and Technology (EIAST) proposed utilizing existing data security concept by embedding a digital signature, "watermark", into DubaiSat-1 satellite imagery. In this study, DubaiSat-1 images with 2.5 meter resolution are used as a cover and a colored EIAST logo is used as a watermark. In order to evaluate the robustness of the proposed technique, a couple of attacks are applied such as JPEG compression, rotation and synchronization attacks. Furthermore, tampering attacks are applied to prove image authenticity.

  9. Matching CCD images to a stellar catalog using locality-sensitive hashing

    NASA Astrophysics Data System (ADS)

    Liu, Bo; Yu, Jia-Zong; Peng, Qing-Yu

    2018-02-01

    The usage of a subset of observed stars in a CCD image to find their corresponding matched stars in a stellar catalog is an important issue in astronomical research. Subgraph isomorphic-based algorithms are the most widely used methods in star catalog matching. When more subgraph features are provided, the CCD images are recognized better. However, when the navigation feature database is large, the method requires more time to match the observing model. To solve this problem, this study investigates further and improves subgraph isomorphic matching algorithms. We present an algorithm based on a locality-sensitive hashing technique, which allocates quadrilateral models in the navigation feature database into different hash buckets and reduces the search range to the bucket in which the observed quadrilateral model is located. Experimental results indicate the effectivity of our method.

  10. 7 CFR 51.1433 - U.S. Commercial Halves.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial Halves. 51.1433 Section 51.1433 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1433 U.S. Commercial Halves. The...

  11. Internet traffic load balancing using dynamic hashing with flow volume

    NASA Astrophysics Data System (ADS)

    Jo, Ju-Yeon; Kim, Yoohwan; Chao, H. Jonathan; Merat, Francis L.

    2002-07-01

    Sending IP packets over multiple parallel links is in extensive use in today's Internet and its use is growing due to its scalability, reliability and cost-effectiveness. To maximize the efficiency of parallel links, load balancing is necessary among the links, but it may cause the problem of packet reordering. Since packet reordering impairs TCP performance, it is important to reduce the amount of reordering. Hashing offers a simple solution to keep the packet order by sending a flow over a unique link, but static hashing does not guarantee an even distribution of the traffic amount among the links, which could lead to packet loss under heavy load. Dynamic hashing offers some degree of load balancing but suffers from load fluctuations and excessive packet reordering. To overcome these shortcomings, we have enhanced the dynamic hashing algorithm to utilize the flow volume information in order to reassign only the appropriate flows. This new method, called dynamic hashing with flow volume (DHFV), eliminates unnecessary flow reassignments of small flows and achieves load balancing very quickly without load fluctuation by accurately predicting the amount of transferred load between the links. In this paper we provide the general framework of DHFV and address the challenges in implementing DHFV. We then introduce two algorithms of DHFV with different flow selection strategies and show their performances through simulation.

  12. Improved One-Way Hash Chain and Revocation Polynomial-Based Self-Healing Group Key Distribution Schemes in Resource-Constrained Wireless Networks

    PubMed Central

    Chen, Huifang; Xie, Lei

    2014-01-01

    Self-healing group key distribution (SGKD) aims to deal with the key distribution problem over an unreliable wireless network. In this paper, we investigate the SGKD issue in resource-constrained wireless networks. We propose two improved SGKD schemes using the one-way hash chain (OHC) and the revocation polynomial (RP), the OHC&RP-SGKD schemes. In the proposed OHC&RP-SGKD schemes, by introducing the unique session identifier and binding the joining time with the capability of recovering previous session keys, the problem of the collusion attack between revoked users and new joined users in existing hash chain-based SGKD schemes is resolved. Moreover, novel methods for utilizing the one-way hash chain and constructing the personal secret, the revocation polynomial and the key updating broadcast packet are presented. Hence, the proposed OHC&RP-SGKD schemes eliminate the limitation of the maximum allowed number of revoked users on the maximum allowed number of sessions, increase the maximum allowed number of revoked/colluding users, and reduce the redundancy in the key updating broadcast packet. Performance analysis and simulation results show that the proposed OHC&RP-SGKD schemes are practical for resource-constrained wireless networks in bad environments, where a strong collusion attack resistance is required and many users could be revoked. PMID:25529204

  13. 7 CFR 51.1437 - Size classifications for halves.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... halves per pound shall be based upon the weight of half-kernels after all pieces, particles and dust... specified range. (d) Tolerances for pieces, particles, and dust. In order to allow for variations incident..., particles, and dust: Provided, That not more than one-third of this amount, or 5 percent, shall be allowed...

  14. 7 CFR 51.1437 - Size classifications for halves.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... halves per pound shall be based upon the weight of half-kernels after all pieces, particles and dust... specified range. (d) Tolerances for pieces, particles, and dust. In order to allow for variations incident..., particles, and dust: Provided, That not more than one-third of this amount, or 5 percent, shall be allowed...

  15. Supervised graph hashing for histopathology image retrieval and classification.

    PubMed

    Shi, Xiaoshuang; Xing, Fuyong; Xu, KaiDi; Xie, Yuanpu; Su, Hai; Yang, Lin

    2017-12-01

    In pathology image analysis, morphological characteristics of cells are critical to grade many diseases. With the development of cell detection and segmentation techniques, it is possible to extract cell-level information for further analysis in pathology images. However, it is challenging to conduct efficient analysis of cell-level information on a large-scale image dataset because each image usually contains hundreds or thousands of cells. In this paper, we propose a novel image retrieval based framework for large-scale pathology image analysis. For each image, we encode each cell into binary codes to generate image representation using a novel graph based hashing model and then conduct image retrieval by applying a group-to-group matching method to similarity measurement. In order to improve both computational efficiency and memory requirement, we further introduce matrix factorization into the hashing model for scalable image retrieval. The proposed framework is extensively validated with thousands of lung cancer images, and it achieves 97.98% classification accuracy and 97.50% retrieval precision with all cells of each query image used. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. 9 CFR 319.302 - Hash.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CERTIFICATION DEFINITIONS AND STANDARDS OF IDENTITY OR COMPOSITION Canned, Frozen, or Dehydrated Meat Food... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hash. 319.302 Section 319.302 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION...

  17. 9 CFR 319.302 - Hash.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CERTIFICATION DEFINITIONS AND STANDARDS OF IDENTITY OR COMPOSITION Canned, Frozen, or Dehydrated Meat Food... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hash. 319.302 Section 319.302 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION...

  18. An enhanced biometric authentication scheme for telecare medicine information systems with nonce using chaotic hash function.

    PubMed

    Das, Ashok Kumar; Goswami, Adrijit

    2014-06-01

    Recently, Awasthi and Srivastava proposed a novel biometric remote user authentication scheme for the telecare medicine information system (TMIS) with nonce. Their scheme is very efficient as it is based on efficient chaotic one-way hash function and bitwise XOR operations. In this paper, we first analyze Awasthi-Srivastava's scheme and then show that their scheme has several drawbacks: (1) incorrect password change phase, (2) fails to preserve user anonymity property, (3) fails to establish a secret session key beween a legal user and the server, (4) fails to protect strong replay attack, and (5) lacks rigorous formal security analysis. We then a propose a novel and secure biometric-based remote user authentication scheme in order to withstand the security flaw found in Awasthi-Srivastava's scheme and enhance the features required for an idle user authentication scheme. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks, including the replay and man-in-the-middle attacks. Our scheme is also efficient as compared to Awasthi-Srivastava's scheme.

  19. System using data compression and hashing adapted for use for multimedia encryption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffland, Douglas R

    2011-07-12

    A system and method is disclosed for multimedia encryption. Within the system of the present invention, a data compression module receives and compresses a media signal into a compressed data stream. A data acquisition module receives and selects a set of data from the compressed data stream. And, a hashing module receives and hashes the set of data into a keyword. The method of the present invention includes the steps of compressing a media signal into a compressed data stream; selecting a set of data from the compressed data stream; and hashing the set of data into a keyword.

  20. Improving Sector Hash Carving with Rule-Based and Entropy-Based Non-Probative Block Filters

    DTIC Science & Technology

    2015-03-01

    0x20 exceeds the histogram rule’s threshold of 256 instances of a single 4-byte value. The 0x20 bytes are part of an Extensible Metadata Platform (XMP...block consists of data separated by NULL bytes of padding. The histogram rule is triggered for the block because the block contains more than 256 4...sdash can reduce the rate of false positive matches. After characteristic features have been selected, the features are hashed using SHA -1, which creates

  1. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... combination, are salt, sugar (sucrose or dextrose), spice, and flavoring, including essential oils, oleoresins, and other spice extractives. (b) Corned beef hash may contain one or more of the following optional...

  2. Checking Questionable Entry of Personally Identifiable Information Encrypted by One-Way Hash Transformation

    PubMed Central

    Chen, Xianlai; Fann, Yang C; McAuliffe, Matthew; Vismer, David

    2017-01-01

    Background As one of the several effective solutions for personal privacy protection, a global unique identifier (GUID) is linked with hash codes that are generated from combinations of personally identifiable information (PII) by a one-way hash algorithm. On the GUID server, no PII is permitted to be stored, and only GUID and hash codes are allowed. The quality of PII entry is critical to the GUID system. Objective The goal of our study was to explore a method of checking questionable entry of PII in this context without using or sending any portion of PII while registering a subject. Methods According to the principle of GUID system, all possible combination patterns of PII fields were analyzed and used to generate hash codes, which were stored on the GUID server. Based on the matching rules of the GUID system, an error-checking algorithm was developed using set theory to check PII entry errors. We selected 200,000 simulated individuals with randomly-planted errors to evaluate the proposed algorithm. These errors were placed in the required PII fields or optional PII fields. The performance of the proposed algorithm was also tested in the registering system of study subjects. Results There are 127,700 error-planted subjects, of which 114,464 (89.64%) can still be identified as the previous one and remaining 13,236 (10.36%, 13,236/127,700) are discriminated as new subjects. As expected, 100% of nonidentified subjects had errors within the required PII fields. The possibility that a subject is identified is related to the count and the type of incorrect PII field. For all identified subjects, their errors can be found by the proposed algorithm. The scope of questionable PII fields is also associated with the count and the type of the incorrect PII field. The best situation is to precisely find the exact incorrect PII fields, and the worst situation is to shrink the questionable scope only to a set of 13 PII fields. In the application, the proposed algorithm can

  3. Data Collision Prevention with Overflow Hashing Technique in Closed Hash Searching Process

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Nurjamiyah; Rafika Dewi, Arie

    2017-12-01

    Hash search is a method that can be used for various search processes such as search engines, sorting, machine learning, neural network and so on, in the search process the possibility of collision data can happen and to prevent the occurrence of collision can be done in several ways one of them is to use Overflow technique, the use of this technique perform with varying length of data and this technique can prevent the occurrence of data collisions.

  4. LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel

    2017-10-01

    Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.

  5. Checking Questionable Entry of Personally Identifiable Information Encrypted by One-Way Hash Transformation.

    PubMed

    Chen, Xianlai; Fann, Yang C; McAuliffe, Matthew; Vismer, David; Yang, Rong

    2017-02-17

    As one of the several effective solutions for personal privacy protection, a global unique identifier (GUID) is linked with hash codes that are generated from combinations of personally identifiable information (PII) by a one-way hash algorithm. On the GUID server, no PII is permitted to be stored, and only GUID and hash codes are allowed. The quality of PII entry is critical to the GUID system. The goal of our study was to explore a method of checking questionable entry of PII in this context without using or sending any portion of PII while registering a subject. According to the principle of GUID system, all possible combination patterns of PII fields were analyzed and used to generate hash codes, which were stored on the GUID server. Based on the matching rules of the GUID system, an error-checking algorithm was developed using set theory to check PII entry errors. We selected 200,000 simulated individuals with randomly-planted errors to evaluate the proposed algorithm. These errors were placed in the required PII fields or optional PII fields. The performance of the proposed algorithm was also tested in the registering system of study subjects. There are 127,700 error-planted subjects, of which 114,464 (89.64%) can still be identified as the previous one and remaining 13,236 (10.36%, 13,236/127,700) are discriminated as new subjects. As expected, 100% of nonidentified subjects had errors within the required PII fields. The possibility that a subject is identified is related to the count and the type of incorrect PII field. For all identified subjects, their errors can be found by the proposed algorithm. The scope of questionable PII fields is also associated with the count and the type of the incorrect PII field. The best situation is to precisely find the exact incorrect PII fields, and the worst situation is to shrink the questionable scope only to a set of 13 PII fields. In the application, the proposed algorithm can give a hint of questionable PII entry

  6. Improving the efficiency of quantum hash function by dense coding of coin operators in discrete-time quantum walk

    NASA Astrophysics Data System (ADS)

    Yang, YuGuang; Zhang, YuChen; Xu, Gang; Chen, XiuBo; Zhou, Yi-Hua; Shi, WeiMin

    2018-03-01

    Li et al. first proposed a quantum hash function (QHF) in a quantum-walk architecture. In their scheme, two two-particle interactions, i.e., I interaction and π-phase interaction are introduced and the choice of I or π-phase interactions at each iteration depends on a message bit. In this paper, we propose an efficient QHF by dense coding of coin operators in discrete-time quantum walk. Compared with existing QHFs, our protocol has the following advantages: the efficiency of the QHF can be doubled and even more; only one particle is enough and two-particle interactions are unnecessary so that quantum resources are saved. It is a clue to apply the dense coding technique to quantum cryptographic protocols, especially to the applications with restricted quantum resources.

  7. 7 CFR 51.1431 - U.S. No. 1 Halves and Pieces.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. No. 1 Halves and Pieces. 51.1431 Section 51.1431... MARKETING ACT OF 1946 FRESH FRUITS, VEGETABLES AND OTHER PRODUCTS 1,2 (INSPECTION, CERTIFICATION, AND STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1431 U.S. No. 1 Halves and...

  8. 7 CFR 51.1431 - U.S. No. 1 Halves and Pieces.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false U.S. No. 1 Halves and Pieces. 51.1431 Section 51.1431... MARKETING ACT OF 1946 FRESH FRUITS, VEGETABLES AND OTHER PRODUCTS 1,2 (INSPECTION, CERTIFICATION, AND STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1431 U.S. No. 1 Halves and...

  9. Planetary Nebula Candidates Uncovered with the HASH Research Platform

    NASA Astrophysics Data System (ADS)

    Fragkou, Vasiliki; Bojičić, Ivan; Frew, David; Parker, Quentin

    2017-10-01

    A detailed examination of new high quality radio catalogues (e.g. Cornish) in combination with available mid-infrared (MIR) satellite imagery (e.g. Glimpse) has allowed us to find 70 new planetary nebula (PN) candidates based on existing knowledge of their typical colors and fluxes. To further examine the nature of these sources, multiple diagnostic tools have been applied to these candidates based on published data and on available imagery in the HASH (Hong Kong/ AAO/ Strasbourg Hα planetary nebula) research platform. Some candidates have previously-missed optical counterparts allowing for spectroscopic follow-up. Indeed, the single object spectroscopically observed so far has turned out to be a bona fide PN.

  10. Adoption of the Hash algorithm in a conceptual model for the civil registry of Ecuador

    NASA Astrophysics Data System (ADS)

    Toapanta, Moisés; Mafla, Enrique; Orizaga, Antonio

    2018-04-01

    The Hash security algorithm was analyzed in order to mitigate information security in a distributed architecture. The objective of this research is to develop a prototype for the Adoption of the algorithm Hash in a conceptual model for the Civil Registry of Ecuador. The deductive method was used in order to analyze the published articles that have a direct relation with the research project "Algorithms and Security Protocols for the Civil Registry of Ecuador" and articles related to the Hash security algorithm. It resulted from this research: That the SHA-1 security algorithm is appropriate for use in Ecuador's civil registry; we adopted the SHA-1 algorithm used in the flowchart technique and finally we obtained the adoption of the hash algorithm in a conceptual model. It is concluded that from the comparison of the DM5 and SHA-1 algorithm, it is suggested that in the case of an implementation, the SHA-1 algorithm is taken due to the amount of information and data available from the Civil Registry of Ecuador; It is determined that the SHA-1 algorithm that was defined using the flowchart technique can be modified according to the requirements of each institution; the model for adopting the hash algorithm in a conceptual model is a prototype that can be modified according to all the actors that make up each organization.

  11. Handwriting: Feature Correlation Analysis for Biometric Hashes

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Steinmetz, Ralf

    2004-12-01

    In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  12. Random multispace quantization as an analytic mechanism for BioHashing of biometric and random identity inputs.

    PubMed

    Teoh, Andrew B J; Goh, Alwyn; Ngo, David C L

    2006-12-01

    Biometric analysis for identity verification is becoming a widespread reality. Such implementations necessitate large-scale capture and storage of biometric data, which raises serious issues in terms of data privacy and (if such data is compromised) identity theft. These problems stem from the essential permanence of biometric data, which (unlike secret passwords or physical tokens) cannot be refreshed or reissued if compromised. Our previously presented biometric-hash framework prescribes the integration of external (password or token-derived) randomness with user-specific biometrics, resulting in bitstring outputs with security characteristics (i.e., noninvertibility) comparable to cryptographic ciphers or hashes. The resultant BioHashes are hence cancellable, i.e., straightforwardly revoked and reissued (via refreshed password or reissued token) if compromised. BioHashing furthermore enhances recognition effectiveness, which is explained in this paper as arising from the Random Multispace Quantization (RMQ) of biometric and external random inputs.

  13. Implementation analysis of RC5 algorithm on Preneel-Govaerts-Vandewalle (PGV) hashing schemes using length extension attack

    NASA Astrophysics Data System (ADS)

    Siswantyo, Sepha; Susanti, Bety Hayat

    2016-02-01

    Preneel-Govaerts-Vandewalle (PGV) schemes consist of 64 possible single-block-length schemes that can be used to build a hash function based on block ciphers. For those 64 schemes, Preneel claimed that 4 schemes are secure. In this paper, we apply length extension attack on those 4 secure PGV schemes which use RC5 algorithm in its basic construction to test their collision resistance property. The attack result shows that the collision occurred on those 4 secure PGV schemes. Based on the analysis, we indicate that Feistel structure and data dependent rotation operation in RC5 algorithm, XOR operations on the scheme, along with selection of additional message block value also give impact on the collision to occur.

  14. Side-information-dependent correlation channel estimation in hash-based distributed video coding.

    PubMed

    Deligiannis, Nikos; Barbarien, Joeri; Jacobs, Marc; Munteanu, Adrian; Skodras, Athanassios; Schelkens, Peter

    2012-04-01

    In the context of low-cost video encoding, distributed video coding (DVC) has recently emerged as a potential candidate for uplink-oriented applications. This paper builds on a concept of correlation channel (CC) modeling, which expresses the correlation noise as being statistically dependent on the side information (SI). Compared with classical side-information-independent (SII) noise modeling adopted in current DVC solutions, it is theoretically proven that side-information-dependent (SID) modeling improves the Wyner-Ziv coding performance. Anchored in this finding, this paper proposes a novel algorithm for online estimation of the SID CC parameters based on already decoded information. The proposed algorithm enables bit-plane-by-bit-plane successive refinement of the channel estimation leading to progressively improved accuracy. Additionally, the proposed algorithm is included in a novel DVC architecture that employs a competitive hash-based motion estimation technique to generate high-quality SI at the decoder. Experimental results corroborate our theoretical gains and validate the accuracy of the channel estimation algorithm. The performance assessment of the proposed architecture shows remarkable and consistent coding gains over a germane group of state-of-the-art distributed and standard video codecs, even under strenuous conditions, i.e., large groups of pictures and highly irregular motion content.

  15. Multichromosomal median and halving problems under different genomic distances

    PubMed Central

    Tannier, Eric; Zheng, Chunfang; Sankoff, David

    2009-01-01

    Background Genome median and genome halving are combinatorial optimization problems that aim at reconstructing ancestral genomes as well as the evolutionary events leading from the ancestor to extant species. Exploring complexity issues is a first step towards devising efficient algorithms. The complexity of the median problem for unichromosomal genomes (permutations) has been settled for both the breakpoint distance and the reversal distance. Although the multichromosomal case has often been assumed to be a simple generalization of the unichromosomal case, it is also a relaxation so that complexity in this context does not follow from existing results, and is open for all distances. Results We settle here the complexity of several genome median and halving problems, including a surprising polynomial result for the breakpoint median and guided halving problems in genomes with circular and linear chromosomes, showing that the multichromosomal problem is actually easier than the unichromosomal problem. Still other variants of these problems are NP-complete, including the DCJ double distance problem, previously mentioned as an open question. We list the remaining open problems. Conclusion This theoretical study clears up a wide swathe of the algorithmical study of genome rearrangements with multiple multichromosomal genomes. PMID:19386099

  16. Rotation invariant deep binary hashing for fast image retrieval

    NASA Astrophysics Data System (ADS)

    Dai, Lai; Liu, Jianming; Jiang, Aiwen

    2017-07-01

    In this paper, we study how to compactly represent image's characteristics for fast image retrieval. We propose supervised rotation invariant compact discriminative binary descriptors through combining convolutional neural network with hashing. In the proposed network, binary codes are learned by employing a hidden layer for representing latent concepts that dominate on class labels. A loss function is proposed to minimize the difference between binary descriptors that describe reference image and the rotated one. Compared with some other supervised methods, the proposed network doesn't have to require pair-wised inputs for binary code learning. Experimental results show that our method is effective and achieves state-of-the-art results on the CIFAR-10 and MNIST datasets.

  17. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    NASA Astrophysics Data System (ADS)

    Popic, Victoria; Batzoglou, Serafim

    2017-05-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party.

  18. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    PubMed Central

    Popic, Victoria; Batzoglou, Serafim

    2017-01-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party. PMID:28508884

  19. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research

    PubMed Central

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D

    2015-01-01

    Background A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. Objective This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. Methods NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f 1, f 2, ..., f k. The input for each function f i has 3 components: a random number r, an integer n, and input data m. The result, f i(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f 1(r 1, n 1, m 1), f 2(r 2, n 2, m 2), ..., f k(r k, n k, m k). In the second phase, the intermediate string generated in Phase 1 is encrypted

  20. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research.

    PubMed

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D; Cui, Licong

    2015-11-10

    A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f1, f2, ..., fk. The input for each function fi has 3 components: a random number r, an integer n, and input data m. The result, fi(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f1(r1, n1, m1), f2(r2, n2, m2), ..., fk(rk, nk, mk). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result

  1. Fast Exact Search in Hamming Space With Multi-Index Hashing.

    PubMed

    Norouzi, Mohammad; Punjani, Ali; Fleet, David J

    2014-06-01

    There is growing interest in representing image data and feature descriptors using compact binary codes for fast near neighbor search. Although binary codes are motivated by their use as direct indices (addresses) into a hash table, codes longer than 32 bits are not being used as such, as it was thought to be ineffective. We introduce a rigorous way to build multiple hash tables on binary code substrings that enables exact k-nearest neighbor search in Hamming space. The approach is storage efficient and straight-forward to implement. Theoretical analysis shows that the algorithm exhibits sub-linear run-time behavior for uniformly distributed codes. Empirical results show dramatic speedups over a linear scan baseline for datasets of up to one billion codes of 64, 128, or 256 bits.

  2. HASH: the Hong Kong/AAO/Strasbourg Hα planetary nebula database

    NASA Astrophysics Data System (ADS)

    Parker, Quentin A.; Bojičić, Ivan S.; Frew, David J.

    2016-07-01

    By incorporating our major recent discoveries with re-measured and verified contents of existing catalogues we provide, for the first time, an accessible, reliable, on-line SQL database for essential, up-to date information for all known Galactic planetary nebulae (PNe). We have attempted to: i) reliably remove PN mimics/false ID's that have biased previous studies and ii) provide accurate positions, sizes, morphologies, multi-wavelength imagery and spectroscopy. We also provide a link to CDS/Vizier for the archival history of each object and other valuable links to external data. With the HASH interface, users can sift, select, browse, collate, investigate, download and visualise the entire currently known Galactic PNe diversity. HASH provides the community with the most complete and reliable data with which to undertake new science.

  3. Halving Student Loan Interest Rates Is Unaffordable and Ineffective. WebMemo No. 1308

    ERIC Educational Resources Information Center

    Riedl, Brian M.

    2007-01-01

    The House of Representatives will likely vote this week on a proposal to halve the 6.8 percent interest rate on subsidized student loans as part of the new congressional majority's 100-Hour agenda. This document presents six problems with halving student loan interest rates and argues that, rather than providing billions in new federal subsidies,…

  4. Application of kernel functions for accurate similarity search in large chemical databases.

    PubMed

    Wang, Xiaohong; Huan, Jun; Smalter, Aaron; Lushington, Gerald H

    2010-04-29

    Similarity search in chemical structure databases is an important problem with many applications in chemical genomics, drug design, and efficient chemical probe screening among others. It is widely believed that structure based methods provide an efficient way to do the query. Recently various graph kernel functions have been designed to capture the intrinsic similarity of graphs. Though successful in constructing accurate predictive and classification models, graph kernel functions can not be applied to large chemical compound database due to the high computational complexity and the difficulties in indexing similarity search for large databases. To bridge graph kernel function and similarity search in chemical databases, we applied a novel kernel-based similarity measurement, developed in our team, to measure similarity of graph represented chemicals. In our method, we utilize a hash table to support new graph kernel function definition, efficient storage and fast search. We have applied our method, named G-hash, to large chemical databases. Our results show that the G-hash method achieves state-of-the-art performance for k-nearest neighbor (k-NN) classification. Moreover, the similarity measurement and the index structure is scalable to large chemical databases with smaller indexing size, and faster query processing time as compared to state-of-the-art indexing methods such as Daylight fingerprints, C-tree and GraphGrep. Efficient similarity query processing method for large chemical databases is challenging since we need to balance running time efficiency and similarity search accuracy. Our previous similarity search method, G-hash, provides a new way to perform similarity search in chemical databases. Experimental study validates the utility of G-hash in chemical databases.

  5. OCO-2 Fairing Bi-Sector Halves Transport

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Both halves of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, arrive at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations have begun to hoist the sections of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the pad's tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  6. OCO-2 Fairing Bi-Sector Halves Transport

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Both halves of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, are towed from the Building 836 hangar to Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations have begun to hoist the sections of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the pad's tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  7. OCO-2 Fairing Bi-Sector Halves Transport

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Both halves of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, are delivered to Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations have begun to hoist the sections of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the pad's tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  8. 7 CFR 51.1430 - U.S. No. 1 Halves.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Standards for Grades of Shelled Pecans Grades § 51.1430 U.S. No. 1 Halves. “U.S. No. 1 Halves” consists of pecan half-kernels which meet the following requirements: (a) For quality: (1) Well dried; (2) Fairly...

  9. Comparative genomics meets topology: a novel view on genome median and halving problems.

    PubMed

    Alexeev, Nikita; Avdeyev, Pavel; Alekseyev, Max A

    2016-11-11

    Genome median and genome halving are combinatorial optimization problems that aim at reconstruction of ancestral genomes by minimizing the number of evolutionary events between them and genomes of the extant species. While these problems have been widely studied in past decades, their solutions are often either not efficient or not biologically adequate. These shortcomings have been recently addressed by restricting the problems solution space. We show that the restricted variants of genome median and halving problems are, in fact, closely related. We demonstrate that these problems have a neat topological interpretation in terms of embedded graphs and polygon gluings. We illustrate how such interpretation can lead to solutions to these problems in particular cases. This study provides an unexpected link between comparative genomics and topology, and demonstrates advantages of solving genome median and halving problems within the topological framework.

  10. Assessing the Equivalence of Paper, Mobile Phone, and Tablet Survey Responses at a Community Mental Health Center Using Equivalent Halves of a 'Gold-Standard' Depression Item Bank.

    PubMed

    Brodey, Benjamin B; Gonzalez, Nicole L; Elkin, Kathryn Ann; Sasiela, W Jordan; Brodey, Inger S

    2017-09-06

    halves, with the potential to simplify future experimental methodologies. Among community mental health care recipients, the PROMIS items function similarly whether administered via paper, tablet, or mobile phone. User satisfaction across modalities was also similar. Because paper, tablet, and mobile phone administrations yielded similar results, the choice of technology should be based on factors such as convenience and can even be changed during a study without adversely affecting the comparability of results. ©Benjamin B Brodey, Nicole L Gonzalez, Kathryn Ann Elkin, W Jordan Sasiela, Inger S Brodey. Originally published in JMIR Mental Health (http://mental.jmir.org), 06.09.2017.

  11. Non-symbolic halving in an Amazonian indigene group

    PubMed Central

    McCrink, Koleen; Spelke, Elizabeth S.; Dehaene, Stanislas; Pica, Pierre

    2014-01-01

    Much research supports the existence of an Approximate Number System (ANS) that is recruited by infants, children, adults, and non-human animals to generate coarse, non-symbolic representations of number. This system supports simple arithmetic operations such as addition, subtraction, and ordering of amounts. The current study tests whether an intuition of a more complex calculation, division, exists in an indigene group in the Amazon, the Mundurucu, whose language includes no words for large numbers. Mundurucu children were presented with a video event depicting a division transformation of halving, in which pairs of objects turned into single objects, reducing the array's numerical magnitude. Then they were tested on their ability to calculate the outcome of this division transformation with other large-number arrays. The Mundurucu children effected this transformation even when non-numerical variables were controlled, performed above chance levels on the very first set of test trials, and exhibited performance similar to urban children who had access to precise number words and a surrounding symbolic culture. We conclude that a halving calculation is part of the suite of intuitive operations supported by the ANS. PMID:23587042

  12. Delving Deeper: One Cut, Two Halves, Three Questions

    ERIC Educational Resources Information Center

    Ren, Guanshen

    2009-01-01

    A square can be divided into two equal parts with any cut through the center. The first question that arises is, Would any cut through the center of a regular polygon divide it into two equal parts? If not, the second question is, What kind of lines through the center of the polygon would cut it into two halves? However, many objects are not…

  13. Deep Hashing for Scalable Image Search.

    PubMed

    Lu, Jiwen; Liong, Venice Erin; Zhou, Jie

    2017-05-01

    In this paper, we propose a new deep hashing (DH) approach to learn compact binary codes for scalable image search. Unlike most existing binary codes learning methods, which usually seek a single linear projection to map each sample into a binary feature vector, we develop a deep neural network to seek multiple hierarchical non-linear transformations to learn these binary codes, so that the non-linear relationship of samples can be well exploited. Our model is learned under three constraints at the top layer of the developed deep network: 1) the loss between the compact real-valued code and the learned binary vector is minimized, 2) the binary codes distribute evenly on each bit, and 3) different bits are as independent as possible. To further improve the discriminative power of the learned binary codes, we extend DH into supervised DH (SDH) and multi-label SDH by including a discriminative term into the objective function of DH, which simultaneously maximizes the inter-class variations and minimizes the intra-class variations of the learned binary codes with the single-label and multi-label settings, respectively. Extensive experimental results on eight widely used image search data sets show that our proposed methods achieve very competitive results with the state-of-the-arts.

  14. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Both halves of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, are moved into position in the environmental enclosure, or clean room, at the top of the Delta II launcher at Space Launch Complex 2 on Vandenberg Air Force Base in California. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  15. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Both halves of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, have arrived in the environmental enclosure, or clean room, at the top of the Delta II launcher at Space Launch Complex 2 on Vandenberg Air Force Base in California. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  16. Guided genome halving: hardness, heuristics and the history of the Hemiascomycetes.

    PubMed

    Zheng, Chunfang; Zhu, Qian; Adam, Zaky; Sankoff, David

    2008-07-01

    Some present day species have incurred a whole genome doubling event in their evolutionary history, and this is reflected today in patterns of duplicated segments scattered throughout their chromosomes. These duplications may be used as data to 'halve' the genome, i.e. to reconstruct the ancestral genome at the moment of doubling, but the solution is often highly nonunique. To resolve this problem, we take account of outgroups, external reference genomes, to guide and narrow down the search. We improve on a previous, computationally costly, 'brute force' method by adapting the genome halving algorithm of El-Mabrouk and Sankoff so that it rapidly and accurately constructs an ancestor close the outgroups, prior to a local optimization heuristic. We apply this to reconstruct the predoubling ancestor of Saccharomyces cerevisiae and Candida glabrata, guided by the genomes of three other yeasts that diverged before the genome doubling event. We analyze the results in terms (1) of the minimum evolution criterion, (2) how close the genome halving result is to the final (local) minimum and (3) how close the final result is to an ancestor manually constructed by an expert with access to additional information. We also visualize the set of reconstructed ancestors using classic multidimensional scaling to see what aspects of the two doubled and three unduplicated genomes influence the differences among the reconstructions. The experimental software is available on request.

  17. Semantic Segmentation of Building Elements Using Point Cloud Hashing

    NASA Astrophysics Data System (ADS)

    Chizhova, M.; Gurianov, A.; Hess, M.; Luhmann, T.; Brunn, A.; Stilla, U.

    2018-05-01

    For the interpretation of point clouds, the semantic definition of extracted segments from point clouds or images is a common problem. Usually, the semantic of geometrical pre-segmented point cloud elements are determined using probabilistic networks and scene databases. The proposed semantic segmentation method is based on the psychological human interpretation of geometric objects, especially on fundamental rules of primary comprehension. Starting from these rules the buildings could be quite well and simply classified by a human operator (e.g. architect) into different building types and structural elements (dome, nave, transept etc.), including particular building parts which are visually detected. The key part of the procedure is a novel method based on hashing where point cloud projections are transformed into binary pixel representations. A segmentation approach released on the example of classical Orthodox churches is suitable for other buildings and objects characterized through a particular typology in its construction (e.g. industrial objects in standardized enviroments with strict component design allowing clear semantic modelling).

  18. A regional composite-face effect for species-specific recognition: Upper and lower halves play different roles in holistic processing of monkey faces.

    PubMed

    Wang, Zhe; Quinn, Paul C; Jin, Haiyang; Sun, Yu-Hao P; Tanaka, James W; Pascalis, Olivier; Lee, Kang

    2018-04-25

    Using a composite-face paradigm, we examined the holistic processing induced by Asian faces, Caucasian faces, and monkey faces with human Asian participants in two experiments. In Experiment 1, participants were asked to judge whether the upper halves of two faces successively presented were the same or different. A composite-face effect was found for Asian faces and Caucasian faces, but not for monkey faces. In Experiment 2, participants were asked to judge whether the lower halves of the two faces successively presented were the same or different. A composite-face effect was found for monkey faces as well as for Asian faces and Caucasian faces. Collectively, these results reveal that own-species (i.e., own-race and other-race) faces engage holistic processing in both upper and lower halves of the face, but other-species (i.e., monkey) faces engage holistic processing only when participants are asked to match the lower halves of the face. The findings are discussed in the context of a region-based holistic processing account for the species-specific effect in face recognition. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. A Tree Locality-Sensitive Hash for Secure Software Testing

    DTIC Science & Technology

    2017-09-14

    errors, or to look for vulnerabilities that could allow a nefarious actor to use our software against us. Ultimately, all testing is designed to find...and an equivalent number of feasible paths discovered by Klee. 1.5 Summary This document the Tree Locality-Sensitive Hash (TLSH), a locality-senstive...performing two groups of tests that verify the accuracy and usefulness of TLSH. Chapter 5 summarizes the contents of the dissertation and lists avenues

  20. ProGeRF: Proteome and Genome Repeat Finder Utilizing a Fast Parallel Hash Function

    PubMed Central

    Moraes, Walas Jhony Lopes; Rodrigues, Thiago de Souza; Bartholomeu, Daniella Castanheira

    2015-01-01

    Repetitive element sequences are adjacent, repeating patterns, also called motifs, and can be of different lengths; repetitions can involve their exact or approximate copies. They have been widely used as molecular markers in population biology. Given the sizes of sequenced genomes, various bioinformatics tools have been developed for the extraction of repetitive elements from DNA sequences. However, currently available tools do not provide options for identifying repetitive elements in the genome or proteome, displaying a user-friendly web interface, and performing-exhaustive searches. ProGeRF is a web site for extracting repetitive regions from genome and proteome sequences. It was designed to be efficient, fast, and accurate and primarily user-friendly web tool allowing many ways to view and analyse the results. ProGeRF (Proteome and Genome Repeat Finder) is freely available as a stand-alone program, from which the users can download the source code, and as a web tool. It was developed using the hash table approach to extract perfect and imperfect repetitive regions in a (multi)FASTA file, while allowing a linear time complexity. PMID:25811026

  1. Design, fabrication, and high-gradient testing of an X -band, traveling-wave accelerating structure milled from copper halves

    NASA Astrophysics Data System (ADS)

    Argyropoulos, Theodoros; Catalan-Lasheras, Nuria; Grudiev, Alexej; Mcmonagle, Gerard; Rodriguez-Castro, Enrique; Syrachev, Igor; Wegner, Rolf; Woolley, Ben; Wuensch, Walter; Zha, Hao; Dolgashev, Valery; Bowden, Gorden; Haase, Andrew; Lucas, Thomas Geoffrey; Volpi, Matteo; Esperante-Pereira, Daniel; Rajamäki, Robin

    2018-06-01

    A prototype 11.994 GHz, traveling-wave accelerating structure for the Compact Linear Collider has been built, using the novel technique of assembling the structure from milled halves. The use of milled halves has many advantages when compared to a structure made from individual disks. These include the potential for a reduction in cost, because there are fewer parts, as well as a greater freedom in choice of joining technology because there are no rf currents across the halves' joint. Here we present the rf design and fabrication of the prototype structure, followed by the results of the high-power test and post-test surface analysis. During high-power testing the structure reached an unloaded gradient of 100 MV /m at a rf breakdown rate of less than 1.5 ×10-5 breakdowns /pulse /m with a 200 ns pulse. This structure has been designed for the CLIC testing program but construction from halves can be advantageous in a wide variety of applications.

  2. MOSAIK: a hash-based algorithm for accurate next-generation sequencing short-read mapping.

    PubMed

    Lee, Wan-Ping; Stromberg, Michael P; Ward, Alistair; Stewart, Chip; Garrison, Erik P; Marth, Gabor T

    2014-01-01

    MOSAIK is a stable, sensitive and open-source program for mapping second and third-generation sequencing reads to a reference genome. Uniquely among current mapping tools, MOSAIK can align reads generated by all the major sequencing technologies, including Illumina, Applied Biosystems SOLiD, Roche 454, Ion Torrent and Pacific BioSciences SMRT. Indeed, MOSAIK was the only aligner to provide consistent mappings for all the generated data (sequencing technologies, low-coverage and exome) in the 1000 Genomes Project. To provide highly accurate alignments, MOSAIK employs a hash clustering strategy coupled with the Smith-Waterman algorithm. This method is well-suited to capture mismatches as well as short insertions and deletions. To support the growing interest in larger structural variant (SV) discovery, MOSAIK provides explicit support for handling known-sequence SVs, e.g. mobile element insertions (MEIs) as well as generating outputs tailored to aid in SV discovery. All variant discovery benefits from an accurate description of the read placement confidence. To this end, MOSAIK uses a neural-network based training scheme to provide well-calibrated mapping quality scores, demonstrated by a correlation coefficient between MOSAIK assigned and actual mapping qualities greater than 0.98. In order to ensure that studies of any genome are supported, a training pipeline is provided to ensure optimal mapping quality scores for the genome under investigation. MOSAIK is multi-threaded, open source, and incorporated into our command and pipeline launcher system GKNO (http://gkno.me).

  3. MOSAIK: A Hash-Based Algorithm for Accurate Next-Generation Sequencing Short-Read Mapping

    PubMed Central

    Lee, Wan-Ping; Stromberg, Michael P.; Ward, Alistair; Stewart, Chip; Garrison, Erik P.; Marth, Gabor T.

    2014-01-01

    MOSAIK is a stable, sensitive and open-source program for mapping second and third-generation sequencing reads to a reference genome. Uniquely among current mapping tools, MOSAIK can align reads generated by all the major sequencing technologies, including Illumina, Applied Biosystems SOLiD, Roche 454, Ion Torrent and Pacific BioSciences SMRT. Indeed, MOSAIK was the only aligner to provide consistent mappings for all the generated data (sequencing technologies, low-coverage and exome) in the 1000 Genomes Project. To provide highly accurate alignments, MOSAIK employs a hash clustering strategy coupled with the Smith-Waterman algorithm. This method is well-suited to capture mismatches as well as short insertions and deletions. To support the growing interest in larger structural variant (SV) discovery, MOSAIK provides explicit support for handling known-sequence SVs, e.g. mobile element insertions (MEIs) as well as generating outputs tailored to aid in SV discovery. All variant discovery benefits from an accurate description of the read placement confidence. To this end, MOSAIK uses a neural-network based training scheme to provide well-calibrated mapping quality scores, demonstrated by a correlation coefficient between MOSAIK assigned and actual mapping qualities greater than 0.98. In order to ensure that studies of any genome are supported, a training pipeline is provided to ensure optimal mapping quality scores for the genome under investigation. MOSAIK is multi-threaded, open source, and incorporated into our command and pipeline launcher system GKNO (http://gkno.me). PMID:24599324

  4. Comparison of Various Similarity Measures for Average Image Hash in Mobile Phone Application

    NASA Astrophysics Data System (ADS)

    Farisa Chaerul Haviana, Sam; Taufik, Muhammad

    2017-04-01

    One of the main issue in Content Based Image Retrieval (CIBR) is similarity measures for resulting image hashes. The main key challenge is to find the most benefits distance or similarity measures for calculating the similarity in term of speed and computing costs, specially under limited computing capabilities device like mobile phone. This study we utilize twelve most common and popular distance or similarity measures technique implemented in mobile phone application, to be compared and studied. The results show that all similarity measures implemented in this study was perform equally under mobile phone application. This gives more possibilities for method combinations to be implemented for image retrieval.

  5. Jobs, Skills and Incomes in Ghana: How Was Poverty Halved?

    ERIC Educational Resources Information Center

    Nsowah-Nuamah, Nicholas; Teal, Francis; Awoonor-Williams, Moses

    2012-01-01

    On the basis of official statistics, poverty has halved in Ghana over the period from 1991 to 2005. Our objective in this paper is to assess how far this fall was linked to the creation of better paying jobs and the increase in education. We find that earnings rose rapidly in the period from 1998 to 2005, by 64% for men and by 55% for women. While…

  6. Matching Real and Synthetic Panoramic Images Using a Variant of Geometric Hashing

    NASA Astrophysics Data System (ADS)

    Li-Chee-Ming, J.; Armenakis, C.

    2017-05-01

    This work demonstrates an approach to automatically initialize a visual model-based tracker, and recover from lost tracking, without prior camera pose information. These approaches are commonly referred to as tracking-by-detection. Previous tracking-by-detection techniques used either fiducials (i.e. landmarks or markers) or the object's texture. The main contribution of this work is the development of a tracking-by-detection algorithm that is based solely on natural geometric features. A variant of geometric hashing, a model-to-image registration algorithm, is proposed that searches for a matching panoramic image from a database of synthetic panoramic images captured in a 3D virtual environment. The approach identifies corresponding features between the matched panoramic images. The corresponding features are to be used in a photogrammetric space resection to estimate the camera pose. The experiments apply this algorithm to initialize a model-based tracker in an indoor environment using the 3D CAD model of the building.

  7. GSHR-Tree: a spatial index tree based on dynamic spatial slot and hash table in grid environments

    NASA Astrophysics Data System (ADS)

    Chen, Zhanlong; Wu, Xin-cai; Wu, Liang

    2008-12-01

    Computation Grids enable the coordinated sharing of large-scale distributed heterogeneous computing resources that can be used to solve computationally intensive problems in science, engineering, and commerce. Grid spatial applications are made possible by high-speed networks and a new generation of Grid middleware that resides between networks and traditional GIS applications. The integration of the multi-sources and heterogeneous spatial information and the management of the distributed spatial resources and the sharing and cooperative of the spatial data and Grid services are the key problems to resolve in the development of the Grid GIS. The performance of the spatial index mechanism is the key technology of the Grid GIS and spatial database affects the holistic performance of the GIS in Grid Environments. In order to improve the efficiency of parallel processing of a spatial mass data under the distributed parallel computing grid environment, this paper presents a new grid slot hash parallel spatial index GSHR-Tree structure established in the parallel spatial indexing mechanism. Based on the hash table and dynamic spatial slot, this paper has improved the structure of the classical parallel R tree index. The GSHR-Tree index makes full use of the good qualities of R-Tree and hash data structure. This paper has constructed a new parallel spatial index that can meet the needs of parallel grid computing about the magnanimous spatial data in the distributed network. This arithmetic splits space in to multi-slots by multiplying and reverting and maps these slots to sites in distributed and parallel system. Each sites constructs the spatial objects in its spatial slot into an R tree. On the basis of this tree structure, the index data was distributed among multiple nodes in the grid networks by using large node R-tree method. The unbalance during process can be quickly adjusted by means of a dynamical adjusting algorithm. This tree structure has considered the

  8. Anatomy of a hash-based long read sequence mapping algorithm for next generation DNA sequencing.

    PubMed

    Misra, Sanchit; Agrawal, Ankit; Liao, Wei-keng; Choudhary, Alok

    2011-01-15

    Recently, a number of programs have been proposed for mapping short reads to a reference genome. Many of them are heavily optimized for short-read mapping and hence are very efficient for shorter queries, but that makes them inefficient or not applicable for reads longer than 200 bp. However, many sequencers are already generating longer reads and more are expected to follow. For long read sequence mapping, there are limited options; BLAT, SSAHA2, FANGS and BWA-SW are among the popular ones. However, resequencing and personalized medicine need much faster software to map these long sequencing reads to a reference genome to identify SNPs or rare transcripts. We present AGILE (AliGnIng Long rEads), a hash table based high-throughput sequence mapping algorithm for longer 454 reads that uses diagonal multiple seed-match criteria, customized q-gram filtering and a dynamic incremental search approach among other heuristics to optimize every step of the mapping process. In our experiments, we observe that AGILE is more accurate than BLAT, and comparable to BWA-SW and SSAHA2. For practical error rates (< 5%) and read lengths (200-1000 bp), AGILE is significantly faster than BLAT, SSAHA2 and BWA-SW. Even for the other cases, AGILE is comparable to BWA-SW and several times faster than BLAT and SSAHA2. http://www.ece.northwestern.edu/~smi539/agile.html.

  9. Limitations and requirements of content-based multimedia authentication systems

    NASA Astrophysics Data System (ADS)

    Wu, Chai W.

    2001-08-01

    Recently, a number of authentication schemes have been proposed for multimedia data such as images and sound data. They include both label based systems and semifragile watermarks. The main requirement for such authentication systems is that minor modifications such as lossy compression which do not alter the content of the data preserve the authenticity of the data, whereas modifications which do modify the content render the data not authentic. These schemes can be classified into two main classes depending on the model of image authentication they are based on. One of the purposes of this paper is to look at some of the advantages and disadvantages of these image authentication schemes and their relationship with fundamental limitations of the underlying model of image authentication. In particular, we study feature-based algorithms which generate an authentication tag based on some inherent features in the image such as the location of edges. The main disadvantage of most proposed feature-based algorithms is that similar images generate similar features, and therefore it is possible for a forger to generate dissimilar images that have the same features. On the other hand, the class of hash-based algorithms utilizes a cryptographic hash function or a digital signature scheme to reduce the data and generate an authentication tag. It inherits the security of digital signatures to thwart forgery attacks. The main disadvantage of hash-based algorithms is that the image needs to be modified in order to be made authenticatable. The amount of modification is on the order of the noise the image can tolerate before it is rendered inauthentic. The other purpose of this paper is to propose a multimedia authentication scheme which combines some of the best features of both classes of algorithms. The proposed scheme utilizes cryptographic hash functions and digital signature schemes and the data does not need to be modified in order to be made authenticatable. Several

  10. Drying kinetics of apricot halves in a microwave-hot air hybrid oven

    NASA Astrophysics Data System (ADS)

    Horuz, Erhan; Bozkurt, Hüseyin; Karataş, Haluk; Maskan, Medeni

    2017-06-01

    Drying behavior and kinetics of apricot halves were investigated in a microwave-hot air domestic hybrid oven at 120, 150 and 180 W microwave power and 50, 60 and 70 °C air temperature. Drying operation was finished when the moisture content reached to 25% (wet basis) from 77% (w.b). Increase in microwave power and air temperature increased drying rates and reduced drying time. Only falling rate period was observed in drying of apricot halves in hybrid oven. Eleven mathematical models were used for describing the drying kinetics of apricots. Modified logistic model gave the best fitting to the experimental data. The model has never been used to explain drying behavior of any kind of food materials up to now. Fick's second law was used for determination of both effective moisture diffusivity and thermal diffusivity values. Activation energy values of dried apricots were calculated from Arrhenius equation. Those that obtained from effective moisture diffusivity, thermal diffusivity and drying rate constant values ranged from 31.10 to 39.4 kJ/mol, 29.56 to 35.19 kJ/mol, and 26.02 to 32.36 kJ/mol, respectively.

  11. The Hong Kong/AAO/Strasbourg Hα (HASH) Planetary Nebula Database

    NASA Astrophysics Data System (ADS)

    Bojičić, Ivan S.; Parker, Quentin A.; Frew, David J.

    2017-10-01

    The Hong Kong/AAO/Strasbourg Hα (HASH) planetary nebula database is an online research platform providing free and easy access to the largest and most comprehensive catalogue of known Galactic PNe and a repository of observational data (imaging and spectroscopy) for these and related astronomical objects. The main motivation for creating this system is resolving some of long standing problems in the field e.g. problems with mimics and dubious and/or misidentifications, errors in observational data and consolidation of the widely scattered data-sets. This facility allows researchers quick and easy access to the archived and new observational data and creating and sharing of non-redundant PN samples and catalogues.

  12. Functional interaction between the two halves of the photoreceptor-specific ATP binding cassette protein ABCR (ABCA4). Evidence for a non-exchangeable ADP in the first nucleotide binding domain.

    PubMed

    Ahn, Jinhi; Beharry, Seelochan; Molday, Laurie L; Molday, Robert S

    2003-10-10

    ABCR, also known as ABCA4, is a member of the superfamily of ATP binding cassette transporters that is believed to transport retinal or retinylidene-phosphatidylethanolamine across photoreceptor disk membranes. Mutations in the ABCR gene are responsible for Stargardt macular dystrophy and related retinal dystrophies that cause severe loss in vision. ABCR consists of two tandemly arranged halves each containing a membrane spanning segment followed by a large extracellular/lumen domain, a multi-spanning membrane domain, and a nucleotide binding domain (NBD). To define the role of each NBD, we examined the nucleotide binding and ATPase activities of the N and C halves of ABCR individually and co-expressed in COS-1 cells and derived from trypsin-cleaved ABCR in disk membranes. When disk membranes or membranes from co-transfected cells were photoaffinity labeled with 8-azido-ATP and 8-azido-ADP, only the NBD2 in the C-half bound and trapped the nucleotide. Co-expressed half-molecules displayed basal and retinal-stimulated ATPase activity similar to full-length ABCR. The individually expressed N-half displayed weak 8-azido-ATP labeling and low basal ATPase activity that was not stimulated by retinal, whereas the C-half did not bind ATP and exhibited little if any ATPase activity. Purified ABCR contained one tightly bound ADP, presumably in NBD1. Our results indicate that only NBD2 of ABCR binds and hydrolyzes ATP in the presence or absence of retinal. NBD1, containing a bound ADP, associates with NBD2 to play a crucial, non-catalytic role in ABCR function.

  13. A repeated halving approach to fabricate ultrathin single-walled carbon nanotube films for transparent supercapacitors.

    PubMed

    Niu, Zhiqiang; Zhou, Weiya; Chen, Jun; Feng, Guoxing; Li, Hong; Hu, Yongsheng; Ma, Wenjun; Dong, Haibo; Li, Jinzhu; Xie, Sishen

    2013-02-25

    Ultrathin SWCNT transparent and conductive films on flexible and transparent substrates are prepared via repeatedly halving the directly grown SWCNT films and flexible and transparent supercapacitors with excellent performance were fabricated. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  15. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  16. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  17. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  18. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  19. Effects of halving pesticide use on wheat production

    PubMed Central

    Hossard, L.; Philibert, A.; Bertrand, M.; Colnenne-David, C.; Debaeke, P.; Munier-Jolain, N.; Jeuffroy, M. H.; Richard, G.; Makowski, D.

    2014-01-01

    Pesticides pose serious threats to both human health and the environment. In Europe, farmers are encouraged to reduce their use, and in France a recent environmental policy fixed a target of halving the pesticide use by 2018. Organic and integrated cropping systems have been proposed as possible solutions for reducing pesticide use, but the effect of reducing pesticide use on crop yield remains unclear. Here we use a set of cropping system experiments to quantify the yield losses resulting from a reduction of pesticide use for winter wheat in France. Our estimated yield losses resulting from a 50% reduction in pesticide use ranged from 5 to 13% of the yield obtained with the current pesticide use. At the scale of the whole country, these losses would decrease the French wheat production by about 2 to 3 millions of tons, which represent about 15% of the French wheat export. PMID:24651597

  20. Effects of halving pesticide use on wheat production

    NASA Astrophysics Data System (ADS)

    Hossard, L.; Philibert, A.; Bertrand, M.; Colnenne-David, C.; Debaeke, P.; Munier-Jolain, N.; Jeuffroy, M. H.; Richard, G.; Makowski, D.

    2014-03-01

    Pesticides pose serious threats to both human health and the environment. In Europe, farmers are encouraged to reduce their use, and in France a recent environmental policy fixed a target of halving the pesticide use by 2018. Organic and integrated cropping systems have been proposed as possible solutions for reducing pesticide use, but the effect of reducing pesticide use on crop yield remains unclear. Here we use a set of cropping system experiments to quantify the yield losses resulting from a reduction of pesticide use for winter wheat in France. Our estimated yield losses resulting from a 50% reduction in pesticide use ranged from 5 to 13% of the yield obtained with the current pesticide use. At the scale of the whole country, these losses would decrease the French wheat production by about 2 to 3 millions of tons, which represent about 15% of the French wheat export.

  1. Effects of halving pesticide use on wheat production.

    PubMed

    Hossard, L; Philibert, A; Bertrand, M; Colnenne-David, C; Debaeke, P; Munier-Jolain, N; Jeuffroy, M H; Richard, G; Makowski, D

    2014-03-20

    Pesticides pose serious threats to both human health and the environment. In Europe, farmers are encouraged to reduce their use, and in France a recent environmental policy fixed a target of halving the pesticide use by 2018. Organic and integrated cropping systems have been proposed as possible solutions for reducing pesticide use, but the effect of reducing pesticide use on crop yield remains unclear. Here we use a set of cropping system experiments to quantify the yield losses resulting from a reduction of pesticide use for winter wheat in France. Our estimated yield losses resulting from a 50% reduction in pesticide use ranged from 5 to 13% of the yield obtained with the current pesticide use. At the scale of the whole country, these losses would decrease the French wheat production by about 2 to 3 millions of tons, which represent about 15% of the French wheat export.

  2. Learning Discriminative Binary Codes for Large-scale Cross-modal Retrieval.

    PubMed

    Xu, Xing; Shen, Fumin; Yang, Yang; Shen, Heng Tao; Li, Xuelong

    2017-05-01

    Hashing based methods have attracted considerable attention for efficient cross-modal retrieval on large-scale multimedia data. The core problem of cross-modal hashing is how to learn compact binary codes that construct the underlying correlations between heterogeneous features from different modalities. A majority of recent approaches aim at learning hash functions to preserve the pairwise similarities defined by given class labels. However, these methods fail to explicitly explore the discriminative property of class labels during hash function learning. In addition, they usually discard the discrete constraints imposed on the to-be-learned binary codes, and compromise to solve a relaxed problem with quantization to obtain the approximate binary solution. Therefore, the binary codes generated by these methods are suboptimal and less discriminative to different classes. To overcome these drawbacks, we propose a novel cross-modal hashing method, termed discrete cross-modal hashing (DCH), which directly learns discriminative binary codes while retaining the discrete constraints. Specifically, DCH learns modality-specific hash functions for generating unified binary codes, and these binary codes are viewed as representative features for discriminative classification with class labels. An effective discrete optimization algorithm is developed for DCH to jointly learn the modality-specific hash function and the unified binary codes. Extensive experiments on three benchmark data sets highlight the superiority of DCH under various cross-modal scenarios and show its state-of-the-art performance.

  3. Intrusion detection using secure signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Trent Darnel; Haile, Jedediah

    A method and device for intrusion detection using secure signatures comprising capturing network data. A search hash value, value employing at least one one-way function, is generated from the captured network data using a first hash function. The presence of a search hash value match in a secure signature table comprising search hash values and an encrypted rule is determined. After determining a search hash value match, a decryption key is generated from the captured network data using a second hash function, a hash function different form the first hash function. One or more of the encrypted rules of themore » secure signatures table having a hash value equal to the generated search hash value are then decrypted using the generated decryption key. The one or more decrypted secure signature rules are then processed for a match and one or more user notifications are deployed if a match is identified.« less

  4. Data Recovery of Distributed Hash Table with Distributed-to-Distributed Data Copy

    NASA Astrophysics Data System (ADS)

    Doi, Yusuke; Wakayama, Shirou; Ozaki, Satoshi

    To realize huge-scale information services, many Distributed Hash Table (DHT) based systems have been proposed. For example, there are some proposals to manage item-level product traceability information with DHTs. In such an application, each entry of a huge number of item-level IDs need to be available on a DHT. To ensure data availability, the soft-state approach has been employed in previous works. However, this does not scale well against the number of entries on a DHT. As we expect 1010 products in the traceability case, the soft-state approach is unacceptable. In this paper, we propose Distributed-to-Distributed Data Copy (D3C). With D3C, users can reconstruct the data as they detect data loss, or even migrate to another DHT system. We show why it scales well against the number of entries on a DHT. We have confirmed our approach with a prototype. Evaluation shows our approach fits well on a DHT with a low rate of failure and a huge number of data entries.

  5. UQlust: combining profile hashing with linear-time ranking for efficient clustering and analysis of big macromolecular data.

    PubMed

    Adamczak, Rafal; Meller, Jarek

    2016-12-28

    Advances in computing have enabled current protein and RNA structure prediction and molecular simulation methods to dramatically increase their sampling of conformational spaces. The quickly growing number of experimentally resolved structures, and databases such as the Protein Data Bank, also implies large scale structural similarity analyses to retrieve and classify macromolecular data. Consequently, the computational cost of structure comparison and clustering for large sets of macromolecular structures has become a bottleneck that necessitates further algorithmic improvements and development of efficient software solutions. uQlust is a versatile and easy-to-use tool for ultrafast ranking and clustering of macromolecular structures. uQlust makes use of structural profiles of proteins and nucleic acids, while combining a linear-time algorithm for implicit comparison of all pairs of models with profile hashing to enable efficient clustering of large data sets with a low memory footprint. In addition to ranking and clustering of large sets of models of the same protein or RNA molecule, uQlust can also be used in conjunction with fragment-based profiles in order to cluster structures of arbitrary length. For example, hierarchical clustering of the entire PDB using profile hashing can be performed on a typical laptop, thus opening an avenue for structural explorations previously limited to dedicated resources. The uQlust package is freely available under the GNU General Public License at https://github.com/uQlust . uQlust represents a drastic reduction in the computational complexity and memory requirements with respect to existing clustering and model quality assessment methods for macromolecular structure analysis, while yielding results on par with traditional approaches for both proteins and RNAs.

  6. Dual-tail approach to discovery of novel carbonic anhydrase IX inhibitors by simultaneously matching the hydrophobic and hydrophilic halves of the active site.

    PubMed

    Hou, Zhuang; Lin, Bin; Bao, Yu; Yan, Hai-Ning; Zhang, Miao; Chang, Xiao-Wei; Zhang, Xin-Xin; Wang, Zi-Jie; Wei, Gao-Fei; Cheng, Mao-Sheng; Liu, Yang; Guo, Chun

    2017-05-26

    Dual-tail approach was employed to design novel Carbonic Anhydrase (CA) IX inhibitors by simultaneously matching the hydrophobic and hydrophilic halves of the active site, which also contains a zinc ion as part of the catalytic center. The classic sulfanilamide moiety was used as the zinc binding group. An amino glucosamine fragment was chosen as the hydrophilic part and a cinnamamide fragment as the hydrophobic part in order to draw favorable interactions with the corresponding halves of the active site. In comparison with sulfanilamide which is largely devoid of the hydrophilic and hydrophobic interactions with the two halves of the active site, the compounds so designed and synthesized in this study showed 1000-fold improvement in binding affinity. Most of the compounds inhibited the CA effectively with IC 50 values in the range of 7-152 nM. Compound 14e (IC 50 : 7 nM) was more effective than the reference drug acetazolamide (IC 50 : 30 nM). The results proved that the dual-tail approach to simultaneously matching the hydrophobic and hydrophilic halves of the active site by linking hydrophobic and hydrophilic fragments was useful for designing novel CA inhibitors. The effectiveness of those compounds was elucidated by both the experimental data and molecular docking simulations. This work laid a solid foundation for further development of novel CA IX inhibitors for cancer treatment. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  7. Gencrypt: one-way cryptographic hashes to detect overlapping individuals across samples

    PubMed Central

    Turchin, Michael C.; Hirschhorn, Joel N.

    2012-01-01

    Summary: Meta-analysis across genome-wide association studies is a common approach for discovering genetic associations. However, in some meta-analysis efforts, individual-level data cannot be broadly shared by study investigators due to privacy and Institutional Review Board concerns. In such cases, researchers cannot confirm that each study represents a unique group of people, leading to potentially inflated test statistics and false positives. To resolve this problem, we created a software tool, Gencrypt, which utilizes a security protocol known as one-way cryptographic hashes to allow overlapping participants to be identified without sharing individual-level data. Availability: Gencrypt is freely available under the GNU general public license v3 at http://www.broadinstitute.org/software/gencrypt/ Contact: joelh@broadinstitute.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22302573

  8. A dynamic re-partitioning strategy based on the distribution of key in Spark

    NASA Astrophysics Data System (ADS)

    Zhang, Tianyu; Lian, Xin

    2018-05-01

    Spark is a memory-based distributed data processing framework, has the ability of processing massive data and becomes a focus in Big Data. But the performance of Spark Shuffle depends on the distribution of data. The naive Hash partition function of Spark can not guarantee load balancing when data is skewed. The time of job is affected by the node which has more data to process. In order to handle this problem, dynamic sampling is used. In the process of task execution, histogram is used to count the key frequency distribution of each node, and then generate the global key frequency distribution. After analyzing the distribution of key, load balance of data partition is achieved. Results show that the Dynamic Re-Partitioning function is better than the default Hash partition, Fine Partition and the Balanced-Schedule strategy, it can reduce the execution time of the task and improve the efficiency of the whole cluster.

  9. Distributed Adaptive Binary Quantization for Fast Nearest Neighbor Search.

    PubMed

    Xianglong Liu; Zhujin Li; Cheng Deng; Dacheng Tao

    2017-11-01

    Hashing has been proved an attractive technique for fast nearest neighbor search over big data. Compared with the projection based hashing methods, prototype-based ones own stronger power to generate discriminative binary codes for the data with complex intrinsic structure. However, existing prototype-based methods, such as spherical hashing and K-means hashing, still suffer from the ineffective coding that utilizes the complete binary codes in a hypercube. To address this problem, we propose an adaptive binary quantization (ABQ) method that learns a discriminative hash function with prototypes associated with small unique binary codes. Our alternating optimization adaptively discovers the prototype set and the code set of a varying size in an efficient way, which together robustly approximate the data relations. Our method can be naturally generalized to the product space for long hash codes, and enjoys the fast training linear to the number of the training data. We further devise a distributed framework for the large-scale learning, which can significantly speed up the training of ABQ in the distributed environment that has been widely deployed in many areas nowadays. The extensive experiments on four large-scale (up to 80 million) data sets demonstrate that our method significantly outperforms state-of-the-art hashing methods, with up to 58.84% performance gains relatively.

  10. Paradeisos: A perfect hashing algorithm for many-body eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Jia, C. J.; Wang, Y.; Mendl, C. B.; Moritz, B.; Devereaux, T. P.

    2018-03-01

    We describe an essentially perfect hashing algorithm for calculating the position of an element in an ordered list, appropriate for the construction and manipulation of many-body Hamiltonian, sparse matrices. Each element of the list corresponds to an integer value whose binary representation reflects the occupation of single-particle basis states for each element in the many-body Hilbert space. The algorithm replaces conventional methods, such as binary search, for locating the elements of the ordered list, eliminating the need to store the integer representation for each element, without increasing the computational complexity. Combined with the "checkerboard" decomposition of the Hamiltonian matrix for distribution over parallel computing environments, this leads to a substantial savings in aggregate memory. While the algorithm can be applied broadly to many-body, correlated problems, we demonstrate its utility in reducing total memory consumption for a series of fermionic single-band Hubbard model calculations on small clusters with progressively larger Hilbert space dimension.

  11. Paradeisos: A perfect hashing algorithm for many-body eigenvalue problems

    DOE PAGES

    Jia, C. J.; Wang, Y.; Mendl, C. B.; ...

    2017-12-02

    Here, we describe an essentially perfect hashing algorithm for calculating the position of an element in an ordered list, appropriate for the construction and manipulation of many-body Hamiltonian, sparse matrices. Each element of the list corresponds to an integer value whose binary representation reflects the occupation of single-particle basis states for each element in the many-body Hilbert space. The algorithm replaces conventional methods, such as binary search, for locating the elements of the ordered list, eliminating the need to store the integer representation for each element, without increasing the computational complexity. Combined with the “checkerboard” decomposition of the Hamiltonian matrixmore » for distribution over parallel computing environments, this leads to a substantial savings in aggregate memory. While the algorithm can be applied broadly to many-body, correlated problems, we demonstrate its utility in reducing total memory consumption for a series of fermionic single-band Hubbard model calculations on small clusters with progressively larger Hilbert space dimension.« less

  12. Paradeisos: A perfect hashing algorithm for many-body eigenvalue problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, C. J.; Wang, Y.; Mendl, C. B.

    Here, we describe an essentially perfect hashing algorithm for calculating the position of an element in an ordered list, appropriate for the construction and manipulation of many-body Hamiltonian, sparse matrices. Each element of the list corresponds to an integer value whose binary representation reflects the occupation of single-particle basis states for each element in the many-body Hilbert space. The algorithm replaces conventional methods, such as binary search, for locating the elements of the ordered list, eliminating the need to store the integer representation for each element, without increasing the computational complexity. Combined with the “checkerboard” decomposition of the Hamiltonian matrixmore » for distribution over parallel computing environments, this leads to a substantial savings in aggregate memory. While the algorithm can be applied broadly to many-body, correlated problems, we demonstrate its utility in reducing total memory consumption for a series of fermionic single-band Hubbard model calculations on small clusters with progressively larger Hilbert space dimension.« less

  13. An incremental community detection method for social tagging systems using locality-sensitive hashing.

    PubMed

    Wu, Zhenyu; Zou, Ming

    2014-10-01

    An increasing number of users interact, collaborate, and share information through social networks. Unprecedented growth in social networks is generating a significant amount of unstructured social data. From such data, distilling communities where users have common interests and tracking variations of users' interests over time are important research tracks in fields such as opinion mining, trend prediction, and personalized services. However, these tasks are extremely difficult considering the highly dynamic characteristics of the data. Existing community detection methods are time consuming, making it difficult to process data in real time. In this paper, dynamic unstructured data is modeled as a stream. Tag assignments stream clustering (TASC), an incremental scalable community detection method, is proposed based on locality-sensitive hashing. Both tags and latent interactions among users are incorporated in the method. In our experiments, the social dynamic behaviors of users are first analyzed. The proposed TASC method is then compared with state-of-the-art clustering methods such as StreamKmeans and incremental k-clique; results indicate that TASC can detect communities more efficiently and effectively. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Building Application-Related Patient Identifiers: What Solution for a European Country?

    PubMed Central

    Quantin, Catherine; Allaert, François-André; Avillach, Paul; Fassa, Maniane; Riandey, Benoît; Trouessin, Gilles; Cohen, Olivier

    2008-01-01

    We propose a method utilizing a derived social security number with the same reliability as the social security number. We show the anonymity techniques classically based on unidirectional hash functions (such as the secure hash algorithm (SHA-2) function that can guarantee the security, quality, and reliability of information if these techniques are applied to the Social Security Number). Hashing produces a strictly anonymous code that is always the same for a given individual, and thus enables patient data to be linked. Different solutions are developed and proposed in this article. Hashing the social security number will make it possible to link the information in the personal medical file to other national health information sources with the aim of completing or validating the personal medical record or conducting epidemiological and clinical research. This data linkage would meet the anonymous data requirements of the European directive on data protection. PMID:18401447

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draelos, Timothy John; Dautenhahn, Nathan; Schroeppel, Richard Crabtree

    The security of the widely-used cryptographic hash function SHA1 has been impugned. We have developed two replacement hash functions. The first, SHA1X, is a drop-in replacement for SHA1. The second, SANDstorm, has been submitted as a candidate to the NIST-sponsored SHA3 Hash Function competition.

  16. Secure Hashing of Dynamic Hand Signatures Using Wavelet-Fourier Compression with BioPhasor Mixing and [InlineEquation not available: see fulltext.] Discretization

    NASA Astrophysics Data System (ADS)

    Wai Kuan, Yip; Teoh, Andrew B. J.; Ngo, David C. L.

    2006-12-01

    We introduce a novel method for secure computation of biometric hash on dynamic hand signatures using BioPhasor mixing and[InlineEquation not available: see fulltext.] discretization. The use of BioPhasor as the mixing process provides a one-way transformation that precludes exact recovery of the biometric vector from compromised hashes and stolen tokens. In addition, our user-specific[InlineEquation not available: see fulltext.] discretization acts both as an error correction step as well as a real-to-binary space converter. We also propose a new method of extracting compressed representation of dynamic hand signatures using discrete wavelet transform (DWT) and discrete fourier transform (DFT). Without the conventional use of dynamic time warping, the proposed method avoids storage of user's hand signature template. This is an important consideration for protecting the privacy of the biometric owner. Our results show that the proposed method could produce stable and distinguishable bit strings with equal error rates (EERs) of[InlineEquation not available: see fulltext.] and[InlineEquation not available: see fulltext.] for random and skilled forgeries for stolen token (worst case) scenario, and[InlineEquation not available: see fulltext.] for both forgeries in the genuine token (optimal) scenario.

  17. [Linking anonymous databases for national and international multicenter epidemiological studies: a cryptographic algorithm].

    PubMed

    Quantin, C; Fassa, M; Coatrieux, G; Riandey, B; Trouessin, G; Allaert, F A

    2009-02-01

    Compiling individual records which come from different sources remains very important for multicenter epidemiological studies, but at the same time European directives or other national legislation concerning nominal data processing have to be respected. These legal aspects can be satisfied by implementing mechanisms that allow anonymization of patient data (such as hashing techniques). Moreover, for security reasons, official recommendations suggest using different cryptographic keys in combination with a cryptographic hash function for each study. Unfortunately, such an anonymization procedure is in contradiction with the common requirement in public health and biomedical research as it becomes almost impossible to link records from separate data collections where the same entity is not referenced in the same way. Solving this paradox by using methodology based on the combination of hashing and enciphering techniques is the main aim of this article. The method relies on one of the best known hashing functions (the secure hash algorithm) to ensure the anonymity of personal information while providing greater resistance to dictionary attacks, combined with encryption techniques. The originality of the method relies on the way the combination of hashing and enciphering techniques is performed: like in asymmetric encryption, two keys are used but the private key depends on the patient's identity. The combination of hashing and enciphering techniques provides a great improvement in the overall security of the proposed scheme. This methodology makes the stored data available for use in the field of public health for the benefit of patients, while respecting legal security requirements.

  18. Multimodal Discriminative Binary Embedding for Large-Scale Cross-Modal Retrieval.

    PubMed

    Wang, Di; Gao, Xinbo; Wang, Xiumei; He, Lihuo; Yuan, Bo

    2016-10-01

    Multimodal hashing, which conducts effective and efficient nearest neighbor search across heterogeneous data on large-scale multimedia databases, has been attracting increasing interest, given the explosive growth of multimedia content on the Internet. Recent multimodal hashing research mainly aims at learning the compact binary codes to preserve semantic information given by labels. The overwhelming majority of these methods are similarity preserving approaches which approximate pairwise similarity matrix with Hamming distances between the to-be-learnt binary hash codes. However, these methods ignore the discriminative property in hash learning process, which results in hash codes from different classes undistinguished, and therefore reduces the accuracy and robustness for the nearest neighbor search. To this end, we present a novel multimodal hashing method, named multimodal discriminative binary embedding (MDBE), which focuses on learning discriminative hash codes. First, the proposed method formulates the hash function learning in terms of classification, where the binary codes generated by the learned hash functions are expected to be discriminative. And then, it exploits the label information to discover the shared structures inside heterogeneous data. Finally, the learned structures are preserved for hash codes to produce similar binary codes in the same class. Hence, the proposed MDBE can preserve both discriminability and similarity for hash codes, and will enhance retrieval accuracy. Thorough experiments on benchmark data sets demonstrate that the proposed method achieves excellent accuracy and competitive computational efficiency compared with the state-of-the-art methods for large-scale cross-modal retrieval task.

  19. Stability of Dihydroartemisinin-Piperaquine Tablet Halves During Prolonged Storage Under Tropical Conditions.

    PubMed

    Hodel, Eva Maria; Kaur, Harparkash; Terlouw, Dianne J

    2017-02-08

    Dihydroartemisinin-piperaquine (DP) is recommended for the treatment of uncomplicated malaria, used in efforts to contain artemisinin resistance, and increasingly considered for mass drug administration. Because of the narrow therapeutic dose range and available tablet strengths, the manufacturers and World Health Organization recommended regimens involve breaking tablets into halves to accurately dose children according to body weight. Use of tablet fractions in programmatic settings under tropical conditions requires a highly stable product; however, the stability of DP tablet fractions is unknown. We aged full and half DP (Eurartesim ® ) tablets in a stability chamber at 30°C and 70% humidity level. The active pharmaceutical ingredients dihydroartemisinin and piperaquine remained at ≥ 95% over the 3 months' period of ageing in light and darkness. These findings are reassuring for DP, but highlight the need to assess drug stability under real-life settings during the drug development process, particularly for key drugs of global disease control programs. © The American Society of Tropical Medicine and Hygiene.

  20. Development of a cellulose-based insulating composite material for green buildings: Case of treated organic waste (paper, cardboard, hash)

    NASA Astrophysics Data System (ADS)

    Ouargui, Ahmed; Belouaggadia, Naoual; Elbouari, Abdeslam; Ezzine, Mohammed

    2018-05-01

    Buildings are responsible for 36% of the final energy consumption in Morocco [1-2], and a reduction of this energy consumption of buildings is a priority for the kingdom in order to reach its energy saving goals. One of the most effective actions to reduce energy consumption is the selection and development of innovative and efficient building materials [3]. In this work, we present an experimental study of the effect of adding treated organic waste (paper, cardboard, hash) on mechanical and thermal properties of cement and clay bricks. Thermal conductivity, specific heat and mechanical resistance were investigated in terms of content and size additives. Soaking time and drying temperature were also taken into account. The results reveal that thermal conductivity decreases as well in the case of the paper-cement mixture as that of the paper-clay and seems to stabilize around 40%. In the case of the composite paper-cement, it is found that, for an additives quantity exceeding 15%, the compressive strength exceeds the standard for the hollow non-load bearing masonry. However, the case of paper-clay mixture seems to give more interesting results, related to the compressive strength, for a mass composition of 15% in paper. Given the positive results achieved, it seems possible to use these composites for the construction of walls, ceilings and roofs of housing while minimizing the energy consumption of the building.

  1. Sex hormone-dependent tRNA halves enhance cell proliferation in breast and prostate cancers.

    PubMed

    Honda, Shozo; Loher, Phillipe; Shigematsu, Megumi; Palazzo, Juan P; Suzuki, Ryusuke; Imoto, Issei; Rigoutsos, Isidore; Kirino, Yohei

    2015-07-21

    Sex hormones and their receptors play critical roles in the development and progression of the breast and prostate cancers. Here we report that a novel type of transfer RNA (tRNA)-derived small RNA, termed Sex HOrmone-dependent TRNA-derived RNAs (SHOT-RNAs), are specifically and abundantly expressed in estrogen receptor (ER)-positive breast cancer and androgen receptor (AR)-positive prostate cancer cell lines. SHOT-RNAs are not abundantly present in ER(-) breast cancer, AR(-) prostate cancer, or other examined cancer cell lines from other tissues. ER-dependent accumulation of SHOT-RNAs is not limited to a cell culture system, but it also occurs in luminal-type breast cancer patient tissues. SHOT-RNAs are produced from aminoacylated mature tRNAs by angiogenin-mediated anticodon cleavage, which is promoted by sex hormones and their receptors. Resultant 5'- and 3'-SHOT-RNAs, corresponding to 5'- and 3'-tRNA halves, bear a cyclic phosphate (cP) and an amino acid at the 3'-end, respectively. By devising a "cP-RNA-seq" method that is able to exclusively amplify and sequence cP-containing RNAs, we identified the complete repertoire of 5'-SHOT-RNAs. Furthermore, 5'-SHOT-RNA, but not 3'-SHOT-RNA, has significant functional involvement in cell proliferation. These results have unveiled a novel tRNA-engaged pathway in tumorigenesis of hormone-dependent cancers and implicate SHOT-RNAs as potential candidates for biomarkers and therapeutic targets.

  2. Sex hormone-dependent tRNA halves enhance cell proliferation in breast and prostate cancers

    PubMed Central

    Honda, Shozo; Loher, Phillipe; Shigematsu, Megumi; Palazzo, Juan P.; Suzuki, Ryusuke; Imoto, Issei; Rigoutsos, Isidore; Kirino, Yohei

    2015-01-01

    Sex hormones and their receptors play critical roles in the development and progression of the breast and prostate cancers. Here we report that a novel type of transfer RNA (tRNA)-derived small RNA, termed Sex HOrmone-dependent TRNA-derived RNAs (SHOT-RNAs), are specifically and abundantly expressed in estrogen receptor (ER)-positive breast cancer and androgen receptor (AR)-positive prostate cancer cell lines. SHOT-RNAs are not abundantly present in ER− breast cancer, AR− prostate cancer, or other examined cancer cell lines from other tissues. ER-dependent accumulation of SHOT-RNAs is not limited to a cell culture system, but it also occurs in luminal-type breast cancer patient tissues. SHOT-RNAs are produced from aminoacylated mature tRNAs by angiogenin-mediated anticodon cleavage, which is promoted by sex hormones and their receptors. Resultant 5′- and 3′-SHOT-RNAs, corresponding to 5′- and 3′-tRNA halves, bear a cyclic phosphate (cP) and an amino acid at the 3′-end, respectively. By devising a “cP-RNA-seq” method that is able to exclusively amplify and sequence cP-containing RNAs, we identified the complete repertoire of 5′-SHOT-RNAs. Furthermore, 5′-SHOT-RNA, but not 3′-SHOT-RNA, has significant functional involvement in cell proliferation. These results have unveiled a novel tRNA-engaged pathway in tumorigenesis of hormone-dependent cancers and implicate SHOT-RNAs as potential candidates for biomarkers and therapeutic targets. PMID:26124144

  3. Non-enzymatic browning due to storage is reduced by using clarified lemon juice as acidifier in industrial-scale production of canned peach halves.

    PubMed

    Saura, Domingo; Vegara, Salud; Martí, Nuria; Valero, Manuel; Laencina, José

    2017-06-01

    Non-enzymatic browning (NEB) in canned peach halves in syrup during storage was investigated. Absorbance at 420 nm ( A 420 ), colorimetric parameters (CIE Lab , TCD and La / b ), fructose, glucose and sucrose, total sugar, organic acids, ascorbic acid (AA), dehydroascorbic acid, and 2,3-diketogulonic acid were used to estimate the extent of NEB during 1 year of storage at 30 °C and the relationships between each of these parameters and A 420 were established. The investigation was carried out to explore the possibility of replacing the E330 commonly used as acidifier by turbid or clarified lemon juice (TLJ or CLJ) to obtain a product having good nutrition with better retention of quality. The a , La / b , glucose and fructose were positively correlated with A 420 and all proved to be good indicators of browning development. Overall results showed that replacement of acidifier E330 with CLJ for controlling pH in canned peach halves in syrup had some advantages.

  4. Folding units in calcium vector protein of amphioxus: Structural and functional properties of its amino- and carboxy-terminal halves.

    PubMed

    Baladi, S; Tsvetkov, P O; Petrova, T V; Takagi, T; Sakamoto, H; Lobachov, V M; Makarov, A A; Cox, J A

    2001-04-01

    Muscle of amphioxus contains large amounts of a four EF-hand Ca2+-binding protein, CaVP, and its target, CaVPT. To study the domain structure of CaVP and assess the structurally important determinants for its interaction with CaVPT, we expressed CaVP and its amino (N-CaVP) and carboxy-terminal halves (C-CaVP). The interactive properties of recombinant and wild-type CaVP are very similar, despite three post-translational modifications in the wild-type protein. N-CaVP does not bind Ca2+, shows a well-formed hydrophobic core, and melts at 44 degrees C. C-CaVP binds two Ca2+ with intrinsic dissociation constants of 0.22 and 140 microM (i.e., very similar to the entire CaVP). The metal-free domain in CaVP and C-CaVP shows no distinct melting transition, whereas its 1Ca2+ and 2Ca2+) forms melt in the 111 degrees -123 degrees C range, suggesting that C-CaVP and the carboxy- domain of CaVP are natively unfolded in the metal-free state and progressively gain structure upon binding of 1Ca2+ and 2Ca2+. Thermal denaturation studies provide evidence for interdomain interaction: the apo, 1Ca2+ and 2Ca2+ states of the carboxy-domain destabilize to different degrees the amino-domain. Only C-CaVP forms a Ca2+-dependent 1:1 complex with CaVPT. Our results suggest that the carboxy-terminal domain of CaVP interacts with CaVPT and that the amino-terminal lobe modulates this interaction.

  5. Folding units in calcium vector protein of amphioxus: Structural and functional properties of its amino- and carboxy-terminal halves

    PubMed Central

    Baladi, Sibyl; Tsvetkov, Philipp O.; Petrova, Tatiana V.; Takagi, Takashi; Sakamoto, Hiroshi; Lobachov, Vladimir M.; Makarov, Alexander A.; Cox, Jos A.

    2001-01-01

    Muscle of amphioxus contains large amounts of a four EF-hand Ca2+-binding protein, CaVP, and its target, CaVPT. To study the domain structure of CaVP and assess the structurally important determinants for its interaction with CaVPT, we expressed CaVP and its amino (N-CaVP) and carboxy-terminal halves (C-CaVP). The interactive properties of recombinant and wild-type CaVP are very similar, despite three post-translational modifications in the wild-type protein. N-CaVP does not bind Ca2+, shows a well-formed hydrophobic core, and melts at 44°C. C-CaVP binds two Ca2+ with intrinsic dissociation constants of 0.22 and 140 μM (i.e., very similar to the entire CaVP). The metal-free domain in CaVP and C-CaVP shows no distinct melting transition, whereas its 1Ca2+ and 2Ca2+ forms melt in the 111°–123°C range, suggesting that C-CaVP and the carboxy- domain of CaVP are natively unfolded in the metal-free state and progressively gain structure upon binding of 1Ca2+ and 2Ca2+. Thermal denaturation studies provide evidence for interdomain interaction: the apo, 1Ca2+ and 2Ca2+ states of the carboxy-domain destabilize to different degrees the amino-domain. Only C-CaVP forms a Ca2+-dependent 1:1 complex with CaVPT. Our results suggest that the carboxy-terminal domain of CaVP interacts with CaVPT and that the amino-terminal lobe modulates this interaction. PMID:11274468

  6. Self-Organized Link State Aware Routing for Multiple Mobile Agents in Wireless Network

    NASA Astrophysics Data System (ADS)

    Oda, Akihiro; Nishi, Hiroaki

    Recently, the importance of data sharing structures in autonomous distributed networks has been increasing. A wireless sensor network is used for managing distributed data. This type of distributed network requires effective information exchanging methods for data sharing. To reduce the traffic of broadcasted messages, reduction of the amount of redundant information is indispensable. In order to reduce packet loss in mobile ad-hoc networks, QoS-sensitive routing algorithm have been frequently discussed. The topology of a wireless network is likely to change frequently according to the movement of mobile nodes, radio disturbance, or fading due to the continuous changes in the environment. Therefore, a packet routing algorithm should guarantee QoS by using some quality indicators of the wireless network. In this paper, a novel information exchanging algorithm developed using a hash function and a Boolean operation is proposed. This algorithm achieves efficient information exchanges by reducing the overhead of broadcasting messages, and it can guarantee QoS in a wireless network environment. It can be applied to a routing algorithm in a mobile ad-hoc network. In the proposed routing algorithm, a routing table is constructed by using the received signal strength indicator (RSSI), and the neighborhood information is periodically broadcasted depending on this table. The proposed hash-based routing entry management by using an extended MAC address can eliminate the overhead of message flooding. An analysis of the collision of hash values contributes to the determination of the length of the hash values, which is minimally required. Based on the verification of a mathematical theory, an optimum hash function for determining the length of hash values can be given. Simulations are carried out to evaluate the effectiveness of the proposed algorithm and to validate the theory in a general wireless network routing algorithm.

  7. The Amordad database engine for metagenomics.

    PubMed

    Behnam, Ehsan; Smith, Andrew D

    2014-10-15

    Several technical challenges in metagenomic data analysis, including assembling metagenomic sequence data or identifying operational taxonomic units, are both significant and well known. These forms of analysis are increasingly cited as conceptually flawed, given the extreme variation within traditionally defined species and rampant horizontal gene transfer. Furthermore, computational requirements of such analysis have hindered content-based organization of metagenomic data at large scale. In this article, we introduce the Amordad database engine for alignment-free, content-based indexing of metagenomic datasets. Amordad places the metagenome comparison problem in a geometric context, and uses an indexing strategy that combines random hashing with a regular nearest neighbor graph. This framework allows refinement of the database over time by continual application of random hash functions, with the effect of each hash function encoded in the nearest neighbor graph. This eliminates the need to explicitly maintain the hash functions in order for query efficiency to benefit from the accumulated randomness. Results on real and simulated data show that Amordad can support logarithmic query time for identifying similar metagenomes even as the database size reaches into the millions. Source code, licensed under the GNU general public license (version 3) is freely available for download from http://smithlabresearch.org/amordad andrewds@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    PubMed

    Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  9. Ontology-Based Peer Exchange Network (OPEN)

    ERIC Educational Resources Information Center

    Dong, Hui

    2010-01-01

    In current Peer-to-Peer networks, distributed and semantic free indexing is widely used by systems adopting "Distributed Hash Table" ("DHT") mechanisms. Although such systems typically solve a. user query rather fast in a deterministic way, they only support a very narrow search scheme, namely the exact hash key match. Furthermore, DHT systems put…

  10. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs

    PubMed Central

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410

  11. A Study on the Secure User Profiling Structure and Procedure for Home Healthcare Systems.

    PubMed

    Ko, Hoon; Song, MoonBae

    2016-01-01

    Despite of various benefits such as a convenience and efficiency, home healthcare systems have some inherent security risks that may cause a serious leak on personal health information. This work presents a Secure User Profiling Structure which has the patient information including their health information. A patient and a hospital keep it at that same time, they share the updated data. While they share the data and communicate, the data can be leaked. To solve the security problems, a secure communication channel with a hash function and an One-Time Password between a client and a hospital should be established and to generate an input value to an OTP, it uses a dual hash-function. This work presents a dual hash function-based approach to generate the One-Time Password ensuring a secure communication channel with the secured key. In result, attackers are unable to decrypt the leaked information because of the secured key; in addition, the proposed method outperforms the existing methods in terms of computation cost.

  12. Non-Black-Box Simulation from One-Way Functions and Applications to Resettable Security

    DTIC Science & Technology

    2012-11-05

    from 2001, Barak (FOCS’01) introduced a novel non-black-box simulation technique. This technique enabled the construc- tion of new cryptographic...primitives, such as resettably-sound zero-knowledge arguments, that cannot be proven secure using just black-box simulation techniques. The work of Barak ... Barak requires the existence of collision-resistant hash functions, and a very recent result by Bitansky and Paneth (FOCS’12) instead requires the

  13. Tag Content Access Control with Identity-based Key Exchange

    NASA Astrophysics Data System (ADS)

    Yan, Liang; Rong, Chunming

    2010-09-01

    Radio Frequency Identification (RFID) technology that used to identify objects and users has been applied to many applications such retail and supply chain recently. How to prevent tag content from unauthorized readout is a core problem of RFID privacy issues. Hash-lock access control protocol can make tag to release its content only to reader who knows the secret key shared between them. However, in order to get this shared secret key required by this protocol, reader needs to communicate with a back end database. In this paper, we propose to use identity-based secret key exchange approach to generate the secret key required for hash-lock access control protocol. With this approach, not only back end database connection is not needed anymore, but also tag cloning problem can be eliminated at the same time.

  14. HIA: a genome mapper using hybrid index-based sequence alignment.

    PubMed

    Choi, Jongpill; Park, Kiejung; Cho, Seong Beom; Chung, Myungguen

    2015-01-01

    A number of alignment tools have been developed to align sequencing reads to the human reference genome. The scale of information from next-generation sequencing (NGS) experiments, however, is increasing rapidly. Recent studies based on NGS technology have routinely produced exome or whole-genome sequences from several hundreds or thousands of samples. To accommodate the increasing need of analyzing very large NGS data sets, it is necessary to develop faster, more sensitive and accurate mapping tools. HIA uses two indices, a hash table index and a suffix array index. The hash table performs direct lookup of a q-gram, and the suffix array performs very fast lookup of variable-length strings by exploiting binary search. We observed that combining hash table and suffix array (hybrid index) is much faster than the suffix array method for finding a substring in the reference sequence. Here, we defined the matching region (MR) is a longest common substring between a reference and a read. And, we also defined the candidate alignment regions (CARs) as a list of MRs that is close to each other. The hybrid index is used to find candidate alignment regions (CARs) between a reference and a read. We found that aligning only the unmatched regions in the CAR is much faster than aligning the whole CAR. In benchmark analysis, HIA outperformed in mapping speed compared with the other aligners, without significant loss of mapping accuracy. Our experiments show that the hybrid of hash table and suffix array is useful in terms of speed for mapping NGS sequencing reads to the human reference genome sequence. In conclusion, our tool is appropriate for aligning massive data sets generated by NGS sequencing.

  15. Learning binary code via PCA of angle projection for image retrieval

    NASA Astrophysics Data System (ADS)

    Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong

    2018-01-01

    With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.

  16. Is rule of halves still an occurrence in South India: Findings from community-based survey in a selected urban area of Puducherry.

    PubMed

    Kar, S S; Kalaiselvi, S; Archana, R; Saya, G K; Premarajan, K C

    2017-01-01

    The objective of the present study was to assess the applicability of the rule of halves in an urban population of Puducherry, South India. We also aimed to find the correlates associated with undiagnosed hypertension to facilitate targeted screening. We derive our observation from a community-based cross-sectional study conducted using the World Health Organization STEPwise approach to surveillance in urban slum of Puducherry during 2014-15. Blood pressure (BP) was measured for all the study subjects (n = 2399), and the subjects were classified as hypertensive using Joint National Committee 8 criteria, systolic BP (SBP) ≥140 mm Hg and/or diastolic BP (DBP) ≥90 mmHg and/or known hypertensives and/or treatment with antihypertensive drugs. Controlled hypertension was defined as SBP <140 mmHg and DBP <90 mmHg. Of 2399, 799 (33.3%; 95% confidence interval [CI]: 31.4%-35.2%) adults were found to have raised BP by any means (known and unknown hypertensives). Of the 799, 367 (15.3%; 95%CI: 13.9%-16.8%) of study participants were known hypertensives. Of the known hypertensives, 74.7% (274/367) were put on treatment (drugs and or lifestyle modification), and 80% (218/274) were on regular treatment. Higher proportions of men were found to have undiagnosed hypertension compared to women (26.1 vs. 19.8%, P < 0.001). Similarly, adult from below poverty line (23.8 vs. 20%, P < 0.001), unskilled laborer (26.6 vs. 20%, P < 0.001), and literacy less than middle school (12.3 vs. 23%, P < 0.001) had more undiagnosed hypertension. In the selected urban area of Puducherry around one-third of the adult populations are having hypertension, including the 54% of undiagnosed hypertension. Adults from the vulnerable subgroups such as lower level of literacy, below poverty line, and unskilled work are found to have higher proportions of undiagnosed hypertension.

  17. Achaete-Scute Homolog 1 Expression Controls Cellular Differentiation of Neuroblastoma

    PubMed Central

    Kasim, Mumtaz; Heß, Vicky; Scholz, Holger; Persson, Pontus B.; Fähling, Michael

    2016-01-01

    Neuroblastoma, the major cause of infant cancer deaths, results from fast proliferation of undifferentiated neuroblasts. Treatment of high-risk neuroblastoma includes differentiation with retinoic acid (RA); however, the resistance of many of these tumors to RA-induced differentiation poses a considerable challenge. Human achaete-scute homolog 1 (hASH1) is a proneural basic helix-loop-helix transcription factor essential for neurogenesis and is often upregulated in neuroblastoma. Here, we identified a novel function for hASH1 in regulating the differentiation phenotype of neuroblastoma cells. Global analysis of 986 human neuroblastoma datasets revealed a negative correlation between hASH1 and neuron differentiation that was independent of the N-myc (MYCN) oncogene. Using RA to induce neuron differentiation in two neuroblastoma cell lines displaying high and low levels of hASH1 expression, we confirmed the link between hASH1 expression and the differentiation defective phenotype, which was reversed by silencing hASH1 or by hypoxic preconditioning. We further show that hASH1 suppresses neuronal differentiation by inhibiting transcription at the RA receptor element. Collectively, our data indicate hASH1 to be key for understanding neuroblastoma resistance to differentiation therapy and pave the way for hASH1-targeted therapies for augmenting the response of neuroblastoma to differentiation therapy. PMID:28066180

  18. Research on target tracking algorithm based on spatio-temporal context

    NASA Astrophysics Data System (ADS)

    Li, Baiping; Xu, Sanmei; Kang, Hongjuan

    2017-07-01

    In this paper, a novel target tracking algorithm based on spatio-temporal context is proposed. During the tracking process, the camera shaking or occlusion may lead to the failure of tracking. The proposed algorithm can solve this problem effectively. The method use the spatio-temporal context algorithm as the main research object. We get the first frame's target region via mouse. Then the spatio-temporal context algorithm is used to get the tracking targets of the sequence of frames. During this process a similarity measure function based on perceptual hash algorithm is used to judge the tracking results. If tracking failed, reset the initial value of Mean Shift algorithm for the subsequent target tracking. Experiment results show that the proposed algorithm can achieve real-time and stable tracking when camera shaking or target occlusion.

  19. Associations between butane hash oil use and cannabis-related problems.

    PubMed

    Meier, Madeline H

    2017-10-01

    High-potency cannabis concentrates are increasingly popular in the United States, and there is concern that use of high-potency cannabis might increase risk for cannabis-related problems. However, little is known about the potential negative consequences of concentrate use. This study reports on associations between past-year use of a high-potency cannabis concentrate, known as butane hash oil (BHO), and cannabis-related problems. A sample of 821 college students were recruited to complete a survey about their health and behavior. Participants who had used cannabis in the past year (33%, n=273) completed questions about their cannabis use, including their use of BHO and cannabis-related problems in eight domains: physical dependence, impaired control, academic-occupational problems, social-interpersonal problems, self-care problems, self-perception, risk behavior, and blackouts. Approximately 44% (n=121) of past-year cannabis users had used BHO in the past year. More frequent BHO use was associated with higher levels of physical dependence (RR=1.8, p<0.001), impaired control (RR=1.3, p<0.001), cannabis-related academic/occupational problems (RR=1.5, p=0.004), poor self-care (RR=1.3, p=0.002), and cannabis-related risk behavior (RR=1.2, p=0.001). After accounting for sociodemographic factors, age of onset of cannabis use, sensation seeking, overall frequency of cannabis use, and frequency of other substance use, BHO use was still associated with higher levels of physical dependence (RR=1.2, p=0.014). BHO use is associated with greater physiological dependence on cannabis, even after accounting for potential confounders. Longitudinal research is needed to determine if cannabis users with higher levels of physiological dependence seek out BHO and/or if BHO use increases risk for physiological dependence. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Digital camera with apparatus for authentication of images produced from an image file

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1993-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

  1. Speeding Up Sigmatropic Shifts-To Halve or to Hold.

    PubMed

    Tantillo, Dean J

    2016-04-19

    Catalysis is common. Rational catalyst design, however, is at the frontier of chemical science. Although the histories of physical organic and synthetic organic chemistry boast key chapters involving [3s,3s] sigmatropic shifts, catalysis of these reactions is much less common than catalysis of ostensibly more complex processes. The comparative dearth of catalysts for sigmatropic shifts is perhaps a result of the perception that transition state structures for these reactions, like their reactants, are nonpolar and therefore not amenable to selective stabilization and its associated barrier lowering. However, as demonstrated in this Account, transition state structures for [3s,3s] sigmatropic shifts can in fact have charge distributions that differ significantly from those of reactants, even for hydrocarbon substrates, allowing for barriers to be decreased and rates increased. In some cases, differences in charge distribution result from the inclusion of heteroatoms at specific positions in reactants, but in other cases differences are actually induced by catalysts. Perhaps surprisingly, strategies for complexation of transition state structures that remain nonpolar are also possible. In general, the strategies for catalysis employed can be characterized as involving either mechanistic intervention, where a catalyst induces a change from the concerted mechanism expected for a [3s,3s] sigmatropic shift to a multistep process (cutting the transformation into halves or smaller pieces) whose overall barrier is decreased relative to the concerted process, or transition state complexation, where a catalyst simply binds (holds) more tightly to the transition state structure for a [3s,3s] sigmatropic shift than to the reactant, leading to a lower barrier in the presence of the catalyst. Both of these strategies can be considered to be biomimetic in that enzymes frequently induce multistep processes and utilize selective transition state stabilization for the steps involved

  2. A one-time pad color image cryptosystem based on SHA-3 and multiple chaotic systems

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Wang, Siwei; Zhang, Yingqian; Luo, Chao

    2018-04-01

    A novel image encryption algorithm is proposed that combines the SHA-3 hash function and two chaotic systems: the hyper-chaotic Lorenz and Chen systems. First, 384 bit keystream hash values are obtained by applying SHA-3 to plaintext. The sensitivity of the SHA-3 algorithm and chaotic systems ensures the effect of a one-time pad. Second, the color image is expanded into three-dimensional space. During permutation, it undergoes plane-plane displacements in the x, y and z dimensions. During diffusion, we use the adjacent pixel dataset and corresponding chaotic value to encrypt each pixel. Finally, the structure of alternating between permutation and diffusion is applied to enhance the level of security. Furthermore, we design techniques to improve the algorithm's encryption speed. Our experimental simulations show that the proposed cryptosystem achieves excellent encryption performance and can resist brute-force, statistical, and chosen-plaintext attacks.

  3. A Proposal for Kelly CriterionBased Lossy Network Compression

    DTIC Science & Technology

    2016-03-01

    warehousing and data mining techniques for cyber security. New York (NY): Springer; 2007. p. 83–108. 34. Münz G, Li S, Carle G. Traffic anomaly...p. 188–196. 48. Kim NU, Park MW, Park SH, Jung SM, Eom JH, Chung TM. A study on ef- fective hash-based load balancing scheme for parallel nids. In

  4. Semi-Supervised Geographical Feature Detection

    NASA Astrophysics Data System (ADS)

    Yu, H.; Yu, L.; Kuo, K. S.

    2016-12-01

    Extraction and tracking geographical features is a fundamental requirement in many geoscience fields. However, this operation has become an increasingly challenging task for domain scientists when tackling a large amount of geoscience data. Although domain scientists may have a relatively clear definition of features, it is difficult to capture the presence of features in an accurate and efficient fashion. We propose a semi-supervised approach to address large geographical feature detection. Our approach has two main components. First, we represent a heterogeneous geoscience data in a unified high-dimensional space, which can facilitate us to evaluate the similarity of data points with respect to geolocation, time, and variable values. We characterize the data from these measures, and use a set of hash functions to parameterize the initial knowledge of the data. Second, for any user query, our approach can automatically extract the initial results based on the hash functions. To improve the accuracy of querying, our approach provides a visualization interface to display the querying results and allow users to interactively explore and refine them. The user feedback will be used to enhance our knowledge base in an iterative manner. In our implementation, we use high-performance computing techniques to accelerate the construction of hash functions. Our design facilitates a parallelization scheme for feature detection and extraction, which is a traditionally challenging problem for large-scale data. We evaluate our approach and demonstrate the effectiveness using both synthetic and real world datasets.

  5. A Novel Fast and Secure Approach for Voice Encryption Based on DNA Computing

    NASA Astrophysics Data System (ADS)

    Kakaei Kate, Hamidreza; Razmara, Jafar; Isazadeh, Ayaz

    2018-06-01

    Today, in the world of information communication, voice information has a particular importance. One way to preserve voice data from attacks is voice encryption. The encryption algorithms use various techniques such as hashing, chaotic, mixing, and many others. In this paper, an algorithm is proposed for voice encryption based on three different schemes to increase flexibility and strength of the algorithm. The proposed algorithm uses an innovative encoding scheme, the DNA encryption technique and a permutation function to provide a secure and fast solution for voice encryption. The algorithm is evaluated based on various measures including signal to noise ratio, peak signal to noise ratio, correlation coefficient, signal similarity and signal frequency content. The results demonstrate applicability of the proposed method in secure and fast encryption of voice files

  6. Cryptographic framework for document-objects resulting from multiparty collaborative transactions.

    PubMed

    Goh, A

    2000-01-01

    Multiparty transactional frameworks--i.e. Electronic Data Interchange (EDI) or Health Level (HL) 7--often result in composite documents which can be accurately modelled using hyperlinked document-objects. The structural complexity arising from multiauthor involvement and transaction-specific sequencing would be poorly handled by conventional digital signature schemes based on a single evaluation of a one-way hash function and asymmetric cryptography. In this paper we outline the generation of structure-specific authentication hash-trees for the the authentication of transactional document-objects, followed by asymmetric signature generation on the hash-tree value. Server-side multi-client signature verification would probably constitute the single most compute-intensive task, hence the motivation for our usage of the Rabin signature protocol which results in significantly reduced verification workloads compared to the more commonly applied Rivest-Shamir-Adleman (RSA) protocol. Data privacy is handled via symmetric encryption of message traffic using session-specific keys obtained through key-negotiation mechanisms based on discrete-logarithm cryptography. Individual client-to-server channels can be secured using a double key-pair variation of Diffie-Hellman (DH) key negotiation, usage of which also enables bidirectional node authentication. The reciprocal server-to-client multicast channel is secured through Burmester-Desmedt (BD) key-negotiation which enjoys significant advantages over the usual multiparty extensions to the DH protocol. The implementation of hash-tree signatures and bi/multidirectional key negotiation results in a comprehensive cryptographic framework for multiparty document-objects satisfying both authentication and data privacy requirements.

  7. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  8. Optimization of incremental structure from motion combining a random k-d forest and pHash for unordered images in a complex scene

    NASA Astrophysics Data System (ADS)

    Zhan, Zongqian; Wang, Chendong; Wang, Xin; Liu, Yi

    2018-01-01

    On the basis of today's popular virtual reality and scientific visualization, three-dimensional (3-D) reconstruction is widely used in disaster relief, virtual shopping, reconstruction of cultural relics, etc. In the traditional incremental structure from motion (incremental SFM) method, the time cost of the matching is one of the main factors restricting the popularization of this method. To make the whole matching process more efficient, we propose a preprocessing method before the matching process: (1) we first construct a random k-d forest with the large-scale scale-invariant feature transform features in the images and combine this with the pHash method to obtain a value of relatedness, (2) we then construct a connected weighted graph based on the relatedness value, and (3) we finally obtain a planned sequence of adding images according to the principle of the minimum spanning tree. On this basis, we attempt to thin the minimum spanning tree to reduce the number of matchings and ensure that the images are well distributed. The experimental results show a great reduction in the number of matchings with enough object points, with only a small influence on the inner stability, which proves that this method can quickly and reliably improve the efficiency of the SFM method with unordered multiview images in complex scenes.

  9. A novel image retrieval algorithm based on PHOG and LSH

    NASA Astrophysics Data System (ADS)

    Wu, Hongliang; Wu, Weimin; Peng, Jiajin; Zhang, Junyuan

    2017-08-01

    PHOG can describe the local shape of the image and its relationship between the spaces. The using of PHOG algorithm to extract image features in image recognition and retrieval and other aspects have achieved good results. In recent years, locality sensitive hashing (LSH) algorithm has been superior to large-scale data in solving near-nearest neighbor problems compared with traditional algorithms. This paper presents a novel image retrieval algorithm based on PHOG and LSH. First, we use PHOG to extract the feature vector of the image, then use L different LSH hash table to reduce the dimension of PHOG texture to index values and map to different bucket, and finally extract the corresponding value of the image in the bucket for second image retrieval using Manhattan distance. This algorithm can adapt to the massive image retrieval, which ensures the high accuracy of the image retrieval and reduces the time complexity of the retrieval. This algorithm is of great significance.

  10. Blood and milk polymorphonuclear leukocyte and monocyte/macrophage functions in naturally caprine arthritis encephalitis virus infection in dairy goats.

    PubMed

    Santos, Bruna Parapinski; Souza, Fernando Nogueira; Blagitz, Maiara Garcia; Batista, Camila Freitas; Bertagnon, Heloísa Godoi; Diniz, Soraia Araújo; Silva, Marcos Xavier; Haddad, João Paulo Amaral; Della Libera, Alice Maria Melville Paiva

    2017-06-01

    The exact influence of caprine arthritis encephalitis virus (CAEV) infection on blood and milk polymorphonuclear leukocytes (PMNLs) and monocyte/macrophages of goats remains unclear. Thus, the present study sought to explore the blood and milk PMNL and monocyte/macrophage functions in naturally CAEV-infected goats. The present study used 18 healthy Saanen goats that were segregated according to sera test outcomes into serologically CAEV negative (n=8; 14 halves) and positive (n=10; 14 halves) groups. All milk samples from mammary halves with milk bacteriologically positive outcomes, somatic cell count ≥2×10 6 cellsmL -1 , and abnormal secretions in the strip cup test were excluded. We evaluated the percentage of blood and milk PMNLs and monocyte/macrophages, the viability of PMNLs and monocyte/macrophages, the levels of intracellular reactive oxygen species (ROS) and the nonopsonized phagocytosis of Staphylococcus aureus and Escherichia coli by flow cytometry. In the present study, a higher percentage of milk macrophages (CD14 + ) and milk polymorphonuclear leukocytes undergoing late apoptosis or necrosis (Annexin-V + /Propidium iodide + ) was observed in CAEV-infected goats; we did not find any further alterations in blood and milk PMNL and monocyte/macrophage functions. Thus, regarding our results, the goats naturally infected with CAEV did not reveal pronounced dysfunctions in blood and milk polymorphonuclear leukocytes and monocytes/macrophages. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Spectrally and angularly resolved measurements of three-halves harmonic emission from laser-produced plasmas

    NASA Astrophysics Data System (ADS)

    Kang, N.; Liu, H.; Lin, Z.; Lei, A.; Zhou, S.; Fang, Z.; An, H.; Li, K.; Fan, W.

    2017-10-01

    Spectra of three-halves harmonic emissions (3{ω }0/2) from laser-produced plasmas were measured at different angles, including both forward and backward sides, from the direction of incident laser beams. The 3{ω }0/2 emitted from carbon-hydrogen (CH) targets was observed to be larger than that from aluminum (Al) targets with the same incident laser intensity, which supports the argument that the two-plasmon decay (TPD) instability could be inhibited by using medium-Z ablator instead of CH ablator in direct-drive inertial confinement fusion. Besides, the measured 3{ω }0/2-incident intensity curves for both materials suggest relatively lower threshold of TPD than the calculated values. In experiments with thin Al targets, the angular distribution of the blue- and red-shifted peaks of 3{ω }0/2 spectra were obtained, which shows that the most intense blue- and red-shifted peaks may not be produced in paired plasmons, but the spectra produced by their ‘twin’ plasmons were not observed. Because 3{ω }0/2 may have been influenced by other physical processes during their propagation from their birth places to the detectors, the mismatches on emission angle, wavelength shift, and threshold may be qualitatively explained through the assumption that small-scale light filaments widely existed in the corona of laser-produced plasmas.

  12. Git as an Encrypted Distributed Version Control System

    DTIC Science & Technology

    2015-03-01

    options. The algorithm uses AES- 256 counter mode with an IV derived from SHA -1-HMAC hash (this is nearly identical to the GCM mode discussed earlier...built into the internal structure of Git. Every file in a Git repository is check summed with a SHA -1 hash, a one-way function with arbitrarily long...implementation. Git-encrypt calls OpenSSL cryptography library command line functions. The default cipher used is AES- 256 - Electronic Code Book (ECB), which is

  13. Recognition of functional sites in protein structures.

    PubMed

    Shulman-Peleg, Alexandra; Nussinov, Ruth; Wolfson, Haim J

    2004-06-04

    Recognition of regions on the surface of one protein, that are similar to a binding site of another is crucial for the prediction of molecular interactions and for functional classifications. We first describe a novel method, SiteEngine, that assumes no sequence or fold similarities and is able to recognize proteins that have similar binding sites and may perform similar functions. We achieve high efficiency and speed by introducing a low-resolution surface representation via chemically important surface points, by hashing triangles of physico-chemical properties and by application of hierarchical scoring schemes for a thorough exploration of global and local similarities. We proceed to rigorously apply this method to functional site recognition in three possible ways: first, we search a given functional site on a large set of complete protein structures. Second, a potential functional site on a protein of interest is compared with known binding sites, to recognize similar features. Third, a complete protein structure is searched for the presence of an a priori unknown functional site, similar to known sites. Our method is robust and efficient enough to allow computationally demanding applications such as the first and the third. From the biological standpoint, the first application may identify secondary binding sites of drugs that may lead to side-effects. The third application finds new potential sites on the protein that may provide targets for drug design. Each of the three applications may aid in assigning a function and in classification of binding patterns. We highlight the advantages and disadvantages of each type of search, provide examples of large-scale searches of the entire Protein Data Base and make functional predictions.

  14. QKD-Based Secured Burst Integrity Design for Optical Burst Switched Networks

    NASA Astrophysics Data System (ADS)

    Balamurugan, A. M.; Sivasubramanian, A.; Parvathavarthini, B.

    2016-03-01

    The field of optical transmission has undergone numerous advancements and is still being researched mainly due to the fact that optical data transmission can be done at enormous speeds. It is quite evident that people prefer optical communication when it comes to large amount of data involving its transmission. The concept of switching in networks has matured enormously with several researches, architecture to implement and methods starting with Optical circuit switching to Optical Burst Switching. Optical burst switching is regarded as viable solution for switching bursts over networks but has several security vulnerabilities. However, this work exploited the security issues associated with Optical Burst Switching with respect to integrity of burst. This proposed Quantum Key based Secure Hash Algorithm (QKBSHA-512) with enhanced compression function design provides better avalanche effect over the conventional integrity algorithms.

  15. An Intelligent Web-Based System for Diagnosing Student Learning Problems Using Concept Maps

    ERIC Educational Resources Information Center

    Acharya, Anal; Sinha, Devadatta

    2017-01-01

    The aim of this article is to propose a method for development of concept map in web-based environment for identifying concepts a student is deficient in after learning using traditional methods. Direct Hashing and Pruning algorithm was used to construct concept map. Redundancies within the concept map were removed to generate a learning sequence.…

  16. A secure and robust password-based remote user authentication scheme using smart cards for the integrated EPR information system.

    PubMed

    Das, Ashok Kumar

    2015-03-01

    An integrated EPR (Electronic Patient Record) information system of all the patients provides the medical institutions and the academia with most of the patients' information in details for them to make corrective decisions and clinical decisions in order to maintain and analyze patients' health. In such system, the illegal access must be restricted and the information from theft during transmission over the insecure Internet must be prevented. Lee et al. proposed an efficient password-based remote user authentication scheme using smart card for the integrated EPR information system. Their scheme is very efficient due to usage of one-way hash function and bitwise exclusive-or (XOR) operations. However, in this paper, we show that though their scheme is very efficient, their scheme has three security weaknesses such as (1) it has design flaws in password change phase, (2) it fails to protect privileged insider attack and (3) it lacks the formal security verification. We also find that another recently proposed Wen's scheme has the same security drawbacks as in Lee at al.'s scheme. In order to remedy these security weaknesses found in Lee et al.'s scheme and Wen's scheme, we propose a secure and efficient password-based remote user authentication scheme using smart cards for the integrated EPR information system. We show that our scheme is also efficient as compared to Lee et al.'s scheme and Wen's scheme as our scheme only uses one-way hash function and bitwise exclusive-or (XOR) operations. Through the security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks.

  17. The 1.3 A resolution structure of the RNA tridecamer r(GCGUUUGAAACGC): metal ion binding correlates with base unstacking and groove contraction.

    PubMed

    Timsit, Youri; Bombard, Sophie

    2007-12-01

    Metal ions play a key role in RNA folding and activity. Elucidating the rules that govern the binding of metal ions is therefore an essential step for better understanding the RNA functions. High-resolution data are a prerequisite for a detailed structural analysis of ion binding on RNA and, in particular, the observation of monovalent cations. Here, the high-resolution crystal structures of the tridecamer duplex r(GCGUUUGAAACGC) crystallized under different conditions provides new structural insights on ion binding on GAAA/UUU sequences that exhibit both unusual structural and functional properties in RNA. The present study extends the repertory of RNA ion binding sites in showing that the two first bases of UUU triplets constitute a specific site for sodium ions. A striking asymmetric pattern of metal ion binding in the two equivalent halves of the palindromic sequence demonstrates that sequence and its environment act together to bind metal ions. A highly ionophilic half that binds six metal ions allows, for the first time, the observation of a disodium cluster in RNA. The comparison of the equivalent halves of the duplex provides experimental evidences that ion binding correlates with structural alterations and groove contraction.

  18. Loudness function derives from data on electrical discharge rates in auditory nerve fibers

    NASA Technical Reports Server (NTRS)

    Howes, W. L.

    1973-01-01

    Judgements of the loudness of pure-tone sound stimuli yield a loudness function which relates perceived loudness to stimulus amplitude. A loudness function is derived from physical evidence alone without regard to human judgments. The resultant loudness function is L=K(q-q0), where L is loudness, q is effective sound pressure (specifically q0 at the loudness threshold), and K is generally a weak function of the number of stimulated auditory nerve fibers. The predicted function is in agreement with loudness judgment data reported by Warren, which imply that, in the suprathreshold loudness regime, decreasing the sound-pressure level by 6 db results in halving the loudness.

  19. Security analysis of boolean algebra based on Zhang-Wang digital signature scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Jinbin, E-mail: jbzheng518@163.com

    2014-10-06

    In 2005, Zhang and Wang proposed an improvement signature scheme without using one-way hash function and message redundancy. In this paper, we show that this scheme exits potential safety concerns through the analysis of boolean algebra, such as bitwise exclusive-or, and point out that mapping is not one to one between assembly instructions and machine code actually by means of the analysis of the result of the assembly program segment, and which possibly causes safety problems unknown to the software.

  20. An image retrieval framework for real-time endoscopic image retargeting.

    PubMed

    Ye, Menglong; Johns, Edward; Walter, Benjamin; Meining, Alexander; Yang, Guang-Zhong

    2017-08-01

    Serial endoscopic examinations of a patient are important for early diagnosis of malignancies in the gastrointestinal tract. However, retargeting for optical biopsy is challenging due to extensive tissue variations between examinations, requiring the method to be tolerant to these changes whilst enabling real-time retargeting. This work presents an image retrieval framework for inter-examination retargeting. We propose both a novel image descriptor tolerant of long-term tissue changes and a novel descriptor matching method in real time. The descriptor is based on histograms generated from regional intensity comparisons over multiple scales, offering stability over long-term appearance changes at the higher levels, whilst remaining discriminative at the lower levels. The matching method then learns a hashing function using random forests, to compress the string and allow for fast image comparison by a simple Hamming distance metric. A dataset that contains 13 in vivo gastrointestinal videos was collected from six patients, representing serial examinations of each patient, which includes videos captured with significant time intervals. Precision-recall for retargeting shows that our new descriptor outperforms a number of alternative descriptors, whilst our hashing method outperforms a number of alternative hashing approaches. We have proposed a novel framework for optical biopsy in serial endoscopic examinations. A new descriptor, combined with a novel hashing method, achieves state-of-the-art retargeting, with validation on in vivo videos from six patients. Real-time performance also allows for practical integration without disturbing the existing clinical workflow.

  1. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment

    PubMed Central

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation. PMID:26981584

  2. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.

    PubMed

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  3. Best-First Heuristic Search for Multicore Machines

    DTIC Science & Technology

    2010-01-01

    Otto, 1998) to implement an asynchronous version of PRA* that they call Hash Distributed A* ( HDA *). HDA * distributes nodes using a hash function in...nodes which are being communicated between peers are in transit. In contact with the authors of HDA *, we have created an implementation of HDA * for...Also, our implementation of HDA * allows us to make a fair comparison between algorithms by sharing common data structures such as priority queues and

  4. User characteristics and effect profile of Butane Hash Oil: An extremely high-potency cannabis concentrate.

    PubMed

    Chan, Gary C K; Hall, Wayne; Freeman, Tom P; Ferris, Jason; Kelly, Adrian B; Winstock, Adam

    2017-09-01

    Recent reports suggest an increase in use of extremely potent cannabis concentrates such as Butane Hash Oil (BHO) in some developed countries. The aims of this study were to examine the characteristics of BHO users and the effect profiles of BHO. Anonymous online survey in over 20 countries in 2014 and 2015. Participants aged 18 years or older were recruited through onward promotion and online social networks. The overall sample size was 181,870. In this sample, 46% (N=83,867) reported using some form of cannabis in the past year, and 3% reported BHO use (n=5922). Participants reported their use of 7 types of cannabis in the past 12 months, the source of their cannabis, reasons for use, use of other illegal substances, and lifetime diagnosis for depression, anxiety and psychosis. Participants were asked to rate subjective effects of BHO and high potency herbal cannabis. Participants who reported a lifetime diagnosis of depression (OR=1.15, p=0.003), anxiety (OR=1.72, p<0.001), and a larger number of substance use (OR=1.29, p<0.001) were more likely to use BHO than only using high potency herbal cannabis. BHO users also reported stronger negative effects and less positive effects when using BHO than high potency herbal cannabis (p<0.001) CONCLUSION: Mental health problems and other illicit drug use were associated with use of BHO. BHO was reported to have stronger negative and weaker positive effects than high potency herbal cannabis. Copyright © 2017. Published by Elsevier B.V.

  5. Enhanced K-means clustering with encryption on cloud

    NASA Astrophysics Data System (ADS)

    Singh, Iqjot; Dwivedi, Prerna; Gupta, Taru; Shynu, P. G.

    2017-11-01

    This paper tries to solve the problem of storing and managing big files over cloud by implementing hashing on Hadoop in big-data and ensure security while uploading and downloading files. Cloud computing is a term that emphasis on sharing data and facilitates to share infrastructure and resources.[10] Hadoop is an open source software that gives us access to store and manage big files according to our needs on cloud. K-means clustering algorithm is an algorithm used to calculate distance between the centroid of the cluster and the data points. Hashing is a algorithm in which we are storing and retrieving data with hash keys. The hashing algorithm is called as hash function which is used to portray the original data and later to fetch the data stored at the specific key. [17] Encryption is a process to transform electronic data into non readable form known as cipher text. Decryption is the opposite process of encryption, it transforms the cipher text into plain text that the end user can read and understand well. For encryption and decryption we are using Symmetric key cryptographic algorithm. In symmetric key cryptography are using DES algorithm for a secure storage of the files. [3

  6. A Complete and Accurate Ab Initio Repeat Finding Algorithm.

    PubMed

    Lian, Shuaibin; Chen, Xinwu; Wang, Peng; Zhang, Xiaoli; Dai, Xianhua

    2016-03-01

    It has become clear that repetitive sequences have played multiple roles in eukaryotic genome evolution including increasing genetic diversity through mutation, changes in gene expression and facilitating generation of novel genes. However, identification of repetitive elements can be difficult in the ab initio manner. Currently, some classical ab initio tools of finding repeats have already presented and compared. The completeness and accuracy of detecting repeats of them are little pool. To this end, we proposed a new ab initio repeat finding tool, named HashRepeatFinder, which is based on hash index and word counting. Furthermore, we assessed the performances of HashRepeatFinder with other two famous tools, such as RepeatScout and Repeatfinder, in human genome data hg19. The results indicated the following three conclusions: (1) The completeness of HashRepeatFinder is the best one among these three compared tools in almost all chromosomes, especially in chr9 (8 times of RepeatScout, 10 times of Repeatfinder); (2) in terms of detecting large repeats, HashRepeatFinder also performed best in all chromosomes, especially in chr3 (24 times of RepeatScout and 250 times of Repeatfinder) and chr19 (12 times of RepeatScout and 60 times of Repeatfinder); (3) in terms of accuracy, HashRepeatFinder can merge the abundant repeats with high accuracy.

  7. Ordered versus Unordered Map for Primitive Data Types

    DTIC Science & Technology

    2015-09-01

    mapped to some element. C++ provides two types of map containers within the standard template library, the std ::map and the std ::unordered_map...classes. As the name implies, the containers main functional difference is that the elements in the std ::map are ordered by the key, and the std ...unordered_map are not ordered based on their key. The std ::unordered_map elements are placed into “buckets” based on a hash value computed for their key

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    AISL-CRYPTO is a library of cryptography functions supporting other AISL software. It provides various crypto functions for Common Lisp, including Digital Signature Algorithm, Data Encryption Standard, Secure Hash Algorithm, and public-key cryptography.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Amrish, E-mail: amrish99@gmail.com; Kaur, Sandeep, E-mail: sipusukhn@gmail.com; Mudahar, Isha, E-mail: isha@pbi.ac.in

    We have investigated the structural and electronic properties of carbon nanotube with small fullerene halves C{sub n} (n ≤ 40) which are covalently bonded to the side wall of an armchair single wall carbon nanotube (SWCNT) using first principle method based on density functional theory. The fullerene size results in weak bonding between fullerene halves and carbon nanotube (CNT). Further, it was found that the C-C bond distance that attaches the fullerene half and CNT is of the order of 1.60 Å. The calculated binding energies indicate the stability of the complexes formed. The HOMO-LUMO gaps and electron density ofmore » state plots points towards the metallicity of the complex formed. Our calculations on charge transfer reveal that very small amount of charge is transferred from CNT to fullerene halves.« less

  10. The Backscattering Phase Function for a Sphere with a Two-Scale Relief of Rough Surface

    NASA Astrophysics Data System (ADS)

    Klass, E. V.

    2017-12-01

    The backscattering of light from spherical surfaces characterized by one and two-scale roughness reliefs has been investigated. The analysis is performed using the three-dimensional Monte-Carlo program POKS-RG (geometrical-optics approximation), which makes it possible to take into account the roughness of objects under study by introducing local geometries of different levels. The geometric module of the program is aimed at describing objects by equations of second-order surfaces. One-scale roughness is set as an ensemble of geometric figures (convex or concave halves of ellipsoids or cones). The two-scale roughness is modeled by convex halves of ellipsoids, with surface containing ellipsoidal pores. It is shown that a spherical surface with one-scale convex inhomogeneities has a flatter backscattering phase function than a surface with concave inhomogeneities (pores). For a sphere with two-scale roughness, the dependence of the backscattering intensity is found to be determined mostly by the lower-level inhomogeneities. The influence of roughness on the dependence of the backscattering from different spatial regions of spherical surface is analyzed.

  11. Nonlinear spline wavefront reconstruction through moment-based Shack-Hartmann sensor measurements.

    PubMed

    Viegers, M; Brunner, E; Soloviev, O; de Visser, C C; Verhaegen, M

    2017-05-15

    We propose a spline-based aberration reconstruction method through moment measurements (SABRE-M). The method uses first and second moment information from the focal spots of the SH sensor to reconstruct the wavefront with bivariate simplex B-spline basis functions. The proposed method, since it provides higher order local wavefront estimates with quadratic and cubic basis functions can provide the same accuracy for SH arrays with a reduced number of subapertures and, correspondingly, larger lenses which can be beneficial for application in low light conditions. In numerical experiments the performance of SABRE-M is compared to that of the first moment method SABRE for aberrations of different spatial orders and for different sizes of the SH array. The results show that SABRE-M is superior to SABRE, in particular for the higher order aberrations and that SABRE-M can give equal performance as SABRE on a SH grid of halved sampling.

  12. Resource optimized TTSH-URA for multimedia stream authentication in swallowable-capsule-based wireless body sensor networks.

    PubMed

    Wang, Wei; Wang, Chunqiu; Zhao, Min

    2014-03-01

    To ease the burdens on the hospitalization capacity, an emerging swallowable-capsule technology has evolved to serve as a remote gastrointestinal (GI) disease examination technique with the aid of the wireless body sensor network (WBSN). Secure multimedia transmission in such a swallowable-capsule-based WBSN faces critical challenges including energy efficiency and content quality guarantee. In this paper, we propose a joint resource allocation and stream authentication scheme to maintain the best possible video quality while ensuring security and energy efficiency in GI-WBSNs. The contribution of this research is twofold. First, we establish a unique signature-hash (S-H) diversity approach in the authentication domain to optimize video authentication robustness and the authentication bit rate overhead over a wireless channel. Based on the full exploration of S-H authentication diversity, we propose a new two-tier signature-hash (TTSH) stream authentication scheme to improve the video quality by reducing authentication dependence overhead while protecting its integrity. Second, we propose to combine this authentication scheme with a unique S-H oriented unequal resource allocation (URA) scheme to improve the energy-distortion-authentication performance of wireless video delivery in GI-WBSN. Our analysis and simulation results demonstrate that the proposed TTSH with URA scheme achieves considerable gain in both authenticated video quality and energy efficiency.

  13. Provably Secure Password-based Authentication in TLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdalla, Michel; Emmanuel, Bresson; Chevassut, Olivier

    2005-12-20

    In this paper, we show how to design an efficient, provably secure password-based authenticated key exchange mechanism specifically for the TLS (Transport Layer Security) protocol. The goal is to provide a technique that allows users to employ (short) passwords to securely identify themselves to servers. As our main contribution, we describe a new password-based technique for user authentication in TLS, called Simple Open Key Exchange (SOKE). Loosely speaking, the SOKE ciphersuites are unauthenticated Diffie-Hellman ciphersuites in which the client's Diffie-Hellman ephemeral public value is encrypted using a simple mask generation function. The mask is simply a constant value raised tomore » the power of (a hash of) the password.The SOKE ciphersuites, in advantage over previous pass-word-based authentication ciphersuites for TLS, combine the following features. First, SOKE has formal security arguments; the proof of security based on the computational Diffie-Hellman assumption is in the random oracle model, and holds for concurrent executions and for arbitrarily large password dictionaries. Second, SOKE is computationally efficient; in particular, it only needs operations in a sufficiently large prime-order subgroup for its Diffie-Hellman computations (no safe primes). Third, SOKE provides good protocol flexibility because the user identity and password are only required once a SOKE ciphersuite has actually been negotiated, and after the server has sent a server identity.« less

  14. What Is True Halving in the Payoff Matrix of Game Theory?

    PubMed

    Ito, Hiromu; Katsumata, Yuki; Hasegawa, Eisuke; Yoshimura, Jin

    2016-01-01

    In game theory, there are two social interpretations of rewards (payoffs) for decision-making strategies: (1) the interpretation based on the utility criterion derived from expected utility theory and (2) the interpretation based on the quantitative criterion (amount of gain) derived from validity in the empirical context. A dynamic decision theory has recently been developed in which dynamic utility is a conditional (state) variable that is a function of the current wealth of a decision maker. We applied dynamic utility to the equal division in dove-dove contests in the hawk-dove game. Our results indicate that under the utility criterion, the half-share of utility becomes proportional to a player's current wealth. Our results are consistent with studies of the sense of fairness in animals, which indicate that the quantitative criterion has greater validity than the utility criterion. We also find that traditional analyses of repeated games must be reevaluated.

  15. Providing Cryptographic Security and Evidentiary Chain-of-Custody with the Advanced Forensic Format, Library, and Tools

    DTIC Science & Technology

    2008-08-19

    1 hash of the page page%d sha256 The segment for the SHA256 hash of the page Bad Sector Management: badsectors The number of sectors in the image...written, AFFLIB can automatically compute the page’s MD5, SHA-1, and/or SHA256 hash and write an associated segment containing the hash value. The...are written into segments themselves, with the segment name being name/ sha256 where name is the original segment name sha256 is the hash algorithm used

  16. Halving Titan

    NASA Image and Video Library

    2010-01-11

    Titan seasonal hemispheric dichotomy is chronicled in black and white, with the moon northern half appearing slightly lighter than the dark southern half in this image taken by NASA Cassini spacecraft.

  17. Joint image encryption and compression scheme based on IWT and SPIHT

    NASA Astrophysics Data System (ADS)

    Zhang, Miao; Tong, Xiaojun

    2017-03-01

    A joint lossless image encryption and compression scheme based on integer wavelet transform (IWT) and set partitioning in hierarchical trees (SPIHT) is proposed to achieve lossless image encryption and compression simultaneously. Making use of the properties of IWT and SPIHT, encryption and compression are combined. Moreover, the proposed secure set partitioning in hierarchical trees (SSPIHT) via the addition of encryption in the SPIHT coding process has no effect on compression performance. A hyper-chaotic system, nonlinear inverse operation, Secure Hash Algorithm-256(SHA-256), and plaintext-based keystream are all used to enhance the security. The test results indicate that the proposed methods have high security and good lossless compression performance.

  18. What Is True Halving in the Payoff Matrix of Game Theory?

    PubMed Central

    Hasegawa, Eisuke; Yoshimura, Jin

    2016-01-01

    In game theory, there are two social interpretations of rewards (payoffs) for decision-making strategies: (1) the interpretation based on the utility criterion derived from expected utility theory and (2) the interpretation based on the quantitative criterion (amount of gain) derived from validity in the empirical context. A dynamic decision theory has recently been developed in which dynamic utility is a conditional (state) variable that is a function of the current wealth of a decision maker. We applied dynamic utility to the equal division in dove-dove contests in the hawk-dove game. Our results indicate that under the utility criterion, the half-share of utility becomes proportional to a player’s current wealth. Our results are consistent with studies of the sense of fairness in animals, which indicate that the quantitative criterion has greater validity than the utility criterion. We also find that traditional analyses of repeated games must be reevaluated. PMID:27487194

  19. A Secure and Robust User Authenticated Key Agreement Scheme for Hierarchical Multi-medical Server Environment in TMIS.

    PubMed

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2015-09-01

    The telecare medicine information system (TMIS) helps the patients to gain the health monitoring facility at home and access medical services over the Internet of mobile networks. Recently, Amin and Biswas presented a smart card based user authentication and key agreement security protocol usable for TMIS system using the cryptographic one-way hash function and biohashing function, and claimed that their scheme is secure against all possible attacks. Though their scheme is efficient due to usage of one-way hash function, we show that their scheme has several security pitfalls and design flaws, such as (1) it fails to protect privileged-insider attack, (2) it fails to protect strong replay attack, (3) it fails to protect strong man-in-the-middle attack, (4) it has design flaw in user registration phase, (5) it has design flaw in login phase, (6) it has design flaw in password change phase, (7) it lacks of supporting biometric update phase, and (8) it has flaws in formal security analysis. In order to withstand these security pitfalls and design flaws, we aim to propose a secure and robust user authenticated key agreement scheme for the hierarchical multi-server environment suitable in TMIS using the cryptographic one-way hash function and fuzzy extractor. Through the rigorous security analysis including the formal security analysis using the widely-accepted Burrows-Abadi-Needham (BAN) logic, the formal security analysis under the random oracle model and the informal security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme using the most-widely accepted and used Automated Validation of Internet Security Protocols and Applications (AVISPA) tool. The simulation results show that our scheme is also secure. Our scheme is more efficient in computation and communication as compared to Amin-Biswas's scheme and other related schemes. In addition, our scheme supports extra functionality features as compared to

  20. Robust and efficient biometrics based password authentication scheme for telecare medicine information systems using extended chaotic maps.

    PubMed

    Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Xie, Dong; Yang, Yixian

    2015-06-01

    The Telecare Medicine Information Systems (TMISs) provide an efficient communicating platform supporting the patients access health-care delivery services via internet or mobile networks. Authentication becomes an essential need when a remote patient logins into the telecare server. Recently, many extended chaotic maps based authentication schemes using smart cards for TMISs have been proposed. Li et al. proposed a secure smart cards based authentication scheme for TMISs using extended chaotic maps based on Lee's and Jiang et al.'s scheme. In this study, we show that Li et al.'s scheme has still some weaknesses such as violation the session key security, vulnerability to user impersonation attack and lack of local verification. To conquer these flaws, we propose a chaotic maps and smart cards based password authentication scheme by applying biometrics technique and hash function operations. Through the informal and formal security analyses, we demonstrate that our scheme is resilient possible known attacks including the attacks found in Li et al.'s scheme. As compared with the previous authentication schemes, the proposed scheme is more secure and efficient and hence more practical for telemedical environments.

  1. Method and system for analyzing and classifying electronic information

    DOEpatents

    McGaffey, Robert W.; Bell, Michael Allen; Kortman, Peter J.; Wilson, Charles H.

    2003-04-29

    A data analysis and classification system that reads the electronic information, analyzes the electronic information according to a user-defined set of logical rules, and returns a classification result. The data analysis and classification system may accept any form of computer-readable electronic information. The system creates a hash table wherein each entry of the hash table contains a concept corresponding to a word or phrase which the system has previously encountered. The system creates an object model based on the user-defined logical associations, used for reviewing each concept contained in the electronic information in order to determine whether the electronic information is classified. The data analysis and classification system extracts each concept in turn from the electronic information, locates it in the hash table, and propagates it through the object model. In the event that the system can not find the electronic information token in the hash table, that token is added to a missing terms list. If any rule is satisfied during propagation of the concept through the object model, the electronic information is classified.

  2. A new pre-classification method based on associative matching method

    NASA Astrophysics Data System (ADS)

    Katsuyama, Yutaka; Minagawa, Akihiro; Hotta, Yoshinobu; Omachi, Shinichiro; Kato, Nei

    2010-01-01

    Reducing the time complexity of character matching is critical to the development of efficient Japanese Optical Character Recognition (OCR) systems. To shorten processing time, recognition is usually split into separate preclassification and recognition stages. For high overall recognition performance, the pre-classification stage must both have very high classification accuracy and return only a small number of putative character categories for further processing. Furthermore, for any practical system, the speed of the pre-classification stage is also critical. The associative matching (AM) method has often been used for fast pre-classification, because its use of a hash table and reliance solely on logical bit operations to select categories makes it highly efficient. However, redundant certain level of redundancy exists in the hash table because it is constructed using only the minimum and maximum values of the data on each axis and therefore does not take account of the distribution of the data. We propose a modified associative matching method that satisfies the performance criteria described above but in a fraction of the time by modifying the hash table to reflect the underlying distribution of training characters. Furthermore, we show that our approach outperforms pre-classification by clustering, ANN and conventional AM in terms of classification accuracy, discriminative power and speed. Compared to conventional associative matching, the proposed approach results in a 47% reduction in total processing time across an evaluation test set comprising 116,528 Japanese character images.

  3. Genomics-Based Security Protocols: From Plaintext to Cipherprotein

    NASA Technical Reports Server (NTRS)

    Shaw, Harry; Hussein, Sayed; Helgert, Hermann

    2011-01-01

    The evolving nature of the internet will require continual advances in authentication and confidentiality protocols. Nature provides some clues as to how this can be accomplished in a distributed manner through molecular biology. Cryptography and molecular biology share certain aspects and operations that allow for a set of unified principles to be applied to problems in either venue. A concept for developing security protocols that can be instantiated at the genomics level is presented. A DNA (Deoxyribonucleic acid) inspired hash code system is presented that utilizes concepts from molecular biology. It is a keyed-Hash Message Authentication Code (HMAC) capable of being used in secure mobile Ad hoc networks. It is targeted for applications without an available public key infrastructure. Mechanics of creating the HMAC are presented as well as a prototype HMAC protocol architecture. Security concepts related to the implementation differences between electronic domain security and genomics domain security are discussed.

  4. HECLIB. Volume 2: HECDSS Subroutines Programmer’s Manual

    DTIC Science & Technology

    1991-05-01

    algorithm and hierarchical design for database accesses. This algorithm provides quick access to data sets and an efficient means of adding new data set...Description of How DSS Works DSS version 6 utilizes a modified hash algorithm based upon the pathname to store and retrieve data. This structure allows...balancing disk space and record access times. A variation in this algorithm is for "stable" files. In a stable file, a hash table is not utilized

  5. An improved biometrics-based remote user authentication scheme with user anonymity.

    PubMed

    Khan, Muhammad Khurram; Kumari, Saru

    2013-01-01

    The authors review the biometrics-based user authentication scheme proposed by An in 2012. The authors show that there exist loopholes in the scheme which are detrimental for its security. Therefore the authors propose an improved scheme eradicating the flaws of An's scheme. Then a detailed security analysis of the proposed scheme is presented followed by its efficiency comparison. The proposed scheme not only withstands security problems found in An's scheme but also provides some extra features with mere addition of only two hash operations. The proposed scheme allows user to freely change his password and also provides user anonymity with untraceability.

  6. Sensor-based laser ablation for tissue specific cutting: an experimental study.

    PubMed

    Rupprecht, Stephan; Tangermann-Gerk, Katja; Wiltfang, Joerg; Neukam, Friedrich Wilhelm; Schlegel, Andreas

    2004-01-01

    The interaction of laser light and tissue causes measurable phenomenons. These phenomenons can be quantified and used to control the laser drilling within a feedback system. Ten halves of dissected minipig jaws were treated with an Er:YAG laser system controlled via a feedback system. Sensor outputs were recorded and analyzed while osteotomy was done. The relative depth of laser ablation was calculated by 3D computed tomography and evaluated histologically. The detected signals caused by the laser-tissue interaction changed their character in a dramatic way after passing the cortical bone layer. The radiological evaluation of 98 laser-ablated holes in the ten halves showed no deeper ablation beyond the cortical layer (mean values: 97.8%). Histologically, no physical damage to the alveolar nerve bundle was proved. The feedback system to control the laser drilling was working exactly for cortical ablation of the bone based on the evaluation of detected and quantified phenomenon related to the laser-tissue interaction.

  7. A DRM based on renewable broadcast encryption

    NASA Astrophysics Data System (ADS)

    Ramkumar, Mahalingam; Memon, Nasir

    2005-07-01

    We propose an architecture for digital rights management based on a renewable, random key pre-distribution (KPD) scheme, HARPS (hashed random preloaded subsets). The proposed architecture caters for broadcast encryption by a trusted authority (TA) and by "parent" devices (devices used by vendors who manufacture compliant devices) for periodic revocation of devices. The KPD also facilitates broadcast encryption by peer devices, which permits peers to distribute content, and efficiently control access to the content encryption secret using subscription secrets. The underlying KPD also caters for broadcast authentication and mutual authentication of any two devices, irrespective of the vendors manufacturing the device, and thus provides a comprehensive solution for securing interactions between devices taking part in a DRM system.

  8. Manticore and CS mode : parallelizable encryption with joint cipher-state authentication.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree

    2004-10-01

    We describe a new mode of encryption with inexpensive authentication, which uses information from the internal state of the cipher to provide the authentication. Our algorithms have a number of benefits: (1) the encryption has properties similar to CBC mode, yet the encipherment and authentication can be parallelized and/or pipelined, (2) the authentication overhead is minimal, and (3) the authentication process remains resistant against some IV reuse. We offer a Manticore class of authenticated encryption algorithms based on cryptographic hash functions, which support variable block sizes up to twice the hash output length and variable key lengths. A proof ofmore » security is presented for the MTC4 and Pepper algorithms. We then generalize the construction to create the Cipher-State (CS) mode of encryption that uses the internal state of any round-based block cipher as an authenticator. We provide hardware and software performance estimates for all of our constructions and give a concrete example of the CS mode of encryption that uses AES as the encryption primitive and adds a small speed overhead (10-15%) compared to AES alone.« less

  9. Dual-tail arylsulfone-based benzenesulfonamides differently match the hydrophobic and hydrophilic halves of human carbonic anhydrases active sites: Selective inhibitors for the tumor-associated hCA IX isoform.

    PubMed

    Ibrahim, Hany S; Allam, Heba Abdelrasheed; Mahmoud, Walaa R; Bonardi, Alessandro; Nocentini, Alessio; Gratteri, Paola; Ibrahim, Eslam S; Abdel-Aziz, Hatem A; Supuran, Claudiu T

    2018-05-25

    The synthesis and characterization of two new sets of arylsulfonehydrazone benzenesulfonamides (4a-4i with phenyl tail and 4j-4q with tolyl tail) are reported. The compounds were designed according to a dual-tails approach to modulate the interactions of the ligands portions at the outer rim of both hydrophobic and hydrophilic active site halves of human isoforms of carbonic anhydrase (CA, EC 4.2.1.1). The synthesized sulfonamides were evaluated in vitro for their inhibitory activity against the following human (h) isoforms, hCA I, II, IV and IX. With the latter being a validated anticancer drug target and a marker of tumor hypoxia, attractive results arose from the Compounds' inhibitory screening in terms of potency and selectivity. Indeed, whereas the first subset of compounds 4a-4i exhibited great efficacy in inhibiting both the ubiquitous, off-target hCA II (K I s 9.5-172.0 nM) and hCA IX (K I s 7.5-131.5 nM), the second subset of tolyl-bearing derivatives 4j-4q were shown to possess a selective hCA IX inhibitory action over isoforms I, II and IV. The most selective compounds 4l and 4n were further screened for their in vitro cytotoxic activity against MCF-7 and MDA-MB-231 cancer cell lines under hypoxic conditions. The selective IX/II inhibitory trend of 4j-4q compared to those of compounds 4a-4i was unveiled by docking studies. Further exploration of these molecules could be useful for the development of novel antitumor agents with a selective CA inhibitory mechanism. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  10. The effect of cryoprotectant on kangaroo sperm ultrastructure and mitochondrial function.

    PubMed

    McClean, Rhett; Holt, William V; Zee, Yeng Peng; Lisle, Allan; Johnston, Stephen D

    2008-12-01

    This study examined the effect of cryoprotectants (20% DMSO, a 10% DMSO/10% glycerol mixture, 20% glycerol and 1M sucrose solution) on kangaroo sperm structure and function, along with the effect of varying concentrations of glycerol on sperm mitochondrial function. Eastern grey kangaroo cauda epididymidal spermatozoa were incubated for 10 min at 35 degrees C in each cryoprotectant and the plasma membrane integrity (PMI) and motility assessed using light microscopy. The same samples were fixed for TEM and the ultrastructural integrity of the spermatozoa examined. To investigate the effect of glycerol on the kangaroo sperm mitochondrial function, epididymidal spermatozoa were incubated with JC-1 in Tris-citrate media at 35 degrees C for 20 min in a range of glycerol concentrations (0%, 5%, 10%, 15% and 20%) and the mitochondrial membrane potential (MMP) and plasma membrane integrity determined. As expected, incubation of spermatozoa in 20% glycerol for 10 min resulted in a significant reduction in motility, PMI and ultrastructural integrity. Interestingly, incubation in 20% DMSO resulted in no significant reduction in motility or PMI but a significant loss of structural integrity when compared to the control spermatozoa (0% cryoprotectant). However, 20% DMSO was overall less damaging to sperm ultrastructure than glycerol, a combination of 10% glycerol and 10% DMSO, and sucrose. While all glycerol concentrations had an adverse effect on mitochondrial function, the statistical models presented for the relationship between MMP and glycerol predicted that spermatozoa, when added to 20% glycerol, would lose half of their initial MMP immediately at 35 degrees C and MMP would halve after 19.4 min at 4 degrees C. Models for the relationship between PMI and glycerol predicted that spermatozoa would lose half of their initial PMI after 1.8 min at 35 degrees C and PMI would halve after 21.1 min at 4 degrees C. These results suggest that if glycerol is to be used as a

  11. Learning Short Binary Codes for Large-scale Image Retrieval.

    PubMed

    Liu, Li; Yu, Mengyang; Shao, Ling

    2017-03-01

    Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.

  12. Generating region proposals for histopathological whole slide image retrieval.

    PubMed

    Ma, Yibing; Jiang, Zhiguo; Zhang, Haopeng; Xie, Fengying; Zheng, Yushan; Shi, Huaqiang; Zhao, Yu; Shi, Jun

    2018-06-01

    Content-based image retrieval is an effective method for histopathological image analysis. However, given a database of huge whole slide images (WSIs), acquiring appropriate region-of-interests (ROIs) for training is significant and difficult. Moreover, histopathological images can only be annotated by pathologists, resulting in the lack of labeling information. Therefore, it is an important and challenging task to generate ROIs from WSI and retrieve image with few labels. This paper presents a novel unsupervised region proposing method for histopathological WSI based on Selective Search. Specifically, the WSI is over-segmented into regions which are hierarchically merged until the WSI becomes a single region. Nucleus-oriented similarity measures for region mergence and Nucleus-Cytoplasm color space for histopathological image are specially defined to generate accurate region proposals. Additionally, we propose a new semi-supervised hashing method for image retrieval. The semantic features of images are extracted with Latent Dirichlet Allocation and transformed into binary hashing codes with Supervised Hashing. The methods are tested on a large-scale multi-class database of breast histopathological WSIs. The results demonstrate that for one WSI, our region proposing method can generate 7.3 thousand contoured regions which fit well with 95.8% of the ROIs annotated by pathologists. The proposed hashing method can retrieve a query image among 136 thousand images in 0.29 s and reach precision of 91% with only 10% of images labeled. The unsupervised region proposing method can generate regions as predictions of lesions in histopathological WSI. The region proposals can also serve as the training samples to train machine-learning models for image retrieval. The proposed hashing method can achieve fast and precise image retrieval with small amount of labels. Furthermore, the proposed methods can be potentially applied in online computer-aided-diagnosis systems. Copyright

  13. Transitioning Client Based NALCOMIS to a Multi Function Web Based Application

    DTIC Science & Technology

    2016-09-23

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS TRANSITIONING CLIENT- BASED NALCOMIS TO A MULTI-FUNCTION WEB- BASED APPLICATION by Aaron P...TITLE AND SUBTITLE TRANSITIONING CLIENT- BASED NALCOMIS TO A MULTI-FUNCTION WEB- BASED APPLICATION 5. FUNDING NUMBERS 6. AUTHOR(S) Aaron P. Schnetzler 7...NALCOMIS. NALCOMIS has two configurations that are used by organizational and intermediate level maintenance activi- ties, Optimized Organizational

  14. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  15. Use of a Y-tube conduit after facial nerve injury reduces collateral axonal branching at the lesion site but neither reduces polyinnervation of motor endplates nor improves functional recovery.

    PubMed

    Hizay, Arzu; Ozsoy, Umut; Demirel, Bahadir Murat; Ozsoy, Ozlem; Angelova, Srebrina K; Ankerne, Janina; Sarikcioglu, Sureyya Bilmen; Dunlop, Sarah A; Angelov, Doychin N; Sarikcioglu, Levent

    2012-06-01

    Despite increased understanding of peripheral nerve regeneration, functional recovery after surgical repair remains disappointing. A major contributing factor is the extensive collateral branching at the lesion site, which leads to inaccurate axonal navigation and aberrant reinnervation of targets. To determine whether the Y tube reconstruction improved axonal regrowth and whether this was associated with improved function. We used a Y-tube conduit with the aim of improving navigation of regenerating axons after facial nerve transection in rats. Retrograde labeling from the zygomatic and buccal branches showed a halving in the number of double-labeled facial motor neurons (15% vs 8%; P < .05) after Y tube reconstruction compared with facial-facial anastomosis coaptation. However, in both surgical groups, the proportion of polyinnervated motor endplates was similar (≈ 30%; P > .05), and video-based motion analysis of whisking revealed similarly poor function. Although Y-tube reconstruction decreases axonal branching at the lesion site and improves axonal navigation compared with facial-facial anastomosis coaptation, it fails to promote monoinnervation of motor endplates and confers no functional benefit.

  16. Multi-Party Quantum Private Comparison Protocol Based on Entanglement Swapping of Bell Entangled States

    NASA Astrophysics Data System (ADS)

    Ye, Tian-Yu

    2016-09-01

    Recently, Liu et al. proposed a two-party quantum private comparison (QPC) protocol using entanglement swapping of Bell entangled state (Commun. Theor. Phys. 57 (2012) 583). Subsequently Liu et al. pointed out that in Liu et al.'s protocol, the TP can extract the two users' secret inputs without being detected by launching the Bell-basis measurement attack, and suggested the corresponding improvement to mend this loophole (Commun. Theor. Phys. 62 (2014) 210). In this paper, we first point out the information leakage problem toward TP existing in both of the above two protocols, and then suggest the corresponding improvement by using the one-way hash function to encrypt the two users' secret inputs. We further put forward the three-party QPC protocol also based on entanglement swapping of Bell entangled state, and then validate its output correctness and its security in detail. Finally, we generalize the three-party QPC protocol into the multi-party case, which can accomplish arbitrary pair's comparison of equality among K users within one execution. Supported by the National Natural Science Foundation of China under Grant No. 61402407

  17. GO-based functional dissimilarity of gene sets.

    PubMed

    Díaz-Díaz, Norberto; Aguilar-Ruiz, Jesús S

    2011-09-01

    The Gene Ontology (GO) provides a controlled vocabulary for describing the functions of genes and can be used to evaluate the functional coherence of gene sets. Many functional coherence measures consider each pair of gene functions in a set and produce an output based on all pairwise distances. A single gene can encode multiple proteins that may differ in function. For each functionality, other proteins that exhibit the same activity may also participate. Therefore, an identification of the most common function for all of the genes involved in a biological process is important in evaluating the functional similarity of groups of genes and a quantification of functional coherence can helps to clarify the role of a group of genes working together. To implement this approach to functional assessment, we present GFD (GO-based Functional Dissimilarity), a novel dissimilarity measure for evaluating groups of genes based on the most relevant functions of the whole set. The measure assigns a numerical value to the gene set for each of the three GO sub-ontologies. Results show that GFD performs robustly when applied to gene set of known functionality (extracted from KEGG). It performs particularly well on randomly generated gene sets. An ROC analysis reveals that the performance of GFD in evaluating the functional dissimilarity of gene sets is very satisfactory. A comparative analysis against other functional measures, such as GS2 and those presented by Resnik and Wang, also demonstrates the robustness of GFD.

  18. Digital data storage systems, computers, and data verification methods

    DOEpatents

    Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.

    2005-12-27

    Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.

  19. In vitro quantification of the performance of model-based mono-planar and bi-planar fluoroscopy for 3D joint kinematics estimation.

    PubMed

    Tersi, Luca; Barré, Arnaud; Fantozzi, Silvia; Stagni, Rita

    2013-03-01

    Model-based mono-planar and bi-planar 3D fluoroscopy methods can quantify intact joints kinematics with performance/cost trade-off. The aim of this study was to compare the performances of mono- and bi-planar setups to a marker-based gold-standard, during dynamic phantom knee acquisitions. Absolute pose errors for in-plane parameters were lower than 0.6 mm or 0.6° for both mono- and bi-planar setups. Mono-planar setups resulted critical in quantifying the out-of-plane translation (error < 6.5 mm), and bi-planar in quantifying the rotation along bone longitudinal axis (error < 1.3°). These errors propagated to joint angles and translations differently depending on the alignment of the anatomical axes and the fluoroscopic reference frames. Internal-external rotation was the least accurate angle both with mono- (error < 4.4°) and bi-planar (error < 1.7°) setups, due to bone longitudinal symmetries. Results highlighted that accuracy for mono-planar in-plane pose parameters is comparable to bi-planar, but with halved computational costs, halved segmentation time and halved ionizing radiation dose. Bi-planar analysis better compensated for the out-of-plane uncertainty that is differently propagated to relative kinematics depending on the setup. To take its full benefits, the motion task to be investigated should be designed to maintain the joint inside the visible volume introducing constraints with respect to mono-planar analysis.

  20. Clamshell microwave cavities having a superconductive coating

    DOEpatents

    Cooke, D. Wayne; Arendt, Paul N.; Piel, Helmut

    1994-01-01

    A microwave cavity including a pair of opposing clamshell halves, such halves comprised of a metal selected from the group consisting of silver, copper, or a silver-based alloy, wherein the cavity is further characterized as exhibiting a dominant TE.sub.011 mode is provided together with an embodiment wherein the interior concave surfaces of the clamshell halves are coated with a superconductive material. In the case of copper clamshell halves, the microwave cavity has a Q-value of about 1.2.times.10.sup.5 as measured at a temperature of 10K and a frequency of 10 GHz.

  1. A Key Establishment Protocol for RFID User in IPTV Environment

    NASA Astrophysics Data System (ADS)

    Jeong, Yoon-Su; Kim, Yong-Tae; Sohn, Jae-Min; Park, Gil-Cheol; Lee, Sang-Ho

    In recent years, the usage of IPTV (Internet Protocol Television) has been increased. The reason is a technological convergence of broadcasting and telecommunication delivering interactive applications and multimedia content through high speed Internet connections. The main critical point of IPTV security requirements is subscriber authentication. That is, IPTV service should have the capability to identify the subscribers to prohibit illegal access. Currently, IPTV service does not provide a sound authentication mechanism to verify the identity of its wireless users (or devices). This paper focuses on a lightweight authentication and key establishment protocol based on the use of hash functions. The proposed approach provides effective authentication for a mobile user with a RFID tag whose authentication information is communicated back and forth with the IPTV authentication server via IPTV set-top box (STB). That is, the proposed protocol generates user's authentication information that is a bundle of two public keys derived from hashing user's private keys and RFID tag's session identifier, and adds 1bit to this bundled information for subscriber's information confidentiality before passing it to the authentication server.

  2. Adaptive Bloom Filter: A Space-Efficient Counting Algorithm for Unpredictable Network Traffic

    NASA Astrophysics Data System (ADS)

    Matsumoto, Yoshihide; Hazeyama, Hiroaki; Kadobayashi, Youki

    The Bloom Filter (BF), a space-and-time-efficient hashcoding method, is used as one of the fundamental modules in several network processing algorithms and applications such as route lookups, cache hits, packet classification, per-flow state management or network monitoring. BF is a simple space-efficient randomized data structure used to represent a data set in order to support membership queries. However, BF generates false positives, and cannot count the number of distinct elements. A counting Bloom Filter (CBF) can count the number of distinct elements, but CBF needs more space than BF. We propose an alternative data structure of CBF, and we called this structure an Adaptive Bloom Filter (ABF). Although ABF uses the same-sized bit-vector used in BF, the number of hash functions employed by ABF is dynamically changed to record the number of appearances of a each key element. Considering the hash collisions, the multiplicity of a each key element on ABF can be estimated from the number of hash functions used to decode the membership of the each key element. Although ABF can realize the same functionality as CBF, ABF requires the same memory size as BF. We describe the construction of ABF and IABF (Improved ABF), and provide a mathematical analysis and simulation using Zipf's distribution. Finally, we show that ABF can be used for an unpredictable data set such as real network traffic.

  3. An Improved Biometrics-Based Remote User Authentication Scheme with User Anonymity

    PubMed Central

    Kumari, Saru

    2013-01-01

    The authors review the biometrics-based user authentication scheme proposed by An in 2012. The authors show that there exist loopholes in the scheme which are detrimental for its security. Therefore the authors propose an improved scheme eradicating the flaws of An's scheme. Then a detailed security analysis of the proposed scheme is presented followed by its efficiency comparison. The proposed scheme not only withstands security problems found in An's scheme but also provides some extra features with mere addition of only two hash operations. The proposed scheme allows user to freely change his password and also provides user anonymity with untraceability. PMID:24350272

  4. Functional materials based on nanocrystalline cellulose

    NASA Astrophysics Data System (ADS)

    Surov, O. V.; Voronova, M. I.; Zakharov, A. G.

    2017-10-01

    The data on the synthesis of functional materials based on nanocrystalline cellulose (NCC) published over the past 10 years are analyzed. The liquid-crystal properties of NCC suspensions, methods of investigation of NCC suspensions and films, conditions for preserving chiral nematic structure in the NCC films after removal of the solvent and features of templated sol-gel synthesis of functional materials based on NCC are considered. The bibliography includes 106 references.

  5. Robust watermarking scheme for binary images using a slice-based large-cluster algorithm with a Hamming Code

    NASA Astrophysics Data System (ADS)

    Chen, Wen-Yuan; Liu, Chen-Chung

    2006-01-01

    The problems with binary watermarking schemes are that they have only a small amount of embeddable space and are not robust enough. We develop a slice-based large-cluster algorithm (SBLCA) to construct a robust watermarking scheme for binary images. In SBLCA, a small-amount cluster selection (SACS) strategy is used to search for a feasible slice in a large-cluster flappable-pixel decision (LCFPD) method, which is used to search for the best location for concealing a secret bit from a selected slice. This method has four major advantages over the others: (a) SBLCA has a simple and effective decision function to select appropriate concealment locations, (b) SBLCA utilizes a blind watermarking scheme without the original image in the watermark extracting process, (c) SBLCA uses slice-based shuffling capability to transfer the regular image into a hash state without remembering the state before shuffling, and finally, (d) SBLCA has enough embeddable space that every 64 pixels could accommodate a secret bit of the binary image. Furthermore, empirical results on test images reveal that our approach is a robust watermarking scheme for binary images.

  6. Functional MRI registration with tissue-specific patch-based functional correlation tensors.

    PubMed

    Zhou, Yujia; Zhang, Han; Zhang, Lichi; Cao, Xiaohuan; Yang, Ru; Feng, Qianjin; Yap, Pew-Thian; Shen, Dinggang

    2018-06-01

    Population studies of brain function with resting-state functional magnetic resonance imaging (rs-fMRI) rely on accurate intersubject registration of functional areas. This is typically achieved through registration using high-resolution structural images with more spatial details and better tissue contrast. However, accumulating evidence has suggested that such strategy cannot align functional regions well because functional areas are not necessarily consistent with anatomical structures. To alleviate this problem, a number of registration algorithms based directly on rs-fMRI data have been developed, most of which utilize functional connectivity (FC) features for registration. However, most of these methods usually extract functional features only from the thin and highly curved cortical grey matter (GM), posing great challenges to accurate estimation of whole-brain deformation fields. In this article, we demonstrate that additional useful functional features can also be extracted from the whole brain, not restricted to the GM, particularly the white-matter (WM), for improving the overall functional registration. Specifically, we quantify local anisotropic correlation patterns of the blood oxygenation level-dependent (BOLD) signals using tissue-specific patch-based functional correlation tensors (ts-PFCTs) in both GM and WM. Functional registration is then performed by integrating the features from different tissues using the multi-channel large deformation diffeomorphic metric mapping (mLDDMM) algorithm. Experimental results show that our method achieves superior functional registration performance, compared with conventional registration methods. © 2018 Wiley Periodicals, Inc.

  7. OCO-2 Fairing Bi-Sector Halves Transport

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, is towed from the Building 836 hangar to Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations have begun to hoist the sections of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the pad's tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  8. Titan Two Halves

    NASA Image and Video Library

    2010-05-13

    Two different seasons on Titan in different hemispheres can be seen in this image. The moon northern half appears slightly darker than the southern half in this view taken in visible blue light by NASA Cassini spacecraft.

  9. Resistivity of Carbon-Carbon Composites Halved

    NASA Technical Reports Server (NTRS)

    Gaier, James R.

    2004-01-01

    Carbon-carbon composites have become the material of choice for applications requiring strength and stiffness at very high temperatures (above 2000 C). These composites comprise carbon or graphite fibers embedded in a carbonized or graphitized matrix. In some applications, such as shielding sensitive electronics in very high temperature environments, the performance of these materials would be improved by lowering their electrical resistivity. One method to lower the resistivity of the composites is to lower the resistivity of the graphite fibers, and a proven method to accomplish that is intercalation. Intercalation is the insertion of guest atoms or molecules into a host lattice. In this study the host fibers were highly graphitic pitch-based graphite fibers, or vapor-grown carbon fibers (VGCF), and the intercalate was bromine. Intercalation compounds of graphite are generally thought of as being only metastable, but it has been shown that the residual bromine graphite fiber intercalation compound is remarkably stable, resisting decomposition even at temperatures at least as high as 1000 C. The focus of this work was to fabricate composite preforms, determine whether the fibers they were made from were still intercalated with bromine after processing, and determine the effect on composite resistivity. It was not expected that the resistivity would be lowered as dramatically as with graphite polymer composites because the matrix itself would be much more conductive, but it was hoped that the gains would be substantial enough to warrant its use in high-performance applications. In a collaborative effort supporting a Space Act Agreement between the NASA Glenn Research Center and Applied Sciences, Inc. (Cedarville, OH), laminar preforms were fabricated with pristine and bromine-intercalated pitch-based fibers (P100 and P100-Br) and VGCF (Pyro I and Pyro I-Br). The green preforms were carbonized at 1000 C and then heat treated to 3000 C. To determine whether the

  10. Digital Camera with Apparatus for Authentication of Images Produced from an Image File

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1996-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.

  11. Tar, CO and delta 9THC delivery from the 1st and 2nd halves of a marijuana cigarette.

    PubMed

    Tashkin, D P; Gliederer, F; Rose, J; Chang, P; Hui, K K; Yu, J L; Wu, T C

    1991-11-01

    Previous in vitro studies suggest that, with successive puffs from a marijuana cigarette, delta-9-THC becomes concentrated in the remaining uncombusted portion of the cigarette. These observations are consistent with the common practice of smoking marijuana cigarettes to a smaller butt length than that to which tobacco cigarettes are smoked. The purpose of the present study was to compare the delivery of delta-9-THC, as well as total insoluble smoke particulates (tar) and carbon monoxide, from the distal ("first") versus the proximal ("second") halves of a standard marijuana cigarette during "natural" smoking of marijuana. On 4 separate days, ten habitual marijuana users smoked nearly all or approximately 1/2 of a standard marijuana cigarette (83 mm length; 800-900 mg; 1.24% THC), as follows: day 1, "whole" cigarette (60 mm smoked, leaving a 23-mm butt); day 2, "first" half (first 30 mm); day 3, "second" half (second 30 mm) after the "first" half was presmoked with a syringe; and day 4, "second" half after the "first" half was excised. A previously described smoking apparatus (20) was used for measurement of puff volume and inhaled tar. Puff volume and number were allowed to vary spontaneously (provided that the specified length of cigarette was consumed), while inhaled volume (1.5 liters), breathholding time (14 s) and interpuff interval (30 s) were held constant. Blood samples were withdrawn prior to smoking and serially after completion of smoking for analysis of blood carboxyhemoglobin (COHb) and serum delta-9-THC. Heart rate was measured before and 5 min after smoking. Subjects rated their level of "high" 20 min after completion of smoking.(ABSTRACT TRUNCATED AT 250 WORDS)

  12. Perl Modules for Constructing Iterators

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2009-01-01

    The Iterator Perl Module provides a general-purpose framework for constructing iterator objects within Perl, and a standard API for interacting with those objects. Iterators are an object-oriented design pattern where a description of a series of values is used in a constructor. Subsequent queries can request values in that series. These Perl modules build on the standard Iterator framework and provide iterators for some other types of values. Iterator::DateTime constructs iterators from DateTime objects or Date::Parse descriptions and ICal/RFC 2445 style re-currence descriptions. It supports a variety of input parameters, including a start to the sequence, an end to the sequence, an Ical/RFC 2445 recurrence describing the frequency of the values in the series, and a format description that can refine the presentation manner of the DateTime. Iterator::String constructs iterators from string representations. This module is useful in contexts where the API consists of supplying a string and getting back an iterator where the specific iteration desired is opaque to the caller. It is of particular value to the Iterator::Hash module which provides nested iterations. Iterator::Hash constructs iterators from Perl hashes that can include multiple iterators. The constructed iterators will return all the permutations of the iterations of the hash by nested iteration of embedded iterators. A hash simply includes a set of keys mapped to values. It is a very common data structure used throughout Perl programming. The Iterator:: Hash module allows a hash to include strings defining iterators (parsed and dispatched with Iterator::String) that are used to construct an overall series of hash values.

  13. Single-pixel non-imaging object recognition by means of Fourier spectrum acquisition

    NASA Astrophysics Data System (ADS)

    Chen, Huichao; Shi, Jianhong; Liu, Xialin; Niu, Zhouzhou; Zeng, Guihua

    2018-04-01

    Single-pixel imaging has emerged over recent years as a novel imaging technique, which has significant application prospects. In this paper, we propose and experimentally demonstrate a scheme that can achieve single-pixel non-imaging object recognition by acquiring the Fourier spectrum. In an experiment, a four-step phase-shifting sinusoid illumination light is used to irradiate the object image, the value of the light intensity is measured with a single-pixel detection unit, and the Fourier coefficients of the object image are obtained by a differential measurement. The Fourier coefficients are first cast into binary numbers to obtain the hash value. We propose a new method of perceptual hashing algorithm, which is combined with a discrete Fourier transform to calculate the hash value. The hash distance is obtained by calculating the difference of the hash value between the object image and the contrast images. By setting an appropriate threshold, the object image can be quickly and accurately recognized. The proposed scheme realizes single-pixel non-imaging perceptual hashing object recognition by using fewer measurements. Our result might open a new path for realizing object recognition with non-imaging.

  14. A Comparison of Nutrient Intakes between a Ft. Riley Contractor-Operated and a Ft. Lewis Military-Operated Garrison Dining Facility

    DTIC Science & Technology

    1987-10-01

    Meatsauce Rissole Potatoes Turkey Nuggets (I/o) Hash Browned Potatoes (I/o) Mashed Potatoes Buttered Mixed Vegetables Toasted Garlic Bread Brussels...Chicken Curry Baked Ham/P/A Sauce Parsley Buttered Potatoes Brown Gravy Hash Browned Potatoes (I/o) Steamed Rice Steamed Carrots Mashed Potatoes...Steak Mashed Potatoes Mashed Potatoes Rissole Potatoes Steamed Rice Hash Browned Potatoes (I/o) Green Beans Steamed Carrots Broccoli w/Cheese sauce

  15. Multicollision attack on CBC-MAC, EMAC, and XCBC-MAC of AES-128 algorithm

    NASA Astrophysics Data System (ADS)

    Brolin Sihite, Alfonso; Hayat Susanti, Bety

    2017-10-01

    A Message Authentication Codes (MAC) can be constructed based on a block cipher algorithm. CBC-MAC, EMAC, and XCBC-MAC constructions are some of MAC schemes that used in the hash function. In this paper, we do multicollision attack on CBC-MAC, EMAC, and XCBC-MAC construction which uses AES-128 block cipher algorithm as basic construction. The method of multicollision attack utilizes the concept of existential forgery on CBC-MAC. The results show that the multicollision can be obtained easily in CBC-MAC, EMAC, and XCBC-MAC construction.

  16. Secured Hash Based Burst Header Authentication Design for Optical Burst Switched Networks

    NASA Astrophysics Data System (ADS)

    Balamurugan, A. M.; Sivasubramanian, A.; Parvathavarthini, B.

    2017-12-01

    The optical burst switching (OBS) is a promising technology that could meet the fast growing network demand. They are featured with the ability to meet the bandwidth requirement of applications that demand intensive bandwidth. OBS proves to be a satisfactory technology to tackle the huge bandwidth constraints, but suffers from security vulnerabilities. The objective of this proposed work is to design a faster and efficient burst header authentication algorithm for core nodes. There are two important key features in this work, viz., header encryption and authentication. Since the burst header is an important in optical burst switched network, it has to be encrypted; otherwise it is be prone to attack. The proposed MD5&RC4-4S based burst header authentication algorithm runs 20.75 ns faster than the conventional algorithms. The modification suggested in the proposed RC4-4S algorithm gives a better security and solves the correlation problems between the publicly known outputs during key generation phase. The modified MD5 recommended in this work provides 7.81 % better avalanche effect than the conventional algorithm. The device utilization result also shows the suitability of the proposed algorithm for header authentication in real time applications.

  17. Functional Assessment-Based Interventions: Focusing on the Environment and Considering Function

    ERIC Educational Resources Information Center

    Oakes, Wendy Peia; Lane, Kathleen Lynne; Hirsch, Shanna Eisner

    2018-01-01

    It can be challenging for educators to select intervention tactics based on the function of the student's behavior. In this article, authors offer practical information on behavioral function and environmental-focused intervention ideas for educators developing behavior intervention plans. Ideas are organized according to the hypothesized function…

  18. Protection of Health Imagery by Region Based Lossless Reversible Watermarking Scheme

    PubMed Central

    Priya, R. Lakshmi; Sadasivam, V.

    2015-01-01

    Providing authentication and integrity in medical images is a problem and this work proposes a new blind fragile region based lossless reversible watermarking technique to improve trustworthiness of medical images. The proposed technique embeds the watermark using a reversible least significant bit embedding scheme. The scheme combines hashing, compression, and digital signature techniques to create a content dependent watermark making use of compressed region of interest (ROI) for recovery of ROI as reported in literature. The experiments were carried out to prove the performance of the scheme and its assessment reveals that ROI is extracted in an intact manner and PSNR values obtained lead to realization that the presented scheme offers greater protection for health imageries. PMID:26649328

  19. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    NASA Astrophysics Data System (ADS)

    Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.

  20. A Secure and Robust Object-Based Video Authentication System

    NASA Astrophysics Data System (ADS)

    He, Dajun; Sun, Qibin; Tian, Qi

    2004-12-01

    An object-based video authentication system, which combines watermarking, error correction coding (ECC), and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART) coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT) coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI).

  1. Trial-Based Functional Analysis and Functional Communication Training in an Early Childhood Setting

    ERIC Educational Resources Information Center

    Lambert, Joseph M.; Bloom, Sarah E.; Irvin, Jennifer

    2012-01-01

    Problem behavior is common in early childhood special education classrooms. Functional communication training (FCT; Carr & Durand, 1985) may reduce problem behavior but requires identification of its function. The trial-based functional analysis (FA) is a method that can be used to identify problem behavior function in schools. We conducted…

  2. Tactile functions after cerebral hemispherectomy.

    PubMed

    Backlund, H; Morin, C; Ptito, A; Bushnell, M C; Olausson, H

    2005-01-01

    Patients that were hemispherectomized due to brain lesions early in life sometimes have remarkably well-preserved tactile functions on their paretic body half. This has been attributed to developmental neuroplasticity. However, the tactile examinations generally have been fairly crude, and subtle deficits may not have been revealed. We investigated monofilament detection and three types of tactile directional sensibility in four hemispherectomized patients and six healthy controls. Patients were examined bilaterally on the face, forearm and lower leg. Normal subjects were examined unilaterally. Following each test of directional sensibility, subjects were asked to rate the intensity of the stimulation. On the nonparetic side, results were almost always in the normal range. On the paretic side, the patients' capacity for monofilament detection was less impaired than their directional sensibility. Despite the disturbed directional sensibility on their paretic side the patients rated tactile sensations evoked by the stimuli, on both their paretic and nonparetic body halves, as more intense than normals. Thus, mechanisms of plasticity seem adequate for tactile detection and intensity coding but not for more complex tactile functions such as directional sensibility. The reason for the high vulnerability of tactile directional sensibility may be that it depends on spatially and temporally precise afferent information processed in a distributed cortical network.

  3. Big words, halved brains and small worlds: complex brain networks of figurative language comprehension.

    PubMed

    Arzouan, Yossi; Solomon, Sorin; Faust, Miriam; Goldstein, Abraham

    2011-04-27

    Language comprehension is a complex task that involves a wide network of brain regions. We used topological measures to qualify and quantify the functional connectivity of the networks used under various comprehension conditions. To that aim we developed a technique to represent functional networks based on EEG recordings, taking advantage of their excellent time resolution in order to capture the fast processes that occur during language comprehension. Networks were created by searching for a specific causal relation between areas, the negative feedback loop, which is ubiquitous in many systems. This method is a simple way to construct directed graphs using event-related activity, which can then be analyzed topologically. Brain activity was recorded while subjects read expressions of various types and indicated whether they found them meaningful. Slightly different functional networks were obtained for event-related activity evoked by each expression type. The differences reflect the special contribution of specific regions in each condition and the balance of hemispheric activity involved in comprehending different types of expressions and are consistent with the literature in the field. Our results indicate that representing event-related brain activity as a network using a simple temporal relation, such as the negative feedback loop, to indicate directional connectivity is a viable option for investigation which also derives new information about aspects not reflected in the classical methods for investigating brain activity.

  4. Microbiological quality of five potato products obtained at retail markets.

    PubMed Central

    Duran, A P; Swartzentruber, A; Lanier, J M; Wentz, B A; Schwab, A H; Barnard, R J; Read, R B

    1982-01-01

    The microbiological quality of frozen hash brown potatoes, dried hash brown potatoes with onions, frozen french fried potatoes, dried instant mashed potatoes, and potato salad was determined by a national sampling at the retail level. A wide range of results was obtained, with most sampling units of each products having excellent microbiological quality. Geometric mean aerobic plate counts were as follows: dried hash brown potatoes, 270/g; frozen hash brown potatoes with onions, 580/g; frozen french fried potatoes 78/g; dried instant mashed potatoes, 1.1 x 10(3)/g; and potato salad, 3.6 x 10(3)/g. Mean values of coliforms, Escherichia coli, and Staphylococcus aureus were less than 10/g. PMID:6758695

  5. CP function: an alpha spending function based on conditional power.

    PubMed

    Jiang, Zhiwei; Wang, Ling; Li, Chanjuan; Xia, Jielai; Wang, William

    2014-11-20

    Alpha spending function and stochastic curtailment are two frequently used methods in group sequential design. In the stochastic curtailment approach, the actual type I error probability cannot be well controlled within the specified significance level. But conditional power (CP) in stochastic curtailment is easier to be accepted and understood by clinicians. In this paper, we develop a spending function based on the concept of conditional power, named CP function, which combines desirable features of alpha spending and stochastic curtailment. Like other two-parameter functions, CP function is flexible to fit the needs of the trial. A simulation study is conducted to explore the choice of CP boundary in CP function that maximizes the trial power. It is equivalent to, even better than, classical Pocock, O'Brien-Fleming, and quadratic spending function as long as a proper ρ0 is given, which is pre-specified CP threshold for efficacy. It also well controls the overall type I error type I error rate and overcomes the disadvantage of stochastic curtailment. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Dissociating error-based and reinforcement-based loss functions during sensorimotor learning

    PubMed Central

    McGregor, Heather R.; Mohatarem, Ayman

    2017-01-01

    It has been proposed that the sensorimotor system uses a loss (cost) function to evaluate potential movements in the presence of random noise. Here we test this idea in the context of both error-based and reinforcement-based learning. In a reaching task, we laterally shifted a cursor relative to true hand position using a skewed probability distribution. This skewed probability distribution had its mean and mode separated, allowing us to dissociate the optimal predictions of an error-based loss function (corresponding to the mean of the lateral shifts) and a reinforcement-based loss function (corresponding to the mode). We then examined how the sensorimotor system uses error feedback and reinforcement feedback, in isolation and combination, when deciding where to aim the hand during a reach. We found that participants compensated differently to the same skewed lateral shift distribution depending on the form of feedback they received. When provided with error feedback, participants compensated based on the mean of the skewed noise. When provided with reinforcement feedback, participants compensated based on the mode. Participants receiving both error and reinforcement feedback continued to compensate based on the mean while repeatedly missing the target, despite receiving auditory, visual and monetary reinforcement feedback that rewarded hitting the target. Our work shows that reinforcement-based and error-based learning are separable and can occur independently. Further, when error and reinforcement feedback are in conflict, the sensorimotor system heavily weights error feedback over reinforcement feedback. PMID:28753634

  7. Dissociating error-based and reinforcement-based loss functions during sensorimotor learning.

    PubMed

    Cashaback, Joshua G A; McGregor, Heather R; Mohatarem, Ayman; Gribble, Paul L

    2017-07-01

    It has been proposed that the sensorimotor system uses a loss (cost) function to evaluate potential movements in the presence of random noise. Here we test this idea in the context of both error-based and reinforcement-based learning. In a reaching task, we laterally shifted a cursor relative to true hand position using a skewed probability distribution. This skewed probability distribution had its mean and mode separated, allowing us to dissociate the optimal predictions of an error-based loss function (corresponding to the mean of the lateral shifts) and a reinforcement-based loss function (corresponding to the mode). We then examined how the sensorimotor system uses error feedback and reinforcement feedback, in isolation and combination, when deciding where to aim the hand during a reach. We found that participants compensated differently to the same skewed lateral shift distribution depending on the form of feedback they received. When provided with error feedback, participants compensated based on the mean of the skewed noise. When provided with reinforcement feedback, participants compensated based on the mode. Participants receiving both error and reinforcement feedback continued to compensate based on the mean while repeatedly missing the target, despite receiving auditory, visual and monetary reinforcement feedback that rewarded hitting the target. Our work shows that reinforcement-based and error-based learning are separable and can occur independently. Further, when error and reinforcement feedback are in conflict, the sensorimotor system heavily weights error feedback over reinforcement feedback.

  8. Design and implementation of encrypted and decrypted file system based on USBKey and hardware code

    NASA Astrophysics Data System (ADS)

    Wu, Kehe; Zhang, Yakun; Cui, Wenchao; Jiang, Ting

    2017-05-01

    To protect the privacy of sensitive data, an encrypted and decrypted file system based on USBKey and hardware code is designed and implemented in this paper. This system uses USBKey and hardware code to authenticate a user. We use random key to encrypt file with symmetric encryption algorithm and USBKey to encrypt random key with asymmetric encryption algorithm. At the same time, we use the MD5 algorithm to calculate the hash of file to verify its integrity. Experiment results show that large files can be encrypted and decrypted in a very short time. The system has high efficiency and ensures the security of documents.

  9. Unified Communications: Simplifying DoD Communication Methods

    DTIC Science & Technology

    2013-04-18

    private key to encrypt the hash. The encrypted hash, together with some other information, such as the hashing algorithm , is known as a digital...virtual private network (VPN). The use of a VPN would allow users to access corporate data while encrypting traffic.35 Another layer of protection would...sign and encrypt emails as well as controlling access to restricted sites. PKI uses a combination of public and private keys for encryption and

  10. Using Distinct Sectors in Media Sampling and Full Media Analysis to Detect Presence of Documents from a Corpus

    DTIC Science & Technology

    2012-09-01

    relative performance of several conventional SQL and NoSQL databases with a set of one billion file block hashes. Digital Forensics, Sector Hashing, Full... NoSQL databases with a set of one billion file block hashes. v THIS PAGE INTENTIONALLY LEFT BLANK vi Table of Contents List of Acronyms and...Operating System NOOP No Operation assembly instruction NoSQL “Not only SQL” model for non-relational database management NSRL National Software

  11. A Fast Optimization Method for General Binary Code Learning.

    PubMed

    Shen, Fumin; Zhou, Xiang; Yang, Yang; Song, Jingkuan; Shen, Heng; Tao, Dacheng

    2016-09-22

    Hashing or binary code learning has been recognized to accomplish efficient near neighbor search, and has thus attracted broad interests in recent retrieval, vision and learning studies. One main challenge of learning to hash arises from the involvement of discrete variables in binary code optimization. While the widely-used continuous relaxation may achieve high learning efficiency, the pursued codes are typically less effective due to accumulated quantization error. In this work, we propose a novel binary code optimization method, dubbed Discrete Proximal Linearized Minimization (DPLM), which directly handles the discrete constraints during the learning process. Specifically, the discrete (thus nonsmooth nonconvex) problem is reformulated as minimizing the sum of a smooth loss term with a nonsmooth indicator function. The obtained problem is then efficiently solved by an iterative procedure with each iteration admitting an analytical discrete solution, which is thus shown to converge very fast. In addition, the proposed method supports a large family of empirical loss functions, which is particularly instantiated in this work by both a supervised and an unsupervised hashing losses, together with the bits uncorrelation and balance constraints. In particular, the proposed DPLM with a supervised `2 loss encodes the whole NUS-WIDE database into 64-bit binary codes within 10 seconds on a standard desktop computer. The proposed approach is extensively evaluated on several large-scale datasets and the generated binary codes are shown to achieve very promising results on both retrieval and classification tasks.

  12. Classroom Application of a Trial-Based Functional Analysis

    ERIC Educational Resources Information Center

    Bloom, Sarah E.; Iwata, Brian A.; Fritz, Jennifer N.; Roscoe, Eileen M.; Carreau, Abbey B.

    2011-01-01

    We evaluated a trial-based approach to conducting functional analyses in classroom settings. Ten students referred for problem behavior were exposed to a series of assessment trials, which were interspersed among classroom activities throughout the day. Results of these trial-based functional analyses were compared to those of more traditional…

  13. Left-right asymmetry of the gnathostome skull: its evolutionary, developmental, and functional aspects.

    PubMed

    Compagnucci, Claudia; Fish, Jennifer; Depew, Michael J

    2014-06-01

    Much of the gnathostome (jawed vertebrate) evolutionary radiation was dependent on the ability to sense and interpret the environment and subsequently act upon this information through utilization of a specialized mode of feeding involving the jaws. While the gnathostome skull, reflective of the vertebrate baüplan, typically is bilaterally symmetric with right (dextral) and left (sinistral) halves essentially representing mirror images along the midline, both adaptive and abnormal asymmetries have appeared. Herein we provide a basic primer on studies of the asymmetric development of the gnathostome skull, touching briefly on asymmetry as a field of study, then describing the nature of cranial development and finally underscoring evolutionary and functional aspects of left-right asymmetric cephalic development. © 2014 Wiley Periodicals, Inc.

  14. Evolution-Based Functional Decomposition of Proteins

    PubMed Central

    Rivoire, Olivier; Reynolds, Kimberly A.; Ranganathan, Rama

    2016-01-01

    The essential biological properties of proteins—folding, biochemical activities, and the capacity to adapt—arise from the global pattern of interactions between amino acid residues. The statistical coupling analysis (SCA) is an approach to defining this pattern that involves the study of amino acid coevolution in an ensemble of sequences comprising a protein family. This approach indicates a functional architecture within proteins in which the basic units are coupled networks of amino acids termed sectors. This evolution-based decomposition has potential for new understandings of the structural basis for protein function. To facilitate its usage, we present here the principles and practice of the SCA and introduce new methods for sector analysis in a python-based software package (pySCA). We show that the pattern of amino acid interactions within sectors is linked to the divergence of functional lineages in a multiple sequence alignment—a model for how sector properties might be differentially tuned in members of a protein family. This work provides new tools for studying proteins and for generally testing the concept of sectors as the principal units of function and adaptive variation. PMID:27254668

  15. AgBase: supporting functional modeling in agricultural organisms

    PubMed Central

    McCarthy, Fiona M.; Gresham, Cathy R.; Buza, Teresia J.; Chouvarine, Philippe; Pillai, Lakshmi R.; Kumar, Ranjit; Ozkan, Seval; Wang, Hui; Manda, Prashanti; Arick, Tony; Bridges, Susan M.; Burgess, Shane C.

    2011-01-01

    AgBase (http://www.agbase.msstate.edu/) provides resources to facilitate modeling of functional genomics data and structural and functional annotation of agriculturally important animal, plant, microbe and parasite genomes. The website is redesigned to improve accessibility and ease of use, including improved search capabilities. Expanded capabilities include new dedicated pages for horse, cat, dog, cotton, rice and soybean. We currently provide 590 240 Gene Ontology (GO) annotations to 105 454 gene products in 64 different species, including GO annotations linked to transcripts represented on agricultural microarrays. For many of these arrays, this provides the only functional annotation available. GO annotations are available for download and we provide comprehensive, species-specific GO annotation files for 18 different organisms. The tools available at AgBase have been expanded and several existing tools improved based upon user feedback. One of seven new tools available at AgBase, GOModeler, supports hypothesis testing from functional genomics data. We host several associated databases and provide genome browsers for three agricultural pathogens. Moreover, we provide comprehensive training resources (including worked examples and tutorials) via links to Educational Resources at the AgBase website. PMID:21075795

  16. Security enhancement of a biometric based authentication scheme for telecare medicine information systems with nonce.

    PubMed

    Mishra, Dheerendra; Mukhopadhyay, Sourav; Kumari, Saru; Khan, Muhammad Khurram; Chaturvedi, Ankita

    2014-05-01

    Telecare medicine information systems (TMIS) present the platform to deliver clinical service door to door. The technological advances in mobile computing are enhancing the quality of healthcare and a user can access these services using its mobile device. However, user and Telecare system communicate via public channels in these online services which increase the security risk. Therefore, it is required to ensure that only authorized user is accessing the system and user is interacting with the correct system. The mutual authentication provides the way to achieve this. Although existing schemes are either vulnerable to attacks or they have higher computational cost while an scalable authentication scheme for mobile devices should be secure and efficient. Recently, Awasthi and Srivastava presented a biometric based authentication scheme for TMIS with nonce. Their scheme only requires the computation of the hash and XOR functions.pagebreak Thus, this scheme fits for TMIS. However, we observe that Awasthi and Srivastava's scheme does not achieve efficient password change phase. Moreover, their scheme does not resist off-line password guessing attack. Further, we propose an improvement of Awasthi and Srivastava's scheme with the aim to remove the drawbacks of their scheme.

  17. Trial-Based Functional Analysis Informs Treatment for Vocal Scripting.

    PubMed

    Rispoli, Mandy; Brodhead, Matthew; Wolfe, Katie; Gregori, Emily

    2018-05-01

    Research on trial-based functional analysis has primarily focused on socially maintained challenging behaviors. However, procedural modifications may be necessary to clarify ambiguous assessment results. The purposes of this study were to evaluate the utility of iterative modifications to trial-based functional analysis on the identification of putative reinforcement and subsequent treatment for vocal scripting. For all participants, modifications to the trial-based functional analysis identified a primary function of automatic reinforcement. The structure of the trial-based format led to identification of social attention as an abolishing operation for vocal scripting. A noncontingent attention treatment was evaluated using withdrawal designs for each participant. This noncontingent attention treatment resulted in near zero levels of vocal scripting for all participants. Implications for research and practice are presented.

  18. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    PubMed

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  19. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information

    PubMed Central

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft’s algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms. PMID:27806102

  20. 3D animation of facial plastic surgery based on computer graphics

    NASA Astrophysics Data System (ADS)

    Zhang, Zonghua; Zhao, Yan

    2013-12-01

    More and more people, especial women, are getting desired to be more beautiful than ever. To some extent, it becomes true because the plastic surgery of face was capable in the early 20th and even earlier as doctors just dealing with war injures of face. However, the effect of post-operation is not always satisfying since no animation could be seen by the patients beforehand. In this paper, by combining plastic surgery of face and computer graphics, a novel method of simulated appearance of post-operation will be given to demonstrate the modified face from different viewpoints. The 3D human face data are obtained by using 3D fringe pattern imaging systems and CT imaging systems and then converted into STL (STereo Lithography) file format. STL file is made up of small 3D triangular primitives. The triangular mesh can be reconstructed by using hash function. Top triangular meshes in depth out of numbers of triangles must be picked up by ray-casting technique. Mesh deformation is based on the front triangular mesh in the process of simulation, which deforms interest area instead of control points. Experiments on face model show that the proposed 3D animation facial plastic surgery can effectively demonstrate the simulated appearance of post-operation.

  1. Training Teachers to Conduct Trial-Based Functional Analyses

    ERIC Educational Resources Information Center

    Kunnavatana, S. Shanun; Bloom, Sarah E.; Samaha, Andrew L.; Dayton, Elizabeth

    2013-01-01

    The trial-based functional analysis (FA) is a promising approach to identification of behavioral function and is especially suited for use in educational settings. Not all studies on trial-based FA have included teachers as therapists, and those studies that have, included minimal information on teacher training. The purpose of this study was to…

  2. Security enhanced multi-factor biometric authentication scheme using bio-hash function

    PubMed Central

    Lee, Youngsook; Moon, Jongho

    2017-01-01

    With the rapid development of personal information and wireless communication technology, user authentication schemes have been crucial to ensure that wireless communications are secure. As such, various authentication schemes with multi-factor authentication have been proposed to improve the security of electronic communications. Multi-factor authentication involves the use of passwords, smart cards, and various biometrics to provide users with the utmost privacy and data protection. Cao and Ge analyzed various authentication schemes and found that Younghwa An’s scheme was susceptible to a replay attack where an adversary masquerades as a legal server and a user masquerading attack where user anonymity is not provided, allowing an adversary to execute a password change process by intercepting the user’s ID during login. Cao and Ge improved upon Younghwa An’s scheme, but various security problems remained. This study demonstrates that Cao and Ge’s scheme is susceptible to a biometric recognition error, slow wrong password detection, off-line password attack, user impersonation attack, ID guessing attack, a DoS attack, and that their scheme cannot provide session key agreement. Then, to address all weaknesses identified in Cao and Ge’s scheme, this study proposes a security enhanced multi-factor biometric authentication scheme and provides a security analysis and formal analysis using Burrows-Abadi-Needham logic. Finally, the efficiency analysis reveals that the proposed scheme can protect against several possible types of attacks with only a slightly high computational cost. PMID:28459867

  3. Security enhanced multi-factor biometric authentication scheme using bio-hash function.

    PubMed

    Choi, Younsung; Lee, Youngsook; Moon, Jongho; Won, Dongho

    2017-01-01

    With the rapid development of personal information and wireless communication technology, user authentication schemes have been crucial to ensure that wireless communications are secure. As such, various authentication schemes with multi-factor authentication have been proposed to improve the security of electronic communications. Multi-factor authentication involves the use of passwords, smart cards, and various biometrics to provide users with the utmost privacy and data protection. Cao and Ge analyzed various authentication schemes and found that Younghwa An's scheme was susceptible to a replay attack where an adversary masquerades as a legal server and a user masquerading attack where user anonymity is not provided, allowing an adversary to execute a password change process by intercepting the user's ID during login. Cao and Ge improved upon Younghwa An's scheme, but various security problems remained. This study demonstrates that Cao and Ge's scheme is susceptible to a biometric recognition error, slow wrong password detection, off-line password attack, user impersonation attack, ID guessing attack, a DoS attack, and that their scheme cannot provide session key agreement. Then, to address all weaknesses identified in Cao and Ge's scheme, this study proposes a security enhanced multi-factor biometric authentication scheme and provides a security analysis and formal analysis using Burrows-Abadi-Needham logic. Finally, the efficiency analysis reveals that the proposed scheme can protect against several possible types of attacks with only a slightly high computational cost.

  4. The self-crosslinking smart hyaluronic acid hydrogels as injectable three-dimensional scaffolds for cells culture.

    PubMed

    Bian, Shaoquan; He, Mengmeng; Sui, Junhui; Cai, Hanxu; Sun, Yong; Liang, Jie; Fan, Yujiang; Zhang, Xingdong

    2016-04-01

    Although the disulfide bond crosslinked hyaluronic acid hydrogels have been reported by many research groups, the major researches were focused on effectively forming hydrogels. However, few researchers paid attention to the potential significance of controlling the hydrogel formation and degradation, improving biocompatibility, reducing the toxicity of exogenous and providing convenience to the clinical operations later on. In this research, the novel controllable self-crosslinking smart hydrogels with in-situ gelation property was prepared by a single component, the thiolated hyaluronic acid derivative (HA-SH), and applied as a three-dimensional scaffold to mimic native extracellular matrix (ECM) for the culture of fibroblasts cells (L929) and chondrocytes. A series of HA-SH hydrogels were prepared depending on different degrees of thiol substitution (ranging from 10 to 60%) and molecule weights of HA (0.1, 0.3 and 1.0 MDa). The gelation time, swelling property and smart degradation behavior of HA-SH hydrogel were evaluated. The results showed that the gelation and degradation time of hydrogels could be controlled by adjusting the component of HA-SH polymers. The storage modulus of HA-SH hydrogels obtained by dynamic modulus analysis (DMA) could be up to 44.6 kPa. In addition, HA-SH hydrogels were investigated as a three-dimensional scaffold for the culture of fibroblasts cells (L929) and chondrocytes cells in vitro and as an injectable hydrogel for delivering chondrocytes cells in vivo. These results illustrated that HA-SH hydrogels with controllable gelation process, intelligent degradation behavior, excellent biocompatibility and convenient operational characteristics supplied potential clinical application capacity for tissue engineering and regenerative medicine. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Provably secure Rabin-p cryptosystem in hybrid setting

    NASA Astrophysics Data System (ADS)

    Asbullah, Muhammad Asyraf; Ariffin, Muhammad Rezal Kamel

    2016-06-01

    In this work, we design an efficient and provably secure hybrid cryptosystem depicted by a combination of the Rabin-p cryptosystem with an appropriate symmetric encryption scheme. We set up a hybrid structure which is proven secure in the sense of indistinguishable against the chosen-ciphertext attack. We presume that the integer factorization problem is hard and the hash function that modeled as a random function.

  6. A network function-based definition of communities in complex networks.

    PubMed

    Chauhan, Sanjeev; Girvan, Michelle; Ott, Edward

    2012-09-01

    We consider an alternate definition of community structure that is functionally motivated. We define network community structure based on the function the network system is intended to perform. In particular, as a specific example of this approach, we consider communities whose function is enhanced by the ability to synchronize and/or by resilience to node failures. Previous work has shown that, in many cases, the largest eigenvalue of the network's adjacency matrix controls the onset of both synchronization and percolation processes. Thus, for networks whose functional performance is dependent on these processes, we propose a method that divides a given network into communities based on maximizing a function of the largest eigenvalues of the adjacency matrices of the resulting communities. We also explore the differences between the partitions obtained by our method and the modularity approach (which is based solely on consideration of network structure). We do this for several different classes of networks. We find that, in many cases, modularity-based partitions do almost as well as our function-based method in finding functional communities, even though modularity does not specifically incorporate consideration of function.

  7. Halving It All: How Equally Shared Parenting Works.

    ERIC Educational Resources Information Center

    Deutsch, Francine M.

    Noting that details of everyday life contribute to parental equality or inequality, this qualitative study focused on how couples transformed parental roles to create truly equal families. Participating in the study were 88 couples in 4 categories, based on division of parental responsibilities: equal sharers, 60-40 couples, 75-25 couples, and…

  8. Teaching Paraprofessionals to Implement Function-Based Interventions

    ERIC Educational Resources Information Center

    Walker, Virginia L.; Snell, Martha E.

    2017-01-01

    The purpose of this study was to evaluate the effects of workshops and coaching on paraprofessional implementation of function-based interventions. The results of indirect and direct functional behavior assessment guided the development of intervention strategies for three students with autism and intellectual disability. Following intervention,…

  9. High-performance sparse matrix-matrix products on Intel KNL and multicore architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagasaka, Y; Matsuoka, S; Azad, A

    Sparse matrix-matrix multiplication (SpGEMM) is a computational primitive that is widely used in areas ranging from traditional numerical applications to recent big data analysis and machine learning. Although many SpGEMM algorithms have been proposed, hardware specific optimizations for multi- and many-core processors are lacking and a detailed analysis of their performance under various use cases and matrices is not available. We firstly identify and mitigate multiple bottlenecks with memory management and thread scheduling on Intel Xeon Phi (Knights Landing or KNL). Specifically targeting multi- and many-core processors, we develop a hash-table-based algorithm and optimize a heap-based shared-memory SpGEMM algorithm. Wemore » examine their performance together with other publicly available codes. Different from the literature, our evaluation also includes use cases that are representative of real graph algorithms, such as multi-source breadth-first search or triangle counting. Our hash-table and heap-based algorithms are showing significant speedups from libraries in the majority of the cases while different algorithms dominate the other scenarios with different matrix size, sparsity, compression factor and operation type. We wrap up in-depth evaluation results and make a recipe to give the best SpGEMM algorithm for target scenario. A critical finding is that hash-table-based SpGEMM gets a significant performance boost if the nonzeros are not required to be sorted within each row of the output matrix.« less

  10. ACMES: fast multiple-genome searches for short repeat sequences with concurrent cross-species information retrieval

    PubMed Central

    Reneker, Jeff; Shyu, Chi-Ren; Zeng, Peiyu; Polacco, Joseph C.; Gassmann, Walter

    2004-01-01

    We have developed a web server for the life sciences community to use to search for short repeats of DNA sequence of length between 3 and 10 000 bases within multiple species. This search employs a unique and fast hash function approach. Our system also applies information retrieval algorithms to discover knowledge of cross-species conservation of repeat sequences. Furthermore, we have incorporated a part of the Gene Ontology database into our information retrieval algorithms to broaden the coverage of the search. Our web server and tutorial can be found at http://acmes.rnet.missouri.edu. PMID:15215469

  11. Web-based Electronic Sharing and RE-allocation of Assets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leverett, Dave; Miller, Robert A.; Berlin, Gary J.

    2002-09-09

    The Electronic Asses Sharing Program is a web-based application that provides the capability for complex-wide sharing and reallocation of assets that are excess, under utilized, or un-utilized. through a web-based fron-end and supporting has database with a search engine, users can search for assets that they need, search for assets needed by others, enter assets they need, and enter assets they have available for reallocation. In addition, entire listings of available assets and needed assets can be viewed. The application is written in Java, the hash database and search engine are in Object-oriented Java Database Management (OJDBM). The application willmore » be hosted on an SRS-managed server outside the Firewall and access will be controlled via a protected realm. An example of the application can be viewed at the followinig (temporary) URL: http://idgdev.srs.gov/servlet/srs.weshare.WeShare« less

  12. Dominance-based ranking functions for interval-valued intuitionistic fuzzy sets.

    PubMed

    Chen, Liang-Hsuan; Tu, Chien-Cheng

    2014-08-01

    The ranking of interval-valued intuitionistic fuzzy sets (IvIFSs) is difficult since they include the interval values of membership and nonmembership. This paper proposes ranking functions for IvIFSs based on the dominance concept. The proposed ranking functions consider the degree to which an IvIFS dominates and is not dominated by other IvIFSs. Based on the bivariate framework and the dominance concept, the functions incorporate not only the boundary values of membership and nonmembership, but also the relative relations among IvIFSs in comparisons. The dominance-based ranking functions include bipolar evaluations with a parameter that allows the decision-maker to reflect his actual attitude in allocating the various kinds of dominance. The relationship for two IvIFSs that satisfy the dual couple is defined based on four proposed ranking functions. Importantly, the proposed ranking functions can achieve a full ranking for all IvIFSs. Two examples are used to demonstrate the applicability and distinctiveness of the proposed ranking functions.

  13. A new therapeutic effect of simvastatin revealed by functional improvement in muscular dystrophy.

    PubMed

    Whitehead, Nicholas P; Kim, Min Jeong; Bible, Kenneth L; Adams, Marvin E; Froehner, Stanley C

    2015-10-13

    Duchenne muscular dystrophy (DMD) is a lethal, degenerative muscle disease with no effective treatment. DMD muscle pathogenesis is characterized by chronic inflammation, oxidative stress, and fibrosis. Statins, cholesterol-lowering drugs, inhibit these deleterious processes in ischemic diseases affecting skeletal muscle, and therefore have potential to improve DMD. However, statins have not been considered for DMD, or other muscular dystrophies, principally because skeletal-muscle-related symptoms are rare, but widely publicized, side effects of these drugs. Here we show positive effects of statins in dystrophic skeletal muscle. Simvastatin dramatically reduced damage and enhanced muscle function in dystrophic (mdx) mice. Long-term simvastatin treatment vastly improved overall muscle health in mdx mice, reducing plasma creatine kinase activity, an established measure of muscle damage, to near-normal levels. This reduction was accompanied by reduced inflammation, more oxidative muscle fibers, and improved strength of the weak diaphragm muscle. Shorter-term treatment protected against muscle fatigue and increased mdx hindlimb muscle force by 40%, a value comparable to current dystrophin gene-based therapies. Increased force correlated with reduced NADPH Oxidase 2 protein expression, the major source of oxidative stress in dystrophic muscle. Finally, in old mdx mice with severe muscle degeneration, simvastatin enhanced diaphragm force and halved fibrosis, a major cause of functional decline in DMD. These improvements were accompanied by autophagy activation, a recent therapeutic target for DMD, and less oxidative stress. Together, our findings highlight that simvastatin substantially improves the overall health and function of dystrophic skeletal muscles and may provide an unexpected, novel therapy for DMD and related neuromuscular diseases.

  14. A new therapeutic effect of simvastatin revealed by functional improvement in muscular dystrophy

    PubMed Central

    Whitehead, Nicholas P.; Kim, Min Jeong; Bible, Kenneth L.; Adams, Marvin E.; Froehner, Stanley C.

    2015-01-01

    Duchenne muscular dystrophy (DMD) is a lethal, degenerative muscle disease with no effective treatment. DMD muscle pathogenesis is characterized by chronic inflammation, oxidative stress, and fibrosis. Statins, cholesterol-lowering drugs, inhibit these deleterious processes in ischemic diseases affecting skeletal muscle, and therefore have potential to improve DMD. However, statins have not been considered for DMD, or other muscular dystrophies, principally because skeletal-muscle-related symptoms are rare, but widely publicized, side effects of these drugs. Here we show positive effects of statins in dystrophic skeletal muscle. Simvastatin dramatically reduced damage and enhanced muscle function in dystrophic (mdx) mice. Long-term simvastatin treatment vastly improved overall muscle health in mdx mice, reducing plasma creatine kinase activity, an established measure of muscle damage, to near-normal levels. This reduction was accompanied by reduced inflammation, more oxidative muscle fibers, and improved strength of the weak diaphragm muscle. Shorter-term treatment protected against muscle fatigue and increased mdx hindlimb muscle force by 40%, a value comparable to current dystrophin gene-based therapies. Increased force correlated with reduced NADPH Oxidase 2 protein expression, the major source of oxidative stress in dystrophic muscle. Finally, in old mdx mice with severe muscle degeneration, simvastatin enhanced diaphragm force and halved fibrosis, a major cause of functional decline in DMD. These improvements were accompanied by autophagy activation, a recent therapeutic target for DMD, and less oxidative stress. Together, our findings highlight that simvastatin substantially improves the overall health and function of dystrophic skeletal muscles and may provide an unexpected, novel therapy for DMD and related neuromuscular diseases. PMID:26417069

  15. Functional Enzyme-Based Approach for Linking Microbial Community Functions with Biogeochemical Process Kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Minjing; Qian, Wei-jun; Gao, Yuqian

    The kinetics of biogeochemical processes in natural and engineered environmental systems are typically described using Monod-type or modified Monod-type models. These models rely on biomass as surrogates for functional enzymes in microbial community that catalyze biogeochemical reactions. A major challenge to apply such models is the difficulty to quantitatively measure functional biomass for constraining and validating the models. On the other hand, omics-based approaches have been increasingly used to characterize microbial community structure, functions, and metabolites. Here we proposed an enzyme-based model that can incorporate omics-data to link microbial community functions with biogeochemical process kinetics. The model treats enzymes asmore » time-variable catalysts for biogeochemical reactions and applies biogeochemical reaction network to incorporate intermediate metabolites. The sequences of genes and proteins from metagenomes, as well as those from the UniProt database, were used for targeted enzyme quantification and to provide insights into the dynamic linkage among functional genes, enzymes, and metabolites that are necessary to be incorporated in the model. The application of the model was demonstrated using denitrification as an example by comparing model-simulated with measured functional enzymes, genes, denitrification substrates and intermediates« less

  16. Questionnaire-based assessment of executive functioning: Case studies.

    PubMed

    Kronenberger, William G; Castellanos, Irina; Pisoni, David B

    2018-01-01

    Delays in the development of executive functioning skills are frequently observed in pediatric neuropsychology populations and can have a broad and significant impact on quality of life. As a result, assessment of executive functioning is often relevant for the development of formulations and recommendations in pediatric neuropsychology clinical work. Questionnaire-based measures of executive functioning behaviors in everyday life have unique advantages and complement traditional neuropsychological measures of executive functioning. Two case studies of children with spina bifida are presented to illustrate the clinical use of a new questionnaire measure of executive and learning-related functioning, the Learning, Executive, and Attention Functioning Scale (LEAF). The LEAF emphasizes clinical utility in assessment by incorporating four characteristics: brevity in administration, breadth of additional relevant content, efficiency of scoring and interpretation, and ease of availability for use. LEAF results were consistent with another executive functioning checklist in documenting everyday behavior problems related to working memory, planning, and organization while offering additional breadth of assessment of domains such as attention, processing speed, and novel problem-solving. These case study results demonstrate the clinical utility of questionnaire-based measurement of executive functioning in pediatric neuropsychology and provide a new measure for accomplishing this goal.

  17. Regression-Based Estimates of Observed Functional Status in Centenarians

    PubMed Central

    Mitchell, Meghan B.; Miller, L. Stephen; Woodard, John L.; Davey, Adam; Martin, Peter; Burgess, Molly; Poon, Leonard W.

    2011-01-01

    Purpose of the Study: There is lack of consensus on the best method of functional assessment, and there is a paucity of studies on daily functioning in centenarians. We sought to compare associations between performance-based, self-report, and proxy report of functional status in centenarians. We expected the strongest relationships between proxy reports and observed performance of basic activities of daily living (BADLs) and instrumental activities of daily living (IADLs). We hypothesized that the discrepancy between self-report and observed daily functioning would be modified by cognitive status. We additionally sought to provide clinicians with estimates of centenarians’ observed daily functioning based on their mental status in combination with subjective measures of activities of daily living (ADLs). Design and Methods: Two hundred and forty-four centenarians from the Georgia Centenarian Study were included in this cross-sectional population-based study. Measures included the Direct Assessment of Functional Status, self-report and proxy report of functional status, and the Mini-Mental State Examination (MMSE). Results: Associations between observed and proxy reports were stronger than between observed and self-report across BADL and IADL measures. A significant MMSE by type of report interaction was found, indicating that lower MMSE performance is associated with a greater discrepancy between subjective and objective ADL measures. Implications: Results demonstrate associations between 3 methods of assessing functional status and suggest proxy reports are generally more accurate than self-report measures. Cognitive status accounted for some of the discrepancy between observed and self-reports, and we provide clinicians with tables to estimate centenarians’ performance on observed functional measures based on MMSE and subjective report of functional status. PMID:20974657

  18. Transposon based functional characterization of soybean genes

    USDA-ARS?s Scientific Manuscript database

    Type II transposable elements that use cut and paste mechanism for jumping from one genomic region to another is ideal in tagging and cloning genes. Precise excision from an insertion site in a mutant gene leads to regaining the wild-type function. Thus, function of a gene can be established based o...

  19. The Effect of the Extinction Procedure in Function-Based Intervention

    ERIC Educational Resources Information Center

    Janney, Donna M.; Umbreit, John; Ferro, Jolenea B.; Liaupsin, Carl J.; Lane, Kathleen L.

    2013-01-01

    In this study, we examined the contribution of the extinction procedure in function-based interventions implemented in the general education classrooms of three at-risk elementary-aged students. Function-based interventions included antecedent adjustments, reinforcement procedures, and function-matched extinction procedures. Using a combined ABC…

  20. [Standardization of the terms for Chinese herbal functions based on functional targeting].

    PubMed

    Xiao, Bin; Tao, Ou; Gu, Hao; Wang, Yun; Qiao, Yan-Jiang

    2011-03-01

    Functional analysis concisely summarizes and concentrates on the therapeutic characteristics and features of Chinese herbal medicine. Standardization of the terms for Chinese herbal functions not only plays a key role in modern research and development of Chinese herbal medicine, but also has far-reaching clinical applications. In this paper, a new method for standardizing the terms for Chinese herbal function was proposed. Firstly, functional targets were collected. Secondly, the pathological conditions and the mode of action of every functional target were determined by analyzing the references. Thirdly, the relationships between the pathological condition and the mode of action were determined based on Chinese medicine theory and data. This three-step approach allows for standardization of the terms for Chinese herbal functions. Promoting the standardization of Chinese medicine terms will benefit the overall clinical application of Chinese herbal medicine.

  1. Molecular dissection of botulinum neurotoxin reveals interdomain chaperone function.

    PubMed

    Fischer, Audrey; Montal, Mauricio

    2013-12-01

    Clostridium botulinum neurotoxin (BoNT) is a multi-domain protein made up of the approximately 100 kDa heavy chain (HC) and the approximately 50 kDa light chain (LC). The HC can be further subdivided into two halves: the N-terminal translocation domain (TD) and the C-terminal Receptor Binding Domain (RBD). We have investigated the minimal requirements for channel activity and LC translocation. We utilize a cellular protection assay and a single channel/single molecule LC translocation assay to characterize in real time the channel and chaperone activities of BoNT/A truncation constructs in Neuro 2A cells. The unstructured, elongated belt region of the TD is demonstrated to be dispensable for channel activity, although may be required for productive LC translocation. We show that the RBD is not necessary for channel activity or LC translocation, however it dictates the pH threshold of channel insertion into the membrane. These findings indicate that each domain functions as a chaperone for the others in addition to their individual functions, working in concert to achieve productive intoxication. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Resting-State Functional Connectivity-Based Biomarkers and Functional MRI-Based Neurofeedback for Psychiatric Disorders: A Challenge for Developing Theranostic Biomarkers.

    PubMed

    Yamada, Takashi; Hashimoto, Ryu-Ichiro; Yahata, Noriaki; Ichikawa, Naho; Yoshihara, Yujiro; Okamoto, Yasumasa; Kato, Nobumasa; Takahashi, Hidehiko; Kawato, Mitsuo

    2017-10-01

    Psychiatric research has been hampered by an explanatory gap between psychiatric symptoms and their neural underpinnings, which has resulted in poor treatment outcomes. This situation has prompted us to shift from symptom-based diagnosis to data-driven diagnosis, aiming to redefine psychiatric disorders as disorders of neural circuitry. Promising candidates for data-driven diagnosis include resting-state functional connectivity MRI (rs-fcMRI)-based biomarkers. Although biomarkers have been developed with the aim of diagnosing patients and predicting the efficacy of therapy, the focus has shifted to the identification of biomarkers that represent therapeutic targets, which would allow for more personalized treatment approaches. This type of biomarker (i.e., "theranostic biomarker") is expected to elucidate the disease mechanism of psychiatric conditions and to offer an individualized neural circuit-based therapeutic target based on the neural cause of a condition. To this end, researchers have developed rs-fcMRI-based biomarkers and investigated a causal relationship between potential biomarkers and disease-specific behavior using functional MRI (fMRI)-based neurofeedback on functional connectivity. In this review, we introduce a recent approach for creating a theranostic biomarker, which consists mainly of 2 parts: (1) developing an rs-fcMRI-based biomarker that can predict diagnosis and/or symptoms with high accuracy, and (2) the introduction of a proof-of-concept study investigating the relationship between normalizing the biomarker and symptom changes using fMRI-based neurofeedback. In parallel with the introduction of recent studies, we review rs-fcMRI-based biomarker and fMRI-based neurofeedback, focusing on the technological improvements and limitations associated with clinical use. © The Author 2017. Published by Oxford University Press on behalf of CINP.

  3. Resting-State Functional Connectivity-Based Biomarkers and Functional MRI-Based Neurofeedback for Psychiatric Disorders: A Challenge for Developing Theranostic Biomarkers

    PubMed Central

    Yamada, Takashi; Hashimoto, Ryu-ichiro; Yahata, Noriaki; Ichikawa, Naho; Yoshihara, Yujiro; Okamoto, Yasumasa; Kato, Nobumasa; Takahashi, Hidehiko

    2017-01-01

    Abstract Psychiatric research has been hampered by an explanatory gap between psychiatric symptoms and their neural underpinnings, which has resulted in poor treatment outcomes. This situation has prompted us to shift from symptom-based diagnosis to data-driven diagnosis, aiming to redefine psychiatric disorders as disorders of neural circuitry. Promising candidates for data-driven diagnosis include resting-state functional connectivity MRI (rs-fcMRI)-based biomarkers. Although biomarkers have been developed with the aim of diagnosing patients and predicting the efficacy of therapy, the focus has shifted to the identification of biomarkers that represent therapeutic targets, which would allow for more personalized treatment approaches. This type of biomarker (i.e., “theranostic biomarker”) is expected to elucidate the disease mechanism of psychiatric conditions and to offer an individualized neural circuit-based therapeutic target based on the neural cause of a condition. To this end, researchers have developed rs-fcMRI-based biomarkers and investigated a causal relationship between potential biomarkers and disease-specific behavior using functional MRI (fMRI)-based neurofeedback on functional connectivity. In this review, we introduce a recent approach for creating a theranostic biomarker, which consists mainly of 2 parts: (1) developing an rs-fcMRI-based biomarker that can predict diagnosis and/or symptoms with high accuracy, and (2) the introduction of a proof-of-concept study investigating the relationship between normalizing the biomarker and symptom changes using fMRI-based neurofeedback. In parallel with the introduction of recent studies, we review rs-fcMRI-based biomarker and fMRI-based neurofeedback, focusing on the technological improvements and limitations associated with clinical use. PMID:28977523

  4. Atlas-based functional radiosurgery: Early results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stancanello, J.; Romanelli, P.; Pantelis, E.

    2009-02-15

    Functional disorders of the brain, such as dystonia and neuropathic pain, may respond poorly to medical therapy. Deep brain stimulation (DBS) of the globus pallidus pars interna (GPi) and the centromedian nucleus of the thalamus (CMN) may alleviate dystonia and neuropathic pain, respectively. A noninvasive alternative to DBS is radiosurgical ablation [internal pallidotomy (IP) and medial thalamotomy (MT)]. The main technical limitation of radiosurgery is that targets are selected only on the basis of MRI anatomy, without electrophysiological confirmation. This means that, to be feasible, image-based targeting must be highly accurate and reproducible. Here, we report on the feasibility ofmore » an atlas-based approach to targeting for functional radiosurgery. In this method, masks of the GPi, CMN, and medio-dorsal nucleus were nonrigidly registered to patients' T1-weighted MRI (T1w-MRI) and superimposed on patients' T2-weighted MRI (T2w-MRI). Radiosurgical targets were identified on the T2w-MRI registered to the planning CT by an expert functional neurosurgeon. To assess its feasibility, two patients were treated with the CyberKnife using this method of targeting; a patient with dystonia received an IP (120 Gy prescribed to the 65% isodose) and a patient with neuropathic pain received a MT (120 Gy to the 77% isodose). Six months after treatment, T2w-MRIs and contrast-enhanced T1w-MRIs showed edematous regions around the lesions; target placements were reevaluated by DW-MRIs. At 12 months post-treatment steroids for radiation-induced edema and medications for dystonia and neuropathic pain were suppressed. Both patients experienced significant relief from pain and dystonia-related problems. Fifteen months after treatment edema had disappeared. Thus, this work shows promising feasibility of atlas-based functional radiosurgery to improve patient condition. Further investigations are indicated for optimizing treatment dose.« less

  5. Functionalized graphene hydrogel-based high-performance supercapacitors.

    PubMed

    Xu, Yuxi; Lin, Zhaoyang; Huang, Xiaoqing; Wang, Yang; Huang, Yu; Duan, Xiangfeng

    2013-10-25

    Functionalized graphene hydrogels are prepared by a one-step low-temperature reduction process and exhibit ultrahigh specific capacitances and excellent cycling stability in the aqueous electrolyte. Flexible solid-state supercapacitors based on functionalized graphene hydrogels are demonstrated with superior capacitive performances and extraordinary mechanical flexibility. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Improving the Rainbow Attack by Reusing Colours

    NASA Astrophysics Data System (ADS)

    Ågren, Martin; Johansson, Thomas; Hell, Martin

    Hashing or encrypting a key or a password is a vital part in most network security protocols. The most practical generic attack on such schemes is a time memory trade-off attack. Such an attack inverts any one-way function using a trade-off between memory and execution time. Existing techniques include the Hellman attack and the rainbow attack, where the latter uses different reduction functions ("colours") within a table.

  7. Functional outcomes in community-based adults with borderline personality disorder.

    PubMed

    Javaras, Kristin N; Zanarini, Mary C; Hudson, James I; Greenfield, Shelly F; Gunderson, John G

    2017-06-01

    Many individuals in clinical samples with borderline personality disorder (BPD) experience high levels of functional impairment. However, little is known about the levels of functional impairment experienced by individuals with BPD in the general community. To address this issue, we compared overall and domain-specific (educational/occupational; social; recreational) functioning in a sample of community-based individuals with BPD (n = 164); community-based individuals without BPD (n = 901); and clinically-ascertained individuals with BPD (n = 61). BPD diagnoses and functional outcomes were based on well-accepted, semi-structured interviews. Community-based individuals with BPD were significantly less likely to experience good overall functioning (steady, consistent employment and ≥1 good relationship) compared to community-based individuals without BPD (BPD: 47.4%; Non- BPD: 74.5%; risk difference -27.1%; p < 0.001), even when compared directly to their own non-BPD siblings (risk difference -35.5%; p < 0.001). Community-based individuals with BPD versus those without BPD did not differ significantly on most domain-specific outcomes, but the former group experienced poorer educational/occupational performance and lower quality relationships with parents, partners, and friends. However, community-based individuals with BPD were significantly more likely to experience good overall functioning than clinically-based individuals with BPD (risk difference -35.2%; p < 0.001), with the latter group more likely to experience reduced employment status, very poor quality relationships with partners, and social isolation. In conclusion, community-based individuals with BPD experienced marked functional impairment, especially in the social domain, but were less likely to experience the more extreme occupational and social impairments seen among patients with BPD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Functional identity and diversity of animals predict ecosystem functioning better than species-based indices

    PubMed Central

    Gagic, Vesna; Bartomeus, Ignasi; Jonsson, Tomas; Taylor, Astrid; Winqvist, Camilla; Fischer, Christina; Slade, Eleanor M.; Steffan-Dewenter, Ingolf; Emmerson, Mark; Potts, Simon G.; Tscharntke, Teja; Weisser, Wolfgang; Bommarco, Riccardo

    2015-01-01

    Drastic biodiversity declines have raised concerns about the deterioration of ecosystem functions and have motivated much recent research on the relationship between species diversity and ecosystem functioning. A functional trait framework has been proposed to improve the mechanistic understanding of this relationship, but this has rarely been tested for organisms other than plants. We analysed eight datasets, including five animal groups, to examine how well a trait-based approach, compared with a more traditional taxonomic approach, predicts seven ecosystem functions below- and above-ground. Trait-based indices consistently provided greater explanatory power than species richness or abundance. The frequency distributions of single or multiple traits in the community were the best predictors of ecosystem functioning. This implies that the ecosystem functions we investigated were underpinned by the combination of trait identities (i.e. single-trait indices) and trait complementarity (i.e. multi-trait indices) in the communities. Our study provides new insights into the general mechanisms that link biodiversity to ecosystem functioning in natural animal communities and suggests that the observed responses were due to the identity and dominance patterns of the trait composition rather than the number or abundance of species per se. PMID:25567651

  9. Training Residential Staff to Conduct Trial-Based Functional Analyses

    ERIC Educational Resources Information Center

    Lambert, Joseph M.; Bloom, Sarah E.; Kunnavatana, S. Shanun; Collins, Shawnee D.; Clay, Casey J.

    2013-01-01

    We taught 6 supervisors of a residential service provider for adults with developmental disabilities to train 9 house managers to conduct trial-based functional analyses. Effects of the training were evaluated with a nonconcurrent multiple baseline. Results suggest that house managers can be trained to conduct trial-based functional analyses with…

  10. On-Board Event-Based State Estimation for Trajectory Approaching and Tracking of a Vehicle

    PubMed Central

    Martínez-Rey, Miguel; Espinosa, Felipe; Gardel, Alfredo; Santos, Carlos

    2015-01-01

    For the problem of pose estimation of an autonomous vehicle using networked external sensors, the processing capacity and battery consumption of these sensors, as well as the communication channel load should be optimized. Here, we report an event-based state estimator (EBSE) consisting of an unscented Kalman filter that uses a triggering mechanism based on the estimation error covariance matrix to request measurements from the external sensors. This EBSE generates the events of the estimator module on-board the vehicle and, thus, allows the sensors to remain in stand-by mode until an event is generated. The proposed algorithm requests a measurement every time the estimation distance root mean squared error (DRMS) value, obtained from the estimator's covariance matrix, exceeds a threshold value. This triggering threshold can be adapted to the vehicle's working conditions rendering the estimator even more efficient. An example of the use of the proposed EBSE is given, where the autonomous vehicle must approach and follow a reference trajectory. By making the threshold a function of the distance to the reference location, the estimator can halve the use of the sensors with a negligible deterioration in the performance of the approaching maneuver. PMID:26102489

  11. Network-based function prediction and interactomics: the case for metabolic enzymes.

    PubMed

    Janga, S C; Díaz-Mejía, J Javier; Moreno-Hagelsieb, G

    2011-01-01

    As sequencing technologies increase in power, determining the functions of unknown proteins encoded by the DNA sequences so produced becomes a major challenge. Functional annotation is commonly done on the basis of amino-acid sequence similarity alone. Long after sequence similarity becomes undetectable by pair-wise comparison, profile-based identification of homologs can often succeed due to the conservation of position-specific patterns, important for a protein's three dimensional folding and function. Nevertheless, prediction of protein function from homology-driven approaches is not without problems. Homologous proteins might evolve different functions and the power of homology detection has already started to reach its maximum. Computational methods for inferring protein function, which exploit the context of a protein in cellular networks, have come to be built on top of homology-based approaches. These network-based functional inference techniques provide both a first hand hint into a proteins' functional role and offer complementary insights to traditional methods for understanding the function of uncharacterized proteins. Most recent network-based approaches aim to integrate diverse kinds of functional interactions to boost both coverage and confidence level. These techniques not only promise to solve the moonlighting aspect of proteins by annotating proteins with multiple functions, but also increase our understanding on the interplay between different functional classes in a cell. In this article we review the state of the art in network-based function prediction and describe some of the underlying difficulties and successes. Given the volume of high-throughput data that is being reported the time is ripe to employ these network-based approaches, which can be used to unravel the functions of the uncharacterized proteins accumulating in the genomic databases. © 2010 Elsevier Inc. All rights reserved.

  12. Optimal inverse functions created via population-based optimization.

    PubMed

    Jennings, Alan L; Ordóñez, Raúl

    2014-06-01

    Finding optimal inputs for a multiple-input, single-output system is taxing for a system operator. Population-based optimization is used to create sets of functions that produce a locally optimal input based on a desired output. An operator or higher level planner could use one of the functions in real time. For the optimization, each agent in the population uses the cost and output gradients to take steps lowering the cost while maintaining their current output. When an agent reaches an optimal input for its current output, additional agents are generated in the output gradient directions. The new agents then settle to the local optima for the new output values. The set of associated optimal points forms an inverse function, via spline interpolation, from a desired output to an optimal input. In this manner, multiple locally optimal functions can be created. These functions are naturally clustered in input and output spaces allowing for a continuous inverse function. The operator selects the best cluster over the anticipated range of desired outputs and adjusts the set point (desired output) while maintaining optimality. This reduces the demand from controlling multiple inputs, to controlling a single set point with no loss in performance. Results are demonstrated on a sample set of functions and on a robot control problem.

  13. Quantum Authencryption with Two-Photon Entangled States for Off-Line Communicants

    NASA Astrophysics Data System (ADS)

    Ye, Tian-Yu

    2016-02-01

    In this paper, a quantum authencryption protocol is proposed by using the two-photon entangled states as the quantum resource. Two communicants Alice and Bob share two private keys in advance, which determine the generation of two-photon entangled states. The sender Alice sends the two-photon entangled state sequence encoded with her classical bits to the receiver Bob in the manner of one-step quantum transmission. Upon receiving the encoded quantum state sequence, Bob decodes out Alice's classical bits with the two-photon joint measurements and authenticates the integrity of Alice's secret with the help of one-way hash function. The proposed protocol only uses the one-step quantum transmission and needs neither a public discussion nor a trusted third party. As a result, the proposed protocol can be adapted to the case where the receiver is off-line, such as the quantum E-mail systems. Moreover, the proposed protocol provides the message authentication to one bit level with the help of one-way hash function and has an information-theoretical efficiency equal to 100 %.

  14. A strategy to load balancing for non-connectivity MapReduce job

    NASA Astrophysics Data System (ADS)

    Zhou, Huaping; Liu, Guangzong; Gui, Haixia

    2017-09-01

    MapReduce has been widely used in large scale and complex datasets as a kind of distributed programming model. Original Hash partitioning function in MapReduce often results the problem of data skew when data distribution is uneven. To solve the imbalance of data partitioning, we proposes a strategy to change the remaining partitioning index when data is skewed. In Map phase, we count the amount of data which will be distributed to each reducer, then Job Tracker monitor the global partitioning information and dynamically modify the original partitioning function according to the data skew model, so the Partitioner can change the index of these partitioning which will cause data skew to the other reducer that has less load in the next partitioning process, and can eventually balance the load of each node. Finally, we experimentally compare our method with existing methods on both synthetic and real datasets, the experimental results show our strategy can solve the problem of data skew with better stability and efficiency than Hash method and Sampling method for non-connectivity MapReduce task.

  15. Pre-hospital electrocardiogram triage with telemedicine near halves time to treatment in STEMI: A meta-analysis and meta-regression analysis of non-randomized studies.

    PubMed

    Brunetti, Natale Daniele; De Gennaro, Luisa; Correale, Michele; Santoro, Francesco; Caldarola, Pasquale; Gaglione, Antonio; Di Biase, Matteo

    2017-04-01

    A shorter time to treatment has been shown to be associated with lower mortality rates in acute myocardial infarction (AMI). Several strategies have been adopted with the aim to reduce any delay in diagnosis of AMI: pre-hospital triage with telemedicine is one of such strategies. We therefore aimed to measure the real effect of pre-hospital triage with telemedicine in case of AMI in a meta-analysis study. We performed a meta-analysis of non-randomized studies with the aim to quantify the exact reduction of time to treatment achieved by pre-hospital triage with telemedicine. Data were pooled and compared by relative time reduction and 95% C.I.s. A meta-regression analysis was performed in order to find possible predictors of shorter time to treatment. Eleven studies were selected and finally evaluated in the study. The overall relative reduction of time to treatment with pre-hospital triage and telemedicine was -38/-40% (p<0.001). Absolute time reduction was significantly correlated to time to treatment in the control groups (p<0.001), while relative time reduction was independent. A non-significant trend toward shorter relative time reductions was observed over years. Pre-hospital triage with telemedicine is associated with a near halved time to treatment in AMI. The benefit is larger in terms of absolute time to treatment reduction in populations with larger delays to treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. An effective and secure key-management scheme for hierarchical access control in E-medicine system.

    PubMed

    Odelu, Vanga; Das, Ashok Kumar; Goswami, Adrijit

    2013-04-01

    Recently several hierarchical access control schemes are proposed in the literature to provide security of e-medicine systems. However, most of them are either insecure against 'man-in-the-middle attack' or they require high storage and computational overheads. Wu and Chen proposed a key management method to solve dynamic access control problems in a user hierarchy based on hybrid cryptosystem. Though their scheme improves computational efficiency over Nikooghadam et al.'s approach, it suffers from large storage space for public parameters in public domain and computational inefficiency due to costly elliptic curve point multiplication. Recently, Nikooghadam and Zakerolhosseini showed that Wu-Chen's scheme is vulnerable to man-in-the-middle attack. In order to remedy this security weakness in Wu-Chen's scheme, they proposed a secure scheme which is again based on ECC (elliptic curve cryptography) and efficient one-way hash function. However, their scheme incurs huge computational cost for providing verification of public information in the public domain as their scheme uses ECC digital signature which is costly when compared to symmetric-key cryptosystem. In this paper, we propose an effective access control scheme in user hierarchy which is only based on symmetric-key cryptosystem and efficient one-way hash function. We show that our scheme reduces significantly the storage space for both public and private domains, and computational complexity when compared to Wu-Chen's scheme, Nikooghadam-Zakerolhosseini's scheme, and other related schemes. Through the informal and formal security analysis, we further show that our scheme is secure against different attacks and also man-in-the-middle attack. Moreover, dynamic access control problems in our scheme are also solved efficiently compared to other related schemes, making our scheme is much suitable for practical applications of e-medicine systems.

  17. Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.

    PubMed

    Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa

    2015-09-01

    Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. © The Author(s) 2015.

  18. Functional identity and diversity of animals predict ecosystem functioning better than species-based indices.

    PubMed

    Gagic, Vesna; Bartomeus, Ignasi; Jonsson, Tomas; Taylor, Astrid; Winqvist, Camilla; Fischer, Christina; Slade, Eleanor M; Steffan-Dewenter, Ingolf; Emmerson, Mark; Potts, Simon G; Tscharntke, Teja; Weisser, Wolfgang; Bommarco, Riccardo

    2015-02-22

    Drastic biodiversity declines have raised concerns about the deterioration of ecosystem functions and have motivated much recent research on the relationship between species diversity and ecosystem functioning. A functional trait framework has been proposed to improve the mechanistic understanding of this relationship, but this has rarely been tested for organisms other than plants. We analysed eight datasets, including five animal groups, to examine how well a trait-based approach, compared with a more traditional taxonomic approach, predicts seven ecosystem functions below- and above-ground. Trait-based indices consistently provided greater explanatory power than species richness or abundance. The frequency distributions of single or multiple traits in the community were the best predictors of ecosystem functioning. This implies that the ecosystem functions we investigated were underpinned by the combination of trait identities (i.e. single-trait indices) and trait complementarity (i.e. multi-trait indices) in the communities. Our study provides new insights into the general mechanisms that link biodiversity to ecosystem functioning in natural animal communities and suggests that the observed responses were due to the identity and dominance patterns of the trait composition rather than the number or abundance of species per se. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. Modeling and Simulation of the Economics of Mining in the Bitcoin Market.

    PubMed

    Cocco, Luisanna; Marchesi, Michele

    2016-01-01

    In January 3, 2009, Satoshi Nakamoto gave rise to the "Bitcoin Blockchain", creating the first block of the chain hashing on his computer's central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU's generation. They are GPU's, FPGA's and ASIC's generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU's generation, the first with economic significance. The model reproduces some "stylized facts" found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network.

  20. Modeling and Simulation of the Economics of Mining in the Bitcoin Market

    PubMed Central

    Marchesi, Michele

    2016-01-01

    In January 3, 2009, Satoshi Nakamoto gave rise to the “Bitcoin Blockchain”, creating the first block of the chain hashing on his computer’s central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU’s generation. They are GPU’s, FPGA’s and ASIC’s generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU’s generation, the first with economic significance. The model reproduces some “stylized facts” found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network. PMID:27768691

  1. The Extent to Which Collaborative Teams of Educators Link the Results of Functional Assessment to Function-Based Interventions

    ERIC Educational Resources Information Center

    de Courcy-Bower, Laurie

    2010-01-01

    A promising approach to addressing challenging behavior in schools is to develop and implement "function-based interventions" (Dunlap et al., 2006; Hanley, Iwata, & McCord, 2003). Function-based interventions are individualized interventions in which five key outcomes of functional assessment (i.e., identification of challenging behavior,…

  2. Dissociations between behavioural and functional magnetic resonance imaging-based evaluations of cognitive function after brain injury

    PubMed Central

    Bardin, Jonathan C.; Fins, Joseph J.; Katz, Douglas I.; Hersh, Jennifer; Heier, Linda A.; Tabelow, Karsten; Dyke, Jonathan P.; Ballon, Douglas J.; Schiff, Nicholas D.

    2011-01-01

    Functional neuroimaging methods hold promise for the identification of cognitive function and communication capacity in some severely brain-injured patients who may not retain sufficient motor function to demonstrate their abilities. We studied seven severely brain-injured patients and a control group of 14 subjects using a novel hierarchical functional magnetic resonance imaging assessment utilizing mental imagery responses. Whereas the control group showed consistent and accurate (for communication) blood-oxygen-level-dependent responses without exception, the brain-injured subjects showed a wide variation in the correlation of blood-oxygen-level-dependent responses and overt behavioural responses. Specifically, the brain-injured subjects dissociated bedside and functional magnetic resonance imaging-based command following and communication capabilities. These observations reveal significant challenges in developing validated functional magnetic resonance imaging-based methods for clinical use and raise interesting questions about underlying brain function assayed using these methods in brain-injured subjects. PMID:21354974

  3. Using the Hill Cipher to Teach Cryptographic Principles

    ERIC Educational Resources Information Center

    McAndrew, Alasdair

    2008-01-01

    The Hill cipher is the simplest example of a "block cipher," which takes a block of plaintext as input, and returns a block of ciphertext as output. Although it is insecure by modern standards, its simplicity means that it is well suited for the teaching of such concepts as encryption modes, and properties of cryptographic hash functions. Although…

  4. Network-Aware Mechanisms for Tolerating Byzantine Failures in Distributed Systems

    DTIC Science & Technology

    2012-01-01

    In Digest, we use SHA-256 as the collision-resistant hash function. We use the realization of SHA-256 from the OpenSSL toolkit [48]. The results...vol. 46, pp. 372–378, 1997. [48] “ Openssl project.” [Online]. Available: http://www.openssl.org/ 154 [49] M. J. Fischer, N. A. Lynch, and M. Merritt

  5. Identifying Model-Based Reconfiguration Goals through Functional Deficiencies

    NASA Technical Reports Server (NTRS)

    Benazera, Emmanuel; Trave-Massuyes, Louise

    2004-01-01

    Model-based diagnosis is now advanced to the point autonomous systems face some uncertain and faulty situations with success. The next step toward more autonomy is to have the system recovering itself after faults occur, a process known as model-based reconfiguration. After faults occur, given a prediction of the nominal behavior of the system and the result of the diagnosis operation, this paper details how to automatically determine the functional deficiencies of the system. These deficiencies are characterized in the case of uncertain state estimates. A methodology is then presented to determine the reconfiguration goals based on the deficiencies. Finally, a recovery process interleaves planning and model predictive control to restore the functionalities in prioritized order.

  6. Interactions between colour and synaesthetic colour: an effect of simultaneous colour contrast on synaesthetic colours.

    PubMed

    Nijboer, Tanja C W; Gebuis, Titia; te Pas, Susan F; van der Smagt, Maarten J

    2011-01-01

    We investigated whether simultaneous colour contrast affects the synaesthetic colour experience and normal colour percept in a similar manner. We simultaneously presented a target stimulus (i.e. grapheme) and a reference stimulus (i.e. hash). Either the grapheme or the hash was presented on a saturated background of the same or opposite colour category as the synaesthetic colour and the other stimulus on a grey background. In both conditions, grapheme-colour synaesthetes were asked to colour the hash in a colour similar to the synaesthetic colour of the grapheme. Controls that were pair-matched to the synaesthetes performed the same experiment, but for them, the grapheme was presented in the colour induced by the grapheme in synaesthetes. When graphemes were presented on a grey and the hash on a coloured background, a traditional simultaneous colour-contrast effect was found for controls as well as synaesthetes. When graphemes were presented on colour and the hash on grey, the controls again showed a traditional simultaneous colour-contrast effect, whereas the synaesthetes showed the opposite effect. Our results show that synaesthetic colour experiences differ from normal colour perception; both are susceptible to different surrounding colours, but not in a comparable manner. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Evidence-based Assessment of Cognitive Functioning in Pediatric Psychology

    PubMed Central

    Brown, Ronald T.; Cavanagh, Sarah E.; Vess, Sarah F.; Segall, Mathew J.

    2008-01-01

    Objective To review the evidence base for measures of cognitive functioning frequently used within the field of pediatric psychology. Methods From a list of 47 measures identified by the Society of Pediatric Psychology (Division 54) Evidence-Based Assessment Task Force Workgroup, 27 measures were included in the review. Measures were organized, reviewed, and evaluated according to general domains of functioning (e.g., attention/executive functioning, memory). Results Twenty-two of 27 measures reviewed demonstrated psychometric properties that met “Well-established” criteria as set forth by the Assessment Task Force. Psychometric properties were strongest for measures of general cognitive ability and weakest for measures of visual-motor functioning and attention. Conclusions We report use of “Well-established” measures of overall cognitive functioning, nonverbal intelligence, academic achievement, language, and memory and learning. For several specific tests in the domains of visual-motor functioning and attention, additional psychometric data are needed for measures to meet criteria as “Well established.” PMID:18194973

  8. Modeling of Glycerol-3-Phosphate Transporter Suggests a Potential ‘Tilt’ Mechanism involved in its Function

    PubMed Central

    Tsigelny, Igor F.; Greenberg, Jerry; Kouznetsova, Valentina; Nigam, Sanjay K.

    2009-01-01

    Many major facilitator superfamily (MFS) transporters have similar 12-transmembrane α-helical topologies with two six-helix halves connected by a long loop. In humans, these transporters participate in key physiological processes and are also, as in the case of members of the organic anion transporter (OAT) family, of pharmaceutical interest. Recently, crystal structures of two bacterial representatives of the MFS family — the glycerol-3-phosphate transporter (GlpT) and lac-permease (LacY) — have been solved and, because of assumptions regarding the high structural conservation of this family, there is hope that the results can be applied to mammalian transporters as well. Based on crystallography, it has been suggested that a major conformational “switching” mechanism accounts for ligand transport by MFS proteins. This conformational switch would then allow periodic changes in the overall transporter configuration, resulting in its cyclic opening to the periplasm or cytoplasm. Following this lead, we have modeled a possible “switch” mechanism in GlpT, using the concept of rotation of protein domains as in the DynDom program17 and membranephilic constraints predicted by the MAPAS program.23 We found that the minima of energies of intersubunit interactions support two alternate positions consistent with their transport properties. Thus, for GlpT, a “tilt” of 9°–10° rotation had the most favorable energetics of electrostatic interaction between the two halves of the transporter; moreover, this confirmation was sufficient to suggest transport of the ligand across the membrane. We conducted steered molecular dynamics simulations of the GlpT-ligand system to explore how glycerol-3-phosphate would be handled by the “tilted” structure, and obtained results generally consistent with experimental mutagenesis data. While biochemical data remain most consistent with a single-site alternating access model, our results raise the possibility that, while

  9. Privacy Protection for Telecare Medicine Information Systems Using a Chaotic Map-Based Three-Factor Authenticated Key Agreement Scheme.

    PubMed

    Zhang, Liping; Zhu, Shaohui; Tang, Shanyu

    2017-03-01

    Telecare medicine information systems (TMIS) provide flexible and convenient e-health care. However, the medical records transmitted in TMIS are exposed to unsecured public networks, so TMIS are more vulnerable to various types of security threats and attacks. To provide privacy protection for TMIS, a secure and efficient authenticated key agreement scheme is urgently needed to protect the sensitive medical data. Recently, Mishra et al. proposed a biometrics-based authenticated key agreement scheme for TMIS by using hash function and nonce, they claimed that their scheme could eliminate the security weaknesses of Yan et al.'s scheme and provide dynamic identity protection and user anonymity. In this paper, however, we demonstrate that Mishra et al.'s scheme suffers from replay attacks, man-in-the-middle attacks and fails to provide perfect forward secrecy. To overcome the weaknesses of Mishra et al.'s scheme, we then propose a three-factor authenticated key agreement scheme to enable the patient to enjoy the remote healthcare services via TMIS with privacy protection. The chaotic map-based cryptography is employed in the proposed scheme to achieve a delicate balance of security and performance. Security analysis demonstrates that the proposed scheme resists various attacks and provides several attractive security properties. Performance evaluation shows that the proposed scheme increases efficiency in comparison with other related schemes.

  10. Developing Novel Protein-based Materials using Ultrabithorax: Production, Characterization, and Functionalization

    NASA Astrophysics Data System (ADS)

    Huang, Zhao

    2011-12-01

    Compared to 'conventional' materials made from metal, glass, or ceramics, protein-based materials have unique mechanical properties. Furthermore, the morphology, mechanical properties, and functionality of protein-based materials may be optimized via sequence engineering for use in a variety of applications, including textile materials, biosensors, and tissue engineering scaffolds. The development of recombinant DNA technology has enabled the production and engineering of protein-based materials ex vivo. However, harsh production conditions can compromise the mechanical properties of protein-based materials and diminish their ability to incorporate functional proteins. Developing a new generation of protein-based materials is crucial to (i) improve materials assembly conditions, (ii) create novel mechanical properties, and (iii) expand the capacity to carry functional protein/peptide sequences. This thesis describes development of novel protein-based materials using Ultrabithorax, a member of the Hox family of proteins that regulate developmental pathways in Drosophila melanogaster. The experiments presented (i) establish the conditions required for the assembly of Ubx-based materials, (ii) generate a wide range of Ubx morphologies, (iii) examine the mechanical properties of Ubx fibers, (iv) incorporate protein functions to Ubx-based materials via gene fusion, (v) pattern protein functions within the Ubx materials, and (vi) examine the biocompatibility of Ubx materials in vitro. Ubx-based materials assemble at mild conditions compatible with protein folding and activity, which enables Ubx chimeric materials to retain the function of appended proteins in spatial patterns determined by materials assembly. Ubx-based materials also display mechanical properties comparable to existing protein-based materials and demonstrate good biocompatibility with living cells in vitro. Taken together, this research demonstrates the unique features and future potential of novel Ubx-based

  11. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Workers transfer half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, into the environmental enclosure, or clean room, at the top of the Delta II launcher at Space Launch Complex 2 on Vandenberg Air Force Base in California. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  12. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, is transferred through the portal into the environmental enclosure, or clean room, at the top of the Delta II launcher at Space Launch Complex 2 on Vandenberg Air Force Base in California. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  13. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, arrives at the portal of the environmental enclosure, or clean room, at the top of the Delta II launcher at Space Launch Complex 2 on Vandenberg Air Force Base in California. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  14. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, is positioned into the environmental enclosure, or clean room, at the top of the Delta II launcher at Space Launch Complex 2 on Vandenberg Air Force Base in California. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  15. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, arrives at the portal to the environmental enclosure, or clean room, at the top of the Delta II launcher at Space Launch Complex 2 on Vandenberg Air Force Base in California. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  16. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, is lifted toward the environmental enclosure, or clean room, at the top of the Delta II launcher at Space Launch Complex 2 on Vandenberg Air Force Base in California. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  17. Protein-protein docking using region-based 3D Zernike descriptors

    PubMed Central

    2009-01-01

    Background Protein-protein interactions are a pivotal component of many biological processes and mediate a variety of functions. Knowing the tertiary structure of a protein complex is therefore essential for understanding the interaction mechanism. However, experimental techniques to solve the structure of the complex are often found to be difficult. To this end, computational protein-protein docking approaches can provide a useful alternative to address this issue. Prediction of docking conformations relies on methods that effectively capture shape features of the participating proteins while giving due consideration to conformational changes that may occur. Results We present a novel protein docking algorithm based on the use of 3D Zernike descriptors as regional features of molecular shape. The key motivation of using these descriptors is their invariance to transformation, in addition to a compact representation of local surface shape characteristics. Docking decoys are generated using geometric hashing, which are then ranked by a scoring function that incorporates a buried surface area and a novel geometric complementarity term based on normals associated with the 3D Zernike shape description. Our docking algorithm was tested on both bound and unbound cases in the ZDOCK benchmark 2.0 dataset. In 74% of the bound docking predictions, our method was able to find a near-native solution (interface C-αRMSD ≤ 2.5 Å) within the top 1000 ranks. For unbound docking, among the 60 complexes for which our algorithm returned at least one hit, 60% of the cases were ranked within the top 2000. Comparison with existing shape-based docking algorithms shows that our method has a better performance than the others in unbound docking while remaining competitive for bound docking cases. Conclusion We show for the first time that the 3D Zernike descriptors are adept in capturing shape complementarity at the protein-protein interface and useful for protein docking prediction

  18. Protein-protein docking using region-based 3D Zernike descriptors.

    PubMed

    Venkatraman, Vishwesh; Yang, Yifeng D; Sael, Lee; Kihara, Daisuke

    2009-12-09

    Protein-protein interactions are a pivotal component of many biological processes and mediate a variety of functions. Knowing the tertiary structure of a protein complex is therefore essential for understanding the interaction mechanism. However, experimental techniques to solve the structure of the complex are often found to be difficult. To this end, computational protein-protein docking approaches can provide a useful alternative to address this issue. Prediction of docking conformations relies on methods that effectively capture shape features of the participating proteins while giving due consideration to conformational changes that may occur. We present a novel protein docking algorithm based on the use of 3D Zernike descriptors as regional features of molecular shape. The key motivation of using these descriptors is their invariance to transformation, in addition to a compact representation of local surface shape characteristics. Docking decoys are generated using geometric hashing, which are then ranked by a scoring function that incorporates a buried surface area and a novel geometric complementarity term based on normals associated with the 3D Zernike shape description. Our docking algorithm was tested on both bound and unbound cases in the ZDOCK benchmark 2.0 dataset. In 74% of the bound docking predictions, our method was able to find a near-native solution (interface C-alphaRMSD < or = 2.5 A) within the top 1000 ranks. For unbound docking, among the 60 complexes for which our algorithm returned at least one hit, 60% of the cases were ranked within the top 2000. Comparison with existing shape-based docking algorithms shows that our method has a better performance than the others in unbound docking while remaining competitive for bound docking cases. We show for the first time that the 3D Zernike descriptors are adept in capturing shape complementarity at the protein-protein interface and useful for protein docking prediction. Rigorous benchmark studies

  19. Distributed Kernelized Locality-Sensitive Hashing for Faster Image Based Navigation

    DTIC Science & Technology

    2015-03-26

    Facebook, Google, and Yahoo !. Current methods for image retrieval become problematic when implemented on image datasets that can easily reach billions of...correlations. Tech industry leaders like Facebook, Google, and Yahoo ! sort and index even larger volumes of “big data” daily. When attempting to process...open source implementation of Google’s MapReduce programming paradigm [13] which has been used for many different things. Using Apache Hadoop, Yahoo

  20. Exact density functional and wave function embedding schemes based on orbital localization

    NASA Astrophysics Data System (ADS)

    Hégely, Bence; Nagy, Péter R.; Ferenczy, György G.; Kállay, Mihály

    2016-08-01

    Exact schemes for the embedding of density functional theory (DFT) and wave function theory (WFT) methods into lower-level DFT or WFT approaches are introduced utilizing orbital localization. First, a simple modification of the projector-based embedding scheme of Manby and co-workers [J. Chem. Phys. 140, 18A507 (2014)] is proposed. We also use localized orbitals to partition the system, but instead of augmenting the Fock operator with a somewhat arbitrary level-shift projector we solve the Huzinaga-equation, which strictly enforces the Pauli exclusion principle. Second, the embedding of WFT methods in local correlation approaches is studied. Since the latter methods split up the system into local domains, very simple embedding theories can be defined if the domains of the active subsystem and the environment are treated at a different level. The considered embedding schemes are benchmarked for reaction energies and compared to quantum mechanics (QM)/molecular mechanics (MM) and vacuum embedding. We conclude that for DFT-in-DFT embedding, the Huzinaga-equation-based scheme is more efficient than the other approaches, but QM/MM or even simple vacuum embedding is still competitive in particular cases. Concerning the embedding of wave function methods, the clear winner is the embedding of WFT into low-level local correlation approaches, and WFT-in-DFT embedding can only be more advantageous if a non-hybrid density functional is employed.

  1. PanFP: pangenome-based functional profiles for microbial communities.

    PubMed

    Jun, Se-Ran; Robeson, Michael S; Hauser, Loren J; Schadt, Christopher W; Gorin, Andrey A

    2015-09-26

    For decades there has been increasing interest in understanding the relationships between microbial communities and ecosystem functions. Current DNA sequencing technologies allows for the exploration of microbial communities in two principle ways: targeted rRNA gene surveys and shotgun metagenomics. For large study designs, it is often still prohibitively expensive to sequence metagenomes at both the breadth and depth necessary to statistically capture the true functional diversity of a community. Although rRNA gene surveys provide no direct evidence of function, they do provide a reasonable estimation of microbial diversity, while being a very cost-effective way to screen samples of interest for later shotgun metagenomic analyses. However, there is a great deal of 16S rRNA gene survey data currently available from diverse environments, and thus a need for tools to infer functional composition of environmental samples based on 16S rRNA gene survey data. We present a computational method called pangenome-based functional profiles (PanFP), which infers functional profiles of microbial communities from 16S rRNA gene survey data for Bacteria and Archaea. PanFP is based on pangenome reconstruction of a 16S rRNA gene operational taxonomic unit (OTU) from known genes and genomes pooled from the OTU's taxonomic lineage. From this lineage, we derive an OTU functional profile by weighting a pangenome's functional profile with the OTUs abundance observed in a given sample. We validated our method by comparing PanFP to the functional profiles obtained from the direct shotgun metagenomic measurement of 65 diverse communities via Spearman correlation coefficients. These correlations improved with increasing sequencing depth, within the range of 0.8-0.9 for the most deeply sequenced Human Microbiome Project mock community samples. PanFP is very similar in performance to another recently released tool, PICRUSt, for almost all of survey data analysed here. But, our method is unique

  2. Comparison of Grouping Methods for Template Extraction from VA Medical Record Text.

    PubMed

    Redd, Andrew M; Gundlapalli, Adi V; Divita, Guy; Tran, Le-Thuy; Pettey, Warren B P; Samore, Matthew H

    2017-01-01

    We investigate options for grouping templates for the purpose of template identification and extraction from electronic medical records. We sampled a corpus of 1000 documents originating from Veterans Health Administration (VA) electronic medical record. We grouped documents through hashing and binning tokens (Hashed) as well as by the top 5% of tokens identified as important through the term frequency inverse document frequency metric (TF-IDF). We then compared the approaches on the number of groups with 3 or more and the resulting longest common subsequences (LCSs) common to all documents in the group. We found that the Hashed method had a higher success rate for finding LCSs, and longer LCSs than the TF-IDF method, however the TF-IDF approach found more groups than the Hashed and subsequently more long sequences, however the average length of LCSs were lower. In conclusion, each algorithm appears to have areas where it appears to be superior.

  3. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – A crane is employed to lift half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, into a vertical position at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  4. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Workers remove the protective wrap from half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, newly arrived at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  5. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, is lifted into a vertical position at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  6. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Preparations are underway to lift half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, into a vertical position at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  7. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, arrives at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  8. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – The second half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, arrives at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower where the other half already is in position. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  9. Changes in Benthos Associated with Mussel (Mytilus edulis L.) Farms on the West-Coast of Scotland

    PubMed Central

    Wilding, Thomas A.; Nickell, Thomas D.

    2013-01-01

    Aquaculture, as a means of food production, is growing rapidly in response to an increasing demand for protein and the over-exploitation of wild fisheries. This expansion includes mussels (family Mytilidae) where production currently stands at 1.5 million tonnes per annum. Mussel culture is frequently perceived as having little environmental impact yet mussel biodeposits and shell debris accumulate around the production site and are linked to changes in the benthos. To assess the extent and nature of changes in benthos associated with mussel farming grab and video sampling around seven mussel farms was conducted. Grab samples were analysed for macrofauna and shell-hash content whilst starfish were counted and the shell-hash cover estimated from video imaging. Shell-hash was patchily distributed and occasionally dominated sediments (maximum of 2116 g per 0.1 m2 grab). Mean shell-hash content decreased rapidly at distances >5 m from the line and, over the distance 1–64 m, decreased by three orders of magnitude. The presence of shell-hash and the distance-from-line influenced macrofaunal assemblages but this effect differed between sites. There was no evidence that mussel farming was associated with changes in macrobenthic diversity, species count or feeding strategy. However, total macrofaunal count was estimated to be 2.5 times higher in close proximity to the lines, compared with 64 m distance, and there was evidence that this effect was conditional on the presence of shell-hash. Starfish density varied considerably between sites but, overall, they were approximately 10 times as abundant close to the mussel-lines compared with 64 m distance. There was no evidence that starfish were more abundant in the presence of shell-hash visible on the sediment surface. In terms of farm-scale benthic impacts these data suggest that mussel farming is a relatively benign way of producing food, compared with intensive fish-farming, in similar environments. PMID:23874583

  10. Directional output distance functions: endogenous directions based on exogenous normalization constraints

    USDA-ARS?s Scientific Manuscript database

    In this paper we develop a model for computing directional output distance functions with endogenously determined direction vectors. We show how this model is related to the slacks-based directional distance function introduced by Fare and Grosskopf and show how to use the slacks-based function to e...

  11. SAR Processing Based On Two-Dimensional Transfer Function

    NASA Technical Reports Server (NTRS)

    Chang, Chi-Yung; Jin, Michael Y.; Curlander, John C.

    1994-01-01

    Exact transfer function, ETF, is two-dimensional transfer function that constitutes basis of improved frequency-domain-convolution algorithm for processing synthetic-aperture-radar, SAR data. ETF incorporates terms that account for Doppler effect of motion of radar relative to scanned ground area and for antenna squint angle. Algorithm based on ETF outperforms others.

  12. Function-Based Approach to Designing an Instructional Environment

    ERIC Educational Resources Information Center

    Park, Kristy; Pinkelman, Sarah

    2017-01-01

    Teachers are faced with the challenge of selecting interventions that are most likely to be effective and best matched to the function of problem behavior. This article will define aspects of the instructional environment and describe a decision-making logic to select environmental variables. A summary of commonly used function-based interventions…

  13. Stochastic optimal operation of reservoirs based on copula functions

    NASA Astrophysics Data System (ADS)

    Lei, Xiao-hui; Tan, Qiao-feng; Wang, Xu; Wang, Hao; Wen, Xin; Wang, Chao; Zhang, Jing-wen

    2018-02-01

    Stochastic dynamic programming (SDP) has been widely used to derive operating policies for reservoirs considering streamflow uncertainties. In SDP, there is a need to calculate the transition probability matrix more accurately and efficiently in order to improve the economic benefit of reservoir operation. In this study, we proposed a stochastic optimization model for hydropower generation reservoirs, in which 1) the transition probability matrix was calculated based on copula functions; and 2) the value function of the last period was calculated by stepwise iteration. Firstly, the marginal distribution of stochastic inflow in each period was built and the joint distributions of adjacent periods were obtained using the three members of the Archimedean copulas, based on which the conditional probability formula was derived. Then, the value in the last period was calculated by a simple recursive equation with the proposed stepwise iteration method and the value function was fitted with a linear regression model. These improvements were incorporated into the classic SDP and applied to the case study in Ertan reservoir, China. The results show that the transition probability matrix can be more easily and accurately obtained by the proposed copula function based method than conventional methods based on the observed or synthetic streamflow series, and the reservoir operation benefit can also be increased.

  14. Cereal based functional food of Indian subcontinent: a review.

    PubMed

    Das, Arpita; Raychaudhuri, Utpal; Chakraborty, Runu

    2012-12-01

    Due to constant health awareness and readily available information on usefulness of different diet and their direct link with health, the demand of functional food is increasing day by day. The concept of functional foods includes foods or food ingredients that exert a beneficial effect on host health and/or reduce the risk of chronic disease beyond basic nutritional functions. Increasing awareness of consumer health and interest in functional foods to achieve a healthy lifestyle has resulted in the need for food products with versatile health-benefiting properties. Cereal- and cereal component-based food products offer opportunities to include probiotics, prebiotics, and fibers in the human diet. Various growth studies using probiotic Lactic acid bacteria on cereal-based substrates and utilization of whole grain or components as high-fiber foods in developing novel food products lend support to the idea that cereal-based media may well be good probiotic carriers. It is essential that science and traditional knowledge should go together to find mutually beneficial results. In the Indian subcontinent, making use of fermented food and beverages using local food crops and other biological resources are very common. But the nature of the products and the base material vary from region to region.

  15. Teacher-Conducted Trial-Based Functional Analyses as the Basis for Intervention

    ERIC Educational Resources Information Center

    Bloom, Sarah E.; Lambert, Joseph M.; Dayton, Elizabeth; Samaha, Andrew L.

    2013-01-01

    Previous studies have focused on whether a trial-based functional analysis (FA) yields the same outcomes as more traditional FAs, and whether interventions based on trial-based FAs can reduce socially maintained problem behavior. We included a full range of behavior functions and taught 3 teachers to conduct a trial-based FA with 3 boys with…

  16. Modified current follower-based immittance function simulators

    NASA Astrophysics Data System (ADS)

    Alpaslan, Halil; Yuce, Erkan

    2017-12-01

    In this paper, four immittance function simulators consisting of a single modified current follower with single Z- terminal and a minimum number of passive components are proposed. The first proposed circuit can provide +L parallel with +R and the second proposed one can realise -L parallel with -R. The third proposed structure can provide +L series with +R and the fourth proposed one can realise -L series with -R. However, all the proposed immittance function simulators need a single resistive matching constraint. Parasitic impedance effects on all the proposed immittance function simulators are investigated. A second-order current-mode (CM) high-pass filter derived from the first proposed immittance function simulator is given as an application example. Also, a second-order CM low-pass filter derived from the third proposed immittance function simulator is given as an application example. A number of simulation results based on SPICE programme and an experimental test result are given to verify the theory.

  17. GeNemo: a search engine for web-based functional genomic data.

    PubMed

    Zhang, Yongqing; Cao, Xiaoyi; Zhong, Sheng

    2016-07-08

    A set of new data types emerged from functional genomic assays, including ChIP-seq, DNase-seq, FAIRE-seq and others. The results are typically stored as genome-wide intensities (WIG/bigWig files) or functional genomic regions (peak/BED files). These data types present new challenges to big data science. Here, we present GeNemo, a web-based search engine for functional genomic data. GeNemo searches user-input data against online functional genomic datasets, including the entire collection of ENCODE and mouse ENCODE datasets. Unlike text-based search engines, GeNemo's searches are based on pattern matching of functional genomic regions. This distinguishes GeNemo from text or DNA sequence searches. The user can input any complete or partial functional genomic dataset, for example, a binding intensity file (bigWig) or a peak file. GeNemo reports any genomic regions, ranging from hundred bases to hundred thousand bases, from any of the online ENCODE datasets that share similar functional (binding, modification, accessibility) patterns. This is enabled by a Markov Chain Monte Carlo-based maximization process, executed on up to 24 parallel computing threads. By clicking on a search result, the user can visually compare her/his data with the found datasets and navigate the identified genomic regions. GeNemo is available at www.genemo.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Heteromultimerization modulates P2X receptor functions through participating extracellular and C-terminal subdomains.

    PubMed

    Koshimizu, Taka-aki; Ueno, Susumu; Tanoue, Akito; Yanagihara, Nobuyuki; Stojilkovic, Stanko S; Tsujimoto, Gozoh

    2002-12-06

    P2X purinergic receptors (P2XRs) differ among themselves with respect to their ligand preferences and channel kinetics during activation, desensitization, and recovery. However, the contributions of distinct receptor subdomains to the subtype-specific behavior have been incompletely characterized. Here we show that homomeric receptors having the extracellular domain of the P2X(3) subunit in the P2X(2a)-based backbone (P2X(2a)/X(3)ex) mimicked two intrinsic functions of P2X(3)R, sensitivity to alphabeta-methylene ATP and ecto-ATPase-dependent recovery from endogenous desensitization; these two functions were localized to the N- and C-terminal halves of the P2X(3) extracellular loop, respectively. The chimeric P2X(2a)R/X(3)ex receptors also desensitized with accelerated rates compared with native P2X(2a)R, and the introduction of P2X(2) C-terminal splicing into the chimeric subunit (P2X(2b)/X(3)ex) further increased the rate of desensitization. Physical and functional heteromerization of native P2X(2a) and P2X(2b) subunits was also demonstrated. In heteromeric receptors, the ectodomain of P2X(3) was a structural determinant for ligand selectivity and recovery from desensitization, and the C terminus of P2X(2) was an important factor for the desensitization rate. Furthermore, [gamma-(32)P]8-azido ATP, a photoreactive agonist, was effectively cross-linked to P2X(3) subunit in homomeric receptors but not in heteromeric P2X(2) + P2X(3)Rs. These results indicate that heteromeric receptors formed by distinct P2XR subunits develop new functions resulting from integrative effects of the participating extracellular and C-terminal subdomains.

  19. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Workers maneuver half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, newly arrived at Space Launch Complex 2 on Vandenberg Air Force Base in California, into position underneath the crane. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower where the other half already is in position. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  20. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – Workers attach a crane onto half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, newly arrived at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower where the other half already is in position. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  1. OCO-2: Hoisting the Fairing Halves up the MST

    NASA Image and Video Library

    2014-03-24

    VANDENBERG AIR FORCE BASE, Calif. – A crane lifts half of the fairing for NASA's Orbiting Carbon Observatory-2 mission, or OCO-2, up the side of the mobile service tower at Space Launch Complex 2 on Vandenberg Air Force Base in California. Operations are underway to hoist this section of the fairing into the Delta II launcher's environmental enclosure, or clean room, at the top of the tower where the other half is already in position. The fairing will protect OCO-2 during launch aboard a United Launch Alliance Delta II rocket from Space Launch Complex 2 in July. The observatory will collect precise global measurements of carbon dioxide in the Earth's atmosphere and provide scientists with a better idea of the chemical compound's impacts on climate change. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. To learn more about OCO-2, visit http://oco.jpl.nasa.gov. Photo credit: NASA/Randy Beaudoin

  2. Sprint-based exercise and cognitive function in adolescents.

    PubMed

    Cooper, Simon B; Bandelow, Stephan; Nute, Maria L; Dring, Karah J; Stannard, Rebecca L; Morris, John G; Nevill, Mary E

    2016-12-01

    Moderate intensity exercise has been shown to enhance cognition in an adolescent population, yet the effect of high-intensity sprint-based exercise remains unknown and was therefore examined in the present study. Following ethical approval and familiarisation, 44 adolescents (12.6 ± 0.6 y) completed an exercise (E) and resting (R) trial in a counter-balanced, randomised crossover design. The exercise trial comprised of 10 × 10 s running sprints, interspersed by 50 s active recovery (walking). A battery of cognitive function tests (Stroop, Digit Symbol Substitution (DSST) and Corsi blocks tests) were completed 30 min pre-exercise, immediately post-exercise and 45 min post-exercise. Data were analysed using mixed effect models with repeated measures. Response times on the simple level of the Stroop test were significantly quicker 45 min following sprint-based exercise (R: 818 ± 33 ms, E: 772 ± 26 ms; p = 0.027) and response times on the complex level of the Stroop test were quicker immediately following the sprint-based exercise (R: 1095 ± 36 ms, E: 1043 ± 37 ms; p = 0.038), while accuracy was maintained. Sprint-based exercise had no immediate or delayed effects on the number of items recalled on the Corsi blocks test (p = 0.289) or substitutions made during the DSST (p = 0.689). The effect of high intensity sprint-based exercise on adolescents' cognitive function was dependant on the component of cognitive function examined. Executive function was enhanced following exercise, demonstrated by improved response times on the Stroop test, whilst visuo-spatial memory and general psycho-motor speed were unaffected. These data support the inclusion of high-intensity sprint-based exercise for adolescents during the school day to enhance cognition.

  3. A domain-based approach for analyzing the function of aluminum-activated malate transporters from wheat (Triticum aestivum) and Arabidopsis thaliana in Xenopus oocytes.

    PubMed

    Sasaki, Takayuki; Tsuchiya, Yoshiyuki; Ariyoshi, Michiyo; Ryan, Peter R; Furuichi, Takuya; Yamamoto, Yoko

    2014-12-01

    Wheat and Arabidopsis plants respond to aluminum (Al) ions by releasing malate from their root apices via Al-activated malate transporter. Malate anions bind with the toxic Al ions and contribute to the Al tolerance of these species. The genes encoding the transporters in wheat and Arabidopsis, TaALMT1 and AtALMT1, respectively, were expressed in Xenopus laevis oocytes and characterized electrophysiologically using the two-electrode voltage clamp system. The Al-activated currents generated by malate efflux were detected for TaALMT1 but not for AtALMT1. Chimeric proteins were generated by swapping the N- and C-terminal halves of TaALMT1 and AtALMT1 (Ta::At and At::Ta). When these chimeras were characterized in oocytes, Al-activated malate efflux was detected for the Ta::At chimera but not for At::Ta, suggesting that the N-terminal half of TaALMT1 is necessary for function in oocytes. An additional chimera, Ta(48)::At, generated by swapping 17 residues from the N-terminus of AtALMT1 with the equivalent 48 residues from TaALMT1, was sufficient to support transport activity. This 48 residue region includes a helical region with a putative transmembrane domain which is absent in AtALMT1. The deletion of this domain from Ta(48)::At led to the complete loss of transport activity. Furthermore, truncations and a deletion at the C-terminal end of TaALMT1 indicated that a putative helical structure in this region was also required for transport function. This study provides insights into the structure-function relationships of Al-activated ALMT proteins by identifying specific domains on the N- and C-termini of TaALMT1 that are critical for basal transport function and Al responsiveness in oocytes. © The Author 2014. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. Study on Capturing Functional Requirements of the New Product Based on Evolution

    NASA Astrophysics Data System (ADS)

    Liu, Fang; Song, Liya; Bai, Zhonghang; Zhang, Peng

    In order to exist in an increasingly competitive global marketplace, it is important for corporations to forecast the evolutionary direction of new products rapidly and effectively. Most products in the world are developed based on the design of existing products. In the product design, capturing functional requirements is a key step. Function is continuously evolving, which is driven by the evolution of needs and technologies. So the functional requirements of new product can be forecasted based on the functions of existing product. Eight laws of function evolution are put forward in this paper. The process model of capturing the functional requirements of new product based on function evolution is proposed. An example illustrates the design process.

  5. PanFP: Pangenome-based functional profiles for microbial communities

    DOE PAGES

    Jun, Se -Ran; Hauser, Loren John; Schadt, Christopher Warren; ...

    2015-09-26

    For decades there has been increasing interest in understanding the relationships between microbial communities and ecosystem functions. Current DNA sequencing technologies allows for the exploration of microbial communities in two principle ways: targeted rRNA gene surveys and shotgun metagenomics. For large study designs, it is often still prohibitively expensive to sequence metagenomes at both the breadth and depth necessary to statistically capture the true functional diversity of a community. Although rRNA gene surveys provide no direct evidence of function, they do provide a reasonable estimation of microbial diversity, while being a very cost effective way to screen samples of interestmore » for later shotgun metagenomic analyses. However, there is a great deal of 16S rRNA gene survey data currently available from diverse environments, and thus a need for tools to infer functional composition of environmental samples based on 16S rRNA gene survey data. As a result, we present a computational method called pangenome based functional profiles (PanFP), which infers functional profiles of microbial communities from 16S rRNA gene survey data for Bacteria and Archaea. PanFP is based on pangenome reconstruction of a 16S rRNA gene operational taxonomic unit (OTU) from known genes and genomes pooled from the OTU s taxonomic lineage. From this lineage, we derive an OTU functional profile by weighting a pangenome s functional profile with the OTUs abundance observed in a given sample. We validated our method by comparing PanFP to the functional profiles obtained from the direct shotgun metagenomic measurement of 65 diverse communities via Spearman correlation coefficients. These correlations improved with increasing sequencing depth, within the range of 0.8 0.9 for the most deeply sequenced Human Microbiome Project mock community samples. PanFP is very similar in performance to another recently released tool, PICRUSt, for almost all of survey data analysed here. But

  6. SKL algorithm based fabric image matching and retrieval

    NASA Astrophysics Data System (ADS)

    Cao, Yichen; Zhang, Xueqin; Ma, Guojian; Sun, Rongqing; Dong, Deping

    2017-07-01

    Intelligent computer image processing technology provides convenience and possibility for designers to carry out designs. Shape analysis can be achieved by extracting SURF feature. However, high dimension of SURF feature causes to lower matching speed. To solve this problem, this paper proposed a fast fabric image matching algorithm based on SURF K-means and LSH algorithm. By constructing the bag of visual words on K-Means algorithm, and forming feature histogram of each image, the dimension of SURF feature is reduced at the first step. Then with the help of LSH algorithm, the features are encoded and the dimension is further reduced. In addition, the indexes of each image and each class of image are created, and the number of matching images is decreased by LSH hash bucket. Experiments on fabric image database show that this algorithm can speed up the matching and retrieval process, the result can satisfy the requirement of dress designers with accuracy and speed.

  7. Specification-based software sizing: An empirical investigation of function metrics

    NASA Technical Reports Server (NTRS)

    Jeffery, Ross; Stathis, John

    1993-01-01

    For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.

  8. Control design based on a linear state function observer

    NASA Technical Reports Server (NTRS)

    Su, Tzu-Jeng; Craig, Roy R., Jr.

    1992-01-01

    An approach to the design of low-order controllers for large scale systems is proposed. The method is derived from the theory of linear state function observers. First, the realization of a state feedback control law is interpreted as the observation of a linear function of the state vector. The linear state function to be reconstructed is the given control law. Then, based on the derivation for linear state function observers, the observer design is formulated as a parameter optimization problem. The optimization objective is to generate a matrix that is close to the given feedback gain matrix. Based on that matrix, the form of the observer and a new control law can be determined. A four-disk system and a lightly damped beam are presented as examples to demonstrate the applicability and efficacy of the proposed method.

  9. A Polynomial Subset-Based Efficient Multi-Party Key Management System for Lightweight Device Networks.

    PubMed

    Mahmood, Zahid; Ning, Huansheng; Ghafoor, AtaUllah

    2017-03-24

    Wireless Sensor Networks (WSNs) consist of lightweight devices to measure sensitive data that are highly vulnerable to security attacks due to their constrained resources. In a similar manner, the internet-based lightweight devices used in the Internet of Things (IoT) are facing severe security and privacy issues because of the direct accessibility of devices due to their connection to the internet. Complex and resource-intensive security schemes are infeasible and reduce the network lifetime. In this regard, we have explored the polynomial distribution-based key establishment schemes and identified an issue that the resultant polynomial value is either storage intensive or infeasible when large values are multiplied. It becomes more costly when these polynomials are regenerated dynamically after each node join or leave operation and whenever key is refreshed. To reduce the computation, we have proposed an Efficient Key Management (EKM) scheme for multiparty communication-based scenarios. The proposed session key management protocol is established by applying a symmetric polynomial for group members, and the group head acts as a responsible node. The polynomial generation method uses security credentials and secure hash function. Symmetric cryptographic parameters are efficient in computation, communication, and the storage required. The security justification of the proposed scheme has been completed by using Rubin logic, which guarantees that the protocol attains mutual validation and session key agreement property strongly among the participating entities. Simulation scenarios are performed using NS 2.35 to validate the results for storage, communication, latency, energy, and polynomial calculation costs during authentication, session key generation, node migration, secure joining, and leaving phases. EKM is efficient regarding storage, computation, and communication overhead and can protect WSN-based IoT infrastructure.

  10. Coral reef habitats as surrogates of species, ecological functions, and ecosystem services.

    PubMed

    Mumby, Peter J; Broad, Kenneth; Brumbaugh, Daniel R; Dahlgren, Craig P; Harborne, Alastair R; Hastings, Alan; Holmes, Katherine E; Kappel, Carrie V; Micheli, Fiorenza; Sanchirico, James N

    2008-08-01

    Habitat maps are often the core spatially consistent data set on which marine reserve networks are designed, but their efficacy as surrogates for species richness and applicability to other conservation measures is poorly understood. Combining an analysis of field survey data, literature review, and expert assessment by a multidisciplinary working group, we examined the degree to which Caribbean coastal habitats provide useful planning information on 4 conservation measures: species richness, the ecological functions of fish species, ecosystem processes, and ecosystem services. Approximately one-quarter to one-third of benthic invertebrate species and fish species (disaggregated by life phase; hereafter fish species) occurred in a single habitat, and Montastraea-dominated forereefs consistently had the highest richness of all species, processes, and services. All 11 habitats were needed to represent all 277 fish species in the seascape, although reducing the conservation target to 95% of species approximately halved the number of habitats required to ensure representation. Species accumulation indices (SAIs) were used to compare the efficacy of surrogates and revealed that fish species were a more appropriate surrogate of benthic species (SAI = 71%) than benthic species were for fishes (SAI = 42%). Species of reef fishes were also distributed more widely across the seascape than invertebrates and therefore their use as a surrogate simultaneously included mangroves, sea grass, and coral reef habitats. Functional classes of fishes served as effective surrogates of fish and benthic species which, given their ease to survey, makes them a particularly useful measure for conservation planning. Ecosystem processes and services exhibited great redundancy among habitats and were ineffective as surrogates of species. Therefore, processes and services in this case were generally unsuitable for a complementarity-based approach to reserve design. In contrast, the representation

  11. OTIS Basic Index Access System (OBIAS); A System for Retrieval of Information From the ERIC and CIJE Data Bases Utilizing a Direct Access Inverted Index of Descriptors and a Reformatted Direct Access ERIC-CIJE File.

    ERIC Educational Resources Information Center

    Bracken, Paula

    The OTIS Basic Index Access System (OBIAS) for searching the ERIC data base is described. This system offers two advantages over the previous system. First, search time has been halved, reducing the cost per search to an estimated $10 on a batch basis. Second, the "OTIS ERIC Descripter Catalog" which contains all descriptors used in the…

  12. Partial correlation-based functional connectivity analysis for functional near-infrared spectroscopy signals

    NASA Astrophysics Data System (ADS)

    Akın, Ata

    2017-12-01

    A theoretical framework, a partial correlation-based functional connectivity (PC-FC) analysis to functional near-infrared spectroscopy (fNIRS) data, is proposed. This is based on generating a common background signal from a high passed version of fNIRS data averaged over all channels as the regressor in computing the PC between pairs of channels. This approach has been employed to real data collected during a Stroop task. The results show a strong significance in the global efficiency (GE) metric computed by the PC-FC analysis for neutral, congruent, and incongruent stimuli (NS, CS, IcS; GEN=0.10±0.009, GEC=0.11±0.01, GEIC=0.13±0.015, p=0.0073). A positive correlation (r=0.729 and p=0.0259) is observed between the interference of reaction times (incongruent-neutral) and interference of GE values (GEIC-GEN) computed from [HbO] signals.

  13. Totally S-protected hyaluronic acid: Evaluation of stability and mucoadhesive properties as liquid dosage form.

    PubMed

    Pereira de Sousa, Irene; Suchaoin, Wongsakorn; Zupančič, Ožbej; Leichner, Christina; Bernkop-Schnürch, Andreas

    2016-11-05

    It is the aim of this study to synthesize hyaluronic acid (HA) derivatives bearing mucoadhesive properties and showing prolonged stability at pH 7.4 and under oxidative condition as liquid dosage form. HA was modified by thiolation with l-cysteine (HA-SH) and by conjugation with 2-mercaptonicotinic acid-l-cysteine ligand to obtain an S-protected derivative (HA-MNA). The polymers were characterized by determination of thiol group content and mercaptonicotinic acid content. Cytotoxicity, stability and mucoadhesive properties (rheological evaluation and tensile test) of the polymers were evaluated. HA-SH and HA-MNA could be successfully synthesized with a degree of modification of 5% and 9% of the total moles of carboxylic acid groups, respectively. MTT assay revealed no toxicity for the polymers. HA-SH resulted to be unstable both at pH 7.4 and under oxidative conditions, whereas HA-MNA was stable under both conditions. Rheological assessment showed a 52-fold and a 3-fold increase in viscosity for HA-MNA incubated with mucus compared to unmodified HA and HA-SH, respectively. Tensile evaluation carried out with intestinal and conjunctival mucosa confirmed the higher mucoadhesive properties of HA-MNA compared to HA-SH. According to the presented results, HA-MNA appears to be a potent excipient for the formulation of stable liquid dosage forms showing comparatively high mucodhesive properties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A comparative study of Message Digest 5(MD5) and SHA256 algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Tarigan, J. T.; Ginting, A. B. C.

    2018-03-01

    The document is a collection of written or printed data containing information. The more rapid advancement of technology, the integrity of a document should be kept. Because of the nature of an open document means the document contents can be read and modified by many parties so that the integrity of the information as a content of the document is not preserved. To maintain the integrity of the data, it needs to create a mechanism which is called a digital signature. A digital signature is a specific code which is generated from the function of producing a digital signature. One of the algorithms that used to create the digital signature is a hash function. There are many hash functions. Two of them are message digest 5 (MD5) and SHA256. Those both algorithms certainly have its advantages and disadvantages of each. The purpose of this research is to determine the algorithm which is better. The parameters which used to compare that two algorithms are the running time and complexity. The research results obtained from the complexity of the Algorithms MD5 and SHA256 is the same, i.e., ⊖ (N), but regarding the speed is obtained that MD5 is better compared to SHA256.

  15. Questionnaire-based assessment of executive functioning: Psychometrics.

    PubMed

    Castellanos, Irina; Kronenberger, William G; Pisoni, David B

    2018-01-01

    The psychometric properties of the Learning, Executive, and Attention Functioning (LEAF) scale were investigated in an outpatient clinical pediatric sample. As a part of clinical testing, the LEAF scale, which broadly measures neuropsychological abilities related to executive functioning and learning, was administered to parents of 118 children and adolescents referred for psychological testing at a pediatric psychology clinic; 85 teachers also completed LEAF scales to assess reliability across different raters and settings. Scores on neuropsychological tests of executive functioning and academic achievement were abstracted from charts. Psychometric analyses of the LEAF scale demonstrated satisfactory internal consistency, parent-teacher inter-rater reliability in the small to large effect size range, and test-retest reliability in the large effect size range, similar to values for other executive functioning checklists. Correlations between corresponding subscales on the LEAF and other behavior checklists were large, while most correlations with neuropsychological tests of executive functioning and achievement were significant but in the small to medium range. Results support the utility of the LEAF as a reliable and valid questionnaire-based assessment of delays and disturbances in executive functioning and learning. Applications and advantages of the LEAF and other questionnaire measures of executive functioning in clinical neuropsychology settings are discussed.

  16. Studies on combined model based on functional objectives of large scale complex engineering

    NASA Astrophysics Data System (ADS)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  17. Connectotyping: Model Based Fingerprinting of the Functional Connectome

    PubMed Central

    Miranda-Dominguez, Oscar; Mills, Brian D.; Carpenter, Samuel D.; Grant, Kathleen A.; Kroenke, Christopher D.; Nigg, Joel T.; Fair, Damien A.

    2014-01-01

    A better characterization of how an individual’s brain is functionally organized will likely bring dramatic advances to many fields of study. Here we show a model-based approach toward characterizing resting state functional connectivity MRI (rs-fcMRI) that is capable of identifying a so-called “connectotype”, or functional fingerprint in individual participants. The approach rests on a simple linear model that proposes the activity of a given brain region can be described by the weighted sum of its functional neighboring regions. The resulting coefficients correspond to a personalized model-based connectivity matrix that is capable of predicting the timeseries of each subject. Importantly, the model itself is subject specific and has the ability to predict an individual at a later date using a limited number of non-sequential frames. While we show that there is a significant amount of shared variance between models across subjects, the model’s ability to discriminate an individual is driven by unique connections in higher order control regions in frontal and parietal cortices. Furthermore, we show that the connectotype is present in non-human primates as well, highlighting the translational potential of the approach. PMID:25386919

  18. Hydrodynamics-based functional forms of activity metabolism: a case for the power-law polynomial function in animal swimming energetics.

    PubMed

    Papadopoulos, Anthony

    2009-01-01

    The first-degree power-law polynomial function is frequently used to describe activity metabolism for steady swimming animals. This function has been used in hydrodynamics-based metabolic studies to evaluate important parameters of energetic costs, such as the standard metabolic rate and the drag power indices. In theory, however, the power-law polynomial function of any degree greater than one can be used to describe activity metabolism for steady swimming animals. In fact, activity metabolism has been described by the conventional exponential function and the cubic polynomial function, although only the power-law polynomial function models drag power since it conforms to hydrodynamic laws. Consequently, the first-degree power-law polynomial function yields incorrect parameter values of energetic costs if activity metabolism is governed by the power-law polynomial function of any degree greater than one. This issue is important in bioenergetics because correct comparisons of energetic costs among different steady swimming animals cannot be made unless the degree of the power-law polynomial function derives from activity metabolism. In other words, a hydrodynamics-based functional form of activity metabolism is a power-law polynomial function of any degree greater than or equal to one. Therefore, the degree of the power-law polynomial function should be treated as a parameter, not as a constant. This new treatment not only conforms to hydrodynamic laws, but also ensures correct comparisons of energetic costs among different steady swimming animals. Furthermore, the exponential power-law function, which is a new hydrodynamics-based functional form of activity metabolism, is a special case of the power-law polynomial function. Hence, the link between the hydrodynamics of steady swimming and the exponential-based metabolic model is defined.

  19. Early Childhood Practitioner Involvement in Functional Behavioral Assessment and Function-Based Interventions: A Literature Review

    ERIC Educational Resources Information Center

    Wood, Brenna K.; Drogan, Robin R.; Janney, Donna M.

    2014-01-01

    Reviewers analyzed studies published from 1990 to 2012 to determine early childhood practitioner involvement in functional behavioral assessment (FBA) and function-based behavioral intervention plans (BIP) for children with challenging behavior, age 6 and younger. Coding of 30 studies included practitioner involvement in FBA and BIP processes,…

  20. Role of the parameters involved in the plan optimization based on the generalized equivalent uniform dose and radiobiological implications

    NASA Astrophysics Data System (ADS)

    Widesott, L.; Strigari, L.; Pressello, M. C.; Benassi, M.; Landoni, V.

    2008-03-01

    We investigated the role and the weight of the parameters involved in the intensity modulated radiation therapy (IMRT) optimization based on the generalized equivalent uniform dose (gEUD) method, for prostate and head-and-neck plans. We systematically varied the parameters (gEUDmax and weight) involved in the gEUD-based optimization of rectal wall and parotid glands. We found that the proper value of weight factor, still guaranteeing planning treatment volumes coverage, produced similar organs at risks dose-volume (DV) histograms for different gEUDmax with fixed a = 1. Most of all, we formulated a simple relation that links the reference gEUDmax and the associated weight factor. As secondary objective, we evaluated plans obtained with the gEUD-based optimization and ones based on DV criteria, using the normal tissue complication probability (NTCP) models. gEUD criteria seemed to improve sparing of rectum and parotid glands with respect to DV-based optimization: the mean dose, the V40 and V50 values to the rectal wall were decreased of about 10%, the mean dose to parotids decreased of about 20-30%. But more than the OARs sparing, we underlined the halving of the OARs optimization time with the implementation of the gEUD-based cost function. Using NTCP models we enhanced differences between the two optimization criteria for parotid glands, but no for rectum wall.

  1. Functional relationship-based alarm processing system

    DOEpatents

    Corsberg, D.R.

    1988-04-22

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the functional relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated or deactivated. Thus, each alarm's importance is continuously updated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on causal factors between two alarms); (3) required action (system response or action expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). The alarm processing system and method is sensitive to the dynamic nature of the process being monitored and is capable of changing the relative importance of each alarm as necessary. 12 figs.

  2. Functional relationship-based alarm processing system

    DOEpatents

    Corsberg, Daniel R.

    1989-01-01

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the functional relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated or deactivated. Thus, each alarm's importance is continuously updated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on causal factors between two alarms); (3) required action (system response or action expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). The alarm processing system and method is sensitive to the dynamic nature of the process being monitored and is capable of changing the relative importance of each alarm as necessary.

  3. [PREPARATION AND BIOCOMPATIBILITY OF IN SITU CROSSLINKING HYALURONIC ACID HYDROGEL].

    PubMed

    Liang, Jiabi; Li, Jun; Wang, Ting; Liang, Yuhong; Zou, Xuenong; Zhou, Guangqian; Zhou, Zhiyu

    2016-06-08

    To fabricate in situ crosslinking hyaluronic acid hydrogel and evaluate its biocompatibility in vitro. The acrylic acid chloride and polyethylene glycol were added to prepare crosslinking agent polyethylene glycol acrylate (PEGDA), and the molecular structure of PEGDA was analyzed by Flourier transformation infrared spectroscopy and 1H nuclear magnetic resonance spectroscopy. Hyaluronic acid hydrogel was chemically modified to prepare hyaluronic acid thiolation (HA-SH). And the degree of HA-SH was analyzed qualitatively and quantitatively by Ellman method. HA-SH solution in concentrations ( W/V ) of 0.5%, 1.0%, and 1.5% and PEGDA solution in concentrations ( W/V ) of 2%, 4%, and 6% were prepared with PBS. The two solutions were mixed in different ratios, and in situ crosslinking hyaluronic acid hydrogel was obtained; the crosslinking time was recorded. The cellular toxicity of in situ crosslinking hyaluronic acid hydrogel (1.5% HA-SH and 4% PEGDA mixed) was tested by L929 cells. Meanwhile, the biocompatibility of hydrogel was tested by co-cultured with human bone mesenchymal stem cells (hBMSCs). Flourier transformation infrared spectroscopy showed that most hydroxyl groups were replaced by acrylate groups; 1H nuclear magnetic resonance spectroscopy showed 3 characteristic peaks of hydrogen representing acrylate and olefinic bond at 5-7 ppm. The thiolation yield of HA-SH was 65.4%. In situ crosslinking time of hyaluronic acid hydrogel was 2 to 70 minutes in the PEGDA concentrations of 2%-6% and HA-SH concentrations of 0.5%-1.5%. The hyaluronic acid hydrogel appeared to be transparent. The toxicity grade of leaching solution of hydrogel was grade 1. hBMSCs grew well and distributed evenly in hydrogel with a very high viability. In situ crosslinking hyaluronic acid hydrogel has low cytotoxicity, good biocompatibility, and controllable crosslinking time, so it could be used as a potential tissue engineered scaffold or repairing material for tissue regeneration.

  4. The biogenesis pathway of tRNA-derived piRNAs in Bombyx germ cells

    PubMed Central

    Honda, Shozo; Kawamura, Takuya; Loher, Phillipe; Morichika, Keisuke; Rigoutsos, Isidore

    2017-01-01

    Abstract Transfer RNAs (tRNAs) function in translational machinery and further serves as a source of short non-coding RNAs (ncRNAs). tRNA-derived ncRNAs show differential expression profiles and play roles in many biological processes beyond translation. Molecular mechanisms that shape and regulate their expression profiles are largely unknown. Here, we report the mechanism of biogenesis for tRNA-derived Piwi-interacting RNAs (td-piRNAs) expressed in Bombyx BmN4 cells. In the cells, two cytoplasmic tRNA species, tRNAAspGUC and tRNAHisGUG, served as major sources for td-piRNAs, which were derived from the 5′-part of the respective tRNAs. cP-RNA-seq identified the two tRNAs as major substrates for the 5′-tRNA halves as well, suggesting a previously uncharacterized link between 5′-tRNA halves and td-piRNAs. An increase in levels of the 5′-tRNA halves, induced by BmNSun2 knockdown, enhanced the td-piRNA expression levels without quantitative change in mature tRNAs, indicating that 5′-tRNA halves, not mature tRNAs, are the direct precursors for td-piRNAs. For the generation of tRNAHisGUG-derived piRNAs, BmThg1l-mediated nucleotide addition to −1 position of tRNAHisGUG was required, revealing an important function of BmThg1l in piRNA biogenesis. Our study advances the understanding of biogenesis mechanisms and the genesis of specific expression profiles for tRNA-derived ncRNAs. PMID:28645172

  5. In Vitro and Ex Vivo Evaluation of Novel Curcumin-Loaded Excipient for Buccal Delivery.

    PubMed

    Laffleur, Flavia; Schmelzle, Franziska; Ganner, Ariane; Vanicek, Stefan

    2017-08-01

    This study aimed to develop a mucoadhesive polymeric excipient comprising curcumin for buccal delivery. Curcumin encompasses broad range of benefits such as antioxidant, anti-inflammatory, and chemotherapeutic activity. Hyaluronic acid (HA) as polymeric excipient was modified by immobilization of thiol bearing ligands. L-Cysteine (SH) ethyl ester was covalently attached via amide bond formation between cysteine and the carboxylic moiety of hyaluronic acid. Succeeded synthesis was proved by H-NMR and IR spectra. The obtained thiolated polymer hyaluronic acid ethyl ester (HA-SH) was evaluated in terms of stability, safety, mucoadhesiveness, drug release, and permeation-enhancing properties. HA-SH showed 2.75-fold higher swelling capacity over time in comparison to unmodified polymer. Furthermore, mucoadhesion increased 3.4-fold in case of HA-SH and drug release was increased 1.6-fold versus HA control, respectively. Curcumin-loaded HA-SH exhibits a 4.4-fold higher permeation compared with respective HA. Taking these outcomes in consideration, novel curcumin-loaded excipient, namely thiolated hyaluronic acid ethyl ester appears as promising tool for pharyngeal diseases.

  6. Cost-Sensitive Local Binary Feature Learning for Facial Age Estimation.

    PubMed

    Lu, Jiwen; Liong, Venice Erin; Zhou, Jie

    2015-12-01

    In this paper, we propose a cost-sensitive local binary feature learning (CS-LBFL) method for facial age estimation. Unlike the conventional facial age estimation methods that employ hand-crafted descriptors or holistically learned descriptors for feature representation, our CS-LBFL method learns discriminative local features directly from raw pixels for face representation. Motivated by the fact that facial age estimation is a cost-sensitive computer vision problem and local binary features are more robust to illumination and expression variations than holistic features, we learn a series of hashing functions to project raw pixel values extracted from face patches into low-dimensional binary codes, where binary codes with similar chronological ages are projected as close as possible, and those with dissimilar chronological ages are projected as far as possible. Then, we pool and encode these local binary codes within each face image as a real-valued histogram feature for face representation. Moreover, we propose a cost-sensitive local binary multi-feature learning method to jointly learn multiple sets of hashing functions using face patches extracted from different scales to exploit complementary information. Our methods achieve competitive performance on four widely used face aging data sets.

  7. Running Clubs--A Combinatorial Investigation.

    ERIC Educational Resources Information Center

    Nissen, Phillip; Taylor, John

    1991-01-01

    Presented is a combinatorial problem based on the Hash House Harriers rule which states that the route of the run should not have previously been traversed by the club. Discovered is how many weeks the club can meet before the rule has to be broken. (KR)

  8. Does Temporal Integration of Face Parts Reflect Holistic Processing?

    PubMed Central

    Cheung, Olivia S.; Richler, Jennifer J.; Phillips, W. Stewart; Gauthier, Isabel

    2011-01-01

    We examined whether temporal integration of face parts reflects holistic processing or response interference. Participants learned to name two faces “Fred” and two “Bob”. At test, top and bottom halves of different faces formed composites and were presented briefly separated in time. Replicating prior findings (Singer & Sheinberg, 2006), naming of the target halves for aligned composites was slowed when the irrelevant halves were from faces with a different name compared to that from the original face. However, no interference was observed when the irrelevant halves had identical names as the target halves but came from different learned faces, arguing against a true holistic effect. Instead, response interference was obtained when the target halves briefly preceded the irrelevant halves. Experiment 2 confirmed a double-dissociation between holistic processing vs. response interference for intact faces vs. temporally separated face halves, suggesting that simultaneous presentation of facial information is critical for holistic processing. PMID:21327378

  9. Data Sharing in DHT Based P2P Systems

    NASA Astrophysics Data System (ADS)

    Roncancio, Claudia; Del Pilar Villamil, María; Labbé, Cyril; Serrano-Alvarado, Patricia

    The evolution of peer-to-peer (P2P) systems triggered the building of large scale distributed applications. The main application domain is data sharing across a very large number of highly autonomous participants. Building such data sharing systems is particularly challenging because of the “extreme” characteristics of P2P infrastructures: massive distribution, high churn rate, no global control, potentially untrusted participants... This article focuses on declarative querying support, query optimization and data privacy on a major class of P2P systems, that based on Distributed Hash Table (P2P DHT). The usual approaches and the algorithms used by classic distributed systems and databases for providing data privacy and querying services are not well suited to P2P DHT systems. A considerable amount of work was required to adapt them for the new challenges such systems present. This paper describes the most important solutions found. It also identifies important future research trends in data management in P2P DHT systems.

  10. Effects of gross motor function and manual function levels on performance-based ADL motor skills of children with spastic cerebral palsy.

    PubMed

    Park, Myoung-Ok

    2017-02-01

    [Purpose] The purpose of this study was to determine effects of Gross Motor Function Classification System and Manual Ability Classification System levels on performance-based motor skills of children with spastic cerebral palsy. [Subjects and Methods] Twenty-three children with cerebral palsy were included. The Assessment of Motor and Process Skills was used to evaluate performance-based motor skills in daily life. Gross motor function was assessed using Gross Motor Function Classification Systems, and manual function was measured using the Manual Ability Classification System. [Results] Motor skills in daily activities were significantly different on Gross Motor Function Classification System level and Manual Ability Classification System level. According to the results of multiple regression analysis, children categorized as Gross Motor Function Classification System level III scored lower in terms of performance based motor skills than Gross Motor Function Classification System level I children. Also, when analyzed with respect to Manual Ability Classification System level, level II was lower than level I, and level III was lower than level II in terms of performance based motor skills. [Conclusion] The results of this study indicate that performance-based motor skills differ among children categorized based on Gross Motor Function Classification System and Manual Ability Classification System levels of cerebral palsy.

  11. Tuple spaces in hardware for accelerated implicit routing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Zachary Kent; Tripp, Justin

    2010-12-01

    Organizing and optimizing data objects on networks with support for data migration and failing nodes is a complicated problem to handle as systems grow. The goal of this work is to demonstrate that high levels of speedup can be achieved by moving responsibility for finding, fetching, and staging data into an FPGA-based network card. We present a system for implicit routing of data via FPGA-based network cards. In this system, data structures are requested by name, and the network of FPGAs finds the data within the network and relays the structure to the requester. This is acheived through successive examinationmore » of hardware hash tables implemented in the FPGA. By avoiding software stacks between nodes, the data is quickly fetched entirely through FPGA-FPGA interaction. The performance of this system is orders of magnitude faster than software implementations due to the improved speed of the hash tables and lowered latency between the network nodes.« less

  12. A Study of Gaps in Cyber Defense Automation

    DTIC Science & Technology

    2015-10-18

    to lowercase. Next, the normalized file is tokenized using an n -length window. These n -tokens are then hashed into a Bloom filter for that file. In the...readily available to most developers. However, with a complexity of O(N2) where N is the number of files (or the number of functions if using function...that measure the impact of each feature on the website’s chances of becoming compromised, and the top N features are submitted to the classification

  13. Boron-based nanostructures: Synthesis, functionalization, and characterization

    NASA Astrophysics Data System (ADS)

    Bedasso, Eyrusalam Kifyalew

    Boron-based nanostructures have not been explored in detail; however, these structures have the potential to revolutionize many fields including electronics and biomedicine. The research discussed in this dissertation focuses on synthesis, functionalization, and characterization of boron-based zero-dimensional nanostructures (core/shell and nanoparticles) and one-dimensional nanostructures (nanorods). The first project investigates the synthesis and functionalization of boron-based core/shell nanoparticles. Two boron-containing core/shell nanoparticles, namely boron/iron oxide and boron/silica, were synthesized. Initially, boron nanoparticles with a diameter between 10-100 nm were prepared by decomposition of nido-decaborane (B10H14) followed by formation of a core/shell structure. The core/shell structures were prepared using the appropriate precursor, iron source and silica source, for the shell in the presence of boron nanoparticles. The formation of core/shell nanostructures was confirmed using high resolution TEM. Then, the core/shell nanoparticles underwent a surface modification. Boron/iron oxide core/shell nanoparticles were functionalized with oleic acid, citric acid, amine-terminated polyethylene glycol, folic acid, and dopamine, and boron/silica core/shell nanoparticles were modified with 3-(amino propyl) triethoxy silane, 3-(2-aminoethyleamino)propyltrimethoxysilane), citric acid, folic acid, amine-terminated polyethylene glycol, and O-(2-Carboxyethyl)polyethylene glycol. A UV-Vis and ATR-FTIR analysis established the success of surface modification. The cytotoxicity of water-soluble core/shell nanoparticles was studied in triple negative breast cancer cell line MDA-MB-231 and the result showed the compounds are not toxic. The second project highlights optimization of reaction conditions for the synthesis of boron nanorods. This synthesis, done via reduction of boron oxide with molten lithium, was studied to produce boron nanorods without any

  14. Information filtering via a scaling-based function.

    PubMed

    Qiu, Tian; Zhang, Zi-Ke; Chen, Guang

    2013-01-01

    Finding a universal description of the algorithm optimization is one of the key challenges in personalized recommendation. In this article, for the first time, we introduce a scaling-based algorithm (SCL) independent of recommendation list length based on a hybrid algorithm of heat conduction and mass diffusion, by finding out the scaling function for the tunable parameter and object average degree. The optimal value of the tunable parameter can be abstracted from the scaling function, which is heterogeneous for the individual object. Experimental results obtained from three real datasets, Netflix, MovieLens and RYM, show that the SCL is highly accurate in recommendation. More importantly, compared with a number of excellent algorithms, including the mass diffusion method, the original hybrid method, and even an improved version of the hybrid method, the SCL algorithm remarkably promotes the personalized recommendation in three other aspects: solving the accuracy-diversity dilemma, presenting a high novelty, and solving the key challenge of cold start problem.

  15. Information Filtering via a Scaling-Based Function

    PubMed Central

    Qiu, Tian; Zhang, Zi-Ke; Chen, Guang

    2013-01-01

    Finding a universal description of the algorithm optimization is one of the key challenges in personalized recommendation. In this article, for the first time, we introduce a scaling-based algorithm (SCL) independent of recommendation list length based on a hybrid algorithm of heat conduction and mass diffusion, by finding out the scaling function for the tunable parameter and object average degree. The optimal value of the tunable parameter can be abstracted from the scaling function, which is heterogeneous for the individual object. Experimental results obtained from three real datasets, Netflix, MovieLens and RYM, show that the SCL is highly accurate in recommendation. More importantly, compared with a number of excellent algorithms, including the mass diffusion method, the original hybrid method, and even an improved version of the hybrid method, the SCL algorithm remarkably promotes the personalized recommendation in three other aspects: solving the accuracy-diversity dilemma, presenting a high novelty, and solving the key challenge of cold start problem. PMID:23696829

  16. InSight Atlas V Boattail Halves Arrival, Offload, Mate

    NASA Image and Video Library

    2018-02-19

    At Vandenberg Air Force Base in California, the boattail adaptor interface that will connect the Centaur upper stage to the payload fairing arrives for NASA's upcoming Interior Exploration using Seismic Investigations, Geodesy and Heat Transport, or InSight, mission to land on Mars. InSight will liftoff atop a United Launch Alliance Atlas V rocket to send the spacecraft on the first mission to explore the Red Planet's deep interior. It will investigate processes that shaped the rocky planets of the inner solar system including Earth. Liftoff from Vandenberg is scheduled for May 5, 2018.

  17. A machine-learned computational functional genomics-based approach to drug classification.

    PubMed

    Lötsch, Jörn; Ultsch, Alfred

    2016-12-01

    The public accessibility of "big data" about the molecular targets of drugs and the biological functions of genes allows novel data science-based approaches to pharmacology that link drugs directly with their effects on pathophysiologic processes. This provides a phenotypic path to drug discovery and repurposing. This paper compares the performance of a functional genomics-based criterion to the traditional drug target-based classification. Knowledge discovery in the DrugBank and Gene Ontology databases allowed the construction of a "drug target versus biological process" matrix as a combination of "drug versus genes" and "genes versus biological processes" matrices. As a canonical example, such matrices were constructed for classical analgesic drugs. These matrices were projected onto a toroid grid of 50 × 82 artificial neurons using a self-organizing map (SOM). The distance, respectively, cluster structure of the high-dimensional feature space of the matrices was visualized on top of this SOM using a U-matrix. The cluster structure emerging on the U-matrix provided a correct classification of the analgesics into two main classes of opioid and non-opioid analgesics. The classification was flawless with both the functional genomics and the traditional target-based criterion. The functional genomics approach inherently included the drugs' modulatory effects on biological processes. The main pharmacological actions known from pharmacological science were captures, e.g., actions on lipid signaling for non-opioid analgesics that comprised many NSAIDs and actions on neuronal signal transmission for opioid analgesics. Using machine-learned techniques for computational drug classification in a comparative assessment, a functional genomics-based criterion was found to be similarly suitable for drug classification as the traditional target-based criterion. This supports a utility of functional genomics-based approaches to computational system pharmacology for drug

  18. Quantification of soil structure based on Minkowski functions

    NASA Astrophysics Data System (ADS)

    Vogel, H.-J.; Weller, U.; Schlüter, S.

    2010-10-01

    The structure of soils and other geologic media is a complex three-dimensional object. Most of the physical material properties including mechanical and hydraulic characteristics are immediately linked to the structure given by the pore space and its spatial distribution. It is an old dream and still a formidable challenge to relate structural features of porous media to their functional properties. Using tomographic techniques, soil structure can be directly observed at a range of spatial scales. In this paper we present a scale-invariant concept to quantify complex structures based on a limited set of meaningful morphological functions. They are based on d+1 Minkowski functionals as defined for d-dimensional bodies. These basic quantities are determined as a function of pore size or aggregate size obtained by filter procedures using mathematical morphology. The resulting Minkowski functions provide valuable information on the size of pores and aggregates, the pore surface area and the pore topology having the potential to be linked to physical properties. The theoretical background and the related algorithms are presented and the approach is demonstrated for the pore structure of an arable soil and the pore structure of a sand both obtained by X-ray micro-tomography. We also analyze the fundamental problem of limited resolution which is critical for any attempt to quantify structural features at any scale using samples of different size recorded at different resolutions. The results demonstrate that objects smaller than 5 voxels are critical for quantitative analysis.

  19. Kernelized Locality-Sensitive Hashing for Fast Image Landmark Association

    DTIC Science & Technology

    2011-03-24

    based Simultaneous Localization and Mapping ( SLAM ). The problem, however, is that vision-based navigation techniques can re- quire excessive amounts of...up and optimizing the data association process in vision-based SLAM . Specifically, this work studies the current methods that algorithms use to...required for location identification than that of other methods. This work can then be extended into a vision- SLAM implementation to subsequently

  20. Microchip-Based Single-Cell Functional Proteomics for Biomedical Applications

    PubMed Central

    Lu, Yao; Yang, Liu; Wei, Wei; Shi, Qihui

    2017-01-01

    Cellular heterogeneity has been widely recognized but only recently have single cell tools become available that allow characterizing heterogeneity at the genomic and proteomic levels. We review the technological advances in microchip-based toolkits for single-cell functional proteomics. Each of these tools has distinct advantages and limitations, and a few have advanced toward being applied to address biological or clinical problems that fail to be addressed by traditional population-based methods. High-throughput single-cell proteomic assays generate high-dimensional data sets that contain new information and thus require developing new analytical framework to extract new biology. In this review article, we highlight a few biological and clinical applications in which the microchip-based single-cell proteomic tools provide unique advantages. The examples include resolving functional heterogeneity and dynamics of immune cells, dissecting cell-cell interaction by creating well-contolled on-chip microenvironment, capturing high-resolution snapshots of immune system functions in patients for better immunotherapy and elucidating phosphoprotein signaling networks in cancer cells for guiding effective molecularly targeted therapies. PMID:28280819

  1. Diagrammatic expansion for positive spectral functions beyond GW: Application to vertex corrections in the electron gas

    NASA Astrophysics Data System (ADS)

    Stefanucci, G.; Pavlyukh, Y.; Uimonen, A.-M.; van Leeuwen, R.

    2014-09-01

    We present a diagrammatic approach to construct self-energy approximations within many-body perturbation theory with positive spectral properties. The method cures the problem of negative spectral functions which arises from a straightforward inclusion of vertex diagrams beyond the GW approximation. Our approach consists of a two-step procedure: We first express the approximate many-body self-energy as a product of half-diagrams and then identify the minimal number of half-diagrams to add in order to form a perfect square. The resulting self-energy is an unconventional sum of self-energy diagrams in which the internal lines of half a diagram are time-ordered Green's functions, whereas those of the other half are anti-time-ordered Green's functions, and the lines joining the two halves are either lesser or greater Green's functions. The theory is developed using noninteracting Green's functions and subsequently extended to self-consistent Green's functions. Issues related to the conserving properties of diagrammatic approximations with positive spectral functions are also addressed. As a major application of the formalism we derive the minimal set of additional diagrams to make positive the spectral function of the GW approximation with lowest-order vertex corrections and screened interactions. The method is then applied to vertex corrections in the three-dimensional homogeneous electron gas by using a combination of analytical frequency integrations and numerical Monte Carlo momentum integrations to evaluate the diagrams.

  2. Free Web-based personal health records: an analysis of functionality.

    PubMed

    Fernández-Alemán, José Luis; Seva-Llor, Carlos Luis; Toval, Ambrosio; Ouhbi, Sofia; Fernández-Luque, Luis

    2013-12-01

    This paper analyzes and assesses the functionality of free Web-based PHRs as regards health information, user actions and connection with other tools. A systematic literature review in Medline, ACM Digital Library, IEEE Digital Library and ScienceDirect was used to select 19 free Web-based PHRs from the 47 PHRs identified. The results show that none of the PHRs selected met 100% of the 28 functions presented in this paper. Two free Web-based PHRs target a particular public. Around 90 % of the PHRs identified allow users throughout the world to create their own profiles without any geographical restrictions. Only half of the PHRs selected provide physicians with user actions. Few PHRs can connect with other tools. There was considerable variability in the types of data included in free Web-based PHRs. Functionality may have implications for PHR use and adoption, particularly as regards patients with chronic illnesses or disabilities. Support for standard medical document formats and protocols are required to enable data to be exchanged with other stakeholders in the health care domain. The results of our study may assist users in selecting the PHR that best fits their needs, since no significant connection exists between the number of functions of the PHRs identified and their popularity.

  3. InSight Atlas V Boattail Halves Arrival, Offload, Mate

    NASA Image and Video Library

    2018-02-19

    At Vandenberg Air Force Base in California, the boattail adaptor interface that will connect the Centaur upper stage to the payload fairing is offloaded for NASA's upcoming Interior Exploration using Seismic Investigations, Geodesy and Heat Transport, or InSight, mission to land on Mars. InSight will liftoff atop a United Launch Alliance Atlas V rocket to send the spacecraft on the first mission to explore the Red Planet's deep interior. It will investigate processes that shaped the rocky planets of the inner solar system including Earth. Liftoff from Vandenberg is scheduled for May 5, 2018.

  4. Wavelet-based reversible watermarking for authentication

    NASA Astrophysics Data System (ADS)

    Tian, Jun

    2002-04-01

    In the digital information age, digital content (audio, image, and video) can be easily copied, manipulated, and distributed. Copyright protection and content authentication of digital content has become an urgent problem to content owners and distributors. Digital watermarking has provided a valuable solution to this problem. Based on its application scenario, most digital watermarking methods can be divided into two categories: robust watermarking and fragile watermarking. As a special subset of fragile watermark, reversible watermark (which is also called lossless watermark, invertible watermark, erasable watermark) enables the recovery of the original, unwatermarked content after the watermarked content has been detected to be authentic. Such reversibility to get back unwatermarked content is highly desired in sensitive imagery, such as military data and medical data. In this paper we present a reversible watermarking method based on an integer wavelet transform. We look into the binary representation of each wavelet coefficient and embed an extra bit to expandable wavelet coefficient. The location map of all expanded coefficients will be coded by JBIG2 compression and these coefficient values will be losslessly compressed by arithmetic coding. Besides these two compressed bit streams, an SHA-256 hash of the original image will also be embedded for authentication purpose.

  5. Effects of Bright Light Therapy on Sleep, Cognition, Brain Function, and Neurochemistry in Mild Traumatic Brain Injury

    DTIC Science & Technology

    2014-01-30

    34poppers").      Cannabis:   marijuana , hashish ("hash"), THC, "pot", "grass", "weed", "reefer".        Tranquilizers:  Quaalude, Seconal ("reds"), Valium...How many times in the past year have you used marijuana ? _______________ Have you ever used marijuana at other times in your life? YES NO...If YES, at what age did you begin smoking marijuana ? ______________ On approximately how many occasions have you used marijuana ? __________ Do

  6. Effects of Bright Light Therapy on Sleep, Cognition, Brain Function, and Neurochemistry in Mild Traumatic Brain Injury

    DTIC Science & Technology

    2015-01-01

    rush”, nitrous oxide ("laughing gas"), amyl or butyl nitrate ("poppers").      Cannabis:   marijuana , hashish ("hash"), THC, "pot", "grass", "weed...have you used marijuana ? _______________ Have you ever used marijuana at other times in your life? YES NO If YES, at what age did you begin...smoking marijuana ? ______________ On approximately how many occasions have you used marijuana ? __________ Do you use any other street drugs currently

  7. A Spatially-Registered, Massively Parallelised Data Structure for Interacting with Large, Integrated Geodatasets

    NASA Astrophysics Data System (ADS)

    Irving, D. H.; Rasheed, M.; O'Doherty, N.

    2010-12-01

    The efficient storage, retrieval and interactive use of subsurface data present great challenges in geodata management. Data volumes are typically massive, complex and poorly indexed with inadequate metadata. Derived geomodels and interpretations are often tightly bound in application-centric and proprietary formats; open standards for long-term stewardship are poorly developed. Consequently current data storage is a combination of: complex Logical Data Models (LDMs) based on file storage formats; 2D GIS tree-based indexing of spatial data; and translations of serialised memory-based storage techniques into disk-based storage. Whilst adequate for working at the mesoscale over a short timeframes, these approaches all possess technical and operational shortcomings: data model complexity; anisotropy of access; scalability to large and complex datasets; and weak implementation and integration of metadata. High performance hardware such as parallelised storage and Relational Database Management System (RDBMS) have long been exploited in many solutions but the underlying data structure must provide commensurate efficiencies to allow multi-user, multi-application and near-realtime data interaction. We present an open Spatially-Registered Data Structure (SRDS) built on Massively Parallel Processing (MPP) database architecture implemented by a ANSI SQL 2008 compliant RDBMS. We propose a LDM comprising a 3D Earth model that is decomposed such that each increasing Level of Detail (LoD) is achieved by recursively halving the bin size until it is less than the error in each spatial dimension for that data point. The value of an attribute at that point is stored as a property of that point and at that LoD. It is key to the numerical efficiency of the SRDS that it is under-pinned by a power-of-two relationship thus precluding the need for computationally intensive floating point arithmetic. Our approach employed a tightly clustered MPP array with small clusters of storage

  8. GOMA: functional enrichment analysis tool based on GO modules

    PubMed Central

    Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213

  9. AmpliVar: mutation detection in high-throughput sequence from amplicon-based libraries.

    PubMed

    Hsu, Arthur L; Kondrashova, Olga; Lunke, Sebastian; Love, Clare J; Meldrum, Cliff; Marquis-Nicholson, Renate; Corboy, Greg; Pham, Kym; Wakefield, Matthew; Waring, Paul M; Taylor, Graham R

    2015-04-01

    Conventional means of identifying variants in high-throughput sequencing align each read against a reference sequence, and then call variants at each position. Here, we demonstrate an orthogonal means of identifying sequence variation by grouping the reads as amplicons prior to any alignment. We used AmpliVar to make key-value hashes of sequence reads and group reads as individual amplicons using a table of flanking sequences. Low-abundance reads were removed according to a selectable threshold, and reads above this threshold were aligned as groups, rather than as individual reads, permitting the use of sensitive alignment tools. We show that this approach is more sensitive, more specific, and more computationally efficient than comparable methods for the analysis of amplicon-based high-throughput sequencing data. The method can be extended to enable alignment-free confirmation of variants seen in hybridization capture target-enrichment data. © 2015 WILEY PERIODICALS, INC.

  10. Functional relationship-based alarm processing

    DOEpatents

    Corsberg, D.R.

    1987-04-13

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated. Thus, each alarm's importance is continuously updated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on causal factors between two alarms); (3) required action (system response or action expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). 11 figs.

  11. Standardized reporting of functioning information on ICF-based common metrics.

    PubMed

    Prodinger, Birgit; Tennant, Alan; Stucki, Gerold

    2018-02-01

    In clinical practice and research a variety of clinical data collection tools are used to collect information on people's functioning for clinical practice and research and national health information systems. Reporting on ICF-based common metrics enables standardized documentation of functioning information in national health information systems. The objective of this methodological note on applying the ICF in rehabilitation is to demonstrate how to report functioning information collected with a data collection tool on ICF-based common metrics. We first specify the requirements for the standardized reporting of functioning information. Secondly, we introduce the methods needed for transforming functioning data to ICF-based common metrics. Finally, we provide an example. The requirements for standardized reporting are as follows: 1) having a common conceptual framework to enable content comparability between any health information; and 2) a measurement framework so that scores between two or more clinical data collection tools can be directly compared. The methods needed to achieve these requirements are the ICF Linking Rules and the Rasch measurement model. Using data collected incorporating the 36-item Short Form Health Survey (SF-36), the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), and the Stroke Impact Scale 3.0 (SIS 3.0), the application of the standardized reporting based on common metrics is demonstrated. A subset of items from the three tools linked to common chapters of the ICF (d4 Mobility, d5 Self-care and d6 Domestic life), were entered as "super items" into the Rasch model. Good fit was achieved with no residual local dependency and a unidimensional metric. A transformation table allows for comparison between scales, and between a scale and the reporting common metric. Being able to report functioning information collected with commonly used clinical data collection tools with ICF-based common metrics enables clinicians

  12. Weighted functional linear regression models for gene-based association analysis.

    PubMed

    Belonogova, Nadezhda M; Svishcheva, Gulnara R; Wilson, James F; Campbell, Harry; Axenovich, Tatiana I

    2018-01-01

    Functional linear regression models are effectively used in gene-based association analysis of complex traits. These models combine information about individual genetic variants, taking into account their positions and reducing the influence of noise and/or observation errors. To increase the power of methods, where several differently informative components are combined, weights are introduced to give the advantage to more informative components. Allele-specific weights have been introduced to collapsing and kernel-based approaches to gene-based association analysis. Here we have for the first time introduced weights to functional linear regression models adapted for both independent and family samples. Using data simulated on the basis of GAW17 genotypes and weights defined by allele frequencies via the beta distribution, we demonstrated that type I errors correspond to declared values and that increasing the weights of causal variants allows the power of functional linear models to be increased. We applied the new method to real data on blood pressure from the ORCADES sample. Five of the six known genes with P < 0.1 in at least one analysis had lower P values with weighted models. Moreover, we found an association between diastolic blood pressure and the VMP1 gene (P = 8.18×10-6), when we used a weighted functional model. For this gene, the unweighted functional and weighted kernel-based models had P = 0.004 and 0.006, respectively. The new method has been implemented in the program package FREGAT, which is freely available at https://cran.r-project.org/web/packages/FREGAT/index.html.

  13. Structural and functional analysis of an anchorless fibronectin-binding protein FBPS from Gram-positive bacterium Streptococcus suis.

    PubMed

    Musyoki, Abednego Moki; Shi, Zhongyu; Xuan, Chunling; Lu, Guangwen; Qi, Jianxun; Gao, Feng; Zheng, Beiwen; Zhang, Qiangmin; Li, Yan; Haywood, Joel; Liu, Cuihua; Yan, Jinghua; Shi, Yi; Gao, George F

    2016-11-29

    The anchorless fibronectin-binding proteins (FnBPs) are a group of important virulence factors for which the structures are not available and the functions are not well defined. In this study we performed comprehensive studies on a prototypic member of this group: the fibronectin-/fibrinogen-binding protein from Streptococcus suis (FBPS). The structures of the N- and C-terminal halves (FBPS-N and FBPS-C), which together cover the full-length protein in sequence, were solved at a resolution of 2.1 and 2.6 Å, respectively, and each was found to be composed of two domains with unique folds. Furthermore, we have elucidated the organization of these domains by small-angle X-ray scattering. We further showed that the fibronectin-binding site is located in FBPS-C and that FBPS promotes the adherence of S suis to host cells by attaching the bacteria via FBPS-N. Finally, we demonstrated that FBPS functions both as an adhesin, promoting S suis attachment to host cells, and as a bacterial factor, activating signaling pathways via β1 integrin receptors to induce chemokine production.

  14. Breast Histopathological Image Retrieval Based on Latent Dirichlet Allocation.

    PubMed

    Ma, Yibing; Jiang, Zhiguo; Zhang, Haopeng; Xie, Fengying; Zheng, Yushan; Shi, Huaqiang; Zhao, Yu

    2017-07-01

    In the field of pathology, whole slide image (WSI) has become the major carrier of visual and diagnostic information. Content-based image retrieval among WSIs can aid the diagnosis of an unknown pathological image by finding its similar regions in WSIs with diagnostic information. However, the huge size and complex content of WSI pose several challenges for retrieval. In this paper, we propose an unsupervised, accurate, and fast retrieval method for a breast histopathological image. Specifically, the method presents a local statistical feature of nuclei for morphology and distribution of nuclei, and employs the Gabor feature to describe the texture information. The latent Dirichlet allocation model is utilized for high-level semantic mining. Locality-sensitive hashing is used to speed up the search. Experiments on a WSI database with more than 8000 images from 15 types of breast histopathology demonstrate that our method achieves about 0.9 retrieval precision as well as promising efficiency. Based on the proposed framework, we are developing a search engine for an online digital slide browsing and retrieval platform, which can be applied in computer-aided diagnosis, pathology education, and WSI archiving and management.

  15. A Polynomial Subset-Based Efficient Multi-Party Key Management System for Lightweight Device Networks

    PubMed Central

    Mahmood, Zahid; Ning, Huansheng; Ghafoor, AtaUllah

    2017-01-01

    Wireless Sensor Networks (WSNs) consist of lightweight devices to measure sensitive data that are highly vulnerable to security attacks due to their constrained resources. In a similar manner, the internet-based lightweight devices used in the Internet of Things (IoT) are facing severe security and privacy issues because of the direct accessibility of devices due to their connection to the internet. Complex and resource-intensive security schemes are infeasible and reduce the network lifetime. In this regard, we have explored the polynomial distribution-based key establishment schemes and identified an issue that the resultant polynomial value is either storage intensive or infeasible when large values are multiplied. It becomes more costly when these polynomials are regenerated dynamically after each node join or leave operation and whenever key is refreshed. To reduce the computation, we have proposed an Efficient Key Management (EKM) scheme for multiparty communication-based scenarios. The proposed session key management protocol is established by applying a symmetric polynomial for group members, and the group head acts as a responsible node. The polynomial generation method uses security credentials and secure hash function. Symmetric cryptographic parameters are efficient in computation, communication, and the storage required. The security justification of the proposed scheme has been completed by using Rubin logic, which guarantees that the protocol attains mutual validation and session key agreement property strongly among the participating entities. Simulation scenarios are performed using NS 2.35 to validate the results for storage, communication, latency, energy, and polynomial calculation costs during authentication, session key generation, node migration, secure joining, and leaving phases. EKM is efficient regarding storage, computation, and communication overhead and can protect WSN-based IoT infrastructure. PMID:28338632

  16. Potential Functions and the Characterization of Economics-Based Information

    NASA Astrophysics Data System (ADS)

    Haven, Emmanuel

    2015-10-01

    The formulation of quantum mechanics as a diffusion process by Nelson (Phys Rev 150:1079-1085, 1966) provides for an interesting approach on how we may transit from classical mechanics into quantum mechanics. Besides the presence of the real potential function, another type of potential function (often denoted as `quantum potential') forms an intrinsic part of this theory. In this paper we attempt to show how both types of potential functions can have a use in a resolutely macroscopic context like financial asset pricing. We are particularly interested in uncovering how the `quantum potential' can add to the economics-based relevant information which is already supplied by the real potential function.

  17. Two Improved Access Methods on Compact Binary (CB) Trees.

    ERIC Educational Resources Information Center

    Shishibori, Masami; Koyama, Masafumi; Okada, Makoto; Aoe, Jun-ichi

    2000-01-01

    Discusses information retrieval and the use of binary trees as a fast access method for search strategies such as hashing. Proposes new methods based on compact binary trees that provide faster access and more compact storage, explains the theoretical basis, and confirms the validity of the methods through empirical observations. (LRW)

  18. Fabrication of functional PLGA-based electrospun scaffolds and their applications in biomedical engineering.

    PubMed

    Zhao, Wen; Li, Jiaojiao; Jin, Kaixiang; Liu, Wenlong; Qiu, Xuefeng; Li, Chenrui

    2016-02-01

    Electrospun PLGA-based scaffolds have been applied extensively in biomedical engineering, such as tissue engineering and drug delivery system. Due to lack of the recognition sites on cells, hydropholicity and single-function, the applications of PLGA fibrous scaffolds are limited. In order to tackle these issues, many works have been done to obtain functional PLGA-based scaffolds, including surface modifications, the fabrication of PLGA-based composite scaffolds and drug-loaded scaffolds. The functional PLGA-based scaffolds have significantly improved cell adhesion, attachment and proliferation. Moreover, the current study has summarized the applications of functional PLGA-based scaffolds in wound dressing, vascular and bone tissue engineering area as well as drug delivery system. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Computational functional genomics-based approaches in analgesic drug discovery and repurposing.

    PubMed

    Lippmann, Catharina; Kringel, Dario; Ultsch, Alfred; Lötsch, Jörn

    2018-06-01

    Persistent pain is a major healthcare problem affecting a fifth of adults worldwide with still limited treatment options. The search for new analgesics increasingly includes the novel research area of functional genomics, which combines data derived from various processes related to DNA sequence, gene expression or protein function and uses advanced methods of data mining and knowledge discovery with the goal of understanding the relationship between the genome and the phenotype. Its use in drug discovery and repurposing for analgesic indications has so far been performed using knowledge discovery in gene function and drug target-related databases; next-generation sequencing; and functional proteomics-based approaches. Here, we discuss recent efforts in functional genomics-based approaches to analgesic drug discovery and repurposing and highlight the potential of computational functional genomics in this field including a demonstration of the workflow using a novel R library 'dbtORA'.

  20. ARNetMiT R Package: association rules based gene co-expression networks of miRNA targets.

    PubMed

    Özgür Cingiz, M; Biricik, G; Diri, B

    2017-03-31

    miRNAs are key regulators that bind to target genes to suppress their gene expression level. The relations between miRNA-target genes enable users to derive co-expressed genes that may be involved in similar biological processes and functions in cells. We hypothesize that target genes of miRNAs are co-expressed, when they are regulated by multiple miRNAs. With the usage of these co-expressed genes, we can theoretically construct co-expression networks (GCNs) related to 152 diseases. In this study, we introduce ARNetMiT that utilize a hash based association rule algorithm in a novel way to infer the GCNs on miRNA-target genes data. We also present R package of ARNetMiT, which infers and visualizes GCNs of diseases that are selected by users. Our approach assumes miRNAs as transactions and target genes as their items. Support and confidence values are used to prune association rules on miRNA-target genes data to construct support based GCNs (sGCNs) along with support and confidence based GCNs (scGCNs). We use overlap analysis and the topological features for the performance analysis of GCNs. We also infer GCNs with popular GNI algorithms for comparison with the GCNs of ARNetMiT. Overlap analysis results show that ARNetMiT outperforms the compared GNI algorithms. We see that using high confidence values in scGCNs increase the ratio of the overlapped gene-gene interactions between the compared methods. According to the evaluation of the topological features of ARNetMiT based GCNs, the degrees of nodes have power-law distribution. The hub genes discovered by ARNetMiT based GCNs are consistent with the literature.

  1. Tangible interactive system for document browsing and visualisation of multimedia data

    NASA Astrophysics Data System (ADS)

    Rytsar, Yuriy; Voloshynovskiy, Sviatoslav; Koval, Oleksiy; Deguillaume, Frederic; Topak, Emre; Startchik, Sergei; Pun, Thierry

    2006-01-01

    In this paper we introduce and develop a framework for document interactive navigation in multimodal databases. First, we analyze the main open issues of existing multimodal interfaces and then discuss two applications that include interaction with documents in several human environments, i.e., the so-called smart rooms. Second, we propose a system set-up dedicated to the efficient navigation in the printed documents. This set-up is based on the fusion of data from several modalities that include images and text. Both modalities can be used as cover data for hidden indexes using data-hiding technologies as well as source data for robust visual hashing. The particularities of the proposed robust visual hashing are described in the paper. Finally, we address two practical applications of smart rooms for tourism and education and demonstrate the advantages of the proposed solution.

  2. A new method of cannabis ingestion: the dangers of dabs?

    PubMed

    Loflin, Mallory; Earleywine, Mitch

    2014-10-01

    A new method for administering cannabinoids, called butane hash oil ("dabs"), is gaining popularity among marijuana users. Despite press reports that suggest that "dabbing" is riskier than smoking flower cannabis, no data address whether dabs users experience more problems from use than those who prefer flower cannabis. The present study aimed to gather preliminary information on dabs users and test whether dabs use is associated with more problems than using flower cannabis. Participants (n=357) reported on their history of cannabis use, their experience with hash oil and the process of "dabbing," reasons for choosing "dabs" over other methods, and any problems related to both flower cannabis and butane hash oil. Analyses revealed that using "dabs" created no more problems or accidents than using flower cannabis. Participants did report that "dabs" led to higher tolerance and withdrawal (as defined by the participants), suggesting that the practice might be more likely to lead to symptoms of addiction or dependence. The use of butane hash oil has spread outside of the medical marijuana community, and users view it as significantly more dangerous than other forms of cannabis use. Published by Elsevier Ltd.

  3. Associations of SF-36 mental health functioning and work and family related factors with intentions to retire early among employees.

    PubMed

    Harkonmäki, K; Rahkonen, O; Martikainen, P; Silventoinen, K; Lahelma, E

    2006-08-01

    To examine the associations of mental health functioning (SF-36) and work and family related psychosocial factors with intentions to retire early. Cross sectional survey data (n = 5037) from the Helsinki Health Study occupational cohort in 2001 and 2002 were used. Intentions to retire early were inquired with a question: "Have you considered retiring before normal retirement age?" Mental health functioning was measured by the Short Form 36 (SF-36) mental component summary (MCS). Work and family related psychosocial factors included job demands and job control, procedural and relational justice, conflicts between work and family, and social network size. Multinomial regression models were used to analyse the data. Poor mental health functioning, unfavourable psychosocial working conditions, and conflicts between work and family were individually related to intentions to retire early. After adjustments for all work and family related factors the odds ratio for low mental health functioning was halved (from OR = 6.05 to 3.67), but nevertheless the association between poor mental health functioning and strong intentions to retire early remained strong. These findings highlight not only the importance of low mental health and unfavourable working conditions but also the simultaneous impact of conflicts between work and family to employees' intentions to retire early.

  4. Carbon-Based Functional Materials Derived from Waste for Water Remediation and Energy Storage.

    PubMed

    Ma, Qinglang; Yu, Yifu; Sindoro, Melinda; Fane, Anthony G; Wang, Rong; Zhang, Hua

    2017-04-01

    Carbon-based functional materials hold the key for solving global challenges in the areas of water scarcity and the energy crisis. Although carbon nanotubes (CNTs) and graphene have shown promising results in various fields of application, their high preparation cost and low production yield still dramatically hinder their wide practical applications. Therefore, there is an urgent call for preparing carbon-based functional materials from low-cost, abundant, and sustainable sources. Recent innovative strategies have been developed to convert various waste materials into valuable carbon-based functional materials. These waste-derived carbon-based functional materials have shown great potential in many applications, especially as sorbents for water remediation and electrodes for energy storage. Here, the research progress in the preparation of waste-derived carbon-based functional materials is summarized, along with their applications in water remediation and energy storage; challenges and future research directions in this emerging research field are also discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Concepts and practices used to develop functional PLGA-based nanoparticulate systems.

    PubMed

    Sah, Hongkee; Thoma, Laura A; Desu, Hari R; Sah, Edel; Wood, George C

    2013-01-01

    The functionality of bare polylactide-co-glycolide (PLGA) nanoparticles is limited to drug depot or drug solubilization in their hard cores. They have inherent weaknesses as a drug-delivery system. For instance, when administered intravenously, the nanoparticles undergo rapid clearance from systemic circulation before reaching the site of action. Furthermore, plain PLGA nanoparticles cannot distinguish between different cell types. Recent research shows that surface functionalization of nanoparticles and development of new nanoparticulate dosage forms help overcome these delivery challenges and improve in vivo performance. Immense research efforts have propelled the development of diverse functional PLGA-based nanoparticulate delivery systems. Representative examples include PEGylated micelles/nanoparticles (PEG, polyethylene glycol), polyplexes, polymersomes, core-shell-type lipid-PLGA hybrids, cell-PLGA hybrids, receptor-specific ligand-PLGA conjugates, and theranostics. Each PLGA-based nanoparticulate dosage form has specific features that distinguish it from other nanoparticulate systems. This review focuses on fundamental concepts and practices that are used in the development of various functional nanoparticulate dosage forms. We describe how the attributes of these functional nanoparticulate forms might contribute to achievement of desired therapeutic effects that are not attainable using conventional therapies. Functional PLGA-based nanoparticulate systems are expected to deliver chemotherapeutic, diagnostic, and imaging agents in a highly selective and effective manner.

  6. Concepts and practices used to develop functional PLGA-based nanoparticulate systems

    PubMed Central

    Sah, Hongkee; Thoma, Laura A; Desu, Hari R; Sah, Edel; Wood, George C

    2013-01-01

    The functionality of bare polylactide-co-glycolide (PLGA) nanoparticles is limited to drug depot or drug solubilization in their hard cores. They have inherent weaknesses as a drug-delivery system. For instance, when administered intravenously, the nanoparticles undergo rapid clearance from systemic circulation before reaching the site of action. Furthermore, plain PLGA nanoparticles cannot distinguish between different cell types. Recent research shows that surface functionalization of nanoparticles and development of new nanoparticulate dosage forms help overcome these delivery challenges and improve in vivo performance. Immense research efforts have propelled the development of diverse functional PLGA-based nanoparticulate delivery systems. Representative examples include PEGylated micelles/nanoparticles (PEG, polyethylene glycol), polyplexes, polymersomes, core-shell–type lipid-PLGA hybrids, cell-PLGA hybrids, receptor-specific ligand-PLGA conjugates, and theranostics. Each PLGA-based nanoparticulate dosage form has specific features that distinguish it from other nanoparticulate systems. This review focuses on fundamental concepts and practices that are used in the development of various functional nanoparticulate dosage forms. We describe how the attributes of these functional nanoparticulate forms might contribute to achievement of desired therapeutic effects that are not attainable using conventional therapies. Functional PLGA-based nanoparticulate systems are expected to deliver chemotherapeutic, diagnostic, and imaging agents in a highly selective and effective manner. PMID:23459088

  7. An ecologically valid performance-based social functioning assessment battery for schizophrenia.

    PubMed

    Shi, Chuan; He, Yi; Cheung, Eric F C; Yu, Xin; Chan, Raymond C K

    2013-12-30

    Psychiatrists pay more attention to the social functioning outcome of schizophrenia nowadays. How to evaluate the real world function among schizophrenia is a challenging task due to culture difference, there is no such kind of instrument in terms of the Chinese setting. This study aimed to report the validation of an ecologically valid performance-based everyday functioning assessment for schizophrenia, namely the Beijing Performance-based Functional Ecological Test (BJ-PERFECT). Fifty community-dwelling adults with schizophrenia and 37 healthy controls were recruited. Fifteen of the healthy controls were re-tested one week later. All participants were administered the University of California, San Diego, Performance-based Skill Assessment-Brief version (UPSA-B) and the MATRICS Consensus Cognitive Battery (MCCB). The finalized assessment included three subdomains: transportation, financial management and work ability. The test-retest and inter-rater reliabilities were good. The total score significantly correlated with the UPSA-B. The performance of individuals with schizophrenia was significantly more impaired than healthy controls, especially in the domain of work ability. Among individuals with schizophrenia, functional outcome was influenced by premorbid functioning, negative symptoms and neurocognition such as processing speed, visual learning and attention/vigilance. © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Quantification of Soil Pore Structure Based on Minkowski-Functions

    NASA Astrophysics Data System (ADS)

    Vogel, H.; Weller, U.; Schlüter, S.

    2009-05-01

    The porous structure in soils and other geologic media is typically a complex 3-dimensional object. Most of the physical material properties including mechanical and hydraulic characteristics are immediately linked to this structure which can be directly observed using non-invasive techniques as e.g. X-ray tomography. It is an old dream and still a formidable challenge to related structural features of porous media to their physical properties. In this contribution we present a scale-invariant concept to quantify pore structure based on a limited set of meaningful morphological functions. They are based on d+1 Minkowski functionals as defined for d-dimensional bodies. These basic quantities are determined as a function of pore size obtained by filter procedures using mathematical morphology. The resulting Minkowski functions provide valuable information on pore size, pore surface area and pore topology having the potential to be linked to physical properties. The theoretical background and the related algorithms are presented and the approach is demonstrated for the structure of an arable topsoil obtained by X-ray micro tomography. We also discuss the fundamental problem of limited resolution which is critical for any attempt to quantify structural features at any scale.

  9. Functional morphology of the luminescence system of Siphamia versicolor (Perciformes: Apogonidae), a bacterially luminous coral reef fish.

    PubMed

    Dunlap, Paul V; Nakamura, Masaru

    2011-08-01

    Previous studies of the luminescence system of Siphamia versicolor (Perciformes: Apogonidae) identified a ventral light organ, reflector, lens, duct, and a ventral diffuser extending from the throat to the caudal peduncle. The control and function of luminescence in this and other species of Siphamia, however, have not been defined. Morphological examination of fresh and preserved specimens identified additional components of the luminescence system involved in control and ventral emission of luminescence, including a retractable shutter over the ventral face of the light organ, contiguity of the ventral diffuser from the caudal peduncle to near the chin, and transparency of the bones and other tissues of the lower jaw. The shutter halves retract laterally, allowing the ventral release of light, and relax medially, blocking ventral light emission; topical application of norepinephrine to the exposed light organ resulted in retraction of the shutter halves, which suggests that operation of the shutter is under neuromuscular control. The extension of the diffuser to near the chin and transparency of the lower jaw allow a uniform emission of luminescence over the entire ventrum of the fish. The live aquarium-held fish were found to readily and consistently display ventral luminescence. At twilight, the fish left the protective association with their longspine sea urchin, Diadema setosum, and began to emit ventral luminescence and to feed on zooplankton. Ventral luminescence illuminated a zone below and around the fish, which typically swam close to the substrate. Shortly after complete darkness, the fish stopped feeding and emitting luminescence. These observations suggest that S. versicolor uses ventral luminescence to attract and feed on zooplankton from the reef benthos at twilight. Ventral luminescence may allow S. versicolor to exploit for feeding the gap at twilight in the presence of potential predators as the reef transitions from diurnally active to

  10. A function-based approach to cockpit procedure aids

    NASA Technical Reports Server (NTRS)

    Phatak, Anil V.; Jain, Parveen; Palmer, Everett

    1990-01-01

    The objective of this research is to develop and test a cockpit procedural aid that can compose and present procedures that are appropriate for the given flight situation. The procedure would indicate the status of the aircraft engineering systems, and the environmental conditions. Prescribed procedures already exist for normal as well as for a number of non-normal and emergency situations, and can be presented to the crew using an interactive cockpit display. However, no procedures are prescribed or recommended for a host of plausible flight situations involving multiple malfunctions compounded by adverse environmental conditions. Under these circumstances, the cockpit procedural aid must review the prescribed procedures for the individual malfunction (when available), evaluate the alternatives or options, and present one or more composite procedures (prioritized or unprioritized) in response to the given situation. A top-down function-based conceptual approach towards composing and presenting cockpit procedures is being investigated. This approach is based upon the thought process that an operating crew must go through while attempting to meet the flight objectives given the current flight situation. In order to accomplish the flight objectives, certain critical functions must be maintained during each phase of the flight, using the appropriate procedures or success paths. The viability of these procedures depends upon the availability of required resources. If resources available are not sufficient to meet the requirements, alternative procedures (success paths) using the available resources must be constructed to maintain the critical functions and the corresponding objectives. If no success path exists that can satisfy the critical functions/objectives, then the next level of critical functions/objectives must be selected and the process repeated. Information is given in viewgraph form.

  11. Architectural design of an Algol interpreter

    NASA Technical Reports Server (NTRS)

    Jackson, C. K.

    1971-01-01

    The design of a syntax-directed interpreter for a subset of Algol is described. It is a conceptual design with sufficient details and completeness but as much independence of implementation as possible. The design includes a detailed description of a scanner, an analyzer described in the Floyd-Evans productions, a hash-coded symbol table, and an executor. Interpretation of sample programs is also provided to show how the interpreter functions.

  12. Functional Evolution of PLP-dependent Enzymes based on Active-Site Structural Similarities

    PubMed Central

    Catazaro, Jonathan; Caprez, Adam; Guru, Ashu; Swanson, David; Powers, Robert

    2014-01-01

    Families of distantly related proteins typically have very low sequence identity, which hinders evolutionary analysis and functional annotation. Slowly evolving features of proteins, such as an active site, are therefore valuable for annotating putative and distantly related proteins. To date, a complete evolutionary analysis of the functional relationship of an entire enzyme family based on active-site structural similarities has not yet been undertaken. Pyridoxal-5’-phosphate (PLP) dependent enzymes are primordial enzymes that diversified in the last universal ancestor. Using the Comparison of Protein Active Site Structures (CPASS) software and database, we show that the active site structures of PLP-dependent enzymes can be used to infer evolutionary relationships based on functional similarity. The enzymes successfully clustered together based on substrate specificity, function, and three-dimensional fold. This study demonstrates the value of using active site structures for functional evolutionary analysis and the effectiveness of CPASS. PMID:24920327

  13. Trait-based approaches for understanding microbial biodiversity and ecosystem functioning

    PubMed Central

    Krause, Sascha; Le Roux, Xavier; Niklaus, Pascal A.; Van Bodegom, Peter M.; Lennon, Jay T.; Bertilsson, Stefan; Grossart, Hans-Peter; Philippot, Laurent; Bodelier, Paul L. E.

    2014-01-01

    In ecology, biodiversity-ecosystem functioning (BEF) research has seen a shift in perspective from taxonomy to function in the last two decades, with successful application of trait-based approaches. This shift offers opportunities for a deeper mechanistic understanding of the role of biodiversity in maintaining multiple ecosystem processes and services. In this paper, we highlight studies that have focused on BEF of microbial communities with an emphasis on integrating trait-based approaches to microbial ecology. In doing so, we explore some of the inherent challenges and opportunities of understanding BEF using microbial systems. For example, microbial biologists characterize communities using gene phylogenies that are often unable to resolve functional traits. Additionally, experimental designs of existing microbial BEF studies are often inadequate to unravel BEF relationships. We argue that combining eco-physiological studies with contemporary molecular tools in a trait-based framework can reinforce our ability to link microbial diversity to ecosystem processes. We conclude that such trait-based approaches are a promising framework to increase the understanding of microbial BEF relationships and thus generating systematic principles in microbial ecology and more generally ecology. PMID:24904563

  14. Significance of cannabis use to dental practice.

    PubMed

    Maloney, William James

    2011-04-01

    The illicit use of the three main forms of cannabis-marijuana, hash, hash oil-pose certain obstacles and challenges to the dental professional. There are a number of systemic, as well as oral/head and neck manifestations, associated with cannabis use. Dentists need to be aware of these manifestations in order to take whatever precautions and/or modifications to the proposed treatment that might be necessary.

  15. Portable Language-Independent Adaptive Translation from OCR. Phase 1

    DTIC Science & Technology

    2009-04-01

    including brute-force k-Nearest Neighbors ( kNN ), fast approximate kNN using hashed k-d trees, classification and regression trees, and locality...achieved by refinements in ground-truthing protocols. Recent algorithmic improvements to our approximate kNN classifier using hashed k-D trees allows...recent years discriminative training has been shown to outperform phonetic HMMs estimated using ML for speech recognition. Standard ML estimation

  16. Validation of the Female Sexual Function Index (FSFI) for web-based administration.

    PubMed

    Crisp, Catrina C; Fellner, Angela N; Pauls, Rachel N

    2015-02-01

    Web-based questionnaires are becoming increasingly valuable for clinical research. The Female Sexual Function Index (FSFI) is the gold standard for evaluating female sexual function; yet, it has not been validated in this format. We sought to validate the Female Sexual Function Index (FSFI) for web-based administration. Subjects enrolled in a web-based research survey of sexual function from the general population were invited to participate in this validation study. The first 151 respondents were included. Validation participants completed the web-based version of the FSFI followed by a mailed paper-based version. Demographic data were collected for all subjects. Scores were compared using the paired t test and the intraclass correlation coefficient. One hundred fifty-one subjects completed both web- and paper-based versions of the FSFI. Those subjects participating in the validation study did not differ in demographics or FSFI scores from the remaining subjects in the general population study. Total web-based and paper-based FSFI scores were not significantly different (mean 20.31 and 20.29 respectively, p = 0.931). The six domains or subscales of the FSFI were similar when comparing web and paper scores. Finally, intraclass correlation analysis revealed a high degree of correlation between total and subscale scores, r = 0.848-0.943, p < 0.001. Web-based administration of the FSFI is a valid alternative to the paper-based version.

  17. Adult Roles & Functions. Objective Based Evaluation System.

    ERIC Educational Resources Information Center

    West Virginia State Vocational Curriculum Lab., Cedar Lakes.

    This book of objective-based test items is designed to be used with the Adult Roles and Functions curriculum for a non-laboratory home economic course for grades eleven and twelve. It contains item banks for each cognitive objective in the curriculum. In addition, there is a form for the table of specifications to be developed for each unit. This…

  18. PCM-Based Durable Write Cache for Fast Disk I/O

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zhuo; Wang, Bin; Carpenter, Patrick

    2012-01-01

    Flash based solid-state devices (FSSDs) have been adopted within the memory hierarchy to improve the performance of hard disk drive (HDD) based storage system. However, with the fast development of storage-class memories, new storage technologies with better performance and higher write endurance than FSSDs are emerging, e.g., phase-change memory (PCM). Understanding how to leverage these state-of-the-art storage technologies for modern computing systems is important to solve challenging data intensive computing problems. In this paper, we propose to leverage PCM for a hybrid PCM-HDD storage architecture. We identify the limitations of traditional LRU caching algorithms for PCM-based caches, and develop amore » novel hash-based write caching scheme called HALO to improve random write performance of hard disks. To address the limited durability of PCM devices and solve the degraded spatial locality in traditional wear-leveling techniques, we further propose novel PCM management algorithms that provide effective wear-leveling while maximizing access parallelism. We have evaluated this PCM-based hybrid storage architecture using applications with a diverse set of I/O access patterns. Our experimental results demonstrate that the HALO caching scheme leads to an average reduction of 36.8% in execution time compared to the LRU caching scheme, and that the SFC wear leveling extends the lifetime of PCM by a factor of 21.6.« less

  19. Efficient fractal-based mutation in evolutionary algorithms from iterated function systems

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.; Aybar-Ruíz, A.; Camacho-Gómez, C.; Pereira, E.

    2018-03-01

    In this paper we present a new mutation procedure for Evolutionary Programming (EP) approaches, based on Iterated Function Systems (IFSs). The new mutation procedure proposed consists of considering a set of IFS which are able to generate fractal structures in a two-dimensional phase space, and use them to modify a current individual of the EP algorithm, instead of using random numbers from different probability density functions. We test this new proposal in a set of benchmark functions for continuous optimization problems. In this case, we compare the proposed mutation against classical Evolutionary Programming approaches, with mutations based on Gaussian, Cauchy and chaotic maps. We also include a discussion on the IFS-based mutation in a real application of Tuned Mass Dumper (TMD) location and optimization for vibration cancellation in buildings. In both practical cases, the proposed EP with the IFS-based mutation obtained extremely competitive results compared to alternative classical mutation operators.

  20. Dentine Tubule Occlusion by Novel Bioactive Glass-Based Toothpastes

    PubMed Central

    Hill, Robert G.; Chen, Xiaojing

    2018-01-01

    There are numerous over-the-counter (OTC) and professionally applied (in-office) products and techniques currently available for the treatment of dentine hypersensitivity (DH), but more recently, the use of bioactive glasses in toothpaste formulations have been advocated as a possible solution to managing DH. Aim. The aim of the present study, therefore, was to compare several bioactive glass formulations to investigate their effectiveness in an established in vitro model. Materials and Methods. A 45S5 glass was synthesized in the laboratory together with several other glass formulations: (1) a mixed glass (fluoride and chloride), (2) BioMinF, (3) a chloride glass, and (4) an amorphous chloride glass. The glass powders were formulated into five different toothpaste formulations. Dentine discs were sectioned from extracted human teeth and prepared for the investigation by removing the cutting debris (smear layer) following sectioning using a 6% citric acid solution for 2 minutes. Each disc was halved to provide test and control halves for comparison following the brushing of the five toothpaste formulations onto the test halves for each toothpaste group. Following the toothpaste application, the test discs were immersed in either artificial saliva or exposed to an acid challenge. Results. The dentine samples were analyzed using scanning electron microscopy (SEM), and observation of the SEM images indicated that there was good surface coverage following artificial saliva immersion. Furthermore, although the acid challenge removed the hydroxyapatite layer on the dentine surface for most of the samples, except for the amorphous chloride glass, there was evidence of tubular occlusion in the dentine tubules. Conclusions. The conclusions from the study would suggest that the inclusion of bioactive glass into a toothpaste formulation may be an effective approach to treat DH. PMID:29849637

  1. Growth Points in Linking Representations of Function: A Research-Based Framework

    ERIC Educational Resources Information Center

    Ronda, Erlina

    2015-01-01

    This paper describes five growth points in linking representations of function developed from a study of secondary school learners. Framed within the cognitivist perspective and process-object conception of function, the growth points were identified and described based on linear and quadratic function tasks learners can do and their strategies…

  2. Classifying Different Emotional States by Means of EEG-Based Functional Connectivity Patterns

    PubMed Central

    Lee, You-Yun; Hsieh, Shulan

    2014-01-01

    This study aimed to classify different emotional states by means of EEG-based functional connectivity patterns. Forty young participants viewed film clips that evoked the following emotional states: neutral, positive, or negative. Three connectivity indices, including correlation, coherence, and phase synchronization, were used to estimate brain functional connectivity in EEG signals. Following each film clip, participants were asked to report on their subjective affect. The results indicated that the EEG-based functional connectivity change was significantly different among emotional states. Furthermore, the connectivity pattern was detected by pattern classification analysis using Quadratic Discriminant Analysis. The results indicated that the classification rate was better than chance. We conclude that estimating EEG-based functional connectivity provides a useful tool for studying the relationship between brain activity and emotional states. PMID:24743695

  3. Functional Behavioral Assessment-Based Interventions. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2016

    2016-01-01

    This intervention report presents findings from a systematic review of "functional behavioral assessment-based interventions" conducted using the WWC Procedures and Standards Handbook, version 3.0, and the Children Identified With or At Risk for an Emotional Disturbance review protocol, version 3.0. Functional behavioral assessment (FBA)…

  4. An Integrative Object-Based Image Analysis Workflow for Uav Images

    NASA Astrophysics Data System (ADS)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  5. Functional evolution of PLP-dependent enzymes based on active-site structural similarities.

    PubMed

    Catazaro, Jonathan; Caprez, Adam; Guru, Ashu; Swanson, David; Powers, Robert

    2014-10-01

    Families of distantly related proteins typically have very low sequence identity, which hinders evolutionary analysis and functional annotation. Slowly evolving features of proteins, such as an active site, are therefore valuable for annotating putative and distantly related proteins. To date, a complete evolutionary analysis of the functional relationship of an entire enzyme family based on active-site structural similarities has not yet been undertaken. Pyridoxal-5'-phosphate (PLP) dependent enzymes are primordial enzymes that diversified in the last universal ancestor. Using the comparison of protein active site structures (CPASS) software and database, we show that the active site structures of PLP-dependent enzymes can be used to infer evolutionary relationships based on functional similarity. The enzymes successfully clustered together based on substrate specificity, function, and three-dimensional-fold. This study demonstrates the value of using active site structures for functional evolutionary analysis and the effectiveness of CPASS. © 2014 Wiley Periodicals, Inc.

  6. Hemispheric asymmetry of electroencephalography-based functional brain networks.

    PubMed

    Jalili, Mahdi

    2014-11-12

    Electroencephalography (EEG)-based functional brain networks have been investigated frequently in health and disease. It has been shown that a number of graph theory metrics are disrupted in brain disorders. EEG-based brain networks are often studied in the whole-brain framework, where all the nodes are grouped into a single network. In this study, we studied the brain networks in two hemispheres and assessed whether there are any hemispheric-specific patterns in the properties of the networks. To this end, resting state closed-eyes EEGs from 44 healthy individuals were processed and the network structures were extracted separately for each hemisphere. We examined neurophysiologically meaningful graph theory metrics: global and local efficiency measures. The global efficiency did not show any hemispheric asymmetry, whereas the local connectivity showed rightward asymmetry for a range of intermediate density values for the constructed networks. Furthermore, the age of the participants showed significant direct correlations with the global efficiency of the left hemisphere, but only in the right hemisphere, with local connectivity. These results suggest that only local connectivity of EEG-based functional networks is associated with brain hemispheres.

  7. Flight Crew Workload Evaluation Based on the Workload Function Distribution Method.

    PubMed

    Zheng, Yiyuan; Lu, Yanyu; Jie, Yuwen; Fu, Shan

    2017-05-01

    The minimum flight crew on the flight deck should be established according to the workload for individual crewmembers. Typical workload measures consist of three types: subjective rating scale, task performance, and psychophysiological measures. However, all these measures have their own limitations. To reflect flight crew workload more specifically and comprehensively within the flight environment, and more directly comply with airworthiness regulations, the Workload Function Distribution Method, which combined the basic six workload functions, was proposed. The analysis was based on the different conditions of workload function numbers. Each condition was analyzed from two aspects, which were overall proportion and effective proportion. Three types of approach tasks were used in this study and the NASA-TLX scale was implemented for comparison. Neither the one-function condition nor the two-function condition had the same results with NASA-TLX. However, both the three-function and the four- to six- function conditions were identical with NASA-TLX. Further, the significant differences were different on four to six conditions. The overall proportion was insignificant, while the effective proportions were significant. The results show that the conditions with one function and two functions seemed to have no influence on workload, while executing three functions and four to six functions had an impact on workload. Besides, effective proportions of workload functions were more precisely compared with the overall proportions to indicate workload, especially in the conditions with multiple functions.Zheng Y, Lu Y, Jie Y, Fu S. Flight crew workload evaluation based on the workload function distribution method. Aerosp Med Hum Perform. 2017; 88(5):481-486.

  8. LMI-based stability analysis of fuzzy-model-based control systems using approximated polynomial membership functions.

    PubMed

    Narimani, Mohammand; Lam, H K; Dilmaghani, R; Wolfe, Charles

    2011-06-01

    Relaxed linear-matrix-inequality-based stability conditions for fuzzy-model-based control systems with imperfect premise matching are proposed. First, the derivative of the Lyapunov function, containing the product terms of the fuzzy model and fuzzy controller membership functions, is derived. Then, in the partitioned operating domain of the membership functions, the relations between the state variables and the mentioned product terms are represented by approximated polynomials in each subregion. Next, the stability conditions containing the information of all subsystems and the approximated polynomials are derived. In addition, the concept of the S-procedure is utilized to release the conservativeness caused by considering the whole operating region for approximated polynomials. It is shown that the well-known stability conditions can be special cases of the proposed stability conditions. Simulation examples are given to illustrate the validity of the proposed approach.

  9. Raman Optical Activity Spectra from Density Functional Perturbation Theory and Density-Functional-Theory-Based Molecular Dynamics.

    PubMed

    Luber, Sandra

    2017-03-14

    We describe the calculation of Raman optical activity (ROA) tensors from density functional perturbation theory, which has been implemented into the CP2K software package. Using the mixed Gaussian and plane waves method, ROA spectra are evaluated in the double-harmonic approximation. Moreover, an approach for the calculation of ROA spectra by means of density functional theory-based molecular dynamics is derived and used to obtain an ROA spectrum via time correlation functions, which paves the way for the calculation of ROA spectra taking into account anharmonicities and dynamic effects at ambient conditions.

  10. Nutritional approach for designing meat-based functional food products with nuts.

    PubMed

    Olmedilla-Alonso, B; Granado-Lorencio, F; Herrero-Barbudo, C; Blanco-Navarro, I

    2006-01-01

    Meat and meat products are essential components of diets in developed countries and despite the convincing evidence that relate them to an increased risk for CVD, a growing consumption of meat products is foreseen. Epidemiological studies show that regular consumption of nuts, in general, and walnuts in particular, correlates inversely with myocardial infarction and ischaemic vascular disease. We assess the nutritional basis for and technological approach to the development of functional meat-based products potentially relevant in cardiovascular disease (CVD) risk reduction. Using the available strategies in the meat industry (reformulation processes) and a food-based approach, we address the design and development of restructured beef steak with added walnuts, potentially functional for CVD risk reduction. Its adequacy as a vehicle for active nutrients is confirmed by a pharmacokinetic pilot study in humans using gamma-tocopherol as an exposure biomarker in chylomicrons during the post-prandial state. Effect and potential "functionality" is being assessed by a dietary intervention study in subjects at risk and markers and indicators related to CVD are being evaluated. Within the conceptual framework of evidence-based medicine, development of meat-based functional products may become a useful approach for specific applications, with a potential market and health benefits of great importance at a population level.

  11. The relationship between nature-based tourism and autonomic nervous system function among older adults.

    PubMed

    Chang, Liang-Chih

    2014-01-01

    Nature-based tourism has recently become a topic of interest in health research. This study was aimed at examining relationships among nature-based tourism, stress, and the function of the autonomic nervous system (ANS). Three hundred and twenty-two older adults living in Taichung City, Taiwan, were selected as participants. Data were collected by a face-to-face survey that included measures of the frequency of participation in domestic and international nature-based tourism and the stress and ANS function of these participants. The data were analyzed using a path analysis. The results demonstrated that the frequency of participation in domestic nature-based tourism directly contributed to ANS function and that it also indirectly contributed to ANS function through stress reduction. Domestic nature-based tourism can directly and indirectly contribute to ANS function among older adults. Increasing the frequency of participation in domestic nature-based tourism should be considered a critical element of health programs for older adults. © 2014 International Society of Travel Medicine.

  12. Non-covalently functionalized carbon nanostructures for synthesizing carbon-based hybrid nanomaterials.

    PubMed

    Li, Haiqing; Song, Sing I; Song, Ga Young; Kim, Il

    2014-02-01

    Carbon nanostructures (CNSs) such as carbon nanotubes, graphene sheets, and nanodiamonds provide an important type of substrate for constructing a variety of hybrid nanomaterials. However, their intrinsic chemistry-inert surfaces make it indispensable to pre-functionalize them prior to immobilizing additional components onto their surfaces. Currently developed strategies for functionalizing CNSs include covalent and non-covalent approaches. Conventional covalent treatments often damage the structure integrity of carbon surfaces and adversely affect their physical properties. In contrast, the non-covalent approach offers a non-destructive way to modify CNSs with desired functional surfaces, while reserving their intrinsic properties. Thus far, a number of surface modifiers including aromatic compounds, small-molecular surfactants, amphiphilic polymers, and biomacromolecules have been developed to non-covalently functionalize CNS surfaces. Mediated by these surface modifiers, various functional components such as organic species and inorganic nanoparticles were further decorated onto their surfaces, resulting in versatile carbon-based hybrid nanomaterials with broad applications in chemical engineering and biomedical areas. In this review, the recent advances in the generation of such hybrid nanostructures based on non-covalently functionalized CNSs will be reviewed.

  13. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  14. MATLAB-Based Program for Teaching Autocorrelation Function and Noise Concepts

    ERIC Educational Resources Information Center

    Jovanovic Dolecek, G.

    2012-01-01

    An attractive MATLAB-based tool for teaching the basics of autocorrelation function and noise concepts is presented in this paper. This tool enhances traditional in-classroom lecturing. The demonstrations of the tool described here highlight the description of the autocorrelation function (ACF) in a general case for wide-sense stationary (WSS)…

  15. Prostate-Specific Antigen Halving Time While on Neoadjuvant Androgen Deprivation Therapy Is Associated With Biochemical Control in Men Treated With Radiation Therapy for Localized Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malik, Renuka; Jani, Ashesh B.; Liauw, Stanley L., E-mail: sliauw@radonc.uchicago.ed

    2011-03-15

    Purpose: To assess whether the PSA response to neoadjuvant androgen deprivation therapy (ADT) is associated with biochemical control in men treated with radiation therapy (RT) for prostate cancer. Methods and Materials: In a cohort of men treated with curative-intent RT for localized prostate cancer between 1988 and 2005, 117 men had PSA values after the first and second months of neoadjuvant ADT. Most men had intermediate-risk (45%) or high-risk (44%) disease. PSA halving time (PSAHT) was calculated by first order kinetics. Median RT dose was 76 Gy and median total duration of ADT was 4 months. Freedom from biochemical failuremore » (FFBF, nadir + 2 definition) was analyzed by PSAHT and absolute PSA nadir before the start of RT. Results: Median follow-up was 45 months. Four-year FFBF was 89%. Median PSAHT was 2 weeks. A faster PSA decline (PSAHT {<=}2 weeks) was associated with greater FFBF (96% vs. 81% for a PSAHT >2 weeks, p = 0.0110). Those within the fastest quartile of PSAHTs ({<=} 10 days) achieved a FFBF of 100%. Among high-risk patients, a PSAHT {<=}2 weeks achieved a 4-yr FFBF of 93% vs. 70% for those with PSAHT >2 weeks (p = 0.0508). Absolute PSA nadir was not associated with FFBF. On multivariable analysis, PSAHT (p = 0.0093) and Gleason score (p = 0.0320) were associated with FFBF, whereas T-stage (p = 0.7363) and initial PSA level (p = 0.9614) were not. Conclusions: For men treated with combined ADT and RT, PSA response to the first month of ADT may be a useful criterion for prognosis and treatment modification.« less

  16. Body Dysmorphic Symptoms, Functional Impairment, and Depression: The Role of Appearance-Based Teasing.

    PubMed

    Weingarden, Hilary; Renshaw, Keith D

    2016-01-01

    Body dysmorphic disorder is associated with elevated social and occupational impairment and comorbid depression, but research on risk factors for body dysmorphic symptoms and associated outcomes is limited. Appearance-based teasing may be a potential risk factor. To examine the specificity of this factor, the authors assessed self-reported appearance-based teasing, body dysmorphic, and obsessive-compulsive symptom severity, functional impairment (i.e., social, occupational, family impairment), and depression in a nonclinical sample of undergraduates. As hypothesized, appearance-based teasing was positively correlated with body dysmorphic symptoms. The correlation between teasing and body dysmorphic symptoms was stronger than that between teasing and obsessive-compulsive symptom severity. Last, body dysmorphic symptom severity and appearance-based teasing interacted in predicting functional impairment and depression. Specifically, appearance-based teasing was positively associated with depression and functional impairment only in those with elevated body dysmorphic symptoms. When a similar moderation was tested with obsessive-compulsive, in place of body dysmorphic, symptom severity, the interaction was nonsignificant. Findings support theory that appearance-based teasing is a specific risk factor for body dysmorphic symptoms and associated functional impairment.

  17. Distance-Based Functional Diversity Measures and Their Decomposition: A Framework Based on Hill Numbers

    PubMed Central

    Chiu, Chun-Huo; Chao, Anne

    2014-01-01

    Hill numbers (or the “effective number of species”) are increasingly used to characterize species diversity of an assemblage. This work extends Hill numbers to incorporate species pairwise functional distances calculated from species traits. We derive a parametric class of functional Hill numbers, which quantify “the effective number of equally abundant and (functionally) equally distinct species” in an assemblage. We also propose a class of mean functional diversity (per species), which quantifies the effective sum of functional distances between a fixed species to all other species. The product of the functional Hill number and the mean functional diversity thus quantifies the (total) functional diversity, i.e., the effective total distance between species of the assemblage. The three measures (functional Hill numbers, mean functional diversity and total functional diversity) quantify different aspects of species trait space, and all are based on species abundance and species pairwise functional distances. When all species are equally distinct, our functional Hill numbers reduce to ordinary Hill numbers. When species abundances are not considered or species are equally abundant, our total functional diversity reduces to the sum of all pairwise distances between species of an assemblage. The functional Hill numbers and the mean functional diversity both satisfy a replication principle, implying the total functional diversity satisfies a quadratic replication principle. When there are multiple assemblages defined by the investigator, each of the three measures of the pooled assemblage (gamma) can be multiplicatively decomposed into alpha and beta components, and the two components are independent. The resulting beta component measures pure functional differentiation among assemblages and can be further transformed to obtain several classes of normalized functional similarity (or differentiation) measures, including N-assemblage functional generalizations of

  18. Distance-based functional diversity measures and their decomposition: a framework based on Hill numbers.

    PubMed

    Chiu, Chun-Huo; Chao, Anne

    2014-01-01

    Hill numbers (or the "effective number of species") are increasingly used to characterize species diversity of an assemblage. This work extends Hill numbers to incorporate species pairwise functional distances calculated from species traits. We derive a parametric class of functional Hill numbers, which quantify "the effective number of equally abundant and (functionally) equally distinct species" in an assemblage. We also propose a class of mean functional diversity (per species), which quantifies the effective sum of functional distances between a fixed species to all other species. The product of the functional Hill number and the mean functional diversity thus quantifies the (total) functional diversity, i.e., the effective total distance between species of the assemblage. The three measures (functional Hill numbers, mean functional diversity and total functional diversity) quantify different aspects of species trait space, and all are based on species abundance and species pairwise functional distances. When all species are equally distinct, our functional Hill numbers reduce to ordinary Hill numbers. When species abundances are not considered or species are equally abundant, our total functional diversity reduces to the sum of all pairwise distances between species of an assemblage. The functional Hill numbers and the mean functional diversity both satisfy a replication principle, implying the total functional diversity satisfies a quadratic replication principle. When there are multiple assemblages defined by the investigator, each of the three measures of the pooled assemblage (gamma) can be multiplicatively decomposed into alpha and beta components, and the two components are independent. The resulting beta component measures pure functional differentiation among assemblages and can be further transformed to obtain several classes of normalized functional similarity (or differentiation) measures, including N-assemblage functional generalizations of the

  19. Functional networks inference from rule-based machine learning models.

    PubMed

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The

  20. A novel chaos-based image encryption algorithm using DNA sequence operations

    NASA Astrophysics Data System (ADS)

    Chai, Xiuli; Chen, Yiran; Broyde, Lucie

    2017-01-01

    An image encryption algorithm based on chaotic system and deoxyribonucleic acid (DNA) sequence operations is proposed in this paper. First, the plain image is encoded into a DNA matrix, and then a new wave-based permutation scheme is performed on it. The chaotic sequences produced by 2D Logistic chaotic map are employed for row circular permutation (RCP) and column circular permutation (CCP). Initial values and parameters of the chaotic system are calculated by the SHA 256 hash of the plain image and the given values. Then, a row-by-row image diffusion method at DNA level is applied. A key matrix generated from the chaotic map is used to fuse the confused DNA matrix; also the initial values and system parameters of the chaotic system are renewed by the hamming distance of the plain image. Finally, after decoding the diffused DNA matrix, we obtain the cipher image. The DNA encoding/decoding rules of the plain image and the key matrix are determined by the plain image. Experimental results and security analyses both confirm that the proposed algorithm has not only an excellent encryption result but also resists various typical attacks.

  1. Quantum key management

    DOEpatents

    Hughes, Richard John; Thrasher, James Thomas; Nordholt, Jane Elizabeth

    2016-11-29

    Innovations for quantum key management harness quantum communications to form a cryptography system within a public key infrastructure framework. In example implementations, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a Merkle signature scheme (using Winternitz one-time digital signatures or other one-time digital signatures, and Merkle hash trees) to constitute a cryptography system. More generally, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a hash-based signature scheme. This provides a secure way to identify, authenticate, verify, and exchange secret cryptographic keys. Features of the quantum key management innovations further include secure enrollment of users with a registration authority, as well as credential checking and revocation with a certificate authority, where the registration authority and/or certificate authority can be part of the same system as a trusted authority for quantum key distribution.

  2. Functional relationship-based alarm processing

    DOEpatents

    Corsberg, Daniel R.

    1988-01-01

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated. Thus, each alarm's importance is continuously oupdated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on caussal factors between two alarms); (3) required action (system response or action) expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). The alarm processing system and method is sensitive to the dynamic nature of the process being monitored and is capable of changing the relative importance of each alarm as necessary.

  3. Performance-based versus patient-reported physical function: what are the underlying predictors?

    PubMed

    Bean, Jonathan F; Olveczky, Daniele D; Kiely, Dan K; LaRose, Sharon I; Jette, Alan M

    2011-12-01

    Functional limitations have been operationally defined for studies of rehabilitation science through measures of physical performance and patient-reported function. Although conceived as representing similar concepts, differences between these 2 modes of measuring physical functioning have not been adequately characterized scientifically. The purpose of this study was to compare the Short Physical Performance Battery (SPPB) with the function component of the Late-Life Function and Disability Instrument (LLFDI) with respect to their association with physiologic factors and other psychosocial and health factors potentially influencing rehabilitative care. This study was a cross-sectional analysis of baseline data from a sample of community-dwelling older adults (N=137) with mobility limitations enrolled in a randomized controlled trial of exercise. A performance-based measure of function (the SPPB) and a self-report measure of function (the LLFDI) served as functional outcomes. Physiologic factors included measures of leg strength, leg velocity, and exercise tolerance test (ETT) duration, which served as a surrogate measure of aerobic capacity. Psychosocial and health factors included age, sex, height, body mass index, number of chronic conditions, depression, and falls efficacy. Separate multivariable regression models predicting SPPB and LLFDI scores described 33% and 42% of the variance in each outcome (R(2)), respectively. Leg velocity and ETT duration were positively associated with both performance-based and patient-reported functional measures. Leg strength and age were positively associated with SPPB scores, whereas number of chronic conditions, sex, and falls efficacy were associated with the LLFDI scores. This study included older adults with mobility limitations and may not generalize to other populations. Performance-based and patient-reported measures of physical function appear to assess different aspects of an older person's functioning. The SPPB was

  4. Status of MAPA (Modular Accelerator Physics Analysis) and the Tech-X Object-Oriented Accelerator Library

    NASA Astrophysics Data System (ADS)

    Cary, J. R.; Shasharina, S.; Bruhwiler, D. L.

    1998-04-01

    The MAPA code is a fully interactive accelerator modeling and design tool consisting of a GUI and two object-oriented C++ libraries: a general library suitable for treatment of any dynamical system, and an accelerator library including many element types plus an accelerator class. The accelerator library inherits directly from the system library, which uses hash tables to store any relevant parameters or strings. The GUI can access these hash tables in a general way, allowing the user to invoke a window displaying all relevant parameters for a particular element type or for the accelerator class, with the option to change those parameters. The system library can advance an arbitrary number of dynamical variables through an arbitrary mapping. The accelerator class inherits this capability and overloads the relevant functions to advance the phase space variables of a charged particle through a string of elements. Among other things, the GUI makes phase space plots and finds fixed points of the map. We discuss the object hierarchy of the two libraries and use of the code.

  5. Fun cube based brain gym cognitive function assessment system.

    PubMed

    Zhang, Tao; Lin, Chung-Chih; Yu, Tsang-Chu; Sun, Jing; Hsu, Wen-Chuin; Wong, Alice May-Kuen

    2017-05-01

    The aim of this study is to design and develop a fun cube (FC) based brain gym (BG) cognitive function assessment system using the wireless sensor network and multimedia technologies. The system comprised (1) interaction devices, FCs and a workstation used as interactive tools for collecting and transferring data to the server, (2) a BG information management system responsible for managing the cognitive games and storing test results, and (3) a feedback system used for conducting the analysis of cognitive functions to assist caregivers in screening high risk groups with mild cognitive impairment. Three kinds of experiments were performed to evaluate the developed FC-based BG cognitive function assessment system. The experimental results showed that the Pearson correlation coefficient between the system's evaluation outcomes and the traditional Montreal Cognitive Assessment scores was 0.83. The average Technology Acceptance Model 2 score was close to six for 31 elderly subjects. Most subjects considered that the brain games are interesting and the FC human-machine interface is easy to learn and operate. The control group and the cognitive impairment group had statistically significant difference with respect to the accuracy of and the time taken for the brain cognitive function assessment games, including Animal Naming, Color Search, Trail Making Test, Change Blindness, and Forward / Backward Digit Span. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Effects of computer-based training on procedural modifications to standard functional analyses.

    PubMed

    Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.

  7. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    ERIC Educational Resources Information Center

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  8. Functional-Based Assessment of Social Behavior: Introduction and Overview.

    ERIC Educational Resources Information Center

    Lewis, Timothy J.; Sugai, George

    1994-01-01

    This introduction to and overview of a special issue on social behavior assessment within schools discusses the impact of function-based methodologies on assessment and intervention practices in identification and remediation of challenging social behaviors. (JDD)

  9. Distinct hippocampal functional networks revealed by tractography-based parcellation.

    PubMed

    Adnan, Areeba; Barnett, Alexander; Moayedi, Massieh; McCormick, Cornelia; Cohn, Melanie; McAndrews, Mary Pat

    2016-07-01

    Recent research suggests the anterior and posterior hippocampus form part of two distinct functional neural networks. Here we investigate the structural underpinnings of this functional connectivity difference using diffusion-weighted imaging-based parcellation. Using this technique, we substantiated that the hippocampus can be parcellated into distinct anterior and posterior segments. These structurally defined segments did indeed show different patterns of resting state functional connectivity, in that the anterior segment showed greater connectivity with temporal and orbitofrontal cortex, whereas the posterior segment was more highly connected to medial and lateral parietal cortex. Furthermore, we showed that the posterior hippocampal connectivity to memory processing regions, including the dorsolateral prefrontal cortex, parahippocampal, inferior temporal and fusiform gyri and the precuneus, predicted interindividual relational memory performance. These findings provide important support for the integration of structural and functional connectivity in understanding the brain networks underlying episodic memory.

  10. Towards fully automated structure-based function prediction in structural genomics: a case study.

    PubMed

    Watson, James D; Sanderson, Steve; Ezersky, Alexandra; Savchenko, Alexei; Edwards, Aled; Orengo, Christine; Joachimiak, Andrzej; Laskowski, Roman A; Thornton, Janet M

    2007-04-13

    As the global Structural Genomics projects have picked up pace, the number of structures annotated in the Protein Data Bank as hypothetical protein or unknown function has grown significantly. A major challenge now involves the development of computational methods to assign functions to these proteins accurately and automatically. As part of the Midwest Center for Structural Genomics (MCSG) we have developed a fully automated functional analysis server, ProFunc, which performs a battery of analyses on a submitted structure. The analyses combine a number of sequence-based and structure-based methods to identify functional clues. After the first stage of the Protein Structure Initiative (PSI), we review the success of the pipeline and the importance of structure-based function prediction. As a dataset, we have chosen all structures solved by the MCSG during the 5 years of the first PSI. Our analysis suggests that two of the structure-based methods are particularly successful and provide examples of local similarity that is difficult to identify using current sequence-based methods. No one method is successful in all cases, so, through the use of a number of complementary sequence and structural approaches, the ProFunc server increases the chances that at least one method will find a significant hit that can help elucidate function. Manual assessment of the results is a time-consuming process and subject to individual interpretation and human error. We present a method based on the Gene Ontology (GO) schema using GO-slims that can allow the automated assessment of hits with a success rate approaching that of expert manual assessment.

  11. A T Matrix Method Based upon Scalar Basis Functions

    NASA Technical Reports Server (NTRS)

    Mackowski, D.W.; Kahnert, F. M.; Mishchenko, Michael I.

    2013-01-01

    A surface integral formulation is developed for the T matrix of a homogenous and isotropic particle of arbitrary shape, which employs scalar basis functions represented by the translation matrix elements of the vector spherical wave functions. The formulation begins with the volume integral equation for scattering by the particle, which is transformed so that the vector and dyadic components in the equation are replaced with associated dipole and multipole level scalar harmonic wave functions. The approach leads to a volume integral formulation for the T matrix, which can be extended, by use of Green's identities, to the surface integral formulation. The result is shown to be equivalent to the traditional surface integral formulas based on the VSWF basis.

  12. The Creative Brain.

    ERIC Educational Resources Information Center

    Herrmann, Ned

    1982-01-01

    Outlines the differences between left-brain and right-brain functioning and between left-brain and right-brain dominant individuals, and concludes that creativity uses both halves of the brain. Discusses how both students and curriculum can become more "whole-brained." (Author/JM)

  13. MaMBA - a functional Moon and Mars Base Analog

    NASA Astrophysics Data System (ADS)

    Heinicke, C.; Foing, B.

    2017-09-01

    Despite impressive progress in robotic exploration of celestial bodies, robots are believed to never reach the effectiveness and efficiency of a trained human. Consequently, ESA proposes to build an international Moon Village in roughly 15 years and NASA plans for the first manned mission to Mars shortly after. One of the challenges still remaining is the need for a shelter, a habitat which allows human spacefarers to safely live and work on the surface of a celestial body. Although a number of prototype habitats has been built during the last decades and inhabited for various durations (e.g. MDRS, FMARS, HI-SEAS, M.A.R.S.), these habitats are typically equipped for studies on human factors and would not function in an extraterrestrial environment. Project MaMBA (Moon and Mars Base Analog) aims to build the first functional habitat based on the lessons learned from intermediate and long duration missions at the mentioned habitats. The habitat will serve for testing technologies like life support, power systems, and interplanetary communi­cation. Special attention will be given to the develop­ment of the geoscience laboratory module. Crews will live and work inside the habitat to ensure its functionality.

  14. Graph-based network analysis of resting-state functional MRI.

    PubMed

    Wang, Jinhui; Zuo, Xinian; He, Yong

    2010-01-01

    In the past decade, resting-state functional MRI (R-fMRI) measures of brain activity have attracted considerable attention. Based on changes in the blood oxygen level-dependent signal, R-fMRI offers a novel way to assess the brain's spontaneous or intrinsic (i.e., task-free) activity with both high spatial and temporal resolutions. The properties of both the intra- and inter-regional connectivity of resting-state brain activity have been well documented, promoting our understanding of the brain as a complex network. Specifically, the topological organization of brain networks has been recently studied with graph theory. In this review, we will summarize the recent advances in graph-based brain network analyses of R-fMRI signals, both in typical and atypical populations. Application of these approaches to R-fMRI data has demonstrated non-trivial topological properties of functional networks in the human brain. Among these is the knowledge that the brain's intrinsic activity is organized as a small-world, highly efficient network, with significant modularity and highly connected hub regions. These network properties have also been found to change throughout normal development, aging, and in various pathological conditions. The literature reviewed here suggests that graph-based network analyses are capable of uncovering system-level changes associated with different processes in the resting brain, which could provide novel insights into the understanding of the underlying physiological mechanisms of brain function. We also highlight several potential research topics in the future.

  15. Simulation of random road microprofile based on specified correlation function

    NASA Astrophysics Data System (ADS)

    Rykov, S. P.; Rykova, O. A.; Koval, V. S.; Vlasov, V. G.; Fedotov, K. V.

    2018-03-01

    The paper aims to develop a numerical simulation method and an algorithm for a random microprofile of special roads based on the specified correlation function. The paper used methods of correlation, spectrum and numerical analysis. It proves that the transfer function of the generating filter for known expressions of spectrum input and output filter characteristics can be calculated using a theorem on nonnegative and fractional rational factorization and integral transformation. The model of the random function equivalent of the real road surface microprofile enables us to assess springing system parameters and identify ranges of variations.

  16. Validating Trial-Based Functional Analyses in Mainstream Primary School Classrooms

    ERIC Educational Resources Information Center

    Austin, Jennifer L.; Groves, Emily A.; Reynish, Lisa C.; Francis, Laura L.

    2015-01-01

    There is growing evidence to support the use of trial-based functional analyses, particularly in classroom settings. However, there currently are no evaluations of this procedure with typically developing children. Furthermore, it is possible that refinements may be needed to adapt trial-based analyses to mainstream classrooms. This study was…

  17. Atlas-based system for functional neurosurgery

    NASA Astrophysics Data System (ADS)

    Nowinski, Wieslaw L.; Yeo, Tseng T.; Yang, Guo L.; Dow, Douglas E.

    1997-05-01

    This paper addresses the development of an atlas-based system for preoperative functional neurosurgery planning and training, intraoperative support and postoperative analysis. The system is based on Atlas of Stereotaxy of the Human Brain by Schaltenbrand and Wahren used for interactive segmentation and labeling of clinical data in 2D/3D, and for assisting stereotactic targeting. The atlas microseries are digitized, enhanced, segmented, labeled, aligned and organized into mutually preregistered atlas volumes 3D models of the structures are also constructed. The atlas may be interactively registered with the actual patient's data. Several other features are also provided including data reformatting, visualization, navigation, mensuration, and stereotactic path display and editing in 2D/3D. The system increases the accuracy of target definition, reduces the time of planning and time of the procedure itself. It also constitutes a research platform for the construction of more advanced neurosurgery supporting tools and brain atlases.

  18. Performance-based Physical Functioning and Peripheral Neuropathy in a Population-based Cohort of Women at Midlife

    PubMed Central

    Ylitalo, Kelly R.; Herman, William H.; Harlow, Siobán D.

    2013-01-01

    Peripheral neuropathy is underappreciated as a potential cause of functional limitations. In the present article, we assessed the cross-sectional association between peripheral neuropathy and physical functioning and how the longitudinal association between age and functioning differed by neuropathy status. Physical functioning was measured in 1996–2008 using timed performances on stair-climb, walking, sit-to-stand, and balance tests at the Michigan site of the Study of Women's Health Across the Nation, a population-based cohort study of women at midlife (n = 396). Peripheral neuropathy was measured in 2008 and defined as having an abnormal monofilament test result or 4 or more symptoms. We used linear mixed models to determine whether trajectories of physical functioning differed by prevalent neuropathy status. Overall, 27.8% of the women had neuropathy. Stair-climb time differed by neuropathy status (P = 0.04), and for every 1-year increase in age, women with neuropathy had a 1.82% (95% confidence interval: 1.42, 2.21) increase compared with a 0.95% (95% confidence interval: 0.71, 1.20) increase for women without neuropathy. Sit-to-stand time differed by neuropathy status (P = 0.01), but the rate of change did not differ. No differences between neuropathy groups were observed for the walk test. For some performance-based tasks, poor functioning was maintained or exacerbated for women who had prevalent neuropathy. Peripheral neuropathy may play a role in physical functioning limitations and future disability. PMID:23524038

  19. A Robust and Efficient Quantum Private Comparison of Equality Based on the Entangled Swapping of GHZ-like State and χ + State

    NASA Astrophysics Data System (ADS)

    Xu, Ling; Zhao, Zhiwen

    2017-08-01

    A new quantum protocol with the assistance of a semi-honest third party (TP) is proposed, which allows the participants comparing the equality of their private information without disclosing them. Different from previous protocols, this protocol utilizes quantum key distribution against the collective-dephasing noise and the collective-rotation noise, which is more robust and abandons few samples, to transmit the classical information. In addition, this protocol utilizes the GHZ-like state and the χ + state to produce the entanglement swapping. And the Bell basis and the dual basis are used to measure the particle pair so that 3 bits of each participant's private information can be compared in each comparison time, which is more efficient and consumes fewer comparison times. Meanwhile, there is no need of unitary operation and hash function in this protocol. At the end, various kinds of outside attack and participant attack are discussed and analyzed to be invalid, so it can complete the comparison in security.

  20. Proof of cipher text ownership based on convergence encryption

    NASA Astrophysics Data System (ADS)

    Zhong, Weiwei; Liu, Zhusong

    2017-08-01

    Cloud storage systems save disk space and bandwidth through deduplication technology, but with the use of this technology has been targeted security attacks: the attacker can get the original file just use hash value to deceive the server to obtain the file ownership. In order to solve the above security problems and the different security requirements of cloud storage system files, an efficient information theory security proof of ownership scheme is proposed. This scheme protects the data through the convergence encryption method, and uses the improved block-level proof of ownership scheme, and can carry out block-level client deduplication to achieve efficient and secure cloud storage deduplication scheme.