Sample records for cryptographic hash function

  1. A cryptographic hash function based on chaotic network automata

    NASA Astrophysics Data System (ADS)

    Machicao, Jeaneth; Bruno, Odemir M.

    2017-12-01

    Chaos theory has been used to develop several cryptographic methods relying on the pseudo-random properties extracted from simple nonlinear systems such as cellular automata (CA). Cryptographic hash functions (CHF) are commonly used to check data integrity. CHF “compress” arbitrary long messages (input) into much smaller representations called hash values or message digest (output), designed to prevent the ability to reverse the hash values into the original message. This paper proposes a chaos-based CHF inspired on an encryption method based on chaotic CA rule B1357-S2468. Here, we propose an hybrid model that combines CA and networks, called network automata (CNA), whose chaotic spatio-temporal outputs are used to compute a hash value. Following the Merkle and Damgård model of construction, a portion of the message is entered as the initial condition of the network automata, so that the rest parts of messages are iteratively entered to perturb the system. The chaotic network automata shuffles the message using flexible control parameters, so that the generated hash value is highly sensitive to the message. As demonstrated in our experiments, the proposed model has excellent pseudo-randomness and sensitivity properties with acceptable performance when compared to conventional hash functions.

  2. Collision attack against Tav-128 hash function

    NASA Astrophysics Data System (ADS)

    Hariyanto, Fajar; Hayat Susanti, Bety

    2017-10-01

    Tav-128 is a hash function which is designed for Radio Frequency Identification (RFID) authentication protocol. Tav-128 is expected to be a cryptographically secure hash function which meets collision resistance properties. In this research, a collision attack is done to prove whether Tav-128 is a collision resistant hash function. The results show that collisions can be obtained in Tav-128 hash function which means in other word, Tav-128 is not a collision resistant hash function.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draelos, Timothy John; Dautenhahn, Nathan; Schroeppel, Richard Crabtree

    The security of the widely-used cryptographic hash function SHA1 has been impugned. We have developed two replacement hash functions. The first, SHA1X, is a drop-in replacement for SHA1. The second, SANDstorm, has been submitted as a candidate to the NIST-sponsored SHA3 Hash Function competition.

  4. [Linking anonymous databases for national and international multicenter epidemiological studies: a cryptographic algorithm].

    PubMed

    Quantin, C; Fassa, M; Coatrieux, G; Riandey, B; Trouessin, G; Allaert, F A

    2009-02-01

    Compiling individual records which come from different sources remains very important for multicenter epidemiological studies, but at the same time European directives or other national legislation concerning nominal data processing have to be respected. These legal aspects can be satisfied by implementing mechanisms that allow anonymization of patient data (such as hashing techniques). Moreover, for security reasons, official recommendations suggest using different cryptographic keys in combination with a cryptographic hash function for each study. Unfortunately, such an anonymization procedure is in contradiction with the common requirement in public health and biomedical research as it becomes almost impossible to link records from separate data collections where the same entity is not referenced in the same way. Solving this paradox by using methodology based on the combination of hashing and enciphering techniques is the main aim of this article. The method relies on one of the best known hashing functions (the secure hash algorithm) to ensure the anonymity of personal information while providing greater resistance to dictionary attacks, combined with encryption techniques. The originality of the method relies on the way the combination of hashing and enciphering techniques is performed: like in asymmetric encryption, two keys are used but the private key depends on the patient's identity. The combination of hashing and enciphering techniques provides a great improvement in the overall security of the proposed scheme. This methodology makes the stored data available for use in the field of public health for the benefit of patients, while respecting legal security requirements.

  5. On the concept of cryptographic quantum hashing

    NASA Astrophysics Data System (ADS)

    Ablayev, F.; Ablayev, M.

    2015-12-01

    In the letter we define the notion of a quantum resistant ((ε ,δ ) -resistant) hash function which consists of a combination of pre-image (one-way) resistance (ε-resistance) and collision resistance (δ-resistance) properties. We present examples and discussion that supports the idea of quantum hashing. We present an explicit quantum hash function which is ‘balanced’, one-way resistant and collision resistant and demonstrate how to build a large family of quantum hash functions. Balanced quantum hash functions need a high degree of entanglement between the qubits. We use a phase transformation technique to express quantum hashing constructions, which is an effective way of mapping hash states to coherent states in a superposition of time-bin modes. The phase transformation technique is ready to be implemented with current optical technology.

  6. A Simple Secure Hash Function Scheme Using Multiple Chaotic Maps

    NASA Astrophysics Data System (ADS)

    Ahmad, Musheer; Khurana, Shruti; Singh, Sushmita; AlSharari, Hamed D.

    2017-06-01

    The chaotic maps posses high parameter sensitivity, random-like behavior and one-way computations, which favor the construction of cryptographic hash functions. In this paper, we propose to present a novel hash function scheme which uses multiple chaotic maps to generate efficient variable-sized hash functions. The message is divided into four parts, each part is processed by a different 1D chaotic map unit yielding intermediate hash code. The four codes are concatenated to two blocks, then each block is processed through 2D chaotic map unit separately. The final hash value is generated by combining the two partial hash codes. The simulation analyses such as distribution of hashes, statistical properties of confusion and diffusion, message and key sensitivity, collision resistance and flexibility are performed. The results reveal that the proposed anticipated hash scheme is simple, efficient and holds comparable capabilities when compared with some recent chaos-based hash algorithms.

  7. Secure Image Hash Comparison for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.

    2014-06-06

    The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.

  8. Using the Hill Cipher to Teach Cryptographic Principles

    ERIC Educational Resources Information Center

    McAndrew, Alasdair

    2008-01-01

    The Hill cipher is the simplest example of a "block cipher," which takes a block of plaintext as input, and returns a block of ciphertext as output. Although it is insecure by modern standards, its simplicity means that it is well suited for the teaching of such concepts as encryption modes, and properties of cryptographic hash functions. Although…

  9. Implementation of cryptographic hash function SHA256 in C++

    NASA Astrophysics Data System (ADS)

    Shrivastava, Akash

    2012-02-01

    This abstract explains the implementation of SHA Secure hash algorithm 256 using C++. The SHA-2 is a strong hashing algorithm used in almost all kinds of security applications. The algorithm consists of 2 phases: Preprocessing and hash computation. Preprocessing involves padding a message, parsing the padded message into m-bits blocks, and setting initialization values to be used in the hash computation. It generates a message schedule from padded message and uses that schedule, along with functions, constants, and word operations to iteratively generate a series of hash values. The final hash value generated by the computation is used to determine the message digest. SHA-2 includes a significant number of changes from its predecessor, SHA-1. SHA-2 consists of a set of four hash functions with digests that are 224, 256, 384 or 512 bits. The algorithm outputs a 256 bits message block with an internal state block of 256 bits and initial block size of 512 bits. Maximum message length in bit is generated is 2^64 -1, over all computed over a series of 64 rounds consisting or several operations such as and, or, Xor, Shr, Rot. The code will provide clear understanding of the hash algorithm and generates hash values to retrieve message digest.

  10. Robust hashing for 3D models

    NASA Astrophysics Data System (ADS)

    Berchtold, Waldemar; Schäfer, Marcel; Rettig, Michael; Steinebach, Martin

    2014-02-01

    3D models and applications are of utmost interest in both science and industry. With the increment of their usage, their number and thereby the challenge to correctly identify them increases. Content identification is commonly done by cryptographic hashes. However, they fail as a solution in application scenarios such as computer aided design (CAD), scientific visualization or video games, because even the smallest alteration of the 3D model, e.g. conversion or compression operations, massively changes the cryptographic hash as well. Therefore, this work presents a robust hashing algorithm for 3D mesh data. The algorithm applies several different bit extraction methods. They are built to resist desired alterations of the model as well as malicious attacks intending to prevent correct allocation. The different bit extraction methods are tested against each other and, as far as possible, the hashing algorithm is compared to the state of the art. The parameters tested are robustness, security and runtime performance as well as False Acceptance Rate (FAR) and False Rejection Rate (FRR), also the probability calculation of hash collision is included. The introduced hashing algorithm is kept adaptive e.g. in hash length, to serve as a proper tool for all applications in practice.

  11. Fully Integrated Passive UHF RFID Tag for Hash-Based Mutual Authentication Protocol.

    PubMed

    Mikami, Shugo; Watanabe, Dai; Li, Yang; Sakiyama, Kazuo

    2015-01-01

    Passive radio-frequency identification (RFID) tag has been used in many applications. While the RFID market is expected to grow, concerns about security and privacy of the RFID tag should be overcome for the future use. To overcome these issues, privacy-preserving authentication protocols based on cryptographic algorithms have been designed. However, to the best of our knowledge, evaluation of the whole tag, which includes an antenna, an analog front end, and a digital processing block, that runs authentication protocols has not been studied. In this paper, we present an implementation and evaluation of a fully integrated passive UHF RFID tag that runs a privacy-preserving mutual authentication protocol based on a hash function. We design a single chip including the analog front end and the digital processing block. We select a lightweight hash function supporting 80-bit security strength and a standard hash function supporting 128-bit security strength. We show that when the lightweight hash function is used, the tag completes the protocol with a reader-tag distance of 10 cm. Similarly, when the standard hash function is used, the tag completes the protocol with the distance of 8.5 cm. We discuss the impact of the peak power consumption of the tag on the distance of the tag due to the hash function.

  12. Fully Integrated Passive UHF RFID Tag for Hash-Based Mutual Authentication Protocol

    PubMed Central

    Mikami, Shugo; Watanabe, Dai; Li, Yang; Sakiyama, Kazuo

    2015-01-01

    Passive radio-frequency identification (RFID) tag has been used in many applications. While the RFID market is expected to grow, concerns about security and privacy of the RFID tag should be overcome for the future use. To overcome these issues, privacy-preserving authentication protocols based on cryptographic algorithms have been designed. However, to the best of our knowledge, evaluation of the whole tag, which includes an antenna, an analog front end, and a digital processing block, that runs authentication protocols has not been studied. In this paper, we present an implementation and evaluation of a fully integrated passive UHF RFID tag that runs a privacy-preserving mutual authentication protocol based on a hash function. We design a single chip including the analog front end and the digital processing block. We select a lightweight hash function supporting 80-bit security strength and a standard hash function supporting 128-bit security strength. We show that when the lightweight hash function is used, the tag completes the protocol with a reader-tag distance of 10 cm. Similarly, when the standard hash function is used, the tag completes the protocol with the distance of 8.5 cm. We discuss the impact of the peak power consumption of the tag on the distance of the tag due to the hash function. PMID:26491714

  13. Providing Cryptographic Security and Evidentiary Chain-of-Custody with the Advanced Forensic Format, Library, and Tools

    DTIC Science & Technology

    2008-08-19

    1 hash of the page page%d sha256 The segment for the SHA256 hash of the page Bad Sector Management: badsectors The number of sectors in the image...written, AFFLIB can automatically compute the page’s MD5, SHA-1, and/or SHA256 hash and write an associated segment containing the hash value. The...are written into segments themselves, with the segment name being name/ sha256 where name is the original segment name sha256 is the hash algorithm used

  14. Implementation of 4-way Superscalar Hash MIPS Processor Using FPGA

    NASA Astrophysics Data System (ADS)

    Sahib Omran, Safaa; Fouad Jumma, Laith

    2018-05-01

    Due to the quick advancements in the personal communications systems and wireless communications, giving data security has turned into a more essential subject. This security idea turns into a more confounded subject when next-generation system requirements and constant calculation speed are considered in real-time. Hash functions are among the most essential cryptographic primitives and utilized as a part of the many fields of signature authentication and communication integrity. These functions are utilized to acquire a settled size unique fingerprint or hash value of an arbitrary length of message. In this paper, Secure Hash Algorithms (SHA) of types SHA-1, SHA-2 (SHA-224, SHA-256) and SHA-3 (BLAKE) are implemented on Field-Programmable Gate Array (FPGA) in a processor structure. The design is described and implemented using a hardware description language, namely VHSIC “Very High Speed Integrated Circuit” Hardware Description Language (VHDL). Since the logical operation of the hash types of (SHA-1, SHA-224, SHA-256 and SHA-3) are 32-bits, so a Superscalar Hash Microprocessor without Interlocked Pipelines (MIPS) processor are designed with only few instructions that were required in invoking the desired Hash algorithms, when the four types of hash algorithms executed sequentially using the designed processor, the total time required equal to approximately 342 us, with a throughput of 4.8 Mbps while the required to execute the same four hash algorithms using the designed four-way superscalar is reduced to 237 us with improved the throughput to 5.1 Mbps.

  15. Cryptographic framework for document-objects resulting from multiparty collaborative transactions.

    PubMed

    Goh, A

    2000-01-01

    Multiparty transactional frameworks--i.e. Electronic Data Interchange (EDI) or Health Level (HL) 7--often result in composite documents which can be accurately modelled using hyperlinked document-objects. The structural complexity arising from multiauthor involvement and transaction-specific sequencing would be poorly handled by conventional digital signature schemes based on a single evaluation of a one-way hash function and asymmetric cryptography. In this paper we outline the generation of structure-specific authentication hash-trees for the the authentication of transactional document-objects, followed by asymmetric signature generation on the hash-tree value. Server-side multi-client signature verification would probably constitute the single most compute-intensive task, hence the motivation for our usage of the Rabin signature protocol which results in significantly reduced verification workloads compared to the more commonly applied Rivest-Shamir-Adleman (RSA) protocol. Data privacy is handled via symmetric encryption of message traffic using session-specific keys obtained through key-negotiation mechanisms based on discrete-logarithm cryptography. Individual client-to-server channels can be secured using a double key-pair variation of Diffie-Hellman (DH) key negotiation, usage of which also enables bidirectional node authentication. The reciprocal server-to-client multicast channel is secured through Burmester-Desmedt (BD) key-negotiation which enjoys significant advantages over the usual multiparty extensions to the DH protocol. The implementation of hash-tree signatures and bi/multidirectional key negotiation results in a comprehensive cryptographic framework for multiparty document-objects satisfying both authentication and data privacy requirements.

  16. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  17. Non-Black-Box Simulation from One-Way Functions and Applications to Resettable Security

    DTIC Science & Technology

    2012-11-05

    from 2001, Barak (FOCS’01) introduced a novel non-black-box simulation technique. This technique enabled the construc- tion of new cryptographic...primitives, such as resettably-sound zero-knowledge arguments, that cannot be proven secure using just black-box simulation techniques. The work of Barak ... Barak requires the existence of collision-resistant hash functions, and a very recent result by Bitansky and Paneth (FOCS’12) instead requires the

  18. Enhanced K-means clustering with encryption on cloud

    NASA Astrophysics Data System (ADS)

    Singh, Iqjot; Dwivedi, Prerna; Gupta, Taru; Shynu, P. G.

    2017-11-01

    This paper tries to solve the problem of storing and managing big files over cloud by implementing hashing on Hadoop in big-data and ensure security while uploading and downloading files. Cloud computing is a term that emphasis on sharing data and facilitates to share infrastructure and resources.[10] Hadoop is an open source software that gives us access to store and manage big files according to our needs on cloud. K-means clustering algorithm is an algorithm used to calculate distance between the centroid of the cluster and the data points. Hashing is a algorithm in which we are storing and retrieving data with hash keys. The hashing algorithm is called as hash function which is used to portray the original data and later to fetch the data stored at the specific key. [17] Encryption is a process to transform electronic data into non readable form known as cipher text. Decryption is the opposite process of encryption, it transforms the cipher text into plain text that the end user can read and understand well. For encryption and decryption we are using Symmetric key cryptographic algorithm. In symmetric key cryptography are using DES algorithm for a secure storage of the files. [3

  19. Gencrypt: one-way cryptographic hashes to detect overlapping individuals across samples

    PubMed Central

    Turchin, Michael C.; Hirschhorn, Joel N.

    2012-01-01

    Summary: Meta-analysis across genome-wide association studies is a common approach for discovering genetic associations. However, in some meta-analysis efforts, individual-level data cannot be broadly shared by study investigators due to privacy and Institutional Review Board concerns. In such cases, researchers cannot confirm that each study represents a unique group of people, leading to potentially inflated test statistics and false positives. To resolve this problem, we created a software tool, Gencrypt, which utilizes a security protocol known as one-way cryptographic hashes to allow overlapping participants to be identified without sharing individual-level data. Availability: Gencrypt is freely available under the GNU general public license v3 at http://www.broadinstitute.org/software/gencrypt/ Contact: joelh@broadinstitute.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22302573

  20. Limitations and requirements of content-based multimedia authentication systems

    NASA Astrophysics Data System (ADS)

    Wu, Chai W.

    2001-08-01

    Recently, a number of authentication schemes have been proposed for multimedia data such as images and sound data. They include both label based systems and semifragile watermarks. The main requirement for such authentication systems is that minor modifications such as lossy compression which do not alter the content of the data preserve the authenticity of the data, whereas modifications which do modify the content render the data not authentic. These schemes can be classified into two main classes depending on the model of image authentication they are based on. One of the purposes of this paper is to look at some of the advantages and disadvantages of these image authentication schemes and their relationship with fundamental limitations of the underlying model of image authentication. In particular, we study feature-based algorithms which generate an authentication tag based on some inherent features in the image such as the location of edges. The main disadvantage of most proposed feature-based algorithms is that similar images generate similar features, and therefore it is possible for a forger to generate dissimilar images that have the same features. On the other hand, the class of hash-based algorithms utilizes a cryptographic hash function or a digital signature scheme to reduce the data and generate an authentication tag. It inherits the security of digital signatures to thwart forgery attacks. The main disadvantage of hash-based algorithms is that the image needs to be modified in order to be made authenticatable. The amount of modification is on the order of the noise the image can tolerate before it is rendered inauthentic. The other purpose of this paper is to propose a multimedia authentication scheme which combines some of the best features of both classes of algorithms. The proposed scheme utilizes cryptographic hash functions and digital signature schemes and the data does not need to be modified in order to be made authenticatable. Several applications including the authentication of images on CD-ROM and handwritten documents will be discussed.

  1. Improving the efficiency of quantum hash function by dense coding of coin operators in discrete-time quantum walk

    NASA Astrophysics Data System (ADS)

    Yang, YuGuang; Zhang, YuChen; Xu, Gang; Chen, XiuBo; Zhou, Yi-Hua; Shi, WeiMin

    2018-03-01

    Li et al. first proposed a quantum hash function (QHF) in a quantum-walk architecture. In their scheme, two two-particle interactions, i.e., I interaction and π-phase interaction are introduced and the choice of I or π-phase interactions at each iteration depends on a message bit. In this paper, we propose an efficient QHF by dense coding of coin operators in discrete-time quantum walk. Compared with existing QHFs, our protocol has the following advantages: the efficiency of the QHF can be doubled and even more; only one particle is enough and two-particle interactions are unnecessary so that quantum resources are saved. It is a clue to apply the dense coding technique to quantum cryptographic protocols, especially to the applications with restricted quantum resources.

  2. Quantum-secured blockchain

    NASA Astrophysics Data System (ADS)

    Kiktenko, E. O.; Pozhar, N. O.; Anufriev, M. N.; Trushechkin, A. S.; Yunusov, R. R.; Kurochkin, Y. V.; Lvovsky, A. I.; Fedorov, A. K.

    2018-07-01

    Blockchain is a distributed database which is cryptographically protected against malicious modifications. While promising for a wide range of applications, current blockchain platforms rely on digital signatures, which are vulnerable to attacks by means of quantum computers. The same, albeit to a lesser extent, applies to cryptographic hash functions that are used in preparing new blocks, so parties with access to quantum computation would have unfair advantage in procuring mining rewards. Here we propose a possible solution to the quantum era blockchain challenge and report an experimental realization of a quantum-safe blockchain platform that utilizes quantum key distribution across an urban fiber network for information-theoretically secure authentication. These results address important questions about realizability and scalability of quantum-safe blockchains for commercial and governmental applications.

  3. Random multispace quantization as an analytic mechanism for BioHashing of biometric and random identity inputs.

    PubMed

    Teoh, Andrew B J; Goh, Alwyn; Ngo, David C L

    2006-12-01

    Biometric analysis for identity verification is becoming a widespread reality. Such implementations necessitate large-scale capture and storage of biometric data, which raises serious issues in terms of data privacy and (if such data is compromised) identity theft. These problems stem from the essential permanence of biometric data, which (unlike secret passwords or physical tokens) cannot be refreshed or reissued if compromised. Our previously presented biometric-hash framework prescribes the integration of external (password or token-derived) randomness with user-specific biometrics, resulting in bitstring outputs with security characteristics (i.e., noninvertibility) comparable to cryptographic ciphers or hashes. The resultant BioHashes are hence cancellable, i.e., straightforwardly revoked and reissued (via refreshed password or reissued token) if compromised. BioHashing furthermore enhances recognition effectiveness, which is explained in this paper as arising from the Random Multispace Quantization (RMQ) of biometric and external random inputs.

  4. A Secure and Robust User Authenticated Key Agreement Scheme for Hierarchical Multi-medical Server Environment in TMIS.

    PubMed

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2015-09-01

    The telecare medicine information system (TMIS) helps the patients to gain the health monitoring facility at home and access medical services over the Internet of mobile networks. Recently, Amin and Biswas presented a smart card based user authentication and key agreement security protocol usable for TMIS system using the cryptographic one-way hash function and biohashing function, and claimed that their scheme is secure against all possible attacks. Though their scheme is efficient due to usage of one-way hash function, we show that their scheme has several security pitfalls and design flaws, such as (1) it fails to protect privileged-insider attack, (2) it fails to protect strong replay attack, (3) it fails to protect strong man-in-the-middle attack, (4) it has design flaw in user registration phase, (5) it has design flaw in login phase, (6) it has design flaw in password change phase, (7) it lacks of supporting biometric update phase, and (8) it has flaws in formal security analysis. In order to withstand these security pitfalls and design flaws, we aim to propose a secure and robust user authenticated key agreement scheme for the hierarchical multi-server environment suitable in TMIS using the cryptographic one-way hash function and fuzzy extractor. Through the rigorous security analysis including the formal security analysis using the widely-accepted Burrows-Abadi-Needham (BAN) logic, the formal security analysis under the random oracle model and the informal security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme using the most-widely accepted and used Automated Validation of Internet Security Protocols and Applications (AVISPA) tool. The simulation results show that our scheme is also secure. Our scheme is more efficient in computation and communication as compared to Amin-Biswas's scheme and other related schemes. In addition, our scheme supports extra functionality features as compared to other related schemes. As a result, our scheme is very appropriate for practical applications in TMIS.

  5. 76 FR 11433 - Federal Transition To Secure Hash Algorithm (SHA)-256

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... generating digital signatures. Current information systems, Web servers, applications and workstation operating systems were designed to process, and use SHA-1 generated signatures. National Institute of... cryptographic keys, and more robust algorithms by December 2013. Government systems may begin to encounter...

  6. Authenticity techniques for PACS images and records

    NASA Astrophysics Data System (ADS)

    Wong, Stephen T. C.; Abundo, Marco; Huang, H. K.

    1995-05-01

    Along with the digital radiology environment supported by picture archiving and communication systems (PACS) comes a new problem: How to establish trust in multimedia medical data that exist only in the easily altered memory of a computer. Trust is characterized in terms of integrity and privacy of digital data. Two major self-enforcing techniques can be used to assure the authenticity of electronic images and text -- key-based cryptography and digital time stamping. Key-based cryptography associates the content of an image with the originator using one or two distinct keys and prevents alteration of the document by anyone other than the originator. A digital time stamping algorithm generates a characteristic `digital fingerprint' for the original document using a mathematical hash function, and checks that it has not been modified. This paper discusses these cryptographic algorithms and their appropriateness for a PACS environment. It also presents experimental results of cryptographic algorithms on several imaging modalities.

  7. Image Hashes as Templates for Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images,more » and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the information contained in the hashed image data (available out-of-IB) cannot be used to extract sensitive information about the imaged object is of primary concern. Thus the techniques are characterized by high unpredictability to guarantee security. We will present an assessment of the performance of our techniques with respect to security, sensitivity and robustness on the basis of a methodical and mathematically precise framework.« less

  8. Manticore and CS mode : parallelizable encryption with joint cipher-state authentication.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree

    2004-10-01

    We describe a new mode of encryption with inexpensive authentication, which uses information from the internal state of the cipher to provide the authentication. Our algorithms have a number of benefits: (1) the encryption has properties similar to CBC mode, yet the encipherment and authentication can be parallelized and/or pipelined, (2) the authentication overhead is minimal, and (3) the authentication process remains resistant against some IV reuse. We offer a Manticore class of authenticated encryption algorithms based on cryptographic hash functions, which support variable block sizes up to twice the hash output length and variable key lengths. A proof ofmore » security is presented for the MTC4 and Pepper algorithms. We then generalize the construction to create the Cipher-State (CS) mode of encryption that uses the internal state of any round-based block cipher as an authenticator. We provide hardware and software performance estimates for all of our constructions and give a concrete example of the CS mode of encryption that uses AES as the encryption primitive and adds a small speed overhead (10-15%) compared to AES alone.« less

  9. SHAMROCK: A Synthesizable High Assurance Cryptography and Key Management Coprocessor

    DTIC Science & Technology

    2016-11-01

    and excluding devices from a communicating group as they become trusted, or untrusted. An example of using rekeying to dynamically adjust group...algorithms, such as the Elliptic Curve Digital Signature Algorithm (ECDSA), work by computing a cryptographic hash of a message using, for example , the...material is based upon work supported by the Assistant Secretary of Defense for Research and Engineering under Air Force Contract No. FA8721- 05-C

  10. Secure Hierarchical Multicast Routing and Multicast Internet Anonymity

    DTIC Science & Technology

    1998-06-01

    Multimedia, Summer 94, pages 76{79, 94. [15] David Chaum . Blind signatures for untraceable payments. In Proc. Crypto󈨖, pages 199{203, 1982. [16] David L...use of digital signatures , which consist of a cryptographic hash of the message encrypted with the private key of the signer. Digitally-signed messages... signature on the request and on the certi cate it contains. Notice that the location service need not retrieve the initiator’s public key as it is contained

  11. Protecting privacy in a clinical data warehouse.

    PubMed

    Kong, Guilan; Xiao, Zhichun

    2015-06-01

    Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.

  12. Physical cryptographic verification of nuclear warheads

    PubMed Central

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.

    2016-01-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times. PMID:27432959

  13. Physical cryptographic verification of nuclear warheads

    NASA Astrophysics Data System (ADS)

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.

    2016-08-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.

  14. Physical cryptographic verification of nuclear warheads.

    PubMed

    Kemp, R Scott; Danagoulian, Areg; Macdonald, Ruaridh R; Vavrek, Jayson R

    2016-08-02

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.

  15. Quantum key management

    DOEpatents

    Hughes, Richard John; Thrasher, James Thomas; Nordholt, Jane Elizabeth

    2016-11-29

    Innovations for quantum key management harness quantum communications to form a cryptography system within a public key infrastructure framework. In example implementations, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a Merkle signature scheme (using Winternitz one-time digital signatures or other one-time digital signatures, and Merkle hash trees) to constitute a cryptography system. More generally, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a hash-based signature scheme. This provides a secure way to identify, authenticate, verify, and exchange secret cryptographic keys. Features of the quantum key management innovations further include secure enrollment of users with a registration authority, as well as credential checking and revocation with a certificate authority, where the registration authority and/or certificate authority can be part of the same system as a trusted authority for quantum key distribution.

  16. Physical cryptographic verification of nuclear warheads

    DOE PAGES

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; ...

    2016-07-18

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably securemore » cryptographic hash that does not rely on electronics or software. Finally, these techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.« less

  17. Enabling search over encrypted multimedia databases

    NASA Astrophysics Data System (ADS)

    Lu, Wenjun; Swaminathan, Ashwin; Varna, Avinash L.; Wu, Min

    2009-02-01

    Performing information retrieval tasks while preserving data confidentiality is a desirable capability when a database is stored on a server maintained by a third-party service provider. This paper addresses the problem of enabling content-based retrieval over encrypted multimedia databases. Search indexes, along with multimedia documents, are first encrypted by the content owner and then stored onto the server. Through jointly applying cryptographic techniques, such as order preserving encryption and randomized hash functions, with image processing and information retrieval techniques, secure indexing schemes are designed to provide both privacy protection and rank-ordered search capability. Retrieval results on an encrypted color image database and security analysis of the secure indexing schemes under different attack models show that data confidentiality can be preserved while retaining very good retrieval performance. This work has promising applications in secure multimedia management.

  18. Intrusion detection using secure signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Trent Darnel; Haile, Jedediah

    A method and device for intrusion detection using secure signatures comprising capturing network data. A search hash value, value employing at least one one-way function, is generated from the captured network data using a first hash function. The presence of a search hash value match in a secure signature table comprising search hash values and an encrypted rule is determined. After determining a search hash value match, a decryption key is generated from the captured network data using a second hash function, a hash function different form the first hash function. One or more of the encrypted rules of themore » secure signatures table having a hash value equal to the generated search hash value are then decrypted using the generated decryption key. The one or more decrypted secure signature rules are then processed for a match and one or more user notifications are deployed if a match is identified.« less

  19. Robust hashing with local models for approximate similarity search.

    PubMed

    Song, Jingkuan; Yang, Yi; Li, Xuelong; Huang, Zi; Yang, Yang

    2014-07-01

    Similarity search plays an important role in many applications involving high-dimensional data. Due to the known dimensionality curse, the performance of most existing indexing structures degrades quickly as the feature dimensionality increases. Hashing methods, such as locality sensitive hashing (LSH) and its variants, have been widely used to achieve fast approximate similarity search by trading search quality for efficiency. However, most existing hashing methods make use of randomized algorithms to generate hash codes without considering the specific structural information in the data. In this paper, we propose a novel hashing method, namely, robust hashing with local models (RHLM), which learns a set of robust hash functions to map the high-dimensional data points into binary hash codes by effectively utilizing local structural information. In RHLM, for each individual data point in the training dataset, a local hashing model is learned and used to predict the hash codes of its neighboring data points. The local models from all the data points are globally aligned so that an optimal hash code can be assigned to each data point. After obtaining the hash codes of all the training data points, we design a robust method by employing l2,1 -norm minimization on the loss function to learn effective hash functions, which are then used to map each database point into its hash code. Given a query data point, the search process first maps it into the query hash code by the hash functions and then explores the buckets, which have similar hash codes to the query hash code. Extensive experimental results conducted on real-life datasets show that the proposed RHLM outperforms the state-of-the-art methods in terms of search quality and efficiency.

  20. Perceptual Audio Hashing Functions

    NASA Astrophysics Data System (ADS)

    Özer, Hamza; Sankur, Bülent; Memon, Nasir; Anarım, Emin

    2005-12-01

    Perceptual hash functions provide a tool for fast and reliable identification of content. We present new audio hash functions based on summarization of the time-frequency spectral characteristics of an audio document. The proposed hash functions are based on the periodicity series of the fundamental frequency and on singular-value description of the cepstral frequencies. They are found, on one hand, to perform very satisfactorily in identification and verification tests, and on the other hand, to be very resilient to a large variety of attacks. Moreover, we address the issue of security of hashes and propose a keying technique, and thereby a key-dependent hash function.

  1. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  2. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    PubMed Central

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196

  3. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption.

    PubMed

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-29

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  4. Collision analysis of one kind of chaos-based hash function

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Peng, Wenbing; Liao, Xiaofeng; Xiang, Tao

    2010-02-01

    In the last decade, various chaos-based hash functions have been proposed. Nevertheless, the corresponding analyses of them lag far behind. In this Letter, we firstly take a chaos-based hash function proposed very recently in Amin, Faragallah and Abd El-Latif (2009) [11] as a sample to analyze its computational collision problem, and then generalize the construction method of one kind of chaos-based hash function and summarize some attentions to avoid the collision problem. It is beneficial to the hash function design based on chaos in the future.

  5. Query-Adaptive Reciprocal Hash Tables for Nearest Neighbor Search.

    PubMed

    Liu, Xianglong; Deng, Cheng; Lang, Bo; Tao, Dacheng; Li, Xuelong

    2016-02-01

    Recent years have witnessed the success of binary hashing techniques in approximate nearest neighbor search. In practice, multiple hash tables are usually built using hashing to cover more desired results in the hit buckets of each table. However, rare work studies the unified approach to constructing multiple informative hash tables using any type of hashing algorithms. Meanwhile, for multiple table search, it also lacks of a generic query-adaptive and fine-grained ranking scheme that can alleviate the binary quantization loss suffered in the standard hashing techniques. To solve the above problems, in this paper, we first regard the table construction as a selection problem over a set of candidate hash functions. With the graph representation of the function set, we propose an efficient solution that sequentially applies normalized dominant set to finding the most informative and independent hash functions for each table. To further reduce the redundancy between tables, we explore the reciprocal hash tables in a boosting manner, where the hash function graph is updated with high weights emphasized on the misclassified neighbor pairs of previous hash tables. To refine the ranking of the retrieved buckets within a certain Hamming radius from the query, we propose a query-adaptive bitwise weighting scheme to enable fine-grained bucket ranking in each hash table, exploiting the discriminative power of its hash functions and their complement for nearest neighbor search. Moreover, we integrate such scheme into the multiple table search using a fast, yet reciprocal table lookup algorithm within the adaptive weighted Hamming radius. In this paper, both the construction method and the query-adaptive search method are general and compatible with different types of hashing algorithms using different feature spaces and/or parameter settings. Our extensive experiments on several large-scale benchmarks demonstrate that the proposed techniques can significantly outperform both the naive construction methods and the state-of-the-art hashing algorithms.

  6. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research.

    PubMed

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D; Cui, Licong

    2015-11-10

    A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f1, f2, ..., fk. The input for each function fi has 3 components: a random number r, an integer n, and input data m. The result, fi(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f1(r1, n1, m1), f2(r2, n2, m2), ..., fk(rk, nk, mk). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash.

  7. Spherical hashing: binary code embedding with hyperspheres.

    PubMed

    Heo, Jae-Pil; Lee, Youngwoon; He, Junfeng; Chang, Shih-Fu; Yoon, Sung-Eui

    2015-11-01

    Many binary code embedding schemes have been actively studied recently, since they can provide efficient similarity search, and compact data representations suitable for handling large scale image databases. Existing binary code embedding techniques encode high-dimensional data by using hyperplane-based hashing functions. In this paper we propose a novel hypersphere-based hashing function, spherical hashing, to map more spatially coherent data points into a binary code compared to hyperplane-based hashing functions. We also propose a new binary code distance function, spherical Hamming distance, tailored for our hypersphere-based binary coding scheme, and design an efficient iterative optimization process to achieve both balanced partitioning for each hash function and independence between hashing functions. Furthermore, we generalize spherical hashing to support various similarity measures defined by kernel functions. Our extensive experiments show that our spherical hashing technique significantly outperforms state-of-the-art techniques based on hyperplanes across various benchmarks with sizes ranging from one to 75 million of GIST, BoW and VLAD descriptors. The performance gains are consistent and large, up to 100 percent improvements over the second best method among tested methods. These results confirm the unique merits of using hyperspheres to encode proximity regions in high-dimensional spaces. Finally, our method is intuitive and easy to implement.

  8. Linear Subspace Ranking Hashing for Cross-Modal Retrieval.

    PubMed

    Li, Kai; Qi, Guo-Jun; Ye, Jun; Hua, Kien A

    2017-09-01

    Hashing has attracted a great deal of research in recent years due to its effectiveness for the retrieval and indexing of large-scale high-dimensional multimedia data. In this paper, we propose a novel ranking-based hashing framework that maps data from different modalities into a common Hamming space where the cross-modal similarity can be measured using Hamming distance. Unlike existing cross-modal hashing algorithms where the learned hash functions are binary space partitioning functions, such as the sign and threshold function, the proposed hashing scheme takes advantage of a new class of hash functions closely related to rank correlation measures which are known to be scale-invariant, numerically stable, and highly nonlinear. Specifically, we jointly learn two groups of linear subspaces, one for each modality, so that features' ranking orders in different linear subspaces maximally preserve the cross-modal similarities. We show that the ranking-based hash function has a natural probabilistic approximation which transforms the original highly discontinuous optimization problem into one that can be efficiently solved using simple gradient descent algorithms. The proposed hashing framework is also flexible in the sense that the optimization procedures are not tied up to any specific form of loss function, which is typical for existing cross-modal hashing methods, but rather we can flexibly accommodate different loss functions with minimal changes to the learning steps. We demonstrate through extensive experiments on four widely-used real-world multimodal datasets that the proposed cross-modal hashing method can achieve competitive performance against several state-of-the-arts with only moderate training and testing time.

  9. Computing quantum hashing in the model of quantum branching programs

    NASA Astrophysics Data System (ADS)

    Ablayev, Farid; Ablayev, Marat; Vasiliev, Alexander

    2018-02-01

    We investigate the branching program complexity of quantum hashing. We consider a quantum hash function that maps elements of a finite field into quantum states. We require that this function is preimage-resistant and collision-resistant. We consider two complexity measures for Quantum Branching Programs (QBP): a number of qubits and a number of compu-tational steps. We show that the quantum hash function can be computed efficiently. Moreover, we prove that such QBP construction is optimal. That is, we prove lower bounds that match the constructed quantum hash function computation.

  10. Neighborhood Discriminant Hashing for Large-Scale Image Retrieval.

    PubMed

    Tang, Jinhui; Li, Zechao; Wang, Meng; Zhao, Ruizhen

    2015-09-01

    With the proliferation of large-scale community-contributed images, hashing-based approximate nearest neighbor search in huge databases has aroused considerable interest from the fields of computer vision and multimedia in recent years because of its computational and memory efficiency. In this paper, we propose a novel hashing method named neighborhood discriminant hashing (NDH) (for short) to implement approximate similarity search. Different from the previous work, we propose to learn a discriminant hashing function by exploiting local discriminative information, i.e., the labels of a sample can be inherited from the neighbor samples it selects. The hashing function is expected to be orthogonal to avoid redundancy in the learned hashing bits as much as possible, while an information theoretic regularization is jointly exploited using maximum entropy principle. As a consequence, the learned hashing function is compact and nonredundant among bits, while each bit is highly informative. Extensive experiments are carried out on four publicly available data sets and the comparison results demonstrate the outperforming performance of the proposed NDH method over state-of-the-art hashing techniques.

  11. A fingerprint key binding algorithm based on vector quantization and error correction

    NASA Astrophysics Data System (ADS)

    Li, Liang; Wang, Qian; Lv, Ke; He, Ning

    2012-04-01

    In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.

  12. Hierarchical Recurrent Neural Hashing for Image Retrieval With Hierarchical Convolutional Features.

    PubMed

    Lu, Xiaoqiang; Chen, Yaxiong; Li, Xuelong

    Hashing has been an important and effective technology in image retrieval due to its computational efficiency and fast search speed. The traditional hashing methods usually learn hash functions to obtain binary codes by exploiting hand-crafted features, which cannot optimally represent the information of the sample. Recently, deep learning methods can achieve better performance, since deep learning architectures can learn more effective image representation features. However, these methods only use semantic features to generate hash codes by shallow projection but ignore texture details. In this paper, we proposed a novel hashing method, namely hierarchical recurrent neural hashing (HRNH), to exploit hierarchical recurrent neural network to generate effective hash codes. There are three contributions of this paper. First, a deep hashing method is proposed to extensively exploit both spatial details and semantic information, in which, we leverage hierarchical convolutional features to construct image pyramid representation. Second, our proposed deep network can exploit directly convolutional feature maps as input to preserve the spatial structure of convolutional feature maps. Finally, we propose a new loss function that considers the quantization error of binarizing the continuous embeddings into the discrete binary codes, and simultaneously maintains the semantic similarity and balanceable property of hash codes. Experimental results on four widely used data sets demonstrate that the proposed HRNH can achieve superior performance over other state-of-the-art hashing methods.Hashing has been an important and effective technology in image retrieval due to its computational efficiency and fast search speed. The traditional hashing methods usually learn hash functions to obtain binary codes by exploiting hand-crafted features, which cannot optimally represent the information of the sample. Recently, deep learning methods can achieve better performance, since deep learning architectures can learn more effective image representation features. However, these methods only use semantic features to generate hash codes by shallow projection but ignore texture details. In this paper, we proposed a novel hashing method, namely hierarchical recurrent neural hashing (HRNH), to exploit hierarchical recurrent neural network to generate effective hash codes. There are three contributions of this paper. First, a deep hashing method is proposed to extensively exploit both spatial details and semantic information, in which, we leverage hierarchical convolutional features to construct image pyramid representation. Second, our proposed deep network can exploit directly convolutional feature maps as input to preserve the spatial structure of convolutional feature maps. Finally, we propose a new loss function that considers the quantization error of binarizing the continuous embeddings into the discrete binary codes, and simultaneously maintains the semantic similarity and balanceable property of hash codes. Experimental results on four widely used data sets demonstrate that the proposed HRNH can achieve superior performance over other state-of-the-art hashing methods.

  13. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research

    PubMed Central

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D

    2015-01-01

    Background A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. Objective This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. Methods NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f 1, f 2, ..., f k. The input for each function f i has 3 components: a random number r, an integer n, and input data m. The result, f i(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f 1(r 1, n 1, m 1), f 2(r 2, n 2, m 2), ..., f k(r k, n k, m k). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. Results We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). Conclusions The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash. PMID:26554419

  14. Hash function based on chaotic map lattices.

    PubMed

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  15. Hash function based on chaotic map lattices

    NASA Astrophysics Data System (ADS)

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  16. On the balanced quantum hashing

    NASA Astrophysics Data System (ADS)

    Ablayev, F.; Ablayev, M.; Vasiliev, A.

    2016-02-01

    In the paper we define a notion of a resistant quantum hash function which combines a notion of pre-image (one-way) resistance and the notion of collision resistance. In the quantum setting one-way resistance property and collision resistance property are correlated: the “more” a quantum function is one-way resistant the “less” it is collision resistant and vice versa. We present an explicit quantum hash function which is “balanced” one-way resistant and collision resistant and demonstrate how to build a large family of balanced quantum hash functions.

  17. Learning Discriminative Binary Codes for Large-scale Cross-modal Retrieval.

    PubMed

    Xu, Xing; Shen, Fumin; Yang, Yang; Shen, Heng Tao; Li, Xuelong

    2017-05-01

    Hashing based methods have attracted considerable attention for efficient cross-modal retrieval on large-scale multimedia data. The core problem of cross-modal hashing is how to learn compact binary codes that construct the underlying correlations between heterogeneous features from different modalities. A majority of recent approaches aim at learning hash functions to preserve the pairwise similarities defined by given class labels. However, these methods fail to explicitly explore the discriminative property of class labels during hash function learning. In addition, they usually discard the discrete constraints imposed on the to-be-learned binary codes, and compromise to solve a relaxed problem with quantization to obtain the approximate binary solution. Therefore, the binary codes generated by these methods are suboptimal and less discriminative to different classes. To overcome these drawbacks, we propose a novel cross-modal hashing method, termed discrete cross-modal hashing (DCH), which directly learns discriminative binary codes while retaining the discrete constraints. Specifically, DCH learns modality-specific hash functions for generating unified binary codes, and these binary codes are viewed as representative features for discriminative classification with class labels. An effective discrete optimization algorithm is developed for DCH to jointly learn the modality-specific hash function and the unified binary codes. Extensive experiments on three benchmark data sets highlight the superiority of DCH under various cross-modal scenarios and show its state-of-the-art performance.

  18. Apparatus, system and method for providing cryptographic key information with physically unclonable function circuitry

    DOEpatents

    Areno, Matthew

    2015-12-08

    Techniques and mechanisms for providing a value from physically unclonable function (PUF) circuitry for a cryptographic operation of a security module. In an embodiment, a cryptographic engine receives a value from PUF circuitry and based on the value, outputs a result of a cryptographic operation to a bus of the security module. The bus couples the cryptographic engine to control logic or interface logic of the security module. In another embodiment, the value is provided to the cryptographic engine from the PUF circuitry via a signal line which is distinct from the bus, where any exchange of the value by either of the cryptographic engine and the PUF circuitry is for communication of the first value independent of the bus.

  19. Multimodal Discriminative Binary Embedding for Large-Scale Cross-Modal Retrieval.

    PubMed

    Wang, Di; Gao, Xinbo; Wang, Xiumei; He, Lihuo; Yuan, Bo

    2016-10-01

    Multimodal hashing, which conducts effective and efficient nearest neighbor search across heterogeneous data on large-scale multimedia databases, has been attracting increasing interest, given the explosive growth of multimedia content on the Internet. Recent multimodal hashing research mainly aims at learning the compact binary codes to preserve semantic information given by labels. The overwhelming majority of these methods are similarity preserving approaches which approximate pairwise similarity matrix with Hamming distances between the to-be-learnt binary hash codes. However, these methods ignore the discriminative property in hash learning process, which results in hash codes from different classes undistinguished, and therefore reduces the accuracy and robustness for the nearest neighbor search. To this end, we present a novel multimodal hashing method, named multimodal discriminative binary embedding (MDBE), which focuses on learning discriminative hash codes. First, the proposed method formulates the hash function learning in terms of classification, where the binary codes generated by the learned hash functions are expected to be discriminative. And then, it exploits the label information to discover the shared structures inside heterogeneous data. Finally, the learned structures are preserved for hash codes to produce similar binary codes in the same class. Hence, the proposed MDBE can preserve both discriminability and similarity for hash codes, and will enhance retrieval accuracy. Thorough experiments on benchmark data sets demonstrate that the proposed method achieves excellent accuracy and competitive computational efficiency compared with the state-of-the-art methods for large-scale cross-modal retrieval task.

  20. A more secure parallel keyed hash function based on chaotic neural network

    NASA Astrophysics Data System (ADS)

    Huang, Zhongquan

    2011-08-01

    Although various hash functions based on chaos or chaotic neural network were proposed, most of them can not work efficiently in parallel computing environment. Recently, an algorithm for parallel keyed hash function construction based on chaotic neural network was proposed [13]. However, there is a strict limitation in this scheme that its secret keys must be nonce numbers. In other words, if the keys are used more than once in this scheme, there will be some potential security flaw. In this paper, we analyze the cause of vulnerability of the original one in detail, and then propose the corresponding enhancement measures, which can remove the limitation on the secret keys. Theoretical analysis and computer simulation indicate that the modified hash function is more secure and practical than the original one. At the same time, it can keep the parallel merit and satisfy the other performance requirements of hash function, such as good statistical properties, high message and key sensitivity, and strong collision resistance, etc.

  1. Multiview alignment hashing for efficient image search.

    PubMed

    Liu, Li; Yu, Mengyang; Shao, Ling

    2015-03-01

    Hashing is a popular and efficient method for nearest neighbor search in large-scale data spaces by embedding high-dimensional feature descriptors into a similarity preserving Hamming space with a low dimension. For most hashing methods, the performance of retrieval heavily depends on the choice of the high-dimensional feature descriptor. Furthermore, a single type of feature cannot be descriptive enough for different images when it is used for hashing. Thus, how to combine multiple representations for learning effective hashing functions is an imminent task. In this paper, we present a novel unsupervised multiview alignment hashing approach based on regularized kernel nonnegative matrix factorization, which can find a compact representation uncovering the hidden semantics and simultaneously respecting the joint probability distribution of data. In particular, we aim to seek a matrix factorization to effectively fuse the multiple information sources meanwhile discarding the feature redundancy. Since the raised problem is regarded as nonconvex and discrete, our objective function is then optimized via an alternate way with relaxation and converges to a locally optimal solution. After finding the low-dimensional representation, the hashing functions are finally obtained through multivariable logistic regression. The proposed method is systematically evaluated on three data sets: 1) Caltech-256; 2) CIFAR-10; and 3) CIFAR-20, and the results show that our method significantly outperforms the state-of-the-art multiview hashing techniques.

  2. Gene function prediction based on Gene Ontology Hierarchy Preserving Hashing.

    PubMed

    Zhao, Yingwen; Fu, Guangyuan; Wang, Jun; Guo, Maozu; Yu, Guoxian

    2018-02-23

    Gene Ontology (GO) uses structured vocabularies (or terms) to describe the molecular functions, biological roles, and cellular locations of gene products in a hierarchical ontology. GO annotations associate genes with GO terms and indicate the given gene products carrying out the biological functions described by the relevant terms. However, predicting correct GO annotations for genes from a massive set of GO terms as defined by GO is a difficult challenge. To combat with this challenge, we introduce a Gene Ontology Hierarchy Preserving Hashing (HPHash) based semantic method for gene function prediction. HPHash firstly measures the taxonomic similarity between GO terms. It then uses a hierarchy preserving hashing technique to keep the hierarchical order between GO terms, and to optimize a series of hashing functions to encode massive GO terms via compact binary codes. After that, HPHash utilizes these hashing functions to project the gene-term association matrix into a low-dimensional one and performs semantic similarity based gene function prediction in the low-dimensional space. Experimental results on three model species (Homo sapiens, Mus musculus and Rattus norvegicus) for interspecies gene function prediction show that HPHash performs better than other related approaches and it is robust to the number of hash functions. In addition, we also take HPHash as a plugin for BLAST based gene function prediction. From the experimental results, HPHash again significantly improves the prediction performance. The codes of HPHash are available at: http://mlda.swu.edu.cn/codes.php?name=HPHash. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Implementation of the AES as a Hash Function for Confirming the Identity of Software on a Computer System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Randy R.; Bass, Robert B.; Kouzes, Richard T.

    2003-01-20

    This paper provides a brief overview of the implementation of the Advanced Encryption Standard (AES) as a hash function for confirming the identity of software resident on a computer system. The PNNL Software Authentication team chose to use a hash function to confirm software identity on a system for situations where: (1) there is limited time to perform the confirmation and (2) access to the system is restricted to keyboard or thumbwheel input and output can only be displayed on a monitor. PNNL reviewed three popular algorithms: the Secure Hash Algorithm - 1 (SHA-1), the Message Digest - 5 (MD-5),more » and the Advanced Encryption Standard (AES) and selected the AES to incorporate in software confirmation tool we developed. This paper gives a brief overview of the SHA-1, MD-5, and the AES and sites references for further detail. It then explains the overall processing steps of the AES to reduce a large amount of generic data-the plain text, such is present in memory and other data storage media in a computer system, to a small amount of data-the hash digest, which is a mathematically unique representation or signature of the former that could be displayed on a computer's monitor. This paper starts with a simple definition and example to illustrate the use of a hash function. It concludes with a description of how the software confirmation tool uses the hash function to confirm the identity of software on a computer system.« less

  4. Study of the similarity function in Indexing-First-One hashing

    NASA Astrophysics Data System (ADS)

    Lai, Y.-L.; Jin, Z.; Goi, B.-M.; Chai, T.-Y.

    2017-06-01

    The recent proposed Indexing-First-One (IFO) hashing is a latest technique that is particularly adopted for eye iris template protection, i.e. IrisCode. However, IFO employs the measure of Jaccard Similarity (JS) initiated from Min-hashing has yet been adequately discussed. In this paper, we explore the nature of JS in binary domain and further propose a mathematical formulation to generalize the usage of JS, which is subsequently verified by using CASIA v3-Interval iris database. Our study reveals that JS applied in IFO hashing is a generalized version in measure two input objects with respect to Min-Hashing where the coefficient of JS is equal to one. With this understanding, IFO hashing can propagate the useful properties of Min-hashing, i.e. similarity preservation, thus favorable for similarity searching or recognition in binary space.

  5. Efficient computation of hashes

    NASA Astrophysics Data System (ADS)

    Lopes, Raul H. C.; Franqueira, Virginia N. L.; Hobson, Peter R.

    2014-06-01

    The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced.

  6. Live chat alternative security protocol

    NASA Astrophysics Data System (ADS)

    Rahman, J. P. R.; Nugraha, E.; Febriany, A.

    2018-05-01

    Indonesia is one of the largest e-commerce markets in Southeast Asia, as many as 5 million people do transactions in e-commerce, therefore more and more people use live chat service to communicate with customer service. In live chat, the customer service often asks customers’ data such as, full name, address, e-mail, transaction id, which aims to verify the purchase of the product. One of the risks that will happen is sniffing which will lead to the theft of confidential information that will cause huge losses to the customer. The anticipation that will be done is build an alternative security protocol for user interaction in live chat by using a cryptographic algorithm that is useful for protecting confidential messages. Live chat requires confidentiality and data integration with encryption and hash functions. The used algorithm are Rijndael 256 bits, RSA, and SHA256. To increase the complexity, the Rijndael algorithm will be modified in the S-box and ShiftRow sections based on the shannon principle rule, the results show that all pass the Randomness test, but the modification in Shiftrow indicates a better avalanche effect. Therefore the message will be difficult to be stolen or changed.

  7. Crypto-Watermarking of Transmitted Medical Images.

    PubMed

    Al-Haj, Ali; Mohammad, Ahmad; Amer, Alaa'

    2017-02-01

    Telemedicine is a booming healthcare practice that has facilitated the exchange of medical data and expertise between healthcare entities. However, the widespread use of telemedicine applications requires a secured scheme to guarantee confidentiality and verify authenticity and integrity of exchanged medical data. In this paper, we describe a region-based, crypto-watermarking algorithm capable of providing confidentiality, authenticity, and integrity for medical images of different modalities. The proposed algorithm provides authenticity by embedding robust watermarks in images' region of non-interest using SVD in the DWT domain. Integrity is provided in two levels: strict integrity implemented by a cryptographic hash watermark, and content-based integrity implemented by a symmetric encryption-based tamper localization scheme. Confidentiality is achieved as a byproduct of hiding patient's data in the image. Performance of the algorithm was evaluated with respect to imperceptibility, robustness, capacity, and tamper localization, using different medical images. The results showed the effectiveness of the algorithm in providing security for telemedicine applications.

  8. PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.

    PubMed

    Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang

    2017-07-26

    Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.

  9. Achaete-Scute Homolog 1 Expression Controls Cellular Differentiation of Neuroblastoma

    PubMed Central

    Kasim, Mumtaz; Heß, Vicky; Scholz, Holger; Persson, Pontus B.; Fähling, Michael

    2016-01-01

    Neuroblastoma, the major cause of infant cancer deaths, results from fast proliferation of undifferentiated neuroblasts. Treatment of high-risk neuroblastoma includes differentiation with retinoic acid (RA); however, the resistance of many of these tumors to RA-induced differentiation poses a considerable challenge. Human achaete-scute homolog 1 (hASH1) is a proneural basic helix-loop-helix transcription factor essential for neurogenesis and is often upregulated in neuroblastoma. Here, we identified a novel function for hASH1 in regulating the differentiation phenotype of neuroblastoma cells. Global analysis of 986 human neuroblastoma datasets revealed a negative correlation between hASH1 and neuron differentiation that was independent of the N-myc (MYCN) oncogene. Using RA to induce neuron differentiation in two neuroblastoma cell lines displaying high and low levels of hASH1 expression, we confirmed the link between hASH1 expression and the differentiation defective phenotype, which was reversed by silencing hASH1 or by hypoxic preconditioning. We further show that hASH1 suppresses neuronal differentiation by inhibiting transcription at the RA receptor element. Collectively, our data indicate hASH1 to be key for understanding neuroblastoma resistance to differentiation therapy and pave the way for hASH1-targeted therapies for augmenting the response of neuroblastoma to differentiation therapy. PMID:28066180

  10. Efficient hash tables for network applications.

    PubMed

    Zink, Thomas; Waldvogel, Marcel

    2015-01-01

    Hashing has yet to be widely accepted as a component of hard real-time systems and hardware implementations, due to still existing prejudices concerning the unpredictability of space and time requirements resulting from collisions. While in theory perfect hashing can provide optimal mapping, in practice, finding a perfect hash function is too expensive, especially in the context of high-speed applications. The introduction of hashing with multiple choices, d-left hashing and probabilistic table summaries, has caused a shift towards deterministic DRAM access. However, high amounts of rare and expensive high-speed SRAM need to be traded off for predictability, which is infeasible for many applications. In this paper we show that previous suggestions suffer from the false precondition of full generality. Our approach exploits four individual degrees of freedom available in many practical applications, especially hardware and high-speed lookups. This reduces the requirement of on-chip memory up to an order of magnitude and guarantees constant lookup and update time at the cost of only minute amounts of additional hardware. Our design makes efficient hash table implementations cheaper, more predictable, and more practical.

  11. Text image authenticating algorithm based on MD5-hash function and Henon map

    NASA Astrophysics Data System (ADS)

    Wei, Jinqiao; Wang, Ying; Ma, Xiaoxue

    2017-07-01

    In order to cater to the evidentiary requirements of the text image, this paper proposes a fragile watermarking algorithm based on Hash function and Henon map. The algorithm is to divide a text image into parts, get flippable pixels and nonflippable pixels of every lump according to PSD, generate watermark of non-flippable pixels with MD5-Hash, encrypt watermark with Henon map and select embedded blocks. The simulation results show that the algorithm with a good ability in tampering localization can be used to authenticate and forensics the authenticity and integrity of text images

  12. Learning binary code via PCA of angle projection for image retrieval

    NASA Astrophysics Data System (ADS)

    Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong

    2018-01-01

    With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.

  13. Query-Adaptive Hash Code Ranking for Large-Scale Multi-View Visual Search.

    PubMed

    Liu, Xianglong; Huang, Lei; Deng, Cheng; Lang, Bo; Tao, Dacheng

    2016-10-01

    Hash-based nearest neighbor search has become attractive in many applications. However, the quantization in hashing usually degenerates the discriminative power when using Hamming distance ranking. Besides, for large-scale visual search, existing hashing methods cannot directly support the efficient search over the data with multiple sources, and while the literature has shown that adaptively incorporating complementary information from diverse sources or views can significantly boost the search performance. To address the problems, this paper proposes a novel and generic approach to building multiple hash tables with multiple views and generating fine-grained ranking results at bitwise and tablewise levels. For each hash table, a query-adaptive bitwise weighting is introduced to alleviate the quantization loss by simultaneously exploiting the quality of hash functions and their complement for nearest neighbor search. From the tablewise aspect, multiple hash tables are built for different data views as a joint index, over which a query-specific rank fusion is proposed to rerank all results from the bitwise ranking by diffusing in a graph. Comprehensive experiments on image search over three well-known benchmarks show that the proposed method achieves up to 17.11% and 20.28% performance gains on single and multiple table search over the state-of-the-art methods.

  14. Image encryption algorithm based on multiple mixed hash functions and cyclic shift

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Zhu, Xiaoqiang; Wu, Xiangjun; Zhang, Yingqian

    2018-08-01

    This paper proposes a new one-time pad scheme for chaotic image encryption that is based on the multiple mixed hash functions and the cyclic-shift function. The initial value is generated using both information of the plaintext image and the chaotic sequences, which are calculated from the SHA1 and MD5 hash algorithms. The scrambling sequences are generated by the nonlinear equations and logistic map. This paper aims to improve the deficiencies of traditional Baptista algorithms and its improved algorithms. We employ the cyclic-shift function and piece-wise linear chaotic maps (PWLCM), which give each shift number the characteristics of chaos, to diffuse the image. Experimental results and security analysis show that the new scheme has better security and can resist common attacks.

  15. Building Application-Related Patient Identifiers: What Solution for a European Country?

    PubMed Central

    Quantin, Catherine; Allaert, François-André; Avillach, Paul; Fassa, Maniane; Riandey, Benoît; Trouessin, Gilles; Cohen, Olivier

    2008-01-01

    We propose a method utilizing a derived social security number with the same reliability as the social security number. We show the anonymity techniques classically based on unidirectional hash functions (such as the secure hash algorithm (SHA-2) function that can guarantee the security, quality, and reliability of information if these techniques are applied to the Social Security Number). Hashing produces a strictly anonymous code that is always the same for a given individual, and thus enables patient data to be linked. Different solutions are developed and proposed in this article. Hashing the social security number will make it possible to link the information in the personal medical file to other national health information sources with the aim of completing or validating the personal medical record or conducting epidemiological and clinical research. This data linkage would meet the anonymous data requirements of the European directive on data protection. PMID:18401447

  16. Lightweight Privacy-Preserving Authentication Protocols Secure against Active Attack in an Asymmetric Way

    NASA Astrophysics Data System (ADS)

    Cui, Yank; Kobara, Kazukuni; Matsuura, Kanta; Imai, Hideki

    As pervasive computing technologies develop fast, the privacy protection becomes a crucial issue and needs to be coped with very carefully. Typically, it is difficult to efficiently identify and manage plenty of the low-cost pervasive devices like Radio Frequency Identification Devices (RFID), without leaking any privacy information. In particular, the attacker may not only eavesdrop the communication in a passive way, but also mount an active attack to ask queries adaptively, which is obviously more dangerous. Towards settling this problem, in this paper, we propose two lightweight authentication protocols which are privacy-preserving against active attack, in an asymmetric way. That asymmetric style with privacy-oriented simplification succeeds to reduce the load of low-cost devices and drastically decrease the computation cost for the management of server. This is because that, unlike the usual management of the identities, our approach does not require any synchronization nor exhaustive search in the database, which enjoys great convenience in case of a large-scale system. The protocols are based on a fast asymmetric encryption with specialized simplification and only one cryptographic hash function, which consequently assigns an easy work to pervasive devices. Besides, our results do not require the strong assumption of the random oracle.

  17. Security for decentralized health information systems.

    PubMed

    Bleumer, G

    1994-02-01

    Health care information systems must reflect at least two basic characteristics of the health care community: the increasing mobility of patients and the personal liability of everyone giving medical treatment. Open distributed information systems bear the potential to reflect these requirements. But the market for open information systems and operating systems hardly provides secure products today. This 'missing link' is approached by the prototype SECURE Talk that provides secure transmission and archiving of files on top of an existing operating system. Its services may be utilized by existing medical applications. SECURE Talk demonstrates secure communication utilizing only standard hardware. Its message is that cryptography (and in particular asymmetric cryptography) is practical for many medical applications even if implemented in software. All mechanisms are software implemented in order to be executable on standard-hardware. One can investigate more or less decentralized forms of public key management and the performance of many different cryptographic mechanisms. That of, e.g. hybrid encryption and decryption (RSA+DES-PCBC) is about 300 kbit/s. That of signing and verifying is approximately the same using RSA with a DES hash function. The internal speed, without disk accesses etc., is about 1.1 Mbit/s. (Apple Quadra 950 (MC 68040, 33 MHz, RAM: 20 MB, 80 ns. Length of RSA modulus is 512 bit).

  18. A Lightweight Data Integrity Scheme for Sensor Networks

    PubMed Central

    Kamel, Ibrahim; Juma, Hussam

    2011-01-01

    Limited energy is the most critical constraint that limits the capabilities of wireless sensor networks (WSNs). Most sensors operate on batteries with limited power. Battery recharging or replacement may be impossible. Security mechanisms that are based on public key cryptographic algorithms such as RSA and digital signatures are prohibitively expensive in terms of energy consumption and storage requirements, and thus unsuitable for WSN applications. This paper proposes a new fragile watermarking technique to detect unauthorized alterations in WSN data streams. We propose the FWC-D scheme, which uses group delimiters to keep the sender and receivers synchronized and help them to avoid ambiguity in the event of data insertion or deletion. The watermark, which is computed using a hash function, is stored in the previous group in a linked-list fashion to ensure data freshness and mitigate replay attacks, FWC-D generates a serial number SN that is attached to each group to help the receiver determines how many group insertions or deletions occurred. Detailed security analysis that compares the proposed FWC-D scheme with SGW, one of the latest integrity schemes for WSNs, shows that FWC-D is more robust than SGW. Simulation results further show that the proposed scheme is much faster than SGW. PMID:22163840

  19. Distributed Adaptive Binary Quantization for Fast Nearest Neighbor Search.

    PubMed

    Xianglong Liu; Zhujin Li; Cheng Deng; Dacheng Tao

    2017-11-01

    Hashing has been proved an attractive technique for fast nearest neighbor search over big data. Compared with the projection based hashing methods, prototype-based ones own stronger power to generate discriminative binary codes for the data with complex intrinsic structure. However, existing prototype-based methods, such as spherical hashing and K-means hashing, still suffer from the ineffective coding that utilizes the complete binary codes in a hypercube. To address this problem, we propose an adaptive binary quantization (ABQ) method that learns a discriminative hash function with prototypes associated with small unique binary codes. Our alternating optimization adaptively discovers the prototype set and the code set of a varying size in an efficient way, which together robustly approximate the data relations. Our method can be naturally generalized to the product space for long hash codes, and enjoys the fast training linear to the number of the training data. We further devise a distributed framework for the large-scale learning, which can significantly speed up the training of ABQ in the distributed environment that has been widely deployed in many areas nowadays. The extensive experiments on four large-scale (up to 80 million) data sets demonstrate that our method significantly outperforms state-of-the-art hashing methods, with up to 58.84% performance gains relatively.

  20. Cryptographic Boolean Functions with Biased Inputs

    DTIC Science & Technology

    2015-07-31

    theory of random graphs developed by Erdős and Rényi [2]. The graph properties in a random graph expressed as such Boolean functions are used by...distributed Bernoulli variates with the parameter p. Since our scope is within the area of cryptography , we initiate an analysis of cryptographic...Boolean functions with biased inputs, which we refer to as µp-Boolean functions, is a common generalization of Boolean functions which stems from the

  1. HashDist: Reproducible, Relocatable, Customizable, Cross-Platform Software Stacks for Open Hydrological Science

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and reproducibility for scientists working in the hydrological sciences. The HashDist documentation is available from: http://hashdist.readthedocs.org/en/latest/ HashDist is currently hosted at: https://github.com/hashdist/hashdist

  2. A Secure and Robust Object-Based Video Authentication System

    NASA Astrophysics Data System (ADS)

    He, Dajun; Sun, Qibin; Tian, Qi

    2004-12-01

    An object-based video authentication system, which combines watermarking, error correction coding (ECC), and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART) coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT) coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI).

  3. Application of kernel functions for accurate similarity search in large chemical databases.

    PubMed

    Wang, Xiaohong; Huan, Jun; Smalter, Aaron; Lushington, Gerald H

    2010-04-29

    Similarity search in chemical structure databases is an important problem with many applications in chemical genomics, drug design, and efficient chemical probe screening among others. It is widely believed that structure based methods provide an efficient way to do the query. Recently various graph kernel functions have been designed to capture the intrinsic similarity of graphs. Though successful in constructing accurate predictive and classification models, graph kernel functions can not be applied to large chemical compound database due to the high computational complexity and the difficulties in indexing similarity search for large databases. To bridge graph kernel function and similarity search in chemical databases, we applied a novel kernel-based similarity measurement, developed in our team, to measure similarity of graph represented chemicals. In our method, we utilize a hash table to support new graph kernel function definition, efficient storage and fast search. We have applied our method, named G-hash, to large chemical databases. Our results show that the G-hash method achieves state-of-the-art performance for k-nearest neighbor (k-NN) classification. Moreover, the similarity measurement and the index structure is scalable to large chemical databases with smaller indexing size, and faster query processing time as compared to state-of-the-art indexing methods such as Daylight fingerprints, C-tree and GraphGrep. Efficient similarity query processing method for large chemical databases is challenging since we need to balance running time efficiency and similarity search accuracy. Our previous similarity search method, G-hash, provides a new way to perform similarity search in chemical databases. Experimental study validates the utility of G-hash in chemical databases.

  4. SIMPL Systems, or: Can We Design Cryptographic Hardware without Secret Key Information?

    NASA Astrophysics Data System (ADS)

    Rührmair, Ulrich

    This paper discusses a new cryptographic primitive termed SIMPL system. Roughly speaking, a SIMPL system is a special type of Physical Unclonable Function (PUF) which possesses a binary description that allows its (slow) public simulation and prediction. Besides this public key like functionality, SIMPL systems have another advantage: No secret information is, or needs to be, contained in SIMPL systems in order to enable cryptographic protocols - neither in the form of a standard binary key, nor as secret information hidden in random, analog features, as it is the case for PUFs. The cryptographic security of SIMPLs instead rests on (i) a physical assumption on their unclonability, and (ii) a computational assumption regarding the complexity of simulating their output. This novel property makes SIMPL systems potentially immune against many known hardware and software attacks, including malware, side channel, invasive, or modeling attacks.

  5. Hardware device binding and mutual authentication

    DOEpatents

    Hamlet, Jason R; Pierson, Lyndon G

    2014-03-04

    Detection and deterrence of device tampering and subversion by substitution may be achieved by including a cryptographic unit within a computing device for binding multiple hardware devices and mutually authenticating the devices. The cryptographic unit includes a physically unclonable function ("PUF") circuit disposed in or on the hardware device, which generates a binding PUF value. The cryptographic unit uses the binding PUF value during an enrollment phase and subsequent authentication phases. During a subsequent authentication phase, the cryptographic unit uses the binding PUF values of the multiple hardware devices to generate a challenge to send to the other device, and to verify a challenge received from the other device to mutually authenticate the hardware devices.

  6. Security Criteria for Distributed Systems: Functional Requirements.

    DTIC Science & Technology

    1995-09-01

    Open Company Limited. Ziv , J. and A. Lempel . 1977. A Universal Algorithm for Sequential Data Compression . IEEE Transactions on Information Theory Vol...3, SCF-5 DCF-7. Configurable Cryptographic Algorithms (a) It shall be possible to configure the system such that the data confidentiality functions...use different cryptographic algorithms for different protocols (e.g., mail or interprocess communication data ). (b) The modes of encryption

  7. An Analysis of Cryptographically Significant Boolean Functions With High Correlation Immunity by Reconfigurable Computer

    DTIC Science & Technology

    2010-12-01

    with high correlation immunity and then evaluate these functions for other desirable cryptographic features. C. METHOD The only known primary methods...out if not used) # ---------------------------------- # PRIMARY = < primary file 1> < primary file 2> #SECONDARY = <secondary file 1...finding the fuction value for a //set u and for each value of v. end end

  8. SGFSC: speeding the gene functional similarity calculation based on hash tables.

    PubMed

    Tian, Zhen; Wang, Chunyu; Guo, Maozu; Liu, Xiaoyan; Teng, Zhixia

    2016-11-04

    In recent years, many measures of gene functional similarity have been proposed and widely used in all kinds of essential research. These methods are mainly divided into two categories: pairwise approaches and group-wise approaches. However, a common problem with these methods is their time consumption, especially when measuring the gene functional similarities of a large number of gene pairs. The problem of computational efficiency for pairwise approaches is even more prominent because they are dependent on the combination of semantic similarity. Therefore, the efficient measurement of gene functional similarity remains a challenging problem. To speed current gene functional similarity calculation methods, a novel two-step computing strategy is proposed: (1) establish a hash table for each method to store essential information obtained from the Gene Ontology (GO) graph and (2) measure gene functional similarity based on the corresponding hash table. There is no need to traverse the GO graph repeatedly for each method with the help of the hash table. The analysis of time complexity shows that the computational efficiency of these methods is significantly improved. We also implement a novel Speeding Gene Functional Similarity Calculation tool, namely SGFSC, which is bundled with seven typical measures using our proposed strategy. Further experiments show the great advantage of SGFSC in measuring gene functional similarity on the whole genomic scale. The proposed strategy is successful in speeding current gene functional similarity calculation methods. SGFSC is an efficient tool that is freely available at http://nclab.hit.edu.cn/SGFSC . The source code of SGFSC can be downloaded from http://pan.baidu.com/s/1dFFmvpZ .

  9. Hash Functions and Information Theoretic Security

    NASA Astrophysics Data System (ADS)

    Bagheri, Nasour; Knudsen, Lars R.; Naderi, Majid; Thomsen, Søren S.

    Information theoretic security is an important security notion in cryptography as it provides a true lower bound for attack complexities. However, in practice attacks often have a higher cost than the information theoretic bound. In this paper we study the relationship between information theoretic attack costs and real costs. We show that in the information theoretic model, many well-known and commonly used hash functions such as MD5 and SHA-256 fail to be preimage resistant.

  10. Small Private Key PKS on an Embedded Microprocessor

    PubMed Central

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-01-01

    Multivariate quadratic ( ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012. PMID:24651722

  11. Small private key MQPKS on an embedded microprocessor.

    PubMed

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-03-19

    Multivariate quadratic (MQ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to MQ cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key MQ scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key MQ scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing MQ on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key MQ scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012.

  12. Hardware device to physical structure binding and authentication

    DOEpatents

    Hamlet, Jason R.; Stein, David J.; Bauer, Todd M.

    2013-08-20

    Detection and deterrence of device tampering and subversion may be achieved by including a cryptographic fingerprint unit within a hardware device for authenticating a binding of the hardware device and a physical structure. The cryptographic fingerprint unit includes an internal physically unclonable function ("PUF") circuit disposed in or on the hardware device, which generate an internal PUF value. Binding logic is coupled to receive the internal PUF value, as well as an external PUF value associated with the physical structure, and generates a binding PUF value, which represents the binding of the hardware device and the physical structure. The cryptographic fingerprint unit also includes a cryptographic unit that uses the binding PUF value to allow a challenger to authenticate the binding.

  13. Git as an Encrypted Distributed Version Control System

    DTIC Science & Technology

    2015-03-01

    options. The algorithm uses AES- 256 counter mode with an IV derived from SHA -1-HMAC hash (this is nearly identical to the GCM mode discussed earlier...built into the internal structure of Git. Every file in a Git repository is check summed with a SHA -1 hash, a one-way function with arbitrarily long...implementation. Git-encrypt calls OpenSSL cryptography library command line functions. The default cipher used is AES- 256 - Electronic Code Book (ECB), which is

  14. Self-Organized Link State Aware Routing for Multiple Mobile Agents in Wireless Network

    NASA Astrophysics Data System (ADS)

    Oda, Akihiro; Nishi, Hiroaki

    Recently, the importance of data sharing structures in autonomous distributed networks has been increasing. A wireless sensor network is used for managing distributed data. This type of distributed network requires effective information exchanging methods for data sharing. To reduce the traffic of broadcasted messages, reduction of the amount of redundant information is indispensable. In order to reduce packet loss in mobile ad-hoc networks, QoS-sensitive routing algorithm have been frequently discussed. The topology of a wireless network is likely to change frequently according to the movement of mobile nodes, radio disturbance, or fading due to the continuous changes in the environment. Therefore, a packet routing algorithm should guarantee QoS by using some quality indicators of the wireless network. In this paper, a novel information exchanging algorithm developed using a hash function and a Boolean operation is proposed. This algorithm achieves efficient information exchanges by reducing the overhead of broadcasting messages, and it can guarantee QoS in a wireless network environment. It can be applied to a routing algorithm in a mobile ad-hoc network. In the proposed routing algorithm, a routing table is constructed by using the received signal strength indicator (RSSI), and the neighborhood information is periodically broadcasted depending on this table. The proposed hash-based routing entry management by using an extended MAC address can eliminate the overhead of message flooding. An analysis of the collision of hash values contributes to the determination of the length of the hash values, which is minimally required. Based on the verification of a mathematical theory, an optimum hash function for determining the length of hash values can be given. Simulations are carried out to evaluate the effectiveness of the proposed algorithm and to validate the theory in a general wireless network routing algorithm.

  15. Secure method for biometric-based recognition with integrated cryptographic functions.

    PubMed

    Chiou, Shin-Yan

    2013-01-01

    Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.

  16. A Study on the Secure User Profiling Structure and Procedure for Home Healthcare Systems.

    PubMed

    Ko, Hoon; Song, MoonBae

    2016-01-01

    Despite of various benefits such as a convenience and efficiency, home healthcare systems have some inherent security risks that may cause a serious leak on personal health information. This work presents a Secure User Profiling Structure which has the patient information including their health information. A patient and a hospital keep it at that same time, they share the updated data. While they share the data and communicate, the data can be leaked. To solve the security problems, a secure communication channel with a hash function and an One-Time Password between a client and a hospital should be established and to generate an input value to an OTP, it uses a dual hash-function. This work presents a dual hash function-based approach to generate the One-Time Password ensuring a secure communication channel with the secured key. In result, attackers are unable to decrypt the leaked information because of the secured key; in addition, the proposed method outperforms the existing methods in terms of computation cost.

  17. Defence against Black Hole and Selective Forwarding Attacks for Medical WSNs in the IoT †

    PubMed Central

    Mathur, Avijit; Newe, Thomas; Rao, Muzaffar

    2016-01-01

    Wireless sensor networks (WSNs) are being used to facilitate monitoring of patients in hospital and home environments. These systems consist of a variety of different components/sensors and many processes like clustering, routing, security, and self-organization. Routing is necessary for medical-based WSNs because it allows remote data delivery and it facilitates network scalability in large hospitals. However, routing entails several problems, mainly due to the open nature of wireless networks, and these need to be addressed. This paper looks at two of the problems that arise due to wireless routing between the nodes and access points of a medical WSN (for IoT use): black hole and selective forwarding (SF) attacks. A solution to the former can readily be provided through the use of cryptographic hashes, while the latter makes use of a neighbourhood watch and threshold-based analysis to detect and correct SF attacks. The scheme proposed here is capable of detecting a selective forwarding attack with over 96% accuracy and successfully identifying the malicious node with 83% accuracy. PMID:26797620

  18. Defence against Black Hole and Selective Forwarding Attacks for Medical WSNs in the IoT.

    PubMed

    Mathur, Avijit; Newe, Thomas; Rao, Muzaffar

    2016-01-19

    Wireless sensor networks (WSNs) are being used to facilitate monitoring of patients in hospital and home environments. These systems consist of a variety of different components/sensors and many processes like clustering, routing, security, and self-organization. Routing is necessary for medical-based WSNs because it allows remote data delivery and it facilitates network scalability in large hospitals. However, routing entails several problems, mainly due to the open nature of wireless networks, and these need to be addressed. This paper looks at two of the problems that arise due to wireless routing between the nodes and access points of a medical WSN (for IoT use): black hole and selective forwarding (SF) attacks. A solution to the former can readily be provided through the use of cryptographic hashes, while the latter makes use of a neighbourhood watch and threshold-based analysis to detect and correct SF attacks. The scheme proposed here is capable of detecting a selective forwarding attack with over 96% accuracy and successfully identifying the malicious node with 83% accuracy.

  19. Self-Supervised Video Hashing With Hierarchical Binary Auto-Encoder.

    PubMed

    Song, Jingkuan; Zhang, Hanwang; Li, Xiangpeng; Gao, Lianli; Wang, Meng; Hong, Richang

    2018-07-01

    Existing video hash functions are built on three isolated stages: frame pooling, relaxed learning, and binarization, which have not adequately explored the temporal order of video frames in a joint binary optimization model, resulting in severe information loss. In this paper, we propose a novel unsupervised video hashing framework dubbed self-supervised video hashing (SSVH), which is able to capture the temporal nature of videos in an end-to-end learning to hash fashion. We specifically address two central problems: 1) how to design an encoder-decoder architecture to generate binary codes for videos and 2) how to equip the binary codes with the ability of accurate video retrieval. We design a hierarchical binary auto-encoder to model the temporal dependencies in videos with multiple granularities, and embed the videos into binary codes with less computations than the stacked architecture. Then, we encourage the binary codes to simultaneously reconstruct the visual content and neighborhood structure of the videos. Experiments on two real-world data sets show that our SSVH method can significantly outperform the state-of-the-art methods and achieve the current best performance on the task of unsupervised video retrieval.

  20. Self-Supervised Video Hashing With Hierarchical Binary Auto-Encoder

    NASA Astrophysics Data System (ADS)

    Song, Jingkuan; Zhang, Hanwang; Li, Xiangpeng; Gao, Lianli; Wang, Meng; Hong, Richang

    2018-07-01

    Existing video hash functions are built on three isolated stages: frame pooling, relaxed learning, and binarization, which have not adequately explored the temporal order of video frames in a joint binary optimization model, resulting in severe information loss. In this paper, we propose a novel unsupervised video hashing framework dubbed Self-Supervised Video Hashing (SSVH), that is able to capture the temporal nature of videos in an end-to-end learning-to-hash fashion. We specifically address two central problems: 1) how to design an encoder-decoder architecture to generate binary codes for videos; and 2) how to equip the binary codes with the ability of accurate video retrieval. We design a hierarchical binary autoencoder to model the temporal dependencies in videos with multiple granularities, and embed the videos into binary codes with less computations than the stacked architecture. Then, we encourage the binary codes to simultaneously reconstruct the visual content and neighborhood structure of the videos. Experiments on two real-world datasets (FCVID and YFCC) show that our SSVH method can significantly outperform the state-of-the-art methods and achieve the currently best performance on the task of unsupervised video retrieval.

  1. The Amordad database engine for metagenomics.

    PubMed

    Behnam, Ehsan; Smith, Andrew D

    2014-10-15

    Several technical challenges in metagenomic data analysis, including assembling metagenomic sequence data or identifying operational taxonomic units, are both significant and well known. These forms of analysis are increasingly cited as conceptually flawed, given the extreme variation within traditionally defined species and rampant horizontal gene transfer. Furthermore, computational requirements of such analysis have hindered content-based organization of metagenomic data at large scale. In this article, we introduce the Amordad database engine for alignment-free, content-based indexing of metagenomic datasets. Amordad places the metagenome comparison problem in a geometric context, and uses an indexing strategy that combines random hashing with a regular nearest neighbor graph. This framework allows refinement of the database over time by continual application of random hash functions, with the effect of each hash function encoded in the nearest neighbor graph. This eliminates the need to explicitly maintain the hash functions in order for query efficiency to benefit from the accumulated randomness. Results on real and simulated data show that Amordad can support logarithmic query time for identifying similar metagenomes even as the database size reaches into the millions. Source code, licensed under the GNU general public license (version 3) is freely available for download from http://smithlabresearch.org/amordad andrewds@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions

    PubMed Central

    Chiou, Shin-Yan

    2013-01-01

    Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied. PMID:23762851

  3. Unsupervised Deep Hashing With Pseudo Labels for Scalable Image Retrieval.

    PubMed

    Zhang, Haofeng; Liu, Li; Long, Yang; Shao, Ling

    2018-04-01

    In order to achieve efficient similarity searching, hash functions are designed to encode images into low-dimensional binary codes with the constraint that similar features will have a short distance in the projected Hamming space. Recently, deep learning-based methods have become more popular, and outperform traditional non-deep methods. However, without label information, most state-of-the-art unsupervised deep hashing (DH) algorithms suffer from severe performance degradation for unsupervised scenarios. One of the main reasons is that the ad-hoc encoding process cannot properly capture the visual feature distribution. In this paper, we propose a novel unsupervised framework that has two main contributions: 1) we convert the unsupervised DH model into supervised by discovering pseudo labels; 2) the framework unifies likelihood maximization, mutual information maximization, and quantization error minimization so that the pseudo labels can maximumly preserve the distribution of visual features. Extensive experiments on three popular data sets demonstrate the advantages of the proposed method, which leads to significant performance improvement over the state-of-the-art unsupervised hashing algorithms.

  4. Feature hashing for fast image retrieval

    NASA Astrophysics Data System (ADS)

    Yan, Lingyu; Fu, Jiarun; Zhang, Hongxin; Yuan, Lu; Xu, Hui

    2018-03-01

    Currently, researches on content based image retrieval mainly focus on robust feature extraction. However, due to the exponential growth of online images, it is necessary to consider searching among large scale images, which is very timeconsuming and unscalable. Hence, we need to pay much attention to the efficiency of image retrieval. In this paper, we propose a feature hashing method for image retrieval which not only generates compact fingerprint for image representation, but also prevents huge semantic loss during the process of hashing. To generate the fingerprint, an objective function of semantic loss is constructed and minimized, which combine the influence of both the neighborhood structure of feature data and mapping error. Since the machine learning based hashing effectively preserves neighborhood structure of data, it yields visual words with strong discriminability. Furthermore, the generated binary codes leads image representation building to be of low-complexity, making it efficient and scalable to large scale databases. Experimental results show good performance of our approach.

  5. Attacks on quantum key distribution protocols that employ non-ITS authentication

    NASA Astrophysics Data System (ADS)

    Pacher, C.; Abidin, A.; Lorünser, T.; Peev, M.; Ursin, R.; Zeilinger, A.; Larsson, J.-Å.

    2016-01-01

    We demonstrate how adversaries with large computing resources can break quantum key distribution (QKD) protocols which employ a particular message authentication code suggested previously. This authentication code, featuring low key consumption, is not information-theoretically secure (ITS) since for each message the eavesdropper has intercepted she is able to send a different message from a set of messages that she can calculate by finding collisions of a cryptographic hash function. However, when this authentication code was introduced, it was shown to prevent straightforward man-in-the-middle (MITM) attacks against QKD protocols. In this paper, we prove that the set of messages that collide with any given message under this authentication code contains with high probability a message that has small Hamming distance to any other given message. Based on this fact, we present extended MITM attacks against different versions of BB84 QKD protocols using the addressed authentication code; for three protocols, we describe every single action taken by the adversary. For all protocols, the adversary can obtain complete knowledge of the key, and for most protocols her success probability in doing so approaches unity. Since the attacks work against all authentication methods which allow to calculate colliding messages, the underlying building blocks of the presented attacks expose the potential pitfalls arising as a consequence of non-ITS authentication in QKD post-processing. We propose countermeasures, increasing the eavesdroppers demand for computational power, and also prove necessary and sufficient conditions for upgrading the discussed authentication code to the ITS level.

  6. FBC: a flat binary code scheme for fast Manhattan hash retrieval

    NASA Astrophysics Data System (ADS)

    Kong, Yan; Wu, Fuzhang; Gao, Lifa; Wu, Yanjun

    2018-04-01

    Hash coding is a widely used technique in approximate nearest neighbor (ANN) search, especially in document search and multimedia (such as image and video) retrieval. Based on the difference of distance measurement, hash methods are generally classified into two categories: Hamming hashing and Manhattan hashing. Benefitting from better neighborhood structure preservation, Manhattan hashing methods outperform earlier methods in search effectiveness. However, due to using decimal arithmetic operations instead of bit operations, Manhattan hashing becomes a more time-consuming process, which significantly decreases the whole search efficiency. To solve this problem, we present an intuitive hash scheme which uses Flat Binary Code (FBC) to encode the data points. As a result, the decimal arithmetic used in previous Manhattan hashing can be replaced by more efficient XOR operator. The final experiments show that with a reasonable memory space growth, our FBC speeds up more than 80% averagely without any search accuracy loss when comparing to the state-of-art Manhattan hashing methods.

  7. Fast perceptual image hash based on cascade algorithm

    NASA Astrophysics Data System (ADS)

    Ruchay, Alexey; Kober, Vitaly; Yavtushenko, Evgeniya

    2017-09-01

    In this paper, we propose a perceptual image hash algorithm based on cascade algorithm, which can be applied in image authentication, retrieval, and indexing. Image perceptual hash uses for image retrieval in sense of human perception against distortions caused by compression, noise, common signal processing and geometrical modifications. The main disadvantage of perceptual hash is high time expenses. In the proposed cascade algorithm of image retrieval initializes with short hashes, and then a full hash is applied to the processed results. Computer simulation results show that the proposed hash algorithm yields a good performance in terms of robustness, discriminability, and time expenses.

  8. Best-First Heuristic Search for Multicore Machines

    DTIC Science & Technology

    2010-01-01

    Otto, 1998) to implement an asynchronous version of PRA* that they call Hash Distributed A* ( HDA *). HDA * distributes nodes using a hash function in...nodes which are being communicated between peers are in transit. In contact with the authors of HDA *, we have created an implementation of HDA * for...Also, our implementation of HDA * allows us to make a fair comparison between algorithms by sharing common data structures such as priority queues and

  9. G-Hash: Towards Fast Kernel-based Similarity Search in Large Graph Databases.

    PubMed

    Wang, Xiaohong; Smalter, Aaron; Huan, Jun; Lushington, Gerald H

    2009-01-01

    Structured data including sets, sequences, trees and graphs, pose significant challenges to fundamental aspects of data management such as efficient storage, indexing, and similarity search. With the fast accumulation of graph databases, similarity search in graph databases has emerged as an important research topic. Graph similarity search has applications in a wide range of domains including cheminformatics, bioinformatics, sensor network management, social network management, and XML documents, among others.Most of the current graph indexing methods focus on subgraph query processing, i.e. determining the set of database graphs that contains the query graph and hence do not directly support similarity search. In data mining and machine learning, various graph kernel functions have been designed to capture the intrinsic similarity of graphs. Though successful in constructing accurate predictive and classification models for supervised learning, graph kernel functions have (i) high computational complexity and (ii) non-trivial difficulty to be indexed in a graph database.Our objective is to bridge graph kernel function and similarity search in graph databases by proposing (i) a novel kernel-based similarity measurement and (ii) an efficient indexing structure for graph data management. Our method of similarity measurement builds upon local features extracted from each node and their neighboring nodes in graphs. A hash table is utilized to support efficient storage and fast search of the extracted local features. Using the hash table, a graph kernel function is defined to capture the intrinsic similarity of graphs and for fast similarity query processing. We have implemented our method, which we have named G-hash, and have demonstrated its utility on large chemical graph databases. Our results show that the G-hash method achieves state-of-the-art performance for k-nearest neighbor (k-NN) classification. Most importantly, the new similarity measurement and the index structure is scalable to large database with smaller indexing size, faster indexing construction time, and faster query processing time as compared to state-of-the-art indexing methods such as C-tree, gIndex, and GraphGrep.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    AISL-CRYPTO is a library of cryptography functions supporting other AISL software. It provides various crypto functions for Common Lisp, including Digital Signature Algorithm, Data Encryption Standard, Secure Hash Algorithm, and public-key cryptography.

  11. On Various Nonlinearity Measures for Boolean Functions*

    PubMed Central

    Boyar, Joan; Find, Magnus Gausdal; Peralta, René

    2016-01-01

    A necessary condition for the security of cryptographic functions is to be “sufficiently distant” from linear, and cryptographers have proposed several measures for this distance. In this paper, we show that six common measures, nonlinearity, algebraic degree, annihilator immunity, algebraic thickness, normality, and multiplicative complexity, are incomparable in the sense that for each pair of measures, μ1, μ2, there exist functions f1, f2 with f1 being more nonlinear than f2 according to μ1, but less nonlinear according to μ2. We also present new connections between two of these measures. Additionally, we give a lower bound on the multiplicative complexity of collision-free functions. PMID:27458499

  12. A Novel Approach for Constructing One-Way Hash Function Based on a Message Block Controlled 8D Hyperchaotic Map

    NASA Astrophysics Data System (ADS)

    Lin, Zhuosheng; Yu, Simin; Lü, Jinhu

    2017-06-01

    In this paper, a novel approach for constructing one-way hash function based on 8D hyperchaotic map is presented. First, two nominal matrices both with constant and variable parameters are adopted for designing 8D discrete-time hyperchaotic systems, respectively. Then each input plaintext message block is transformed into 8 × 8 matrix following the order of left to right and top to bottom, which is used as a control matrix for the switch of the nominal matrix elements both with the constant parameters and with the variable parameters. Through this switching control, a new nominal matrix mixed with the constant and variable parameters is obtained for the 8D hyperchaotic map. Finally, the hash function is constructed with the multiple low 8-bit hyperchaotic system iterative outputs after being rounded down, and its secure analysis results are also given, validating the feasibility and reliability of the proposed approach. Compared with the existing schemes, the main feature of the proposed method is that it has a large number of key parameters with avalanche effect, resulting in the difficulty for estimating or predicting key parameters via various attacks.

  13. A Lightweight Protocol for Secure Video Streaming

    PubMed Central

    Morkevicius, Nerijus; Bagdonas, Kazimieras

    2018-01-01

    The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing “Fog Node-End Device” layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard. PMID:29757988

  14. Providing integrity, authenticity, and confidentiality for header and pixel data of DICOM images.

    PubMed

    Al-Haj, Ali

    2015-04-01

    Exchange of medical images over public networks is subjected to different types of security threats. This has triggered persisting demands for secured telemedicine implementations that will provide confidentiality, authenticity, and integrity for the transmitted images. The medical image exchange standard (DICOM) offers mechanisms to provide confidentiality for the header data of the image but not for the pixel data. On the other hand, it offers mechanisms to achieve authenticity and integrity for the pixel data but not for the header data. In this paper, we propose a crypto-based algorithm that provides confidentially, authenticity, and integrity for the pixel data, as well as for the header data. This is achieved by applying strong cryptographic primitives utilizing internally generated security data, such as encryption keys, hashing codes, and digital signatures. The security data are generated internally from the header and the pixel data, thus a strong bond is established between the DICOM data and the corresponding security data. The proposed algorithm has been evaluated extensively using DICOM images of different modalities. Simulation experiments show that confidentiality, authenticity, and integrity have been achieved as reflected by the results we obtained for normalized correlation, entropy, PSNR, histogram analysis, and robustness.

  15. A Lightweight Protocol for Secure Video Streaming.

    PubMed

    Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis

    2018-05-14

    The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.

  16. FSH: fast spaced seed hashing exploiting adjacent hashes.

    PubMed

    Girotto, Samuele; Comin, Matteo; Pizzi, Cinzia

    2018-01-01

    Patterns with wildcards in specified positions, namely spaced seeds , are increasingly used instead of k -mers in many bioinformatics applications that require indexing, querying and rapid similarity search, as they can provide better sensitivity. Many of these applications require to compute the hashing of each position in the input sequences with respect to the given spaced seed, or to multiple spaced seeds. While the hashing of k -mers can be rapidly computed by exploiting the large overlap between consecutive k -mers, spaced seeds hashing is usually computed from scratch for each position in the input sequence, thus resulting in slower processing. The method proposed in this paper, fast spaced-seed hashing (FSH), exploits the similarity of the hash values of spaced seeds computed at adjacent positions in the input sequence. In our experiments we compute the hash for each positions of metagenomics reads from several datasets, with respect to different spaced seeds. We also propose a generalized version of the algorithm for the simultaneous computation of multiple spaced seeds hashing. In the experiments, our algorithm can compute the hashing values of spaced seeds with a speedup, with respect to the traditional approach, between 1.6[Formula: see text] to 5.3[Formula: see text], depending on the structure of the spaced seed. Spaced seed hashing is a routine task for several bioinformatics application. FSH allows to perform this task efficiently and raise the question of whether other hashing can be exploited to further improve the speed up. This has the potential of major impact in the field, making spaced seed applications not only accurate, but also faster and more efficient. The software FSH is freely available for academic use at: https://bitbucket.org/samu661/fsh/overview.

  17. A Hash Based Remote User Authentication and Authenticated Key Agreement Scheme for the Integrated EPR Information System.

    PubMed

    Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi; Wang, Chun-Cheng

    2015-11-01

    To protect patient privacy and ensure authorized access to remote medical services, many remote user authentication schemes for the integrated electronic patient record (EPR) information system have been proposed in the literature. In a recent paper, Das proposed a hash based remote user authentication scheme using passwords and smart cards for the integrated EPR information system, and claimed that the proposed scheme could resist various passive and active attacks. However, in this paper, we found that Das's authentication scheme is still vulnerable to modification and user duplication attacks. Thereafter we propose a secure and efficient authentication scheme for the integrated EPR information system based on lightweight hash function and bitwise exclusive-or (XOR) operations. The security proof and performance analysis show our new scheme is well-suited to adoption in remote medical healthcare services.

  18. Model-based vision using geometric hashing

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  19. Digital camera with apparatus for authentication of images produced from an image file

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1993-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

  20. Hash Bit Selection for Nearest Neighbor Search.

    PubMed

    Xianglong Liu; Junfeng He; Shih-Fu Chang

    2017-11-01

    To overcome the barrier of storage and computation when dealing with gigantic-scale data sets, compact hashing has been studied extensively to approximate the nearest neighbor search. Despite the recent advances, critical design issues remain open in how to select the right features, hashing algorithms, and/or parameter settings. In this paper, we address these by posing an optimal hash bit selection problem, in which an optimal subset of hash bits are selected from a pool of candidate bits generated by different features, algorithms, or parameters. Inspired by the optimization criteria used in existing hashing algorithms, we adopt the bit reliability and their complementarity as the selection criteria that can be carefully tailored for hashing performance in different tasks. Then, the bit selection solution is discovered by finding the best tradeoff between search accuracy and time using a modified dynamic programming method. To further reduce the computational complexity, we employ the pairwise relationship among hash bits to approximate the high-order independence property, and formulate it as an efficient quadratic programming method that is theoretically equivalent to the normalized dominant set problem in a vertex- and edge-weighted graph. Extensive large-scale experiments have been conducted under several important application scenarios of hash techniques, where our bit selection framework can achieve superior performance over both the naive selection methods and the state-of-the-art hashing algorithms, with significant accuracy gains ranging from 10% to 50%, relatively.

  1. Deep Constrained Siamese Hash Coding Network and Load-Balanced Locality-Sensitive Hashing for Near Duplicate Image Detection.

    PubMed

    Hu, Weiming; Fan, Yabo; Xing, Junliang; Sun, Liang; Cai, Zhaoquan; Maybank, Stephen

    2018-09-01

    We construct a new efficient near duplicate image detection method using a hierarchical hash code learning neural network and load-balanced locality-sensitive hashing (LSH) indexing. We propose a deep constrained siamese hash coding neural network combined with deep feature learning. Our neural network is able to extract effective features for near duplicate image detection. The extracted features are used to construct a LSH-based index. We propose a load-balanced LSH method to produce load-balanced buckets in the hashing process. The load-balanced LSH significantly reduces the query time. Based on the proposed load-balanced LSH, we design an effective and feasible algorithm for near duplicate image detection. Extensive experiments on three benchmark data sets demonstrate the effectiveness of our deep siamese hash encoding network and load-balanced LSH.

  2. Novel Duplicate Address Detection with Hash Function

    PubMed Central

    Song, GuangJia; Ji, ZhenZhou

    2016-01-01

    Duplicate address detection (DAD) is an important component of the address resolution protocol (ARP) and the neighbor discovery protocol (NDP). DAD determines whether an IP address is in conflict with other nodes. In traditional DAD, the target address to be detected is broadcast through the network, which provides convenience for malicious nodes to attack. A malicious node can send a spoofing reply to prevent the address configuration of a normal node, and thus, a denial-of-service attack is launched. This study proposes a hash method to hide the target address in DAD, which prevents an attack node from launching destination attacks. If the address of a normal node is identical to the detection address, then its hash value should be the same as the “Hash_64” field in the neighboring solicitation message. Consequently, DAD can be successfully completed. This process is called DAD-h. Simulation results indicate that address configuration using DAD-h has a considerably higher success rate when under attack compared with traditional DAD. Comparative analysis shows that DAD-h does not require third-party devices and considerable computing resources; it also provides a lightweight security resolution. PMID:26991901

  3. Binary Multidimensional Scaling for Hashing.

    PubMed

    Huang, Yameng; Lin, Zhouchen

    2017-10-04

    Hashing is a useful technique for fast nearest neighbor search due to its low storage cost and fast query speed. Unsupervised hashing aims at learning binary hash codes for the original features so that the pairwise distances can be best preserved. While several works have targeted on this task, the results are not satisfactory mainly due to the oversimplified model. In this paper, we propose a unified and concise unsupervised hashing framework, called Binary Multidimensional Scaling (BMDS), which is able to learn the hash code for distance preservation in both batch and online mode. In the batch mode, unlike most existing hashing methods, we do not need to simplify the model by predefining the form of hash map. Instead, we learn the binary codes directly based on the pairwise distances among the normalized original features by Alternating Minimization. This enables a stronger expressive power of the hash map. In the online mode, we consider the holistic distance relationship between current query example and those we have already learned, rather than only focusing on current data chunk. It is useful when the data come in a streaming fashion. Empirical results show that while being efficient for training, our algorithm outperforms state-of-the-art methods by a large margin in terms of distance preservation, which is practical for real-world applications.

  4. Meet-in-the-Middle Preimage Attacks on Hash Modes of Generalized Feistel and Misty Schemes with SP Round Function

    NASA Astrophysics Data System (ADS)

    Moon, Dukjae; Hong, Deukjo; Kwon, Daesung; Hong, Seokhie

    We assume that the domain extender is the Merkle-Damgård (MD) scheme and he message is padded by a ‘1’, and minimum number of ‘0’s, followed by a fixed size length information so that the length of padded message is multiple of block length. Under this assumption, we analyze securities of the hash mode when the compression function follows the Davies-Meyer (DM) scheme and the underlying block cipher is one of the plain Feistel or Misty scheme or the generalized Feistel or Misty schemes with Substitution-Permutation (SP) round function. We do this work based on Meet-in-the-Middle (MitM) preimage attack techniques, and develop several useful initial structures.

  5. Double hashing technique in closed hashing search process

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Zulkarnain, Iskandar; Jaya, Hendra

    2017-09-01

    The search process is used in various activities performed both online and offline, many algorithms that can be used to perform the search process one of which is a hash search algorithm, search process with hash search algorithm used in this study using double hashing technique where the data will be formed into the table with same length and then search, the results of this study indicate that the search process with double hashing technique allows faster searching than the usual search techniques, this research allows to search the solution by dividing the value into the main table and overflow table so that the search process is expected faster than the data stacked in the form of one table and collision data could avoided.

  6. Deep classification hashing for person re-identification

    NASA Astrophysics Data System (ADS)

    Wang, Jiabao; Li, Yang; Zhang, Xiancai; Miao, Zhuang; Tao, Gang

    2018-04-01

    As the development of surveillance in public, person re-identification becomes more and more important. The largescale databases call for efficient computation and storage, hashing technique is one of the most important methods. In this paper, we proposed a new deep classification hashing network by introducing a new binary appropriation layer in the traditional ImageNet pre-trained CNN models. It outputs binary appropriate features, which can be easily quantized into binary hash-codes for hamming similarity comparison. Experiments show that our deep hashing method can outperform the state-of-the-art methods on the public CUHK03 and Market1501 datasets.

  7. Learning Short Binary Codes for Large-scale Image Retrieval.

    PubMed

    Liu, Li; Yu, Mengyang; Shao, Ling

    2017-03-01

    Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.

  8. Speaker Linking and Applications using Non-Parametric Hashing Methods

    DTIC Science & Technology

    2016-09-08

    clustering method based on hashing—canopy- clustering . We apply this method to a large corpus of speaker recordings, demonstrate performance tradeoffs...and compare to other hash- ing methods. Index Terms: speaker recognition, clustering , hashing, locality sensitive hashing. 1. Introduction We assume...speaker in our corpus. Second, given a QBE method, how can we perform speaker clustering —each clustering should be a single speaker, and a cluster should

  9. An adaptive cryptographic accelerator for network storage security on dynamically reconfigurable platform

    NASA Astrophysics Data System (ADS)

    Tang, Li; Liu, Jing-Ning; Feng, Dan; Tong, Wei

    2008-12-01

    Existing security solutions in network storage environment perform poorly because cryptographic operations (encryption and decryption) implemented in software can dramatically reduce system performance. In this paper we propose a cryptographic hardware accelerator on dynamically reconfigurable platform for the security of high performance network storage system. We employ a dynamic reconfigurable platform based on a FPGA to implement a PowerPCbased embedded system, which executes cryptographic algorithms. To reduce the reconfiguration latency, we apply prefetch scheduling. Moreover, the processing elements could be dynamically configured to support different cryptographic algorithms according to the request received by the accelerator. In the experiment, we have implemented AES (Rijndael) and 3DES cryptographic algorithms in the reconfigurable accelerator. Our proposed reconfigurable cryptographic accelerator could dramatically increase the performance comparing with the traditional software-based network storage systems.

  10. Adaptive Bloom Filter: A Space-Efficient Counting Algorithm for Unpredictable Network Traffic

    NASA Astrophysics Data System (ADS)

    Matsumoto, Yoshihide; Hazeyama, Hiroaki; Kadobayashi, Youki

    The Bloom Filter (BF), a space-and-time-efficient hashcoding method, is used as one of the fundamental modules in several network processing algorithms and applications such as route lookups, cache hits, packet classification, per-flow state management or network monitoring. BF is a simple space-efficient randomized data structure used to represent a data set in order to support membership queries. However, BF generates false positives, and cannot count the number of distinct elements. A counting Bloom Filter (CBF) can count the number of distinct elements, but CBF needs more space than BF. We propose an alternative data structure of CBF, and we called this structure an Adaptive Bloom Filter (ABF). Although ABF uses the same-sized bit-vector used in BF, the number of hash functions employed by ABF is dynamically changed to record the number of appearances of a each key element. Considering the hash collisions, the multiplicity of a each key element on ABF can be estimated from the number of hash functions used to decode the membership of the each key element. Although ABF can realize the same functionality as CBF, ABF requires the same memory size as BF. We describe the construction of ABF and IABF (Improved ABF), and provide a mathematical analysis and simulation using Zipf's distribution. Finally, we show that ABF can be used for an unpredictable data set such as real network traffic.

  11. Semi-Supervised Geographical Feature Detection

    NASA Astrophysics Data System (ADS)

    Yu, H.; Yu, L.; Kuo, K. S.

    2016-12-01

    Extraction and tracking geographical features is a fundamental requirement in many geoscience fields. However, this operation has become an increasingly challenging task for domain scientists when tackling a large amount of geoscience data. Although domain scientists may have a relatively clear definition of features, it is difficult to capture the presence of features in an accurate and efficient fashion. We propose a semi-supervised approach to address large geographical feature detection. Our approach has two main components. First, we represent a heterogeneous geoscience data in a unified high-dimensional space, which can facilitate us to evaluate the similarity of data points with respect to geolocation, time, and variable values. We characterize the data from these measures, and use a set of hash functions to parameterize the initial knowledge of the data. Second, for any user query, our approach can automatically extract the initial results based on the hash functions. To improve the accuracy of querying, our approach provides a visualization interface to display the querying results and allow users to interactively explore and refine them. The user feedback will be used to enhance our knowledge base in an iterative manner. In our implementation, we use high-performance computing techniques to accelerate the construction of hash functions. Our design facilitates a parallelization scheme for feature detection and extraction, which is a traditionally challenging problem for large-scale data. We evaluate our approach and demonstrate the effectiveness using both synthetic and real world datasets.

  12. Algorithm That Synthesizes Other Algorithms for Hashing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2010-01-01

    An algorithm that includes a collection of several subalgorithms has been devised as a means of synthesizing still other algorithms (which could include computer code) that utilize hashing to determine whether an element (typically, a number or other datum) is a member of a set (typically, a list of numbers). Each subalgorithm synthesizes an algorithm (e.g., a block of code) that maps a static set of key hashes to a somewhat linear monotonically increasing sequence of integers. The goal in formulating this mapping is to cause the length of the sequence thus generated to be as close as practicable to the original length of the set and thus to minimize gaps between the elements. The advantage of the approach embodied in this algorithm is that it completely avoids the traditional approach of hash-key look-ups that involve either secondary hash generation and look-up or further searching of a hash table for a desired key in the event of collisions. This algorithm guarantees that it will never be necessary to perform a search or to generate a secondary key in order to determine whether an element is a member of a set. This algorithm further guarantees that any algorithm that it synthesizes can be executed in constant time. To enforce these guarantees, the subalgorithms are formulated to employ a set of techniques, each of which works very effectively covering a certain class of hash-key values. These subalgorithms are of two types, summarized as follows: Given a list of numbers, try to find one or more solutions in which, if each number is shifted to the right by a constant number of bits and then masked with a rotating mask that isolates a set of bits, a unique number is thereby generated. In a variant of the foregoing procedure, omit the masking. Try various combinations of shifting, masking, and/or offsets until the solutions are found. From the set of solutions, select the one that provides the greatest compression for the representation and is executable in the minimum amount of time. Given a list of numbers, try to find one or more solutions in which, if each number is compressed by use of the modulo function by some value, then a unique value is generated.

  13. A Polynomial Subset-Based Efficient Multi-Party Key Management System for Lightweight Device Networks.

    PubMed

    Mahmood, Zahid; Ning, Huansheng; Ghafoor, AtaUllah

    2017-03-24

    Wireless Sensor Networks (WSNs) consist of lightweight devices to measure sensitive data that are highly vulnerable to security attacks due to their constrained resources. In a similar manner, the internet-based lightweight devices used in the Internet of Things (IoT) are facing severe security and privacy issues because of the direct accessibility of devices due to their connection to the internet. Complex and resource-intensive security schemes are infeasible and reduce the network lifetime. In this regard, we have explored the polynomial distribution-based key establishment schemes and identified an issue that the resultant polynomial value is either storage intensive or infeasible when large values are multiplied. It becomes more costly when these polynomials are regenerated dynamically after each node join or leave operation and whenever key is refreshed. To reduce the computation, we have proposed an Efficient Key Management (EKM) scheme for multiparty communication-based scenarios. The proposed session key management protocol is established by applying a symmetric polynomial for group members, and the group head acts as a responsible node. The polynomial generation method uses security credentials and secure hash function. Symmetric cryptographic parameters are efficient in computation, communication, and the storage required. The security justification of the proposed scheme has been completed by using Rubin logic, which guarantees that the protocol attains mutual validation and session key agreement property strongly among the participating entities. Simulation scenarios are performed using NS 2.35 to validate the results for storage, communication, latency, energy, and polynomial calculation costs during authentication, session key generation, node migration, secure joining, and leaving phases. EKM is efficient regarding storage, computation, and communication overhead and can protect WSN-based IoT infrastructure.

  14. A Polynomial Subset-Based Efficient Multi-Party Key Management System for Lightweight Device Networks

    PubMed Central

    Mahmood, Zahid; Ning, Huansheng; Ghafoor, AtaUllah

    2017-01-01

    Wireless Sensor Networks (WSNs) consist of lightweight devices to measure sensitive data that are highly vulnerable to security attacks due to their constrained resources. In a similar manner, the internet-based lightweight devices used in the Internet of Things (IoT) are facing severe security and privacy issues because of the direct accessibility of devices due to their connection to the internet. Complex and resource-intensive security schemes are infeasible and reduce the network lifetime. In this regard, we have explored the polynomial distribution-based key establishment schemes and identified an issue that the resultant polynomial value is either storage intensive or infeasible when large values are multiplied. It becomes more costly when these polynomials are regenerated dynamically after each node join or leave operation and whenever key is refreshed. To reduce the computation, we have proposed an Efficient Key Management (EKM) scheme for multiparty communication-based scenarios. The proposed session key management protocol is established by applying a symmetric polynomial for group members, and the group head acts as a responsible node. The polynomial generation method uses security credentials and secure hash function. Symmetric cryptographic parameters are efficient in computation, communication, and the storage required. The security justification of the proposed scheme has been completed by using Rubin logic, which guarantees that the protocol attains mutual validation and session key agreement property strongly among the participating entities. Simulation scenarios are performed using NS 2.35 to validate the results for storage, communication, latency, energy, and polynomial calculation costs during authentication, session key generation, node migration, secure joining, and leaving phases. EKM is efficient regarding storage, computation, and communication overhead and can protect WSN-based IoT infrastructure. PMID:28338632

  15. A novel class sensitive hashing technique for large-scale content-based remote sensing image retrieval

    NASA Astrophysics Data System (ADS)

    Reato, Thomas; Demir, Begüm; Bruzzone, Lorenzo

    2017-10-01

    This paper presents a novel class sensitive hashing technique in the framework of large-scale content-based remote sensing (RS) image retrieval. The proposed technique aims at representing each image with multi-hash codes, each of which corresponds to a primitive (i.e., land cover class) present in the image. To this end, the proposed method consists of a three-steps algorithm. The first step is devoted to characterize each image by primitive class descriptors. These descriptors are obtained through a supervised approach, which initially extracts the image regions and their descriptors that are then associated with primitives present in the images. This step requires a set of annotated training regions to define primitive classes. A correspondence between the regions of an image and the primitive classes is built based on the probability of each primitive class to be present at each region. All the regions belonging to the specific primitive class with a probability higher than a given threshold are highly representative of that class. Thus, the average value of the descriptors of these regions is used to characterize that primitive. In the second step, the descriptors of primitive classes are transformed into multi-hash codes to represent each image. This is achieved by adapting the kernel-based supervised locality sensitive hashing method to multi-code hashing problems. The first two steps of the proposed technique, unlike the standard hashing methods, allow one to represent each image by a set of primitive class sensitive descriptors and their hash codes. Then, in the last step, the images in the archive that are very similar to a query image are retrieved based on a multi-hash-code-matching scheme. Experimental results obtained on an archive of aerial images confirm the effectiveness of the proposed technique in terms of retrieval accuracy when compared to the standard hashing methods.

  16. Object-Location-Aware Hashing for Multi-Label Image Retrieval via Automatic Mask Learning.

    PubMed

    Huang, Chang-Qin; Yang, Shang-Ming; Pan, Yan; Lai, Han-Jiang

    2018-09-01

    Learning-based hashing is a leading approach of approximate nearest neighbor search for large-scale image retrieval. In this paper, we develop a deep supervised hashing method for multi-label image retrieval, in which we propose to learn a binary "mask" map that can identify the approximate locations of objects in an image, so that we use this binary "mask" map to obtain length-limited hash codes which mainly focus on an image's objects but ignore the background. The proposed deep architecture consists of four parts: 1) a convolutional sub-network to generate effective image features; 2) a binary "mask" sub-network to identify image objects' approximate locations; 3) a weighted average pooling operation based on the binary "mask" to obtain feature representations and hash codes that pay most attention to foreground objects but ignore the background; and 4) the combination of a triplet ranking loss designed to preserve relative similarities among images and a cross entropy loss defined on image labels. We conduct comprehensive evaluations on four multi-label image data sets. The results indicate that the proposed hashing method achieves superior performance gains over the state-of-the-art supervised or unsupervised hashing baselines.

  17. Bin-Hash Indexing: A Parallel Method for Fast Query Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, Edward W; Gosink, Luke J.; Wu, Kesheng

    2008-06-27

    This paper presents a new parallel indexing data structure for answering queries. The index, called Bin-Hash, offers extremely high levels of concurrency, and is therefore well-suited for the emerging commodity of parallel processors, such as multi-cores, cell processors, and general purpose graphics processing units (GPU). The Bin-Hash approach first bins the base data, and then partitions and separately stores the values in each bin as a perfect spatial hash table. To answer a query, we first determine whether or not a record satisfies the query conditions based on the bin boundaries. For the bins with records that can not bemore » resolved, we examine the spatial hash tables. The procedures for examining the bin numbers and the spatial hash tables offer the maximum possible level of concurrency; all records are able to be evaluated by our procedure independently in parallel. Additionally, our Bin-Hash procedures access much smaller amounts of data than similar parallel methods, such as the projection index. This smaller data footprint is critical for certain parallel processors, like GPUs, where memory resources are limited. To demonstrate the effectiveness of Bin-Hash, we implement it on a GPU using the data-parallel programming language CUDA. The concurrency offered by the Bin-Hash index allows us to fully utilize the GPU's massive parallelism in our work; over 12,000 records can be simultaneously evaluated at any one time. We show that our new query processing method is an order of magnitude faster than current state-of-the-art CPU-based indexing technologies. Additionally, we compare our performance to existing GPU-based projection index strategies.« less

  18. Quantum random oracle model for quantum digital signature

    NASA Astrophysics Data System (ADS)

    Shang, Tao; Lei, Qi; Liu, Jianwei

    2016-10-01

    The goal of this work is to provide a general security analysis tool, namely, the quantum random oracle (QRO), for facilitating the security analysis of quantum cryptographic protocols, especially protocols based on quantum one-way function. QRO is used to model quantum one-way function and different queries to QRO are used to model quantum attacks. A typical application of quantum one-way function is the quantum digital signature, whose progress has been hampered by the slow pace of the experimental realization. Alternatively, we use the QRO model to analyze the provable security of a quantum digital signature scheme and elaborate the analysis procedure. The QRO model differs from the prior quantum-accessible random oracle in that it can output quantum states as public keys and give responses to different queries. This tool can be a test bed for the cryptanalysis of more quantum cryptographic protocols based on the quantum one-way function.

  19. Digital data storage systems, computers, and data verification methods

    DOEpatents

    Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.

    2005-12-27

    Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.

  20. Perl Modules for Constructing Iterators

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2009-01-01

    The Iterator Perl Module provides a general-purpose framework for constructing iterator objects within Perl, and a standard API for interacting with those objects. Iterators are an object-oriented design pattern where a description of a series of values is used in a constructor. Subsequent queries can request values in that series. These Perl modules build on the standard Iterator framework and provide iterators for some other types of values. Iterator::DateTime constructs iterators from DateTime objects or Date::Parse descriptions and ICal/RFC 2445 style re-currence descriptions. It supports a variety of input parameters, including a start to the sequence, an end to the sequence, an Ical/RFC 2445 recurrence describing the frequency of the values in the series, and a format description that can refine the presentation manner of the DateTime. Iterator::String constructs iterators from string representations. This module is useful in contexts where the API consists of supplying a string and getting back an iterator where the specific iteration desired is opaque to the caller. It is of particular value to the Iterator::Hash module which provides nested iterations. Iterator::Hash constructs iterators from Perl hashes that can include multiple iterators. The constructed iterators will return all the permutations of the iterations of the hash by nested iteration of embedded iterators. A hash simply includes a set of keys mapped to values. It is a very common data structure used throughout Perl programming. The Iterator:: Hash module allows a hash to include strings defining iterators (parsed and dispatched with Iterator::String) that are used to construct an overall series of hash values.

  1. System using data compression and hashing adapted for use for multimedia encryption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffland, Douglas R

    2011-07-12

    A system and method is disclosed for multimedia encryption. Within the system of the present invention, a data compression module receives and compresses a media signal into a compressed data stream. A data acquisition module receives and selects a set of data from the compressed data stream. And, a hashing module receives and hashes the set of data into a keyword. The method of the present invention includes the steps of compressing a media signal into a compressed data stream; selecting a set of data from the compressed data stream; and hashing the set of data into a keyword.

  2. Practical security and privacy attacks against biometric hashing using sparse recovery

    NASA Astrophysics Data System (ADS)

    Topcu, Berkay; Karabat, Cagatay; Azadmanesh, Matin; Erdogan, Hakan

    2016-12-01

    Biometric hashing is a cancelable biometric verification method that has received research interest recently. This method can be considered as a two-factor authentication method which combines a personal password (or secret key) with a biometric to obtain a secure binary template which is used for authentication. We present novel practical security and privacy attacks against biometric hashing when the attacker is assumed to know the user's password in order to quantify the additional protection due to biometrics when the password is compromised. We present four methods that can reconstruct a biometric feature and/or the image from a hash and one method which can find the closest biometric data (i.e., face image) from a database. Two of the reconstruction methods are based on 1-bit compressed sensing signal reconstruction for which the data acquisition scenario is very similar to biometric hashing. Previous literature introduced simple attack methods, but we show that we can achieve higher level of security threats using compressed sensing recovery techniques. In addition, we present privacy attacks which reconstruct a biometric image which resembles the original image. We quantify the performance of the attacks using detection error tradeoff curves and equal error rates under advanced attack scenarios. We show that conventional biometric hashing methods suffer from high security and privacy leaks under practical attacks, and we believe more advanced hash generation methods are necessary to avoid these attacks.

  3. Digital Camera with Apparatus for Authentication of Images Produced from an Image File

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1996-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.

  4. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Corned beef hash. 319.303 Section 319.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.303 Corned beef hash. (a) “Corned Beef Hash” is the semi-solid food product in the form of a...

  5. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Corned beef hash. 319.303 Section 319.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.303 Corned beef hash. (a) “Corned Beef Hash” is the semi-solid food product in the form of a...

  6. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Corned beef hash. 319.303 Section 319.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.303 Corned beef hash. (a) “Corned Beef Hash” is the semi-solid food product in the form of a...

  7. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Corned beef hash. 319.303 Section 319.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.303 Corned beef hash. (a) “Corned Beef Hash” is the semi-solid food product in the form of a...

  8. An image retrieval framework for real-time endoscopic image retargeting.

    PubMed

    Ye, Menglong; Johns, Edward; Walter, Benjamin; Meining, Alexander; Yang, Guang-Zhong

    2017-08-01

    Serial endoscopic examinations of a patient are important for early diagnosis of malignancies in the gastrointestinal tract. However, retargeting for optical biopsy is challenging due to extensive tissue variations between examinations, requiring the method to be tolerant to these changes whilst enabling real-time retargeting. This work presents an image retrieval framework for inter-examination retargeting. We propose both a novel image descriptor tolerant of long-term tissue changes and a novel descriptor matching method in real time. The descriptor is based on histograms generated from regional intensity comparisons over multiple scales, offering stability over long-term appearance changes at the higher levels, whilst remaining discriminative at the lower levels. The matching method then learns a hashing function using random forests, to compress the string and allow for fast image comparison by a simple Hamming distance metric. A dataset that contains 13 in vivo gastrointestinal videos was collected from six patients, representing serial examinations of each patient, which includes videos captured with significant time intervals. Precision-recall for retargeting shows that our new descriptor outperforms a number of alternative descriptors, whilst our hashing method outperforms a number of alternative hashing approaches. We have proposed a novel framework for optical biopsy in serial endoscopic examinations. A new descriptor, combined with a novel hashing method, achieves state-of-the-art retargeting, with validation on in vivo videos from six patients. Real-time performance also allows for practical integration without disturbing the existing clinical workflow.

  9. Rotation invariant deep binary hashing for fast image retrieval

    NASA Astrophysics Data System (ADS)

    Dai, Lai; Liu, Jianming; Jiang, Aiwen

    2017-07-01

    In this paper, we study how to compactly represent image's characteristics for fast image retrieval. We propose supervised rotation invariant compact discriminative binary descriptors through combining convolutional neural network with hashing. In the proposed network, binary codes are learned by employing a hidden layer for representing latent concepts that dominate on class labels. A loss function is proposed to minimize the difference between binary descriptors that describe reference image and the rotated one. Compared with some other supervised methods, the proposed network doesn't have to require pair-wised inputs for binary code learning. Experimental results show that our method is effective and achieves state-of-the-art results on the CIFAR-10 and MNIST datasets.

  10. Compact binary hashing for music retrieval

    NASA Astrophysics Data System (ADS)

    Seo, Jin S.

    2014-03-01

    With the huge volume of music clips available for protection, browsing, and indexing, there is an increased attention to retrieve the information contents of the music archives. Music-similarity computation is an essential building block for browsing, retrieval, and indexing of digital music archives. In practice, as the number of songs available for searching and indexing is increased, so the storage cost in retrieval systems is becoming a serious problem. This paper deals with the storage problem by extending the supervector concept with the binary hashing. We utilize the similarity-preserving binary embedding in generating a hash code from the supervector of each music clip. Especially we compare the performance of the various binary hashing methods for music retrieval tasks on the widely-used genre dataset and the in-house singer dataset. Through the evaluation, we find an effective way of generating hash codes for music similarity estimation which improves the retrieval performance.

  11. Performance of hashed cache data migration schemes on multicomputers

    NASA Technical Reports Server (NTRS)

    Hiranandani, Seema; Saltz, Joel; Mehrotra, Piyush; Berryman, Harry

    1991-01-01

    After conducting an examination of several data-migration mechanisms which permit an explicit and controlled mapping of data to memory, a set of schemes for storage and retrieval of off-processor array elements is experimentally evaluated and modeled. All schemes considered have their basis in the use of hash tables for efficient access of nonlocal data. The techniques in question are those of hashed cache, partial enumeration, and full enumeration; in these, nonlocal data are stored in hash tables, so that the operative difference lies in the amount of memory used by each scheme and in the retrieval mechanism used for nonlocal data.

  12. Single-pixel non-imaging object recognition by means of Fourier spectrum acquisition

    NASA Astrophysics Data System (ADS)

    Chen, Huichao; Shi, Jianhong; Liu, Xialin; Niu, Zhouzhou; Zeng, Guihua

    2018-04-01

    Single-pixel imaging has emerged over recent years as a novel imaging technique, which has significant application prospects. In this paper, we propose and experimentally demonstrate a scheme that can achieve single-pixel non-imaging object recognition by acquiring the Fourier spectrum. In an experiment, a four-step phase-shifting sinusoid illumination light is used to irradiate the object image, the value of the light intensity is measured with a single-pixel detection unit, and the Fourier coefficients of the object image are obtained by a differential measurement. The Fourier coefficients are first cast into binary numbers to obtain the hash value. We propose a new method of perceptual hashing algorithm, which is combined with a discrete Fourier transform to calculate the hash value. The hash distance is obtained by calculating the difference of the hash value between the object image and the contrast images. By setting an appropriate threshold, the object image can be quickly and accurately recognized. The proposed scheme realizes single-pixel non-imaging perceptual hashing object recognition by using fewer measurements. Our result might open a new path for realizing object recognition with non-imaging.

  13. A dynamic re-partitioning strategy based on the distribution of key in Spark

    NASA Astrophysics Data System (ADS)

    Zhang, Tianyu; Lian, Xin

    2018-05-01

    Spark is a memory-based distributed data processing framework, has the ability of processing massive data and becomes a focus in Big Data. But the performance of Spark Shuffle depends on the distribution of data. The naive Hash partition function of Spark can not guarantee load balancing when data is skewed. The time of job is affected by the node which has more data to process. In order to handle this problem, dynamic sampling is used. In the process of task execution, histogram is used to count the key frequency distribution of each node, and then generate the global key frequency distribution. After analyzing the distribution of key, load balance of data partition is achieved. Results show that the Dynamic Re-Partitioning function is better than the default Hash partition, Fine Partition and the Balanced-Schedule strategy, it can reduce the execution time of the task and improve the efficiency of the whole cluster.

  14. Microbiological quality of five potato products obtained at retail markets.

    PubMed Central

    Duran, A P; Swartzentruber, A; Lanier, J M; Wentz, B A; Schwab, A H; Barnard, R J; Read, R B

    1982-01-01

    The microbiological quality of frozen hash brown potatoes, dried hash brown potatoes with onions, frozen french fried potatoes, dried instant mashed potatoes, and potato salad was determined by a national sampling at the retail level. A wide range of results was obtained, with most sampling units of each products having excellent microbiological quality. Geometric mean aerobic plate counts were as follows: dried hash brown potatoes, 270/g; frozen hash brown potatoes with onions, 580/g; frozen french fried potatoes 78/g; dried instant mashed potatoes, 1.1 x 10(3)/g; and potato salad, 3.6 x 10(3)/g. Mean values of coliforms, Escherichia coli, and Staphylococcus aureus were less than 10/g. PMID:6758695

  15. Forensic hash for multimedia information

    NASA Astrophysics Data System (ADS)

    Lu, Wenjun; Varna, Avinash L.; Wu, Min

    2010-01-01

    Digital multimedia such as images and videos are prevalent on today's internet and cause significant social impact, which can be evidenced by the proliferation of social networking sites with user generated contents. Due to the ease of generating and modifying images and videos, it is critical to establish trustworthiness for online multimedia information. In this paper, we propose novel approaches to perform multimedia forensics using compact side information to reconstruct the processing history of a document. We refer to this as FASHION, standing for Forensic hASH for informatION assurance. Based on the Radon transform and scale space theory, the proposed forensic hash is compact and can effectively estimate the parameters of geometric transforms and detect local tampering that an image may have undergone. Forensic hash is designed to answer a broader range of questions regarding the processing history of multimedia data than the simple binary decision from traditional robust image hashing, and also offers more efficient and accurate forensic analysis than multimedia forensic techniques that do not use any side information.

  16. A Complete and Accurate Ab Initio Repeat Finding Algorithm.

    PubMed

    Lian, Shuaibin; Chen, Xinwu; Wang, Peng; Zhang, Xiaoli; Dai, Xianhua

    2016-03-01

    It has become clear that repetitive sequences have played multiple roles in eukaryotic genome evolution including increasing genetic diversity through mutation, changes in gene expression and facilitating generation of novel genes. However, identification of repetitive elements can be difficult in the ab initio manner. Currently, some classical ab initio tools of finding repeats have already presented and compared. The completeness and accuracy of detecting repeats of them are little pool. To this end, we proposed a new ab initio repeat finding tool, named HashRepeatFinder, which is based on hash index and word counting. Furthermore, we assessed the performances of HashRepeatFinder with other two famous tools, such as RepeatScout and Repeatfinder, in human genome data hg19. The results indicated the following three conclusions: (1) The completeness of HashRepeatFinder is the best one among these three compared tools in almost all chromosomes, especially in chr9 (8 times of RepeatScout, 10 times of Repeatfinder); (2) in terms of detecting large repeats, HashRepeatFinder also performed best in all chromosomes, especially in chr3 (24 times of RepeatScout and 250 times of Repeatfinder) and chr19 (12 times of RepeatScout and 60 times of Repeatfinder); (3) in terms of accuracy, HashRepeatFinder can merge the abundant repeats with high accuracy.

  17. Cryptographic Properties of the Hidden Weighted Bit Function

    DTIC Science & Technology

    2013-12-23

    valid OMB control number. 1. REPORT DATE 23 DEC 2013 2. REPORT TYPE 3. DATES COVERED 00-00-2013 to 00-00-2013 4. TITLE AND SUBTITLE...K. Feng, An Infinite Class of Balanced Vectorial Boolean Functions with Optimum Algebraic Immunity and Good Nonlinearity, in: IWCC 2009, In: LNCS

  18. Unified Communications: Simplifying DoD Communication Methods

    DTIC Science & Technology

    2013-04-18

    private key to encrypt the hash. The encrypted hash, together with some other information, such as the hashing algorithm , is known as a digital...virtual private network (VPN). The use of a VPN would allow users to access corporate data while encrypting traffic.35 Another layer of protection would...sign and encrypt emails as well as controlling access to restricted sites. PKI uses a combination of public and private keys for encryption and

  19. A Fast Optimization Method for General Binary Code Learning.

    PubMed

    Shen, Fumin; Zhou, Xiang; Yang, Yang; Song, Jingkuan; Shen, Heng; Tao, Dacheng

    2016-09-22

    Hashing or binary code learning has been recognized to accomplish efficient near neighbor search, and has thus attracted broad interests in recent retrieval, vision and learning studies. One main challenge of learning to hash arises from the involvement of discrete variables in binary code optimization. While the widely-used continuous relaxation may achieve high learning efficiency, the pursued codes are typically less effective due to accumulated quantization error. In this work, we propose a novel binary code optimization method, dubbed Discrete Proximal Linearized Minimization (DPLM), which directly handles the discrete constraints during the learning process. Specifically, the discrete (thus nonsmooth nonconvex) problem is reformulated as minimizing the sum of a smooth loss term with a nonsmooth indicator function. The obtained problem is then efficiently solved by an iterative procedure with each iteration admitting an analytical discrete solution, which is thus shown to converge very fast. In addition, the proposed method supports a large family of empirical loss functions, which is particularly instantiated in this work by both a supervised and an unsupervised hashing losses, together with the bits uncorrelation and balance constraints. In particular, the proposed DPLM with a supervised `2 loss encodes the whole NUS-WIDE database into 64-bit binary codes within 10 seconds on a standard desktop computer. The proposed approach is extensively evaluated on several large-scale datasets and the generated binary codes are shown to achieve very promising results on both retrieval and classification tasks.

  20. Deterrence of device counterfeiting, cloning, and subversion by substitution using hardware fingerprinting

    DOEpatents

    Hamlet, Jason R; Bauer, Todd M; Pierson, Lyndon G

    2014-09-30

    Deterrence of device subversion by substitution may be achieved by including a cryptographic fingerprint unit within a computing device for authenticating a hardware platform of the computing device. The cryptographic fingerprint unit includes a physically unclonable function ("PUF") circuit disposed in or on the hardware platform. The PUF circuit is used to generate a PUF value. A key generator is coupled to generate a private key and a public key based on the PUF value while a decryptor is coupled to receive an authentication challenge posed to the computing device and encrypted with the public key and coupled to output a response to the authentication challenge decrypted with the private key.

  1. The self-crosslinking smart hyaluronic acid hydrogels as injectable three-dimensional scaffolds for cells culture.

    PubMed

    Bian, Shaoquan; He, Mengmeng; Sui, Junhui; Cai, Hanxu; Sun, Yong; Liang, Jie; Fan, Yujiang; Zhang, Xingdong

    2016-04-01

    Although the disulfide bond crosslinked hyaluronic acid hydrogels have been reported by many research groups, the major researches were focused on effectively forming hydrogels. However, few researchers paid attention to the potential significance of controlling the hydrogel formation and degradation, improving biocompatibility, reducing the toxicity of exogenous and providing convenience to the clinical operations later on. In this research, the novel controllable self-crosslinking smart hydrogels with in-situ gelation property was prepared by a single component, the thiolated hyaluronic acid derivative (HA-SH), and applied as a three-dimensional scaffold to mimic native extracellular matrix (ECM) for the culture of fibroblasts cells (L929) and chondrocytes. A series of HA-SH hydrogels were prepared depending on different degrees of thiol substitution (ranging from 10 to 60%) and molecule weights of HA (0.1, 0.3 and 1.0 MDa). The gelation time, swelling property and smart degradation behavior of HA-SH hydrogel were evaluated. The results showed that the gelation and degradation time of hydrogels could be controlled by adjusting the component of HA-SH polymers. The storage modulus of HA-SH hydrogels obtained by dynamic modulus analysis (DMA) could be up to 44.6 kPa. In addition, HA-SH hydrogels were investigated as a three-dimensional scaffold for the culture of fibroblasts cells (L929) and chondrocytes cells in vitro and as an injectable hydrogel for delivering chondrocytes cells in vivo. These results illustrated that HA-SH hydrogels with controllable gelation process, intelligent degradation behavior, excellent biocompatibility and convenient operational characteristics supplied potential clinical application capacity for tissue engineering and regenerative medicine. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. A Comparison of Nutrient Intakes between a Ft. Riley Contractor-Operated and a Ft. Lewis Military-Operated Garrison Dining Facility

    DTIC Science & Technology

    1987-10-01

    Meatsauce Rissole Potatoes Turkey Nuggets (I/o) Hash Browned Potatoes (I/o) Mashed Potatoes Buttered Mixed Vegetables Toasted Garlic Bread Brussels...Chicken Curry Baked Ham/P/A Sauce Parsley Buttered Potatoes Brown Gravy Hash Browned Potatoes (I/o) Steamed Rice Steamed Carrots Mashed Potatoes...Steak Mashed Potatoes Mashed Potatoes Rissole Potatoes Steamed Rice Hash Browned Potatoes (I/o) Green Beans Steamed Carrots Broccoli w/Cheese sauce

  3. Optimal hash arrangement of tentacles in jellyfish

    NASA Astrophysics Data System (ADS)

    Okabe, Takuya; Yoshimura, Jin

    2016-06-01

    At first glance, the trailing tentacles of a jellyfish appear to be randomly arranged. However, close examination of medusae has revealed that the arrangement and developmental order of the tentacles obey a mathematical rule. Here, we show that medusa jellyfish adopt the best strategy to achieve the most uniform distribution of a variable number of tentacles. The observed order of tentacles is a real-world example of an optimal hashing algorithm known as Fibonacci hashing in computer science.

  4. Large-scale Cross-modality Search via Collective Matrix Factorization Hashing.

    PubMed

    Ding, Guiguang; Guo, Yuchen; Zhou, Jile; Gao, Yue

    2016-09-08

    By transforming data into binary representation, i.e., Hashing, we can perform high-speed search with low storage cost, and thus Hashing has collected increasing research interest in the recent years. Recently, how to generate Hashcode for multimodal data (e.g., images with textual tags, documents with photos, etc) for large-scale cross-modality search (e.g., searching semantically related images in database for a document query) is an important research issue because of the fast growth of multimodal data in the Web. To address this issue, a novel framework for multimodal Hashing is proposed, termed as Collective Matrix Factorization Hashing (CMFH). The key idea of CMFH is to learn unified Hashcodes for different modalities of one multimodal instance in the shared latent semantic space in which different modalities can be effectively connected. Therefore, accurate cross-modality search is supported. Based on the general framework, we extend it in the unsupervised scenario where it tries to preserve the Euclidean structure, and in the supervised scenario where it fully exploits the label information of data. The corresponding theoretical analysis and the optimization algorithms are given. We conducted comprehensive experiments on three benchmark datasets for cross-modality search. The experimental results demonstrate that CMFH can significantly outperform several state-of-the-art cross-modality Hashing methods, which validates the effectiveness of the proposed CMFH.

  5. Internet traffic load balancing using dynamic hashing with flow volume

    NASA Astrophysics Data System (ADS)

    Jo, Ju-Yeon; Kim, Yoohwan; Chao, H. Jonathan; Merat, Francis L.

    2002-07-01

    Sending IP packets over multiple parallel links is in extensive use in today's Internet and its use is growing due to its scalability, reliability and cost-effectiveness. To maximize the efficiency of parallel links, load balancing is necessary among the links, but it may cause the problem of packet reordering. Since packet reordering impairs TCP performance, it is important to reduce the amount of reordering. Hashing offers a simple solution to keep the packet order by sending a flow over a unique link, but static hashing does not guarantee an even distribution of the traffic amount among the links, which could lead to packet loss under heavy load. Dynamic hashing offers some degree of load balancing but suffers from load fluctuations and excessive packet reordering. To overcome these shortcomings, we have enhanced the dynamic hashing algorithm to utilize the flow volume information in order to reassign only the appropriate flows. This new method, called dynamic hashing with flow volume (DHFV), eliminates unnecessary flow reassignments of small flows and achieves load balancing very quickly without load fluctuation by accurately predicting the amount of transferred load between the links. In this paper we provide the general framework of DHFV and address the challenges in implementing DHFV. We then introduce two algorithms of DHFV with different flow selection strategies and show their performances through simulation.

  6. Adoption of the Hash algorithm in a conceptual model for the civil registry of Ecuador

    NASA Astrophysics Data System (ADS)

    Toapanta, Moisés; Mafla, Enrique; Orizaga, Antonio

    2018-04-01

    The Hash security algorithm was analyzed in order to mitigate information security in a distributed architecture. The objective of this research is to develop a prototype for the Adoption of the algorithm Hash in a conceptual model for the Civil Registry of Ecuador. The deductive method was used in order to analyze the published articles that have a direct relation with the research project "Algorithms and Security Protocols for the Civil Registry of Ecuador" and articles related to the Hash security algorithm. It resulted from this research: That the SHA-1 security algorithm is appropriate for use in Ecuador's civil registry; we adopted the SHA-1 algorithm used in the flowchart technique and finally we obtained the adoption of the hash algorithm in a conceptual model. It is concluded that from the comparison of the DM5 and SHA-1 algorithm, it is suggested that in the case of an implementation, the SHA-1 algorithm is taken due to the amount of information and data available from the Civil Registry of Ecuador; It is determined that the SHA-1 algorithm that was defined using the flowchart technique can be modified according to the requirements of each institution; the model for adopting the hash algorithm in a conceptual model is a prototype that can be modified according to all the actors that make up each organization.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright

    Finding and identifying Cryptography is a growing concern in the malware analysis community. In this paper, a heuristic method for determining the likelihood that a given function contains a cryptographic algorithm is discussed and the results of applying this method in various environments is shown. The algorithm is based on frequency analysis of opcodes that make up each function within a binary.

  8. A Construction of Boolean Functions with Good Cryptographic Properties

    DTIC Science & Technology

    2014-01-01

    be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT...2008, LNCS 5350, Springer–Verlag, 2008, pp. 425–440. [10] C. Carlet and K. Feng, “An Infinite Class of Balanced Vectorial Boolean Functions with Optimum

  9. Implementation analysis of RC5 algorithm on Preneel-Govaerts-Vandewalle (PGV) hashing schemes using length extension attack

    NASA Astrophysics Data System (ADS)

    Siswantyo, Sepha; Susanti, Bety Hayat

    2016-02-01

    Preneel-Govaerts-Vandewalle (PGV) schemes consist of 64 possible single-block-length schemes that can be used to build a hash function based on block ciphers. For those 64 schemes, Preneel claimed that 4 schemes are secure. In this paper, we apply length extension attack on those 4 secure PGV schemes which use RC5 algorithm in its basic construction to test their collision resistance property. The attack result shows that the collision occurred on those 4 secure PGV schemes. Based on the analysis, we indicate that Feistel structure and data dependent rotation operation in RC5 algorithm, XOR operations on the scheme, along with selection of additional message block value also give impact on the collision to occur.

  10. Data Collision Prevention with Overflow Hashing Technique in Closed Hash Searching Process

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Nurjamiyah; Rafika Dewi, Arie

    2017-12-01

    Hash search is a method that can be used for various search processes such as search engines, sorting, machine learning, neural network and so on, in the search process the possibility of collision data can happen and to prevent the occurrence of collision can be done in several ways one of them is to use Overflow technique, the use of this technique perform with varying length of data and this technique can prevent the occurrence of data collisions.

  11. Using Distinct Sectors in Media Sampling and Full Media Analysis to Detect Presence of Documents from a Corpus

    DTIC Science & Technology

    2012-09-01

    relative performance of several conventional SQL and NoSQL databases with a set of one billion file block hashes. Digital Forensics, Sector Hashing, Full... NoSQL databases with a set of one billion file block hashes. v THIS PAGE INTENTIONALLY LEFT BLANK vi Table of Contents List of Acronyms and...Operating System NOOP No Operation assembly instruction NoSQL “Not only SQL” model for non-relational database management NSRL National Software

  12. Shannon: Theory and cryptography

    NASA Astrophysics Data System (ADS)

    Roefs, H. F. A.

    1982-11-01

    The ideas of Shannon as a theoretical basis for cryptography are discussed. The notion of mutual information is introduced to provide a deeper understanding of the functioning of cryptographic systems. Shannon's absolute secure cryptosystem and his notion of unicity distance are explained.

  13. 9 CFR 319.303 - Corned beef hash.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... combination, are salt, sugar (sucrose or dextrose), spice, and flavoring, including essential oils, oleoresins, and other spice extractives. (b) Corned beef hash may contain one or more of the following optional...

  14. Toward Optimal Manifold Hashing via Discrete Locally Linear Embedding.

    PubMed

    Rongrong Ji; Hong Liu; Liujuan Cao; Di Liu; Yongjian Wu; Feiyue Huang

    2017-11-01

    Binary code learning, also known as hashing, has received increasing attention in large-scale visual search. By transforming high-dimensional features to binary codes, the original Euclidean distance is approximated via Hamming distance. More recently, it is advocated that it is the manifold distance, rather than the Euclidean distance, that should be preserved in the Hamming space. However, it retains as an open problem to directly preserve the manifold structure by hashing. In particular, it first needs to build the local linear embedding in the original feature space, and then quantize such embedding to binary codes. Such a two-step coding is problematic and less optimized. Besides, the off-line learning is extremely time and memory consuming, which needs to calculate the similarity matrix of the original data. In this paper, we propose a novel hashing algorithm, termed discrete locality linear embedding hashing (DLLH), which well addresses the above challenges. The DLLH directly reconstructs the manifold structure in the Hamming space, which learns optimal hash codes to maintain the local linear relationship of data points. To learn discrete locally linear embeddingcodes, we further propose a discrete optimization algorithm with an iterative parameters updating scheme. Moreover, an anchor-based acceleration scheme, termed Anchor-DLLH, is further introduced, which approximates the large similarity matrix by the product of two low-rank matrices. Experimental results on three widely used benchmark data sets, i.e., CIFAR10, NUS-WIDE, and YouTube Face, have shown superior performance of the proposed DLLH over the state-of-the-art approaches.

  15. Discriminative Projection Selection Based Face Image Hashing

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  16. Provably secure Rabin-p cryptosystem in hybrid setting

    NASA Astrophysics Data System (ADS)

    Asbullah, Muhammad Asyraf; Ariffin, Muhammad Rezal Kamel

    2016-06-01

    In this work, we design an efficient and provably secure hybrid cryptosystem depicted by a combination of the Rabin-p cryptosystem with an appropriate symmetric encryption scheme. We set up a hybrid structure which is proven secure in the sense of indistinguishable against the chosen-ciphertext attack. We presume that the integer factorization problem is hard and the hash function that modeled as a random function.

  17. Changes in Benthos Associated with Mussel (Mytilus edulis L.) Farms on the West-Coast of Scotland

    PubMed Central

    Wilding, Thomas A.; Nickell, Thomas D.

    2013-01-01

    Aquaculture, as a means of food production, is growing rapidly in response to an increasing demand for protein and the over-exploitation of wild fisheries. This expansion includes mussels (family Mytilidae) where production currently stands at 1.5 million tonnes per annum. Mussel culture is frequently perceived as having little environmental impact yet mussel biodeposits and shell debris accumulate around the production site and are linked to changes in the benthos. To assess the extent and nature of changes in benthos associated with mussel farming grab and video sampling around seven mussel farms was conducted. Grab samples were analysed for macrofauna and shell-hash content whilst starfish were counted and the shell-hash cover estimated from video imaging. Shell-hash was patchily distributed and occasionally dominated sediments (maximum of 2116 g per 0.1 m2 grab). Mean shell-hash content decreased rapidly at distances >5 m from the line and, over the distance 1–64 m, decreased by three orders of magnitude. The presence of shell-hash and the distance-from-line influenced macrofaunal assemblages but this effect differed between sites. There was no evidence that mussel farming was associated with changes in macrobenthic diversity, species count or feeding strategy. However, total macrofaunal count was estimated to be 2.5 times higher in close proximity to the lines, compared with 64 m distance, and there was evidence that this effect was conditional on the presence of shell-hash. Starfish density varied considerably between sites but, overall, they were approximately 10 times as abundant close to the mussel-lines compared with 64 m distance. There was no evidence that starfish were more abundant in the presence of shell-hash visible on the sediment surface. In terms of farm-scale benthic impacts these data suggest that mussel farming is a relatively benign way of producing food, compared with intensive fish-farming, in similar environments. PMID:23874583

  18. Exploiting the HASH Planetary Nebula Research Platform

    NASA Astrophysics Data System (ADS)

    Parker, Quentin A.; Bojičić, Ivan; Frew, David J.

    2017-10-01

    The HASH (Hong Kong/ AAO/ Strasbourg/ Hα) planetary nebula research platform is a unique data repository with a graphical interface and SQL capability that offers the community powerful, new ways to undertake Galactic PN studies. HASH currently contains multi-wavelength images, spectra, positions, sizes, morphologies and other data whenever available for 2401 true, 447 likely, and 692 possible Galactic PNe, for a total of 3540 objects. An additional 620 Galactic post-AGB stars, pre-PNe, and PPN candidates are included. All objects were classified and evaluated following the precepts and procedures established and developed by our group over the last 15 years. The complete database contains over 6,700 Galactic objects including the many mimics and related phenomena previously mistaken or confused with PNe. Curation and updating currently occurs on a weekly basis to keep the repository as up to date as possible until the official release of HASH v1 planned in the near future.

  19. Comparison of Grouping Methods for Template Extraction from VA Medical Record Text.

    PubMed

    Redd, Andrew M; Gundlapalli, Adi V; Divita, Guy; Tran, Le-Thuy; Pettey, Warren B P; Samore, Matthew H

    2017-01-01

    We investigate options for grouping templates for the purpose of template identification and extraction from electronic medical records. We sampled a corpus of 1000 documents originating from Veterans Health Administration (VA) electronic medical record. We grouped documents through hashing and binning tokens (Hashed) as well as by the top 5% of tokens identified as important through the term frequency inverse document frequency metric (TF-IDF). We then compared the approaches on the number of groups with 3 or more and the resulting longest common subsequences (LCSs) common to all documents in the group. We found that the Hashed method had a higher success rate for finding LCSs, and longer LCSs than the TF-IDF method, however the TF-IDF approach found more groups than the Hashed and subsequently more long sequences, however the average length of LCSs were lower. In conclusion, each algorithm appears to have areas where it appears to be superior.

  20. A cryptographic key management solution for HIPAA privacy/security regulations.

    PubMed

    Lee, W-B; Lee, C-D

    2008-01-01

    The Health Insurance Portability and Accountability Act (HIPAA) privacy and security regulations are two crucial provisions in the protection of healthcare privacy. Privacy regulations create a principle to assure that patients have more control over their health information and set limits on the use and disclosure of health information. The security regulations stipulate the provisions implemented to guard data integrity, confidentiality, and availability. Undoubtedly, the cryptographic mechanisms are well defined to provide suitable solutions. In this paper, to comply with the HIPAA regulations, a flexible cryptographic key management solution is proposed to facilitate interoperations among the applied cryptographic mechanisms. In addition, case of consent exceptions intended to facilitate emergency applications and other possible exceptions can also be handled easily.

  1. Multi-factor authentication

    DOEpatents

    Hamlet, Jason R; Pierson, Lyndon G

    2014-10-21

    Detection and deterrence of spoofing of user authentication may be achieved by including a cryptographic fingerprint unit within a hardware device for authenticating a user of the hardware device. The cryptographic fingerprint unit includes an internal physically unclonable function ("PUF") circuit disposed in or on the hardware device, which generates a PUF value. Combining logic is coupled to receive the PUF value, combines the PUF value with one or more other authentication factors to generate a multi-factor authentication value. A key generator is coupled to generate a private key and a public key based on the multi-factor authentication value while a decryptor is coupled to receive an authentication challenge posed to the hardware device and encrypted with the public key and coupled to output a response to the authentication challenge decrypted with the private key.

  2. Interactions between colour and synaesthetic colour: an effect of simultaneous colour contrast on synaesthetic colours.

    PubMed

    Nijboer, Tanja C W; Gebuis, Titia; te Pas, Susan F; van der Smagt, Maarten J

    2011-01-01

    We investigated whether simultaneous colour contrast affects the synaesthetic colour experience and normal colour percept in a similar manner. We simultaneously presented a target stimulus (i.e. grapheme) and a reference stimulus (i.e. hash). Either the grapheme or the hash was presented on a saturated background of the same or opposite colour category as the synaesthetic colour and the other stimulus on a grey background. In both conditions, grapheme-colour synaesthetes were asked to colour the hash in a colour similar to the synaesthetic colour of the grapheme. Controls that were pair-matched to the synaesthetes performed the same experiment, but for them, the grapheme was presented in the colour induced by the grapheme in synaesthetes. When graphemes were presented on a grey and the hash on a coloured background, a traditional simultaneous colour-contrast effect was found for controls as well as synaesthetes. When graphemes were presented on colour and the hash on grey, the controls again showed a traditional simultaneous colour-contrast effect, whereas the synaesthetes showed the opposite effect. Our results show that synaesthetic colour experiences differ from normal colour perception; both are susceptible to different surrounding colours, but not in a comparable manner. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Improving the Rainbow Attack by Reusing Colours

    NASA Astrophysics Data System (ADS)

    Ågren, Martin; Johansson, Thomas; Hell, Martin

    Hashing or encrypting a key or a password is a vital part in most network security protocols. The most practical generic attack on such schemes is a time memory trade-off attack. Such an attack inverts any one-way function using a trade-off between memory and execution time. Existing techniques include the Hellman attack and the rainbow attack, where the latter uses different reduction functions ("colours") within a table.

  4. Twenty Seven Years of Quantum Cryptography!

    NASA Astrophysics Data System (ADS)

    Hughes, Richard

    2011-03-01

    One of the fundamental goals of cryptographic research is to minimize the assumptions underlying the protocols that enable secure communications between pairs or groups of users. In 1984, building on earlier research by Stephen Wiesner, Charles Bennett and Gilles Brassard showed how quantum physics could be harnessed to provide information-theoretic security for protocols such as the distribution of cryptographic keys, which enables two parties to secure their conventional communications. Bennett and Brassard and colleagues performed a proof-of-principle quantum key distribution (QKD) experiment with single-photon quantum state transmission over a 32-cm air path in 1991. This seminal experiment led other researchers to explore QKD in optical fibers and over line-of-sight outdoor atmospheric paths (``free-space''), resulting in dramatic increases in range, bit rate and security. These advances have been enabled by improvements in sources and single-photon detectors. Also in 1991 Artur Ekert showed how the security of QKD could be related to quantum entanglement. This insight led to a deeper understanding and proof of QKD security with practical sources and detectors in the presence of transmission loss and channel noise. Today, QKD has been implemented over ranges much greater than 100km in both fiber and free-space, multi-node network testbeds have been demonstrated, and satellite-based QKD is under study in several countries. ``Quantum hacking'' researchers have shown the importance of extending security considerations to the classical devices that produce and detect the photon quantum states. New quantum cryptographic protocols such as secure identification have been proposed, and others such as quantum secret splitting have been demonstrated. It is now possible to envision quantum cryptography providing a more secure alternative to present-day cryptographic methods for many secure communications functions. My talk will survey these remarkable developments.

  5. Quantum Authencryption with Two-Photon Entangled States for Off-Line Communicants

    NASA Astrophysics Data System (ADS)

    Ye, Tian-Yu

    2016-02-01

    In this paper, a quantum authencryption protocol is proposed by using the two-photon entangled states as the quantum resource. Two communicants Alice and Bob share two private keys in advance, which determine the generation of two-photon entangled states. The sender Alice sends the two-photon entangled state sequence encoded with her classical bits to the receiver Bob in the manner of one-step quantum transmission. Upon receiving the encoded quantum state sequence, Bob decodes out Alice's classical bits with the two-photon joint measurements and authenticates the integrity of Alice's secret with the help of one-way hash function. The proposed protocol only uses the one-step quantum transmission and needs neither a public discussion nor a trusted third party. As a result, the proposed protocol can be adapted to the case where the receiver is off-line, such as the quantum E-mail systems. Moreover, the proposed protocol provides the message authentication to one bit level with the help of one-way hash function and has an information-theoretical efficiency equal to 100 %.

  6. A strategy to load balancing for non-connectivity MapReduce job

    NASA Astrophysics Data System (ADS)

    Zhou, Huaping; Liu, Guangzong; Gui, Haixia

    2017-09-01

    MapReduce has been widely used in large scale and complex datasets as a kind of distributed programming model. Original Hash partitioning function in MapReduce often results the problem of data skew when data distribution is uneven. To solve the imbalance of data partitioning, we proposes a strategy to change the remaining partitioning index when data is skewed. In Map phase, we count the amount of data which will be distributed to each reducer, then Job Tracker monitor the global partitioning information and dynamically modify the original partitioning function according to the data skew model, so the Partitioner can change the index of these partitioning which will cause data skew to the other reducer that has less load in the next partitioning process, and can eventually balance the load of each node. Finally, we experimentally compare our method with existing methods on both synthetic and real datasets, the experimental results show our strategy can solve the problem of data skew with better stability and efficiency than Hash method and Sampling method for non-connectivity MapReduce task.

  7. A one-time pad color image cryptosystem based on SHA-3 and multiple chaotic systems

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Wang, Siwei; Zhang, Yingqian; Luo, Chao

    2018-04-01

    A novel image encryption algorithm is proposed that combines the SHA-3 hash function and two chaotic systems: the hyper-chaotic Lorenz and Chen systems. First, 384 bit keystream hash values are obtained by applying SHA-3 to plaintext. The sensitivity of the SHA-3 algorithm and chaotic systems ensures the effect of a one-time pad. Second, the color image is expanded into three-dimensional space. During permutation, it undergoes plane-plane displacements in the x, y and z dimensions. During diffusion, we use the adjacent pixel dataset and corresponding chaotic value to encrypt each pixel. Finally, the structure of alternating between permutation and diffusion is applied to enhance the level of security. Furthermore, we design techniques to improve the algorithm's encryption speed. Our experimental simulations show that the proposed cryptosystem achieves excellent encryption performance and can resist brute-force, statistical, and chosen-plaintext attacks.

  8. 9 CFR 319.302 - Hash.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CERTIFICATION DEFINITIONS AND STANDARDS OF IDENTITY OR COMPOSITION Canned, Frozen, or Dehydrated Meat Food... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hash. 319.302 Section 319.302 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION...

  9. 9 CFR 319.302 - Hash.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CERTIFICATION DEFINITIONS AND STANDARDS OF IDENTITY OR COMPOSITION Canned, Frozen, or Dehydrated Meat Food... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hash. 319.302 Section 319.302 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION...

  10. A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol.

    PubMed

    Zeng, Ping; Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun

    2017-01-01

    In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on-all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications.

  11. The Speech multi features fusion perceptual hash algorithm based on tensor decomposition

    NASA Astrophysics Data System (ADS)

    Huang, Y. B.; Fan, M. H.; Zhang, Q. Y.

    2018-03-01

    With constant progress in modern speech communication technologies, the speech data is prone to be attacked by the noise or maliciously tampered. In order to make the speech perception hash algorithm has strong robustness and high efficiency, this paper put forward a speech perception hash algorithm based on the tensor decomposition and multi features is proposed. This algorithm analyses the speech perception feature acquires each speech component wavelet packet decomposition. LPCC, LSP and ISP feature of each speech component are extracted to constitute the speech feature tensor. Speech authentication is done by generating the hash values through feature matrix quantification which use mid-value. Experimental results showing that the proposed algorithm is robust for content to maintain operations compared with similar algorithms. It is able to resist the attack of the common background noise. Also, the algorithm is highly efficiency in terms of arithmetic, and is able to meet the real-time requirements of speech communication and complete the speech authentication quickly.

  12. A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol

    PubMed Central

    Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun

    2017-01-01

    In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on—all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications. PMID:28399157

  13. 21 CFR 1311.08 - Incorporation by reference.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... the National Institute of Standards and Technology, Computer Security Division, Information Technology... Publication (FIPS PUB) 140-2, Change Notices (12-03-2002), Security Requirements for Cryptographic Modules... §§ 1311.30(b), 1311.55(b), 1311.115(b), 1311.120(b), 1311.205(b). (i) Annex A: Approved Security Functions...

  14. Totally S-protected hyaluronic acid: Evaluation of stability and mucoadhesive properties as liquid dosage form.

    PubMed

    Pereira de Sousa, Irene; Suchaoin, Wongsakorn; Zupančič, Ožbej; Leichner, Christina; Bernkop-Schnürch, Andreas

    2016-11-05

    It is the aim of this study to synthesize hyaluronic acid (HA) derivatives bearing mucoadhesive properties and showing prolonged stability at pH 7.4 and under oxidative condition as liquid dosage form. HA was modified by thiolation with l-cysteine (HA-SH) and by conjugation with 2-mercaptonicotinic acid-l-cysteine ligand to obtain an S-protected derivative (HA-MNA). The polymers were characterized by determination of thiol group content and mercaptonicotinic acid content. Cytotoxicity, stability and mucoadhesive properties (rheological evaluation and tensile test) of the polymers were evaluated. HA-SH and HA-MNA could be successfully synthesized with a degree of modification of 5% and 9% of the total moles of carboxylic acid groups, respectively. MTT assay revealed no toxicity for the polymers. HA-SH resulted to be unstable both at pH 7.4 and under oxidative conditions, whereas HA-MNA was stable under both conditions. Rheological assessment showed a 52-fold and a 3-fold increase in viscosity for HA-MNA incubated with mucus compared to unmodified HA and HA-SH, respectively. Tensile evaluation carried out with intestinal and conjunctival mucosa confirmed the higher mucoadhesive properties of HA-MNA compared to HA-SH. According to the presented results, HA-MNA appears to be a potent excipient for the formulation of stable liquid dosage forms showing comparatively high mucodhesive properties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Network-Aware Mechanisms for Tolerating Byzantine Failures in Distributed Systems

    DTIC Science & Technology

    2012-01-01

    In Digest, we use SHA-256 as the collision-resistant hash function. We use the realization of SHA-256 from the OpenSSL toolkit [48]. The results...vol. 46, pp. 372–378, 1997. [48] “ Openssl project.” [Online]. Available: http://www.openssl.org/ 154 [49] M. J. Fischer, N. A. Lynch, and M. Merritt

  16. Variable-bias coin tossing

    NASA Astrophysics Data System (ADS)

    Colbeck, Roger; Kent, Adrian

    2006-03-01

    Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT.

  17. Heavy-Ion Microbeam Fault Injection into SRAM-Based FPGA Implementations of Cryptographic Circuits

    NASA Astrophysics Data System (ADS)

    Li, Huiyun; Du, Guanghua; Shao, Cuiping; Dai, Liang; Xu, Guoqing; Guo, Jinlong

    2015-06-01

    Transistors hit by heavy ions may conduct transiently, thereby introducing transient logic errors. Attackers can exploit these abnormal behaviors and extract sensitive information from the electronic devices. This paper demonstrates an ion irradiation fault injection attack experiment into a cryptographic field-programmable gate-array (FPGA) circuit. The experiment proved that the commercial FPGA chip is vulnerable to low-linear energy transfer carbon irradiation, and the attack can cause the leakage of secret key bits. A statistical model is established to estimate the possibility of an effective fault injection attack on cryptographic integrated circuits. The model incorporates the effects from temporal, spatial, and logical probability of an effective attack on the cryptographic circuits. The rate of successful attack calculated from the model conforms well to the experimental results. This quantitative success rate model can help evaluate security risk for designers as well as for the third-party assessment organizations.

  18. Practical comparison of distributed ledger technologies for IoT

    NASA Astrophysics Data System (ADS)

    Red, Val A.

    2017-05-01

    Existing distributed ledger implementations - specifically, several blockchain implementations - embody a cacophony of divergent capabilities augmenting innovations of cryptographic hashes, consensus mechanisms, and asymmetric cryptography in a wide variety of applications. Whether specifically designed for cryptocurrency or otherwise, several distributed ledgers rely upon modular mechanisms such as consensus or smart contracts. These components, however, can vary substantially among implementations; differences involving proof-of-work, practical byzantine fault tolerance, and other consensus approaches exemplify distinct distributed ledger variations. Such divergence results in unique combinations of modules, performance, latency, and fault tolerance. As implementations continue to develop rapidly due to the emerging nature of blockchain technologies, this paper encapsulates a snapshot of sensor and internet of things (IoT) specific implementations of blockchain as of the end of 2016. Several technical risks and divergent approaches preclude standardization of a blockchain for sensors and IoT in the foreseeable future; such issues will be assessed alongside the practicality of IoT applications among Hyperledger, Iota, and Ethereum distributed ledger implementations suggested for IoT. This paper contributes a comparison of existing distributed ledger implementations intended for practical sensor and IoT utilization. A baseline for characterizing distributed ledger implementations in the context of IoT and sensors is proposed. Technical approaches and performance are compared considering IoT size, weight, and power limitations. Consensus and smart contracts, if applied, are also analyzed for the respective implementations' practicality and security. Overall, the maturity of distributed ledgers with respect to sensor and IoT applicability will be analyzed for enterprise interoperability.

  19. A comparative study of Message Digest 5(MD5) and SHA256 algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Tarigan, J. T.; Ginting, A. B. C.

    2018-03-01

    The document is a collection of written or printed data containing information. The more rapid advancement of technology, the integrity of a document should be kept. Because of the nature of an open document means the document contents can be read and modified by many parties so that the integrity of the information as a content of the document is not preserved. To maintain the integrity of the data, it needs to create a mechanism which is called a digital signature. A digital signature is a specific code which is generated from the function of producing a digital signature. One of the algorithms that used to create the digital signature is a hash function. There are many hash functions. Two of them are message digest 5 (MD5) and SHA256. Those both algorithms certainly have its advantages and disadvantages of each. The purpose of this research is to determine the algorithm which is better. The parameters which used to compare that two algorithms are the running time and complexity. The research results obtained from the complexity of the Algorithms MD5 and SHA256 is the same, i.e., ⊖ (N), but regarding the speed is obtained that MD5 is better compared to SHA256.

  20. Ontology-Based Peer Exchange Network (OPEN)

    ERIC Educational Resources Information Center

    Dong, Hui

    2010-01-01

    In current Peer-to-Peer networks, distributed and semantic free indexing is widely used by systems adopting "Distributed Hash Table" ("DHT") mechanisms. Although such systems typically solve a. user query rather fast in a deterministic way, they only support a very narrow search scheme, namely the exact hash key match. Furthermore, DHT systems put…

  1. Secure Minutiae-Based Fingerprint Templates Using Random Triangle Hashing

    NASA Astrophysics Data System (ADS)

    Jin, Zhe; Jin Teoh, Andrew Beng; Ong, Thian Song; Tee, Connie

    Due to privacy concern on the widespread use of biometric authentication systems, biometric template protection has gained great attention in the biometric research recently. It is a challenging task to design a biometric template protection scheme which is anonymous, revocable and noninvertible while maintaining acceptable performance. Many methods have been proposed to resolve this problem, and cancelable biometrics is one of them. In this paper, we propose a scheme coined as Random Triangle Hashing which follows the concept of cancelable biometrics in the fingerprint domain. In this method, re-alignment of fingerprints is not required as all the minutiae are translated into a pre-defined 2 dimensional space based on a reference minutia. After that, the proposed Random Triangle hashing method is used to enforce the one-way property (non-invertibility) of the biometric template. The proposed method is resistant to minor translation error and rotation distortion. Finally, the hash vectors are converted into bit-strings to be stored in the database. The proposed method is evaluated using the public database FVC2004 DB1. An EER of less than 1% is achieved by using the proposed method.

  2. Method and system for analyzing and classifying electronic information

    DOEpatents

    McGaffey, Robert W.; Bell, Michael Allen; Kortman, Peter J.; Wilson, Charles H.

    2003-04-29

    A data analysis and classification system that reads the electronic information, analyzes the electronic information according to a user-defined set of logical rules, and returns a classification result. The data analysis and classification system may accept any form of computer-readable electronic information. The system creates a hash table wherein each entry of the hash table contains a concept corresponding to a word or phrase which the system has previously encountered. The system creates an object model based on the user-defined logical associations, used for reviewing each concept contained in the electronic information in order to determine whether the electronic information is classified. The data analysis and classification system extracts each concept in turn from the electronic information, locates it in the hash table, and propagates it through the object model. In the event that the system can not find the electronic information token in the hash table, that token is added to a missing terms list. If any rule is satisfied during propagation of the concept through the object model, the electronic information is classified.

  3. In Vitro and Ex Vivo Evaluation of Novel Curcumin-Loaded Excipient for Buccal Delivery.

    PubMed

    Laffleur, Flavia; Schmelzle, Franziska; Ganner, Ariane; Vanicek, Stefan

    2017-08-01

    This study aimed to develop a mucoadhesive polymeric excipient comprising curcumin for buccal delivery. Curcumin encompasses broad range of benefits such as antioxidant, anti-inflammatory, and chemotherapeutic activity. Hyaluronic acid (HA) as polymeric excipient was modified by immobilization of thiol bearing ligands. L-Cysteine (SH) ethyl ester was covalently attached via amide bond formation between cysteine and the carboxylic moiety of hyaluronic acid. Succeeded synthesis was proved by H-NMR and IR spectra. The obtained thiolated polymer hyaluronic acid ethyl ester (HA-SH) was evaluated in terms of stability, safety, mucoadhesiveness, drug release, and permeation-enhancing properties. HA-SH showed 2.75-fold higher swelling capacity over time in comparison to unmodified polymer. Furthermore, mucoadhesion increased 3.4-fold in case of HA-SH and drug release was increased 1.6-fold versus HA control, respectively. Curcumin-loaded HA-SH exhibits a 4.4-fold higher permeation compared with respective HA. Taking these outcomes in consideration, novel curcumin-loaded excipient, namely thiolated hyaluronic acid ethyl ester appears as promising tool for pharyngeal diseases.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamlet, Jason; Pierson, Lyndon; Bauer, Todd

    Supply chain security to detect, deter, and prevent the counterfeiting of networked and stand-alone integrated circuits (ICs) is critical to cyber security. Sandia National Laboratory researchers have developed IC ID to leverage Physically Unclonable Functions (PUFs) and strong cryptographic authentication to create a unique fingerprint for each integrated circuit. IC ID assures the authenticity of ICs to prevent tampering or malicious substitution.

  5. Concatenations of the Hidden Weighted Bit Function and Their Cryptographic Properties

    DTIC Science & Technology

    2014-01-01

    a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 2014 2. REPORT TYPE 3. DATES COVERED 00...nonlinearity, in Advances in Cryptology – ASIACRYPT 2008 , Springer-Verlag, 2008, 425–440. [6] C. Carlet and K. Feng, An infinite class of balanced vectorial

  6. A Foundational Proof Framework for Cryptography

    DTIC Science & Technology

    2015-05-01

    uniformly-distributed natural numbers in [0, ) is shown in Listing 7. RndNat_h is a helper function that samples a natural number with the appro...expect that a skilled Coq developer could complete such a proof in a matter of days (though he may require the help of a cryptographer to develop the

  7. Cryptography for a High-Assurance Web-Based Enterprise

    DTIC Science & Technology

    2013-10-01

    2. Other Cryptographic services - Java provides many cryptographic services through the Java Cryptography Architecture (JCA) framework. The...id=2125 [7]. Miller, Sandra Kay, Fiber Optic Networks Vulnerable to Attack, Information Security Magazine, November 15, 2006, [8]. José R.C

  8. Classification of cognitive systems dedicated to data sharing

    NASA Astrophysics Data System (ADS)

    Ogiela, Lidia; Ogiela, Marek R.

    2017-08-01

    In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.

  9. Cryptographic Securities Exchanges

    NASA Astrophysics Data System (ADS)

    Thorpe, Christopher; Parkes, David C.

    While transparency in financial markets should enhance liquidity, its exploitation by unethical and parasitic traders discourages others from fully embracing disclosure of their own information. Traders exploit both the private information in upstairs markets used to trade large orders outside traditional exchanges and the public information present in exchanges' quoted limit order books. Using homomorphic cryptographic protocols, market designers can create "partially transparent" markets in which every matched trade is provably correct and only beneficial information is revealed. In a cryptographic securities exchange, market operators can hide information to prevent its exploitation, and still prove facts about the hidden information such as bid/ask spread or market depth.

  10. [PREPARATION AND BIOCOMPATIBILITY OF IN SITU CROSSLINKING HYALURONIC ACID HYDROGEL].

    PubMed

    Liang, Jiabi; Li, Jun; Wang, Ting; Liang, Yuhong; Zou, Xuenong; Zhou, Guangqian; Zhou, Zhiyu

    2016-06-08

    To fabricate in situ crosslinking hyaluronic acid hydrogel and evaluate its biocompatibility in vitro. The acrylic acid chloride and polyethylene glycol were added to prepare crosslinking agent polyethylene glycol acrylate (PEGDA), and the molecular structure of PEGDA was analyzed by Flourier transformation infrared spectroscopy and 1H nuclear magnetic resonance spectroscopy. Hyaluronic acid hydrogel was chemically modified to prepare hyaluronic acid thiolation (HA-SH). And the degree of HA-SH was analyzed qualitatively and quantitatively by Ellman method. HA-SH solution in concentrations ( W/V ) of 0.5%, 1.0%, and 1.5% and PEGDA solution in concentrations ( W/V ) of 2%, 4%, and 6% were prepared with PBS. The two solutions were mixed in different ratios, and in situ crosslinking hyaluronic acid hydrogel was obtained; the crosslinking time was recorded. The cellular toxicity of in situ crosslinking hyaluronic acid hydrogel (1.5% HA-SH and 4% PEGDA mixed) was tested by L929 cells. Meanwhile, the biocompatibility of hydrogel was tested by co-cultured with human bone mesenchymal stem cells (hBMSCs). Flourier transformation infrared spectroscopy showed that most hydroxyl groups were replaced by acrylate groups; 1H nuclear magnetic resonance spectroscopy showed 3 characteristic peaks of hydrogen representing acrylate and olefinic bond at 5-7 ppm. The thiolation yield of HA-SH was 65.4%. In situ crosslinking time of hyaluronic acid hydrogel was 2 to 70 minutes in the PEGDA concentrations of 2%-6% and HA-SH concentrations of 0.5%-1.5%. The hyaluronic acid hydrogel appeared to be transparent. The toxicity grade of leaching solution of hydrogel was grade 1. hBMSCs grew well and distributed evenly in hydrogel with a very high viability. In situ crosslinking hyaluronic acid hydrogel has low cytotoxicity, good biocompatibility, and controllable crosslinking time, so it could be used as a potential tissue engineered scaffold or repairing material for tissue regeneration.

  11. Secure Cryptographic Key Management System (CKMS) Considerations for Smart Grid Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Aldridge, Hal

    2011-01-01

    In this paper, we examine some unique challenges associated with key management in the Smart Grid and concomitant research initiatives: 1) effectively model security requirements and their implementations, and 2) manage keys and key distribution for very large scale deployments such as Smart Meters over a long period of performance. This will set the stage to: 3) develop innovative, low cost methods to protect keying material, and 4) provide high assurance authentication services. We will present our perspective on key management and will discuss some key issues within the life cycle of a cryptographic key designed to achieve the following:more » 1) control systems designed, installed, operated, and maintained to survive an intentional cyber assault with no loss of critical function, and 2) widespread implementation of methods for secure communication between remote access devices and control centers that are scalable and cost-effective to deploy.« less

  12. Cost-Sensitive Local Binary Feature Learning for Facial Age Estimation.

    PubMed

    Lu, Jiwen; Liong, Venice Erin; Zhou, Jie

    2015-12-01

    In this paper, we propose a cost-sensitive local binary feature learning (CS-LBFL) method for facial age estimation. Unlike the conventional facial age estimation methods that employ hand-crafted descriptors or holistically learned descriptors for feature representation, our CS-LBFL method learns discriminative local features directly from raw pixels for face representation. Motivated by the fact that facial age estimation is a cost-sensitive computer vision problem and local binary features are more robust to illumination and expression variations than holistic features, we learn a series of hashing functions to project raw pixel values extracted from face patches into low-dimensional binary codes, where binary codes with similar chronological ages are projected as close as possible, and those with dissimilar chronological ages are projected as far as possible. Then, we pool and encode these local binary codes within each face image as a real-valued histogram feature for face representation. Moreover, we propose a cost-sensitive local binary multi-feature learning method to jointly learn multiple sets of hashing functions using face patches extracted from different scales to exploit complementary information. Our methods achieve competitive performance on four widely used face aging data sets.

  13. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christy, J. E.; Nickless, W. K.; Thiede, D. R.

    The Transport version 3 (T3) system uses the Network News Transfer Protocol (NNTP) to move data from sources to a Data Reporisoty (DR). Interested recipients subscribe to newsgroups to retrieve data. Data in transport is protected by AES-256 and RSA cryptographic services provided by the external OpenSSL cryptographic libraries.

  15. Report on Pairing-based Cryptography.

    PubMed

    Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily

    2015-01-01

    This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST's position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed.

  16. Report on Pairing-based Cryptography

    PubMed Central

    Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily

    2015-01-01

    This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST’s position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed. PMID:26958435

  17. A covert authentication and security solution for GMOs.

    PubMed

    Mueller, Siguna; Jafari, Farhad; Roth, Don

    2016-09-21

    Proliferation and expansion of security risks necessitates new measures to ensure authenticity and validation of GMOs. Watermarking and other cryptographic methods are available which conceal and recover the original signature, but in the process reveal the authentication information. In many scenarios watermarking and standard cryptographic methods are necessary but not sufficient and new, more advanced, cryptographic protocols are necessary. Herein, we present a new crypto protocol, that is applicable in broader settings, and embeds the authentication string indistinguishably from a random element in the signature space and the string is verified or denied without disclosing the actual signature. Results show that in a nucleotide string of 1000, the algorithm gives a correlation of 0.98 or higher between the distribution of the codon and that of E. coli, making the signature virtually invisible. This algorithm may be used to securely authenticate and validate GMOs without disclosing the actual signature. While this protocol uses watermarking, its novelty is in use of more complex cryptographic techniques based on zero knowledge proofs to encode information.

  18. A Study of Gaps in Cyber Defense Automation

    DTIC Science & Technology

    2015-10-18

    to lowercase. Next, the normalized file is tokenized using an n -length window. These n -tokens are then hashed into a Bloom filter for that file. In the...readily available to most developers. However, with a complexity of O(N2) where N is the number of files (or the number of functions if using function...that measure the impact of each feature on the website’s chances of becoming compromised, and the top N features are submitted to the classification

  19. 75 FR 52798 - State-07, Cryptographic Clearance Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-27

    ... of records, Authority for maintenance of the system, Purpose, Safeguards and Retrievability as well... INDIVIDUALS COVERED BY THE SYSTEM: All current Civil Service and Foreign Service direct hire employees of the... well as those who have already received cryptographic clearance. CATEGORIES OF RECORDS IN THE SYSTEM...

  20. Simultenious binary hash and features learning for image retrieval

    NASA Astrophysics Data System (ADS)

    Frantc, V. A.; Makov, S. V.; Voronin, V. V.; Marchuk, V. I.; Semenishchev, E. A.; Egiazarian, K. O.; Agaian, S.

    2016-05-01

    Content-based image retrieval systems have plenty of applications in modern world. The most important one is the image search by query image or by semantic description. Approaches to this problem are employed in personal photo-collection management systems, web-scale image search engines, medical systems, etc. Automatic analysis of large unlabeled image datasets is virtually impossible without satisfactory image-retrieval technique. It's the main reason why this kind of automatic image processing has attracted so much attention during recent years. Despite rather huge progress in the field, semantically meaningful image retrieval still remains a challenging task. The main issue here is the demand to provide reliable results in short amount of time. This paper addresses the problem by novel technique for simultaneous learning of global image features and binary hash codes. Our approach provide mapping of pixel-based image representation to hash-value space simultaneously trying to save as much of semantic image content as possible. We use deep learning methodology to generate image description with properties of similarity preservation and statistical independence. The main advantage of our approach in contrast to existing is ability to fine-tune retrieval procedure for very specific application which allow us to provide better results in comparison to general techniques. Presented in the paper framework for data- dependent image hashing is based on use two different kinds of neural networks: convolutional neural networks for image description and autoencoder for feature to hash space mapping. Experimental results confirmed that our approach has shown promising results in compare to other state-of-the-art methods.

  1. A novel, privacy-preserving cryptographic approach for sharing sequencing data

    PubMed Central

    Cassa, Christopher A; Miller, Rachel A; Mandl, Kenneth D

    2013-01-01

    Objective DNA samples are often processed and sequenced in facilities external to the point of collection. These samples are routinely labeled with patient identifiers or pseudonyms, allowing for potential linkage to identity and private clinical information if intercepted during transmission. We present a cryptographic scheme to securely transmit externally generated sequence data which does not require any patient identifiers, public key infrastructure, or the transmission of passwords. Materials and methods This novel encryption scheme cryptographically protects participant sequence data using a shared secret key that is derived from a unique subset of an individual’s genetic sequence. This scheme requires access to a subset of an individual’s genetic sequence to acquire full access to the transmitted sequence data, which helps to prevent sample mismatch. Results We validate that the proposed encryption scheme is robust to sequencing errors, population uniqueness, and sibling disambiguation, and provides sufficient cryptographic key space. Discussion Access to a set of an individual’s genotypes and a mutually agreed cryptographic seed is needed to unlock the full sequence, which provides additional sample authentication and authorization security. We present modest fixed and marginal costs to implement this transmission architecture. Conclusions It is possible for genomics researchers who sequence participant samples externally to protect the transmission of sequence data using unique features of an individual’s genetic sequence. PMID:23125421

  2. Developing a Standard Method for Link-Layer Security of CCSDS Space Communications

    NASA Technical Reports Server (NTRS)

    Biggerstaff, Craig

    2009-01-01

    Communications security for space systems has been a specialized field generally far removed from considerations of mission interoperability and cross-support in fact, these considerations often have been viewed as intrinsically opposed to security objectives. The space communications protocols defined by the Consultative Committee for Space Data Systems (CCSDS) have a twenty-five year history of successful use in over 400 missions. While the CCSDS Telemetry, Telecommand, and Advancing Orbiting Systems protocols for use at OSI Layer 2 are operationally mature, there has been no direct support within these protocols for communications security techniques. Link-layer communications security has been successfully implemented in the past using mission-unique methods, but never before with an objective of facilitating cross-support and interoperability. This paper discusses the design of a standard method for cryptographic authentication, encryption, and replay protection at the data link layer that can be integrated into existing CCSDS protocols without disruption to legacy communications services. Integrating cryptographic operations into existing data structures and processing sequences requires a careful assessment of the potential impediments within spacecraft, ground stations, and operations centers. The objective of this work is to provide a sound method for cryptographic encapsulation of frame data that also facilitates Layer 2 virtual channel switching, such that a mission may procure data transport services as needed without involving third parties in the cryptographic processing, or split independent data streams for separate cryptographic processing.

  3. A new method of cannabis ingestion: the dangers of dabs?

    PubMed

    Loflin, Mallory; Earleywine, Mitch

    2014-10-01

    A new method for administering cannabinoids, called butane hash oil ("dabs"), is gaining popularity among marijuana users. Despite press reports that suggest that "dabbing" is riskier than smoking flower cannabis, no data address whether dabs users experience more problems from use than those who prefer flower cannabis. The present study aimed to gather preliminary information on dabs users and test whether dabs use is associated with more problems than using flower cannabis. Participants (n=357) reported on their history of cannabis use, their experience with hash oil and the process of "dabbing," reasons for choosing "dabs" over other methods, and any problems related to both flower cannabis and butane hash oil. Analyses revealed that using "dabs" created no more problems or accidents than using flower cannabis. Participants did report that "dabs" led to higher tolerance and withdrawal (as defined by the participants), suggesting that the practice might be more likely to lead to symptoms of addiction or dependence. The use of butane hash oil has spread outside of the medical marijuana community, and users view it as significantly more dangerous than other forms of cannabis use. Published by Elsevier Ltd.

  4. Fast Exact Search in Hamming Space With Multi-Index Hashing.

    PubMed

    Norouzi, Mohammad; Punjani, Ali; Fleet, David J

    2014-06-01

    There is growing interest in representing image data and feature descriptors using compact binary codes for fast near neighbor search. Although binary codes are motivated by their use as direct indices (addresses) into a hash table, codes longer than 32 bits are not being used as such, as it was thought to be ineffective. We introduce a rigorous way to build multiple hash tables on binary code substrings that enables exact k-nearest neighbor search in Hamming space. The approach is storage efficient and straight-forward to implement. Theoretical analysis shows that the algorithm exhibits sub-linear run-time behavior for uniformly distributed codes. Empirical results show dramatic speedups over a linear scan baseline for datasets of up to one billion codes of 64, 128, or 256 bits.

  5. HASH: the Hong Kong/AAO/Strasbourg Hα planetary nebula database

    NASA Astrophysics Data System (ADS)

    Parker, Quentin A.; Bojičić, Ivan S.; Frew, David J.

    2016-07-01

    By incorporating our major recent discoveries with re-measured and verified contents of existing catalogues we provide, for the first time, an accessible, reliable, on-line SQL database for essential, up-to date information for all known Galactic planetary nebulae (PNe). We have attempted to: i) reliably remove PN mimics/false ID's that have biased previous studies and ii) provide accurate positions, sizes, morphologies, multi-wavelength imagery and spectroscopy. We also provide a link to CDS/Vizier for the archival history of each object and other valuable links to external data. With the HASH interface, users can sift, select, browse, collate, investigate, download and visualise the entire currently known Galactic PNe diversity. HASH provides the community with the most complete and reliable data with which to undertake new science.

  6. Significance of cannabis use to dental practice.

    PubMed

    Maloney, William James

    2011-04-01

    The illicit use of the three main forms of cannabis-marijuana, hash, hash oil-pose certain obstacles and challenges to the dental professional. There are a number of systemic, as well as oral/head and neck manifestations, associated with cannabis use. Dentists need to be aware of these manifestations in order to take whatever precautions and/or modifications to the proposed treatment that might be necessary.

  7. Portable Language-Independent Adaptive Translation from OCR. Phase 1

    DTIC Science & Technology

    2009-04-01

    including brute-force k-Nearest Neighbors ( kNN ), fast approximate kNN using hashed k-d trees, classification and regression trees, and locality...achieved by refinements in ground-truthing protocols. Recent algorithmic improvements to our approximate kNN classifier using hashed k-D trees allows...recent years discriminative training has been shown to outperform phonetic HMMs estimated using ML for speech recognition. Standard ML estimation

  8. Elliptic net and its cryptographic application

    NASA Astrophysics Data System (ADS)

    Muslim, Norliana; Said, Mohamad Rushdan Md

    2017-11-01

    Elliptic net is a generalization of elliptic divisibility sequence and in cryptography field, most cryptographic pairings that are based on elliptic curve such as Tate pairing can be improved by applying elliptic nets algorithm. The elliptic net is constructed by using n dimensional array of values in rational number satisfying nonlinear recurrence relations that arise from elliptic divisibility sequences. The two main properties hold in the recurrence relations are for all positive integers m>n, hm +nhm -n=hm +1hm -1hn2-hn +1hn -1hm2 and hn divides hm whenever n divides m. In this research, we discuss elliptic divisibility sequence associated with elliptic nets based on cryptographic perspective and its possible research direction.

  9. R&D100: IC ID

    ScienceCinema

    Hamlet, Jason; Pierson, Lyndon; Bauer, Todd

    2018-06-25

    Supply chain security to detect, deter, and prevent the counterfeiting of networked and stand-alone integrated circuits (ICs) is critical to cyber security. Sandia National Laboratory researchers have developed IC ID to leverage Physically Unclonable Functions (PUFs) and strong cryptographic authentication to create a unique fingerprint for each integrated circuit. IC ID assures the authenticity of ICs to prevent tampering or malicious substitution.

  10. Design of cryptographically secure AES like S-Box using second-order reversible cellular automata for wireless body area network applications.

    PubMed

    Gangadari, Bhoopal Rao; Rafi Ahamed, Shaik

    2016-09-01

    In biomedical, data security is the most expensive resource for wireless body area network applications. Cryptographic algorithms are used in order to protect the information against unauthorised access. Advanced encryption standard (AES) cryptographic algorithm plays a vital role in telemedicine applications. The authors propose a novel approach for design of substitution bytes (S-Box) using second-order reversible one-dimensional cellular automata (RCA 2 ) as a replacement to the classical look-up-table (LUT) based S-Box used in AES algorithm. The performance of proposed RCA 2 based S-Box and conventional LUT based S-Box is evaluated in terms of security using the cryptographic properties such as the nonlinearity, correlation immunity bias, strict avalanche criteria and entropy. Moreover, it is also shown that RCA 2 based S-Boxes are dynamic in nature, invertible and provide high level of security. Further, it is also found that the RCA 2 based S-Box have comparatively better performance than that of conventional LUT based S-Box.

  11. Design of cryptographically secure AES like S-Box using second-order reversible cellular automata for wireless body area network applications

    PubMed Central

    Rafi Ahamed, Shaik

    2016-01-01

    In biomedical, data security is the most expensive resource for wireless body area network applications. Cryptographic algorithms are used in order to protect the information against unauthorised access. Advanced encryption standard (AES) cryptographic algorithm plays a vital role in telemedicine applications. The authors propose a novel approach for design of substitution bytes (S-Box) using second-order reversible one-dimensional cellular automata (RCA2) as a replacement to the classical look-up-table (LUT) based S-Box used in AES algorithm. The performance of proposed RCA2 based S-Box and conventional LUT based S-Box is evaluated in terms of security using the cryptographic properties such as the nonlinearity, correlation immunity bias, strict avalanche criteria and entropy. Moreover, it is also shown that RCA2 based S-Boxes are dynamic in nature, invertible and provide high level of security. Further, it is also found that the RCA2 based S-Box have comparatively better performance than that of conventional LUT based S-Box. PMID:27733924

  12. The Zagora cryptograph

    NASA Astrophysics Data System (ADS)

    Coucouzeli, A.

    A unique lead seal from the well-known eighth century B.C. settlement of Zagora on the island of Andros dramatically confirms and expands our knowledge of the town planning identified at the site and constituting the earliest example of an orthogonal grid plan in the Greek world. The seal in question is decorated with a symbolic design that constitutes a rare representation of the Dioskouroi as part of the constellation Gemini. This design appears to have acted as a cryptograph enciphering the basic mathematical and astronomical principles behind the planning of Zagora. Besides offering us new insights into early Greek settlement planning, the cryptograph seems to reveal an advanced practical competence in mathematics and celestial observation, which was hitherto unsuspected for such an early period. The Zagora cryptograph also suggests that astronomy and mathematics played a crucial role in the strengthening of the ruling elite's power at Zagora in the framework of the rising city-state or polis. The tight interweaving of astronomical, mathematical, architectural and social considerations in the planning of Zagora is an entirely new discovery for Greece, whose implications are far-reaching.

  13. Octadecyl Chains Immobilized onto Hyaluronic Acid Coatings by Thiol-ene "Click Chemistry" Increase the Surface Antimicrobial Properties and Prevent Platelet Adhesion and Activation to Polyurethane.

    PubMed

    Felgueiras, Helena P; Wang, L M; Ren, K F; Querido, M M; Jin, Q; Barbosa, M A; Ji, J; Martins, M C L

    2017-03-08

    Infection and thrombus formation are still the biggest challenges for the success of blood contact medical devices. This work aims the development of an antimicrobial and hemocompatible biomaterial coating through which selective binding of albumin (passivant protein) from the bloodstream is promoted and, thus, adsorption of other proteins responsible for bacterial adhesion and thrombus formation can be prevented. Polyurethane (PU) films were coated with hyaluronic acid, an antifouling agent, that was previously modified with thiol groups (HA-SH), using polydopamine as the binding agent. Octadecyl acrylate (C18) was used to attract albumin since it resembles the circulating free fatty acids and albumin is a fatty acid transporter. Thiol-ene "click chemistry" was explored for C18 immobilization on HA-SH through a covalent bond between the thiol groups from the HA and the alkene groups from the C18 chains. Surfaces were prepared with different C18 concentrations (0, 5, 10, and 20%) and successful immobilization was demonstrated by scanning electron microscopy (SEM), water contact angle determinations, X-ray photoelectron spectroscopy (XPS) and Fourier transform infrared spectroscopy (FTIR). The ability of surfaces to bind albumin selectively was determined by quartz crystal microbalance with dissipation (QCM-D). Albumin adsorption increased in response to the hydrophobic nature of the surfaces, which augmented with C18 saturation. HA-SH coating reduced albumin adsorption to PU. C18 immobilized onto HA-SH at 5% promoted selective binding of albumin, decreased Staphylococcus aureus adhesion and prevented platelet adhesion and activation to PU in the presence of human plasma. C18/HA-SH coating was established as an innovative and promising strategy to improve the antimicrobial properties and hemocompatibility of any blood contact medical device.

  14. Range image registration based on hash map and moth-flame optimization

    NASA Astrophysics Data System (ADS)

    Zou, Li; Ge, Baozhen; Chen, Lei

    2018-03-01

    Over the past decade, evolutionary algorithms (EAs) have been introduced to solve range image registration problems because of their robustness and high precision. However, EA-based range image registration algorithms are time-consuming. To reduce the computational time, an EA-based range image registration algorithm using hash map and moth-flame optimization is proposed. In this registration algorithm, a hash map is used to avoid over-exploitation in registration process. Additionally, we present a search equation that is better at exploration and a restart mechanism to avoid being trapped in local minima. We compare the proposed registration algorithm with the registration algorithms using moth-flame optimization and several state-of-the-art EA-based registration algorithms. The experimental results show that the proposed algorithm has a lower computational cost than other algorithms and achieves similar registration precision.

  15. Matching CCD images to a stellar catalog using locality-sensitive hashing

    NASA Astrophysics Data System (ADS)

    Liu, Bo; Yu, Jia-Zong; Peng, Qing-Yu

    2018-02-01

    The usage of a subset of observed stars in a CCD image to find their corresponding matched stars in a stellar catalog is an important issue in astronomical research. Subgraph isomorphic-based algorithms are the most widely used methods in star catalog matching. When more subgraph features are provided, the CCD images are recognized better. However, when the navigation feature database is large, the method requires more time to match the observing model. To solve this problem, this study investigates further and improves subgraph isomorphic matching algorithms. We present an algorithm based on a locality-sensitive hashing technique, which allocates quadrilateral models in the navigation feature database into different hash buckets and reduces the search range to the bucket in which the observed quadrilateral model is located. Experimental results indicate the effectivity of our method.

  16. Architectural design of an Algol interpreter

    NASA Technical Reports Server (NTRS)

    Jackson, C. K.

    1971-01-01

    The design of a syntax-directed interpreter for a subset of Algol is described. It is a conceptual design with sufficient details and completeness but as much independence of implementation as possible. The design includes a detailed description of a scanner, an analyzer described in the Floyd-Evans productions, a hash-coded symbol table, and an executor. Interpretation of sample programs is also provided to show how the interpreter functions.

  17. Discrete cosine transform and hash functions toward implementing a (robust-fragile) watermarking scheme

    NASA Astrophysics Data System (ADS)

    Al-Mansoori, Saeed; Kunhu, Alavi

    2013-10-01

    This paper proposes a blind multi-watermarking scheme based on designing two back-to-back encoders. The first encoder is implemented to embed a robust watermark into remote sensing imagery by applying a Discrete Cosine Transform (DCT) approach. Such watermark is used in many applications to protect the copyright of the image. However, the second encoder embeds a fragile watermark using `SHA-1' hash function. The purpose behind embedding a fragile watermark is to prove the authenticity of the image (i.e. tamper-proof). Thus, the proposed technique was developed as a result of new challenges with piracy of remote sensing imagery ownership. This led researchers to look for different means to secure the ownership of satellite imagery and prevent the illegal use of these resources. Therefore, Emirates Institution for Advanced Science and Technology (EIAST) proposed utilizing existing data security concept by embedding a digital signature, "watermark", into DubaiSat-1 satellite imagery. In this study, DubaiSat-1 images with 2.5 meter resolution are used as a cover and a colored EIAST logo is used as a watermark. In order to evaluate the robustness of the proposed technique, a couple of attacks are applied such as JPEG compression, rotation and synchronization attacks. Furthermore, tampering attacks are applied to prove image authenticity.

  18. A Robust and Effective Smart-Card-Based Remote User Authentication Mechanism Using Hash Function

    PubMed Central

    Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme. PMID:24892078

  19. A robust and effective smart-card-based remote user authentication mechanism using hash function.

    PubMed

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme.

  20. An enhanced biometric authentication scheme for telecare medicine information systems with nonce using chaotic hash function.

    PubMed

    Das, Ashok Kumar; Goswami, Adrijit

    2014-06-01

    Recently, Awasthi and Srivastava proposed a novel biometric remote user authentication scheme for the telecare medicine information system (TMIS) with nonce. Their scheme is very efficient as it is based on efficient chaotic one-way hash function and bitwise XOR operations. In this paper, we first analyze Awasthi-Srivastava's scheme and then show that their scheme has several drawbacks: (1) incorrect password change phase, (2) fails to preserve user anonymity property, (3) fails to establish a secret session key beween a legal user and the server, (4) fails to protect strong replay attack, and (5) lacks rigorous formal security analysis. We then a propose a novel and secure biometric-based remote user authentication scheme in order to withstand the security flaw found in Awasthi-Srivastava's scheme and enhance the features required for an idle user authentication scheme. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks, including the replay and man-in-the-middle attacks. Our scheme is also efficient as compared to Awasthi-Srivastava's scheme.

  1. System and method for key generation in security tokens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Philip G.; Humble, Travis S.; Paul, Nathanael R.

    Functional randomness in security tokens (FRIST) may achieve improved security in two-factor authentication hardware tokens by improving on the algorithms used to securely generate random data. A system and method in one embodiment according to the present invention may allow for security of a token based on storage cost and computational security. This approach may enable communication where security is no longer based solely on onetime pads (OTPs) generated from a single cryptographic function (e.g., SHA-256).

  2. A Software Assurance Framework for Mitigating the Risks of Malicious Software in Embedded Systems Used in Aircraft

    DTIC Science & Technology

    2011-09-01

    to show cryptographic signature # generation on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp/csdb CODEBASE=. touch "$CSDB" find "$CODEBASE" -type f...artifacts generated earlier. 81 #! /bin/sh # # Demo program to show cryptographic signature # verification on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp

  3. Technical Analysis of SSP-21 Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bromberger, S.

    As part of the California Energy Systems for the Twenty-First Century (CES-21) program, in December 2016 San Diego Gas and Electric (SDG&E) contracted with Lawrence Livermore National Laboratory (LLNL) to perform an independent verification and validation (IV&V) of a white paper describing their Secure SCADA Protocol for the Twenty-First Century (SSP-21) in order to analyze the effectiveness and propriety of cryptographic protocol use within the SSP-21 specification. SSP-21 is designed to use cryptographic protocols to provide (optional) encryption, authentication, and nonrepudiation, among other capabilities. The cryptographic protocols to be used reflect current industry standards; future versions of SSP-21 will usemore » other advanced technologies to provide a subset of security services.« less

  4. Correlation Immunity, Avalanche Features, and Other Cryptographic Properties of Generalized Boolean Functions

    DTIC Science & Technology

    2017-09-01

    information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other aspect...of this collection of information , including suggestions for reducing this burden to Washington headquarters Services, Directorate for Information

  5. A Tree Locality-Sensitive Hash for Secure Software Testing

    DTIC Science & Technology

    2017-09-14

    errors, or to look for vulnerabilities that could allow a nefarious actor to use our software against us. Ultimately, all testing is designed to find...and an equivalent number of feasible paths discovered by Klee. 1.5 Summary This document the Tree Locality-Sensitive Hash (TLSH), a locality-senstive...performing two groups of tests that verify the accuracy and usefulness of TLSH. Chapter 5 summarizes the contents of the dissertation and lists avenues

  6. Checking Questionable Entry of Personally Identifiable Information Encrypted by One-Way Hash Transformation

    PubMed Central

    Chen, Xianlai; Fann, Yang C; McAuliffe, Matthew; Vismer, David

    2017-01-01

    Background As one of the several effective solutions for personal privacy protection, a global unique identifier (GUID) is linked with hash codes that are generated from combinations of personally identifiable information (PII) by a one-way hash algorithm. On the GUID server, no PII is permitted to be stored, and only GUID and hash codes are allowed. The quality of PII entry is critical to the GUID system. Objective The goal of our study was to explore a method of checking questionable entry of PII in this context without using or sending any portion of PII while registering a subject. Methods According to the principle of GUID system, all possible combination patterns of PII fields were analyzed and used to generate hash codes, which were stored on the GUID server. Based on the matching rules of the GUID system, an error-checking algorithm was developed using set theory to check PII entry errors. We selected 200,000 simulated individuals with randomly-planted errors to evaluate the proposed algorithm. These errors were placed in the required PII fields or optional PII fields. The performance of the proposed algorithm was also tested in the registering system of study subjects. Results There are 127,700 error-planted subjects, of which 114,464 (89.64%) can still be identified as the previous one and remaining 13,236 (10.36%, 13,236/127,700) are discriminated as new subjects. As expected, 100% of nonidentified subjects had errors within the required PII fields. The possibility that a subject is identified is related to the count and the type of incorrect PII field. For all identified subjects, their errors can be found by the proposed algorithm. The scope of questionable PII fields is also associated with the count and the type of the incorrect PII field. The best situation is to precisely find the exact incorrect PII fields, and the worst situation is to shrink the questionable scope only to a set of 13 PII fields. In the application, the proposed algorithm can give a hint of questionable PII entry and perform as an effective tool. Conclusions The GUID system has high error tolerance and may correctly identify and associate a subject even with few PII field errors. Correct data entry, especially required PII fields, is critical to avoiding false splits. In the context of one-way hash transformation, the questionable input of PII may be identified by applying set theory operators based on the hash codes. The count and the type of incorrect PII fields play an important role in identifying a subject and locating questionable PII fields. PMID:28213343

  7. Checking Questionable Entry of Personally Identifiable Information Encrypted by One-Way Hash Transformation.

    PubMed

    Chen, Xianlai; Fann, Yang C; McAuliffe, Matthew; Vismer, David; Yang, Rong

    2017-02-17

    As one of the several effective solutions for personal privacy protection, a global unique identifier (GUID) is linked with hash codes that are generated from combinations of personally identifiable information (PII) by a one-way hash algorithm. On the GUID server, no PII is permitted to be stored, and only GUID and hash codes are allowed. The quality of PII entry is critical to the GUID system. The goal of our study was to explore a method of checking questionable entry of PII in this context without using or sending any portion of PII while registering a subject. According to the principle of GUID system, all possible combination patterns of PII fields were analyzed and used to generate hash codes, which were stored on the GUID server. Based on the matching rules of the GUID system, an error-checking algorithm was developed using set theory to check PII entry errors. We selected 200,000 simulated individuals with randomly-planted errors to evaluate the proposed algorithm. These errors were placed in the required PII fields or optional PII fields. The performance of the proposed algorithm was also tested in the registering system of study subjects. There are 127,700 error-planted subjects, of which 114,464 (89.64%) can still be identified as the previous one and remaining 13,236 (10.36%, 13,236/127,700) are discriminated as new subjects. As expected, 100% of nonidentified subjects had errors within the required PII fields. The possibility that a subject is identified is related to the count and the type of incorrect PII field. For all identified subjects, their errors can be found by the proposed algorithm. The scope of questionable PII fields is also associated with the count and the type of the incorrect PII field. The best situation is to precisely find the exact incorrect PII fields, and the worst situation is to shrink the questionable scope only to a set of 13 PII fields. In the application, the proposed algorithm can give a hint of questionable PII entry and perform as an effective tool. The GUID system has high error tolerance and may correctly identify and associate a subject even with few PII field errors. Correct data entry, especially required PII fields, is critical to avoiding false splits. In the context of one-way hash transformation, the questionable input of PII may be identified by applying set theory operators based on the hash codes. The count and the type of incorrect PII fields play an important role in identifying a subject and locating questionable PII fields. ©Xianlai Chen, Yang C Fann, Matthew McAuliffe, David Vismer, Rong Yang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 17.02.2017.

  8. Status of MAPA (Modular Accelerator Physics Analysis) and the Tech-X Object-Oriented Accelerator Library

    NASA Astrophysics Data System (ADS)

    Cary, J. R.; Shasharina, S.; Bruhwiler, D. L.

    1998-04-01

    The MAPA code is a fully interactive accelerator modeling and design tool consisting of a GUI and two object-oriented C++ libraries: a general library suitable for treatment of any dynamical system, and an accelerator library including many element types plus an accelerator class. The accelerator library inherits directly from the system library, which uses hash tables to store any relevant parameters or strings. The GUI can access these hash tables in a general way, allowing the user to invoke a window displaying all relevant parameters for a particular element type or for the accelerator class, with the option to change those parameters. The system library can advance an arbitrary number of dynamical variables through an arbitrary mapping. The accelerator class inherits this capability and overloads the relevant functions to advance the phase space variables of a charged particle through a string of elements. Among other things, the GUI makes phase space plots and finds fixed points of the map. We discuss the object hierarchy of the two libraries and use of the code.

  9. ProGeRF: Proteome and Genome Repeat Finder Utilizing a Fast Parallel Hash Function

    PubMed Central

    Moraes, Walas Jhony Lopes; Rodrigues, Thiago de Souza; Bartholomeu, Daniella Castanheira

    2015-01-01

    Repetitive element sequences are adjacent, repeating patterns, also called motifs, and can be of different lengths; repetitions can involve their exact or approximate copies. They have been widely used as molecular markers in population biology. Given the sizes of sequenced genomes, various bioinformatics tools have been developed for the extraction of repetitive elements from DNA sequences. However, currently available tools do not provide options for identifying repetitive elements in the genome or proteome, displaying a user-friendly web interface, and performing-exhaustive searches. ProGeRF is a web site for extracting repetitive regions from genome and proteome sequences. It was designed to be efficient, fast, and accurate and primarily user-friendly web tool allowing many ways to view and analyse the results. ProGeRF (Proteome and Genome Repeat Finder) is freely available as a stand-alone program, from which the users can download the source code, and as a web tool. It was developed using the hash table approach to extract perfect and imperfect repetitive regions in a (multi)FASTA file, while allowing a linear time complexity. PMID:25811026

  10. A Key Establishment Protocol for RFID User in IPTV Environment

    NASA Astrophysics Data System (ADS)

    Jeong, Yoon-Su; Kim, Yong-Tae; Sohn, Jae-Min; Park, Gil-Cheol; Lee, Sang-Ho

    In recent years, the usage of IPTV (Internet Protocol Television) has been increased. The reason is a technological convergence of broadcasting and telecommunication delivering interactive applications and multimedia content through high speed Internet connections. The main critical point of IPTV security requirements is subscriber authentication. That is, IPTV service should have the capability to identify the subscribers to prohibit illegal access. Currently, IPTV service does not provide a sound authentication mechanism to verify the identity of its wireless users (or devices). This paper focuses on a lightweight authentication and key establishment protocol based on the use of hash functions. The proposed approach provides effective authentication for a mobile user with a RFID tag whose authentication information is communicated back and forth with the IPTV authentication server via IPTV set-top box (STB). That is, the proposed protocol generates user's authentication information that is a bundle of two public keys derived from hashing user's private keys and RFID tag's session identifier, and adds 1bit to this bundled information for subscriber's information confidentiality before passing it to the authentication server.

  11. Using Compilers to Enhance Cryptographic Product Development

    NASA Astrophysics Data System (ADS)

    Bangerter, E.; Barbosa, M.; Bernstein, D.; Damgård, I.; Page, D.; Pagter, J. I.; Sadeghi, A.-R.; Sovio, S.

    Developing high-quality software is hard in the general case, and it is significantly more challenging in the case of cryptographic software. A high degree of new skill and understanding must be learnt and applied without error to avoid vulnerability and inefficiency. This is often beyond the financial, manpower or intellectual resources avail-able. In this paper we present the motivation for the European funded CACE (Computer Aided Cryptography Engineering) project The main objective of CACE is to provide engineers (with limited or no expertise in cryptography) with a toolbox that allows them to generate robust and efficient implementations of cryptographic primitives. We also present some preliminary results already obtained in the early stages of this project, and discuss the relevance of the project as perceived by stakeholders in the mobile device arena.

  12. HECLIB. Volume 2: HECDSS Subroutines Programmer’s Manual

    DTIC Science & Technology

    1991-05-01

    algorithm and hierarchical design for database accesses. This algorithm provides quick access to data sets and an efficient means of adding new data set...Description of How DSS Works DSS version 6 utilizes a modified hash algorithm based upon the pathname to store and retrieve data. This structure allows...balancing disk space and record access times. A variation in this algorithm is for "stable" files. In a stable file, a hash table is not utilized

  13. Stable carbon and oxygen isotope record of central Lake Erie sediments

    USGS Publications Warehouse

    Tevesz, M.J.S.; Spongberg, A.L.; Fuller, J.A.

    1998-01-01

    Stable carbon and oxygen isotope data from mollusc aragonite extracted from sediment cores provide new information on the origin and history of sedimentation in the southwestern area of the central basin of Lake Erie. Sediments infilling the Sandusky subbasin consist of three lithologic units overlying glacial deposits. The lowest of these is a soft gray mud overlain by a shell hash layer containing Sphaerium striatinum fragments. A fluid mud unit caps the shell hash layer and extends upwards to the sediment-water interface. New stable isotope data suggest that the soft gray mud unit is of postglacial, rather than proglacial, origin. These data also suggest that the shell hash layer was derived from erosional winnowing of the underlying soft gray mud layer. This winnowing event may have occurred as a result of the Nipissing flood. The Pelee-Lorain moraine, which forms the eastern boundary of the Sandusky subbasin, is an elevated area of till capped by a sand deposit that originated as a beach. The presence of both the shell hash layer and relict beach deposit strengthens the interpretation that the Nipissing flood was a critical event in the development of the southwestern area of the central basin of Lake Erie. This event, which returned drainage from the upper lakes to the Lake Erie basin, was a dominant influence on regional stratigraphy, bathymetry, and depositional setting.

  14. Improved One-Way Hash Chain and Revocation Polynomial-Based Self-Healing Group Key Distribution Schemes in Resource-Constrained Wireless Networks

    PubMed Central

    Chen, Huifang; Xie, Lei

    2014-01-01

    Self-healing group key distribution (SGKD) aims to deal with the key distribution problem over an unreliable wireless network. In this paper, we investigate the SGKD issue in resource-constrained wireless networks. We propose two improved SGKD schemes using the one-way hash chain (OHC) and the revocation polynomial (RP), the OHC&RP-SGKD schemes. In the proposed OHC&RP-SGKD schemes, by introducing the unique session identifier and binding the joining time with the capability of recovering previous session keys, the problem of the collusion attack between revoked users and new joined users in existing hash chain-based SGKD schemes is resolved. Moreover, novel methods for utilizing the one-way hash chain and constructing the personal secret, the revocation polynomial and the key updating broadcast packet are presented. Hence, the proposed OHC&RP-SGKD schemes eliminate the limitation of the maximum allowed number of revoked users on the maximum allowed number of sessions, increase the maximum allowed number of revoked/colluding users, and reduce the redundancy in the key updating broadcast packet. Performance analysis and simulation results show that the proposed OHC&RP-SGKD schemes are practical for resource-constrained wireless networks in bad environments, where a strong collusion attack resistance is required and many users could be revoked. PMID:25529204

  15. Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness

    DTIC Science & Technology

    2016-06-01

    within an attack tree structure, then expand attack tree methodology to include cryptographic reductions. We then provide the algorithms for...maintaining and automatically reasoning about these expanded attack trees . We provide a software tool that utilizes machine-readable proof and attack metadata...and the attack tree methodology to provide rapid and precise answers regarding security parameters and effective security. This eliminates the need

  16. Investigation of Current State of Crytpography and Theoretical Implementation of a Cryptographic System for the Combat Service Support Control System.

    DTIC Science & Technology

    1987-05-01

    34 Advances in Crypt g: Proceedings of CRYPTO 84,r o ... .. .. _ __...o ... .. ... ....... ed. by G.R. Blakely and D. Chaum . [Wagn84b] Wagner, Neal R...in Distributed Computer Systems," IEEE Trans. on Computers, Vol. C-35, No. 7, Jul. 86, pp. 583-590. Gifford, David K., "Cryptographic Sealing for

  17. Generating region proposals for histopathological whole slide image retrieval.

    PubMed

    Ma, Yibing; Jiang, Zhiguo; Zhang, Haopeng; Xie, Fengying; Zheng, Yushan; Shi, Huaqiang; Zhao, Yu; Shi, Jun

    2018-06-01

    Content-based image retrieval is an effective method for histopathological image analysis. However, given a database of huge whole slide images (WSIs), acquiring appropriate region-of-interests (ROIs) for training is significant and difficult. Moreover, histopathological images can only be annotated by pathologists, resulting in the lack of labeling information. Therefore, it is an important and challenging task to generate ROIs from WSI and retrieve image with few labels. This paper presents a novel unsupervised region proposing method for histopathological WSI based on Selective Search. Specifically, the WSI is over-segmented into regions which are hierarchically merged until the WSI becomes a single region. Nucleus-oriented similarity measures for region mergence and Nucleus-Cytoplasm color space for histopathological image are specially defined to generate accurate region proposals. Additionally, we propose a new semi-supervised hashing method for image retrieval. The semantic features of images are extracted with Latent Dirichlet Allocation and transformed into binary hashing codes with Supervised Hashing. The methods are tested on a large-scale multi-class database of breast histopathological WSIs. The results demonstrate that for one WSI, our region proposing method can generate 7.3 thousand contoured regions which fit well with 95.8% of the ROIs annotated by pathologists. The proposed hashing method can retrieve a query image among 136 thousand images in 0.29 s and reach precision of 91% with only 10% of images labeled. The unsupervised region proposing method can generate regions as predictions of lesions in histopathological WSI. The region proposals can also serve as the training samples to train machine-learning models for image retrieval. The proposed hashing method can achieve fast and precise image retrieval with small amount of labels. Furthermore, the proposed methods can be potentially applied in online computer-aided-diagnosis systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Improving Sector Hash Carving with Rule-Based and Entropy-Based Non-Probative Block Filters

    DTIC Science & Technology

    2015-03-01

    0x20 exceeds the histogram rule’s threshold of 256 instances of a single 4-byte value. The 0x20 bytes are part of an Extensible Metadata Platform (XMP...block consists of data separated by NULL bytes of padding. The histogram rule is triggered for the block because the block contains more than 256 4...sdash can reduce the rate of false positive matches. After characteristic features have been selected, the features are hashed using SHA -1, which creates

  19. Design Considerations for a Computationally-Lightweight Authentication Mechanism for Passive RFID Tags

    DTIC Science & Technology

    2009-09-01

    suffer the power and complexity requirements of a public key system. 28 In [18], a simulation of the SHA –1 algorithm is performed on a Xilinx FPGA ... 256 bits. Thus, the construction of a hash table would need 2512 independent comparisons. It is known that hash collisions of the SHA –1 algorithm... SHA –1 algorithm for small-core FPGA design. Small-core FPGA design is the process by which a circuit is adapted to use the minimal amount of logic

  20. Ordered versus Unordered Map for Primitive Data Types

    DTIC Science & Technology

    2015-09-01

    mapped to some element. C++ provides two types of map containers within the standard template library, the std ::map and the std ::unordered_map...classes. As the name implies, the containers main functional difference is that the elements in the std ::map are ordered by the key, and the std ...unordered_map are not ordered based on their key. The std ::unordered_map elements are placed into “buckets” based on a hash value computed for their key

  1. A 3D Split Manufacturing Approach to Trustworthy System Development

    DTIC Science & Technology

    2012-12-01

    addition of any cryptographic algorithm or implementation to be included in the system as a foundry-level option. Essentially, 3D security introduces...8192 bytes). We modeled our cryptographic process after the AES algorithm , which can occupy up to 4640 bytes with an enlarged T-Box implementation [4...Reconfigurable Systems and Algorithms (ERSA), Las Vegas, NV, July 2011. [10] Intelligence Advanced Research Projects Agency (IARPA). Trusted integrated

  2. Comparison of Spatiotemporal Mapping Techniques for Enormous Etl and Exploitation Patterns

    NASA Astrophysics Data System (ADS)

    Deiotte, R.; La Valley, R.

    2017-10-01

    The need to extract, transform, and exploit enormous volumes of spatiotemporal data has exploded with the rise of social media, advanced military sensors, wearables, automotive tracking, etc. However, current methods of spatiotemporal encoding and exploitation simultaneously limit the use of that information and increase computing complexity. Current spatiotemporal encoding methods from Niemeyer and Usher rely on a Z-order space filling curve, a relative of Peano's 1890 space filling curve, for spatial hashing and interleaving temporal hashes to generate a spatiotemporal encoding. However, there exist other space-filling curves, and that provide different manifold coverings that could promote better hashing techniques for spatial data and have the potential to map spatiotemporal data without interleaving. The concatenation of Niemeyer's and Usher's techniques provide a highly efficient space-time index. However, other methods have advantages and disadvantages regarding computational cost, efficiency, and utility. This paper explores the several methods using a range of sizes of data sets from 1K to 10M observations and provides a comparison of the methods.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solis, John Hector

    In this paper, we present a modular framework for constructing a secure and efficient program obfuscation scheme. Our approach, inspired by the obfuscation with respect to oracle machines model of [4], retains an interactive online protocol with an oracle, but relaxes the original computational and storage restrictions. We argue this is reasonable given the computational resources of modern personal devices. Furthermore, we relax the information-theoretic security requirement for computational security to utilize established cryptographic primitives. With this additional flexibility we are free to explore different cryptographic buildingblocks. Our approach combines authenticated encryption with private information retrieval to construct a securemore » program obfuscation framework. We give a formal specification of our framework, based on desired functionality and security properties, and provide an example instantiation. In particular, we implement AES in Galois/Counter Mode for authenticated encryption and the Gentry-Ramzan [13]constant communication-rate private information retrieval scheme. We present our implementation results and show that non-trivial sized programs can be realized, but scalability is quickly limited by computational overhead. Finally, we include a discussion on security considerations when instantiating specific modules.« less

  4. Affine Equivalence and Constructions of Cryptographically Strong Boolean Functions

    DTIC Science & Technology

    2013-09-01

    manner is crucial for today’s global citizen. We want our financial transactions over the Internet to get processed without error. Cyber warfare between...encryption and decryption processes . An asymmetric cipher uses different keys to encrypt and decrypt a message, and the connection between the encryption and...Depending on how a symmetric cipher processes a message before encryption or de- cryption, a symmetric cipher can be further classified into a block or

  5. Secure Biometric E-Voting Scheme

    NASA Astrophysics Data System (ADS)

    Ahmed, Taha Kh.; Aborizka, Mohamed

    The implementation of the e-voting becomes more substantial with the rapid increase of e-government development. The recent growth in communications and cryptographic techniques facilitate the implementation of e-voting. Many countries introduced e-voting systems; unfortunately most of these systems are not fully functional. In this paper we will present an e-voting scheme that covers most of the e-voting requirements, smart card and biometric recognition technology were implemented to guarantee voter's privacy and authentication.

  6. Practical Computer Security through Cryptography

    NASA Technical Reports Server (NTRS)

    McNab, David; Twetev, David (Technical Monitor)

    1998-01-01

    The core protocols upon which the Internet was built are insecure. Weak authentication and the lack of low level encryption services introduce vulnerabilities that propagate upwards in the network stack. Using statistics based on CERT/CC Internet security incident reports, the relative likelihood of attacks via these vulnerabilities is analyzed. The primary conclusion is that the standard UNIX BSD-based authentication system is by far the most commonly exploited weakness. Encryption of Sensitive password data and the adoption of cryptographically-based authentication protocols can greatly reduce these vulnerabilities. Basic cryptographic terminology and techniques are presented, with attention focused on the ways in which technology such as encryption and digital signatures can be used to protect against the most commonly exploited vulnerabilities. A survey of contemporary security software demonstrates that tools based on cryptographic techniques, such as Kerberos, ssh, and PGP, are readily available and effectively close many of the most serious security holes. Nine practical recommendations for improving security are described.

  7. Reset Tree-Based Optical Fault Detection

    PubMed Central

    Lee, Dong-Geon; Choi, Dooho; Seo, Jungtaek; Kim, Howon

    2013-01-01

    In this paper, we present a new reset tree-based scheme to protect cryptographic hardware against optical fault injection attacks. As one of the most powerful invasive attacks on cryptographic hardware, optical fault attacks cause semiconductors to misbehave by injecting high-energy light into a decapped integrated circuit. The contaminated result from the affected chip is then used to reveal secret information, such as a key, from the cryptographic hardware. Since the advent of such attacks, various countermeasures have been proposed. Although most of these countermeasures are strong, there is still the possibility of attack. In this paper, we present a novel optical fault detection scheme that utilizes the buffers on a circuit's reset signal tree as a fault detection sensor. To evaluate our proposal, we model radiation-induced currents into circuit components and perform a SPICE simulation. The proposed scheme is expected to be used as a supplemental security tool. PMID:23698267

  8. Evaluation of Information Leakage from Cryptographic Hardware via Common-Mode Current

    NASA Astrophysics Data System (ADS)

    Hayashi, Yu-Ichi; Homma, Naofumi; Mizuki, Takaaki; Sugawara, Takeshi; Kayano, Yoshiki; Aoki, Takafumi; Minegishi, Shigeki; Satoh, Akashi; Sone, Hideaki; Inoue, Hiroshi

    This paper presents a possibility of Electromagnetic (EM) analysis against cryptographic modules outside their security boundaries. The mechanism behind the information leakage is explained from the view point of Electromagnetic Compatibility: electric fluctuation released from cryptographic modules can conduct to peripheral circuits based on ground bounce, resulting in radiation. We demonstrate the consequence of the mechanism through experiments where the ISO/IEC standard block cipher AES (Advanced Encryption Standard) is implemented on an FPGA board and EM radiations from power and communication cables are measured. Correlation Electromagnetic Analysis (CEMA) is conducted in order to evaluate the information leakage. The experimental results show that secret keys are revealed even though there are various disturbing factors such as voltage regulators and AC/DC converters between the target module and the measurement points. We also discuss information-suppression techniques as electrical-level countermeasures against such CEMAs.

  9. Community Detection in Sparse Random Networks

    DTIC Science & Technology

    2013-08-13

    if, (i, j) ∈ E , meaning there is an edge between nodes i, j ∈ V. Note that W is symmetric, and we assume that Wii = 0 for all i. Under the null... Wii = 0.) Our arguments are parallel to those we used under P0, the only difficulty being that Wi is not binomial anymore. Indeed, WSi ∼ Bin(n − 1, p1...Berlin: Springer. Alon, N. and S. Gutner (2010). Balanced families of perfect hash functions and their applications. ACM Trans. Algorithms 6 (3), Art

  10. Security analysis of boolean algebra based on Zhang-Wang digital signature scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Jinbin, E-mail: jbzheng518@163.com

    2014-10-06

    In 2005, Zhang and Wang proposed an improvement signature scheme without using one-way hash function and message redundancy. In this paper, we show that this scheme exits potential safety concerns through the analysis of boolean algebra, such as bitwise exclusive-or, and point out that mapping is not one to one between assembly instructions and machine code actually by means of the analysis of the result of the assembly program segment, and which possibly causes safety problems unknown to the software.

  11. An efficient (t,n) threshold quantum secret sharing without entanglement

    NASA Astrophysics Data System (ADS)

    Qin, Huawang; Dai, Yuewei

    2016-04-01

    An efficient (t,n) threshold quantum secret sharing (QSS) scheme is proposed. In our scheme, the Hash function is used to check the eavesdropping, and no particles need to be published. So the utilization efficiency of the particles is real 100%. No entanglement is used in our scheme. The dealer uses the single particles to encode the secret information, and the participants get the secret through measuring the single particles. Compared to the existing schemes, our scheme is simpler and more efficient.

  12. Deep Hashing for Scalable Image Search.

    PubMed

    Lu, Jiwen; Liong, Venice Erin; Zhou, Jie

    2017-05-01

    In this paper, we propose a new deep hashing (DH) approach to learn compact binary codes for scalable image search. Unlike most existing binary codes learning methods, which usually seek a single linear projection to map each sample into a binary feature vector, we develop a deep neural network to seek multiple hierarchical non-linear transformations to learn these binary codes, so that the non-linear relationship of samples can be well exploited. Our model is learned under three constraints at the top layer of the developed deep network: 1) the loss between the compact real-valued code and the learned binary vector is minimized, 2) the binary codes distribute evenly on each bit, and 3) different bits are as independent as possible. To further improve the discriminative power of the learned binary codes, we extend DH into supervised DH (SDH) and multi-label SDH by including a discriminative term into the objective function of DH, which simultaneously maximizes the inter-class variations and minimizes the intra-class variations of the learned binary codes with the single-label and multi-label settings, respectively. Extensive experimental results on eight widely used image search data sets show that our proposed methods achieve very competitive results with the state-of-the-arts.

  13. Centralized Cryptographic Key Management and Critical Risk Assessment - CRADA Final Report For CRADA Number NFE-11-03562

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, R. K.; Peters, Scott

    The Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) Cyber Security for Energy Delivery Systems (CSEDS) industry led program (DE-FOA-0000359) entitled "Innovation for Increasing Cyber Security for Energy Delivery Systems (12CSEDS)," awarded a contract to Sypris Electronics LLC to develop a Cryptographic Key Management System for the smart grid (Scalable Key Management Solutions for Critical Infrastructure Protection). Oak Ridge National Laboratory (ORNL) and Sypris Electronics, LLC as a result of that award entered into a CRADA (NFE-11-03562) between ORNL and Sypris Electronics, LLC. ORNL provided its Cyber Security Econometrics System (CSES) as a tool to be modifiedmore » and used as a metric to address risks and vulnerabilities in the management of cryptographic keys within the Advanced Metering Infrastructure (AMI) domain of the electric sector. ORNL concentrated our analysis on the AMI domain of which the National Electric Sector Cyber security Organization Resource (NESCOR) Working Group 1 (WG1) has documented 29 failure scenarios. The computational infrastructure of this metric involves system stakeholders, security requirements, system components and security threats. To compute this metric, we estimated the stakes that each stakeholder associates with each security requirement, as well as stochastic matrices that represent the probability of a threat to cause a component failure and the probability of a component failure to cause a security requirement violation. We applied this model to estimate the security of the AMI, by leveraging the recently established National Institute of Standards and Technology Interagency Report (NISTIR) 7628 guidelines for smart grid security and the International Electrotechnical Commission (IEC) 63351, Part 9 to identify the life cycle for cryptographic key management, resulting in a vector that assigned to each stakeholder an estimate of their average loss in terms of dollars per day of system operation. To further address probabilities of threats, information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain from NESCOR WG1. From these five selected scenarios, we characterized them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrated how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.« less

  14. Cryptographic Key Management and Critical Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K

    The Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) CyberSecurity for Energy Delivery Systems (CSEDS) industry led program (DE-FOA-0000359) entitled "Innovation for Increasing CyberSecurity for Energy Delivery Systems (12CSEDS)," awarded a contract to Sypris Electronics LLC to develop a Cryptographic Key Management System for the smart grid (Scalable Key Management Solutions for Critical Infrastructure Protection). Oak Ridge National Laboratory (ORNL) and Sypris Electronics, LLC as a result of that award entered into a CRADA (NFE-11-03562) between ORNL and Sypris Electronics, LLC. ORNL provided its Cyber Security Econometrics System (CSES) as a tool to be modified and usedmore » as a metric to address risks and vulnerabilities in the management of cryptographic keys within the Advanced Metering Infrastructure (AMI) domain of the electric sector. ORNL concentrated our analysis on the AMI domain of which the National Electric Sector Cyber security Organization Resource (NESCOR) Working Group 1 (WG1) has documented 29 failure scenarios. The computational infrastructure of this metric involves system stakeholders, security requirements, system components and security threats. To compute this metric, we estimated the stakes that each stakeholder associates with each security requirement, as well as stochastic matrices that represent the probability of a threat to cause a component failure and the probability of a component failure to cause a security requirement violation. We applied this model to estimate the security of the AMI, by leveraging the recently established National Institute of Standards and Technology Interagency Report (NISTIR) 7628 guidelines for smart grid security and the International Electrotechnical Commission (IEC) 63351, Part 9 to identify the life cycle for cryptographic key management, resulting in a vector that assigned to each stakeholder an estimate of their average loss in terms of dollars per day of system operation. To further address probabilities of threats, information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain from NESCOR WG1. From these five selected scenarios, we characterized them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrated how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.« less

  15. Cryptographic synchronization recovery by measuring randomness of decrypted data

    DOEpatents

    Maestas, Joseph H.; Pierson, Lyndon G.

    1990-01-01

    The invention relates to synchronization of encrypted data communication systems and a method which looks for any lack of pattern or intelligent information in the received data and triggers a resynchronization signal based thereon. If the encrypter/decrypter pairs are out of cryptographic synchronization, the received (decrypted) data resembles pseudorandom data. A method and system are provided for detecting such pseudorandom binary data by, for example, ones density. If the data is sufficiently random the system is resynchronized.

  16. Using global unique identifiers to link autism collections.

    PubMed

    Johnson, Stephen B; Whitney, Glen; McAuliffe, Matthew; Wang, Hailong; McCreedy, Evan; Rozenblit, Leon; Evans, Clark C

    2010-01-01

    To propose a centralized method for generating global unique identifiers to link collections of research data and specimens. The work is a collaboration between the Simons Foundation Autism Research Initiative and the National Database for Autism Research. The system is implemented as a web service: an investigator inputs identifying information about a participant into a client application and sends encrypted information to a server application, which returns a generated global unique identifier. The authors evaluated the system using a volume test of one million simulated individuals and a field test on 2000 families (over 8000 individual participants) in an autism study. Inverse probability of hash codes; rate of false identity of two individuals; rate of false split of single individual; percentage of subjects for which identifying information could be collected; percentage of hash codes generated successfully. Large-volume simulation generated no false splits or false identity. Field testing in the Simons Foundation Autism Research Initiative Simplex Collection produced identifiers for 96% of children in the study and 77% of parents. On average, four out of five hash codes per subject were generated perfectly (only one perfect hash is required for subsequent matching). The system must achieve balance among the competing goals of distinguishing individuals, collecting accurate information for matching, and protecting confidentiality. Considerable effort is required to obtain approval from institutional review boards, obtain consent from participants, and to achieve compliance from sites during a multicenter study. Generic unique identifiers have the potential to link collections of research data, augment the amount and types of data available for individuals, support detection of overlap between collections, and facilitate replication of research findings.

  17. Twitter K-H networks in action: Advancing biomedical literature for drug search.

    PubMed

    Hamed, Ahmed Abdeen; Wu, Xindong; Erickson, Robert; Fandy, Tamer

    2015-08-01

    The importance of searching biomedical literature for drug interaction and side-effects is apparent. Current digital libraries (e.g., PubMed) suffer infrequent tagging and metadata annotation updates. Such limitations cause absence of linking literature to new scientific evidence. This demonstrates a great deal of challenges that stand in the way of scientists when searching biomedical repositories. In this paper, we present a network mining approach that provides a bridge for linking and searching drug-related literature. Our contributions here are two fold: (1) an efficient algorithm called HashPairMiner to address the run-time complexity issues demonstrated in its predecessor algorithm: HashnetMiner, and (2) a database of discoveries hosted on the web to facilitate literature search using the results produced by HashPairMiner. Though the K-H network model and the HashPairMiner algorithm are fairly young, their outcome is evidence of the considerable promise they offer to the biomedical science community in general and the drug research community in particular. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Controllable band structure and topological phase transition in two-dimensional hydrogenated arsenene

    PubMed Central

    Wang, Ya-ping; Ji, Wei-xiao; Zhang, Chang-wen; Li, Ping; Li, Feng; Ren, Miao-juan; Chen, Xin-Lian; Yuan, Min; Wang, Pei-ji

    2016-01-01

    Discovery of two-dimensional (2D) topological insulator such as group-V films initiates challenges in exploring exotic quantum states in low dimensions. Here, we perform first-principles calculations to study the geometric and electronic properties in 2D arsenene monolayer with hydrogenation (HAsH). We predict a new σ-type Dirac cone related to the px,y orbitals of As atoms in HAsH, dependent on in-plane tensile strain. Noticeably, the spin-orbit coupling (SOC) opens a quantum spin Hall (QSH) gap of 193 meV at the Dirac cone. A single pair of topologically protected helical edge states is established for the edges, and its QSH phase is confirmed with topological invariant Z2 = 1. We also propose a 2D quantum well (QW) encapsulating HAsH with the h-BN sheet on each side, which harbors a nontrivial QSH state with the Dirac cone lying within the band gap of cladding BN substrate. These findings provide a promising innovative platform for QSH device design and fabrication operating at room temperature. PMID:26839209

  19. Supervised graph hashing for histopathology image retrieval and classification.

    PubMed

    Shi, Xiaoshuang; Xing, Fuyong; Xu, KaiDi; Xie, Yuanpu; Su, Hai; Yang, Lin

    2017-12-01

    In pathology image analysis, morphological characteristics of cells are critical to grade many diseases. With the development of cell detection and segmentation techniques, it is possible to extract cell-level information for further analysis in pathology images. However, it is challenging to conduct efficient analysis of cell-level information on a large-scale image dataset because each image usually contains hundreds or thousands of cells. In this paper, we propose a novel image retrieval based framework for large-scale pathology image analysis. For each image, we encode each cell into binary codes to generate image representation using a novel graph based hashing model and then conduct image retrieval by applying a group-to-group matching method to similarity measurement. In order to improve both computational efficiency and memory requirement, we further introduce matrix factorization into the hashing model for scalable image retrieval. The proposed framework is extensively validated with thousands of lung cancer images, and it achieves 97.98% classification accuracy and 97.50% retrieval precision with all cells of each query image used. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Modeling and Simulation of the Economics of Mining in the Bitcoin Market.

    PubMed

    Cocco, Luisanna; Marchesi, Michele

    2016-01-01

    In January 3, 2009, Satoshi Nakamoto gave rise to the "Bitcoin Blockchain", creating the first block of the chain hashing on his computer's central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU's generation. They are GPU's, FPGA's and ASIC's generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU's generation, the first with economic significance. The model reproduces some "stylized facts" found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network.

  1. Controllable band structure and topological phase transition in two-dimensional hydrogenated arsenene

    NASA Astrophysics Data System (ADS)

    Wang, Ya-Ping; Ji, Wei-Xiao; Zhang, Chang-Wen; Li, Ping; Li, Feng; Ren, Miao-Juan; Chen, Xin-Lian; Yuan, Min; Wang, Pei-Ji

    2016-02-01

    Discovery of two-dimensional (2D) topological insulator such as group-V films initiates challenges in exploring exotic quantum states in low dimensions. Here, we perform first-principles calculations to study the geometric and electronic properties in 2D arsenene monolayer with hydrogenation (HAsH). We predict a new σ-type Dirac cone related to the px,y orbitals of As atoms in HAsH, dependent on in-plane tensile strain. Noticeably, the spin-orbit coupling (SOC) opens a quantum spin Hall (QSH) gap of 193 meV at the Dirac cone. A single pair of topologically protected helical edge states is established for the edges, and its QSH phase is confirmed with topological invariant Z2 = 1. We also propose a 2D quantum well (QW) encapsulating HAsH with the h-BN sheet on each side, which harbors a nontrivial QSH state with the Dirac cone lying within the band gap of cladding BN substrate. These findings provide a promising innovative platform for QSH device design and fabrication operating at room temperature.

  2. Random sequences generation through optical measurements by phase-shifting interferometry

    NASA Astrophysics Data System (ADS)

    François, M.; Grosges, T.; Barchiesi, D.; Erra, R.; Cornet, A.

    2012-04-01

    The development of new techniques for producing random sequences with a high level of security is a challenging topic of research in modern cryptographics. The proposed method is based on the measurement by phase-shifting interferometry of the speckle signals of the interaction between light and structures. We show how the combination of amplitude and phase distributions (maps) under a numerical process can produce random sequences. The produced sequences satisfy all the statistical requirements of randomness and can be used in cryptographic schemes.

  3. Using Temporal Logic to Specify and Verify Cryptographic Protocols (Progress Report)

    DTIC Science & Technology

    1995-01-01

    know, Meadows’ 1Supported by grant HKUST 608/94E from the Hong Kong Research Grants Council. 1 Report Documentation Page Form ApprovedOMB No. 0704... 1 Introduction We have started work on a project to apply temporal logic to reason about cryptographic protocols. Some of the goals of the project...are as follows. 1 . Allow the user to state and prove that the penetrator cannot use logical or algebraic techniques (e.g., we are disregarding

  4. Hybrid ququart-encoded quantum cryptography protected by Kochen-Specker contextuality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabello, Adan; Department of Physics, Stockholm University, S-10691 Stockholm; D'Ambrosio, Vincenzo

    2011-09-15

    Quantum cryptographic protocols based on complementarity are not secure against attacks in which complementarity is imitated with classical resources. The Kochen-Specker (KS) theorem provides protection against these attacks, without requiring entanglement or spatially separated composite systems. We analyze the maximum tolerated noise to guarantee the security of a KS-protected cryptographic scheme against these attacks and describe a photonic realization of this scheme using hybrid ququarts defined by the polarization and orbital angular momentum of single photons.

  5. Creating an Assured Joint DOD and Interagency Interoperable Net-Centric Enterprise. Report of the Defense Science Board Task Force on Achieving Interoperability in a Net-Centric Environment

    DTIC Science & Technology

    2009-03-01

    policy, elliptic curve public key cryptography using the 256 -bit prime modulus elliptic curve as specified in FIPS-186-2 and SHA - 256 are appropriate for...publications/fips/fips186-2/fips186-2-change1.pdf 76 I P ART I . CH A PT E R 5 Hashing via the Secure Hash Algorithm (using SHA - 256 and...lithography and processing techniques. Field programmable gate arrays ( FPGAs ) are a chip design of interest. These devices are extensively used in

  6. Biased decoy-state measurement-device-independent quantum cryptographic conferencing with finite resources.

    PubMed

    Chen, RuiKe; Bao, WanSu; Zhou, Chun; Li, Hongwei; Wang, Yang; Bao, HaiZe

    2016-03-21

    In recent years, a large quantity of work have been done to narrow the gap between theory and practice in quantum key distribution (QKD). However, most of them are focus on two-party protocols. Very recently, Yao Fu et al proposed a measurement-device-independent quantum cryptographic conferencing (MDI-QCC) protocol and proved its security in the limit of infinitely long keys. As a step towards practical application for MDI-QCC, we design a biased decoy-state measurement-device-independent quantum cryptographic conferencing protocol and analyze the performance of the protocol in both the finite-key and infinite-key regime. From numerical simulations, we show that our decoy-state analysis is tighter than Yao Fu et al. That is, we can achieve the nonzero asymptotic secret key rate in long distance with approximate to 200km and we also demonstrate that with a finite size of data (say 1011 to 1013 signals) it is possible to perform secure MDI-QCC over reasonable distances.

  7. Shadow: Running Tor in a Box for Accurate and Efficient Experimentation

    DTIC Science & Technology

    2011-09-23

    Modeling the speed of a target CPU is done by running an OpenSSL [31] speed test on a real CPU of that type. This provides us with the raw CPU processing...rate, but we are also interested in the processing speed of an application. By running application 5 benchmarks on the same CPU as the OpenSSL speed test...simulation, saving CPU cy- cles on our simulation host machine. Shadow removes cryptographic processing by preloading the main OpenSSL [31] functions used

  8. Protecting Cryptographic Keys and Functions from Malware Attacks

    DTIC Science & Technology

    2010-12-01

    registers. modifies RSA private key signing in OpenSSL to use the technique. The resulting system has the following features: 1. No special hardware is...the above method based on OpenSSL , by exploiting the Streaming SIMD Extension (SSE) XMM registers of modern Intel and AMD x86-compatible CPU’s [22...one can store a 2048-bit exponent.1 Our prototype is based on OpenSSL 0.9.8e, the Ubuntu 6.06 Linux distribution with a 2.6.15 kernel, and SSE2 which

  9. Embedded control system for computerized franking machine

    NASA Astrophysics Data System (ADS)

    Shi, W. M.; Zhang, L. B.; Xu, F.; Zhan, H. W.

    2007-12-01

    This paper presents a novel control system for franking machine. A methodology for operating a franking machine using the functional controls consisting of connection, configuration and franking electromechanical drive is studied. A set of enabling technologies to synthesize postage management software architectures driven microprocessor-based embedded systems is proposed. The cryptographic algorithm that calculates mail items is analyzed to enhance the postal indicia accountability and security. The study indicated that the franking machine is reliability, performance and flexibility in printing mail items.

  10. A Statistical Test Suite for Random and Pseudorandom Number Generators for Cryptographic Applications

    DTIC Science & Technology

    2001-05-15

    is based on a calculated test statistic value, which is a function of the data. If the test statistic value is S and the critical value is t, then...5 Defined in The Handbook of Applied Cryptography ; A. Menezes, P. Van Oorschot and S . Vanstone; CRC Press, 1997. The first 4...3rd ed. Reading: Addison-Wesley, Inc., pp. 61-80. [4] A. J. Menezes, P. C. van Oorschot, and S . A. Vanstone (1997), Handbook of Applied Cryptography

  11. Threshold quantum cryptography

    NASA Astrophysics Data System (ADS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding.

  12. Cryptographic Properties of Monotone Boolean Functions

    DTIC Science & Technology

    2016-01-01

    Algebraic attacks on stream ciphers with linear feedback, in: Advances in Cryptology (Eurocrypt 2003), Lecture Notes in Comput. Sci. 2656, Springer, Berlin...spectrum, algebraic immu- nity MSC 2010: 06E30, 94C10, 94A60, 11T71, 05E99 || Communicated by: Carlo Blundo 1 Introduction Let F 2 be the prime eld of...7]. For the reader’s convenience, we recall some basic notions below. Any f ∈ Bn can be expressed in algebraic normal form (ANF) as f(x 1 , x 2

  13. Physically Unclonable Cryptographic Primitives by Chemical Vapor Deposition of Layered MoS2.

    PubMed

    Alharbi, Abdullah; Armstrong, Darren; Alharbi, Somayah; Shahrjerdi, Davood

    2017-12-26

    Physically unclonable cryptographic primitives are promising for securing the rapidly growing number of electronic devices. Here, we introduce physically unclonable primitives from layered molybdenum disulfide (MoS 2 ) by leveraging the natural randomness of their island growth during chemical vapor deposition (CVD). We synthesize a MoS 2 monolayer film covered with speckles of multilayer islands, where the growth process is engineered for an optimal speckle density. Using the Clark-Evans test, we confirm that the distribution of islands on the film exhibits complete spatial randomness, hence indicating the growth of multilayer speckles is a spatial Poisson process. Such a property is highly desirable for constructing unpredictable cryptographic primitives. The security primitive is an array of 2048 pixels fabricated from this film. The complex structure of the pixels makes the physical duplication of the array impossible (i.e., physically unclonable). A unique optical response is generated by applying an optical stimulus to the structure. The basis for this unique response is the dependence of the photoemission on the number of MoS 2 layers, which by design is random throughout the film. Using a threshold value for the photoemission, we convert the optical response into binary cryptographic keys. We show that the proper selection of this threshold is crucial for maximizing combination randomness and that the optimal value of the threshold is linked directly to the growth process. This study reveals an opportunity for generating robust and versatile security primitives from layered transition metal dichalcogenides.

  14. Multicollision attack on CBC-MAC, EMAC, and XCBC-MAC of AES-128 algorithm

    NASA Astrophysics Data System (ADS)

    Brolin Sihite, Alfonso; Hayat Susanti, Bety

    2017-10-01

    A Message Authentication Codes (MAC) can be constructed based on a block cipher algorithm. CBC-MAC, EMAC, and XCBC-MAC constructions are some of MAC schemes that used in the hash function. In this paper, we do multicollision attack on CBC-MAC, EMAC, and XCBC-MAC construction which uses AES-128 block cipher algorithm as basic construction. The method of multicollision attack utilizes the concept of existential forgery on CBC-MAC. The results show that the multicollision can be obtained easily in CBC-MAC, EMAC, and XCBC-MAC construction.

  15. HIA: a genome mapper using hybrid index-based sequence alignment.

    PubMed

    Choi, Jongpill; Park, Kiejung; Cho, Seong Beom; Chung, Myungguen

    2015-01-01

    A number of alignment tools have been developed to align sequencing reads to the human reference genome. The scale of information from next-generation sequencing (NGS) experiments, however, is increasing rapidly. Recent studies based on NGS technology have routinely produced exome or whole-genome sequences from several hundreds or thousands of samples. To accommodate the increasing need of analyzing very large NGS data sets, it is necessary to develop faster, more sensitive and accurate mapping tools. HIA uses two indices, a hash table index and a suffix array index. The hash table performs direct lookup of a q-gram, and the suffix array performs very fast lookup of variable-length strings by exploiting binary search. We observed that combining hash table and suffix array (hybrid index) is much faster than the suffix array method for finding a substring in the reference sequence. Here, we defined the matching region (MR) is a longest common substring between a reference and a read. And, we also defined the candidate alignment regions (CARs) as a list of MRs that is close to each other. The hybrid index is used to find candidate alignment regions (CARs) between a reference and a read. We found that aligning only the unmatched regions in the CAR is much faster than aligning the whole CAR. In benchmark analysis, HIA outperformed in mapping speed compared with the other aligners, without significant loss of mapping accuracy. Our experiments show that the hybrid of hash table and suffix array is useful in terms of speed for mapping NGS sequencing reads to the human reference genome sequence. In conclusion, our tool is appropriate for aligning massive data sets generated by NGS sequencing.

  16. Design and implementation of a privacy preserving electronic health record linkage tool in Chicago

    PubMed Central

    Cashy, John P; Jackson, Kathryn L; Pah, Adam R; Goel, Satyender; Boehnke, Jörn; Humphries, John Eric; Kominers, Scott Duke; Hota, Bala N; Sims, Shannon A; Malin, Bradley A; French, Dustin D; Walunas, Theresa L; Meltzer, David O; Kaleba, Erin O; Jones, Roderick C; Galanter, William L

    2015-01-01

    Objective To design and implement a tool that creates a secure, privacy preserving linkage of electronic health record (EHR) data across multiple sites in a large metropolitan area in the United States (Chicago, IL), for use in clinical research. Methods The authors developed and distributed a software application that performs standardized data cleaning, preprocessing, and hashing of patient identifiers to remove all protected health information. The application creates seeded hash code combinations of patient identifiers using a Health Insurance Portability and Accountability Act compliant SHA-512 algorithm that minimizes re-identification risk. The authors subsequently linked individual records using a central honest broker with an algorithm that assigns weights to hash combinations in order to generate high specificity matches. Results The software application successfully linked and de-duplicated 7 million records across 6 institutions, resulting in a cohort of 5 million unique records. Using a manually reconciled set of 11 292 patients as a gold standard, the software achieved a sensitivity of 96% and a specificity of 100%, with a majority of the missed matches accounted for by patients with both a missing social security number and last name change. Using 3 disease examples, it is demonstrated that the software can reduce duplication of patient records across sites by as much as 28%. Conclusions Software that standardizes the assignment of a unique seeded hash identifier merged through an agreed upon third-party honest broker can enable large-scale secure linkage of EHR data for epidemiologic and public health research. The software algorithm can improve future epidemiologic research by providing more comprehensive data given that patients may make use of multiple healthcare systems. PMID:26104741

  17. Design and implementation of a privacy preserving electronic health record linkage tool in Chicago.

    PubMed

    Kho, Abel N; Cashy, John P; Jackson, Kathryn L; Pah, Adam R; Goel, Satyender; Boehnke, Jörn; Humphries, John Eric; Kominers, Scott Duke; Hota, Bala N; Sims, Shannon A; Malin, Bradley A; French, Dustin D; Walunas, Theresa L; Meltzer, David O; Kaleba, Erin O; Jones, Roderick C; Galanter, William L

    2015-09-01

    To design and implement a tool that creates a secure, privacy preserving linkage of electronic health record (EHR) data across multiple sites in a large metropolitan area in the United States (Chicago, IL), for use in clinical research. The authors developed and distributed a software application that performs standardized data cleaning, preprocessing, and hashing of patient identifiers to remove all protected health information. The application creates seeded hash code combinations of patient identifiers using a Health Insurance Portability and Accountability Act compliant SHA-512 algorithm that minimizes re-identification risk. The authors subsequently linked individual records using a central honest broker with an algorithm that assigns weights to hash combinations in order to generate high specificity matches. The software application successfully linked and de-duplicated 7 million records across 6 institutions, resulting in a cohort of 5 million unique records. Using a manually reconciled set of 11 292 patients as a gold standard, the software achieved a sensitivity of 96% and a specificity of 100%, with a majority of the missed matches accounted for by patients with both a missing social security number and last name change. Using 3 disease examples, it is demonstrated that the software can reduce duplication of patient records across sites by as much as 28%. Software that standardizes the assignment of a unique seeded hash identifier merged through an agreed upon third-party honest broker can enable large-scale secure linkage of EHR data for epidemiologic and public health research. The software algorithm can improve future epidemiologic research by providing more comprehensive data given that patients may make use of multiple healthcare systems. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. A hash based mutual RFID tag authentication protocol in telecare medicine information system.

    PubMed

    Srivastava, Keerti; Awasthi, Amit K; Kaul, Sonam D; Mittal, R C

    2015-01-01

    Radio Frequency Identification (RFID) is a technology which has multidimensional applications to reduce the complexity of today life. Everywhere, like access control, transportation, real-time inventory, asset management and automated payment systems etc., RFID has its enormous use. Recently, this technology is opening its wings in healthcare environments, where potential applications include patient monitoring, object traceability and drug administration systems etc. In this paper, we propose a secure RFID-based protocol for the medical sector. This protocol is based on hash operation with synchronized secret. The protocol is safe against active and passive attacks such as forgery, traceability, replay and de-synchronization attack.

  19. Graph Coarsening for Path Finding in Cybersecurity Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Johnson, John R.; Halappanavar, Mahantesh

    2013-01-01

    n the pass-the-hash attack, hackers repeatedly steal password hashes and move through a computer network with the goal of reaching a computer with high level administrative privileges. In this paper we apply graph coarsening in network graphs for the purpose of detecting hackers using this attack or assessing the risk level of the network's current state. We repeatedly take graph minors, which preserve the existence of paths in the graph, and take powers of the adjacency matrix to count the paths. This allows us to detect the existence of paths as well as find paths that have high risk ofmore » being used by adversaries.« less

  20. Deeply learnt hashing forests for content based image retrieval in prostate MR images

    NASA Astrophysics Data System (ADS)

    Shah, Amit; Conjeti, Sailesh; Navab, Nassir; Katouzian, Amin

    2016-03-01

    Deluge in the size and heterogeneity of medical image databases necessitates the need for content based retrieval systems for their efficient organization. In this paper, we propose such a system to retrieve prostate MR images which share similarities in appearance and content with a query image. We introduce deeply learnt hashing forests (DL-HF) for this image retrieval task. DL-HF effectively leverages the semantic descriptiveness of deep learnt Convolutional Neural Networks. This is used in conjunction with hashing forests which are unsupervised random forests. DL-HF hierarchically parses the deep-learnt feature space to encode subspaces with compact binary code words. We propose a similarity preserving feature descriptor called Parts Histogram which is derived from DL-HF. Correlation defined on this descriptor is used as a similarity metric for retrieval from the database. Validations on publicly available multi-center prostate MR image database established the validity of the proposed approach. The proposed method is fully-automated without any user-interaction and is not dependent on any external image standardization like image normalization and registration. This image retrieval method is generalizable and is well-suited for retrieval in heterogeneous databases other imaging modalities and anatomies.

  1. Modeling and Simulation of the Economics of Mining in the Bitcoin Market

    PubMed Central

    Marchesi, Michele

    2016-01-01

    In January 3, 2009, Satoshi Nakamoto gave rise to the “Bitcoin Blockchain”, creating the first block of the chain hashing on his computer’s central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU’s generation. They are GPU’s, FPGA’s and ASIC’s generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU’s generation, the first with economic significance. The model reproduces some “stylized facts” found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network. PMID:27768691

  2. Quantum cryptography using coherent states: Randomized encryption and key generation

    NASA Astrophysics Data System (ADS)

    Corndorf, Eric

    With the advent of the global optical-telecommunications infrastructure, an increasing number of individuals, companies, and agencies communicate information with one another over public networks or physically-insecure private networks. While the majority of the traffic flowing through these networks requires little or no assurance of secrecy, the same cannot be said for certain communications between banks, between government agencies, within the military, and between corporations. In these arenas, the need to specify some level of secrecy in communications is a high priority. While the current approaches to securing sensitive information (namely the public-key-cryptography infrastructure and deterministic private-key ciphers like AES and 3DES) seem to be cryptographically strong based on empirical evidence, there exist no mathematical proofs of secrecy for any widely deployed cryptosystem. As an example, the ubiquitous public-key cryptosystems infer all of their secrecy from the assumption that factoring of the product of two large primes is necessarily time consuming---something which has not, and perhaps cannot, be proven. Since the 1980s, the possibility of using quantum-mechanical features of light as a physical mechanism for satisfying particular cryptographic objectives has been explored. This research has been fueled by the hopes that cryptosystems based on quantum systems may provide provable levels of secrecy which are at least as valid as quantum mechanics itself. Unfortunately, the most widely considered quantum-cryptographic protocols (BB84 and the Ekert protocol) have serious implementation problems. Specifically, they require quantum-mechanical states which are not readily available, and they rely on unproven relations between intrusion-level detection and the information available to an attacker. As a result, the secrecy level provided by these experimental implementations is entirely unspecified. In an effort to provably satisfy the cryptographic objectives of key generation and direct data-encryption, a new quantum cryptographic principle is demonstrated wherein keyed coherent-state signal sets are employed. Taking advantage of the fundamental and irreducible quantum-measurement noise of coherent states, these schemes do not require the users to measure the influence of an attacker. Experimental key-generation and data encryption schemes based on these techniques, which are compatible with today's WDM fiber-optic telecommunications infrastructure, are implemented and analyzed.

  3. A cryptologic based trust center for medical images.

    PubMed

    Wong, S T

    1996-01-01

    To investigate practical solutions that can integrate cryptographic techniques and picture archiving and communication systems (PACS) to improve the security of medical images. The PACS at the University of California San Francisco Medical Center consolidate images and associated data from various scanners into a centralized data archive and transmit them to remote display stations for review and consultation purposes. The purpose of this study is to investigate the model of a digital trust center that integrates cryptographic algorithms and protocols seamlessly into such a digital radiology environment to improve the security of medical images. The timing performance of encryption, decryption, and transmission of the cryptographic protocols over 81 volumetric PACS datasets has been measured. Lossless data compression is also applied before the encryption. The transmission performance is measured against three types of networks of different bandwidths: narrow-band Integrated Services Digital Network, Ethernet, and OC-3c Asynchronous Transfer Mode. The proposed digital trust center provides a cryptosystem solution to protect the confidentiality and to determine the authenticity of digital images in hospitals. The results of this study indicate that diagnostic images such as x-rays and magnetic resonance images could be routinely encrypted in PACS. However, applying encryption in teleradiology and PACS is a tradeoff between communications performance and security measures. Many people are uncertain about how to integrate cryptographic algorithms coherently into existing operations of the clinical enterprise. This paper describes a centralized cryptosystem architecture to ensure image data authenticity in a digital radiology department. The system performance has been evaluated in a hospital-integrated PACS environment.

  4. A cryptologic based trust center for medical images.

    PubMed Central

    Wong, S T

    1996-01-01

    OBJECTIVE: To investigate practical solutions that can integrate cryptographic techniques and picture archiving and communication systems (PACS) to improve the security of medical images. DESIGN: The PACS at the University of California San Francisco Medical Center consolidate images and associated data from various scanners into a centralized data archive and transmit them to remote display stations for review and consultation purposes. The purpose of this study is to investigate the model of a digital trust center that integrates cryptographic algorithms and protocols seamlessly into such a digital radiology environment to improve the security of medical images. MEASUREMENTS: The timing performance of encryption, decryption, and transmission of the cryptographic protocols over 81 volumetric PACS datasets has been measured. Lossless data compression is also applied before the encryption. The transmission performance is measured against three types of networks of different bandwidths: narrow-band Integrated Services Digital Network, Ethernet, and OC-3c Asynchronous Transfer Mode. RESULTS: The proposed digital trust center provides a cryptosystem solution to protect the confidentiality and to determine the authenticity of digital images in hospitals. The results of this study indicate that diagnostic images such as x-rays and magnetic resonance images could be routinely encrypted in PACS. However, applying encryption in teleradiology and PACS is a tradeoff between communications performance and security measures. CONCLUSION: Many people are uncertain about how to integrate cryptographic algorithms coherently into existing operations of the clinical enterprise. This paper describes a centralized cryptosystem architecture to ensure image data authenticity in a digital radiology department. The system performance has been evaluated in a hospital-integrated PACS environment. PMID:8930857

  5. Token-based information security for commercial and federal information networks

    NASA Astrophysics Data System (ADS)

    Rohland, William S.

    1996-03-01

    The planning of cryptographic solutions for messaging and electronic commerce applications in the United States during the past few years has been motivated by a high level of interest in the technology on the part of potential users. It has been marked by a high level of controversy over algorithms, patent rights and escrow policy. The diverse needs of the government and commercial sectors have led to mutually exclusive solutions based on different algorithms and policy; this phenomenon is fairly unique to the United States. Because of the strong requirement to preserve the differences that make these solutions unique for the two environments, the near-term evolution of a single standard appears unlikely. Furthermore, the need on the part of some government agencies and some commercial establishments exists to operate in both environments. This paper deals with the technical definition and design approach to a dual-use cryptographic device and the migration paths to the dual-use device from both environments. Such a device is further considered as a component of a secure cryptographic translation facility.

  6. WLC Preface

    NASA Astrophysics Data System (ADS)

    Miret, Josep M.; Sebé, Francesc

    Low-cost devices are the key component of several applications: RFID tags permit an automated supply chain management while smart cards are a secure means of storing cryptographic keys required for remote and secure authentication in e-commerce and e-government applications. These devices must be cheap in order to permit their cost-effective massive manufacturing and deployment. Unfortunately, their low cost limits their computational power. Other devices such as nodes of sensor networks suffer from an additional constraint, namely, their limited battery life. Secure applications designed for these devices cannot make use of classical cryptographic primitives designed for full-fledged computers.

  7. Post-quantum cryptography.

    PubMed

    Bernstein, Daniel J; Lange, Tanja

    2017-09-13

    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.

  8. Post-quantum cryptography

    NASA Astrophysics Data System (ADS)

    Bernstein, Daniel J.; Lange, Tanja

    2017-09-01

    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.

  9. VIRTEX-5 Fpga Implementation of Advanced Encryption Standard Algorithm

    NASA Astrophysics Data System (ADS)

    Rais, Muhammad H.; Qasim, Syed M.

    2010-06-01

    In this paper, we present an implementation of Advanced Encryption Standard (AES) cryptographic algorithm using state-of-the-art Virtex-5 Field Programmable Gate Array (FPGA). The design is coded in Very High Speed Integrated Circuit Hardware Description Language (VHDL). Timing simulation is performed to verify the functionality of the designed circuit. Performance evaluation is also done in terms of throughput and area. The design implemented on Virtex-5 (XC5VLX50FFG676-3) FPGA achieves a maximum throughput of 4.34 Gbps utilizing a total of 399 slices.

  10. The Hong Kong/AAO/Strasbourg Hα (HASH) Planetary Nebula Database

    NASA Astrophysics Data System (ADS)

    Bojičić, Ivan S.; Parker, Quentin A.; Frew, David J.

    2017-10-01

    The Hong Kong/AAO/Strasbourg Hα (HASH) planetary nebula database is an online research platform providing free and easy access to the largest and most comprehensive catalogue of known Galactic PNe and a repository of observational data (imaging and spectroscopy) for these and related astronomical objects. The main motivation for creating this system is resolving some of long standing problems in the field e.g. problems with mimics and dubious and/or misidentifications, errors in observational data and consolidation of the widely scattered data-sets. This facility allows researchers quick and easy access to the archived and new observational data and creating and sharing of non-redundant PN samples and catalogues.

  11. Comparison of Various Similarity Measures for Average Image Hash in Mobile Phone Application

    NASA Astrophysics Data System (ADS)

    Farisa Chaerul Haviana, Sam; Taufik, Muhammad

    2017-04-01

    One of the main issue in Content Based Image Retrieval (CIBR) is similarity measures for resulting image hashes. The main key challenge is to find the most benefits distance or similarity measures for calculating the similarity in term of speed and computing costs, specially under limited computing capabilities device like mobile phone. This study we utilize twelve most common and popular distance or similarity measures technique implemented in mobile phone application, to be compared and studied. The results show that all similarity measures implemented in this study was perform equally under mobile phone application. This gives more possibilities for method combinations to be implemented for image retrieval.

  12. Performance-Oriented Privacy-Preserving Data Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pon, R K; Critchlow, T

    2004-09-15

    Current solutions to integrating private data with public data have provided useful privacy metrics, such as relative information gain, that can be used to evaluate alternative approaches. Unfortunately, they have not addressed critical performance issues, especially when the public database is very large. The use of hashes and noise yields better performance than existing techniques while still making it difficult for unauthorized entities to distinguish which data items truly exist in the private database. As we show here, leveraging the uncertainty introduced by collisions caused by hashing and the injection of noise, we present a technique for performing a relationalmore » join operation between a massive public table and a relatively smaller private one.« less

  13. Planetary Nebula Candidates Uncovered with the HASH Research Platform

    NASA Astrophysics Data System (ADS)

    Fragkou, Vasiliki; Bojičić, Ivan; Frew, David; Parker, Quentin

    2017-10-01

    A detailed examination of new high quality radio catalogues (e.g. Cornish) in combination with available mid-infrared (MIR) satellite imagery (e.g. Glimpse) has allowed us to find 70 new planetary nebula (PN) candidates based on existing knowledge of their typical colors and fluxes. To further examine the nature of these sources, multiple diagnostic tools have been applied to these candidates based on published data and on available imagery in the HASH (Hong Kong/ AAO/ Strasbourg Hα planetary nebula) research platform. Some candidates have previously-missed optical counterparts allowing for spectroscopic follow-up. Indeed, the single object spectroscopically observed so far has turned out to be a bona fide PN.

  14. Using Purpose-Built Functions and Block Hashes to Enable Small Block and Sub-file Forensics

    DTIC Science & Technology

    2010-01-01

    JPEGs. We tested precarve using the nps-2009-canon2-gen6 (Garfinkel et al., 2009) disk image. The disk image was created with a 32 MB SD card and a...analysis of n-grams in the fragment. Fig. 1 e Usage of a 160 GB iPod reported by iTunes 8.2.1 (6) (top), as reported by the file system (bottom center), and...as computing with random sampling (bottom right). Note that iTunes usage actually in GiB, even though the program displays the “GB” label. Fig. 2 e

  15. Effects of Bright Light Therapy on Sleep, Cognition, Brain Function, and Neurochemistry in Mild Traumatic Brain Injury

    DTIC Science & Technology

    2015-01-01

    rush”, nitrous oxide ("laughing gas"), amyl or butyl nitrate ("poppers").      Cannabis:   marijuana , hashish ("hash"), THC, "pot", "grass", "weed...have you used marijuana ? _______________ Have you ever used marijuana at other times in your life? YES NO If YES, at what age did you begin...smoking marijuana ? ______________ On approximately how many occasions have you used marijuana ? __________ Do you use any other street drugs currently

  16. Limits on efficient computation in the physical world

    NASA Astrophysics Data System (ADS)

    Aaronson, Scott Joel

    More than a speculative technology, quantum computing seems to challenge our most basic intuitions about how the physical world should behave. In this thesis I show that, while some intuitions from classical computer science must be jettisoned in the light of modern physics, many others emerge nearly unscathed; and I use powerful tools from computational complexity theory to help determine which are which. In the first part of the thesis, I attack the common belief that quantum computing resembles classical exponential parallelism, by showing that quantum computers would face serious limitations on a wider range of problems than was previously known. In particular, any quantum algorithm that solves the collision problem---that of deciding whether a sequence of n integers is one-to-one or two-to-one---must query the sequence O (n1/5) times. This resolves a question that was open for years; previously no lower bound better than constant was known. A corollary is that there is no "black-box" quantum algorithm to break cryptographic hash functions or solve the Graph Isomorphism problem in polynomial time. I also show that relative to an oracle, quantum computers could not solve NP-complete problems in polynomial time, even with the help of nonuniform "quantum advice states"; and that any quantum algorithm needs O (2n/4/n) queries to find a local minimum of a black-box function on the n-dimensional hypercube. Surprisingly, the latter result also leads to new classical lower bounds for the local search problem. Finally, I give new lower bounds on quantum one-way communication complexity, and on the quantum query complexity of total Boolean functions and recursive Fourier sampling. The second part of the thesis studies the relationship of the quantum computing model to physical reality. I first examine the arguments of Leonid Levin, Stephen Wolfram, and others who believe quantum computing to be fundamentally impossible. I find their arguments unconvincing without a "Sure/Shor separator"---a criterion that separates the already-verified quantum states from those that appear in Shor's factoring algorithm. I argue that such a separator should be based on a complexity classification of quantum states, and go on to create such a classification. Next I ask what happens to the quantum computing model if we take into account that the speed of light is finite---and in particular, whether Grover's algorithm still yields a quadratic speedup for searching a database. Refuting a claim by Benioff, I show that the surprising answer is yes. Finally, I analyze hypothetical models of computation that go even beyond quantum computing. I show that many such models would be as powerful as the complexity class PP, and use this fact to give a simple, quantum computing based proof that PP is closed under intersection. On the other hand, I also present one model---wherein we could sample the entire history of a hidden variable---that appears to be more powerful than standard quantum computing, but only slightly so.

  17. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation

    PubMed Central

    Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-01-01

    Background Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Objective Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Methods Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Results Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. Conclusions To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. PMID:29506966

  18. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation.

    PubMed

    Sadat, Md Nazmus; Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-03-05

    Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. ©Md Nazmus Sadat, Xiaoqian Jiang, Md Momin Al Aziz, Shuang Wang, Noman Mohammed. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.03.2018.

  19. Secure Hashing of Dynamic Hand Signatures Using Wavelet-Fourier Compression with BioPhasor Mixing and [InlineEquation not available: see fulltext.] Discretization

    NASA Astrophysics Data System (ADS)

    Wai Kuan, Yip; Teoh, Andrew B. J.; Ngo, David C. L.

    2006-12-01

    We introduce a novel method for secure computation of biometric hash on dynamic hand signatures using BioPhasor mixing and[InlineEquation not available: see fulltext.] discretization. The use of BioPhasor as the mixing process provides a one-way transformation that precludes exact recovery of the biometric vector from compromised hashes and stolen tokens. In addition, our user-specific[InlineEquation not available: see fulltext.] discretization acts both as an error correction step as well as a real-to-binary space converter. We also propose a new method of extracting compressed representation of dynamic hand signatures using discrete wavelet transform (DWT) and discrete fourier transform (DFT). Without the conventional use of dynamic time warping, the proposed method avoids storage of user's hand signature template. This is an important consideration for protecting the privacy of the biometric owner. Our results show that the proposed method could produce stable and distinguishable bit strings with equal error rates (EERs) of[InlineEquation not available: see fulltext.] and[InlineEquation not available: see fulltext.] for random and skilled forgeries for stolen token (worst case) scenario, and[InlineEquation not available: see fulltext.] for both forgeries in the genuine token (optimal) scenario.

  20. Studying Biological Rhythms of Person's Skin-galvanic Reaction and Dynamics of Light Transmission by Isomeric Substance in Space Flight Conditions

    NASA Technical Reports Server (NTRS)

    Glushko, Vladimir

    2004-01-01

    Intensity and amplitude of human functional systems and human most important organs are wavelike, rhythmic by nature. These waves have constant periodicity, phase and amplitude. The mentioned characteristics can vary, however their variations have a pronounced reiteration in the course of time. This indicates a hashing of several wave processes and their interference. Stochastic changes in wave processes characteristics of a human organism are explained either by 'pulsations' associated with hashing (superposition) of several wave processes and their interference, or by single influence of environmental physical factors on a human organism. Human beings have respectively periods of higher and lower efficiency, state of health and so on, depending not only of environmental factors, but also of 'internal' rhythmic factor. Sometimes peaks and falls periodicity of some or other characteristics is broken. Disturbance of steady-state biological rhythms is usually accompanied by reduction of activity steadiness of the most important systems of a human organism. In its turn this has an effect on organism's adaptation to changing living conditions as well as on general condition and efficiency of a human being. The latter factor is very important for space medicine. Biological rhythmology is a special branch of biology and medicine, it studies rhythmic activity mechanisms of organs, their systems, individuals and species. Appropriate researches were also carried out in space medicine.

  1. Robust and Reusable Fuzzy Extractors

    NASA Astrophysics Data System (ADS)

    Boyen, Xavier

    The use of biometric features as key material in security protocols has often been suggested to relieve their owner from the need to remember long cryptographic secrets. The appeal of biometric data as cryptographic secrets stems from their high apparent entropy, their availability to their owner, and their relative immunity to loss. In particular, they constitute a very effective basis for user authentication, especially when combined with complementary credentials such as a short memorized password or a physical token. However, the use of biometrics in cryptography does not come without problems. Some difficulties are technical, such as the lack of uniformity and the imperfect reproducibility of biometrics, but some challenges are more fundamental.

  2. Combining Cryptography with EEG Biometrics

    PubMed Central

    Kazanavičius, Egidijus; Woźniak, Marcin

    2018-01-01

    Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.

  3. Combining Cryptography with EEG Biometrics.

    PubMed

    Damaševičius, Robertas; Maskeliūnas, Rytis; Kazanavičius, Egidijus; Woźniak, Marcin

    2018-01-01

    Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.

  4. Security of a sessional blind signature based on quantum cryptograph

    NASA Astrophysics Data System (ADS)

    Wang, Tian-Yin; Cai, Xiao-Qiu; Zhang, Rui-Ling

    2014-08-01

    We analyze the security of a sessional blind signature protocol based on quantum cryptograph and show that there are two security leaks in this protocol. One is that the legal user Alice can change the signed message after she gets a valid blind signature from the signatory Bob, and the other is that an external opponent Eve also can forge a valid blind message by a special attack, which are not permitted for blind signature. Therefore, this protocol is not secure in the sense that it does not satisfy the non-forgeability of blind signatures. We also discuss the methods to prevent the attack strategies in the end.

  5. Cryptographically secure biometrics

    NASA Astrophysics Data System (ADS)

    Stoianov, A.

    2010-04-01

    Biometric systems usually do not possess a cryptographic level of security: it has been deemed impossible to perform a biometric authentication in the encrypted domain because of the natural variability of biometric samples and of the cryptographic intolerance even to a single bite error. Encrypted biometric data need to be decrypted on authentication, which creates privacy and security risks. On the other hand, the known solutions called "Biometric Encryption (BE)" or "Fuzzy Extractors" can be cracked by various attacks, for example, by running offline a database of images against the stored helper data in order to obtain a false match. In this paper, we present a novel approach which combines Biometric Encryption with classical Blum-Goldwasser cryptosystem. In the "Client - Service Provider (SP)" or in the "Client - Database - SP" architecture it is possible to keep the biometric data encrypted on all the stages of the storage and authentication, so that SP never has an access to unencrypted biometric data. It is shown that this approach is suitable for two of the most popular BE schemes, Fuzzy Commitment and Quantized Index Modulation (QIM). The approach has clear practical advantages over biometric systems using "homomorphic encryption". Future work will deal with the application of the proposed solution to one-to-many biometric systems.

  6. Implementation of Rivest Shamir Adleman Algorithm (RSA) and Vigenere Cipher In Web Based Information System

    NASA Astrophysics Data System (ADS)

    Aryanti, Aryanti; Mekongga, Ikhthison

    2018-02-01

    Data security and confidentiality is one of the most important aspects of information systems at the moment. One attempt to secure data such as by using cryptography. In this study developed a data security system by implementing the cryptography algorithm Rivest, Shamir Adleman (RSA) and Vigenere Cipher. The research was done by combining Rivest, Shamir Adleman (RSA) and Vigenere Cipher cryptographic algorithms to document file either word, excel, and pdf. This application includes the process of encryption and decryption of data, which is created by using PHP software and my SQL. Data encryption is done on the transmit side through RSA cryptographic calculations using the public key, then proceed with Vigenere Cipher algorithm which also uses public key. As for the stage of the decryption side received by using the Vigenere Cipher algorithm still use public key and then the RSA cryptographic algorithm using a private key. Test results show that the system can encrypt files, decrypt files and transmit files. Tests performed on the process of encryption and decryption of files with different file sizes, file size affects the process of encryption and decryption. The larger the file size the longer the process of encryption and decryption.

  7. An effective and secure key-management scheme for hierarchical access control in E-medicine system.

    PubMed

    Odelu, Vanga; Das, Ashok Kumar; Goswami, Adrijit

    2013-04-01

    Recently several hierarchical access control schemes are proposed in the literature to provide security of e-medicine systems. However, most of them are either insecure against 'man-in-the-middle attack' or they require high storage and computational overheads. Wu and Chen proposed a key management method to solve dynamic access control problems in a user hierarchy based on hybrid cryptosystem. Though their scheme improves computational efficiency over Nikooghadam et al.'s approach, it suffers from large storage space for public parameters in public domain and computational inefficiency due to costly elliptic curve point multiplication. Recently, Nikooghadam and Zakerolhosseini showed that Wu-Chen's scheme is vulnerable to man-in-the-middle attack. In order to remedy this security weakness in Wu-Chen's scheme, they proposed a secure scheme which is again based on ECC (elliptic curve cryptography) and efficient one-way hash function. However, their scheme incurs huge computational cost for providing verification of public information in the public domain as their scheme uses ECC digital signature which is costly when compared to symmetric-key cryptosystem. In this paper, we propose an effective access control scheme in user hierarchy which is only based on symmetric-key cryptosystem and efficient one-way hash function. We show that our scheme reduces significantly the storage space for both public and private domains, and computational complexity when compared to Wu-Chen's scheme, Nikooghadam-Zakerolhosseini's scheme, and other related schemes. Through the informal and formal security analysis, we further show that our scheme is secure against different attacks and also man-in-the-middle attack. Moreover, dynamic access control problems in our scheme are also solved efficiently compared to other related schemes, making our scheme is much suitable for practical applications of e-medicine systems.

  8. Audio signal encryption using chaotic Hénon map and lifting wavelet transforms

    NASA Astrophysics Data System (ADS)

    Roy, Animesh; Misra, A. P.

    2017-12-01

    We propose an audio signal encryption scheme based on the chaotic Hénon map. The scheme mainly comprises two phases: one is the preprocessing stage where the audio signal is transformed into data by the lifting wavelet scheme and the other in which the transformed data is encrypted by chaotic data set and hyperbolic functions. Furthermore, we use dynamic keys and consider the key space size to be large enough to resist any kind of cryptographic attacks. A statistical investigation is also made to test the security and the efficiency of the proposed scheme.

  9. Develop 3G Application with The J2ME SATSA API

    NASA Astrophysics Data System (ADS)

    JunWu, Xu; JunLing, Liang

    This paper describes research in the use of the Security and Trust Services API for J2ME (SATSA) to develop mobile applications. for 3G networks. SATSA defines a set of APIs that allows J2ME applications to communicate with and access functionality, secure storage and cryptographic operations provided by security elements such as smart cards and Wireless Identification Modules (WIM). A Java Card application could also work as an authentication module in a J2ME-based e-bank application. The e-bank application would allow its users to access their bank accounts using their cell phones.

  10. Design and Implementation of KSP on the Next Generation Cryptography API

    NASA Astrophysics Data System (ADS)

    Lina, Zhang

    With good seamless connectivity and higher safety, KSP (Key Storage Providers) is the inexorable trend of security requirements and development to take the place of CSP (Cryptographic Service Provider). But the study on KSP has just started in our country, and almost no reports of its implementation can be found. Based on the analysis of function modules and the architecture of Cryptography API (Next Generation (CNG)), this paper discusses the design and implementation of KSP (key storage providers) based on smart card in detail, and an example is also presented to illustrate how to use KSP in Windows Vista.

  11. Silicon photonic transceiver circuit for high-speed polarization-based discrete variable quantum key distribution

    DOE PAGES

    Cai, Hong; Long, Christopher M.; DeRose, Christopher T.; ...

    2017-01-01

    We demonstrate a silicon photonic transceiver circuit for high-speed discrete variable quantum key distribution that employs a common structure for transmit and receive functions. The device is intended for use in polarization-based quantum cryptographic protocols, such as BB84. Our characterization indicates that the circuit can generate the four BB84 states (TE/TM/45°/135° linear polarizations) with >30 dB polarization extinction ratios and gigabit per second modulation speed, and is capable of decoding any polarization bases differing by 90° with high extinction ratios.

  12. Silicon photonic transceiver circuit for high-speed polarization-based discrete variable quantum key distribution.

    PubMed

    Cai, Hong; Long, Christopher M; DeRose, Christopher T; Boynton, Nicholas; Urayama, Junji; Camacho, Ryan; Pomerene, Andrew; Starbuck, Andrew L; Trotter, Douglas C; Davids, Paul S; Lentine, Anthony L

    2017-05-29

    We demonstrate a silicon photonic transceiver circuit for high-speed discrete variable quantum key distribution that employs a common structure for transmit and receive functions. The device is intended for use in polarization-based quantum cryptographic protocols, such as BB84. Our characterization indicates that the circuit can generate the four BB84 states (TE/TM/45°/135° linear polarizations) with >30 dB polarization extinction ratios and gigabit per second modulation speed, and is capable of decoding any polarization bases differing by 90° with high extinction ratios.

  13. Silicon photonic transceiver circuit for high-speed polarization-based discrete variable quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Hong; Long, Christopher M.; DeRose, Christopher T.

    We demonstrate a silicon photonic transceiver circuit for high-speed discrete variable quantum key distribution that employs a common structure for transmit and receive functions. The device is intended for use in polarization-based quantum cryptographic protocols, such as BB84. Our characterization indicates that the circuit can generate the four BB84 states (TE/TM/45°/135° linear polarizations) with >30 dB polarization extinction ratios and gigabit per second modulation speed, and is capable of decoding any polarization bases differing by 90° with high extinction ratios.

  14. ACMES: fast multiple-genome searches for short repeat sequences with concurrent cross-species information retrieval

    PubMed Central

    Reneker, Jeff; Shyu, Chi-Ren; Zeng, Peiyu; Polacco, Joseph C.; Gassmann, Walter

    2004-01-01

    We have developed a web server for the life sciences community to use to search for short repeats of DNA sequence of length between 3 and 10 000 bases within multiple species. This search employs a unique and fast hash function approach. Our system also applies information retrieval algorithms to discover knowledge of cross-species conservation of repeat sequences. Furthermore, we have incorporated a part of the Gene Ontology database into our information retrieval algorithms to broaden the coverage of the search. Our web server and tutorial can be found at http://acmes.rnet.missouri.edu. PMID:15215469

  15. Effects of Bright Light Therapy on Sleep, Cognition, Brain Function, and Neurochemistry in Mild Traumatic Brain Injury

    DTIC Science & Technology

    2014-01-30

    34poppers").      Cannabis:   marijuana , hashish ("hash"), THC, "pot", "grass", "weed", "reefer".        Tranquilizers:  Quaalude, Seconal ("reds"), Valium...How many times in the past year have you used marijuana ? _______________ Have you ever used marijuana at other times in your life? YES NO...If YES, at what age did you begin smoking marijuana ? ______________ On approximately how many occasions have you used marijuana ? __________ Do

  16. Defense frontier analysis of quantum cryptographic systems.

    PubMed

    Slutsky, B; Rao, R; Sun, P C; Tancevski, L; Fainman, S

    1998-05-10

    When a quantum cryptographic system operates in the presence of background noise, security of the key can be recovered by a procedure called key distillation. A key-distillation scheme effective against so-called individual (bitwise-independent) eavesdropping attacks involves sacrifice of some of the data through privacy amplification. We derive the amount of data sacrifice sufficient to defend against individual eavesdropping attacks in both BB84 and B92 protocols and show in what sense the communication becomes secure as a result. We also compare the secrecy capacity of various quantum cryptosystems, taking into account data sacrifice during key distillation, and conclude that the BB84 protocol may offer better performance characteristics than the B92.

  17. Local randomness: Examples and application

    NASA Astrophysics Data System (ADS)

    Fu, Honghao; Miller, Carl A.

    2018-03-01

    When two players achieve a superclassical score at a nonlocal game, their outputs must contain intrinsic randomness. This fact has many useful implications for quantum cryptography. Recently it has been observed [C. Miller and Y. Shi, Quantum Inf. Computat. 17, 0595 (2017)] that such scores also imply the existence of local randomness—that is, randomness known to one player but not to the other. This has potential implications for cryptographic tasks between two cooperating but mistrustful players. In the current paper we bring this notion toward practical realization, by offering near-optimal bounds on local randomness for the CHSH game, and also proving the security of a cryptographic application of local randomness (single-bit certified deletion).

  18. A Scheme for Obtaining Secure S-Boxes Based on Chaotic Baker's Map

    NASA Astrophysics Data System (ADS)

    Gondal, Muhammad Asif; Abdul Raheem; Hussain, Iqtadar

    2014-09-01

    In this paper, a method for obtaining cryptographically strong 8 × 8 substitution boxes (S-boxes) is presented. The method is based on chaotic baker's map and a "mini version" of a new block cipher with block size 8 bits and can be easily and efficiently performed on a computer. The cryptographic strength of some 8 × 8 S-boxes randomly produced by the method is analyzed. The results show (1) all of them are bijective; (2) the nonlinearity of each output bit of them is usually about 100; (3) all of them approximately satisfy the strict avalanche criterion and output bits independence criterion; (4) they all have an almost equiprobable input/output XOR distribution.

  19. Cryptographically supported NFC tags in medication for better inpatient safety.

    PubMed

    Özcanhan, Mehmet Hilal; Dalkılıç, Gökhan; Utku, Semih

    2014-08-01

    Reliable sources report that errors in drug administration are increasing the number of harmed or killed inpatients, during healthcare. This development is in contradiction to patient safety norms. A correctly designed hospital-wide ubiquitous system, using advanced inpatient identification and matching techniques, should provide correct medicine and dosage at the right time. Researchers are still making grouping proof protocol proposals based on the EPC Global Class 1 Generation 2 ver. 1.2 standard tags, for drug administration. Analyses show that such protocols make medication unsecure and hence fail to guarantee inpatient safety. Thus, the original goal of patient safety still remains. In this paper, a very recent proposal (EKATE) upgraded by a cryptographic function is shown to fall short of expectations. Then, an alternative proposal IMS-NFC which uses a more suitable and newer technology; namely Near Field Communication (NFC), is described. The proposed protocol has the additional support of stronger security primitives and it is compliant to ISO communication and security standards. Unlike previous works, the proposal is a complete ubiquitous system that guarantees full patient safety; and it is based on off-the-shelf, new technology products available in every corner of the world. To prove the claims the performance, cost, security and scope of IMS-NFC are compared with previous proposals. Evaluation shows that the proposed system has stronger security, increased patient safety and equal efficiency, at little extra cost.

  20. Designing and Operating Through Compromise: Architectural Analysis of CKMS for the Advanced Metering Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duren, Mike; Aldridge, Hal; Abercrombie, Robert K

    2013-01-01

    Compromises attributable to the Advanced Persistent Threat (APT) highlight the necessity for constant vigilance. The APT provides a new perspective on security metrics (e.g., statistics based cyber security) and quantitative risk assessments. We consider design principals and models/tools that provide high assurance for energy delivery systems (EDS) operations regardless of the state of compromise. Cryptographic keys must be securely exchanged, then held and protected on either end of a communications link. This is challenging for a utility with numerous substations that must secure the intelligent electronic devices (IEDs) that may comprise complex control system of systems. For example, distribution andmore » management of keys among the millions of intelligent meters within the Advanced Metering Infrastructure (AMI) is being implemented as part of the National Smart Grid initiative. Without a means for a secure cryptographic key management system (CKMS) no cryptographic solution can be widely deployed to protect the EDS infrastructure from cyber-attack. We consider 1) how security modeling is applied to key management and cyber security concerns on a continuous basis from design through operation, 2) how trusted models and key management architectures greatly impact failure scenarios, and 3) how hardware-enabled trust is a critical element to detecting, surviving, and recovering from attack.« less

  1. A Secure Information Framework with APRQ Properties

    NASA Astrophysics Data System (ADS)

    Rupa, Ch.

    2017-08-01

    Internet of the things is the most trending topics in the digital world. Security issues are rampant. In the corporate or institutional setting, security risks are apparent from the outset. Market leaders are unable to use the cryptographic techniques due to their complexities. Hence many bits of private information, including ID, are readily available for third parties to see and to utilize. There is a need to decrease the complexity and increase the robustness of the cryptographic approaches. In view of this, a new cryptographic technique as good encryption pact with adjacency, random prime number and quantum code properties has been proposed. Here, encryption can be done by using quantum photons with gray code. This approach uses the concepts of physics and mathematics with no external key exchange to improve the security of the data. It also reduces the key attacks by generation of a key at the party side instead of sharing. This method makes the security more robust than with the existing approach. Important properties of gray code and quantum are adjacency property and different photons to a single bit (0 or 1). These can reduce the avalanche effect. Cryptanalysis of the proposed method shows that it is resistant to various attacks and stronger than the existing approaches.

  2. Some methods for blindfolded record linkage.

    PubMed

    Churches, Tim; Christen, Peter

    2004-06-28

    The linkage of records which refer to the same entity in separate data collections is a common requirement in public health and biomedical research. Traditionally, record linkage techniques have required that all the identifying data in which links are sought be revealed to at least one party, often a third party. This necessarily invades personal privacy and requires complete trust in the intentions of that party and their ability to maintain security and confidentiality. Dusserre, Quantin, Bouzelat and colleagues have demonstrated that it is possible to use secure one-way hash transformations to carry out follow-up epidemiological studies without any party having to reveal identifying information about any of the subjects - a technique which we refer to as "blindfolded record linkage". A limitation of their method is that only exact comparisons of values are possible, although phonetic encoding of names and other strings can be used to allow for some types of typographical variation and data errors. A method is described which permits the calculation of a general similarity measure, the n-gram score, without having to reveal the data being compared, albeit at some cost in computation and data communication. This method can be combined with public key cryptography and automatic estimation of linkage model parameters to create an overall system for blindfolded record linkage. The system described offers good protection against misdeeds or security failures by any one party, but remains vulnerable to collusion between or simultaneous compromise of two or more parties involved in the linkage operation. In order to reduce the likelihood of this, the use of last-minute allocation of tasks to substitutable servers is proposed. Proof-of-concept computer programmes written in the Python programming language are provided to illustrate the similarity comparison protocol. Although the protocols described in this paper are not unconditionally secure, they do suggest the feasibility, with the aid of modern cryptographic techniques and high speed communication networks, of a general purpose probabilistic record linkage system which permits record linkage studies to be carried out with negligible risk of invasion of personal privacy.

  3. Some methods for blindfolded record linkage

    PubMed Central

    Churches, Tim; Christen, Peter

    2004-01-01

    Background The linkage of records which refer to the same entity in separate data collections is a common requirement in public health and biomedical research. Traditionally, record linkage techniques have required that all the identifying data in which links are sought be revealed to at least one party, often a third party. This necessarily invades personal privacy and requires complete trust in the intentions of that party and their ability to maintain security and confidentiality. Dusserre, Quantin, Bouzelat and colleagues have demonstrated that it is possible to use secure one-way hash transformations to carry out follow-up epidemiological studies without any party having to reveal identifying information about any of the subjects – a technique which we refer to as "blindfolded record linkage". A limitation of their method is that only exact comparisons of values are possible, although phonetic encoding of names and other strings can be used to allow for some types of typographical variation and data errors. Methods A method is described which permits the calculation of a general similarity measure, the n-gram score, without having to reveal the data being compared, albeit at some cost in computation and data communication. This method can be combined with public key cryptography and automatic estimation of linkage model parameters to create an overall system for blindfolded record linkage. Results The system described offers good protection against misdeeds or security failures by any one party, but remains vulnerable to collusion between or simultaneous compromise of two or more parties involved in the linkage operation. In order to reduce the likelihood of this, the use of last-minute allocation of tasks to substitutable servers is proposed. Proof-of-concept computer programmes written in the Python programming language are provided to illustrate the similarity comparison protocol. Conclusion Although the protocols described in this paper are not unconditionally secure, they do suggest the feasibility, with the aid of modern cryptographic techniques and high speed communication networks, of a general purpose probabilistic record linkage system which permits record linkage studies to be carried out with negligible risk of invasion of personal privacy. PMID:15222890

  4. Next generation DRM: cryptography or forensics?

    NASA Astrophysics Data System (ADS)

    Robert, Arnaud

    2009-02-01

    Current content protection systems rely primarily on applied cryptographic techniques but there is an increased use of forensic solutions in images, music and video distribution alike. The two approaches differ significantly, both in terms of technology and in terms of strategy, and thus it begs the question: will one approach take over in the long run, and if so which one? Discussing the evolution of both cryptographic and forensic solutions, we conclude that neither approach is ideal for all constituents, and that in the video space at least they will continue to co-exist for the foreseeable future - even if this may not be the case for other media types. We also analyze shortcomings of these approaches, and suggest that new solutions are necessary in this still emerging marketplace.

  5. Quantum Privacy Amplification and the Security of Quantum Cryptography over Noisy Channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deutsch, D.; Ekert, A.; Jozsa, R.

    1996-09-01

    Existing quantum cryptographic schemes are not, as they stand, operable in the presence of noise on the quantum communication channel. Although they become operable if they are supplemented by classical privacy-amplification techniques, the resulting schemes are difficult to analyze and have not been proved secure. We introduce the concept of quantum privacy amplification and a cryptographic scheme incorporating it which is provably secure over a noisy channel. The scheme uses an {open_quote}{open_quote}entanglement purification{close_quote}{close_quote} procedure which, because it requires only a few quantum controlled-not and single-qubit operations, could be implemented using technology that is currently being developed. {copyright} {ital 1996 Themore » American Physical Society.}« less

  6. A generalized algorithm to design finite field normal basis multipliers

    NASA Technical Reports Server (NTRS)

    Wang, C. C.

    1986-01-01

    Finite field arithmetic logic is central in the implementation of some error-correcting coders and some cryptographic devices. There is a need for good multiplication algorithms which can be easily realized. Massey and Omura recently developed a new multiplication algorithm for finite fields based on a normal basis representation. Using the normal basis representation, the design of the finite field multiplier is simple and regular. The fundamental design of the Massey-Omura multiplier is based on a design of a product function. In this article, a generalized algorithm to locate a normal basis in a field is first presented. Using this normal basis, an algorithm to construct the product function is then developed. This design does not depend on particular characteristics of the generator polynomial of the field.

  7. Tag Content Access Control with Identity-based Key Exchange

    NASA Astrophysics Data System (ADS)

    Yan, Liang; Rong, Chunming

    2010-09-01

    Radio Frequency Identification (RFID) technology that used to identify objects and users has been applied to many applications such retail and supply chain recently. How to prevent tag content from unauthorized readout is a core problem of RFID privacy issues. Hash-lock access control protocol can make tag to release its content only to reader who knows the secret key shared between them. However, in order to get this shared secret key required by this protocol, reader needs to communicate with a back end database. In this paper, we propose to use identity-based secret key exchange approach to generate the secret key required for hash-lock access control protocol. With this approach, not only back end database connection is not needed anymore, but also tag cloning problem can be eliminated at the same time.

  8. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    PubMed

    Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  9. A fast exact simulation method for a class of Markov jump processes.

    PubMed

    Li, Yao; Hu, Lili

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze its properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.

  10. Tangible interactive system for document browsing and visualisation of multimedia data

    NASA Astrophysics Data System (ADS)

    Rytsar, Yuriy; Voloshynovskiy, Sviatoslav; Koval, Oleksiy; Deguillaume, Frederic; Topak, Emre; Startchik, Sergei; Pun, Thierry

    2006-01-01

    In this paper we introduce and develop a framework for document interactive navigation in multimodal databases. First, we analyze the main open issues of existing multimodal interfaces and then discuss two applications that include interaction with documents in several human environments, i.e., the so-called smart rooms. Second, we propose a system set-up dedicated to the efficient navigation in the printed documents. This set-up is based on the fusion of data from several modalities that include images and text. Both modalities can be used as cover data for hidden indexes using data-hiding technologies as well as source data for robust visual hashing. The particularities of the proposed robust visual hashing are described in the paper. Finally, we address two practical applications of smart rooms for tourism and education and demonstrate the advantages of the proposed solution.

  11. A new pre-classification method based on associative matching method

    NASA Astrophysics Data System (ADS)

    Katsuyama, Yutaka; Minagawa, Akihiro; Hotta, Yoshinobu; Omachi, Shinichiro; Kato, Nei

    2010-01-01

    Reducing the time complexity of character matching is critical to the development of efficient Japanese Optical Character Recognition (OCR) systems. To shorten processing time, recognition is usually split into separate preclassification and recognition stages. For high overall recognition performance, the pre-classification stage must both have very high classification accuracy and return only a small number of putative character categories for further processing. Furthermore, for any practical system, the speed of the pre-classification stage is also critical. The associative matching (AM) method has often been used for fast pre-classification, because its use of a hash table and reliance solely on logical bit operations to select categories makes it highly efficient. However, redundant certain level of redundancy exists in the hash table because it is constructed using only the minimum and maximum values of the data on each axis and therefore does not take account of the distribution of the data. We propose a modified associative matching method that satisfies the performance criteria described above but in a fraction of the time by modifying the hash table to reflect the underlying distribution of training characters. Furthermore, we show that our approach outperforms pre-classification by clustering, ANN and conventional AM in terms of classification accuracy, discriminative power and speed. Compared to conventional associative matching, the proposed approach results in a 47% reduction in total processing time across an evaluation test set comprising 116,528 Japanese character images.

  12. Fault parameter constraints using relocated earthquakes: A validation of first-motion focal-mechanism data

    USGS Publications Warehouse

    Kilb, Debi; Hardebeck, J.L.

    2006-01-01

    We estimate the strike and dip of three California fault segments (Calaveras, Sargent, and a portion of the San Andreas near San Jaun Bautistia) based on principle component analysis of accurately located microearthquakes. We compare these fault orientations with two different first-motion focal mechanism catalogs: the Northern California Earthquake Data Center (NCEDC) catalog, calculated using the FPFIT algorithm (Reasenberg and Oppenheimer, 1985), and a catalog created using the HASH algorithm that tests mechanism stability relative to seismic velocity model variations and earthquake location (Hardebeck and Shearer, 2002). We assume any disagreement (misfit >30° in strike, dip, or rake) indicates inaccurate focal mechanisms in the catalogs. With this assumption, we can quantify the parameters that identify the most optimally constrained focal mechanisms. For the NCEDC/FPFIT catalogs, we find that the best quantitative discriminator of quality focal mechanisms is the station distribution ratio (STDR) parameter, an indicator of how the stations are distributed about the focal sphere. Requiring STDR > 0.65 increases the acceptable mechanisms from 34%–37% to 63%–68%. This suggests stations should be uniformly distributed surrounding, rather than aligning, known fault traces. For the HASH catalogs, the fault plane uncertainty (FPU) parameter is the best discriminator, increasing the percent of acceptable mechanisms from 63%–78% to 81%–83% when FPU ≤ 35°. The overall higher percentage of acceptable mechanisms and the usefulness of the formal uncertainty in identifying quality mechanisms validate the HASH approach of testing for mechanism stability.

  13. High-performance sparse matrix-matrix products on Intel KNL and multicore architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagasaka, Y; Matsuoka, S; Azad, A

    Sparse matrix-matrix multiplication (SpGEMM) is a computational primitive that is widely used in areas ranging from traditional numerical applications to recent big data analysis and machine learning. Although many SpGEMM algorithms have been proposed, hardware specific optimizations for multi- and many-core processors are lacking and a detailed analysis of their performance under various use cases and matrices is not available. We firstly identify and mitigate multiple bottlenecks with memory management and thread scheduling on Intel Xeon Phi (Knights Landing or KNL). Specifically targeting multi- and many-core processors, we develop a hash-table-based algorithm and optimize a heap-based shared-memory SpGEMM algorithm. Wemore » examine their performance together with other publicly available codes. Different from the literature, our evaluation also includes use cases that are representative of real graph algorithms, such as multi-source breadth-first search or triangle counting. Our hash-table and heap-based algorithms are showing significant speedups from libraries in the majority of the cases while different algorithms dominate the other scenarios with different matrix size, sparsity, compression factor and operation type. We wrap up in-depth evaluation results and make a recipe to give the best SpGEMM algorithm for target scenario. A critical finding is that hash-table-based SpGEMM gets a significant performance boost if the nonzeros are not required to be sorted within each row of the output matrix.« less

  14. Cryptographic Protocol for Comparing Sets without Leaking Them: Applications in Astronomy

    NASA Astrophysics Data System (ADS)

    McCullough, Peter R.

    2011-09-01

    We describe a cryptographic protocol for two or more persons to compare individual lists of astronomical objects of interest without leaking them. Cryptographers have long known such protocols; astronomers and other scientists may benefit from them also. We describe some latent opportunities that would be enabled by this protocol. Consider the following scenario: Alice has a set of stars that are candidate hosts of transiting planets. Bob has a similar set. Alice and Bob have a mutual desire to know the intersection of their two lists without revealing them to each other. Alice and Bob can recruit a trusted third party, Josephine, to make the comparison, report the results, and then destroy each list. Limitations of that approach are that 1) Josephine must devote time to make each comparison, 2) Alice and Bob may not know a Josephine that they both can trust, especially if Alice and Bob are from different communities, 3) Josephine may not indeed be trustworthy, 4) a fourth person may wittingly or unwittingly intercept one or both of the lists in Josephine's care, and 5) anticipating those limitations, Alice and Bob may elect not to recruit a Josephine and hence not compare their lists. We describe a variant that overcomes those limitations by A) encrypting the lists prior to transmitting them to Josephine, and B) replacing a human Josephine with a computer website.

  15. Network-Centric Quantum Communications

    NASA Astrophysics Data System (ADS)

    Hughes, Richard

    2014-03-01

    Single-photon quantum communications (QC) offers ``future-proof'' cryptographic security rooted in the laws of physics. Today's quantum-secured communications cannot be compromised by unanticipated future technological advances. But to date, QC has only existed in point-to-point instantiations that have limited ability to address the cyber security challenges of our increasingly networked world. In my talk I will describe a fundamentally new paradigm of network-centric quantum communications (NQC) that leverages the network to bring scalable, QC-based security to user groups that may have no direct user-to-user QC connectivity. With QC links only between each of N users and a trusted network node, NQC brings quantum security to N2 user pairs, and to multi-user groups. I will describe a novel integrated photonics quantum smartcard (``QKarD'') and its operation in a multi-node NQC test bed. The QKarDs are used to implement the quantum cryptographic protocols of quantum identification, quantum key distribution and quantum secret splitting. I will explain how these cryptographic primitives are used to provide key management for encryption, authentication, and non-repudiation for user-to-user communications. My talk will conclude with a description of a recent demonstration that QC can meet both the security and quality-of-service (latency) requirements for electric grid control commands and data. These requirements cannot be met simultaneously with present-day cryptography.

  16. Modular multiplication in GF(p) for public-key cryptography

    NASA Astrophysics Data System (ADS)

    Olszyna, Jakub

    Modular multiplication forms the basis of modular exponentiation which is the core operation of the RSA cryptosystem. It is also present in many other cryptographic algorithms including those based on ECC and HECC. Hence, an efficient implementation of PKC relies on efficient implementation of modular multiplication. The paper presents a survey of most common algorithms for modular multiplication along with hardware architectures especially suitable for cryptographic applications in energy constrained environments. The motivation for studying low-power and areaefficient modular multiplication algorithms comes from enabling public-key security for ultra-low power devices that can perform under constrained environments like wireless sensor networks. Serial architectures for GF(p) are analyzed and presented. Finally proposed architectures are verified and compared according to the amount of power dissipated throughout the operation.

  17. Cryptographic robustness of a quantum cryptography system using phase-time coding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molotkov, S. N.

    2008-01-15

    A cryptographic analysis is presented of a new quantum key distribution protocol using phase-time coding. An upper bound is obtained for the error rate that guarantees secure key distribution. It is shown that the maximum tolerable error rate for this protocol depends on the counting rate in the control time slot. When no counts are detected in the control time slot, the protocol guarantees secure key distribution if the bit error rate in the sifted key does not exceed 50%. This protocol partially discriminates between errors due to system defects (e.g., imbalance of a fiber-optic interferometer) and eavesdropping. In themore » absence of eavesdropping, the counts detected in the control time slot are not caused by interferometer imbalance, which reduces the requirements for interferometer stability.« less

  18. Cryptography and the Internet: lessons and challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCurley, K.S.

    1996-12-31

    The popularization of the Internet has brought fundamental changes to the world, because it allows a universal method of communication between computers. This carries enormous benefits with it, but also raises many security considerations. Cryptography is a fundamental technology used to provide security of computer networks, and there is currently a widespread engineering effort to incorporate cryptography into various aspects of the Internet. The system-level engineering required to provide security services for the Internet carries some important lessons for researchers whose study is focused on narrowly defined problems. It also offers challenges to the cryptographic research community by raising newmore » questions not adequately addressed by the existing body of knowledge. This paper attempts to summarize some of these lessons and challenges for the cryptographic research community.« less

  19. Asymmetric distances for binary embeddings.

    PubMed

    Gordo, Albert; Perronnin, Florent; Gong, Yunchao; Lazebnik, Svetlana

    2014-01-01

    In large-scale query-by-example retrieval, embedding image signatures in a binary space offers two benefits: data compression and search efficiency. While most embedding algorithms binarize both query and database signatures, it has been noted that this is not strictly a requirement. Indeed, asymmetric schemes that binarize the database signatures but not the query still enjoy the same two benefits but may provide superior accuracy. In this work, we propose two general asymmetric distances that are applicable to a wide variety of embedding techniques including locality sensitive hashing (LSH), locality sensitive binary codes (LSBC), spectral hashing (SH), PCA embedding (PCAE), PCAE with random rotations (PCAE-RR), and PCAE with iterative quantization (PCAE-ITQ). We experiment on four public benchmarks containing up to 1M images and show that the proposed asymmetric distances consistently lead to large improvements over the symmetric Hamming distance for all binary embedding techniques.

  20. Paradeisos: A perfect hashing algorithm for many-body eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Jia, C. J.; Wang, Y.; Mendl, C. B.; Moritz, B.; Devereaux, T. P.

    2018-03-01

    We describe an essentially perfect hashing algorithm for calculating the position of an element in an ordered list, appropriate for the construction and manipulation of many-body Hamiltonian, sparse matrices. Each element of the list corresponds to an integer value whose binary representation reflects the occupation of single-particle basis states for each element in the many-body Hilbert space. The algorithm replaces conventional methods, such as binary search, for locating the elements of the ordered list, eliminating the need to store the integer representation for each element, without increasing the computational complexity. Combined with the "checkerboard" decomposition of the Hamiltonian matrix for distribution over parallel computing environments, this leads to a substantial savings in aggregate memory. While the algorithm can be applied broadly to many-body, correlated problems, we demonstrate its utility in reducing total memory consumption for a series of fermionic single-band Hubbard model calculations on small clusters with progressively larger Hilbert space dimension.

  1. Handwriting: Feature Correlation Analysis for Biometric Hashes

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Steinmetz, Ralf

    2004-12-01

    In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  2. A fast exact simulation method for a class of Markov jump processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yao, E-mail: yaoli@math.umass.edu; Hu, Lili, E-mail: lilyhu86@gmail.com

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze itsmore » properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.« less

  3. Genomics-Based Security Protocols: From Plaintext to Cipherprotein

    NASA Technical Reports Server (NTRS)

    Shaw, Harry; Hussein, Sayed; Helgert, Hermann

    2011-01-01

    The evolving nature of the internet will require continual advances in authentication and confidentiality protocols. Nature provides some clues as to how this can be accomplished in a distributed manner through molecular biology. Cryptography and molecular biology share certain aspects and operations that allow for a set of unified principles to be applied to problems in either venue. A concept for developing security protocols that can be instantiated at the genomics level is presented. A DNA (Deoxyribonucleic acid) inspired hash code system is presented that utilizes concepts from molecular biology. It is a keyed-Hash Message Authentication Code (HMAC) capable of being used in secure mobile Ad hoc networks. It is targeted for applications without an available public key infrastructure. Mechanics of creating the HMAC are presented as well as a prototype HMAC protocol architecture. Security concepts related to the implementation differences between electronic domain security and genomics domain security are discussed.

  4. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    NASA Astrophysics Data System (ADS)

    Popic, Victoria; Batzoglou, Serafim

    2017-05-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party.

  5. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    PubMed Central

    Popic, Victoria; Batzoglou, Serafim

    2017-01-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party. PMID:28508884

  6. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs

    PubMed Central

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410

  7. A secure and robust password-based remote user authentication scheme using smart cards for the integrated EPR information system.

    PubMed

    Das, Ashok Kumar

    2015-03-01

    An integrated EPR (Electronic Patient Record) information system of all the patients provides the medical institutions and the academia with most of the patients' information in details for them to make corrective decisions and clinical decisions in order to maintain and analyze patients' health. In such system, the illegal access must be restricted and the information from theft during transmission over the insecure Internet must be prevented. Lee et al. proposed an efficient password-based remote user authentication scheme using smart card for the integrated EPR information system. Their scheme is very efficient due to usage of one-way hash function and bitwise exclusive-or (XOR) operations. However, in this paper, we show that though their scheme is very efficient, their scheme has three security weaknesses such as (1) it has design flaws in password change phase, (2) it fails to protect privileged insider attack and (3) it lacks the formal security verification. We also find that another recently proposed Wen's scheme has the same security drawbacks as in Lee at al.'s scheme. In order to remedy these security weaknesses found in Lee et al.'s scheme and Wen's scheme, we propose a secure and efficient password-based remote user authentication scheme using smart cards for the integrated EPR information system. We show that our scheme is also efficient as compared to Lee et al.'s scheme and Wen's scheme as our scheme only uses one-way hash function and bitwise exclusive-or (XOR) operations. Through the security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks.

  8. Conventional Cryptography.

    ERIC Educational Resources Information Center

    Wright, Marie A.

    1993-01-01

    Cryptography is the science that renders data unintelligible to prevent its unauthorized disclosure or modification. Presents an application of matrices used in linear transformations to illustrate a cryptographic system. An example is provided. (17 references) (MDH)

  9. ARC-2007-ACD07-0184-003

    NASA Image and Video Library

    2007-09-26

    From left: Data Parallel Line Relaxation (DPLR) software team members Kerry Trumble, Deepak Bose and David Hash analyze and predict the extreme environments NASA's space shuttle experiences during its super high-speed reentry into Earth’s atmosphere.

  10. A statistical learning strategy for closed-loop control of fluid flows

    NASA Astrophysics Data System (ADS)

    Guéniat, Florimond; Mathelin, Lionel; Hussaini, M. Yousuff

    2016-12-01

    This work discusses a closed-loop control strategy for complex systems utilizing scarce and streaming data. A discrete embedding space is first built using hash functions applied to the sensor measurements from which a Markov process model is derived, approximating the complex system's dynamics. A control strategy is then learned using reinforcement learning once rewards relevant with respect to the control objective are identified. This method is designed for experimental configurations, requiring no computations nor prior knowledge of the system, and enjoys intrinsic robustness. It is illustrated on two systems: the control of the transitions of a Lorenz'63 dynamical system, and the control of the drag of a cylinder flow. The method is shown to perform well.

  11. Changes to Quantum Cryptography

    NASA Astrophysics Data System (ADS)

    Sakai, Yasuyuki; Tanaka, Hidema

    Quantum cryptography has become a subject of widespread interest. In particular, quantum key distribution, which provides a secure key agreement by using quantum systems, is believed to be the most important application of quantum cryptography. Quantum key distribution has the potential to achieve the “unconditionally” secure infrastructure. We also have many cryptographic tools that are based on “modern cryptography” at the present time. They are being used in an effort to guarantee secure communication over open networks such as the Internet. Unfortunately, their ultimate efficacy is in doubt. Quantum key distribution systems are believed to be close to practical and commercial use. In this paper, we discuss what we should do to apply quantum cryptography to our communications. We also discuss how quantum key distribution can be combined with or used to replace cryptographic tools based on modern cryptography.

  12. Formal Analysis of Key Integrity in PKCS#11

    NASA Astrophysics Data System (ADS)

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  13. A novel key management solution for reinforcing compliance with HIPAA privacy/security regulations.

    PubMed

    Lee, Chien-Ding; Ho, Kevin I-J; Lee, Wei-Bin

    2011-07-01

    Digitizing medical records facilitates the healthcare process. However, it can also cause serious security and privacy problems, which are the major concern in the Health Insurance Portability and Accountability Act (HIPAA). While various conventional encryption mechanisms can solve some aspects of these problems, they cannot address the illegal distribution of decrypted medical images, which violates the regulations defined in the HIPAA. To protect decrypted medical images from being illegally distributed by an authorized staff member, the model proposed in this paper provides a way to integrate several cryptographic mechanisms. In this model, the malicious staff member can be tracked by a watermarked clue. By combining several well-designed cryptographic mechanisms and developing a key management scheme to facilitate the interoperation among these mechanisms, the risk of illegal distribution can be reduced.

  14. On protection against a bright-pulse attack in the two-pass quantum cryptography system

    NASA Astrophysics Data System (ADS)

    Balygin, K. A.; Klimov, A. N.; Korol'kov, A. V.; Kulik, S. P.; Molotkov, S. N.

    2016-06-01

    The security of keys in quantum cryptography systems, in contrast to mathematical cryptographic algorithms, is guaranteed by fundamental quantum-mechanical laws. However, the cryptographic resistance of such systems, which are distributed physical devices, fundamentally depends on the method of their implementation and particularly on the calibration and control of critical parameters. The most important parameter is the number of photons in quasi-single-photon information states in a communication channel. The sensitivity to a bright-pulse attack has been demonstrated in an explicit form for a number of systems. A method guaranteeing the resistance to such attacks has been proposed and implemented. Furthermore, the relation of physical observables used and obtained at the control of quantum states to the length of final secret keys has been obtained for the first time.

  15. Practical quantum retrieval games

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan Miguel; Karasamanis, Markos; Lütkenhaus, Norbert

    2016-06-01

    Complex cryptographic protocols are often constructed from simpler building blocks. In order to advance quantum cryptography, it is important to study practical building blocks that can be used to develop new protocols. An example is quantum retrieval games (QRGs), which have broad applicability and have already been used to construct quantum money schemes. In this work, we introduce a general construction of quantum retrieval games based on the hidden matching problem and show how they can be implemented in practice using available technology. More precisely, we provide a general method to construct (1-out-of-k ) QRGs, proving that their cheating probabilities decrease exponentially in k . In particular, we define QRGs based on coherent states of light, which can be implemented even in the presence of experimental imperfections. Our results constitute a tool in the arsenal of the practical quantum cryptographer.

  16. Efficient and anonymous two-factor user authentication in wireless sensor networks: achieving user anonymity with lightweight sensor computation.

    PubMed

    Nam, Junghyun; Choo, Kim-Kwang Raymond; Han, Sangchul; Kim, Moonseong; Paik, Juryon; Won, Dongho

    2015-01-01

    A smart-card-based user authentication scheme for wireless sensor networks (hereafter referred to as a SCA-WSN scheme) is designed to ensure that only users who possess both a smart card and the corresponding password are allowed to gain access to sensor data and their transmissions. Despite many research efforts in recent years, it remains a challenging task to design an efficient SCA-WSN scheme that achieves user anonymity. The majority of published SCA-WSN schemes use only lightweight cryptographic techniques (rather than public-key cryptographic techniques) for the sake of efficiency, and have been demonstrated to suffer from the inability to provide user anonymity. Some schemes employ elliptic curve cryptography for better security but require sensors with strict resource constraints to perform computationally expensive scalar-point multiplications; despite the increased computational requirements, these schemes do not provide user anonymity. In this paper, we present a new SCA-WSN scheme that not only achieves user anonymity but also is efficient in terms of the computation loads for sensors. Our scheme employs elliptic curve cryptography but restricts its use only to anonymous user-to-gateway authentication, thereby allowing sensors to perform only lightweight cryptographic operations. Our scheme also enjoys provable security in a formal model extended from the widely accepted Bellare-Pointcheval-Rogaway (2000) model to capture the user anonymity property and various SCA-WSN specific attacks (e.g., stolen smart card attacks, node capture attacks, privileged insider attacks, and stolen verifier attacks).

  17. Efficient and Anonymous Two-Factor User Authentication in Wireless Sensor Networks: Achieving User Anonymity with Lightweight Sensor Computation

    PubMed Central

    Nam, Junghyun; Choo, Kim-Kwang Raymond; Han, Sangchul; Kim, Moonseong; Paik, Juryon; Won, Dongho

    2015-01-01

    A smart-card-based user authentication scheme for wireless sensor networks (hereafter referred to as a SCA-WSN scheme) is designed to ensure that only users who possess both a smart card and the corresponding password are allowed to gain access to sensor data and their transmissions. Despite many research efforts in recent years, it remains a challenging task to design an efficient SCA-WSN scheme that achieves user anonymity. The majority of published SCA-WSN schemes use only lightweight cryptographic techniques (rather than public-key cryptographic techniques) for the sake of efficiency, and have been demonstrated to suffer from the inability to provide user anonymity. Some schemes employ elliptic curve cryptography for better security but require sensors with strict resource constraints to perform computationally expensive scalar-point multiplications; despite the increased computational requirements, these schemes do not provide user anonymity. In this paper, we present a new SCA-WSN scheme that not only achieves user anonymity but also is efficient in terms of the computation loads for sensors. Our scheme employs elliptic curve cryptography but restricts its use only to anonymous user-to-gateway authentication, thereby allowing sensors to perform only lightweight cryptographic operations. Our scheme also enjoys provable security in a formal model extended from the widely accepted Bellare-Pointcheval-Rogaway (2000) model to capture the user anonymity property and various SCA-WSN specific attacks (e.g., stolen smart card attacks, node capture attacks, privileged insider attacks, and stolen verifier attacks). PMID:25849359

  18. Secure management of biomedical data with cryptographic hardware.

    PubMed

    Canim, Mustafa; Kantarcioglu, Murat; Malin, Bradley

    2012-01-01

    The biomedical community is increasingly migrating toward research endeavors that are dependent on large quantities of genomic and clinical data. At the same time, various regulations require that such data be shared beyond the initial collecting organization (e.g., an academic medical center). It is of critical importance to ensure that when such data are shared, as well as managed, it is done so in a manner that upholds the privacy of the corresponding individuals and the overall security of the system. In general, organizations have attempted to achieve these goals through deidentification methods that remove explicitly, and potentially, identifying features (e.g., names, dates, and geocodes). However, a growing number of studies demonstrate that deidentified data can be reidentified to named individuals using simple automated methods. As an alternative, it was shown that biomedical data could be shared, managed, and analyzed through practical cryptographic protocols without revealing the contents of any particular record. Yet, such protocols required the inclusion of multiple third parties, which may not always be feasible in the context of trust or bandwidth constraints. Thus, in this paper, we introduce a framework that removes the need for multiple third parties by collocating services to store and to process sensitive biomedical data through the integration of cryptographic hardware. Within this framework, we define a secure protocol to process genomic data and perform a series of experiments to demonstrate that such an approach can be run in an efficient manner for typical biomedical investigations.

  19. Secure Management of Biomedical Data With Cryptographic Hardware

    PubMed Central

    Canim, Mustafa; Kantarcioglu, Murat; Malin, Bradley

    2014-01-01

    The biomedical community is increasingly migrating toward research endeavors that are dependent on large quantities of genomic and clinical data. At the same time, various regulations require that such data be shared beyond the initial collecting organization (e.g., an academic medical center). It is of critical importance to ensure that when such data are shared, as well as managed, it is done so in a manner that upholds the privacy of the corresponding individuals and the overall security of the system. In general, organizations have attempted to achieve these goals through deidentification methods that remove explicitly, and potentially, identifying features (e.g., names, dates, and geocodes). However, a growing number of studies demonstrate that deidentified data can be reidentified to named individuals using simple automated methods. As an alternative, it was shown that biomedical data could be shared, managed, and analyzed through practical cryptographic protocols without revealing the contents of any particular record. Yet, such protocols required the inclusion of multiple third parties, which may not always be feasible in the context of trust or bandwidth constraints. Thus, in this paper, we introduce a framework that removes the need for multiple third parties by collocating services to store and to process sensitive biomedical data through the integration of cryptographic hardware. Within this framework, we define a secure protocol to process genomic data and perform a series of experiments to demonstrate that such an approach can be run in an efficient manner for typical biomedical investigations. PMID:22010157

  20. 15 CFR Supplement No. 8 to Part 742 - Self-Classification Report for Encryption Items

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... forensics (v) Cryptographic accelerator (vi) Data backup and recovery (vii) Database (viii) Disk/drive... (MAN) (xxii) Modem (xxiii) Network convergence or infrastructure n.e.s. (xxiv) Network forensics (xxv...

  1. 15 CFR Supplement No. 8 to Part 742 - Self-Classification Report for Encryption Items

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... forensics (v) Cryptographic accelerator (vi) Data backup and recovery (vii) Database (viii) Disk/drive... (MAN) (xxii) Modem (xxiii) Network convergence or infrastructure n.e.s. (xxiv) Network forensics (xxv...

  2. 15 CFR Supplement No. 8 to Part 742 - Self-Classification Report for Encryption Items

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... forensics (v) Cryptographic accelerator (vi) Data backup and recovery (vii) Database (viii) Disk/drive... (MAN) (xxii) Modem (xxiii) Network convergence or infrastructure n.e.s. (xxiv) Network forensics (xxv...

  3. 15 CFR Supplement No. 8 to Part 742 - Self-Classification Report for Encryption Items

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... forensics (v) Cryptographic accelerator (vi) Data backup and recovery (vii) Database (viii) Disk/drive... (MAN) (xxii) Modem (xxiii) Network convergence or infrastructure n.e.s. (xxiv) Network forensics (xxv...

  4. 10 CFR 25.15 - Access permitted under “Q” or “L” access authorization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Confidential National Security Information including intelligence information, CRYPTO (i.e., cryptographic... official business when the employee has the appropriate level of NRC access authorization and need-to-know...

  5. Running Clubs--A Combinatorial Investigation.

    ERIC Educational Resources Information Center

    Nissen, Phillip; Taylor, John

    1991-01-01

    Presented is a combinatorial problem based on the Hash House Harriers rule which states that the route of the run should not have previously been traversed by the club. Discovered is how many weeks the club can meet before the rule has to be broken. (KR)

  6. InChIKey collision resistance: an experimental testing

    PubMed Central

    2012-01-01

    InChIKey is a 27-character compacted (hashed) version of InChI which is intended for Internet and database searching/indexing and is based on an SHA-256 hash of the InChI character string. The first block of InChIKey encodes molecular skeleton while the second block represents various kinds of isomerism (stereo, tautomeric, etc.). InChIKey is designed to be a nearly unique substitute for the parent InChI. However, a single InChIKey may occasionally map to two or more InChI strings (collision). The appearance of collision itself does not compromise the signature as collision-free hashing is impossible; the only viable approach is to set and keep a reasonable level of collision resistance which is sufficient for typical applications. We tested, in computational experiments, how well the real-life InChIKey collision resistance corresponds to the theoretical estimates expected by design. For this purpose, we analyzed the statistical characteristics of InChIKey for datasets of variable size in comparison to the theoretical statistical frequencies. For the relatively short second block, an exhaustive direct testing was performed. We computed and compared to theory the numbers of collisions for the stereoisomers of Spongistatin I (using the whole set of 67,108,864 isomers and its subsets). For the longer first block, we generated, using custom-made software, InChIKeys for more than 3 × 1010 chemical structures. The statistical behavior of this block was tested by comparison of experimental and theoretical frequencies for the various four-letter sequences which may appear in the first block body. From the results of our computational experiments we conclude that the observed characteristics of InChIKey collision resistance are in good agreement with theoretical expectations. PMID:23256896

  7. InChIKey collision resistance: an experimental testing.

    PubMed

    Pletnev, Igor; Erin, Andrey; McNaught, Alan; Blinov, Kirill; Tchekhovskoi, Dmitrii; Heller, Steve

    2012-12-20

    InChIKey is a 27-character compacted (hashed) version of InChI which is intended for Internet and database searching/indexing and is based on an SHA-256 hash of the InChI character string. The first block of InChIKey encodes molecular skeleton while the second block represents various kinds of isomerism (stereo, tautomeric, etc.). InChIKey is designed to be a nearly unique substitute for the parent InChI. However, a single InChIKey may occasionally map to two or more InChI strings (collision). The appearance of collision itself does not compromise the signature as collision-free hashing is impossible; the only viable approach is to set and keep a reasonable level of collision resistance which is sufficient for typical applications.We tested, in computational experiments, how well the real-life InChIKey collision resistance corresponds to the theoretical estimates expected by design. For this purpose, we analyzed the statistical characteristics of InChIKey for datasets of variable size in comparison to the theoretical statistical frequencies. For the relatively short second block, an exhaustive direct testing was performed. We computed and compared to theory the numbers of collisions for the stereoisomers of Spongistatin I (using the whole set of 67,108,864 isomers and its subsets). For the longer first block, we generated, using custom-made software, InChIKeys for more than 3 × 1010 chemical structures. The statistical behavior of this block was tested by comparison of experimental and theoretical frequencies for the various four-letter sequences which may appear in the first block body.From the results of our computational experiments we conclude that the observed characteristics of InChIKey collision resistance are in good agreement with theoretical expectations.

  8. Privacy-Preserving Patient Similarity Learning in a Federated Environment: Development and Analysis.

    PubMed

    Lee, Junghye; Sun, Jimeng; Wang, Fei; Wang, Shuang; Jun, Chi-Hyuck; Jiang, Xiaoqian

    2018-04-13

    There is an urgent need for the development of global analytic frameworks that can perform analyses in a privacy-preserving federated environment across multiple institutions without privacy leakage. A few studies on the topic of federated medical analysis have been conducted recently with the focus on several algorithms. However, none of them have solved similar patient matching, which is useful for applications such as cohort construction for cross-institution observational studies, disease surveillance, and clinical trials recruitment. The aim of this study was to present a privacy-preserving platform in a federated setting for patient similarity learning across institutions. Without sharing patient-level information, our model can find similar patients from one hospital to another. We proposed a federated patient hashing framework and developed a novel algorithm to learn context-specific hash codes to represent patients across institutions. The similarities between patients can be efficiently computed using the resulting hash codes of corresponding patients. To avoid security attack from reverse engineering on the model, we applied homomorphic encryption to patient similarity search in a federated setting. We used sequential medical events extracted from the Multiparameter Intelligent Monitoring in Intensive Care-III database to evaluate the proposed algorithm in predicting the incidence of five diseases independently. Our algorithm achieved averaged area under the curves of 0.9154 and 0.8012 with balanced and imbalanced data, respectively, in κ-nearest neighbor with κ=3. We also confirmed privacy preservation in similarity search by using homomorphic encryption. The proposed algorithm can help search similar patients across institutions effectively to support federated data analysis in a privacy-preserving manner. ©Junghye Lee, Jimeng Sun, Fei Wang, Shuang Wang, Chi-Hyuck Jun, Xiaoqian Jiang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.

  9. UQlust: combining profile hashing with linear-time ranking for efficient clustering and analysis of big macromolecular data.

    PubMed

    Adamczak, Rafal; Meller, Jarek

    2016-12-28

    Advances in computing have enabled current protein and RNA structure prediction and molecular simulation methods to dramatically increase their sampling of conformational spaces. The quickly growing number of experimentally resolved structures, and databases such as the Protein Data Bank, also implies large scale structural similarity analyses to retrieve and classify macromolecular data. Consequently, the computational cost of structure comparison and clustering for large sets of macromolecular structures has become a bottleneck that necessitates further algorithmic improvements and development of efficient software solutions. uQlust is a versatile and easy-to-use tool for ultrafast ranking and clustering of macromolecular structures. uQlust makes use of structural profiles of proteins and nucleic acids, while combining a linear-time algorithm for implicit comparison of all pairs of models with profile hashing to enable efficient clustering of large data sets with a low memory footprint. In addition to ranking and clustering of large sets of models of the same protein or RNA molecule, uQlust can also be used in conjunction with fragment-based profiles in order to cluster structures of arbitrary length. For example, hierarchical clustering of the entire PDB using profile hashing can be performed on a typical laptop, thus opening an avenue for structural explorations previously limited to dedicated resources. The uQlust package is freely available under the GNU General Public License at https://github.com/uQlust . uQlust represents a drastic reduction in the computational complexity and memory requirements with respect to existing clustering and model quality assessment methods for macromolecular structure analysis, while yielding results on par with traditional approaches for both proteins and RNAs.

  10. Privacy-Preserving Patient Similarity Learning in a Federated Environment: Development and Analysis

    PubMed Central

    Sun, Jimeng; Wang, Fei; Wang, Shuang; Jun, Chi-Hyuck; Jiang, Xiaoqian

    2018-01-01

    Background There is an urgent need for the development of global analytic frameworks that can perform analyses in a privacy-preserving federated environment across multiple institutions without privacy leakage. A few studies on the topic of federated medical analysis have been conducted recently with the focus on several algorithms. However, none of them have solved similar patient matching, which is useful for applications such as cohort construction for cross-institution observational studies, disease surveillance, and clinical trials recruitment. Objective The aim of this study was to present a privacy-preserving platform in a federated setting for patient similarity learning across institutions. Without sharing patient-level information, our model can find similar patients from one hospital to another. Methods We proposed a federated patient hashing framework and developed a novel algorithm to learn context-specific hash codes to represent patients across institutions. The similarities between patients can be efficiently computed using the resulting hash codes of corresponding patients. To avoid security attack from reverse engineering on the model, we applied homomorphic encryption to patient similarity search in a federated setting. Results We used sequential medical events extracted from the Multiparameter Intelligent Monitoring in Intensive Care-III database to evaluate the proposed algorithm in predicting the incidence of five diseases independently. Our algorithm achieved averaged area under the curves of 0.9154 and 0.8012 with balanced and imbalanced data, respectively, in κ-nearest neighbor with κ=3. We also confirmed privacy preservation in similarity search by using homomorphic encryption. Conclusions The proposed algorithm can help search similar patients across institutions effectively to support federated data analysis in a privacy-preserving manner. PMID:29653917

  11. Matching Aerial Images to 3D Building Models Using Context-Based Geometric Hashing

    PubMed Central

    Jung, Jaewook; Sohn, Gunho; Bang, Kiin; Wichmann, Andreas; Armenakis, Costas; Kada, Martin

    2016-01-01

    A city is a dynamic entity, which environment is continuously changing over time. Accordingly, its virtual city models also need to be regularly updated to support accurate model-based decisions for various applications, including urban planning, emergency response and autonomous navigation. A concept of continuous city modeling is to progressively reconstruct city models by accommodating their changes recognized in spatio-temporal domain, while preserving unchanged structures. A first critical step for continuous city modeling is to coherently register remotely sensed data taken at different epochs with existing building models. This paper presents a new model-to-image registration method using a context-based geometric hashing (CGH) method to align a single image with existing 3D building models. This model-to-image registration process consists of three steps: (1) feature extraction; (2) similarity measure; and matching, and (3) estimating exterior orientation parameters (EOPs) of a single image. For feature extraction, we propose two types of matching cues: edged corner features representing the saliency of building corner points with associated edges, and contextual relations among the edged corner features within an individual roof. A set of matched corners are found with given proximity measure through geometric hashing, and optimal matches are then finally determined by maximizing the matching cost encoding contextual similarity between matching candidates. Final matched corners are used for adjusting EOPs of the single airborne image by the least square method based on collinearity equations. The result shows that acceptable accuracy of EOPs of a single image can be achievable using the proposed registration approach as an alternative to a labor-intensive manual registration process. PMID:27338410

  12. Decision and function problems based on boson sampling

    NASA Astrophysics Data System (ADS)

    Nikolopoulos, Georgios M.; Brougham, Thomas

    2016-07-01

    Boson sampling is a mathematical problem that is strongly believed to be intractable for classical computers, whereas passive linear interferometers can produce samples efficiently. So far, the problem remains a computational curiosity, and the possible usefulness of boson-sampling devices is mainly limited to the proof of quantum supremacy. The purpose of this work is to investigate whether boson sampling can be used as a resource of decision and function problems that are computationally hard, and may thus have cryptographic applications. After the definition of a rather general theoretical framework for the design of such problems, we discuss their solution by means of a brute-force numerical approach, as well as by means of nonboson samplers. Moreover, we estimate the sample sizes required for their solution by passive linear interferometers, and it is shown that they are independent of the size of the Hilbert space.

  13. S-Boxes Based on Affine Mapping and Orbit of Power Function

    NASA Astrophysics Data System (ADS)

    Khan, Mubashar; Azam, Naveed Ahmed

    2015-06-01

    The demand of data security against computational attacks such as algebraic, differential, linear and interpolation attacks has been increased as a result of rapid advancement in the field of computation. It is, therefore, necessary to develop such cryptosystems which can resist current cryptanalysis and more computational attacks in future. In this paper, we present a multiple S-boxes scheme based on affine mapping and orbit of the power function used in Advanced Encryption Standard (AES). The proposed technique results in 256 different S-boxes named as orbital S-boxes. Rigorous tests and comparisons are performed to analyse the cryptographic strength of each of the orbital S-boxes. Furthermore, gray scale images are encrypted by using multiple orbital S-boxes. Results and simulations show that the encryption strength of the orbital S-boxes against computational attacks is better than that of the existing S-boxes.

  14. Short Review on Quantum Key Distribution Protocols.

    PubMed

    Giampouris, Dimitris

    2017-01-01

    Cryptographic protocols and mechanisms are widely investigated under the notion of quantum computing. Quantum cryptography offers particular advantages over classical ones, whereas in some cases established protocols have to be revisited in order to maintain their functionality. The purpose of this paper is to provide the basic definitions and review the most important theoretical advancements concerning the BB84 and E91 protocols. It also aims to offer a summary on some key developments on the field of quantum key distribution, closely related with the two aforementioned protocols. The main goal of this study is to provide the necessary background information along with a thorough review on the theoretical aspects of QKD, concentrating on specific protocols. The BB84 and E91 protocols have been chosen because most other protocols are similar to these, a fact that makes them important for the general understanding of how the QKD mechanism functions.

  15. An efficient and secure partial image encryption for wireless multimedia sensor networks using discrete wavelet transform, chaotic maps and substitution box

    NASA Astrophysics Data System (ADS)

    Khan, Muazzam A.; Ahmad, Jawad; Javaid, Qaisar; Saqib, Nazar A.

    2017-03-01

    Wireless Sensor Networks (WSN) is widely deployed in monitoring of some physical activity and/or environmental conditions. Data gathered from WSN is transmitted via network to a central location for further processing. Numerous applications of WSN can be found in smart homes, intelligent buildings, health care, energy efficient smart grids and industrial control systems. In recent years, computer scientists has focused towards findings more applications of WSN in multimedia technologies, i.e. audio, video and digital images. Due to bulky nature of multimedia data, WSN process a large volume of multimedia data which significantly increases computational complexity and hence reduces battery time. With respect to battery life constraints, image compression in addition with secure transmission over a wide ranged sensor network is an emerging and challenging task in Wireless Multimedia Sensor Networks. Due to the open nature of the Internet, transmission of data must be secure through a process known as encryption. As a result, there is an intensive demand for such schemes that is energy efficient as well as highly secure since decades. In this paper, discrete wavelet-based partial image encryption scheme using hashing algorithm, chaotic maps and Hussain's S-Box is reported. The plaintext image is compressed via discrete wavelet transform and then the image is shuffled column-wise and row wise-wise via Piece-wise Linear Chaotic Map (PWLCM) and Nonlinear Chaotic Algorithm, respectively. To get higher security, initial conditions for PWLCM are made dependent on hash function. The permuted image is bitwise XORed with random matrix generated from Intertwining Logistic map. To enhance the security further, final ciphertext is obtained after substituting all elements with Hussain's substitution box. Experimental and statistical results confirm the strength of the anticipated scheme.

  16. SPOT: Optimization Tool for Network Adaptable Security

    NASA Astrophysics Data System (ADS)

    Ksiezopolski, Bogdan; Szalachowski, Pawel; Kotulski, Zbigniew

    Recently we have observed the growth of the intelligent application especially with its mobile character, called e-anything. The implementation of these applications provides guarantee of security requirements of the cryptographic protocols which are used in the application. Traditionally the protocols have been configured with the strongest possible security mechanisms. Unfortunately, when the application is used by means of the mobile devices, the strongest protection can lead to the denial of services for them. The solution of this problem is introducing the quality of protection models which will scale the protection level depending on the actual threat level. In this article we would like to introduce the application which manages the protection level of the processes in the mobile environment. The Security Protocol Optimizing Tool (SPOT) optimizes the cryptographic protocol and defines the protocol version appropriate to the actual threat level. In this article the architecture of the SPOT is presented with a detailed description of the included modules.

  17. Network Security via Biometric Recognition of Patterns of Gene Expression

    NASA Technical Reports Server (NTRS)

    Shaw, Harry C.

    2016-01-01

    Molecular biology provides the ability to implement forms of information and network security completely outside the bounds of legacy security protocols and algorithms. This paper addresses an approach which instantiates the power of gene expression for security. Molecular biology provides a rich source of gene expression and regulation mechanisms, which can be adopted to use in the information and electronic communication domains. Conventional security protocols are becoming increasingly vulnerable due to more intensive, highly capable attacks on the underlying mathematics of cryptography. Security protocols are being undermined by social engineering and substandard implementations by IT (Information Technology) organizations. Molecular biology can provide countermeasures to these weak points with the current security approaches. Future advances in instruments for analyzing assays will also enable this protocol to advance from one of cryptographic algorithms to an integrated system of cryptographic algorithms and real-time assays of gene expression products.

  18. Network Security via Biometric Recognition of Patterns of Gene Expression

    NASA Technical Reports Server (NTRS)

    Shaw, Harry C.

    2016-01-01

    Molecular biology provides the ability to implement forms of information and network security completely outside the bounds of legacy security protocols and algorithms. This paper addresses an approach which instantiates the power of gene expression for security. Molecular biology provides a rich source of gene expression and regulation mechanisms, which can be adopted to use in the information and electronic communication domains. Conventional security protocols are becoming increasingly vulnerable due to more intensive, highly capable attacks on the underlying mathematics of cryptography. Security protocols are being undermined by social engineering and substandard implementations by IT organizations. Molecular biology can provide countermeasures to these weak points with the current security approaches. Future advances in instruments for analyzing assays will also enable this protocol to advance from one of cryptographic algorithms to an integrated system of cryptographic algorithms and real-time expression and assay of gene expression products.

  19. The model of encryption algorithm based on non-positional polynomial notations and constructed on an SP-network

    NASA Astrophysics Data System (ADS)

    Kapalova, N.; Haumen, A.

    2018-05-01

    This paper addresses to structures and properties of the cryptographic information protection algorithm model based on NPNs and constructed on an SP-network. The main task of the research is to increase the cryptostrength of the algorithm. In the paper, the transformation resulting in the improvement of the cryptographic strength of the algorithm is described in detail. The proposed model is based on an SP-network. The reasons for using the SP-network in this model are the conversion properties used in these networks. In the encryption process, transformations based on S-boxes and P-boxes are used. It is known that these transformations can withstand cryptanalysis. In addition, in the proposed model, transformations that satisfy the requirements of the "avalanche effect" are used. As a result of this work, a computer program that implements an encryption algorithm model based on the SP-network has been developed.

  20. System of end-to-end symmetric database encryption

    NASA Astrophysics Data System (ADS)

    Galushka, V. V.; Aydinyan, A. R.; Tsvetkova, O. L.; Fathi, V. A.; Fathi, D. V.

    2018-05-01

    The article is devoted to the actual problem of protecting databases from information leakage, which is performed while bypassing access control mechanisms. To solve this problem, it is proposed to use end-to-end data encryption, implemented at the end nodes of an interaction of the information system components using one of the symmetric cryptographic algorithms. For this purpose, a key management method designed for use in a multi-user system based on the distributed key representation model, part of which is stored in the database, and the other part is obtained by converting the user's password, has been developed and described. In this case, the key is calculated immediately before the cryptographic transformations and is not stored in the memory after the completion of these transformations. Algorithms for registering and authorizing a user, as well as changing his password, have been described, and the methods for calculating parts of a key when performing these operations have been provided.

  1. A survey of noninteractive zero knowledge proof system and its applications.

    PubMed

    Wu, Huixin; Wang, Feng

    2014-01-01

    Zero knowledge proof system which has received extensive attention since it was proposed is an important branch of cryptography and computational complexity theory. Thereinto, noninteractive zero knowledge proof system contains only one message sent by the prover to the verifier. It is widely used in the construction of various types of cryptographic protocols and cryptographic algorithms because of its good privacy, authentication, and lower interactive complexity. This paper reviews and analyzes the basic principles of noninteractive zero knowledge proof system, and summarizes the research progress achieved by noninteractive zero knowledge proof system on the following aspects: the definition and related models of noninteractive zero knowledge proof system, noninteractive zero knowledge proof system of NP problems, noninteractive statistical and perfect zero knowledge, the connection between noninteractive zero knowledge proof system, interactive zero knowledge proof system, and zap, and the specific applications of noninteractive zero knowledge proof system. This paper also points out the future research directions.

  2. Quantum-noise randomized data encryption for wavelength-division-multiplexed fiber-optic networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corndorf, Eric; Liang Chuang; Kanter, Gregory S.

    2005-06-15

    We demonstrate high-rate randomized data-encryption through optical fibers using the inherent quantum-measurement noise of coherent states of light. Specifically, we demonstrate 650 Mbit/s data encryption through a 10 Gbit/s data-bearing, in-line amplified 200-km-long line. In our protocol, legitimate users (who share a short secret key) communicate using an M-ry signal set while an attacker (who does not share the secret key) is forced to contend with the fundamental and irreducible quantum-measurement noise of coherent states. Implementations of our protocol using both polarization-encoded signal sets as well as polarization-insensitive phase-keyed signal sets are experimentally and theoretically evaluated. Different from the performancemore » criteria for the cryptographic objective of key generation (quantum key-generation), one possible set of performance criteria for the cryptographic objective of data encryption is established and carefully considered.« less

  3. Quantum cryptographic system with reduced data loss

    DOEpatents

    Lo, H.K.; Chau, H.F.

    1998-03-24

    A secure method for distributing a random cryptographic key with reduced data loss is disclosed. Traditional quantum key distribution systems employ similar probabilities for the different communication modes and thus reject at least half of the transmitted data. The invention substantially reduces the amount of discarded data (those that are encoded and decoded in different communication modes e.g. using different operators) in quantum key distribution without compromising security by using significantly different probabilities for the different communication modes. Data is separated into various sets according to the actual operators used in the encoding and decoding process and the error rate for each set is determined individually. The invention increases the key distribution rate of the BB84 key distribution scheme proposed by Bennett and Brassard in 1984. Using the invention, the key distribution rate increases with the number of quantum signals transmitted and can be doubled asymptotically. 23 figs.

  4. Min-entropy uncertainty relation for finite-size cryptography

    NASA Astrophysics Data System (ADS)

    Ng, Nelly Huei Ying; Berta, Mario; Wehner, Stephanie

    2012-10-01

    Apart from their foundational significance, entropic uncertainty relations play a central role in proving the security of quantum cryptographic protocols. Of particular interest are therefore relations in terms of the smooth min-entropy for Bennett-Brassard 1984 (BB84) and six-state encodings. The smooth min-entropy Hminɛ(X/B) quantifies the negative logarithm of the probability for an attacker B to guess X, except with a small failure probability ɛ. Previously, strong uncertainty relations were obtained which are valid in the limit of large block lengths. Here, we prove an alternative uncertainty relation in terms of the smooth min-entropy that is only marginally less strong but has the crucial property that it can be applied to rather small block lengths. This paves the way for a practical implementation of many cryptographic protocols. As part of our proof we show tight uncertainty relations for a family of Rényi entropies that may be of independent interest.

  5. Method for exponentiating in cryptographic systems

    DOEpatents

    Brickell, Ernest F.; Gordon, Daniel M.; McCurley, Kevin S.

    1994-01-01

    An improved cryptographic method utilizing exponentiation is provided which has the advantage of reducing the number of multiplications required to determine the legitimacy of a message or user. The basic method comprises the steps of selecting a key from a preapproved group of integer keys g; exponentiating the key by an integer value e, where e represents a digital signature, to generate a value g.sup.e ; transmitting the value g.sup.e to a remote facility by a communications network; receiving the value g.sup.e at the remote facility; and verifying the digital signature as originating from the legitimate user. The exponentiating step comprises the steps of initializing a plurality of memory locations with a plurality of values g.sup.xi ; computi The United States Government has rights in this invention pursuant to Contract No. DE-AC04-76DP00789 between the Department of Energy and AT&T Company.

  6. Security of subcarrier wave quantum key distribution against the collective beam-splitting attack.

    PubMed

    Miroshnichenko, G P; Kozubov, A V; Gaidash, A A; Gleim, A V; Horoshko, D B

    2018-04-30

    We consider a subcarrier wave quantum key distribution (QKD) system, where quantum encoding is carried out at weak sidebands generated around a coherent optical beam as a result of electro-optical phase modulation. We study security of two protocols, B92 and BB84, against one of the most powerful attacks for this class of systems, the collective beam-splitting attack. Our analysis includes the case of high modulation index, where the sidebands are essentially multimode. We demonstrate numerically and experimentally that a subcarrier wave QKD system with realistic parameters is capable of distributing cryptographic keys over large distances in presence of collective attacks. We also show that BB84 protocol modification with discrimination of only one state in each basis performs not worse than the original BB84 protocol in this class of QKD systems, thus significantly simplifying the development of cryptographic networks using the considered QKD technique.

  7. Pseudonyms for cancer registries.

    PubMed

    Pommerening, K; Miller, M; Schmidtmann, I; Michaelis, J

    1996-06-01

    In order to conform to the rigid German legislation on data privacy and security we developed a new concept of data flow and data storage for population-based cancer registries. A special trusted office generates a pseudonym for each case by a cryptographic procedure. This office also handles the notification of cases and communicates with the reporting physicians. It passes pseudonymous records to the registration office for permanent storage. The registration office links the records according to the pseudonyms. Starting from a requirements analysis we show how to construct the pseudonyms; we then show that they meet the requirements. We discuss how the pseudonyms have to be protected by cryptographic and organizational means. A pilot study showed that the proposed procedure gives acceptable synonym and homonym error rates. The methods described are not restricted to cancer registration and may serve as a model for comparable applications in medical informatics.

  8. Continuous-variable protocol for oblivious transfer in the noisy-storage model.

    PubMed

    Furrer, Fabian; Gehring, Tobias; Schaffner, Christian; Pacher, Christoph; Schnabel, Roman; Wehner, Stephanie

    2018-04-13

    Cryptographic protocols are the backbone of our information society. This includes two-party protocols which offer protection against distrustful players. Such protocols can be built from a basic primitive called oblivious transfer. We present and experimentally demonstrate here a quantum protocol for oblivious transfer for optical continuous-variable systems, and prove its security in the noisy-storage model. This model allows us to establish security by sending more quantum signals than an attacker can reliably store during the protocol. The security proof is based on uncertainty relations which we derive for continuous-variable systems, that differ from the ones used in quantum key distribution. We experimentally demonstrate in a proof-of-principle experiment the proposed oblivious transfer protocol for various channel losses by using entangled two-mode squeezed states measured with balanced homodyne detection. Our work enables the implementation of arbitrary two-party quantum cryptographic protocols with continuous-variable communication systems.

  9. Security analysis of quadratic phase based cryptography

    NASA Astrophysics Data System (ADS)

    Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Healy, John J.; Sheridan, John T.

    2016-09-01

    The linear canonical transform (LCT) is essential in modeling a coherent light field propagation through first-order optical systems. Recently, a generic optical system, known as a Quadratic Phase Encoding System (QPES), for encrypting a two-dimensional (2D) image has been reported. It has been reported together with two phase keys the individual LCT parameters serve as keys of the cryptosystem. However, it is important that such the encryption systems also satisfies some dynamic security properties. Therefore, in this work, we examine some cryptographic evaluation methods, such as Avalanche Criterion and Bit Independence, which indicates the degree of security of the cryptographic algorithms on QPES. We compare our simulation results with the conventional Fourier and the Fresnel transform based DRPE systems. The results show that the LCT based DRPE has an excellent avalanche and bit independence characteristics than that of using the conventional Fourier and Fresnel based encryption systems.

  10. Choice of optical system is critical for the security of double random phase encryption systems

    NASA Astrophysics Data System (ADS)

    Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Cassidy, Derek; Zhao, Liang; Ryle, James P.; Healy, John J.; Sheridan, John T.

    2017-06-01

    The linear canonical transform (LCT) is used in modeling a coherent light-field propagation through first-order optical systems. Recently, a generic optical system, known as the quadratic phase encoding system (QPES), for encrypting a two-dimensional image has been reported. In such systems, two random phase keys and the individual LCT parameters (α,β,γ) serve as secret keys of the cryptosystem. It is important that such encryption systems also satisfy some dynamic security properties. We, therefore, examine such systems using two cryptographic evaluation methods, the avalanche effect and bit independence criterion, which indicate the degree of security of the cryptographic algorithms using QPES. We compared our simulation results with the conventional Fourier and the Fresnel transform-based double random phase encryption (DRPE) systems. The results show that the LCT-based DRPE has an excellent avalanche and bit independence characteristics compared to the conventional Fourier and Fresnel-based encryption systems.

  11. Physical key-protected one-time pad

    PubMed Central

    Horstmeyer, Roarke; Judkewitz, Benjamin; Vellekoop, Ivo M.; Assawaworrarit, Sid; Yang, Changhuei

    2013-01-01

    We describe an encrypted communication principle that forms a secure link between two parties without electronically saving either of their keys. Instead, random cryptographic bits are kept safe within the unique mesoscopic randomness of two volumetric scattering materials. We demonstrate how a shared set of patterned optical probes can generate 10 gigabits of statistically verified randomness between a pair of unique 2 mm3 scattering objects. This shared randomness is used to facilitate information-theoretically secure communication following a modified one-time pad protocol. Benefits of volumetric physical storage over electronic memory include the inability to probe, duplicate or selectively reset any bits without fundamentally altering the entire key space. Our ability to securely couple the randomness contained within two unique physical objects can extend to strengthen hardware required by a variety of cryptographic protocols, which is currently a critically weak link in the security pipeline of our increasingly mobile communication culture. PMID:24345925

  12. Quantum cryptographic system with reduced data loss

    DOEpatents

    Lo, Hoi-Kwong; Chau, Hoi Fung

    1998-01-01

    A secure method for distributing a random cryptographic key with reduced data loss. Traditional quantum key distribution systems employ similar probabilities for the different communication modes and thus reject at least half of the transmitted data. The invention substantially reduces the amount of discarded data (those that are encoded and decoded in different communication modes e.g. using different operators) in quantum key distribution without compromising security by using significantly different probabilities for the different communication modes. Data is separated into various sets according to the actual operators used in the encoding and decoding process and the error rate for each set is determined individually. The invention increases the key distribution rate of the BB84 key distribution scheme proposed by Bennett and Brassard in 1984. Using the invention, the key distribution rate increases with the number of quantum signals transmitted and can be doubled asymptotically.

  13. Semantic Segmentation of Building Elements Using Point Cloud Hashing

    NASA Astrophysics Data System (ADS)

    Chizhova, M.; Gurianov, A.; Hess, M.; Luhmann, T.; Brunn, A.; Stilla, U.

    2018-05-01

    For the interpretation of point clouds, the semantic definition of extracted segments from point clouds or images is a common problem. Usually, the semantic of geometrical pre-segmented point cloud elements are determined using probabilistic networks and scene databases. The proposed semantic segmentation method is based on the psychological human interpretation of geometric objects, especially on fundamental rules of primary comprehension. Starting from these rules the buildings could be quite well and simply classified by a human operator (e.g. architect) into different building types and structural elements (dome, nave, transept etc.), including particular building parts which are visually detected. The key part of the procedure is a novel method based on hashing where point cloud projections are transformed into binary pixel representations. A segmentation approach released on the example of classical Orthodox churches is suitable for other buildings and objects characterized through a particular typology in its construction (e.g. industrial objects in standardized enviroments with strict component design allowing clear semantic modelling).

  14. Paradeisos: A perfect hashing algorithm for many-body eigenvalue problems

    DOE PAGES

    Jia, C. J.; Wang, Y.; Mendl, C. B.; ...

    2017-12-02

    Here, we describe an essentially perfect hashing algorithm for calculating the position of an element in an ordered list, appropriate for the construction and manipulation of many-body Hamiltonian, sparse matrices. Each element of the list corresponds to an integer value whose binary representation reflects the occupation of single-particle basis states for each element in the many-body Hilbert space. The algorithm replaces conventional methods, such as binary search, for locating the elements of the ordered list, eliminating the need to store the integer representation for each element, without increasing the computational complexity. Combined with the “checkerboard” decomposition of the Hamiltonian matrixmore » for distribution over parallel computing environments, this leads to a substantial savings in aggregate memory. While the algorithm can be applied broadly to many-body, correlated problems, we demonstrate its utility in reducing total memory consumption for a series of fermionic single-band Hubbard model calculations on small clusters with progressively larger Hilbert space dimension.« less

  15. LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel

    2017-10-01

    Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.

  16. Matching Real and Synthetic Panoramic Images Using a Variant of Geometric Hashing

    NASA Astrophysics Data System (ADS)

    Li-Chee-Ming, J.; Armenakis, C.

    2017-05-01

    This work demonstrates an approach to automatically initialize a visual model-based tracker, and recover from lost tracking, without prior camera pose information. These approaches are commonly referred to as tracking-by-detection. Previous tracking-by-detection techniques used either fiducials (i.e. landmarks or markers) or the object's texture. The main contribution of this work is the development of a tracking-by-detection algorithm that is based solely on natural geometric features. A variant of geometric hashing, a model-to-image registration algorithm, is proposed that searches for a matching panoramic image from a database of synthetic panoramic images captured in a 3D virtual environment. The approach identifies corresponding features between the matched panoramic images. The corresponding features are to be used in a photogrammetric space resection to estimate the camera pose. The experiments apply this algorithm to initialize a model-based tracker in an indoor environment using the 3D CAD model of the building.

  17. Tuple spaces in hardware for accelerated implicit routing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Zachary Kent; Tripp, Justin

    2010-12-01

    Organizing and optimizing data objects on networks with support for data migration and failing nodes is a complicated problem to handle as systems grow. The goal of this work is to demonstrate that high levels of speedup can be achieved by moving responsibility for finding, fetching, and staging data into an FPGA-based network card. We present a system for implicit routing of data via FPGA-based network cards. In this system, data structures are requested by name, and the network of FPGAs finds the data within the network and relays the structure to the requester. This is acheived through successive examinationmore » of hardware hash tables implemented in the FPGA. By avoiding software stacks between nodes, the data is quickly fetched entirely through FPGA-FPGA interaction. The performance of this system is orders of magnitude faster than software implementations due to the improved speed of the hash tables and lowered latency between the network nodes.« less

  18. Reducing computation in an i-vector speaker recognition system using a tree-structured universal background model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClanahan, Richard; De Leon, Phillip L.

    The majority of state-of-the-art speaker recognition systems (SR) utilize speaker models that are derived from an adapted universal background model (UBM) in the form of a Gaussian mixture model (GMM). This is true for GMM supervector systems, joint factor analysis systems, and most recently i-vector systems. In all of the identified systems, the posterior probabilities and sufficient statistics calculations represent a computational bottleneck in both enrollment and testing. We propose a multi-layered hash system, employing a tree-structured GMM–UBM which uses Runnalls’ Gaussian mixture reduction technique, in order to reduce the number of these calculations. Moreover, with this tree-structured hash, wemore » can trade-off reduction in computation with a corresponding degradation of equal error rate (EER). As an example, we also reduce this computation by a factor of 15× while incurring less than 10% relative degradation of EER (or 0.3% absolute EER) when evaluated with NIST 2010 speaker recognition evaluation (SRE) telephone data.« less

  19. Reducing computation in an i-vector speaker recognition system using a tree-structured universal background model

    DOE PAGES

    McClanahan, Richard; De Leon, Phillip L.

    2014-08-20

    The majority of state-of-the-art speaker recognition systems (SR) utilize speaker models that are derived from an adapted universal background model (UBM) in the form of a Gaussian mixture model (GMM). This is true for GMM supervector systems, joint factor analysis systems, and most recently i-vector systems. In all of the identified systems, the posterior probabilities and sufficient statistics calculations represent a computational bottleneck in both enrollment and testing. We propose a multi-layered hash system, employing a tree-structured GMM–UBM which uses Runnalls’ Gaussian mixture reduction technique, in order to reduce the number of these calculations. Moreover, with this tree-structured hash, wemore » can trade-off reduction in computation with a corresponding degradation of equal error rate (EER). As an example, we also reduce this computation by a factor of 15× while incurring less than 10% relative degradation of EER (or 0.3% absolute EER) when evaluated with NIST 2010 speaker recognition evaluation (SRE) telephone data.« less

  20. Paradeisos: A perfect hashing algorithm for many-body eigenvalue problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, C. J.; Wang, Y.; Mendl, C. B.

    Here, we describe an essentially perfect hashing algorithm for calculating the position of an element in an ordered list, appropriate for the construction and manipulation of many-body Hamiltonian, sparse matrices. Each element of the list corresponds to an integer value whose binary representation reflects the occupation of single-particle basis states for each element in the many-body Hilbert space. The algorithm replaces conventional methods, such as binary search, for locating the elements of the ordered list, eliminating the need to store the integer representation for each element, without increasing the computational complexity. Combined with the “checkerboard” decomposition of the Hamiltonian matrixmore » for distribution over parallel computing environments, this leads to a substantial savings in aggregate memory. While the algorithm can be applied broadly to many-body, correlated problems, we demonstrate its utility in reducing total memory consumption for a series of fermionic single-band Hubbard model calculations on small clusters with progressively larger Hilbert space dimension.« less

  1. A novel image retrieval algorithm based on PHOG and LSH

    NASA Astrophysics Data System (ADS)

    Wu, Hongliang; Wu, Weimin; Peng, Jiajin; Zhang, Junyuan

    2017-08-01

    PHOG can describe the local shape of the image and its relationship between the spaces. The using of PHOG algorithm to extract image features in image recognition and retrieval and other aspects have achieved good results. In recent years, locality sensitive hashing (LSH) algorithm has been superior to large-scale data in solving near-nearest neighbor problems compared with traditional algorithms. This paper presents a novel image retrieval algorithm based on PHOG and LSH. First, we use PHOG to extract the feature vector of the image, then use L different LSH hash table to reduce the dimension of PHOG texture to index values and map to different bucket, and finally extract the corresponding value of the image in the bucket for second image retrieval using Manhattan distance. This algorithm can adapt to the massive image retrieval, which ensures the high accuracy of the image retrieval and reduces the time complexity of the retrieval. This algorithm is of great significance.

  2. Breaking down the barriers of using strong authentication and encryption in resource constrained embedded systems

    NASA Astrophysics Data System (ADS)

    Knobler, Ron; Scheffel, Peter; Jackson, Scott; Gaj, Kris; Kaps, Jens Peter

    2013-05-01

    Various embedded systems, such as unattended ground sensors (UGS), are deployed in dangerous areas, where they are subject to compromise. Since numerous systems contain a network of devices that communicate with each other (often times with commercial off the shelf [COTS] radios), an adversary is able to intercept messages between system devices, which jeopardizes sensitive information transmitted by the system (e.g. location of system devices). Secret key algorithms such as AES are a very common means to encrypt all system messages to a sufficient security level, for which lightweight implementations exist for even very resource constrained devices. However, all system devices must use the appropriate key to encrypt and decrypt messages from each other. While traditional public key algorithms (PKAs), such as RSA and Elliptic Curve Cryptography (ECC), provide a sufficiently secure means to provide authentication and a means to exchange keys, these traditional PKAs are not suitable for very resource constrained embedded systems or systems which contain low reliability communication links (e.g. mesh networks), especially as the size of the network increases. Therefore, most UGS and other embedded systems resort to pre-placed keys (PPKs) or other naïve schemes which greatly reduce the security and effectiveness of the overall cryptographic approach. McQ has teamed with the Cryptographic Engineering Research Group (CERG) at George Mason University (GMU) to develop an approach using revolutionary cryptographic techniques that provides both authentication and encryption, but on resource constrained embedded devices, without the burden of large amounts of key distribution or storage.

  3. Evaluating privacy-preserving record linkage using cryptographic long-term keys and multibit trees on large medical datasets.

    PubMed

    Brown, Adrian P; Borgs, Christian; Randall, Sean M; Schnell, Rainer

    2017-06-08

    Integrating medical data using databases from different sources by record linkage is a powerful technique increasingly used in medical research. Under many jurisdictions, unique personal identifiers needed for linking the records are unavailable. Since sensitive attributes, such as names, have to be used instead, privacy regulations usually demand encrypting these identifiers. The corresponding set of techniques for privacy-preserving record linkage (PPRL) has received widespread attention. One recent method is based on Bloom filters. Due to superior resilience against cryptographic attacks, composite Bloom filters (cryptographic long-term keys, CLKs) are considered best practice for privacy in PPRL. Real-world performance of these techniques using large-scale data is unknown up to now. Using a large subset of Australian hospital admission data, we tested the performance of an innovative PPRL technique (CLKs using multibit trees) against a gold-standard derived from clear-text probabilistic record linkage. Linkage time and linkage quality (recall, precision and F-measure) were evaluated. Clear text probabilistic linkage resulted in marginally higher precision and recall than CLKs. PPRL required more computing time but 5 million records could still be de-duplicated within one day. However, the PPRL approach required fine tuning of parameters. We argue that increased privacy of PPRL comes with the price of small losses in precision and recall and a large increase in computational burden and setup time. These costs seem to be acceptable in most applied settings, but they have to be considered in the decision to apply PPRL. Further research on the optimal automatic choice of parameters is needed.

  4. Secure anonymous mutual authentication for star two-tier wireless body area networks.

    PubMed

    Ibrahim, Maged Hamada; Kumari, Saru; Das, Ashok Kumar; Wazid, Mohammad; Odelu, Vanga

    2016-10-01

    Mutual authentication is a very important service that must be established between sensor nodes in wireless body area network (WBAN) to ensure the originality and integrity of the patient's data sent by sensors distributed on different parts of the body. However, mutual authentication service is not enough. An adversary can benefit from monitoring the traffic and knowing which sensor is in transmission of patient's data. Observing the traffic (even without disclosing the context) and knowing its origin, it can reveal to the adversary information about the patient's medical conditions. Therefore, anonymity of the communicating sensors is an important service as well. Few works have been conducted in the area of mutual authentication among sensor nodes in WBAN. However, none of them has considered anonymity among body sensor nodes. Up to our knowledge, our protocol is the first attempt to consider this service in a two-tier WBAN. We propose a new secure protocol to realize anonymous mutual authentication and confidential transmission for star two-tier WBAN topology. The proposed protocol uses simple cryptographic primitives. We prove the security of the proposed protocol using the widely-accepted Burrows-Abadi-Needham (BAN) logic, and also through rigorous informal security analysis. In addition, to demonstrate the practicality of our protocol, we evaluate it using NS-2 simulator. BAN logic and informal security analysis prove that our proposed protocol achieves the necessary security requirements and goals of an authentication service. The simulation results show the impact on the various network parameters, such as end-to-end delay and throughput. The nodes in the network require to store few hundred bits. Nodes require to perform very few hash invocations, which are computationally very efficient. The communication cost of the proposed protocol is few hundred bits in one round of communication. Due to the low computation cost, the energy consumed by the nodes is also low. Our proposed protocol is a lightweight anonymous mutually authentication protocol to mutually authenticate the sensor nodes with the controller node (hub) in a star two-tier WBAN topology. Results show that our protocol proves efficiency over previously proposed protocols and at the same time, achieves the necessary security requirements for a secure anonymous mutual authentication scheme. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. A Novel Fast and Secure Approach for Voice Encryption Based on DNA Computing

    NASA Astrophysics Data System (ADS)

    Kakaei Kate, Hamidreza; Razmara, Jafar; Isazadeh, Ayaz

    2018-06-01

    Today, in the world of information communication, voice information has a particular importance. One way to preserve voice data from attacks is voice encryption. The encryption algorithms use various techniques such as hashing, chaotic, mixing, and many others. In this paper, an algorithm is proposed for voice encryption based on three different schemes to increase flexibility and strength of the algorithm. The proposed algorithm uses an innovative encoding scheme, the DNA encryption technique and a permutation function to provide a secure and fast solution for voice encryption. The algorithm is evaluated based on various measures including signal to noise ratio, peak signal to noise ratio, correlation coefficient, signal similarity and signal frequency content. The results demonstrate applicability of the proposed method in secure and fast encryption of voice files

  6. Number Theory and Public-Key Cryptography.

    ERIC Educational Resources Information Center

    Lefton, Phyllis

    1991-01-01

    Described are activities in the study of techniques used to conceal the meanings of messages and data. Some background information and two BASIC programs that illustrate the algorithms used in a new cryptographic system called "public-key cryptography" are included. (CW)

  7. Psst, Can You Keep a Secret?

    PubMed

    Vassilev, Apostol; Mouha, Nicky; Brandão, Luís

    2018-01-01

    The security of encrypted data depends not only on the theoretical properties of cryptographic primitives but also on the robustness of their implementations in software and hardware. Threshold cryptography introduces a computational paradigm that enables higher assurance for such implementations.

  8. Classification of Encrypted Web Traffic Using Machine Learning Algorithms

    DTIC Science & Technology

    2013-06-01

    DPI devices to block certain websites; Yu, Cong, Chen, and Lei [52] suggest hashing the domains of pornographic and illegal websites so ISPs can...Zhenming Lei. “Blocking pornographic , illegal websites by internet host domain using FPGA and Bloom Filter”. Network Infrastructure and Digital Content

  9. Psst, Can You Keep a Secret?

    PubMed Central

    Vassilev, Apostol; Mouha, Nicky; Brandão, Luís

    2018-01-01

    The security of encrypted data depends not only on the theoretical properties of cryptographic primitives but also on the robustness of their implementations in software and hardware. Threshold cryptography introduces a computational paradigm that enables higher assurance for such implementations. PMID:29576634

  10. 21 CFR 1311.08 - Incorporation by reference.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of Standards and Technology, Computer Security Division, Information Technology Laboratory, National... standards are available from the National Institute of Standards and Technology, Computer Security Division... 140-2, Security Requirements for Cryptographic Modules, May 25, 2001, as amended by Change Notices 2...

  11. Case Study: OpenSSL 2012 Validation

    DTIC Science & Technology

    2013-08-01

    there are probably millions of users who are impacted directly, and hundreds of millions who are indirectly affected. Cryptographic libraries are...UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESSES 8. PERFORMING ORGANIZATION REPORT NUMBER D-4991 H13 -001174 Institute for Defense

  12. 76 FR 7817 - Announcing Draft Federal Information Processing Standard 180-4, Secure Hash Standard, and Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... before May 12, 2011. ADDRESSES: Written comments may be sent to: Chief, Computer Security Division... FURTHER INFORMATION CONTACT: Elaine Barker, Computer Security Division, National Institute of Standards... Quynh Dang, Computer Security Division, National Institute of Standards and Technology, Gaithersburg, MD...

  13. Implementation and Use of the Reference Analytics Module of LibAnswers

    ERIC Educational Resources Information Center

    Flatley, Robert; Jensen, Robert Bruce

    2012-01-01

    Academic libraries have traditionally collected reference statistics using hash marks on paper. Although efficient and simple, this method is not an effective way to capture the complexity of reference transactions. Several electronic tools are now available to assist libraries with collecting often elusive reference data--among them homegrown…

  14. TESS: a geometric hashing algorithm for deriving 3D coordinate templates for searching structural databases. Application to enzyme active sites.

    PubMed Central

    Wallace, A. C.; Borkakoti, N.; Thornton, J. M.

    1997-01-01

    It is well established that sequence templates such as those in the PROSITE and PRINTS databases are powerful tools for predicting the biological function and tertiary structure for newly derived protein sequences. The number of X-ray and NMR protein structures is increasing rapidly and it is apparent that a 3D equivalent of the sequence templates is needed. Here, we describe an algorithm called TESS that automatically derives 3D templates from structures deposited in the Brookhaven Protein Data Bank. While a new sequence can be searched for sequence patterns, a new structure can be scanned against these 3D templates to identify functional sites. As examples, 3D templates are derived for enzymes with an O-His-O "catalytic triad" and for the ribonucleases and lysozymes. When these 3D templates are applied to a large data set of nonidentical proteins, several interesting hits are located. This suggests that the development of a 3D template database may help to identify the function of new protein structures, if unknown, as well as to design proteins with specific functions. PMID:9385633

  15. New public key cryptosystem based on quaternions

    NASA Astrophysics Data System (ADS)

    Durcheva, Mariana; Karailiev, Kristian

    2017-12-01

    Quaternions are not commonly used in cryptography. Nevertheless, the noncommutativity of their multiplication makes them suitable for cryptographic purposes. In this paper we suggest a Diffie-Hellman like cryptosystem based on the the quaternions. Additionally, a computer realization of the protocol is given.

  16. Critical analysis of the Bennett-Riedel attack on secure cryptographic key distributions via the Kirchhoff-Law-Johnson-noise scheme.

    PubMed

    Kish, Laszlo B; Abbott, Derek; Granqvist, Claes G

    2013-01-01

    Recently, Bennett and Riedel (BR) (http://arxiv.org/abs/1303.7435v1) argued that thermodynamics is not essential in the Kirchhoff-law-Johnson-noise (KLJN) classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to demonstrate this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR's scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. All our analyses are based on a technically unlimited Eve with infinitely accurate and fast measurements limited only by the laws of physics and statistics. For non-ideal situations and at active (invasive) attacks, the uncertainly principle between measurement duration and statistical errors makes it impossible for Eve to extract the key regardless of the accuracy or speed of her measurements. To show that thermodynamics and noise are essential for the security, we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each BR-attack against the KLJN scheme and conclude that the information theoretic (unconditional) security of the KLJN method has not been successfully challenged.

  17. Critical Analysis of the Bennett–Riedel Attack on Secure Cryptographic Key Distributions via the Kirchhoff-Law–Johnson-Noise Scheme

    PubMed Central

    Kish, Laszlo B.; Abbott, Derek; Granqvist, Claes G.

    2013-01-01

    Recently, Bennett and Riedel (BR) (http://arxiv.org/abs/1303.7435v1) argued that thermodynamics is not essential in the Kirchhoff-law–Johnson-noise (KLJN) classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to demonstrate this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR's scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. All our analyses are based on a technically unlimited Eve with infinitely accurate and fast measurements limited only by the laws of physics and statistics. For non-ideal situations and at active (invasive) attacks, the uncertainly principle between measurement duration and statistical errors makes it impossible for Eve to extract the key regardless of the accuracy or speed of her measurements. To show that thermodynamics and noise are essential for the security, we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each BR-attack against the KLJN scheme and conclude that the information theoretic (unconditional) security of the KLJN method has not been successfully challenged. PMID:24358129

  18. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  19. Resource-Efficient Data-Intensive System Designs for High Performance and Capacity

    DTIC Science & Technology

    2015-09-01

    76, 79, 80, and 81.] [9] Anirudh Badam, KyoungSoo Park, Vivek S. Pai, and Larry L. Peterson. HashCache: cache storage for the next billion. In Proc...Jeffrey Dean, Sanjay Ghemawat, Wilson C. Hsieh, Deborah A. Wallach, Mike Burrows , Tushar Chandra, Andrew Fikes, and Robert E. Gruber. Bigtable: A

  20. Two Improved Access Methods on Compact Binary (CB) Trees.

    ERIC Educational Resources Information Center

    Shishibori, Masami; Koyama, Masafumi; Okada, Makoto; Aoe, Jun-ichi

    2000-01-01

    Discusses information retrieval and the use of binary trees as a fast access method for search strategies such as hashing. Proposes new methods based on compact binary trees that provide faster access and more compact storage, explains the theoretical basis, and confirms the validity of the methods through empirical observations. (LRW)

  1. Malware Memory Analysis for Non specialists: Investigating Publicly Available Memory Images for Prolaco and SpyEye

    DTIC Science & Technology

    2013-10-01

    1_doc_RCData_612|virus|Trojan|rootkit|worm|Prolaco|rundll|msiexec|google|wmimngr|jusche d|wfmngr|wupmgr| java |wpmgr|nvscpapisvr)’ ” results in the...hashdump Dumps passwords hashes (LM/NTLM) from memory hibinfo Dump hibernation file information hivedump Prints out a hive hivelist Print list of

  2. 29 CFR 570.61 - Occupations in the operation of power-driven meat-processing machines and occupations involving...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... machines. (3) All occupations involved in tankage or rendering of dead animals, animal offal, animal fats..., and hashing machines; and presses (except belly-rolling machines). Except, the provisions of this.... Rendering plants means establishments engaged in the conversion of dead animals, animal offal, animal fats...

  3. Compact modalities for forward-error correction

    NASA Astrophysics Data System (ADS)

    Fang, Dejian

    2013-10-01

    Hash tables [1] must work. In fact, few leading analysts would disagree with the refinement of thin clients. In our research, we disprove not only that the infamous read-write algorithm for the exploration of object-oriented languages by W. White et al. is NP-complete, but that the same is true for the lookaside buffer.

  4. 77 FR 13294 - Announcing Approval of Federal Information Processing Standard (FIPS) Publication 180-4, Secure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-06

    ... hash algorithms in many computer network applications. On February 11, 2011, NIST published a notice in... Information Security Management Act (FISMA) of 2002 (Pub. L. 107-347), the Secretary of Commerce is authorized to approve Federal Information Processing Standards (FIPS). NIST activities to develop computer...

  5. Scratch Nights and Hash-Tag Chats: Creative Tools to Enhance Choreography in the Higher Education Dance Curriculum

    ERIC Educational Resources Information Center

    Kelsey, Louise; Uytterhoeven, Lise

    2017-01-01

    This paper reports on a focused collaborative learning and teaching research project between the Dance Department at Middlesex University and partner institution London Studio Centre. Informed by Belinda Allen's research on creative curriculum design, dance students and lecturers shared innovative learning opportunities to enhance the development…

  6. Security in Wireless Sensor Networks Employing MACGSP6

    ERIC Educational Resources Information Center

    Nitipaichit, Yuttasart

    2010-01-01

    Wireless Sensor Networks (WSNs) have unique characteristics which constrain them; including small energy stores, limited computation, and short range communication capability. Most traditional security algorithms use cryptographic primitives such as Public-key cryptography and are not optimized for energy usage. Employing these algorithms for the…

  7. A Multi-Threaded Cryptographic Pseudorandom Number Generator Test Suite

    DTIC Science & Technology

    2016-09-01

    bitcoin thieves, Google releases patch. (2013, Aug. 16). SiliconANGLE. [Online]. Available: http://siliconangle.com/blog/2013/ 08/16/android-crypto-prng...flaw-aided- bitcoin -thieves-google-releases-patch/ [5] M. Gondree. (2014, Sep. 28). NPS POSIX thread pool library. [Online]. Available: https

  8. 48 CFR 352.239-71 - Standard for encryption language.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... language. 352.239-71 Section 352.239-71 Federal Acquisition Regulations System HEALTH AND HUMAN SERVICES... Information Processing Standard (FIPS) 140-2-compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of HHS sensitive information during storage and transmission...

  9. 48 CFR 352.239-71 - Standard for encryption language.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...

  10. 48 CFR 352.239-71 - Standard for encryption language.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...

  11. 48 CFR 352.239-71 - Standard for encryption language.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...

  12. Data Recovery of Distributed Hash Table with Distributed-to-Distributed Data Copy

    NASA Astrophysics Data System (ADS)

    Doi, Yusuke; Wakayama, Shirou; Ozaki, Satoshi

    To realize huge-scale information services, many Distributed Hash Table (DHT) based systems have been proposed. For example, there are some proposals to manage item-level product traceability information with DHTs. In such an application, each entry of a huge number of item-level IDs need to be available on a DHT. To ensure data availability, the soft-state approach has been employed in previous works. However, this does not scale well against the number of entries on a DHT. As we expect 1010 products in the traceability case, the soft-state approach is unacceptable. In this paper, we propose Distributed-to-Distributed Data Copy (D3C). With D3C, users can reconstruct the data as they detect data loss, or even migrate to another DHT system. We show why it scales well against the number of entries on a DHT. We have confirmed our approach with a prototype. Evaluation shows our approach fits well on a DHT with a low rate of failure and a huge number of data entries.

  13. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    PubMed

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  14. A novel and lightweight system to secure wireless medical sensor networks.

    PubMed

    He, Daojing; Chan, Sammy; Tang, Shaohua

    2014-01-01

    Wireless medical sensor networks (MSNs) are a key enabling technology in e-healthcare that allows the data of a patient's vital body parameters to be collected by the wearable or implantable biosensors. However, the security and privacy protection of the collected data is a major unsolved issue, with challenges coming from the stringent resource constraints of MSN devices, and the high demand for both security/privacy and practicality. In this paper, we propose a lightweight and secure system for MSNs. The system employs hash-chain based key updating mechanism and proxy-protected signature technique to achieve efficient secure transmission and fine-grained data access control. Furthermore, we extend the system to provide backward secrecy and privacy preservation. Our system only requires symmetric-key encryption/decryption and hash operations and is thus suitable for the low-power sensor nodes. This paper also reports the experimental results of the proposed system in a network of resource-limited motes and laptop PCs, which show its efficiency in practice. To the best of our knowledge, this is the first secure data transmission and access control system for MSNs until now.

  15. An incremental community detection method for social tagging systems using locality-sensitive hashing.

    PubMed

    Wu, Zhenyu; Zou, Ming

    2014-10-01

    An increasing number of users interact, collaborate, and share information through social networks. Unprecedented growth in social networks is generating a significant amount of unstructured social data. From such data, distilling communities where users have common interests and tracking variations of users' interests over time are important research tracks in fields such as opinion mining, trend prediction, and personalized services. However, these tasks are extremely difficult considering the highly dynamic characteristics of the data. Existing community detection methods are time consuming, making it difficult to process data in real time. In this paper, dynamic unstructured data is modeled as a stream. Tag assignments stream clustering (TASC), an incremental scalable community detection method, is proposed based on locality-sensitive hashing. Both tags and latent interactions among users are incorporated in the method. In our experiments, the social dynamic behaviors of users are first analyzed. The proposed TASC method is then compared with state-of-the-art clustering methods such as StreamKmeans and incremental k-clique; results indicate that TASC can detect communities more efficiently and effectively. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. SnapDock—template-based docking by Geometric Hashing

    PubMed Central

    Estrin, Michael; Wolfson, Haim J.

    2017-01-01

    Abstract Motivation: A highly efficient template-based protein–protein docking algorithm, nicknamed SnapDock, is presented. It employs a Geometric Hashing-based structural alignment scheme to align the target proteins to the interfaces of non-redundant protein–protein interface libraries. Docking of a pair of proteins utilizing the 22 600 interface PIFACE library is performed in < 2 min on the average. A flexible version of the algorithm allowing hinge motion in one of the proteins is presented as well. Results: To evaluate the performance of the algorithm a blind re-modelling of 3547 PDB complexes, which have been uploaded after the PIFACE publication has been performed with success ratio of about 35%. Interestingly, a similar experiment with the template free PatchDock docking algorithm yielded a success rate of about 23% with roughly 1/3 of the solutions different from those of SnapDock. Consequently, the combination of the two methods gave a 42% success ratio. Availability and implementation: A web server of the application is under development. Contact: michaelestrin@gmail.com or wolfson@tau.ac.il PMID:28881968

  17. An effective biometric discretization approach to extract highly discriminative, informative, and privacy-protective binary representation

    NASA Astrophysics Data System (ADS)

    Lim, Meng-Hui; Teoh, Andrew Beng Jin

    2011-12-01

    Biometric discretization derives a binary string for each user based on an ordered set of biometric features. This representative string ought to be discriminative, informative, and privacy protective when it is employed as a cryptographic key in various security applications upon error correction. However, it is commonly believed that satisfying the first and the second criteria simultaneously is not feasible, and a tradeoff between them is always definite. In this article, we propose an effective fixed bit allocation-based discretization approach which involves discriminative feature extraction, discriminative feature selection, unsupervised quantization (quantization that does not utilize class information), and linearly separable subcode (LSSC)-based encoding to fulfill all the ideal properties of a binary representation extracted for cryptographic applications. In addition, we examine a number of discriminative feature-selection measures for discretization and identify the proper way of setting an important feature-selection parameter. Encouraging experimental results vindicate the feasibility of our approach.

  18. Too good to be true: when overwhelming evidence fails to convince.

    PubMed

    Gunn, Lachlan J; Chapeau-Blondeau, François; McDonnell, Mark D; Davis, Bruce R; Allison, Andrew; Abbott, Derek

    2016-03-01

    Is it possible for a large sequence of measurements or observations, which support a hypothesis, to counterintuitively decrease our confidence? Can unanimous support be too good to be true? The assumption of independence is often made in good faith; however, rarely is consideration given to whether a systemic failure has occurred. Taking this into account can cause certainty in a hypothesis to decrease as the evidence for it becomes apparently stronger. We perform a probabilistic Bayesian analysis of this effect with examples based on (i) archaeological evidence, (ii) weighing of legal evidence and (iii) cryptographic primality testing. In this paper, we investigate the effects of small error rates in a set of measurements or observations. We find that even with very low systemic failure rates, high confidence is surprisingly difficult to achieve; in particular, we find that certain analyses of cryptographically important numerical tests are highly optimistic, underestimating their false-negative rate by as much as a factor of 2 80 .

  19. Low-power cryptographic coprocessor for autonomous wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Olszyna, Jakub; Winiecki, Wiesław

    2013-10-01

    The concept of autonomous wireless sensor networks involves energy harvesting, as well as effective management of system resources. Public-key cryptography (PKC) offers the advantage of elegant key agreement schemes with which a secret key can be securely established over unsecure channels. In addition to solving the key management problem, the other major application of PKC is digital signatures, with which non-repudiation of messages exchanges can be achieved. The motivation for studying low-power and area efficient modular arithmetic algorithms comes from enabling public-key security for low-power devices that can perform under constrained environment like autonomous wireless sensor networks. This paper presents a cryptographic coprocessor tailored to the autonomous wireless sensor networks constraints. Such hardware circuit is aimed to support the implementation of different public-key cryptosystems based on modular arithmetic in GF(p) and GF(2m). Key components of the coprocessor are described as GEZEL models and can be easily transformed to VHDL and implemented in hardware.

  20. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    PubMed Central

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  1. Geometric data perturbation-based personal health record transactions in cloud computing.

    PubMed

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  2. Provably Secure Heterogeneous Access Control Scheme for Wireless Body Area Network.

    PubMed

    Omala, Anyembe Andrew; Mbandu, Angolo Shem; Mutiria, Kamenyi Domenic; Jin, Chunhua; Li, Fagen

    2018-04-28

    Wireless body area network (WBAN) provides a medium through which physiological information could be harvested and transmitted to application provider (AP) in real time. Integrating WBAN in a heterogeneous Internet of Things (IoT) ecosystem would enable an AP to monitor patients from anywhere and at anytime. However, the IoT roadmap of interconnected 'Things' is still faced with many challenges. One of the challenges in healthcare is security and privacy of streamed medical data from heterogeneously networked devices. In this paper, we first propose a heterogeneous signcryption scheme where a sender is in a certificateless cryptographic (CLC) environment while a receiver is in identity-based cryptographic (IBC) environment. We then use this scheme to design a heterogeneous access control protocol. Formal security proof for indistinguishability against adaptive chosen ciphertext attack and unforgeability against adaptive chosen message attack in random oracle model is presented. In comparison with some of the existing access control schemes, our scheme has lower computation and communication cost.

  3. A Cryptographic SoC for Robust Protection of Secret Keys in IPTV DRM Systems

    NASA Astrophysics Data System (ADS)

    Lee, Sanghan; Yang, Hae-Yong; Yeom, Yongjin; Park, Jongsik

    The security level of an internet protocol television (IPTV) digital right management (DRM) system ultimately relies on protection of secret keys. Well known devices for the key protection include smartcards and battery backup SRAMs (BB-SRAMs); however, these devices could be vulnerable to various physical attacks. In this paper, we propose a secure and cost-effective design of a cryptographic system on chip (SoC) that integrates the BB-SRAM with a cell-based design technique. The proposed SoC provides robust safeguard against the physical attacks, and satisfies high-speed and low-price requirements of IPTV set-top boxes. Our implementation results show that the maximum encryption rate of the SoC is 633Mb/s. In order to verify the data retention capabilities, we made a prototype chip using 0.18µm standard cell technology. The experimental results show that the integrated BB-SRAM can reliably retain data with a 1.4µA leakage current.

  4. Semi-quantum Secure Direct Communication Scheme Based on Bell States

    NASA Astrophysics Data System (ADS)

    Xie, Chen; Li, Lvzhou; Situ, Haozhen; He, Jianhao

    2018-06-01

    Recently, the idea of semi-quantumness has been often used in designing quantum cryptographic schemes, which allows some of the participants of a quantum cryptographic scheme to remain classical. One of the reasons why this idea is popular is that it allows a quantum information processing task to be accomplished by using quantum resources as few as possible. In this paper, we extend the idea to quantum secure direct communication(QSDC) by proposing a semi-quantum secure direct communication scheme. In the scheme, the message sender, Alice, encodes each bit into a Bell state |φ+> = 1/{√2}(|00> +|11> ) or |{Ψ }+> = 1/{√ 2}(|01> +|10> ), and the message receiver, Bob, who is classical in the sense that he can either let the qubit he received reflect undisturbed, or measure the qubit in the computational basis |0>, |1> and then resend it in the state he found. Moreover, the security analysis of our scheme is also given.

  5. A Survey of Noninteractive Zero Knowledge Proof System and Its Applications

    PubMed Central

    Wu, Huixin; Wang, Feng

    2014-01-01

    Zero knowledge proof system which has received extensive attention since it was proposed is an important branch of cryptography and computational complexity theory. Thereinto, noninteractive zero knowledge proof system contains only one message sent by the prover to the verifier. It is widely used in the construction of various types of cryptographic protocols and cryptographic algorithms because of its good privacy, authentication, and lower interactive complexity. This paper reviews and analyzes the basic principles of noninteractive zero knowledge proof system, and summarizes the research progress achieved by noninteractive zero knowledge proof system on the following aspects: the definition and related models of noninteractive zero knowledge proof system, noninteractive zero knowledge proof system of NP problems, noninteractive statistical and perfect zero knowledge, the connection between noninteractive zero knowledge proof system, interactive zero knowledge proof system, and zap, and the specific applications of noninteractive zero knowledge proof system. This paper also points out the future research directions. PMID:24883407

  6. Quantum Cryptography for Secure Communications to Low-Earth Orbit Satellites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, R.J.; Buttler, W.T.; Kwiat, P.G.

    1999-06-03

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Quantum cryptography is an emerging technology in which two parties may simultaneously generate shared, secret cryptographic key material using the transmission of quantum states of light. The security of these transmissions is based on the inviolability of the laws of quantum mechanics. An adversary can neither successfully tap the quantum transmissions, nor evade detection. Key material is built up using the transmission of a single-photon per bit. We have developed an experimental quantum cryptography system based on the transmissionmore » of non-orthogonal single-photon polarization states to generate shared key material over line-of-sight optical links. Our results provide strong evidence that cryptographic key material could be generated on demand between a ground station and a satellite (or between two satellites), allowing a satellite to be securely re-keyed on in orbit.« less

  7. QKD-Based Secured Burst Integrity Design for Optical Burst Switched Networks

    NASA Astrophysics Data System (ADS)

    Balamurugan, A. M.; Sivasubramanian, A.; Parvathavarthini, B.

    2016-03-01

    The field of optical transmission has undergone numerous advancements and is still being researched mainly due to the fact that optical data transmission can be done at enormous speeds. It is quite evident that people prefer optical communication when it comes to large amount of data involving its transmission. The concept of switching in networks has matured enormously with several researches, architecture to implement and methods starting with Optical circuit switching to Optical Burst Switching. Optical burst switching is regarded as viable solution for switching bursts over networks but has several security vulnerabilities. However, this work exploited the security issues associated with Optical Burst Switching with respect to integrity of burst. This proposed Quantum Key based Secure Hash Algorithm (QKBSHA-512) with enhanced compression function design provides better avalanche effect over the conventional integrity algorithms.

  8. Research on target tracking algorithm based on spatio-temporal context

    NASA Astrophysics Data System (ADS)

    Li, Baiping; Xu, Sanmei; Kang, Hongjuan

    2017-07-01

    In this paper, a novel target tracking algorithm based on spatio-temporal context is proposed. During the tracking process, the camera shaking or occlusion may lead to the failure of tracking. The proposed algorithm can solve this problem effectively. The method use the spatio-temporal context algorithm as the main research object. We get the first frame's target region via mouse. Then the spatio-temporal context algorithm is used to get the tracking targets of the sequence of frames. During this process a similarity measure function based on perceptual hash algorithm is used to judge the tracking results. If tracking failed, reset the initial value of Mean Shift algorithm for the subsequent target tracking. Experiment results show that the proposed algorithm can achieve real-time and stable tracking when camera shaking or target occlusion.

  9. Asymmetric cryptography based on wavefront sensing.

    PubMed

    Peng, Xiang; Wei, Hengzheng; Zhang, Peng

    2006-12-15

    A system of asymmetric cryptography based on wavefront sensing (ACWS) is proposed for the first time to our knowledge. One of the most significant features of the asymmetric cryptography is that a trapdoor one-way function is required and constructed by analogy to wavefront sensing, in which the public key may be derived from optical parameters, such as the wavelength or the focal length, while the private key may be obtained from a kind of regular point array. The ciphertext is generated by the encoded wavefront and represented with an irregular array. In such an ACWS system, the encryption key is not identical to the decryption key, which is another important feature of an asymmetric cryptographic system. The processes of asymmetric encryption and decryption are formulized mathematically and demonstrated with a set of numerical experiments.

  10. Implementing a High-Assurance Smart-Card OS

    NASA Astrophysics Data System (ADS)

    Karger, Paul A.; Toll, David C.; Palmer, Elaine R.; McIntosh, Suzanne K.; Weber, Samuel; Edwards, Jonathan W.

    Building a high-assurance, secure operating system for memory constrained systems, such as smart cards, introduces many challenges. The increasing power of smart cards has made their use feasible in applications such as electronic passports, military and public sector identification cards, and cell-phone based financial and entertainment applications. Such applications require a secure environment, which can only be provided with sufficient hardware and a secure operating system. We argue that smart cards pose additional security challenges when compared to traditional computer platforms. We discuss our design for a secure smart card operating system, named Caernarvon, and show that it addresses these challenges, which include secure application download, protection of cryptographic functions from malicious applications, resolution of covert channels, and assurance of both security and data integrity in the face of arbitrary power losses.

  11. User Authentication and Authorization Challenges in a Networked Library Environment.

    ERIC Educational Resources Information Center

    Machovec, George S.

    1997-01-01

    Discusses computer user authentication and authorization issues when libraries need to let valid users access databases and information services without making the process too difficult for either party. Common solutions are explained, including filtering, passwords, and kerberos (cryptographic authentication scheme for secure use over public…

  12. Functional Requirements for Information Resource Provenance on the Web

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCusker, James P.; Lebo, Timothy; Graves, Alvaro

    We provide a means to formally explain the relationship between HTTP URLs and the representations returned when they are requested. According to existing World Wide Web architecture, the URL serves as an identier for a semiotic referent while the document returned via HTTP serves as a representation of the same referent. This begins with two sides of a semiotic triangle; the third side is the relationship between the URL and the representation received. We complete this description by extending the library science resource model Functional Requirements for Bibliographic Resources (FRBR) with cryptographic message and content digests to create a Functionalmore » Requirements for Information Resources (FRIR). We show how applying the FRIR model to HTTP GET and POST transactions disambiguates the many relationships between a given URL and all representations received from its request, provides fine-grained explanations that are complementary to existing explanations of web resources, and integrates easily into the emerging W3C provenance standard.« less

  13. Ionospheric Radio Propagation

    DTIC Science & Technology

    1948-06-25

    fluctuation According to this method the noise figure of a radio noise, generated either in the receiving sys - receiver is a quotient of the ratio of available...Ibmaity @" nabe ) adal 38. So"&UJ Malabo u~aIS. 5 a"), il61 Sudde io o ,disturbances (OLD). 6; absorption elects. 111. &e d"a hashed ~ ~ ~ ~ ~~( Saim. kuiy1

  14. Characterization of Extremely Lightweight Intrusion Detection (ELIDe) Power Utilization by Varying N-gram and Hash Length

    DTIC Science & Technology

    2015-09-01

    changing the weight file used without redeploying the application. 2.1 Mobile Device We used the same Sprint-brand Galaxy S3 smart phone. The... Galaxy S3 line of smart phones varied in its technical specifications depending on the carrier. For reference, the Sprint-brand Galaxy S3 has the

  15. A Proposal for Kelly CriterionBased Lossy Network Compression

    DTIC Science & Technology

    2016-03-01

    warehousing and data mining techniques for cyber security. New York (NY): Springer; 2007. p. 83–108. 34. Münz G, Li S, Carle G. Traffic anomaly...p. 188–196. 48. Kim NU, Park MW, Park SH, Jung SM, Eom JH, Chung TM. A study on ef- fective hash-based load balancing scheme for parallel nids. In

  16. New Media Literacy Education (NMLE): A Developmental Approach

    ERIC Educational Resources Information Center

    Graber, Diana

    2012-01-01

    The digital world is full of both possibility and peril, with rules of engagement being hashed out as we go. While schools are still "hesitant to embrace new technologies as a backlash from the significant, and largely ineffectual, investment in classroom computers as an instructional panacea during in the mid-1990's" (Collins and Halverson 2009),…

  17. Probability Distributions over Cryptographic Protocols

    DTIC Science & Technology

    2009-06-01

    Artificial Immune Algorithm . . . . . . . . . . . . . . . . . . . 9 3 Design Decisions 11 3.1 Common Ground...creation algorithm for unbounded distribution . . . . . . . 24 4.2 Message creation algorithm for unbounded naive distribution . . . . 24 4.3 Protocol...creation algorithm for intended-run distributions . . . . . . 26 4.4 Protocol and message creation algorithm for realistic distribution . . 32 ix THIS

  18. 22 CFR 124.14 - Exports to warehouses or distribution points outside the United States.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... distribution, end-use and reporting. Licenses for exports pursuant to such agreements must be obtained prior to... country, either in their original form or after being incorporated through an intermediate process into...., sporting firearms for commercial resale, cryptographic devices and software for financial and business...

  19. Cryptographic Research and NSA: Report of the Public Cryptography Study Group.

    ERIC Educational Resources Information Center

    Davida, George I.

    1981-01-01

    The Public Cryptography Study Group accepted the claim made by the National Security Agency that some information in some publications concerning cryptology could be inimical to national security, and is allowing the establishment of a voluntary mechanism, on an experimental basis, for NSA to review cryptology manuscripts. (MLW)

  20. Harry Potter and the Cryptography with Matrices

    ERIC Educational Resources Information Center

    Chua, Boon Liang

    2006-01-01

    This article describes Cryptography, defined as the science of encrypting and deciphering messages written in secret codes, it has played a vital role in securing information since ancient times. There are several cryptographic techniques and many make extensive use of mathematics to secure information. The author discusses an activity built…

  1. Cryptographer

    ERIC Educational Resources Information Center

    Sullivan, Megan

    2005-01-01

    For the general public, the field of cryptography has recently become famous as the method used to uncover secrets in Dan Brown's fictional bestseller, The Da Vinci Code. But the science of cryptography has been popular for centuries--secret hieroglyphics discovered in Egypt suggest that code-making dates back almost 4,000 years. In today's…

  2. Security Protocol Verification and Optimization by Epistemic Model Checking

    DTIC Science & Technology

    2010-11-05

    Three cryptographers are sitting down to dinner at their favourite restau- rant. Their waiter informs them that arrangements have been made with the...Unfortunately, the protocol cannot be expected to satisfy this: suppose that all agents manage to broadcast their mes- sage and all messages have the

  3. Parallel Processable Cryptographic Methods with Unbounded Practical Security.

    ERIC Educational Resources Information Center

    Rothstein, Jerome

    Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

  4. 49 CFR 236.1033 - Communications and security requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... shall: (1) Use an algorithm approved by the National Institute of Standards (NIST) or a similarly...; or (ii) When the key algorithm reaches its lifespan as defined by the standards body responsible for approval of the algorithm. (c) The cleartext form of the cryptographic keys shall be protected from...

  5. 49 CFR 236.1033 - Communications and security requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... shall: (1) Use an algorithm approved by the National Institute of Standards (NIST) or a similarly...; or (ii) When the key algorithm reaches its lifespan as defined by the standards body responsible for approval of the algorithm. (c) The cleartext form of the cryptographic keys shall be protected from...

  6. 49 CFR 236.1033 - Communications and security requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... shall: (1) Use an algorithm approved by the National Institute of Standards (NIST) or a similarly...; or (ii) When the key algorithm reaches its lifespan as defined by the standards body responsible for approval of the algorithm. (c) The cleartext form of the cryptographic keys shall be protected from...

  7. 49 CFR 236.1033 - Communications and security requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... shall: (1) Use an algorithm approved by the National Institute of Standards (NIST) or a similarly...; or (ii) When the key algorithm reaches its lifespan as defined by the standards body responsible for approval of the algorithm. (c) The cleartext form of the cryptographic keys shall be protected from...

  8. Exploitation of Unintentional Information Leakage from Integrated Circuits

    ERIC Educational Resources Information Center

    Cobb, William E.

    2011-01-01

    The information leakage of electronic devices, especially those used in cryptographic or other vital applications, represents a serious practical threat to secure systems. While physical implementation attacks have evolved rapidly over the last decade, relatively little work has been done to allow system designers to effectively counter the…

  9. 49 CFR 236.1033 - Communications and security requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... shall: (1) Use an algorithm approved by the National Institute of Standards (NIST) or a similarly...; or (ii) When the key algorithm reaches its lifespan as defined by the standards body responsible for approval of the algorithm. (c) The cleartext form of the cryptographic keys shall be protected from...

  10. A secure and efficient uniqueness-and-anonymity-preserving remote user authentication scheme for connected health care.

    PubMed

    Das, Ashok Kumar; Goswami, Adrijit

    2013-06-01

    Connected health care has several applications including telecare medicine information system, personally controlled health records system, and patient monitoring. In such applications, user authentication can ensure the legality of patients. In user authentication for such applications, only the legal user/patient himself/herself is allowed to access the remote server, and no one can trace him/her according to transmitted data. Chang et al. proposed a uniqueness-and-anonymity-preserving remote user authentication scheme for connected health care (Chang et al., J Med Syst 37:9902, 2013). Their scheme uses the user's personal biometrics along with his/her password with the help of the smart card. The user's biometrics is verified using BioHashing. Their scheme is efficient due to usage of one-way hash function and exclusive-or (XOR) operations. In this paper, we show that though their scheme is very efficient, their scheme has several security weaknesses such as (1) it has design flaws in login and authentication phases, (2) it has design flaws in password change phase, (3) it fails to protect privileged insider attack, (4) it fails to protect the man-in-the middle attack, and (5) it fails to provide proper authentication. In order to remedy these security weaknesses in Chang et al.'s scheme, we propose an improvement of their scheme while retaining the original merit of their scheme. We show that our scheme is efficient as compared to Chang et al.'s scheme. Through the security analysis, we show that our scheme is secure against possible attacks. Further, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. In addition, after successful authentication between the user and the server, they establish a secret session key shared between them for future secure communication.

  11. Multi-factor authentication using quantum communication

    DOEpatents

    Hughes, Richard John; Peterson, Charles Glen; Thrasher, James T.; Nordholt, Jane E.; Yard, Jon T.; Newell, Raymond Thorson; Somma, Rolando D.

    2018-02-06

    Multi-factor authentication using quantum communication ("QC") includes stages for enrollment and identification. For example, a user enrolls for multi-factor authentication that uses QC with a trusted authority. The trusted authority transmits device factor information associated with a user device (such as a hash function) and user factor information associated with the user (such as an encrypted version of a user password). The user device receives and stores the device factor information and user factor information. For multi-factor authentication that uses QC, the user device retrieves its stored device factor information and user factor information, then transmits the user factor information to the trusted authority, which also retrieves its stored device factor information. The user device and trusted authority use the device factor information and user factor information (more specifically, information such as a user password that is the basis of the user factor information) in multi-factor authentication that uses QC.

  12. Fast implementation of length-adaptive privacy amplification in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Chun-Mei; Li, Mo; Huang, Jing-Zheng; Patcharapong, Treeviriyanupab; Li, Hong-Wei; Li, Fang-Yi; Wang, Chuan; Yin, Zhen-Qiang; Chen, Wei; Keattisak, Sripimanwat; Han, Zhen-Fu

    2014-09-01

    Post-processing is indispensable in quantum key distribution (QKD), which is aimed at sharing secret keys between two distant parties. It mainly consists of key reconciliation and privacy amplification, which is used for sharing the same keys and for distilling unconditional secret keys. In this paper, we focus on speeding up the privacy amplification process by choosing a simple multiplicative universal class of hash functions. By constructing an optimal multiplication algorithm based on four basic multiplication algorithms, we give a fast software implementation of length-adaptive privacy amplification. “Length-adaptive” indicates that the implementation of privacy amplification automatically adapts to different lengths of input blocks. When the lengths of the input blocks are 1 Mbit and 10 Mbit, the speed of privacy amplification can be as fast as 14.86 Mbps and 10.88 Mbps, respectively. Thus, it is practical for GHz or even higher repetition frequency QKD systems.

  13. Integrated fingerprinting in secure digital cinema projection

    NASA Astrophysics Data System (ADS)

    Delannay, Damien; Delaigle, Jean-Francois; Macq, Benoit M. M.; Quisquater, Jean-Jacques; Mas Ribes, Joan M.; Boucqueau, Jean M.; Nivart, Jean-Francois

    2001-12-01

    This paper describes the functional model of a combined conditional access and fingerprinting copyright (-or projectionright) protection system in a digital cinema framework. In the cinema industry, a large part of early movie piracy comes from copies made in the theater itself with a camera. The evolution towards digital cinema broadcast enables watermark based fingerprinting protection systems. Besides an appropriate fingerprinting technology, a number of well defined security/cryptographic tools are integrated in order to guaranty the integrity of the whole system. The requirements are two-fold: On one side, we must ensure that the media content is only accessible at exhibition time (under specific authorization obtained after an ad-hoc film rental agreement) and contains the related exhibition fingerprint. At the other end, we must prove our ability to retrieve the fingerprint information from an illegal copy of the media.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hymel, Ross

    The Public Key (PK) FPGA software performs asymmetric authentication using the 163-bit Elliptic Curve Digital Signature Algorithm (ECDSA) on an embedded FPGA platform. A digital signature is created on user-supplied data, and communication with a host system is performed via a Serial Peripheral Interface (SPI) bus. Software includes all components necessary for signing, including custom random number generator for key creation and SHA-256 for data hashing.

  15. Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)

    DTIC Science & Technology

    2006-12-01

    Encapsulation HMAC Keyed-Hash Message Authentication Code ICMP Internet Control Message Protocol IEEE Institute of Electrical and Electronics Engineers IETF...Internet Engineering Task Force IOS Internetwork Operating System IP Internet Protocol ITU International Telecommunication Union LAN Local Area...network computing. Most organizations today have sophisticated networks that are connected to the Internet. The major benefit reaped from such a

  16. An Intelligent Web-Based System for Diagnosing Student Learning Problems Using Concept Maps

    ERIC Educational Resources Information Center

    Acharya, Anal; Sinha, Devadatta

    2017-01-01

    The aim of this article is to propose a method for development of concept map in web-based environment for identifying concepts a student is deficient in after learning using traditional methods. Direct Hashing and Pruning algorithm was used to construct concept map. Redundancies within the concept map were removed to generate a learning sequence.…

  17. Automated Handling of Garments for Pressing

    DTIC Science & Technology

    1991-09-30

    Parallel Algorithms for 2D Kalman Filtering ................................. 47 DJ. Potter and M.P. Cline Hash Table and Sorted Array: A Case Study of... Kalman Filtering on the Connection Machine ............................ 55 MA. Palis and D.K. Krecker Parallel Sorting of Large Arrays on the MasPar...ALGORITHM’VS FOR SEAM SENSING. .. .. .. ... ... .... ..... 24 6.1 KarelTW Algorithms .. .. ... ... ... ... .... ... ...... 24 6.1.1 Image Filtering

  18. Certificate Revocation Using Fine Grained Certificate Space Partitioning

    NASA Astrophysics Data System (ADS)

    Goyal, Vipul

    A new certificate revocation system is presented. The basic idea is to divide the certificate space into several partitions, the number of partitions being dependent on the PKI environment. Each partition contains the status of a set of certificates. A partition may either expire or be renewed at the end of a time slot. This is done efficiently using hash chains.

  19. Potency trends of delta9-THC and other cannabinoids in confiscated marijuana from 1980-1997.

    PubMed

    ElSohly, M A; Ross, S A; Mehmedic, Z; Arafat, R; Yi, B; Banahan, B F

    2000-01-01

    The analysis of 35,312 cannabis preparations confiscated in the USA over a period of 18 years for delta-9-tetrahydrocannabinol (delta9-THC) and other major cannabinoids is reported. Samples were identified as cannabis, hashish, or hash oil. Cannabis samples were further subdivided into marijuana (loose material, kilobricks and buds), sinsemilla, Thai sticks and ditchweed. The data showed that more than 82% of all confiscated samples were in the marijuana category for every year except 1980 (61%) and 1981 (75%). The potency (concentration of delta9-THC) of marijuana samples rose from less than 1.5% in 1980 to approximately 3.3% in 1983 and 1984, then fluctuated around 3% till 1992. Since 1992, the potency of confiscated marijuana samples has continuously risen, going from 3.1% in 1992 to 4.2% in 1997. The average concentration of delta9-THC in all cannabis samples showed a gradual rise from 3% in 1991 to 4.47% in 1997. Hashish and hash oil, on the other hand, showed no specific potency trends. Other major cannabinoids [cannabidiol (CBD), cannabinol (CBN), and cannabichromene (CBC)] showed no significant change in their concentration over the years.

  20. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information

    PubMed Central

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft’s algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms. PMID:27806102

Top