Quantum enigma cipher as a generalization of the quantum stream cipher
NASA Astrophysics Data System (ADS)
Kato, Kentaro
2016-09-01
Various types of randomizations for the quantum stream cipher by Y00 protocol have been developed so far. In particular, it must be noted that the analysis of immunity against correlation attacks with a new type of randomization by Hirota and Kurosawa prompted a new look at the quantum stream cipher by Y00 protocol (Quant. Inform. Process. 6(2) 2007). From the preceding study on the quantum stream cipher, we recognized that the quantum stream cipher by Y00 protocol would be able to be generalized to a new type of physical cipher that has potential to exceed the Shannon limit by installing additional randomization mechanisms, in accordance with the law of quantum mechanics. We call this new type of physical random cipher the quantum enigma cipher. In this article, we introduce the recent developments for the quantum stream cipher by Y00 protocol and future plans toward the quantum enigma cipher.
NASA Astrophysics Data System (ADS)
Hirota, Osamu; Ohhata, Kenichi; Honda, Makoto; Akutsu, Shigeto; Doi, Yoshifumi; Harasawa, Katsuyoshi; Yamashita, Kiichi
2009-08-01
The security issue for the next generation optical network which realizes Cloud Computing System Service with data center" is urgent problem. In such a network, the encryption by physical layer which provide super security and small delay should be employed. It must provide, however, very high speed encryption because the basic link is operated at 2.5 Gbit/sec or 10 Gbit/sec. The quantum stream cipher by Yuen-2000 protocol (Y-00) is a completely new type random cipher so called Gauss-Yuen random cipher, which can break the Shannon limit for the symmetric key cipher. We develop such a cipher which has good balance of the security, speed and cost performance. In SPIE conference on quantum communication and quantum imaging V, we reported a demonstration of 2.5 Gbit/sec system for the commercial link and proposed how to improve it to 10 Gbit/sec. This paper reports a demonstration of the Y-00 cipher system which works at 10 Gbit/sec. A transmission test in a laboratory is tried to get the basic data on what parameters are important to operate in the real commercial networks. In addition, we give some theoretical results on the security. It is clarified that the necessary condition to break the Shannon limit requires indeed the quantum phenomenon, and that the full information theoretically secure system is available in the satellite link application.
Running key mapping in a quantum stream cipher by the Yuen 2000 protocol
NASA Astrophysics Data System (ADS)
Shimizu, Tetsuya; Hirota, Osamu; Nagasako, Yuki
2008-03-01
A quantum stream cipher by Yuen 2000 protocol (so-called Y00 protocol or αη scheme) consisting of linear feedback shift register of short key is very attractive in implementing secure 40 Gbits/s optical data transmission, which is expected as a next-generation network. However, a basic model of the Y00 protocol with a very short key needs a careful design against fast correlation attacks as pointed out by Donnet This Brief Report clarifies an effectiveness of irregular mapping between running key and physical signals in the driver for selection of M -ary basis in the transmitter, and gives a design method. Consequently, quantum stream cipher by the Y00 protocol with our mapping has immunity against the proposed fast correlation attacks on a basic model of the Y00 protocol even if the key is very short.
A Study on the Stream Cipher Embedded Magic Square of Random Access Files
NASA Astrophysics Data System (ADS)
Liu, Chenglian; Zhao, Jian-Ming; Rafsanjani, Marjan Kuchaki; Shen, Yijuan
2011-09-01
Magic square and stream cipher issues are both interesting and well-tried topics. In this paper, we are proposing a new scheme which streams cipher applications for random access files based on the magic square method. There are two thresholds required to secure our data, if using only decrypts by the stream cipher. It isn't to recovery original source. On other hand, we improve the model of cipher stream to strengthen and defend efficiently; it also was its own high speed and calculates to most parts of the key stream generator.
Color encryption scheme based on adapted quantum logistic map
NASA Astrophysics Data System (ADS)
Zaghloul, Alaa; Zhang, Tiejun; Amin, Mohamed; Abd El-Latif, Ahmed A.
2014-04-01
This paper presents a new color image encryption scheme based on quantum chaotic system. In this scheme, a new encryption scheme is accomplished by generating an intermediate chaotic key stream with the help of quantum chaotic logistic map. Then, each pixel is encrypted by the cipher value of the previous pixel and the adapted quantum logistic map. The results show that the proposed scheme has adequate security for the confidentiality of color images.
NASA Astrophysics Data System (ADS)
Hirota, Osamu; Futami, Fumio
2014-10-01
To guarantee a security of Cloud Computing System is urgent problem. Although there are several threats in a security problem, the most serious problem is cyber attack against an optical fiber transmission among data centers. In such a network, an encryption scheme on Layer 1(physical layer) with an ultimately strong security, a small delay, and a very high speed should be employed, because a basic optical link is operated at 10 Gbit/sec/wavelength. We have developed a quantum noise randomied stream cipher so called Yuen- 2000 encryption scheme (Y-00) during a decade. This type of cipher is a completely new type random cipher in which ciphertext for a legitimate receiver and eavesdropper are different. This is a condition to break the Shannon limit in theory of cryptography. In addition, this scheme has a good balance on a security, a speed and a cost performance. To realize such an encryption, several modulation methods are candidates such as phase-modulation, intensity-modulation, quadrature amplitude modulation, and so on. Northwestern university group demonstrated a phase modulation system (α=η) in 2003. In 2005, we reported a demonstration of 1 Gbit/sec system based on intensity modulation scheme(ISK-Y00), and gave a design method for quadratic amplitude modulation (QAM-Y00) in 2005 and 2010. An intensity modulation scheme promises a real application to a secure fiber communication of current data centers. This paper presents a progress in quantum noise randomized stream cipher based on ISK-Y00, integrating our theoretical and experimental achievements in the past and recent 100 Gbit/sec(10Gbit/sec × 10 wavelengths) experiment.
Single-channel 40 Gbit/s digital coherent QAM quantum noise stream cipher transmission over 480 km.
Yoshida, Masato; Hirooka, Toshihiko; Kasai, Keisuke; Nakazawa, Masataka
2016-01-11
We demonstrate the first 40 Gbit/s single-channel polarization-multiplexed, 5 Gsymbol/s, 16 QAM quantum noise stream cipher (QNSC) transmission over 480 km by incorporating ASE quantum noise from EDFAs as well as the quantum shot noise of the coherent state with multiple photons for the random masking of data. By using a multi-bit encoded scheme and digital coherent transmission techniques, secure optical communication with a record data capacity and transmission distance has been successfully realized. In this system, the signal level received by Eve is hidden by both the amplitude and the phase noise. The highest number of masked signals, 7.5 x 10(4), was achieved by using a QAM scheme with FEC, which makes it possible to reduce the output power from the transmitter while maintaining an error free condition for Bob. We have newly measured the noise distribution around I and Q encrypted data and shown experimentally with a data size of as large as 2(25) that the noise has a Gaussian distribution with no correlations. This distribution is suitable for the random masking of data.
NASA Astrophysics Data System (ADS)
Nurdiyanto, Heri; Rahim, Robbi; Wulan, Nur
2017-12-01
Symmetric type cryptography algorithm is known many weaknesses in encryption process compared with asymmetric type algorithm, symmetric stream cipher are algorithm that works on XOR process between plaintext and key, to improve the security of symmetric stream cipher algorithm done improvisation by using Triple Transposition Key which developed from Transposition Cipher and also use Base64 algorithm for encryption ending process, and from experiment the ciphertext that produced good enough and very random.
Cryptanalysis on classical cipher based on Indonesian language
NASA Astrophysics Data System (ADS)
Marwati, R.; Yulianti, K.
2018-05-01
Cryptanalysis is a process of breaking a cipher in an illegal decryption. This paper discusses about encryption some classic cryptography, breaking substitution cipher and stream cipher, and increasing its security. Encryption and ciphering based on Indonesian Language text. Microsoft Word and Microsoft Excel were chosen as ciphering and breaking tools.
Audio Steganography with Embedded Text
NASA Astrophysics Data System (ADS)
Teck Jian, Chua; Chai Wen, Chuah; Rahman, Nurul Hidayah Binti Ab.; Hamid, Isredza Rahmi Binti A.
2017-08-01
Audio steganography is about hiding the secret message into the audio. It is a technique uses to secure the transmission of secret information or hide their existence. It also may provide confidentiality to secret message if the message is encrypted. To date most of the steganography software such as Mp3Stego and DeepSound use block cipher such as Advanced Encryption Standard or Data Encryption Standard to encrypt the secret message. It is a good practice for security. However, the encrypted message may become too long to embed in audio and cause distortion of cover audio if the secret message is too long. Hence, there is a need to encrypt the message with stream cipher before embedding the message into the audio. This is because stream cipher provides bit by bit encryption meanwhile block cipher provide a fixed length of bits encryption which result a longer output compare to stream cipher. Hence, an audio steganography with embedding text with Rivest Cipher 4 encryption cipher is design, develop and test in this project.
A novel chaotic stream cipher and its application to palmprint template protection
NASA Astrophysics Data System (ADS)
Li, Heng-Jian; Zhang, Jia-Shu
2010-04-01
Based on a coupled nonlinear dynamic filter (NDF), a novel chaotic stream cipher is presented in this paper and employed to protect palmprint templates. The chaotic pseudorandom bit generator (PRBG) based on a coupled NDF, which is constructed in an inverse flow, can generate multiple bits at one iteration and satisfy the security requirement of cipher design. Then, the stream cipher is employed to generate cancelable competitive code palmprint biometrics for template protection. The proposed cancelable palmprint authentication system depends on two factors: the palmprint biometric and the password/token. Therefore, the system provides high-confidence and also protects the user's privacy. The experimental results of verification on the Hong Kong PolyU Palmprint Database show that the proposed approach has a large template re-issuance ability and the equal error rate can achieve 0.02%. The performance of the palmprint template protection scheme proves the good practicability and security of the proposed stream cipher.
Quantum exhaustive key search with simplified-DES as a case study.
Almazrooie, Mishal; Samsudin, Azman; Abdullah, Rosni; Mutter, Kussay N
2016-01-01
To evaluate the security of a symmetric cryptosystem against any quantum attack, the symmetric algorithm must be first implemented on a quantum platform. In this study, a quantum implementation of a classical block cipher is presented. A quantum circuit for a classical block cipher of a polynomial size of quantum gates is proposed. The entire work has been tested on a quantum mechanics simulator called libquantum. First, the functionality of the proposed quantum cipher is verified and the experimental results are compared with those of the original classical version. Then, quantum attacks are conducted by using Grover's algorithm to recover the secret key. The proposed quantum cipher is used as a black box for the quantum search. The quantum oracle is then queried over the produced ciphertext to mark the quantum state, which consists of plaintext and key qubits. The experimental results show that for a key of n-bit size and key space of N such that [Formula: see text], the key can be recovered in [Formula: see text] computational steps.
Coherent pulse position modulation quantum cipher
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sohma, Masaki; Hirota, Osamu
2014-12-04
On the basis of fundamental idea of Yuen, we present a new type of quantum random cipher, where pulse position modulated signals are encrypted in the picture of quantum Gaussian wave form. We discuss the security of our proposed system with a phase mask encryption.
A Novel Image Encryption Scheme Based on Intertwining Chaotic Maps and RC4 Stream Cipher
NASA Astrophysics Data System (ADS)
Kumari, Manju; Gupta, Shailender
2018-03-01
As the systems are enabling us to transmit large chunks of data, both in the form of texts and images, there is a need to explore algorithms which can provide a higher security without increasing the time complexity significantly. This paper proposes an image encryption scheme which uses intertwining chaotic maps and RC4 stream cipher to encrypt/decrypt the images. The scheme employs chaotic map for the confusion stage and for generation of key for the RC4 cipher. The RC4 cipher uses this key to generate random sequences which are used to implement an efficient diffusion process. The algorithm is implemented in MATLAB-2016b and various performance metrics are used to evaluate its efficacy. The proposed scheme provides highly scrambled encrypted images and can resist statistical, differential and brute-force search attacks. The peak signal-to-noise ratio values are quite similar to other schemes, the entropy values are close to ideal. In addition, the scheme is very much practical since having lowest time complexity then its counterparts.
NASA Astrophysics Data System (ADS)
Budiman, M. A.; Rachmawati, D.; Parlindungan, M. R.
2018-03-01
MDTM is a classical symmetric cryptographic algorithm. As with other classical algorithms, the MDTM Cipher algorithm is easy to implement but it is less secure compared to modern symmetric algorithms. In order to make it more secure, a stream cipher RC4A is added and thus the cryptosystem becomes super encryption. In this process, plaintexts derived from PDFs are firstly encrypted with the MDTM Cipher algorithm and are encrypted once more with the RC4A algorithm. The test results show that the value of complexity is Θ(n2) and the running time is linearly directly proportional to the length of plaintext characters and the keys entered.
Chaos and Cryptography: A new dimension in secure communications
NASA Astrophysics Data System (ADS)
Banerjee, Santo; Kurths, J.
2014-06-01
This issue is a collection of contributions on recent developments and achievements of cryptography and communications using chaos. The various contributions report important and promising results such as synchronization of networks and data transmissions; image cipher; optical and TDMA communications, quantum keys etc. Various experiments and applications such as FPGA, smartphone cipher, semiconductor lasers etc, are also included.
NASA Astrophysics Data System (ADS)
Budiman, M. A.; Amalia; Chayanie, N. I.
2018-03-01
Cryptography is the art and science of using mathematical methods to preserve message security. There are two types of cryptography, namely classical and modern cryptography. Nowadays, most people would rather use modern cryptography than classical cryptography because it is harder to break than the classical one. One of classical algorithm is the Zig-zag algorithm that uses the transposition technique: the original message is unreadable unless the person has the key to decrypt the message. To improve the security, the Zig-zag Cipher is combined with RC4+ Cipher which is one of the symmetric key algorithms in the form of stream cipher. The two algorithms are combined to make a super-encryption. By combining these two algorithms, the message will be harder to break by a cryptanalyst. The result showed that complexity of the combined algorithm is θ(n2 ), while the complexity of Zig-zag Cipher and RC4+ Cipher are θ(n2 ) and θ(n), respectively.
A Secure Key Distribution System of Quantum Cryptography Based on the Coherent State
NASA Technical Reports Server (NTRS)
Guo, Guang-Can; Zhang, Xiao-Yu
1996-01-01
The cryptographic communication has a lot of important applications, particularly in the magnificent prospects of private communication. As one knows, the security of cryptographic channel depends crucially on the secrecy of the key. The Vernam cipher is the only cipher system which has guaranteed security. In that system the key must be as long as the message and most be used only once. Quantum cryptography is a method whereby key secrecy can be guaranteed by a physical law. So it is impossible, even in principle, to eavesdrop on such channels. Quantum cryptography has been developed in recent years. Up to now, many schemes of quantum cryptography have been proposed. Now one of the main problems in this field is how to increase transmission distance. In order to use quantum nature of light, up to now proposed schemes all use very dim light pulses. The average photon number is about 0.1. Because of the loss of the optical fiber, it is difficult for the quantum cryptography based on one photon level or on dim light to realize quantum key-distribution over long distance. A quantum key distribution based on coherent state is introduced in this paper. Here we discuss the feasibility and security of this scheme.
A joint encryption/watermarking system for verifying the reliability of medical images.
Bouslimi, Dalel; Coatrieux, Gouenou; Cozic, Michel; Roux, Christian
2012-09-01
In this paper, we propose a joint encryption/water-marking system for the purpose of protecting medical images. This system is based on an approach which combines a substitutive watermarking algorithm, the quantization index modulation, with an encryption algorithm: a stream cipher algorithm (e.g., the RC4) or a block cipher algorithm (e.g., the AES in cipher block chaining (CBC) mode of operation). Our objective is to give access to the outcomes of the image integrity and of its origin even though the image is stored encrypted. If watermarking and encryption are conducted jointly at the protection stage, watermark extraction and decryption can be applied independently. The security analysis of our scheme and experimental results achieved on 8-bit depth ultrasound images as well as on 16-bit encoded positron emission tomography images demonstrate the capability of our system to securely make available security attributes in both spatial and encrypted domains while minimizing image distortion. Furthermore, by making use of the AES block cipher in CBC mode, the proposed system is compliant with or transparent to the DICOM standard.
Dynamic video encryption algorithm for H.264/AVC based on a spatiotemporal chaos system.
Xu, Hui; Tong, Xiao-Jun; Zhang, Miao; Wang, Zhu; Li, Ling-Hao
2016-06-01
Video encryption schemes mostly employ the selective encryption method to encrypt parts of important and sensitive video information, aiming to ensure the real-time performance and encryption efficiency. The classic block cipher is not applicable to video encryption due to the high computational overhead. In this paper, we propose the encryption selection control module to encrypt video syntax elements dynamically which is controlled by the chaotic pseudorandom sequence. A novel spatiotemporal chaos system and binarization method is used to generate a key stream for encrypting the chosen syntax elements. The proposed scheme enhances the resistance against attacks through the dynamic encryption process and high-security stream cipher. Experimental results show that the proposed method exhibits high security and high efficiency with little effect on the compression ratio and time cost.
Dragon Stream Cipher for Secure Blackbox Cockpit Voice Recorder
NASA Astrophysics Data System (ADS)
Akmal, Fadira; Michrandi Nasution, Surya; Azmi, Fairuz
2017-11-01
Aircraft blackbox is a device used to record all aircraft information, which consists of Flight Data Recorder (FDR) and Cockpit Voice Recorder (CVR). Cockpit Voice Recorder contains conversations in the aircraft during the flight.Investigations on aircraft crashes usually take a long time, because it is difficult to find the aircraft blackbox. Then blackbox should have the ability to send information to other places. Aircraft blackbox must have a data security system, data security is a very important part at the time of information exchange process. The system in this research is to perform the encryption and decryption process on Cockpit Voice Recorder by people who are entitled by using Dragon Stream Cipher algorithm. The tests performed are time of data encryption and decryption, and avalanche effect. Result in this paper show us time encryption and decryption are 0,85 seconds and 1,84 second for 30 seconds Cockpit Voice Recorder data witn an avalanche effect 48,67 %.
Jiao, Haisong; Pu, Tao; Zheng, Jilin; Xiang, Peng; Fang, Tao
2017-05-15
The physical-layer security of a quantum-noise randomized cipher (QNRC) system is, for the first time, quantitatively evaluated with secrecy capacity employed as the performance metric. Considering quantum noise as a channel advantage for legitimate parties over eavesdroppers, the specific wire-tap models for both channels of the key and data are built with channel outputs yielded by quantum heterodyne measurement; the general expressions of secrecy capacities for both channels are derived, where the matching codes are proved to be uniformly distributed. The maximal achievable secrecy rate of the system is proposed, under which secrecy of both the key and data is guaranteed. The influences of various system parameters on secrecy capacities are assessed in detail. The results indicate that QNRC combined with proper channel codes is a promising framework of secure communication for long distance with high speed, which can be orders of magnitude higher than the perfect secrecy rates of other encryption systems. Even if the eavesdropper intercepts more signal power than the legitimate receiver, secure communication (up to Gb/s) can still be achievable. Moreover, the secrecy of running key is found to be the main constraint to the systemic maximal secrecy rate.
Deductive Verification of Cryptographic Software
NASA Technical Reports Server (NTRS)
Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara
2009-01-01
We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.
A fast image encryption algorithm based on only blocks in cipher text
NASA Astrophysics Data System (ADS)
Wang, Xing-Yuan; Wang, Qian
2014-03-01
In this paper, a fast image encryption algorithm is proposed, in which the shuffling and diffusion is performed simultaneously. The cipher-text image is divided into blocks and each block has k ×k pixels, while the pixels of the plain-text are scanned one by one. Four logistic maps are used to generate the encryption key stream and the new place in the cipher image of plain image pixels, including the row and column of the block which the pixel belongs to and the place where the pixel would be placed in the block. After encrypting each pixel, the initial conditions of logistic maps would be changed according to the encrypted pixel's value; after encrypting each row of plain image, the initial condition would also be changed by the skew tent map. At last, it is illustrated that this algorithm has a faster speed, big key space, and better properties in withstanding differential attacks, statistical analysis, known plaintext, and chosen plaintext attacks.
A new simple technique for improving the random properties of chaos-based cryptosystems
NASA Astrophysics Data System (ADS)
Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.
2018-03-01
A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.
Fundamental quantitative security in quantum key generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuen, Horace P.
2010-12-15
We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographicmore » context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.« less
An Unequal Secure Encryption Scheme for H.264/AVC Video Compression Standard
NASA Astrophysics Data System (ADS)
Fan, Yibo; Wang, Jidong; Ikenaga, Takeshi; Tsunoo, Yukiyasu; Goto, Satoshi
H.264/AVC is the newest video coding standard. There are many new features in it which can be easily used for video encryption. In this paper, we propose a new scheme to do video encryption for H.264/AVC video compression standard. We define Unequal Secure Encryption (USE) as an approach that applies different encryption schemes (with different security strength) to different parts of compressed video data. This USE scheme includes two parts: video data classification and unequal secure video data encryption. Firstly, we classify the video data into two partitions: Important data partition and unimportant data partition. Important data partition has small size with high secure protection, while unimportant data partition has large size with low secure protection. Secondly, we use AES as a block cipher to encrypt the important data partition and use LEX as a stream cipher to encrypt the unimportant data partition. AES is the most widely used symmetric cryptography which can ensure high security. LEX is a new stream cipher which is based on AES and its computational cost is much lower than AES. In this way, our scheme can achieve both high security and low computational cost. Besides the USE scheme, we propose a low cost design of hybrid AES/LEX encryption module. Our experimental results show that the computational cost of the USE scheme is low (about 25% of naive encryption at Level 0 with VEA used). The hardware cost for hybrid AES/LEX module is 4678 Gates and the AES encryption throughput is about 50Mbps.
NASA Astrophysics Data System (ADS)
Tan, Ru-Chao; Lei, Tong; Zhao, Qing-Min; Gong, Li-Hua; Zhou, Zhi-Hong
2016-12-01
To improve the slow processing speed of the classical image encryption algorithms and enhance the security of the private color images, a new quantum color image encryption algorithm based on a hyper-chaotic system is proposed, in which the sequences generated by the Chen's hyper-chaotic system are scrambled and diffused with three components of the original color image. Sequentially, the quantum Fourier transform is exploited to fulfill the encryption. Numerical simulations show that the presented quantum color image encryption algorithm possesses large key space to resist illegal attacks, sensitive dependence on initial keys, uniform distribution of gray values for the encrypted image and weak correlation between two adjacent pixels in the cipher-image.
Public-key quantum digital signature scheme with one-time pad private-key
NASA Astrophysics Data System (ADS)
Chen, Feng-Lin; Liu, Wan-Fang; Chen, Su-Gen; Wang, Zhi-Hua
2018-01-01
A quantum digital signature scheme is firstly proposed based on public-key quantum cryptosystem. In the scheme, the verification public-key is derived from the signer's identity information (such as e-mail) on the foundation of identity-based encryption, and the signature private-key is generated by one-time pad (OTP) protocol. The public-key and private-key pair belongs to classical bits, but the signature cipher belongs to quantum qubits. After the signer announces the public-key and generates the final quantum signature, each verifier can verify publicly whether the signature is valid or not with the public-key and quantum digital digest. Analysis results show that the proposed scheme satisfies non-repudiation and unforgeability. Information-theoretic security of the scheme is ensured by quantum indistinguishability mechanics and OTP protocol. Based on the public-key cryptosystem, the proposed scheme is easier to be realized compared with other quantum signature schemes under current technical conditions.
Physical-layer security analysis of PSK quantum-noise randomized cipher in optically amplified links
NASA Astrophysics Data System (ADS)
Jiao, Haisong; Pu, Tao; Xiang, Peng; Zheng, Jilin; Fang, Tao; Zhu, Huatao
2017-08-01
The quantitative security of quantum-noise randomized cipher (QNRC) in optically amplified links is analyzed from the perspective of physical-layer advantage. Establishing the wire-tap channel models for both key and data, we derive the general expressions of secrecy capacities for the key against ciphertext-only attack and known-plaintext attack, and that for the data, which serve as the basic performance metrics. Further, the maximal achievable secrecy rate of the system is proposed, under which secrecy of both the key and data is guaranteed. Based on the same framework, the secrecy capacities of various cases can be assessed and compared. The results indicate perfect secrecy is potentially achievable for data transmission, and an elementary principle of setting proper number of photons and bases is given to ensure the maximal data secrecy capacity. But the key security is asymptotically perfect, which tends to be the main constraint of systemic maximal secrecy rate. Moreover, by adopting cascaded optical amplification, QNRC can realize long-haul transmission with secure rate up to Gb/s, which is orders of magnitude higher than the perfect secrecy rates of other encryption systems.
Toward privacy-preserving JPEG image retrieval
NASA Astrophysics Data System (ADS)
Cheng, Hang; Wang, Jingyue; Wang, Meiqing; Zhong, Shangping
2017-07-01
This paper proposes a privacy-preserving retrieval scheme for JPEG images based on local variance. Three parties are involved in the scheme: the content owner, the server, and the authorized user. The content owner encrypts JPEG images for privacy protection by jointly using permutation cipher and stream cipher, and then, the encrypted versions are uploaded to the server. With an encrypted query image provided by an authorized user, the server may extract blockwise local variances in different directions without knowing the plaintext content. After that, it can calculate the similarity between the encrypted query image and each encrypted database image by a local variance-based feature comparison mechanism. The authorized user with the encryption key can decrypt the returned encrypted images with plaintext content similar to the query image. The experimental results show that the proposed scheme not only provides effective privacy-preserving retrieval service but also ensures both format compliance and file size preservation for encrypted JPEG images.
NASA Astrophysics Data System (ADS)
Gong, Li-Hua; He, Xiang-Tao; Tan, Ru-Chao; Zhou, Zhi-Hong
2018-01-01
In order to obtain high-quality color images, it is important to keep the hue component unchanged while emphasize the intensity or saturation component. As a public color model, Hue-Saturation Intensity (HSI) model is commonly used in image processing. A new single channel quantum color image encryption algorithm based on HSI model and quantum Fourier transform (QFT) is investigated, where the color components of the original color image are converted to HSI and the logistic map is employed to diffuse the relationship of pixels in color components. Subsequently, quantum Fourier transform is exploited to fulfill the encryption. The cipher-text is a combination of a gray image and a phase matrix. Simulations and theoretical analyses demonstrate that the proposed single channel quantum color image encryption scheme based on the HSI model and quantum Fourier transform is secure and effective.
Quantum key based burst confidentiality in optical burst switched networks.
Balamurugan, A M; Sivasubramanian, A
2014-01-01
The optical burst switching (OBS) is an emergent result to the technology concern that could achieve a feasible network in future. They are endowed with the ability to meet the bandwidth requirement of those applications that require intensive bandwidth. There are more domains opening up in the OBS that evidently shows their advantages and their capability to face the future network traffic. However, the concept of OBS is still far from perfection facing issues in case of security threat. The transfer of optical switching paradigm to optical burst switching faces serious downfall in the fields of burst aggregation, routing, authentication, dispute resolution, and quality of service (QoS). This paper deals with employing RC4 (stream cipher) to encrypt and decrypt bursts thereby ensuring the confidentiality of the burst. Although the use of AES algorithm has already been proposed for the same issue, by contrasting the two algorithms under the parameters of burst encryption and decryption time, end-to-end delay, it was found that RC4 provided better results. This paper looks to provide a better solution for the confidentiality of the burst in OBS networks.
Quantum Key Based Burst Confidentiality in Optical Burst Switched Networks
Balamurugan, A. M.; Sivasubramanian, A.
2014-01-01
The optical burst switching (OBS) is an emergent result to the technology concern that could achieve a feasible network in future. They are endowed with the ability to meet the bandwidth requirement of those applications that require intensive bandwidth. There are more domains opening up in the OBS that evidently shows their advantages and their capability to face the future network traffic. However, the concept of OBS is still far from perfection facing issues in case of security threat. The transfer of optical switching paradigm to optical burst switching faces serious downfall in the fields of burst aggregation, routing, authentication, dispute resolution, and quality of service (QoS). This paper deals with employing RC4 (stream cipher) to encrypt and decrypt bursts thereby ensuring the confidentiality of the burst. Although the use of AES algorithm has already been proposed for the same issue, by contrasting the two algorithms under the parameters of burst encryption and decryption time, end-to-end delay, it was found that RC4 provided better results. This paper looks to provide a better solution for the confidentiality of the burst in OBS networks. PMID:24578663
NASA Astrophysics Data System (ADS)
Gunn, Lachlan J.; Chappell, James M.; Allison, Andrew; Abbott, Derek
2014-09-01
While information-theoretic security is often associated with the one-time pad and quantum key distribution, noisy transport media leave room for classical techniques and even covert operation. Transit times across the public internet exhibit a degree of randomness, and cannot be determined noiselessly by an eavesdropper. We demonstrate the use of these measurements for information-theoretically secure communication over the public internet.
Block cipher based on modular arithmetic and methods of information compression
NASA Astrophysics Data System (ADS)
Krendelev, S.; Zbitnev, N.; Shishlyannikov, D.; Gridin, D.
2017-10-01
The article focuses on the description of a new block cipher. Due to the heightened interest in BigData the described cipher is used to encrypt big volumes of data in cloud storage services. The main advantages of the given cipher are the ease of implementation and the possibility of probabilistic encryption. This means that the text encryption will be different when the key is the same and the data is the same. So, the strength of the encryption is improved. Additionally, the ciphered message size can be hardly predicted.
NASA Astrophysics Data System (ADS)
Fraser, Gordon
2006-04-01
Introduction Gordon Fraser; Part I. Matter and the Universe: 1. Cosmology Wendy Freedman and Rocky Kolb; 2. Gravity Ronald Adler; 3. Astrophysics Arnon Dar; 4. Particles and the standard model Chris Quigg; 5. Superstrings Michael Green; Part II. Quantum Matter: 6. Atoms and photons Claude Cohen-Tannoudji and Jean Dalibard; 7. The quantum world of ultra-cold atoms Christopher Foot and William Phillips; 8. Superfluidity Henry Hall; 9. Quantum phase transitions Subir Sachdev; Part III. Quanta in Action: 10. Quantum entanglement Anton Zeilinger; 11. Quanta, ciphers and computers Artur Ekert; 12. Small-scale structure and nanoscience Yoseph Imry; Part IV. Calculation and Computation: 13. Nonlinearity Henry Abarbanel; 14. Complexity Antonio Politi; 15. Collaborative physics, e-science and the grid Tony Hey and Anne Trefethen; Part V. Science in Action: 16. Biophysics Cyrus Safinya; 17. Medical physics Nicolaj Pavel; 18. Physics and materials Robert Cahn; 19. Physics and society Ugo Amaldi.
NASA Astrophysics Data System (ADS)
Fraser, Gordon
2009-08-01
Introduction Gordon Fraser; Part I. Matter and the Universe: 1. Cosmology Wendy Freedman and Rocky Kolb; 2. Gravity Ronald Adler; 3. Astrophysics Arnon Dar; 4. Particles and the standard model Chris Quigg; 5. Superstrings Michael Green; Part II. Quantum Matter: 6. Atoms and photons Claude Cohen-Tannoudji and Jean Dalibard; 7. The quantum world of ultra-cold atoms Christopher Foot and William Phillips; 8. Superfluidity Henry Hall; 9. Quantum phase transitions Subir Sachdev; Part III. Quanta in Action: 10. Quantum entanglement Anton Zeilinger; 11. Quanta, ciphers and computers Artur Ekert; 12. Small-scale structure and nanoscience Yoseph Imry; Part IV. Calculation and Computation: 13. Nonlinearity Henry Abarbanel; 14. Complexity Antonio Politi; 15. Collaborative physics, e-science and the grid Tony Hey and Anne Trefethen; Part V. Science in Action: 16. Biophysics Cyrus Safinya; 17. Medical physics Nicolaj Pavel; 18. Physics and materials Robert Cahn; 19. Physics and society Ugo Amaldi.
Cryptographic Properties of Monotone Boolean Functions
2016-01-01
Algebraic attacks on stream ciphers with linear feedback, in: Advances in Cryptology (Eurocrypt 2003), Lecture Notes in Comput. Sci. 2656, Springer, Berlin...spectrum, algebraic immu- nity MSC 2010: 06E30, 94C10, 94A60, 11T71, 05E99 || Communicated by: Carlo Blundo 1 Introduction Let F 2 be the prime eld of...7]. For the reader’s convenience, we recall some basic notions below. Any f ∈ Bn can be expressed in algebraic normal form (ANF) as f(x 1 , x 2
Pseudo-Random Number Generator Based on Coupled Map Lattices
NASA Astrophysics Data System (ADS)
Lü, Huaping; Wang, Shihong; Hu, Gang
A one-way coupled chaotic map lattice is used for generating pseudo-random numbers. It is shown that with suitable cooperative applications of both chaotic and conventional approaches, the output of the spatiotemporally chaotic system can easily meet the practical requirements of random numbers, i.e., excellent random statistical properties, long periodicity of computer realizations, and fast speed of random number generations. This pseudo-random number generator system can be used as ideal synchronous and self-synchronizing stream cipher systems for secure communications.
Optical design of cipher block chaining (CBC) encryption mode by using digital holography
NASA Astrophysics Data System (ADS)
Gil, Sang Keun; Jeon, Seok Hee; Jung, Jong Rae; Kim, Nam
2016-03-01
We propose an optical design of cipher block chaining (CBC) encryption by using digital holographic technique, which has higher security than the conventional electronic method because of the analog-type randomized cipher text with 2-D array. In this paper, an optical design of CBC encryption mode is implemented by 2-step quadrature phase-shifting digital holographic encryption technique using orthogonal polarization. A block of plain text is encrypted with the encryption key by applying 2-step phase-shifting digital holography, and it is changed into cipher text blocks which are digital holograms. These ciphered digital holograms with the encrypted information are Fourier transform holograms and are recorded on CCDs with 256 gray levels quantized intensities. The decryption is computed by these encrypted digital holograms of cipher texts, the same encryption key and the previous cipher text. Results of computer simulations are presented to verify that the proposed method shows the feasibility in the high secure CBC encryption system.
Ganzua: A Cryptanalysis Tool for Monoalphabetic and Polyalphabetic Ciphers
ERIC Educational Resources Information Center
Garcia-Pasquel, Jesus Adolfo; Galaviz, Jose
2006-01-01
Many introductory courses to cryptology and computer security start with or include a discussion of classical ciphers that usually contemplates some cryptanalysis techniques used to break them. Ganzua (picklock in Spanish) is an application designed to assist the cryptanalysis of ciphertext obtained with monoalphabetic or polyalphabetic ciphers.…
Viswanathan, P; Krishna, P Venkata
2014-05-01
Teleradiology allows transmission of medical images for clinical data interpretation to provide improved e-health care access, delivery, and standards. The remote transmission raises various ethical and legal issues like image retention, fraud, privacy, malpractice liability, etc. A joint FED watermarking system means a joint fingerprint/encryption/dual watermarking system is proposed for addressing these issues. The system combines a region based substitution dual watermarking algorithm using spatial fusion, stream cipher algorithm using symmetric key, and fingerprint verification algorithm using invariants. This paper aims to give access to the outcomes of medical images with confidentiality, availability, integrity, and its origin. The watermarking, encryption, and fingerprint enrollment are conducted jointly in protection stage such that the extraction, decryption, and verification can be applied independently. The dual watermarking system, introducing two different embedding schemes, one used for patient data and other for fingerprint features, reduces the difficulty in maintenance of multiple documents like authentication data, personnel and diagnosis data, and medical images. The spatial fusion algorithm, which determines the region of embedding using threshold from the image to embed the encrypted patient data, follows the exact rules of fusion resulting in better quality than other fusion techniques. The four step stream cipher algorithm using symmetric key for encrypting the patient data with fingerprint verification system using algebraic invariants improves the robustness of the medical information. The experiment result of proposed scheme is evaluated for security and quality analysis in DICOM medical images resulted well in terms of attacks, quality index, and imperceptibility.
A secure transmission scheme of streaming media based on the encrypted control message
NASA Astrophysics Data System (ADS)
Li, Bing; Jin, Zhigang; Shu, Yantai; Yu, Li
2007-09-01
As the use of streaming media applications increased dramatically in recent years, streaming media security becomes an important presumption, protecting the privacy. This paper proposes a new encryption scheme in view of characteristics of streaming media and the disadvantage of the living method: encrypt the control message in the streaming media with the high security lever and permute and confuse the data which is non control message according to the corresponding control message. Here the so-called control message refers to the key data of the streaming media, including the streaming media header and the header of the video frame, and the seed key. We encrypt the control message using the public key encryption algorithm which can provide high security lever, such as RSA. At the same time we make use of the seed key to generate key stream, from which the permutation list P responding to GOP (group of picture) is derived. The plain text of the non-control message XORs the key stream and gets the middle cipher text. And then obtained one is permutated according to P. In contrast the decryption process is the inverse process of the above. We have set up a testbed for the above scheme and found our scheme is six to eight times faster than the conventional method. It can be applied not only between PCs but also between handheld devices.
Using the Hill Cipher to Teach Cryptographic Principles
ERIC Educational Resources Information Center
McAndrew, Alasdair
2008-01-01
The Hill cipher is the simplest example of a "block cipher," which takes a block of plaintext as input, and returns a block of ciphertext as output. Although it is insecure by modern standards, its simplicity means that it is well suited for the teaching of such concepts as encryption modes, and properties of cryptographic hash functions. Although…
Information Security Scheme Based on Computational Temporal Ghost Imaging.
Jiang, Shan; Wang, Yurong; Long, Tao; Meng, Xiangfeng; Yang, Xiulun; Shu, Rong; Sun, Baoqing
2017-08-09
An information security scheme based on computational temporal ghost imaging is proposed. A sequence of independent 2D random binary patterns are used as encryption key to multiply with the 1D data stream. The cipher text is obtained by summing the weighted encryption key. The decryption process can be realized by correlation measurement between the encrypted information and the encryption key. Due to the instinct high-level randomness of the key, the security of this method is greatly guaranteed. The feasibility of this method and robustness against both occlusion and additional noise attacks are discussed with simulation, respectively.
Pseudo-random bit generator based on lag time series
NASA Astrophysics Data System (ADS)
García-Martínez, M.; Campos-Cantón, E.
2014-12-01
In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.
Defense in Depth Added to Malicious Activities Simulation Tools (MAST)
2015-09-01
cipher suites. The TLS Handshake is a combination of three components: handshake, change cipher spec, and alert. 41 (1) The Handshake ( Hello ) The...TLS Handshake, specifically the “ Hello ” portion, is designed to negotiate session parameters (cipher suite). The client informs the server of the...protocols and standards that it supports and then the server selects the highest common protocols and standards. Specifically, the Client Hello message
Affine Equivalence and Constructions of Cryptographically Strong Boolean Functions
2013-09-01
manner is crucial for today’s global citizen. We want our financial transactions over the Internet to get processed without error. Cyber warfare between...encryption and decryption processes . An asymmetric cipher uses different keys to encrypt and decrypt a message, and the connection between the encryption and...Depending on how a symmetric cipher processes a message before encryption or de- cryption, a symmetric cipher can be further classified into a block or
NASA Astrophysics Data System (ADS)
Basri, M.; Mawengkang, H.; Zamzami, E. M.
2018-03-01
Limitations of storage sources is one option to switch to cloud storage. Confidentiality and security of data stored on the cloud is very important. To keep up the confidentiality and security of such data can be done one of them by using cryptography techniques. Data Encryption Standard (DES) is one of the block cipher algorithms used as standard symmetric encryption algorithm. This DES will produce 8 blocks of ciphers combined into one ciphertext, but the ciphertext are weak against brute force attacks. Therefore, the last 8 block cipher will be converted into 8 random images using Least Significant Bit (LSB) algorithm which later draws the result of cipher of DES algorithm to be merged into one.
Makkar, Steve R; Howe, Megan; Williamson, Anna; Gilham, Frances
2016-12-01
There is a need to develop innovations that can help bridge the gap between research and policy. Web CIPHER is an online tool designed to help policymakers better engage with research in order to increase its use in health policymaking. The aim of the present study was to test interventions in order to increase policymakers' usage of Web CIPHER. Namely, the impact of posting articles and blogs on topics relevant to the missions and scope of selected policy agencies in the Web CIPHER community. Five policy agencies were targeted for the intervention. Web CIPHER usage data was gathered over a 30-month period using Google Analytics. Time series analysis was used to evaluate whether publication of tailored articles and blogs led to significant changes in usage for all Web CIPHER members from policy agencies, including those from the five target agencies. We further evaluated whether these users showed greater increases in usage following publication of articles and blogs directly targeted at their agency, and if these effects were moderated by the blog author. Web CIPHER usage gradually increased over time and was significantly predicted by the number of articles but not blogs that were posted throughout the study period. Publication of articles on sexual and reproductive health was followed by sustained increases in usage among all users, including users from the policy agency that targets this area. This effect of topic relevance did not occur for the four remaining target agencies. Finally, page views were higher for articles targeted at one's agency compared to other agencies. This effect also occurred for blogs, particularly when the author was internal to one's agency. The findings suggest that Web CIPHER usage in general was motivated by general interest, engagement and appeal, as opposed to the agency specificity of content and work relevance. Blogs in and of themselves may not be effective at promoting usage. Thus, in order to increase policymakers' engagement with research through similar online platforms, a potentially effective approach would be to post abundant, frequently updated, engaging, interesting and widely appealing content irrespective of form.
From Greeks to Today: Cipher Trees and Computer Cryptography.
ERIC Educational Resources Information Center
Grady, M. Tim; Brumbaugh, Doug
1988-01-01
Explores the use of computers for teaching mathematical models of transposition ciphers. Illustrates the ideas, includes activities and extensions, provides a mathematical model and includes computer programs to implement these topics. (MVL)
NASA Astrophysics Data System (ADS)
Aryanti, Aryanti; Mekongga, Ikhthison
2018-02-01
Data security and confidentiality is one of the most important aspects of information systems at the moment. One attempt to secure data such as by using cryptography. In this study developed a data security system by implementing the cryptography algorithm Rivest, Shamir Adleman (RSA) and Vigenere Cipher. The research was done by combining Rivest, Shamir Adleman (RSA) and Vigenere Cipher cryptographic algorithms to document file either word, excel, and pdf. This application includes the process of encryption and decryption of data, which is created by using PHP software and my SQL. Data encryption is done on the transmit side through RSA cryptographic calculations using the public key, then proceed with Vigenere Cipher algorithm which also uses public key. As for the stage of the decryption side received by using the Vigenere Cipher algorithm still use public key and then the RSA cryptographic algorithm using a private key. Test results show that the system can encrypt files, decrypt files and transmit files. Tests performed on the process of encryption and decryption of files with different file sizes, file size affects the process of encryption and decryption. The larger the file size the longer the process of encryption and decryption.
Exploring Hill Ciphers with Graphing Calculators.
ERIC Educational Resources Information Center
St. John, Dennis
1998-01-01
Explains how to code and decode messages using Hill ciphers which combine matrix multiplication and modular arithmetic. Discusses how a graphing calculator can facilitate the matrix and modular arithmetic used in the coding and decoding procedures. (ASK)
A Lightweight Protocol for Secure Video Streaming
Morkevicius, Nerijus; Bagdonas, Kazimieras
2018-01-01
The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing “Fog Node-End Device” layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard. PMID:29757988
A Lightweight Protocol for Secure Video Streaming.
Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis
2018-05-14
The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.
Implementation of digital image encryption algorithm using logistic function and DNA encoding
NASA Astrophysics Data System (ADS)
Suryadi, MT; Satria, Yudi; Fauzi, Muhammad
2018-03-01
Cryptography is a method to secure information that might be in form of digital image. Based on past research, in order to increase security level of chaos based encryption algorithm and DNA based encryption algorithm, encryption algorithm using logistic function and DNA encoding was proposed. Digital image encryption algorithm using logistic function and DNA encoding use DNA encoding to scramble the pixel values into DNA base and scramble it in DNA addition, DNA complement, and XOR operation. The logistic function in this algorithm used as random number generator needed in DNA complement and XOR operation. The result of the test show that the PSNR values of cipher images are 7.98-7.99 bits, the entropy values are close to 8, the histogram of cipher images are uniformly distributed and the correlation coefficient of cipher images are near 0. Thus, the cipher image can be decrypted perfectly and the encryption algorithm has good resistance to entropy attack and statistical attack.
Electrostatic streaming instability modes in complex viscoelastic quantum plasmas
NASA Astrophysics Data System (ADS)
Karmakar, P. K.; Goutam, H. P.
2016-11-01
A generalized quantum hydrodynamic model is procedurally developed to investigate the electrostatic streaming instability modes in viscoelastic quantum electron-ion-dust plasma. Compositionally, inertialess electrons are anticipated to be degenerate quantum particles owing to their large de Broglie wavelengths. In contrast, inertial ions and dust particulates are treated in the same classical framework of linear viscoelastic fluids (non-Newtonian). It considers a dimensionality-dependent Bohmian quantum correction prefactor, γ = [(D - 2)/3D], in electron quantum dynamics, with D symbolizing the problem dimensionality. Applying a regular Fourier-formulaic plane-wave analysis around the quasi-neutral hydrodynamic equilibrium, two distinct instabilities are explored to exist. They stem in ion-streaming (relative to electrons and dust) and dust-streaming (relative to electrons and ions). Their stability is numerically illustrated in judicious parametric windows in both the hydrodynamic and kinetic regimes. The non-trivial influential roles by the relative streams, viscoelasticities, and correction prefactor are analyzed. It is seen that γ acts as a stabilizer for the ion-stream case only. The findings alongside new entailments, as special cases of realistic interest, corroborate well with the earlier predictions in plasma situations. Applicability of the analysis relevant in cosmic and astronomical environments of compact dwarf stars is concisely indicated.
ERIC Educational Resources Information Center
Tech Directions, 2011
2011-01-01
Cryptology, or cryptography, is the study of writing and deciphering hidden messages in codes, ciphers, and writings. It is almost as old as writing itself. Ciphers are messages in which letters are rearranged or substituted for other letters or numbers. Codes are messages in which letters are replaced by letter groups, syllables, or sentences.…
Escaping Embarrassment: Face-Work in the Rap Cipher
ERIC Educational Resources Information Center
Lee, Jooyoung
2009-01-01
How do individuals escape embarrassing moments in interaction? Drawing from ethnographic fieldwork, in-depth interviews, and video recordings of weekly street corner ciphers (impromptu rap sessions), this paper expands Goffman's theory of defensive and protective face-work. The findings reveal formulaic and indirect dimensions of face-work. First,…
Design and Test of Pseudorandom Number Generator Using a Star Network of Lorenz Oscillators
NASA Astrophysics Data System (ADS)
Cho, Kenichiro; Miyano, Takaya
We have recently developed a chaos-based stream cipher based on augmented Lorenz equations as a star network of Lorenz subsystems. In our method, the augmented Lorenz equations are used as a pseudorandom number generator. In this study, we propose a new method based on the augmented Lorenz equations for generating binary pseudorandom numbers and evaluate its security using the statistical tests of SP800-22 published by the National Institute for Standards and Technology in comparison with the performances of other chaotic dynamical models used as binary pseudorandom number generators. We further propose a faster version of the proposed method and evaluate its security using the statistical tests of TestU01 published by L’Ecuyer and Simard.
NASA Astrophysics Data System (ADS)
Budiman, M. A.; Rachmawati, D.; Jessica
2018-03-01
This study aims to combine the trithemus algorithm and double transposition cipher in file security that will be implemented to be an Android-based application. The parameters being examined are the real running time, and the complexity value. The type of file to be used is a file in PDF format. The overall result shows that the complexity of the two algorithms with duper encryption method is reported as Θ (n 2). However, the processing time required in the encryption process uses the Trithemius algorithm much faster than using the Double Transposition Cipher. With the length of plaintext and password linearly proportional to the processing time.
Joint Schemes for Physical Layer Security and Error Correction
ERIC Educational Resources Information Center
Adamo, Oluwayomi
2011-01-01
The major challenges facing resource constraint wireless devices are error resilience, security and speed. Three joint schemes are presented in this research which could be broadly divided into error correction based and cipher based. The error correction based ciphers take advantage of the properties of LDPC codes and Nordstrom Robinson code. A…
Manticore and CS mode : parallelizable encryption with joint cipher-state authentication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree
2004-10-01
We describe a new mode of encryption with inexpensive authentication, which uses information from the internal state of the cipher to provide the authentication. Our algorithms have a number of benefits: (1) the encryption has properties similar to CBC mode, yet the encipherment and authentication can be parallelized and/or pipelined, (2) the authentication overhead is minimal, and (3) the authentication process remains resistant against some IV reuse. We offer a Manticore class of authenticated encryption algorithms based on cryptographic hash functions, which support variable block sizes up to twice the hash output length and variable key lengths. A proof ofmore » security is presented for the MTC4 and Pepper algorithms. We then generalize the construction to create the Cipher-State (CS) mode of encryption that uses the internal state of any round-based block cipher as an authenticator. We provide hardware and software performance estimates for all of our constructions and give a concrete example of the CS mode of encryption that uses AES as the encryption primitive and adds a small speed overhead (10-15%) compared to AES alone.« less
Low frequency waves in streaming quantum dusty plasmas
NASA Astrophysics Data System (ADS)
Rozina, Ch.; Jamil, M.; Khan, Arroj A.; Zeba, I.; Saman, J.
2017-09-01
The influence of quantum effects on the excitation of two instabilities, namely quantum dust-acoustic and quantum dust-lower-hybrid waves due to the free streaming of ion/dust particles in uniformly magnetized dusty plasmas has been investigated using a quantum hydrodynamic model. We have obtained dispersion relations under some particular conditions applied on streaming ions and two contrastreaming dust particle beams at equilibrium and have analyzed the growth rates graphically. We have shown that with the increase of both the electron number density and the streaming speed of ion there is enhancement in the instability due to the fact that the dense plasma particle system with more energetic species having a high speed results in the increase of the growth rate in the electrostatic mode. The application of this work has been pointed out for laboratory as well as for space dusty plasmas.
ERIC Educational Resources Information Center
Farag, Mark
2007-01-01
Hill ciphers are linear codes that use as input a "plaintext" vector [p-right arrow above] of size n, which is encrypted with an invertible n x n matrix E to produce a "ciphertext" vector [c-right arrow above] = E [middle dot] [p-right arrow above]. Informally, a near-field is a triple [left angle bracket]N; +, *[right angle bracket] that…
A pipelined FPGA implementation of an encryption algorithm based on genetic algorithm
NASA Astrophysics Data System (ADS)
Thirer, Nonel
2013-05-01
With the evolution of digital data storage and exchange, it is essential to protect the confidential information from every unauthorized access. High performance encryption algorithms were developed and implemented by software and hardware. Also many methods to attack the cipher text were developed. In the last years, the genetic algorithm has gained much interest in cryptanalysis of cipher texts and also in encryption ciphers. This paper analyses the possibility to use the genetic algorithm as a multiple key sequence generator for an AES (Advanced Encryption Standard) cryptographic system, and also to use a three stages pipeline (with four main blocks: Input data, AES Core, Key generator, Output data) to provide a fast encryption and storage/transmission of a large amount of data.
Effect of Fourier transform on the streaming in quantum lattice gas algorithms
NASA Astrophysics Data System (ADS)
Oganesov, Armen; Vahala, George; Vahala, Linda; Soe, Min
2018-04-01
All our previous quantum lattice gas algorithms for nonlinear physics have approximated the kinetic energy operator by streaming sequences to neighboring lattice sites. Here, the kinetic energy can be treated to all orders by Fourier transforming the kinetic energy operator with interlaced Dirac-based unitary collision operators. Benchmarking against exact solutions for the 1D nonlinear Schrodinger equation shows an extended range of parameters (soliton speeds and amplitudes) over the Dirac-based near-lattice-site streaming quantum algorithm.
NASA Astrophysics Data System (ADS)
Rachmawati, D.; Budiman, M. A.; Atika, F.
2018-03-01
Data security is becoming one of the most significant challenges in the digital world. Retrieval of data by unauthorized parties will result in harm to the owner of the data. PDF data are also susceptible to data security disorder. These things affect the security of the information. To solve the security problem, it needs a method to maintain the protection of the data, such as cryptography. In cryptography, several algorithms can encode data, one of them is Two Square Cipher algorithm which is a symmetric algorithm. At this research, Two Square Cipher algorithm has already developed into a 16 x 16 key aims to enter the various plaintexts. However, for more enhancement security it will be combined with the VMPC algorithm which is a symmetric algorithm. The combination of the two algorithms is called with the super-encryption. At this point, the data already can be stored on a mobile phone allowing users to secure data flexibly and can be accessed anywhere. The application of PDF document security on this research built by Android-platform. At this study will also calculate the complexity of algorithms and process time. Based on the test results the complexity of the algorithm is θ (n) for Two Square Cipher and θ (n) for VMPC algorithm, so the complexity of the super-encryption is also θ (n). VMPC algorithm processing time results quicker than on Two Square Cipher. And the processing time is directly proportional to the length of the plaintext and passwords.
Computer network defense through radial wave functions
NASA Astrophysics Data System (ADS)
Malloy, Ian J.
The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.
Dynamics of streaming instability with quantum correction
NASA Astrophysics Data System (ADS)
Goutam, H. P.; Karmakar, P. K.
2017-05-01
A modified quantum hydrodynamic model (m-QHD) is herein proposed on the basis of the Thomas-Fermi (TF) theory of many fermionic quantum systems to investigate the dynamics of electrostatic streaming instability modes in a complex (dusty) quantum plasma system. The newly formulated m-QHD, as an amelioration over the existing usual QHD, employs a dimensionality-dependent Bohmian quantum correction prefactor, γ = [(D-2)/3D], in the electron quantum dynamics, where D symbolizing the problem dimensionality under consideration. The normal mode analysis of the coupled structure equations reveals the excitation of two distinct streaming modes associated with the flowing ions (against electrons and dust) and the flowing dust particulates (against the electrons and ions). It is mainly shown that the γ-factor introduces a new source of stability and dispersive effects to the ion-streaming instability solely; but not to the dust counterparts. A non-trivial application of our investigation in electrostatic beam-plasma (flow-driven) coupled dynamics leading to the development of self-sustained intense electric current, and hence, of strong magnetic field in compact astrophysical objects (in dwarf-family stars) is summarily indicated.
Connection between the two branches of the quantum two-stream instability across the k space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bret, A.; Haas, F.
2010-05-15
The stability of two quantum counterstreaming electron beams is investigated within the quantum plasma fluid equations for arbitrarily oriented wave vectors k. The analysis reveals that the two quantum two-stream unstable branches are indeed connected by a continuum of unstable modes with oblique wave vectors. Using the longitudinal approximation, the stability domain for any k is analytically explained, together with the growth rate.
Two stream instability in n-type gallium arsenide semiconductor quantum plasma
NASA Astrophysics Data System (ADS)
Ghosh, S.; Muley, Apurva
2018-01-01
By using quantum hydrodynamic model, we derive a generalized dielectric response function for two stream instability (convective only) in n-type gallium arsenide semiconductor plasma. We investigate the phase and amplification profiles of two stream instability with externally applied electric field ranging from 2600 to 4000 kV m-1 in presence of non-dimensional quantum parameter- H. In this range, a significant number of electrons in satellite valley become comparable to the number of electrons in central valley. The presence of quantum corrections in plasma medium induces two novel modes; one of it has amplifying nature and propagates in forward direction. It also modifies the spectral profile of four pre-existing modes in classical plasma. The existence of two stream instability is also established analytically by deriving the real part of longitudinal electrokinetic power flow density.
Nonquadratic Variation of the Blum Blum Shub Pseudorandom Number Generator
2016-09-01
maximum 200 words) Cryptography is essential for secure online communications. Many different types of ciphers are implemented in modern-day... cryptography , but they all have one common factor. All ciphers require a source of randomness, which makes them unpre- dictable. One such source of this...Martinsen Second Reader Craig Rasmussen Chair, Department of Applied Mathematics iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT Cryptography is
HEaDS-UP Phase IV Assessment: Headgear Effects on Auditory Perception
2015-02-01
8 Fig. 6 Average attenuation measured for the CIPHER and INTERCPT helmets as a function of noise level, mandible/ eyewear ...impulsive noise consistent with the US Occupational Safety and Health Administration (OSHA 1981), the National Institute for Occupational Safety and... eyewear , or HPDs) (Fig. 5) show that the CIPHER and INTERCPT compared favorably with the currently fielded advanced combat helmet (ACH). Figure 6
Entanglement-assisted quantum convolutional coding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilde, Mark M.; Brun, Todd A.
2010-04-15
We show how to protect a stream of quantum information from decoherence induced by a noisy quantum communication channel. We exploit preshared entanglement and a convolutional coding structure to develop a theory of entanglement-assisted quantum convolutional coding. Our construction produces a Calderbank-Shor-Steane (CSS) entanglement-assisted quantum convolutional code from two arbitrary classical binary convolutional codes. The rate and error-correcting properties of the classical convolutional codes directly determine the corresponding properties of the resulting entanglement-assisted quantum convolutional code. We explain how to encode our CSS entanglement-assisted quantum convolutional codes starting from a stream of information qubits, ancilla qubits, and shared entangled bits.
Field performance of timber bridges. 17, Ciphers stress-laminated deck bridge
James P. Wacker; James A. Kainz; Michael A. Ritter
In September 1989, the Ciphers bridge was constructed within the Beltrami Island State Forest in Roseau County, Minnesota. The bridge superstructure is a two-span continuous stress-laminated deck that is approximately 12.19 m long, 5.49 m wide, and 305 mm deep (40 ft long, 18 ft wide, and 12 in. deep). The bridge is one of the first to utilize red pine sawn lumber for...
DNA-based cryptographic methods for data hiding in DNA media.
Marwan, Samiha; Shawish, Ahmed; Nagaty, Khaled
2016-12-01
Information security can be achieved using cryptography, steganography or a combination of them, where data is firstly encrypted using any of the available cryptography techniques and then hid into any hiding medium. Recently, the famous genomic DNA has been introduced as a hiding medium, known as DNA steganography, due to its notable ability to hide huge data sets with a high level of randomness and hence security. Despite the numerous cryptography techniques, to our knowledge only the vigenere cipher and the DNA-based playfair cipher have been combined with the DNA steganography, which keeps space for investigation of other techniques and coming up with new improvements. This paper presents a comprehensive analysis between the DNA-based playfair, vigenere, RSA and the AES ciphers, each combined with a DNA hiding technique. The conducted analysis reports the performance diversity of each combined technique in terms of security, speed, hiding capacity in addition to both key size and data size. Moreover, this paper proposes a modification of the current combined DNA-based playfair cipher technique, which makes it not only simple and fast but also provides a significantly higher hiding capacity and security. The conducted extensive experimental studies confirm such outstanding performance in comparison with all the discussed combined techniques. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Chaos based video encryption using maps and Ikeda time delay system
NASA Astrophysics Data System (ADS)
Valli, D.; Ganesan, K.
2017-12-01
Chaos based cryptosystems are an efficient method to deal with improved speed and highly secured multimedia encryption because of its elegant features, such as randomness, mixing, ergodicity, sensitivity to initial conditions and control parameters. In this paper, two chaos based cryptosystems are proposed: one is the higher-dimensional 12D chaotic map and the other is based on the Ikeda delay differential equation (DDE) suitable for designing a real-time secure symmetric video encryption scheme. These encryption schemes employ a substitution box (S-box) to diffuse the relationship between pixels of plain video and cipher video along with the diffusion of current input pixel with the previous cipher pixel, called cipher block chaining (CBC). The proposed method enhances the robustness against statistical, differential and chosen/known plain text attacks. Detailed analysis is carried out in this paper to demonstrate the security and uniqueness of the proposed scheme.
Studies and simulations of the DigiCipher system
NASA Technical Reports Server (NTRS)
Sayood, K.; Chen, Y. C.; Kipp, G.
1993-01-01
During this period the development of simulators for the various high definition television (HDTV) systems proposed to the FCC was continued. The FCC has indicated that it wants the various proposers to collaborate on a single system. Based on all available information this system will look very much like the advanced digital television (ADTV) system with major contributions only from the DigiCipher system. The results of our simulations of the DigiCipher system are described. This simulator was tested using test sequences from the MPEG committee. The results are extrapolated to HDTV video sequences. Once again, some caveats are in order. The sequences used for testing the simulator and generating the results are those used for testing the MPEG algorithm. The sequences are of much lower resolution than the HDTV sequences would be, and therefore the extrapolations are not totally accurate. One would expect to get significantly higher compression in terms of bits per pixel with sequences that are of higher resolution. However, the simulator itself is a valid one, and should HDTV sequences become available, they could be used directly with the simulator. A brief overview of the DigiCipher system is given. Some coding results obtained using the simulator are looked at. These results are compared to those obtained using the ADTV system. These results are evaluated in the context of the CCSDS specifications and make some suggestions as to how the DigiCipher system could be implemented in the NASA network. Simulations such as the ones reported can be biased depending on the particular source sequence used. In order to get more complete information about the system one needs to obtain a reasonable set of models which mirror the various kinds of sources encountered during video coding. A set of models which can be used to effectively model the various possible scenarios is provided. As this is somewhat tangential to the other work reported, the results are included as an appendix.
NASA Astrophysics Data System (ADS)
Lee, Myoung-Jae; Jung, Gwanyong; Jung, Young-Dae
2018-05-01
The dispersion relation for the waves propagating on the surface of a bounded quantum plasma with consideration of electron spin-current and ion-stream is derived and numerically investigated. We have found that one of the real parts of the wave frequency has the branching behavior beyond the instability domains. In such a region where the frequency branching occurs, the waves exhibit purely propagating mode. The resonant instability has also been investigated. We have found that when the phase velocity of the wave is close to the velocity of ion-stream the wave becomes unstable. However, the resonant growth rate is remarkably reduced by the effect of electron spin-current. The growth rate is also decreased by either the reduction of ion-stream velocity or the increase in quantum wavelength. Thus, the quantum effect in terms of the quantum wave number is found to suppress the resonant instability. It is also found that the increase in Fermi energy can reduce the growth rate of the resonant wave in the quantum plasma.
Mobile Assisted Security in Wireless Sensor Networks
2015-08-03
server from Google’s DNS, Chromecast and the content server does the 3-way TCP Handshake which is followed by Client Hello and Server Hello TLS messages...utilized TLS v1.2, except NTP servers and google’s DNS server. In the TLS v1.2, after handshake, client and server sends Client Hello and Server Hello ...Messages in order. In Client Hello messages, client offers a list of Cipher Suites that it supports. Each Cipher Suite defines the key exchange algorithm
Tashima, Hideaki; Takeda, Masafumi; Suzuki, Hiroyuki; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki
2010-06-21
We have shown that the application of double random phase encoding (DRPE) to biometrics enables the use of biometrics as cipher keys for binary data encryption. However, DRPE is reported to be vulnerable to known-plaintext attacks (KPAs) using a phase recovery algorithm. In this study, we investigated the vulnerability of DRPE using fingerprints as cipher keys to the KPAs. By means of computational experiments, we estimated the encryption key and restored the fingerprint image using the estimated key. Further, we propose a method for avoiding the KPA on the DRPE that employs the phase retrieval algorithm. The proposed method makes the amplitude component of the encrypted image constant in order to prevent the amplitude component of the encrypted image from being used as a clue for phase retrieval. Computational experiments showed that the proposed method not only avoids revealing the cipher key and the fingerprint but also serves as a sufficiently accurate verification system.
Possibilities and testing of CPRNG in block cipher mode of operation PM-DC-LM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zacek, Petr; Jasek, Roman; Malanik, David
2016-06-08
This paper discusses the chaotic pseudo-random number generator (CPRNG), which is used in block cipher mode of operation called PM-DC-LM. PM-DC-LM is one of possible subversions of general PM mode. In this paper is not discussed the design of PM-DC-LM, but only CPRNG as a part of it because designing is written in other papers. Possibilities, how to change or to improve CPRNG are mentioned. The final part is devoted for a little testing of CPRNG and some testing data are shown.
Multicollision attack on CBC-MAC, EMAC, and XCBC-MAC of AES-128 algorithm
NASA Astrophysics Data System (ADS)
Brolin Sihite, Alfonso; Hayat Susanti, Bety
2017-10-01
A Message Authentication Codes (MAC) can be constructed based on a block cipher algorithm. CBC-MAC, EMAC, and XCBC-MAC constructions are some of MAC schemes that used in the hash function. In this paper, we do multicollision attack on CBC-MAC, EMAC, and XCBC-MAC construction which uses AES-128 block cipher algorithm as basic construction. The method of multicollision attack utilizes the concept of existential forgery on CBC-MAC. The results show that the multicollision can be obtained easily in CBC-MAC, EMAC, and XCBC-MAC construction.
Three-pass protocol scheme for bitmap image security by using vernam cipher algorithm
NASA Astrophysics Data System (ADS)
Rachmawati, D.; Budiman, M. A.; Aulya, L.
2018-02-01
Confidentiality, integrity, and efficiency are the crucial aspects of data security. Among the other digital data, image data is too prone to abuse of operation like duplication, modification, etc. There are some data security techniques, one of them is cryptography. The security of Vernam Cipher cryptography algorithm is very dependent on the key exchange process. If the key is leaked, security of this algorithm will collapse. Therefore, a method that minimizes key leakage during the exchange of messages is required. The method which is used, is known as Three-Pass Protocol. This protocol enables message delivery process without the key exchange. Therefore, the sending messages process can reach the receiver safely without fear of key leakage. The system is built by using Java programming language. The materials which are used for system testing are image in size 200×200 pixel, 300×300 pixel, 500×500 pixel, 800×800 pixel and 1000×1000 pixel. The result of experiments showed that Vernam Cipher algorithm in Three-Pass Protocol scheme could restore the original image.
Attack to AN Image Encryption Based on Chaotic Logistic Map
NASA Astrophysics Data System (ADS)
Wang, Xing-Yuan; Chen, Feng; Wang, Tian; Xu, Dahai; Ma, Yutian
2013-10-01
This paper offers two different attacks on a freshly proposed image encryption based on chaotic logistic map. The cryptosystem under study first uses a secret key of 80-bit and employed two chaotic logistic maps. We derived the initial conditions of the logistic maps from using the secret key by providing different weights to all its bits. Additionally, in this paper eight different types of procedures are used to encrypt the pixels of an image in the proposed encryption process of which one of them will be used for a certain pixel which is determined by the product of the logistic map. The secret key is revised after encrypting each block which consisted of 16 pixels of the image. The encrypting process have weakness, worst of which is that every byte of plaintext is independent when substituted, so the cipher text of the byte will not change even the other bytes have changed. As a result of weakness, a chosen plaintext attack and a chosen cipher text attack can be completed without any knowledge of the key value to recuperate the ciphered image.
Quantum cryptography using coherent states: Randomized encryption and key generation
NASA Astrophysics Data System (ADS)
Corndorf, Eric
With the advent of the global optical-telecommunications infrastructure, an increasing number of individuals, companies, and agencies communicate information with one another over public networks or physically-insecure private networks. While the majority of the traffic flowing through these networks requires little or no assurance of secrecy, the same cannot be said for certain communications between banks, between government agencies, within the military, and between corporations. In these arenas, the need to specify some level of secrecy in communications is a high priority. While the current approaches to securing sensitive information (namely the public-key-cryptography infrastructure and deterministic private-key ciphers like AES and 3DES) seem to be cryptographically strong based on empirical evidence, there exist no mathematical proofs of secrecy for any widely deployed cryptosystem. As an example, the ubiquitous public-key cryptosystems infer all of their secrecy from the assumption that factoring of the product of two large primes is necessarily time consuming---something which has not, and perhaps cannot, be proven. Since the 1980s, the possibility of using quantum-mechanical features of light as a physical mechanism for satisfying particular cryptographic objectives has been explored. This research has been fueled by the hopes that cryptosystems based on quantum systems may provide provable levels of secrecy which are at least as valid as quantum mechanics itself. Unfortunately, the most widely considered quantum-cryptographic protocols (BB84 and the Ekert protocol) have serious implementation problems. Specifically, they require quantum-mechanical states which are not readily available, and they rely on unproven relations between intrusion-level detection and the information available to an attacker. As a result, the secrecy level provided by these experimental implementations is entirely unspecified. In an effort to provably satisfy the cryptographic objectives of key generation and direct data-encryption, a new quantum cryptographic principle is demonstrated wherein keyed coherent-state signal sets are employed. Taking advantage of the fundamental and irreducible quantum-measurement noise of coherent states, these schemes do not require the users to measure the influence of an attacker. Experimental key-generation and data encryption schemes based on these techniques, which are compatible with today's WDM fiber-optic telecommunications infrastructure, are implemented and analyzed.
Stream of consciousness: Quantum and biochemical assumptions regarding psychopathology.
Tonello, Lucio; Cocchi, Massimo; Gabrielli, Fabio; Tuszynski, Jack A
2017-04-01
The accepted paradigms of mainstream neuropsychiatry appear to be incompletely adequate and in various cases offer equivocal analyses. However, a growing number of new approaches are being proposed that suggest the emergence of paradigm shifts in this area. In particular, quantum theories of mind, brain and consciousness seem to offer a profound change to the current approaches. Unfortunately these quantum paradigms harbor at least two serious problems. First, they are simply models, theories, and assumptions, with no convincing experiments supporting their claims. Second, they deviate from contemporary mainstream views of psychiatric illness and do so in revolutionary ways. We suggest a possible way to integrate experimental neuroscience with quantum models in order to address outstanding issues in psychopathology. A key role is played by the phenomenon called the "stream of consciousness", which can be linked to the so-called "Gamma Synchrony" (GS), which is clearly demonstrated by EEG data. In our novel proposal, a unipolar depressed patient could be seen as a subject with an altered stream of consciousness. In particular, some clues suggest that depression is linked to an "increased power" stream of consciousness. It is additionally suggested that such an approach to depression might be extended to psychopathology in general with potential benefits to diagnostics and therapeutics in neuropsychiatry. Copyright © 2017 Elsevier Ltd. All rights reserved.
Secure chaotic map based block cryptosystem with application to camera sensor networks.
Guo, Xianfeng; Zhang, Jiashu; Khan, Muhammad Khurram; Alghathbar, Khaled
2011-01-01
Recently, Wang et al. presented an efficient logistic map based block encryption system. The encryption system employs feedback ciphertext to achieve plaintext dependence of sub-keys. Unfortunately, we discovered that their scheme is unable to withstand key stream attack. To improve its security, this paper proposes a novel chaotic map based block cryptosystem. At the same time, a secure architecture for camera sensor network is constructed. The network comprises a set of inexpensive camera sensors to capture the images, a sink node equipped with sufficient computation and storage capabilities and a data processing server. The transmission security between the sink node and the server is gained by utilizing the improved cipher. Both theoretical analysis and simulation results indicate that the improved algorithm can overcome the flaws and maintain all the merits of the original cryptosystem. In addition, computational costs and efficiency of the proposed scheme are encouraging for the practical implementation in the real environment as well as camera sensor network.
Secure Chaotic Map Based Block Cryptosystem with Application to Camera Sensor Networks
Guo, Xianfeng; Zhang, Jiashu; Khan, Muhammad Khurram; Alghathbar, Khaled
2011-01-01
Recently, Wang et al. presented an efficient logistic map based block encryption system. The encryption system employs feedback ciphertext to achieve plaintext dependence of sub-keys. Unfortunately, we discovered that their scheme is unable to withstand key stream attack. To improve its security, this paper proposes a novel chaotic map based block cryptosystem. At the same time, a secure architecture for camera sensor network is constructed. The network comprises a set of inexpensive camera sensors to capture the images, a sink node equipped with sufficient computation and storage capabilities and a data processing server. The transmission security between the sink node and the server is gained by utilizing the improved cipher. Both theoretical analysis and simulation results indicate that the improved algorithm can overcome the flaws and maintain all the merits of the original cryptosystem. In addition, computational costs and efficiency of the proposed scheme are encouraging for the practical implementation in the real environment as well as camera sensor network. PMID:22319371
Architecture of security management unit for safe hosting of multiple agents
NASA Astrophysics Data System (ADS)
Gilmont, Tanguy; Legat, Jean-Didier; Quisquater, Jean-Jacques
1999-04-01
In such growing areas as remote applications in large public networks, electronic commerce, digital signature, intellectual property and copyright protection, and even operating system extensibility, the hardware security level offered by existing processors is insufficient. They lack protection mechanisms that prevent the user from tampering critical data owned by those applications. Some devices make exception, but have not enough processing power nor enough memory to stand up to such applications (e.g. smart cards). This paper proposes an architecture of secure processor, in which the classical memory management unit is extended into a new security management unit. It allows ciphered code execution and ciphered data processing. An internal permanent memory can store cipher keys and critical data for several client agents simultaneously. The ordinary supervisor privilege scheme is replaced by a privilege inheritance mechanism that is more suited to operating system extensibility. The result is a secure processor that has hardware support for extensible multitask operating systems, and can be used for both general applications and critical applications needing strong protection. The security management unit and the internal permanent memory can be added to an existing CPU core without loss of performance, and do not require it to be modified.
William Friedman, Geneticist Turned Cryptographer
Goldman, Irwin L.
2017-01-01
William Friedman (1891–1969), trained as a plant geneticist at Cornell University, was employed at Riverbank Laboratories by the eccentric millionaire George Fabyan to work on wheat breeding. Friedman, however, soon became intrigued by and started working on a pet project of Fabyan’s involving the conjecture that Francis Bacon, a polymath known for the study of ciphers, was the real author of Shakespeare’s plays. Thus, beginning in ∼1916, Friedman turned his attention to the so called “Baconian cipher,” and developed decryption techniques that bore similarity to approaches for solving problems in population genetics. His most significant, indeed pathbreaking, work used ideas from genetics and statistics, focusing on analysis of the frequencies of letters in language use. Although he had transitioned from being a geneticist to a cryptographer, his earlier work had resonance in his later pursuits. He soon began working directly for the United States government and produced solutions used to solve complex military ciphers, in particular to break the Japanese Purple code during World War II. Another important legacy of his work was the establishment of the Signal Intelligence Service and eventually the National Security Agency. PMID:28476859
William Friedman, Geneticist Turned Cryptographer.
Goldman, Irwin L
2017-05-01
William Friedman (1891-1969), trained as a plant geneticist at Cornell University, was employed at Riverbank Laboratories by the eccentric millionaire George Fabyan to work on wheat breeding. Friedman, however, soon became intrigued by and started working on a pet project of Fabyan's involving the conjecture that Francis Bacon, a polymath known for the study of ciphers, was the real author of Shakespeare's plays. Thus, beginning in ∼1916, Friedman turned his attention to the so called "Baconian cipher," and developed decryption techniques that bore similarity to approaches for solving problems in population genetics. His most significant, indeed pathbreaking, work used ideas from genetics and statistics, focusing on analysis of the frequencies of letters in language use. Although he had transitioned from being a geneticist to a cryptographer, his earlier work had resonance in his later pursuits. He soon began working directly for the United States government and produced solutions used to solve complex military ciphers, in particular to break the Japanese Purple code during World War II. Another important legacy of his work was the establishment of the Signal Intelligence Service and eventually the National Security Agency. Copyright © 2017 by the Genetics Society of America.
NASA Astrophysics Data System (ADS)
Tong, Xiaojun; Cui, Minggen; Wang, Zhu
2009-07-01
The design of the new compound two-dimensional chaotic function is presented by exploiting two one-dimensional chaotic functions which switch randomly, and the design is used as a chaotic sequence generator which is proved by Devaney's definition proof of chaos. The properties of compound chaotic functions are also proved rigorously. In order to improve the robustness against difference cryptanalysis and produce avalanche effect, a new feedback image encryption scheme is proposed using the new compound chaos by selecting one of the two one-dimensional chaotic functions randomly and a new image pixels method of permutation and substitution is designed in detail by array row and column random controlling based on the compound chaos. The results from entropy analysis, difference analysis, statistical analysis, sequence randomness analysis, cipher sensitivity analysis depending on key and plaintext have proven that the compound chaotic sequence cipher can resist cryptanalytic, statistical and brute-force attacks, and especially it accelerates encryption speed, and achieves higher level of security. By the dynamical compound chaos and perturbation technology, the paper solves the problem of computer low precision of one-dimensional chaotic function.
Mechanisms of iron photoreduction in a metal-rich, acidic stream (St. Kevin Gulch, Colorado, U.S.A.)
Kimball, B.A.; McKnight, Diane M.; Wetherbee, G.A.; Harnish, R.A.
1992-01-01
Iron photoreduction in metal-rich, acidic streams affected by mine drainage accounts for some of the variability in metal chemistry of such streams, producing diel variations in Fe(II). Differentiation of the mechanisms of the Fe photoreduction reaction by a series of in-stream experiments at St. Kevin Gulch, Colorado, indicates that a homogeneous, solution-phase reaction can occur in the absence of suspended particulate Fe and bacteria, and the rate of reaction is increased by the presence of Fe colloids in the stream water. In-stream Fe photoreduction is limited during the diel cycle by the available Fe(III) in the water column and streambed. The quantum yield of Fe(II) was reproducible in diel measurements: the quantum yield, in mol E-1 (from 300 to 400 nm) was 1.4 ?? 10-3 in 1986, 0.8 ?? 10-3 in 1988 and 1.2 ?? 10-3 in 1989, at the same location and under similar streamflow and stream-chemistry conditions. In a photolysis control experiment, there was no detectable production of Fe(II) above background concentrations in stream-water samples that were experimentally excluded from sunlight. ?? 1992.
One-Time Pad as a nonlinear dynamical system
NASA Astrophysics Data System (ADS)
Nagaraj, Nithin
2012-11-01
The One-Time Pad (OTP) is the only known unbreakable cipher, proved mathematically by Shannon in 1949. In spite of several practical drawbacks of using the OTP, it continues to be used in quantum cryptography, DNA cryptography and even in classical cryptography when the highest form of security is desired (other popular algorithms like RSA, ECC, AES are not even proven to be computationally secure). In this work, we prove that the OTP encryption and decryption is equivalent to finding the initial condition on a pair of binary maps (Bernoulli shift). The binary map belongs to a family of 1D nonlinear chaotic and ergodic dynamical systems known as Generalized Luröth Series (GLS). Having established these interesting connections, we construct other perfect secrecy systems on the GLS that are equivalent to the One-Time Pad, generalizing for larger alphabets. We further show that OTP encryption is related to Randomized Arithmetic Coding - a scheme for joint compression and encryption.
Quantum stream instability in coupled two-dimensional plasmas
NASA Astrophysics Data System (ADS)
Akbari-Moghanjoughi, M.
2014-08-01
In this paper the quantum counter-streaming instability problem is studied in planar two-dimensional (2D) quantum plasmas using the coupled quantum hydrodynamic (CQHD) model which incorporates the most important quantum features such as the statistical Fermi-Dirac electron pressure, the electron-exchange potential and the quantum diffraction effect. The instability is investigated for different 2D quantum electron systems using the dynamics of Coulomb-coupled carriers on each plasma sheet when these plasmas are both monolayer doped graphene or metalfilm (corresponding to 2D Dirac or Fermi electron fluids). It is revealed that there are fundamental differences between these two cases regarding the effects of Bohm's quantum potential and the electron-exchange on the instability criteria. These differences mark yet another interesting feature of the effect of the energy band dispersion of Dirac electrons in graphene. Moreover, the effects of plasma number-density and coupling parameter on the instability criteria are shown to be significant. This study is most relevant to low dimensional graphene-based field-effect-transistor (FET) devices. The current study helps in understanding the collective interactions of the low-dimensional coupled ballistic conductors and the nanofabrication of future graphene-based integrated circuits.
Encryption and decryption using FPGA
NASA Astrophysics Data System (ADS)
Nayak, Nikhilesh; Chandak, Akshay; Shah, Nisarg; Karthikeyan, B.
2017-11-01
In this paper, we are performing multiple cryptography methods on a set of data and comparing their outputs. Here AES algorithm and RSA algorithm are used. Using AES Algorithm an 8 bit input (plain text) gets encrypted using a cipher key and the result is displayed on tera term (serially). For simulation a 128 bit input is used and operated with a 128 bit cipher key to generate encrypted text. The reverse operations are then performed to get decrypted text. In RSA Algorithm file handling is used to input plain text. This text is then operated on to get the encrypted and decrypted data, which are then stored in a file. Finally the results of both the algorithms are compared.
NASA Astrophysics Data System (ADS)
Amalia; Budiman, M. A.; Sitepu, R.
2018-03-01
Cryptography is one of the best methods to keep the information safe from security attack by unauthorized people. At present, Many studies had been done by previous researchers to generate a more robust cryptographic algorithm to provide high security for data communication. To strengthen data security, one of the methods is hybrid cryptosystem method that combined symmetric and asymmetric algorithm. In this study, we observed a hybrid cryptosystem method contain Modification Playfair Cipher 16x16 algorithm as a symmetric algorithm and Knapsack Naccache-Stern as an asymmetric algorithm. We observe a running time of this hybrid algorithm with some of the various experiments. We tried different amount of characters to be tested which are 10, 100, 1000, 10000 and 100000 characters and we also examined the algorithm with various key’s length which are 10, 20, 30, 40 of key length. The result of our study shows that the processing time for encryption and decryption process each algorithm is linearly proportional, it means the longer messages character then, the more significant times needed to encrypt and decrypt the messages. The encryption running time of Knapsack Naccache-Stern algorithm takes a longer time than its decryption, while the encryption running time of modification Playfair Cipher 16x16 algorithm takes less time than its decryption.
Rajagopalan, S. P.
2017-01-01
Certificateless-based signcryption overcomes inherent shortcomings in traditional Public Key Infrastructure (PKI) and Key Escrow problem. It imparts efficient methods to design PKIs with public verifiability and cipher text authenticity with minimum dependency. As a classic primitive in public key cryptography, signcryption performs validity of cipher text without decryption by combining authentication, confidentiality, public verifiability and cipher text authenticity much more efficiently than the traditional approach. In this paper, we first define a security model for certificateless-based signcryption called, Complex Conjugate Differential Integrated Factor (CC-DIF) scheme by introducing complex conjugates through introduction of the security parameter and improving secured message distribution rate. However, both partial private key and secret value changes with respect to time. To overcome this weakness, a new certificateless-based signcryption scheme is proposed by setting the private key through Differential (Diff) Equation using an Integration Factor (DiffEIF), minimizing computational cost and communication overhead. The scheme is therefore said to be proven secure (i.e. improving the secured message distributing rate) against certificateless access control and signcryption-based scheme. In addition, compared with the three other existing schemes, the CC-DIF scheme has the least computational cost and communication overhead for secured message communication in mobile network. PMID:29040290
Alagarsamy, Sumithra; Rajagopalan, S P
2017-01-01
Certificateless-based signcryption overcomes inherent shortcomings in traditional Public Key Infrastructure (PKI) and Key Escrow problem. It imparts efficient methods to design PKIs with public verifiability and cipher text authenticity with minimum dependency. As a classic primitive in public key cryptography, signcryption performs validity of cipher text without decryption by combining authentication, confidentiality, public verifiability and cipher text authenticity much more efficiently than the traditional approach. In this paper, we first define a security model for certificateless-based signcryption called, Complex Conjugate Differential Integrated Factor (CC-DIF) scheme by introducing complex conjugates through introduction of the security parameter and improving secured message distribution rate. However, both partial private key and secret value changes with respect to time. To overcome this weakness, a new certificateless-based signcryption scheme is proposed by setting the private key through Differential (Diff) Equation using an Integration Factor (DiffEIF), minimizing computational cost and communication overhead. The scheme is therefore said to be proven secure (i.e. improving the secured message distributing rate) against certificateless access control and signcryption-based scheme. In addition, compared with the three other existing schemes, the CC-DIF scheme has the least computational cost and communication overhead for secured message communication in mobile network.
The general dispersion relation of induced streaming instabilities in quantum outflow systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehdian, H., E-mail: mehdian@khu.ac.ir; Hajisharifi, K.; Hasanbeigi, A.
2015-11-15
In this manuscript the dispersion relations of streaming instabilities, by using the unique property (neutralized in charge and current by default) of plasma shells colliding, have been generalized and studied. This interesting property for interpenetrating beams enables one to find the general dispersion relations without any restrictions used in the previous works in this area. In our previous work [H. Mehdian et al., ApJ. 801, 89 (2015)], employing the plasma shell concept and boost frame method, the general dispersion relation for filamentation instability has been derived in the relativistic classical regime. But in this paper, using the above mentioned concepts,more » the general dispersion relations (for each of streaming instabilities, filamentation, two-stream and multi-stream) in the non-relativistic quantum regime have been derived by employing the quantum fluid equations together with Maxwell equations. The derived dispersion relations enable to describe any arbitrary system of interacting two and three beams, justified neutralization condition, by choosing the inertial reference frame embedded on the one of the beams. Furthermore, by the numerical and analytical study of these dispersion relations, many new features of streaming instabilities (E.g. their cut-off wave numbers and growth rates) in terms of all involved parameters have been illustrated. The obtained results in this paper can be used to describe many astrophysical systems and laboratory astrophysics setting, such as collision of non-parallel plasma shells over a background plasma or the collision of three neutralized plasma slabs, and justifying the many plasma phenomena such as particle accelerations and induced fields.« less
The general dispersion relation of induced streaming instabilities in quantum outflow systems
NASA Astrophysics Data System (ADS)
Mehdian, H.; Hajisharifi, K.; Hasanbeigi, A.
2015-11-01
In this manuscript the dispersion relations of streaming instabilities, by using the unique property (neutralized in charge and current by default) of plasma shells colliding, have been generalized and studied. This interesting property for interpenetrating beams enables one to find the general dispersion relations without any restrictions used in the previous works in this area. In our previous work [H. Mehdian et al., ApJ. 801, 89 (2015)], employing the plasma shell concept and boost frame method, the general dispersion relation for filamentation instability has been derived in the relativistic classical regime. But in this paper, using the above mentioned concepts, the general dispersion relations (for each of streaming instabilities, filamentation, two-stream and multi-stream) in the non-relativistic quantum regime have been derived by employing the quantum fluid equations together with Maxwell equations. The derived dispersion relations enable to describe any arbitrary system of interacting two and three beams, justified neutralization condition, by choosing the inertial reference frame embedded on the one of the beams. Furthermore, by the numerical and analytical study of these dispersion relations, many new features of streaming instabilities (E.g. their cut-off wave numbers and growth rates) in terms of all involved parameters have been illustrated. The obtained results in this paper can be used to describe many astrophysical systems and laboratory astrophysics setting, such as collision of non-parallel plasma shells over a background plasma or the collision of three neutralized plasma slabs, and justifying the many plasma phenomena such as particle accelerations and induced fields.
An Image Encryption Algorithm Based on Information Hiding
NASA Astrophysics Data System (ADS)
Ge, Xin; Lu, Bin; Liu, Fenlin; Gong, Daofu
Aiming at resolving the conflict between security and efficiency in the design of chaotic image encryption algorithms, an image encryption algorithm based on information hiding is proposed based on the “one-time pad” idea. A random parameter is introduced to ensure a different keystream for each encryption, which has the characteristics of “one-time pad”, improving the security of the algorithm rapidly without significant increase in algorithm complexity. The random parameter is embedded into the ciphered image with information hiding technology, which avoids negotiation for its transport and makes the application of the algorithm easier. Algorithm analysis and experiments show that the algorithm is secure against chosen plaintext attack, differential attack and divide-and-conquer attack, and has good statistical properties in ciphered images.
Bouslimi, D; Coatrieux, G; Roux, Ch
2011-01-01
In this paper, we propose a new joint watermarking/encryption algorithm for the purpose of verifying the reliability of medical images in both encrypted and spatial domains. It combines a substitutive watermarking algorithm, the quantization index modulation (QIM), with a block cipher algorithm, the Advanced Encryption Standard (AES), in CBC mode of operation. The proposed solution gives access to the outcomes of the image integrity and of its origins even though the image is stored encrypted. Experimental results achieved on 8 bits encoded Ultrasound images illustrate the overall performances of the proposed scheme. By making use of the AES block cipher in CBC mode, the proposed solution is compliant with or transparent to the DICOM standard.
Security Analysis of Some Diffusion Mechanisms Used in Chaotic Ciphers
NASA Astrophysics Data System (ADS)
Zhang, Leo Yu; Zhang, Yushu; Liu, Yuansheng; Yang, Anjia; Chen, Guanrong
As a variant of the substitution-permutation network, the permutation-diffusion structure has received extensive attention in the field of chaotic cryptography over the last three decades. Because of the high implementation speed and nonlinearity over GF(2), the Galois field of two elements, mixing modulo addition/multiplication and Exclusive OR becomes very popular in various designs to achieve the desired diffusion effect. This paper reports that some diffusion mechanisms based on modulo addition/multiplication and Exclusive OR are not resistant to plaintext attacks as claimed. By cracking several recently proposed chaotic ciphers as examples, it is demonstrated that a good understanding of the strength and weakness of these crypto-primitives is crucial for designing more practical chaotic encryption algorithms in the future.
Context adaptive binary arithmetic coding-based data hiding in partially encrypted H.264/AVC videos
NASA Astrophysics Data System (ADS)
Xu, Dawen; Wang, Rangding
2015-05-01
A scheme of data hiding directly in a partially encrypted version of H.264/AVC videos is proposed which includes three parts, i.e., selective encryption, data embedding and data extraction. Selective encryption is performed on context adaptive binary arithmetic coding (CABAC) bin-strings via stream ciphers. By careful selection of CABAC entropy coder syntax elements for selective encryption, the encrypted bitstream is format-compliant and has exactly the same bit rate. Then a data-hider embeds the additional data into partially encrypted H.264/AVC videos using a CABAC bin-string substitution technique without accessing the plaintext of the video content. Since bin-string substitution is carried out on those residual coefficients with approximately the same magnitude, the quality of the decrypted video is satisfactory. Video file size is strictly preserved even after data embedding. In order to adapt to different application scenarios, data extraction can be done either in the encrypted domain or in the decrypted domain. Experimental results have demonstrated the feasibility and efficiency of the proposed scheme.
Investigating the structure preserving encryption of high efficiency video coding (HEVC)
NASA Astrophysics Data System (ADS)
Shahid, Zafar; Puech, William
2013-02-01
This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.
ERIC Educational Resources Information Center
Tapson, Frank
1996-01-01
Describes public key cryptography, also known as RSA, which is a system using two keys, one used to put a message into cipher and another used to decipher the message. Presents examples using small prime numbers. (MKR)
Protecting privacy in a clinical data warehouse.
Kong, Guilan; Xiao, Zhichun
2015-06-01
Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.
NASA Astrophysics Data System (ADS)
Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi
2018-01-01
Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.
NASA Astrophysics Data System (ADS)
Chai, Xiu-Li; Gan, Zhi-Hua; Lu, Yang; Zhang, Miao-Hui; Chen, Yi-Ran
2016-10-01
Recently, many image encryption algorithms based on chaos have been proposed. Most of the previous algorithms encrypt components R, G, and B of color images independently and neglect the high correlation between them. In the paper, a novel color image encryption algorithm is introduced. The 24 bit planes of components R, G, and B of the color plain image are obtained and recombined into 4 compound bit planes, and this can make the three components affect each other. A four-dimensional (4D) memristive hyperchaotic system generates the pseudorandom key streams and its initial values come from the SHA 256 hash value of the color plain image. The compound bit planes and key streams are confused according to the principles of genetic recombination, then confusion and diffusion as a union are applied to the bit planes, and the color cipher image is obtained. Experimental results and security analyses demonstrate that the proposed algorithm is secure and effective so that it may be adopted for secure communication. Project supported by the National Natural Science Foundation of China (Grant Nos. 61203094 and 61305042), the Natural Science Foundation of the United States (Grant Nos. CNS-1253424 and ECCS-1202225), the Science and Technology Foundation of Henan Province, China (Grant No. 152102210048), the Foundation and Frontier Project of Henan Province, China (Grant No. 162300410196), the Natural Science Foundation of Educational Committee of Henan Province, China (Grant No. 14A413015), and the Research Foundation of Henan University, China (Grant No. xxjc20140006).
Fast encryption of RGB color digital images using a tweakable cellular automaton based schema
NASA Astrophysics Data System (ADS)
Faraoun, Kamel Mohamed
2014-12-01
We propose a new tweakable construction of block-enciphers using second-order reversible cellular automata, and we apply it to encipher RGB-colored images. The proposed construction permits a parallel encryption of the image content by extending the standard definition of a block cipher to take into account a supplementary parameter used as a tweak (nonce) to control the behavior of the cipher from one region of the image to the other, and hence avoid the necessity to use slow sequential encryption's operating modes. The proposed construction defines a flexible pseudorandom permutation that can be used with efficacy to solve the electronic code book problem without the need to a specific sequential mode. Obtained results from various experiments show that the proposed schema achieves high security and execution performances, and enables an interesting mode of selective area decryption due to the parallel character of the approach.
Searchable attribute-based encryption scheme with attribute revocation in cloud storage.
Wang, Shangping; Zhao, Duqiao; Zhang, Yaling
2017-01-01
Attribute based encryption (ABE) is a good way to achieve flexible and secure access control to data, and attribute revocation is the extension of the attribute-based encryption, and the keyword search is an indispensable part for cloud storage. The combination of both has an important application in the cloud storage. In this paper, we construct a searchable attribute-based encryption scheme with attribute revocation in cloud storage, the keyword search in our scheme is attribute based with access control, when the search succeeds, the cloud server returns the corresponding cipher text to user and the user can decrypt the cipher text definitely. Besides, our scheme supports multiple keywords search, which makes the scheme more practical. Under the assumption of decisional bilinear Diffie-Hellman exponent (q-BDHE) and decisional Diffie-Hellman (DDH) in the selective security model, we prove that our scheme is secure.
Single photon quantum cryptography.
Beveratos, Alexios; Brouri, Rosa; Gacoin, Thierry; Villing, André; Poizat, Jean-Philippe; Grangier, Philippe
2002-10-28
We report the full implementation of a quantum cryptography protocol using a stream of single photon pulses generated by a stable and efficient source operating at room temperature. The single photon pulses are emitted on demand by a single nitrogen-vacancy color center in a diamond nanocrystal. The quantum bit error rate is less that 4.6% and the secure bit rate is 7700 bits/s. The overall performances of our system reaches a domain where single photons have a measurable advantage over an equivalent system based on attenuated light pulses.
A transverse separate-spin-evolution streaming instability
NASA Astrophysics Data System (ADS)
Iqbal, Z.; Andreev, Pavel A.; Murtaza, G.
2018-05-01
By using the separate spin evolution quantum hydrodynamical model, the instability of transverse mode due to electron streaming in a partially spin polarized magnetized degenerate plasma is studied. The electron spin polarization gives birth to a new spin-dependent wave (i.e., separate spin evolution streaming driven ordinary wave) in the real wave spectrum. It is shown that the spin polarization and streaming speed significantly affect the frequency of this new mode. Analyzing growth rate, it is found that the electron spin effects reduce the growth rate and shift the threshold of instability as well as its termination point towards higher values. Additionally, how the other parameters like electron streaming and Fermi pressure influence the growth rate is also investigated. Current study can help towards better understanding of the existence of new waves and streaming instability in the astrophysical plasmas.
Medical Image Encryption: An Application for Improved Padding Based GGH Encryption Algorithm
Sokouti, Massoud; Zakerolhosseini, Ali; Sokouti, Babak
2016-01-01
Medical images are regarded as important and sensitive data in the medical informatics systems. For transferring medical images over an insecure network, developing a secure encryption algorithm is necessary. Among the three main properties of security services (i.e., confidentiality, integrity, and availability), the confidentiality is the most essential feature for exchanging medical images among physicians. The Goldreich Goldwasser Halevi (GGH) algorithm can be a good choice for encrypting medical images as both the algorithm and sensitive data are represented by numeric matrices. Additionally, the GGH algorithm does not increase the size of the image and hence, its complexity will remain as simple as O(n2). However, one of the disadvantages of using the GGH algorithm is the Chosen Cipher Text attack. In our strategy, this shortcoming of GGH algorithm has been taken in to consideration and has been improved by applying the padding (i.e., snail tour XORing), before the GGH encryption process. For evaluating their performances, three measurement criteria are considered including (i) Number of Pixels Change Rate (NPCR), (ii) Unified Average Changing Intensity (UACI), and (iii) Avalanche effect. The results on three different sizes of images showed that padding GGH approach has improved UACI, NPCR, and Avalanche by almost 100%, 35%, and 45%, respectively, in comparison to the standard GGH algorithm. Also, the outcomes will make the padding GGH resist against the cipher text, the chosen cipher text, and the statistical attacks. Furthermore, increasing the avalanche effect of more than 50% is a promising achievement in comparison to the increased complexities of the proposed method in terms of encryption and decryption processes. PMID:27857824
A secure image encryption method based on dynamic harmony search (DHS) combined with chaotic map
NASA Astrophysics Data System (ADS)
Mirzaei Talarposhti, Khadijeh; Khaki Jamei, Mehrzad
2016-06-01
In recent years, there has been increasing interest in the security of digital images. This study focuses on the gray scale image encryption using dynamic harmony search (DHS). In this research, first, a chaotic map is used to create cipher images, and then the maximum entropy and minimum correlation coefficient is obtained by applying a harmony search algorithm on them. This process is divided into two steps. In the first step, the diffusion of a plain image using DHS to maximize the entropy as a fitness function will be performed. However, in the second step, a horizontal and vertical permutation will be applied on the best cipher image, which is obtained in the previous step. Additionally, DHS has been used to minimize the correlation coefficient as a fitness function in the second step. The simulation results have shown that by using the proposed method, the maximum entropy and the minimum correlation coefficient, which are approximately 7.9998 and 0.0001, respectively, have been obtained.
Design of an image encryption scheme based on a multiple chaotic map
NASA Astrophysics Data System (ADS)
Tong, Xiao-Jun
2013-07-01
In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.
Secure biometric image sensor and authentication scheme based on compressed sensing.
Suzuki, Hiroyuki; Suzuki, Masamichi; Urabe, Takuya; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki
2013-11-20
It is important to ensure the security of biometric authentication information, because its leakage causes serious risks, such as replay attacks using the stolen biometric data, and also because it is almost impossible to replace raw biometric information. In this paper, we propose a secure biometric authentication scheme that protects such information by employing an optical data ciphering technique based on compressed sensing. The proposed scheme is based on two-factor authentication, the biometric information being supplemented by secret information that is used as a random seed for a cipher key. In this scheme, a biometric image is optically encrypted at the time of image capture, and a pair of restored biometric images for enrollment and verification are verified in the authentication server. If any of the biometric information is exposed to risk, it can be reenrolled by changing the secret information. Through numerical experiments, we confirm that finger vein images can be restored from the compressed sensing measurement data. We also present results that verify the accuracy of the scheme.
DNA based random key generation and management for OTP encryption.
Zhang, Yunpeng; Liu, Xin; Sun, Manhui
2017-09-01
One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.
Efficient reversible data hiding in encrypted H.264/AVC videos
NASA Astrophysics Data System (ADS)
Xu, Dawen; Wang, Rangding
2014-09-01
Due to the security and privacy-preserving requirements for cloud data management, it is sometimes desired that video content is accessible in an encrypted form. Reversible data hiding in the encrypted domain is an emerging technology, as it can perform data hiding in encrypted videos without decryption, which preserves the confidentiality of the content. Furthermore, the original cover can be losslessly restored after decryption and data extraction. An efficient reversible data hiding scheme for encrypted H.264/AVC videos is proposed. During H.264/AVC encoding, the intraprediction mode, motion vector difference, and the sign bits of the residue coefficients are encrypted using a standard stream cipher. Then, the data-hider who does not know the original video content, may reversibly embed secret data into the encrypted H.264/AVC video by using a modified version of the histogram shifting technique. A scale factor is utilized for selecting the embedding zone, which is scalable for different capacity requirements. With an encrypted video containing hidden data, data extraction can be carried out either in the encrypted or decrypted domain. In addition, real reversibility is realized so that data extraction and video recovery are free of any error. Experimental results demonstrate the feasibility and efficiency of the proposed scheme.
Ensemble of Chaotic and Naive Approaches for Performance Enhancement in Video Encryption.
Chandrasekaran, Jeyamala; Thiruvengadam, S J
2015-01-01
Owing to the growth of high performance network technologies, multimedia applications over the Internet are increasing exponentially. Applications like video conferencing, video-on-demand, and pay-per-view depend upon encryption algorithms for providing confidentiality. Video communication is characterized by distinct features such as large volume, high redundancy between adjacent frames, video codec compliance, syntax compliance, and application specific requirements. Naive approaches for video encryption encrypt the entire video stream with conventional text based cryptographic algorithms. Although naive approaches are the most secure for video encryption, the computational cost associated with them is very high. This research work aims at enhancing the speed of naive approaches through chaos based S-box design. Chaotic equations are popularly known for randomness, extreme sensitivity to initial conditions, and ergodicity. The proposed methodology employs two-dimensional discrete Henon map for (i) generation of dynamic and key-dependent S-box that could be integrated with symmetric algorithms like Blowfish and Data Encryption Standard (DES) and (ii) generation of one-time keys for simple substitution ciphers. The proposed design is tested for randomness, nonlinearity, avalanche effect, bit independence criterion, and key sensitivity. Experimental results confirm that chaos based S-box design and key generation significantly reduce the computational cost of video encryption with no compromise in security.
Ensemble of Chaotic and Naive Approaches for Performance Enhancement in Video Encryption
Chandrasekaran, Jeyamala; Thiruvengadam, S. J.
2015-01-01
Owing to the growth of high performance network technologies, multimedia applications over the Internet are increasing exponentially. Applications like video conferencing, video-on-demand, and pay-per-view depend upon encryption algorithms for providing confidentiality. Video communication is characterized by distinct features such as large volume, high redundancy between adjacent frames, video codec compliance, syntax compliance, and application specific requirements. Naive approaches for video encryption encrypt the entire video stream with conventional text based cryptographic algorithms. Although naive approaches are the most secure for video encryption, the computational cost associated with them is very high. This research work aims at enhancing the speed of naive approaches through chaos based S-box design. Chaotic equations are popularly known for randomness, extreme sensitivity to initial conditions, and ergodicity. The proposed methodology employs two-dimensional discrete Henon map for (i) generation of dynamic and key-dependent S-box that could be integrated with symmetric algorithms like Blowfish and Data Encryption Standard (DES) and (ii) generation of one-time keys for simple substitution ciphers. The proposed design is tested for randomness, nonlinearity, avalanche effect, bit independence criterion, and key sensitivity. Experimental results confirm that chaos based S-box design and key generation significantly reduce the computational cost of video encryption with no compromise in security. PMID:26550603
Quantum and Information Thermodynamics: A Unifying Framework Based on Repeated Interactions
NASA Astrophysics Data System (ADS)
Strasberg, Philipp; Schaller, Gernot; Brandes, Tobias; Esposito, Massimiliano
2017-04-01
We expand the standard thermodynamic framework of a system coupled to a thermal reservoir by considering a stream of independently prepared units repeatedly put into contact with the system. These units can be in any nonequilibrium state and interact with the system with an arbitrary strength and duration. We show that this stream constitutes an effective resource of nonequilibrium free energy, and we identify the conditions under which it behaves as a heat, work, or information reservoir. We also show that this setup provides a natural framework to analyze information erasure ("Landauer's principle") and feedback-controlled systems ("Maxwell's demon"). In the limit of a short system-unit interaction time, we further demonstrate that this setup can be used to provide a thermodynamically sound interpretation to many effective master equations. We discuss how nonautonomously driven systems, micromasers, lasing without inversion and the electronic Maxwell demon can be thermodynamically analyzed within our framework. While the present framework accounts for quantum features (e.g., squeezing, entanglement, coherence), we also show that quantum resources do not offer any advantage compared to classical ones in terms of the maximum extractable work.
The Vigenere Cipher with the TI-83
ERIC Educational Resources Information Center
Hamilton, Michael; Yankosky, Bill
2004-01-01
Cryptology, the science of secret writing, is a great way to introduce students to different areas of mathematics such as number theory, linear algebra, probability and statistics. Cryptology consists of two branches: cryptography and cryptanalysis. Cryptography is the science of designing techniques for encrypting and decrypting a message.…
FBIS report. Science and technology: Japan, November 6, 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-06
Some articles are: R&D on Microfactory Technologies; MHI Develops Low Cost, Low Noise Mid-size Helicopters; Kumamoto University to Apply for Approval to Conduct Clinical Experiment for Gene Therapy; MITI To Support Private Sector to Develop Cipher Technology; and Hitachi Electronics Develops Digital Broadcasting Camera System.
Modulation for terrestrial broadcasting of digital HDTV
NASA Technical Reports Server (NTRS)
Kohn, Elliott S.
1991-01-01
The digital modulation methods used by the DigiCipher, DSC-HDTV, ADTV, and ATVA-P digital high-definition television (HDTV) systems are discussed. Three of the systems use a quadrature amplitude modulation method, and the fourth uses a vestigial sideband modulation method. The channel equalization and spectrum sharing of the digital HDTV systems is discussed.
College and Industry: Partners in the Handicapped Role (Cipher III).
ERIC Educational Resources Information Center
Katz, David; And Others
A project was designed and instituted to furnish a structure that would bring together three groups--potential employers, college personnel, and disabled people--to increase employment opportunities for the handicapped. During the third and final project year, representatives of all three groups met in workshops to discuss issues and concerns.…
Signatures and Popular Literacy in Early Seventeenth-Century Japan
ERIC Educational Resources Information Center
Rubinger, Richard
2006-01-01
My paper looks at "signatures" in the form of "ciphers" (kao) and other personal marks made on population registers, town rules, and apostasy oaths in the early seventeenth century to provide some empirical evidence of very high literacy among village leaders. The essay also argues, using the same data, that literacy had…
NASA Astrophysics Data System (ADS)
Rojali, Salman, Afan Galih; George
2017-08-01
Along with the development of information technology in meeting the needs, various adverse actions and difficult to avoid are emerging. One of such action is data theft. Therefore, this study will discuss about cryptography and steganography that aims to overcome these problems. This study will use the Modification Vigenere Cipher, Least Significant Bit and Dictionary Based Compression methods. To determine the performance of study, Peak Signal to Noise Ratio (PSNR) method is used to measure objectively and Mean Opinion Score (MOS) method is used to measure subjectively, also, the performance of this study will be compared to other method such as Spread Spectrum and Pixel Value differencing. After comparing, it can be concluded that this study can provide better performance when compared to other methods (Spread Spectrum and Pixel Value Differencing) and has a range of MSE values (0.0191622-0.05275) and PSNR (60.909 to 65.306) with a hidden file size of 18 kb and has a MOS value range (4.214 to 4.722) or image quality that is approaching very good.
Gao, Yali; Lam, Albert W Y; Chan, Warren C W
2013-04-24
The impact of detecting multiple infectious diseases simultaneously at point-of-care with good sensitivity, specificity, and reproducibility would be enormous for containing the spread of diseases in both resource-limited and rich countries. Many barcoding technologies have been introduced for addressing this need as barcodes can be applied to detecting thousands of genetic and protein biomarkers simultaneously. However, the assay process is not automated and is tedious and requires skilled technicians. Barcoding technology is currently limited to use in resource-rich settings. Here we used magnetism and microfluidics technology to automate the multiple steps in a quantum dot barcode assay. The quantum dot-barcoded microbeads are sequentially (a) introduced into the chip, (b) magnetically moved to a stream containing target molecules, (c) moved back to the original stream containing secondary probes, (d) washed, and (e) finally aligned for detection. The assay requires 20 min, has a limit of detection of 1.2 nM, and can detect genetic targets for HIV, hepatitis B, and syphilis. This study provides a simple strategy to automate the entire barcode assay process and moves barcoding technologies one step closer to point-of-care applications.
Oćwieja, Magdalena; Matras-Postołek, Katarzyna; Maciejewska-Prończuk, Julia; Morga, Maria; Adamczyk, Zbigniew; Sovinska, Svitlana; Żaba, Adam; Gajewska, Marta; Król, Tomasz; Cupiał, Klaudia; Bredol, Michael
2017-10-01
Manganese-doped ZnS quantum dots (QDs) stabilized by cysteamine hydrochloride were successfully synthesized. Their thorough physicochemical characteristics were acquired using UV-Vis absorption and photoluminescence spectroscopy, X-ray diffraction, dynamic light scattering (DLS), transmission electron microscopy (HR-TEM), energy dispersive spectroscopy (EDS) and Fourier transform infrared (FT-IR) spectroscopy. The average particle size, derived from HR-TEM, was 3.1nm, which agrees with the hydrodynamic diameter acquired by DLS, that was equal to 3-4nm, depending on ionic strength. The quantum dots also exhibited a large positive zeta potential varying between 75 and 36mV for ionic strength of 10 -4 and 10 -2 M, respectively (at pH 6.2) and an intense luminescent emission at 590nm. The quantum yield was equal to 31% and the optical band gap energy was equal to 4.26eV. The kinetics of QD monolayer formation on silica substrates (silica sensors and oxidized silicon wafers) under convection-controlled transport was quantitatively evaluated by the quartz crystal microbalance (QCM) and the streaming potential measurements. A high stability of the monolayer for ionic strength 10 -4 and 10 -2 M was confirmed in these measurements. The experimental data were adequately reflected by the extended random sequential adsorption model (eRSA). Additionally, thorough electrokinetic characteristics of the QD monolayers and their stability for various ionic strengths and pH were acquired by streaming potential measurements carried out under in situ conditions. These results were quantitatively interpreted in terms of the three-dimensional (3D) electrokinetic model that furnished bulk zeta potential of particles for high ionic strengths that is impractical by other experimental techniques. It is concluded that these results can be used for designing of biosensors of controlled monolayer structure capable to bind various ligands via covalent as well as electrostatic interactions. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bennett, A. J.; Lee, J. P.; Ellis, D. J. P.; Farrer, I.; Ritchie, D. A.; Shields, A. J.
2016-10-01
Obtaining substantial nonlinear effects at the single-photon level is a considerable challenge that holds great potential for quantum optical measurements and information processing. Of the progress that has been made in recent years one of the most promising methods is to scatter coherent light from quantum emitters, imprinting quantum correlations onto the photons. We report effective interactions between photons, controlled by a single semiconductor quantum dot that is weakly coupled to a monolithic cavity. We show that the nonlinearity of a transition modifies the counting statistics of a Poissonian beam, sorting the photons in number. This is used to create strong correlations between detection events and to create polarization-correlated photons from an uncorrelated stream using a single spin. These results pave the way for semiconductor optical switches operated by single quanta of light.
Ciphers and Currencies: Literacy Dilemmas and Shifting Knowledges.
ERIC Educational Resources Information Center
Kell, Catherine
2001-01-01
Provides an overview of issues in the literacy policy field from a social practices perspective. Outlines a central dilemma in both theory and practice in adult literacy work: that practice theory has not impacted on literacy policy in large parts of the world. Suggests there is an ever-widening gap between literacies of everyday life and the…
Secret Writing. Keys to the Mysteries of Reading and Writing.
ERIC Educational Resources Information Center
Sears, Peter
With a central theme of how people create a means to communicate reliably, and based on language-making exercises that touch students' imaginations, this book aims to interest students in language and how language is made. Since students like codes and ciphers, the book begins with secret writing, which is then used to reveal the foundation of…
Codes, Ciphers, and Cryptography--An Honors Colloquium
ERIC Educational Resources Information Center
Karls, Michael A.
2010-01-01
At the suggestion of a colleague, I read "The Code Book", [32], by Simon Singh to get a basic introduction to the RSA encryption scheme. Inspired by Singh's book, I designed a Ball State University Honors Colloquium in Mathematics for both majors and non-majors, with material coming from "The Code Book" and many other sources. This course became…
Generalized Smooth Transition Map Between Tent and Logistic Maps
NASA Astrophysics Data System (ADS)
Sayed, Wafaa S.; Fahmy, Hossam A. H.; Rezk, Ahmed A.; Radwan, Ahmed G.
There is a continuous demand on novel chaotic generators to be employed in various modeling and pseudo-random number generation applications. This paper proposes a new chaotic map which is a general form for one-dimensional discrete-time maps employing the power function with the tent and logistic maps as special cases. The proposed map uses extra parameters to provide responses that fit multiple applications for which conventional maps were not enough. The proposed generalization covers also maps whose iterative relations are not based on polynomials, i.e. with fractional powers. We introduce a framework for analyzing the proposed map mathematically and predicting its behavior for various combinations of its parameters. In addition, we present and explain the transition map which results in intermediate responses as the parameters vary from their values corresponding to tent map to those corresponding to logistic map case. We study the properties of the proposed map including graph of the map equation, general bifurcation diagram and its key-points, output sequences, and maximum Lyapunov exponent. We present further explorations such as effects of scaling, system response with respect to the new parameters, and operating ranges other than transition region. Finally, a stream cipher system based on the generalized transition map validates its utility for image encryption applications. The system allows the construction of more efficient encryption keys which enhances its sensitivity and other cryptographic properties.
Modeling the Gross-Pitaevskii Equation Using the Quantum Lattice Gas Method
NASA Astrophysics Data System (ADS)
Oganesov, Armen
We present an improved Quantum Lattice Gas (QLG) algorithm as a mesoscopic unitary perturbative representation of the mean field Gross Pitaevskii (GP) equation for Bose-Einstein Condensates (BECs). The method employs an interleaved sequence of unitary collide and stream operators. QLG is applicable to many different scalar potentials in the weak interaction regime and has been used to model the Korteweg-de Vries (KdV), Burgers and GP equations. It can be implemented on both quantum and classical computers and is extremely scalable. We present results for 1D soliton solutions with positive and negative internal interactions, as well as vector solitons with inelastic scattering. In higher dimensions we look at the behavior of vortex ring reconnection. A further improvement is considered with a proper operator splitting technique via a Fourier transformation. This is great for quantum computers since the quantum FFT is exponentially faster than its classical counterpart which involves non-local data on the entire lattice (Quantum FFT is the backbone of the Shor algorithm for quantum factorization). We also present an imaginary time method in which we transform the Schrodinger equation into a diffusion equation for recovering ground state initial conditions of a quantum system suitable for the QLG algorithm.
ERIC Educational Resources Information Center
Kynard, Carmen
2010-01-01
In this article, Carmen Kynard provides a window into a present-day "hush harbor," a site where a group of black women build generative virtual spaces for counterstories that fight institutional racism. Hidden in plain view, these intentional communities have historically allowed African American participants to share and create knowledge and find…
Hybrid Cryptosystem Using Tiny Encryption Algorithm and LUC Algorithm
NASA Astrophysics Data System (ADS)
Rachmawati, Dian; Sharif, Amer; Jaysilen; Andri Budiman, Mohammad
2018-01-01
Security becomes a very important issue in data transmission and there are so many methods to make files more secure. One of that method is cryptography. Cryptography is a method to secure file by writing the hidden code to cover the original file. Therefore, if the people do not involve in cryptography, they cannot decrypt the hidden code to read the original file. There are many methods are used in cryptography, one of that method is hybrid cryptosystem. A hybrid cryptosystem is a method that uses a symmetric algorithm to secure the file and use an asymmetric algorithm to secure the symmetric algorithm key. In this research, TEA algorithm is used as symmetric algorithm and LUC algorithm is used as an asymmetric algorithm. The system is tested by encrypting and decrypting the file by using TEA algorithm and using LUC algorithm to encrypt and decrypt the TEA key. The result of this research is by using TEA Algorithm to encrypt the file, the cipher text form is the character from ASCII (American Standard for Information Interchange) table in the form of hexadecimal numbers and the cipher text size increase by sixteen bytes as the plaintext length is increased by eight characters.
NASA Astrophysics Data System (ADS)
Rachmawati, D.; Budiman, M. A.; Siburian, W. S. E.
2018-05-01
On the process of exchanging files, security is indispensable to avoid the theft of data. Cryptography is one of the sciences used to secure the data by way of encoding. Fast Data Encipherment Algorithm (FEAL) is a block cipher symmetric cryptographic algorithms. Therefore, the file which wants to protect is encrypted and decrypted using the algorithm FEAL. To optimize the security of the data, session key that is utilized in the algorithm FEAL encoded with the Goldwasser-Micali algorithm, which is an asymmetric cryptographic algorithm and using probabilistic concept. In the encryption process, the key was converted into binary form. The selection of values of x that randomly causes the results of the cipher key is different for each binary value. The concept of symmetry and asymmetry algorithm merger called Hybrid Cryptosystem. The use of the algorithm FEAL and Goldwasser-Micali can restore the message to its original form and the algorithm FEAL time required for encryption and decryption is directly proportional to the length of the message. However, on Goldwasser- Micali algorithm, the length of the message is not directly proportional to the time of encryption and decryption.
Guo, Ping; Wang, Jin; Ji, Sai; Geng, Xue Hua; Xiong, Neal N
2015-12-01
With the pervasiveness of smart phones and the advance of wireless body sensor network (BSN), mobile Healthcare (m-Healthcare), which extends the operation of Healthcare provider into a pervasive environment for better health monitoring, has attracted considerable interest recently. However, the flourish of m-Healthcare still faces many challenges including information security and privacy preservation. In this paper, we propose a secure and privacy-preserving framework combining with multilevel trust management. In our scheme, smart phone resources including computing power and energy can be opportunistically gathered to process the computing-intensive PHI (personal health information) during m-Healthcare emergency with minimal privacy disclosure. In specific, to leverage the PHI privacy disclosure and the high reliability of PHI process and transmission in m-Healthcare emergency, we introduce an efficient lightweight encryption for those users whose trust level is low, which is based on mix cipher algorithms and pair of plain text and cipher texts, and allow a medical user to decide who can participate in the opportunistic computing to assist in processing his overwhelming PHI data. Detailed security analysis and simulations show that the proposed framework can efficiently achieve user-centric privacy protection in m-Healthcare system.
Brendel, Benjamin
2017-09-01
This article analyzes the modernization campaigns in Egypt in the 1960s and early 1970s. The regulation of the Nile by the Aswan High Dam and the resulting irrigation projects caused the rate of schistosomiasis infestation in the population to rise. The result was a discourse between experts from the global north and Egyptian elites about modernization, development aid, dam building and health care. The fight against schistosomiasis was like a cipher, which combined different power-laden concepts and arguments. This article will decode the cipher and allow a deeper look into the contemporary dimensions of power bound to this subject. The text is conceived around three thematic axes. The first deals with the discursive interplay of modernization, health and development aid in and for Egypt. The second focuses on far-reaching and long-standing arguments within an international expert discourse about these concepts. Finally, the third presents an exemplary case study of West German health and development aid for fighting schistosomiasis in the Egyptian Fayoum oasis.
Enhanced K-means clustering with encryption on cloud
NASA Astrophysics Data System (ADS)
Singh, Iqjot; Dwivedi, Prerna; Gupta, Taru; Shynu, P. G.
2017-11-01
This paper tries to solve the problem of storing and managing big files over cloud by implementing hashing on Hadoop in big-data and ensure security while uploading and downloading files. Cloud computing is a term that emphasis on sharing data and facilitates to share infrastructure and resources.[10] Hadoop is an open source software that gives us access to store and manage big files according to our needs on cloud. K-means clustering algorithm is an algorithm used to calculate distance between the centroid of the cluster and the data points. Hashing is a algorithm in which we are storing and retrieving data with hash keys. The hashing algorithm is called as hash function which is used to portray the original data and later to fetch the data stored at the specific key. [17] Encryption is a process to transform electronic data into non readable form known as cipher text. Decryption is the opposite process of encryption, it transforms the cipher text into plain text that the end user can read and understand well. For encryption and decryption we are using Symmetric key cryptographic algorithm. In symmetric key cryptography are using DES algorithm for a secure storage of the files. [3
Optimum performance of electron beam pumped GaAs and GaN
NASA Astrophysics Data System (ADS)
Afify, M. S.; Moslem, W. M.; Hassouba, M. A.; Abu-El Hassan, A.
2018-05-01
This paper introduces a physical solution in order to overcome the damage to semiconductors, due to increasing temperature during the pumping process. For this purpose, we use quantum hydrodynamic fluid equations, including different quantum effects. This study concludes that nonlinear acoustic waves, in the form of soliton and shock-like (double layer) pulses, can propagate depending on the electron beam temperature and the streaming speed. Therefore, one can precisely tune the beam parameters in order to avoid such unfavorable noises that may lead to defects in semiconductors.
2012-10-23
Quantum Intelligence, Inc. She was principal investigator (PI) for six contracts awarded by the DoD Small Business Innovation Research (SBIR) Program. She...with at OSD? I hope you don’t mind if I indulge in a little ‘stream of consciousness ’ musing about where LLA could really add value. One of the...implemented by Quantum Intelligence, Inc. (QI, 2001–2012). The unique contribution of this architecture is to leverage a peer-to-peer agent network
Fiber-Coupled Cavity-QED Source of Identical Single Photons
NASA Astrophysics Data System (ADS)
Snijders, H.; Frey, J. A.; Norman, J.; Post, V. P.; Gossard, A. C.; Bowers, J. E.; van Exter, M. P.; Löffler, W.; Bouwmeester, D.
2018-03-01
We present a fully fiber-coupled source of high-fidelity single photons. An (In,Ga)As semiconductor quantum dot is embedded in an optical Fabry-Perot microcavity with a robust design and rigidly attached single-mode fibers, which enables through-fiber cross-polarized resonant laser excitation and photon extraction. Even without spectral filtering, we observe that the incident coherent light pulses are transformed into a stream of single photons with high purity (97%) and indistinguishability (90%), which is measured at an in-fiber brightness of 5% with an excellent cavity-mode-to-fiber coupling efficiency of 85%. Our results pave the way for fully fiber-integrated photonic quantum networks. Furthermore, our method is equally applicable to fiber-coupled solid-state cavity-QED-based photonic quantum gates.
Cryptanalysis of the Sodark Family of Cipher Algorithms
2017-09-01
software project for building three-bit LUT circuit representations of S- boxes is available as a GitHub repository [40]. It contains several improvements...DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The...second- and third-generation automatic link establishment (ALE) systems for high frequency radios. Radios utilizing ALE technology are in use by a
Exploitation of Unintentional Information Leakage from Integrated Circuits
2011-12-01
U.S. Defense Science Board Task Force examined the effects and risks of outsourcing high performance microchip production to foreign countries [Off05...mapping methodology is developed and demon- strated to comprehensively assess the information leakage of arbitrary block cipher implementations. The...engineering poses a serious threat since it can en- able competitors or adversaries to bypass years of research and development through counterfeiting or
Social Noise: Generating Random Numbers from Twitter Streams
NASA Astrophysics Data System (ADS)
Fernández, Norberto; Quintas, Fernando; Sánchez, Luis; Arias, Jesús
2015-12-01
Due to the multiple applications of random numbers in computer systems (cryptography, online gambling, computer simulation, etc.) it is important to have mechanisms to generate these numbers. True Random Number Generators (TRNGs) are commonly used for this purpose. TRNGs rely on non-deterministic sources to generate randomness. Physical processes (like noise in semiconductors, quantum phenomenon, etc.) play this role in state of the art TRNGs. In this paper, we depart from previous work and explore the possibility of defining social TRNGs using the stream of public messages of the microblogging service Twitter as randomness source. Thus, we define two TRNGs based on Twitter stream information and evaluate them using the National Institute of Standards and Technology (NIST) statistical test suite. The results of the evaluation confirm the feasibility of the proposed approach.
Quantum Optics in Astrophysics: The Potential of a New Window
NASA Astrophysics Data System (ADS)
Solomos, Nikolaos H.
2006-08-01
The entire optical astronomy relies upon the detection of light. In this contribution, we put emphasis to the fact that a new window to the universe could be opened with the obvious idea of applying the ...Quantum Theory to describe incoming light Quanta (!). It is clearly the appropriate approach but, nevertheless, it never deemed necessary in main stream astrophysics: Customarily, traditional astronomy not only prefers time-averaged quantities, (although fluctuations in time of a measurement can be a source of information which is getting entirely lost in any time-averaged value) but misses much more information content by continuing to use old semi-classical approaches to treat photon detection processes. Thus, we fail to describe and appreciate in full very important properties of cosmic light, like spatiotemporal coherence. Nevertheless, 45 years of knowledge accumulation in Quantum Optics and technology can now result to the development of instruments capable to extract intimate quantum information scrambled in the incoming optical light fields from celestial sources, provided their ability to detect light emission alterations in the
Rapid and efficient detection of single chromophore molecules in aqueous solution
NASA Astrophysics Data System (ADS)
Li, Li-Qiang; Davis, Lloyd M.
1995-06-01
The first experiments on the detection of single fluorescent molecules in a flowing stream of an aqueous solution with high total efficiency are reported. A capillary injection system for sample delivery causes all the dye molecules to pass in a diffusion-broadened stream within a fast-moving sheath flow, through the center of the tightly focused laser excitation beam. Single-molecule detection with a transit time of approximately 1 ms is accomplished with a high-quantum-efficiency single-photon avalanche diode and a low dead-time time-gating circuit for discrimination of Raman-scattered light from the solvent.
Federation for a Secure Enterprise
2016-09-10
12 October 2005 e. RFC Internet X.509 Public Key Infrastructure: Certification Path Building, 2005 f. Public Key Cryptography Standard, PKCS #1...v2.2: RSA Cryptography Standard, RSA Laboratories, October 27, 2012 g. PKCS#12 format PKCS #12 v1.0: Personal Information Exchange Syntax Standard, RSA...ClientHello padding extension, 2015-02-17 f. Elliptic Curve Cryptography (ECC) Cipher Suites for Transport Layer Security (TLS) Versions 1.2 and Earlier
The Role of Counterintelligence in the European Theater of Operations During World War II
1993-06-04
revolvers, Minox cameras, portable typewriters, 48 fingerprint cameras, latent fingerprint kits, handcuffs, and listening and recording devices.13 This...Comments from the detachments indicated that the fingerprint equipment, and listening and recording devices were of little use. However, the revolvers...40-49. 138 Moulage* 2 Fingerprinting 2 Latent Fingerprinting 3 System of Identification 1 Codes and Ciphers 1 Handwriting Comparison 2 Documentary
An Architecture for Enabling Migration of Tactical Networks to Future Flexible Ad Hoc WBWF
2010-09-01
Requirements Several multiple access schemes TDMA OFDMA SC-OFDMA, FH- CDMA , DS - CDMA , hybrid access schemes, transitions between them Dynamic...parameters algorithms depend on the multiple access scheme If DS - CDMA : handling of macro-diversity (linked to cooperative routing) TDMA and/of OFDMA...Transport format Ciphering @MAC/RLC level : SCM Physical layer (PHY) : signal processing (mod, FEC, etc) CDMA : macro-diversity CDMA , OFDMA
Proving Chaotic Behavior of CBC Mode of Operation
NASA Astrophysics Data System (ADS)
Abidi, Abdessalem; Wang, Qianxue; Bouallegue, Belgacem; Machhout, Mohsen; Guyeux, Christophe
2016-06-01
The cipher block chaining (CBC) mode of operation was invented by IBM (International Business Machine) in 1976. It presents a very popular way of encrypting that is used in various applications. In this paper, we have mathematically proven that, under some conditions, the CBC mode of operation can admit a chaotic behavior according to Devaney. Some cases will be properly studied in order to provide evidence for this idea.
Developing of Library for Proofs of Data Possession in Charm
2013-06-01
INTENTIONALLY LEFT BLANK x LIST OF ACRONYMS AND ABBREVIATIONS API Application Programmer Interface DTP Datatype -preserving Encryption FedRAMP U.S...proposed block-cipher mode for Datatype -Preserving Encryption (DTP) uses the Knuth Shuffle in one of its steps [19]. It may be advantageous to...http://www.clustal.org/omega/clustalo-api/util_8c.html. [19] U. T. Mattsson, “Format-controlling encryption using datatype -preserving encryption
Quantum calculus of classical vortex images, integrable models and quantum states
NASA Astrophysics Data System (ADS)
Pashaev, Oktay K.
2016-10-01
From two circle theorem described in terms of q-periodic functions, in the limit q→1 we have derived the strip theorem and the stream function for N vortex problem. For regular N-vortex polygon we find compact expression for the velocity of uniform rotation and show that it represents a nonlinear oscillator. We describe q-dispersive extensions of the linear and nonlinear Schrodinger equations, as well as the q-semiclassical expansions in terms of Bernoulli and Euler polynomials. Different kind of q-analytic functions are introduced, including the pq-analytic and the golden analytic functions.
Frequency-resolved Monte Carlo.
López Carreño, Juan Camilo; Del Valle, Elena; Laussy, Fabrice P
2018-05-03
We adapt the Quantum Monte Carlo method to the cascaded formalism of quantum optics, allowing us to simulate the emission of photons of known energy. Statistical processing of the photon clicks thus collected agrees with the theory of frequency-resolved photon correlations, extending the range of applications based on correlations of photons of prescribed energy, in particular those of a photon-counting character. We apply the technique to autocorrelations of photon streams from a two-level system under coherent and incoherent pumping, including the Mollow triplet regime where we demonstrate the direct manifestation of leapfrog processes in producing an increased rate of two-photon emission events.
Pre-Mrna Introns as a Model for Cryptographic Algorithm:. Theory and Experiments
NASA Astrophysics Data System (ADS)
Regoli, Massimo
2010-01-01
The RNA-Crypto System (shortly RCS) is a symmetric key algorithm to cipher data. The idea for this new algorithm starts from the observation of nature. In particular from the observation of RNA behavior and some of its properties. In particular the RNA sequences have some sections called Introns. Introns, derived from the term "intragenic regions", are non-coding sections of precursor mRNA (pre-mRNA) or other RNAs, that are removed (spliced out of the RNA) before the mature RNA is formed. Once the introns have been spliced out of a pre-mRNA, the resulting mRNA sequence is ready to be translated into a protein. The corresponding parts of a gene are known as introns as well. The nature and the role of Introns in the pre-mRNA is not clear and it is under ponderous researches by Biologists but, in our case, we will use the presence of Introns in the RNA-Crypto System output as a strong method to add chaotic non coding information and an unnecessary behaviour in the access to the secret key to code the messages. In the RNA-Crypto System algorithm the introns are sections of the ciphered message with non-coding information as well as in the precursor mRNA.
a Simple Symmetric Algorithm Using a Likeness with Introns Behavior in RNA Sequences
NASA Astrophysics Data System (ADS)
Regoli, Massimo
2009-02-01
The RNA-Crypto System (shortly RCS) is a symmetric key algorithm to cipher data. The idea for this new algorithm starts from the observation of nature. In particular from the observation of RNA behavior and some of its properties. The RNA sequences has some sections called Introns. Introns, derived from the term "intragenic regions", are non-coding sections of precursor mRNA (pre-mRNA) or other RNAs, that are removed (spliced out of the RNA) before the mature RNA is formed. Once the introns have been spliced out of a pre-mRNA, the resulting mRNA sequence is ready to be translated into a protein. The corresponding parts of a gene are known as introns as well. The nature and the role of Introns in the pre-mRNA is not clear and it is under ponderous researches by Biologists but, in our case, we will use the presence of Introns in the RNA-Crypto System output as a strong method to add chaotic non coding information and an unnecessary behaviour in the access to the secret key to code the messages. In the RNA-Crypto System algoritnm the introns are sections of the ciphered message with non-coding information as well as in the precursor mRNA.
Bio—Cryptography: A Possible Coding Role for RNA Redundancy
NASA Astrophysics Data System (ADS)
Regoli, M.
2009-03-01
The RNA-Crypto System (shortly RCS) is a symmetric key algorithm to cipher data. The idea for this new algorithm starts from the observation of nature. In particular from the observation of RNA behavior and some of its properties. The RNA sequences have some sections called Introns. Introns, derived from the term "intragenic regions," are non-coding sections of precursor mRNA (pre-mRNA) or other RNAs, that are removed (spliced out of the RNA) before the mature RNA is formed. Once the introns have been spliced out of a pre-mRNA, the resulting mRNA sequence is ready to be translated into a protein. The corresponding parts of a gene are known as introns as well. The nature and the role of Introns in the pre-mRNA is not clear and it is under ponderous researches by biologists but, in our case, we will use the presence of Introns in the RNA-Crypto System output as a strong method to add chaotic non coding information and an unnecessary behavior in the access to the secret key to code the messages. In the RNA-Crypto System algorithm the introns are sections of the ciphered message with non-coding information as well as in the precursor mRNA.
Infrared spectrometry studies: Spectral digital data acquisition system (1971 version)
NASA Technical Reports Server (NTRS)
Lu, L.; Lyon, R. J. P.
1971-01-01
The construction of the Stanford Spectral Digital Data Acquisition System is described. The objective of the system is to record both the spectral distribution of incoming radiation from the rock samples measured by the spectroradiometer (Exotech Model 10-34 Circular Variable Filter Infrared Spectroradiometer) together with other weather information. This system is designed for both laboratory and field measurement programs. The multichannel inputs (8 channels) of the system are as follows: Ch 1 the Spectro-radiometer, Ch 2 the radiometer (PRT-5), and Ch 3 to Ch 8 for the weather information. The system records data from channel 1 and channel 2 alternately for 48 times, before a fast sweep across the six weather channels, to form a single scan in the scan counter. The operation is illustrated in a block diagram, and the theory of operation is described. The outputs are written on a 7-track magnetic tape with IBM compatible form. The format of the tape and the playback computer programs are included. The micro-pac digital modules and a CIPHER model 70 tape recorder (Cipher Data Products) are used. One of the major characteristics of this system is that it is externally clocked by the spectroradiometer instead of taking data at intervals of various wavelengths by using internal-clocking.
Simple algorithm for improved security in the FDDI protocol
NASA Astrophysics Data System (ADS)
Lundy, G. M.; Jones, Benjamin
1993-02-01
We propose a modification to the Fiber Distributed Data Interface (FDDI) protocol based on a simple algorithm which will improve confidential communication capability. This proposed modification provides a simple and reliable system which exploits some of the inherent security properties in a fiber optic ring network. This method differs from conventional methods in that end to end encryption can be facilitated at the media access control sublayer of the data link layer in the OSI network model. Our method is based on a variation of the bit stream cipher method. The transmitting station takes the intended confidential message and uses a simple modulo two addition operation against an initialization vector. The encrypted message is virtually unbreakable without the initialization vector. None of the stations on the ring will have access to both the encrypted message and the initialization vector except the transmitting and receiving stations. The generation of the initialization vector is unique for each confidential transmission and thus provides a unique approach to the key distribution problem. The FDDI protocol is of particular interest to the military in terms of LAN/MAN implementations. Both the Army and the Navy are considering the standard as the basis for future network systems. A simple and reliable security mechanism with the potential to support realtime communications is a necessary consideration in the implementation of these systems. The proposed method offers several advantages over traditional methods in terms of speed, reliability, and standardization.
NASA Astrophysics Data System (ADS)
Yu, Nam Yul
2017-12-01
The principle of compressed sensing (CS) can be applied in a cryptosystem by providing the notion of security. In this paper, we study the computational security of a CS-based cryptosystem that encrypts a plaintext with a partial unitary sensing matrix embedding a secret keystream. The keystream is obtained by a keystream generator of stream ciphers, where the initial seed becomes the secret key of the CS-based cryptosystem. For security analysis, the total variation distance, bounded by the relative entropy and the Hellinger distance, is examined as a security measure for the indistinguishability. By developing upper bounds on the distance measures, we show that the CS-based cryptosystem can be computationally secure in terms of the indistinguishability, as long as the keystream length for each encryption is sufficiently large with low compression and sparsity ratios. In addition, we consider a potential chosen plaintext attack (CPA) from an adversary, which attempts to recover the key of the CS-based cryptosystem. Associated with the key recovery attack, we show that the computational security of our CS-based cryptosystem is brought by the mathematical intractability of a constrained integer least-squares (ILS) problem. For a sub-optimal, but feasible key recovery attack, we consider a successive approximate maximum-likelihood detection (SAMD) and investigate the performance by developing an upper bound on the success probability. Through theoretical and numerical analyses, we demonstrate that our CS-based cryptosystem can be secure against the key recovery attack through the SAMD.
Karpman-Washimi magnetization with electron-exchange effects in quantum plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Woo-Pyo; Jamil, M.; Rasheed, A.
2015-07-15
The influence of quantum electron-exchange on the Karpman-Washimi ponderomotive magnetization is investigated in quantum plasmas. The ponderomotive magnetization and the total radiation power due to the non-stationary Karpman-Washimi interaction related to the time-varying field intensity are obtained as functions of the de Broglie wave length, Debye length, and electron-exchange parameter. The result shows that the electron-exchange effect enhances the cyclotron frequency due to the ponderomotive interactions in quantum plasmas. It is also shown that the electron-exchange effect on the Karpman-Washimi magnetization increases with increasing wave number. In addition, the Karpman-Washimi magnetization and the total radiation power increase with an increasemore » in the ratio of the Debye length to the de Broglie wave length. In streaming quantum plasmas, it is shown that the electron-exchange effect enhances the ponderomotive magnetization below the resonant wave number and, however, suppresses the ponderomotive magnetization above the resonant wave number. The variation of the Karpman-Washimi magnetization and the radiation power due to the variation of the electron-exchange effect and plasma parameters is also discussed.« less
Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging
NASA Technical Reports Server (NTRS)
Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.
2011-01-01
The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.
An image encryption algorithm based on 3D cellular automata and chaotic maps
NASA Astrophysics Data System (ADS)
Del Rey, A. Martín; Sánchez, G. Rodríguez
2015-05-01
A novel encryption algorithm to cipher digital images is presented in this work. The digital image is rendering into a three-dimensional (3D) lattice and the protocol consists of two phases: the confusion phase where 24 chaotic Cat maps are applied and the diffusion phase where a 3D cellular automata is evolved. The encryption method is shown to be secure against the most important cryptanalytic attacks.
Environmental Requirements for Authentication Protocols
2002-01-01
Engineering for Informa- tion Security, March 2001. 10. D. Chaum . Blind signatures for untraceable payments. In Advances in Cryptology{ Proceedings of...the connection, the idea relies on a concept similar to blinding in the sense of Chaum [10], who used it e ectively in the design of anonymous payment...digital signature on the key and a nonce provided by the server, in which the client’s challenge response was independent of the type of cipher
Extracting random numbers from quantum tunnelling through a single diode.
Bernardo-Gavito, Ramón; Bagci, Ibrahim Ethem; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J; Woodhead, Christopher S; Missous, Mohamed; Roedig, Utz; Young, Robert J
2017-12-19
Random number generation is crucial in many aspects of everyday life, as online security and privacy depend ultimately on the quality of random numbers. Many current implementations are based on pseudo-random number generators, but information security requires true random numbers for sensitive applications like key generation in banking, defence or even social media. True random number generators are systems whose outputs cannot be determined, even if their internal structure and response history are known. Sources of quantum noise are thus ideal for this application due to their intrinsic uncertainty. In this work, we propose using resonant tunnelling diodes as practical true random number generators based on a quantum mechanical effect. The output of the proposed devices can be directly used as a random stream of bits or can be further distilled using randomness extraction algorithms, depending on the application.
Git as an Encrypted Distributed Version Control System
2015-03-01
options. The algorithm uses AES- 256 counter mode with an IV derived from SHA -1-HMAC hash (this is nearly identical to the GCM mode discussed earlier...built into the internal structure of Git. Every file in a Git repository is check summed with a SHA -1 hash, a one-way function with arbitrarily long...implementation. Git-encrypt calls OpenSSL cryptography library command line functions. The default cipher used is AES- 256 - Electronic Code Book (ECB), which is
Strong Password-Based Authentication in TLS Using the Three-PartyGroup Diffie-Hellman Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdalla, Michel; Bresson, Emmanuel; Chevassut, Olivier
2006-08-26
The Internet has evolved into a very hostile ecosystem where"phishing'' attacks are common practice. This paper shows that thethree-party group Diffie-Hellman key exchange can help protect againstthese attacks. We have developed a suite of password-based cipher suitesfor the Transport Layer Security (TLS) protocol that are not onlyprovably secure but also assumed to be free from patent and licensingrestrictions based on an analysis of relevant patents in thearea.
Cipher image damage and decisions in real time
NASA Astrophysics Data System (ADS)
Silva-García, Victor Manuel; Flores-Carapia, Rolando; Rentería-Márquez, Carlos; Luna-Benoso, Benjamín; Jiménez-Vázquez, Cesar Antonio; González-Ramírez, Marlon David
2015-01-01
This paper proposes a method for constructing permutations on m position arrangements. Our objective is to encrypt color images using advanced encryption standard (AES), using variable permutations means a different one for each 128-bit block in the first round after the x-or operation is applied. Furthermore, this research offers the possibility of knowing the original image when the encrypted figure suffered a failure from either an attack or not. This is achieved by permuting the original image pixel positions before being encrypted with AES variable permutations, which means building a pseudorandom permutation of 250,000 position arrays or more. To this end, an algorithm that defines a bijective function between the nonnegative integer and permutation sets is built. From this algorithm, the way to build permutations on the 0,1,…,m-1 array, knowing m-1 constants, is presented. The transcendental numbers are used to select these m-1 constants in a pseudorandom way. The quality of the proposed encryption according to the following criteria is evaluated: the correlation coefficient, the entropy, and the discrete Fourier transform. A goodness-of-fit test for each basic color image is proposed to measure the bits randomness degree of the encrypted figure. On the other hand, cipher images are obtained in a loss-less encryption way, i.e., no JPEG file formats are used.
Random numbers from vacuum fluctuations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com; Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543
2016-07-25
We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.
Lee, Jeffrey S; Cleaver, Gerald B
2017-10-01
In this note, the Cosmic Microwave Background (CMB) Radiation is shown to be capable of functioning as a Random Bit Generator, and constitutes an effectively infinite supply of truly random one-time pad values of arbitrary length. It is further argued that the CMB power spectrum potentially conforms to the FIPS 140-2 standard. Additionally, its applicability to the generation of a (n × n) random key matrix for a Vernam cipher is established.
Possible Quantum Absorber Effects in Cortical Synchronization
NASA Astrophysics Data System (ADS)
Kämpf, Uwe
The Wheeler-Feynman transactional "absorber" approach was proposed originally to account for anomalous resonance coupling between spatio-temporally distant measurement partners in entangled quantum states of so-called Einstein-Podolsky-Rosen paradoxes, e.g. of spatio-temporal non-locality, quantum teleportation, etc. Applied to quantum brain dynamics, however, this view provides an anticipative resonance coupling model for aspects of cortical synchronization and recurrent visual action control. It is proposed to consider the registered activation patterns of neuronal loops in so-called synfire chains not as a result of retarded brain communication processes, but rather as surface effects of a system of standing waves generated in the depth of visual processing. According to this view, they arise from a counterbalance between the actual input's delayed bottom-up data streams and top-down recurrent information-processing of advanced anticipative signals in a Wheeler-Feynman-type absorber mode. In the framework of a "time-loop" model, findings about mirror neurons in the brain cortex are suggested to be at least partially associated with temporal rather than spatial mirror functions of visual processing, similar to phase conjugate adaptive resonance-coupling in nonlinear optics.
NASA Astrophysics Data System (ADS)
Nikulin, Vladimir V.; Hughes, David H.; Malowicki, John; Bedi, Vijit
2015-05-01
Free-space optical communication channels offer secure links with low probability of interception and detection. Despite their point-to-point topology, additional security features may be required in privacy-critical applications. Encryption can be achieved at the physical layer by using quantized values of photons, which makes exploitation of such quantum communication links extremely difficult. One example of such technology is keyed communication in quantum noise, a novel quantum modulation protocol that offers ultra-secure communication with competitive performance characteristics. Its utilization relies on specific coherent measurements to decrypt the signal. The process of measurements is complicated by the inherent and irreducible quantum noise of coherent states. This problem is different from traditional laser communication with coherent detection; therefore continuous efforts are being made to improve the measurement techniques. Quantum-based encryption systems that use the phase of the signal as the information carrier impose aggressive requirements on the accuracy of the measurements when an unauthorized party attempts intercepting the data stream. Therefore, analysis of the secrecy of the data becomes extremely important. In this paper, we present the results of a study that had a goal of assessment of potential vulnerability of the running key. Basic results of the laboratory measurements are combined with simulation studies and statistical analysis that can be used for both conceptual improvement of the encryption approach and for quantitative comparison of secrecy of different quantum communication protocols.
Minimal-memory realization of pearl-necklace encoders of general quantum convolutional codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houshmand, Monireh; Hosseini-Khayat, Saied
2011-02-15
Quantum convolutional codes, like their classical counterparts, promise to offer higher error correction performance than block codes of equivalent encoding complexity, and are expected to find important applications in reliable quantum communication where a continuous stream of qubits is transmitted. Grassl and Roetteler devised an algorithm to encode a quantum convolutional code with a ''pearl-necklace'' encoder. Despite their algorithm's theoretical significance as a neat way of representing quantum convolutional codes, it is not well suited to practical realization. In fact, there is no straightforward way to implement any given pearl-necklace structure. This paper closes the gap between theoretical representation andmore » practical implementation. In our previous work, we presented an efficient algorithm to find a minimal-memory realization of a pearl-necklace encoder for Calderbank-Shor-Steane (CSS) convolutional codes. This work is an extension of our previous work and presents an algorithm for turning a pearl-necklace encoder for a general (non-CSS) quantum convolutional code into a realizable quantum convolutional encoder. We show that a minimal-memory realization depends on the commutativity relations between the gate strings in the pearl-necklace encoder. We find a realization by means of a weighted graph which details the noncommutative paths through the pearl necklace. The weight of the longest path in this graph is equal to the minimal amount of memory needed to implement the encoder. The algorithm has a polynomial-time complexity in the number of gate strings in the pearl-necklace encoder.« less
Surface acoustic wave regulated single photon emission from a coupled quantum dot–nanocavity system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiß, M.; Kapfinger, S.; Wixforth, A.
2016-07-18
A coupled quantum dot–nanocavity system in the weak coupling regime of cavity-quantumelectrodynamics is dynamically tuned in and out of resonance by the coherent elastic field of a f{sub SAW} ≃ 800 MHz surface acoustic wave. When the system is brought to resonance by the sound wave, light-matter interaction is strongly increased by the Purcell effect. This leads to a precisely timed single photon emission as confirmed by the second order photon correlation function, g{sup (2)}. All relevant frequencies of our experiment are faithfully identified in the Fourier transform of g{sup (2)}, demonstrating high fidelity regulation of the stream of single photonsmore » emitted by the system.« less
Unified Field Mechanics: A Brief Introduction
NASA Astrophysics Data System (ADS)
Amoroso, Richard L.
Recently we hear more and more physicists saying, `spacetime is doomed', `spacetime is a mirage', the `end of spacetime', `spacetime is not fundamental but emergent' etc. "Henceforth space by itself and time by itself are doomed to fade into the mere shadows, and only a union of the two will preserve an independent reality." - 1908 Hermann Minkowski. We have come full circle from the time of Minkowski's 1908 statement to the brink of an imminent new age of discovery. The basis of our understanding of the natural world has evolved in modern times from Newtonian Mechanics to the 2nd regime of Quantum Mechanics; and now to the threshold of a 3rd regime - Unified Field Mechanics (UFM). The Planck scale stochastic quantum realm can no longer be considered the `basement' or fundamental level of reality. As hard as quantum reality was to imagine so is the fact that the quantum domain is a manifold of finite radius; and that the `sacrosanct - indelible' Quantum Uncertainty Principle can now be surmounted. For decades main stream physicists have been stymied by efforts to reconcile General Relativity with Quantum Mechanics. The stumbling block lies with the two theories conflicting views of space and time: For quantum theory, space and time offer a fixed backcloth against which particles move. In Einstein's relativities, space and time are not only inextricably linked, but the resultant spacetime is warped by the matter within it. In our nascent UFM paradigm for arcane reasons the quantum manifold is not the regime of integration with gravity; it is instead integrated with the domain of the unified field where the forces of nature are deemed to unify. We give a simplistic survey of the fundamental premises of UFM and summarize experimental protocols to falsify the model at this stage of the paradigm's development.
A joint asymmetric watermarking and image encryption scheme
NASA Astrophysics Data System (ADS)
Boato, G.; Conotter, V.; De Natale, F. G. B.; Fontanari, C.
2008-02-01
Here we introduce a novel watermarking paradigm designed to be both asymmetric, i.e., involving a private key for embedding and a public key for detection, and commutative with a suitable encryption scheme, allowing both to cipher watermarked data and to mark encrypted data without interphering with the detection process. In order to demonstrate the effectiveness of the above principles, we present an explicit example where the watermarking part, based on elementary linear algebra, and the encryption part, exploiting a secret random permutation, are integrated in a commutative scheme.
Kimura, Shinya; Sato, Toshihiko; Ikeda, Shunya; Noda, Mitsuhiko; Nakayama, Takeo
2010-01-01
Health insurance claims (ie, receipts) record patient health care treatments and expenses and, although created for the health care payment system, are potentially useful for research. Combining different types of receipts generated for the same patient would dramatically increase the utility of these receipts. However, technical problems, including standardization of disease names and classifications, and anonymous linkage of individual receipts, must be addressed. In collaboration with health insurance societies, all information from receipts (inpatient, outpatient, and pharmacy) was collected. To standardize disease names and classifications, we developed a computer-aided post-entry standardization method using a disease name dictionary based on International Classification of Diseases (ICD)-10 classifications. We also developed an anonymous linkage system by using an encryption code generated from a combination of hash values and stream ciphers. Using different sets of the original data (data set 1: insurance certificate number, name, and sex; data set 2: insurance certificate number, date of birth, and relationship status), we compared the percentage of successful record matches obtained by using data set 1 to generate key codes with the percentage obtained when both data sets were used. The dictionary's automatic conversion of disease names successfully standardized 98.1% of approximately 2 million new receipts entered into the database. The percentage of anonymous matches was higher for the combined data sets (98.0%) than for data set 1 (88.5%). The use of standardized disease classifications and anonymous record linkage substantially contributed to the construction of a large, chronologically organized database of receipts. This database is expected to aid in epidemiologic and health services research using receipt information.
[Resilience and Spirituality Considered from Viewpoint of Existential Philosophy of Karl Jaspers].
Kato, Satoshi
2015-01-01
After publishing "General Psychopathology" in 1913, Jaspers turned his attention to serious philosophical contemplation. Using the term grenzsituation (limit situation) as a key concept, he first presented a framework to shed light on the pathology of both individuals and groups, and this led on to include the perspective of resilience. He then used three more key concepts, transzendenz (transcendence), chiffer (cipher), and unverstädliche (unintelligible) to offer a framework to focus on the possibilities of human existence. In the field of medicine, this is useful to support a spiritual approach which is discussed in palliative treatment. The philosophy developed by Jaspers can be considered as indicating a practical form of guidance for people to find self-support from a limit situation where they have lost their own support, and finally, come to a degree of mutual acceptance. Mutual acceptance is made possible at the level of ciphers, in which specific meaning remains undefined, by directing both the self and the other toward a state of "transcendence". Nowadays there is a trend for those chaplains involved in spiritual care from a specialist point of view to be trained to effectively transcend any difference in religious belief. As a basic premise, the author considers there is a need to once again return to a state before the start of individual religions, and stand on a cross-sectional ground level, an area which could be regarded as common to all religions. When conducting such a task, in the author's view, the restrained spirituality that Jaspers expounded is thought-provoking.
Jinneography: Post-Soviet passages of traumatic exemplarity.
Beigi, Khashayar
2016-04-01
While Russia has historically and geographically close ties with Islam, the second most-practiced religion in its vast territories, the collapse of the USSR changed the terms of this relationship in significant ways. One key shift is the emergence of new immigration patterns between Russia and former Soviet states. Traversing distant lands from the peripheries of the Caucasus and Central Asia to mainland Russia in search of work, migrants have come to recognize each other as fellow Muslims dispersed in a theological geography on the ruins of the universal comradeship dreamed by the Soviet utopia. I propose to study the Islamic pedagogical practice of ibra in the context of sociohistorical dynamics of education and migration between Russia and Central Asia to further locate and analyze this shift in relation to current debates on post-Soviet subjectivity. By discussing the case of a spirit possession of a Tajik national performed in Russia, I argue that the collective participation in the session pedagogically invokes, ciphers, and extends the post-Soviet terrains of history as ibra, or exemplary passage of worldly events. To do so, I first locate the Quranic concept of ibra as a pedagogical paradigm in Islamic traditions as well as an ethnographic lens in the context of educational campaigns for the Muslims of Eurasia and then apply the concept to my analysis of the possession session in order to show that in the ritualistic incarnations of ghosts, or jinns, the civil war of Tajikistan and its continuing cycle of terror is ciphered into a desire for learning, as well as a focus on approximation to the divine. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
Burleigh, Scott
2008-01-01
This slide presentation reviews the activity around the Asynchronous Message Service (AMS) prototype. An AMS reference implementation has been available since late 2005. It is aimed at supporting message exchange both in on-board environments and over space links. The implementation incoroporates all mandatory elements of the draft recommendation from July 2007: (1) MAMS, AMS, and RAMS protocols. (2) Failover, heartbeats, resync. (3) "Hooks" for security, but no cipher suites included in the distribution. The performance is reviewed, and a Benchmark latency test over VxWorks Message Queues is shown as histograms of a count vs microseconds per 1000-byte message
Deducing trapdoor primitives in public key encryption schemes
NASA Astrophysics Data System (ADS)
Pandey, Chandra
2005-03-01
Semantic security of public key encryption schemes is often interchangeable with the art of building trapdoors. In the frame of reference of Random Oracle methodology, the "Key Privacy" and "Anonymity" has often been discussed. However to a certain degree the security of most public key encryption schemes is required to be analyzed with formal proofs using one-way functions. This paper evaluates the design of El Gamal and RSA based schemes and attempts to parallelize the trapdoor primitives used in the computation of the cipher text, thereby magnifying the decryption error δp in the above schemes.
Another Look at the "Dismal Science" and Jenner's Experiment.
Elllis, John A
2018-03-01
The follow-up to Jenner's experiment, routine vaccination, has reduced more disease and saved more vertebrate lives than any other iatrogenic procedure by orders of magnitude. The unassailability of that potentially provocative cliché has been ciphered in human medicine, even if it is more difficult in our profession. Most public relations headaches concerning vaccines are a failure to communicate, often resulting in overly great expectations. Even in the throes of a tight appointment schedule remembering and synopsizing (for clients), some details of the dismal science can make practice great again. Copyright © 2017 Elsevier Inc. All rights reserved.
A Novel Bit-level Image Encryption Method Based on Chaotic Map and Dynamic Grouping
NASA Astrophysics Data System (ADS)
Zhang, Guo-Ji; Shen, Yan
2012-10-01
In this paper, a novel bit-level image encryption method based on dynamic grouping is proposed. In the proposed method, the plain-image is divided into several groups randomly, then permutation-diffusion process on bit level is carried out. The keystream generated by logistic map is related to the plain-image, which confuses the relationship between the plain-image and the cipher-image. The computer simulation results of statistical analysis, information entropy analysis and sensitivity analysis show that the proposed encryption method is secure and reliable enough to be used for communication application.
Nonintrusive Diagnostic Strategies for Arcjet Stream Characterization
2000-04-01
lowpassfiltr. Te aco hinationd of8-0 .3o no M chromaitor sa rta ...... ecdeutcrium lamp’"._• 94 nom Fiber 4- ugt ap Fig. 8. Experimental setup for electrode e...and C. Park, "Stagnation Point Heat lish the validity of this hypothesis, but the spatially Transfer Rate in Nitrogen Plasma Flows: Theory 3B-35 and...R. Loudon, The Quantum Theory of Light, 2nd Ed., Seattle, WA, June, (1990). Clarendon Press, Oxford, pp. 344-347, (1986). 40. W. J. Marinelli, W. J
Controlled multistep synthesis in a three-phase droplet reactor
Nightingale, Adrian M.; Phillips, Thomas W.; Bannock, James H.; de Mello, John C.
2014-01-01
Channel-fouling is a pervasive problem in continuous flow chemistry, causing poor product control and reactor failure. Droplet chemistry, in which the reaction mixture flows as discrete droplets inside an immiscible carrier liquid, prevents fouling by isolating the reaction from the channel walls. Unfortunately, the difficulty of controllably adding new reagents to an existing droplet stream has largely restricted droplet chemistry to simple reactions in which all reagents are supplied at the time of droplet formation. Here we describe an effective method for repeatedly adding controlled quantities of reagents to droplets. The reagents are injected into a multiphase fluid stream, comprising the carrier liquid, droplets of the reaction mixture and an inert gas that maintains a uniform droplet spacing and suppresses new droplet formation. The method, which is suited to many multistep reactions, is applied to a five-stage quantum dot synthesis wherein particle growth is sustained by repeatedly adding fresh feedstock. PMID:24797034
Novel dynamic caching for hierarchically distributed video-on-demand systems
NASA Astrophysics Data System (ADS)
Ogo, Kenta; Matsuda, Chikashi; Nishimura, Kazutoshi
1998-02-01
It is difficult to simultaneously serve the millions of video streams that will be needed in the age of 'Mega-Media' networks by using only one high-performance server. To distribute the service load, caching servers should be location near users. However, in previously proposed caching mechanisms, the grade of service depends on whether the data is already cached at a caching server. To make the caching servers transparent to the users, the ability to randomly access the large volume of data stored in the central server should be supported, and the operational functions of the provided service should not be narrowly restricted. We propose a mechanism for constructing a video-stream-caching server that is transparent to the users and that will always support all special playback functions for all available programs to all the contents with a latency of only 1 or 2 seconds. This mechanism uses Variable-sized-quantum-segment- caching technique derived from an analysis of the historical usage log data generated by a line-on-demand-type service experiment and based on the basic techniques used by a time- slot-based multiple-stream video-on-demand server.
NASA Astrophysics Data System (ADS)
Zhang, Hang; Mao, Yu; Huang, Duan; Li, Jiawei; Zhang, Ling; Guo, Ying
2018-05-01
We introduce a reliable scheme for continuous-variable quantum key distribution (CV-QKD) by using orthogonal frequency division multiplexing (OFDM). As a spectrally efficient multiplexing technique, OFDM allows a large number of closely spaced orthogonal subcarrier signals used to carry data on several parallel data streams or channels. We place emphasis on modulator impairments which would inevitably arise in the OFDM system and analyze how these impairments affect the OFDM-based CV-QKD system. Moreover, we also evaluate the security in the asymptotic limit and the Pirandola-Laurenza-Ottaviani-Banchi upper bound. Results indicate that although the emergence of imperfect modulation would bring about a slight decrease in the secret key bit rate of each subcarrier, the multiplexing technique combined with CV-QKD results in a desirable improvement on the total secret key bit rate which can raise the numerical value about an order of magnitude.
Zakerolhosseini, Ali; Sokouti, Massoud; Pezeshkian, Massoud
2013-01-01
Quick responds to heart attack patients before arriving to hospital is a very important factor. In this paper, a combined model of Body Sensor Network and Personal Digital Access using QTRU cipher algorithm in Wifi networks is presented to efficiently overcome these life threatening attacks. The algorithm for optimizing the routing paths between sensor nodes and an algorithm for reducing the power consumption are also applied for achieving the best performance by this model. This system is consumes low power and has encrypting and decrypting processes. It also has an efficient routing path in a fast manner.
An approach to the language discrimination in different scripts using adjacent local binary pattern
NASA Astrophysics Data System (ADS)
Brodić, D.; Amelio, A.; Milivojević, Z. N.
2017-09-01
The paper proposes a language discrimination method of documents. First, each letter is encoded with the certain script type according to its status in baseline area. Such a cipher text is subjected to a feature extraction process. Accordingly, the local binary pattern as well as its expanded version called adjacent local binary pattern are extracted. Because of the difference in the language characteristics, the above analysis shows significant diversity. This type of diversity is a key aspect in the decision-making differentiation of the languages. Proposed method is tested on an example of documents. The experiments give encouraging results.
Trusted Storage: Putting Security and Data Together
NASA Astrophysics Data System (ADS)
Willett, Michael; Anderson, Dave
State and Federal breach notification legislation mandates that the affected parties be notified in case of a breach of sensitive personal data, unless the data was provably encrypted. Self-encrypting hard drives provide the superior solution for encrypting data-at-rest when compared to software-based solutions. Self-encrypting hard drives, from the laptop to the data center, have been standardized across the hard drive industry by the Trusted Computing Group. Advantages include: simplified management (including keys), no performance impact, quick data erasure and drive re-purposing, no interference with end-to-end data integrity metrics, always encrypting, no cipher-text exposure, and scalability in large data centers.
Design, Assembly, and Characterization of TALE-Based Transcriptional Activators and Repressors.
Thakore, Pratiksha I; Gersbach, Charles A
2016-01-01
Transcription activator-like effectors (TALEs) are modular DNA-binding proteins that can be fused to a variety of effector domains to regulate the epigenome. Nucleotide recognition by TALE monomers follows a simple cipher, making this a powerful and versatile method to activate or repress gene expression. Described here are methods to design, assemble, and test TALE transcription factors (TALE-TFs) for control of endogenous gene expression. In this protocol, TALE arrays are constructed by Golden Gate cloning and tested for activity by transfection and quantitative RT-PCR. These methods for engineering TALE-TFs are useful for studies in reverse genetics and genomics, synthetic biology, and gene therapy.
Zakerolhosseini, Ali; Sokouti, Massoud; Pezeshkian, Massoud
2013-01-01
Quick responds to heart attack patients before arriving to hospital is a very important factor. In this paper, a combined model of Body Sensor Network and Personal Digital Access using QTRU cipher algorithm in Wifi networks is presented to efficiently overcome these life threatening attacks. The algorithm for optimizing the routing paths between sensor nodes and an algorithm for reducing the power consumption are also applied for achieving the best performance by this model. This system is consumes low power and has encrypting and decrypting processes. It also has an efficient routing path in a fast manner. PMID:24252988
Proof of cipher text ownership based on convergence encryption
NASA Astrophysics Data System (ADS)
Zhong, Weiwei; Liu, Zhusong
2017-08-01
Cloud storage systems save disk space and bandwidth through deduplication technology, but with the use of this technology has been targeted security attacks: the attacker can get the original file just use hash value to deceive the server to obtain the file ownership. In order to solve the above security problems and the different security requirements of cloud storage system files, an efficient information theory security proof of ownership scheme is proposed. This scheme protects the data through the convergence encryption method, and uses the improved block-level proof of ownership scheme, and can carry out block-level client deduplication to achieve efficient and secure cloud storage deduplication scheme.
Orthogonal-state-based cryptography in quantum mechanics and local post-quantum theories
NASA Astrophysics Data System (ADS)
Aravinda, S.; Banerjee, Anindita; Pathak, Anirban; Srikanth, R.
2014-02-01
We introduce the concept of cryptographic reduction, in analogy with a similar concept in computational complexity theory. In this framework, class A of crypto-protocols reduces to protocol class B in a scenario X, if for every instance a of A, there is an instance b of B and a secure transformation X that reproduces a given b, such that the security of b guarantees the security of a. Here we employ this reductive framework to study the relationship between security in quantum key distribution (QKD) and quantum secure direct communication (QSDC). We show that replacing the streaming of independent qubits in a QKD scheme by block encoding and transmission (permuting the order of particles block by block) of qubits, we can construct a QSDC scheme. This forms the basis for the block reduction from a QSDC class of protocols to a QKD class of protocols, whereby if the latter is secure, then so is the former. Conversely, given a secure QSDC protocol, we can of course construct a secure QKD scheme by transmitting a random key as the direct message. Then the QKD class of protocols is secure, assuming the security of the QSDC class which it is built from. We refer to this method of deduction of security for this class of QKD protocols, as key reduction. Finally, we propose an orthogonal-state-based deterministic key distribution (KD) protocol which is secure in some local post-quantum theories. Its security arises neither from geographic splitting of a code state nor from Heisenberg uncertainty, but from post-measurement disturbance.
The pursuit of locality in quantum mechanics
NASA Astrophysics Data System (ADS)
Hodkin, Malcolm
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
Hybrid cryptosystem for image file using elgamal and double playfair cipher algorithm
NASA Astrophysics Data System (ADS)
Hardi, S. M.; Tarigan, J. T.; Safrina, N.
2018-03-01
In this paper, we present an implementation of an image file encryption using hybrid cryptography. We chose ElGamal algorithm to perform asymmetric encryption and Double Playfair for the symmetric encryption. Our objective is to show that these algorithms are capable to encrypt an image file with an acceptable running time and encrypted file size while maintaining the level of security. The application was built using C# programming language and ran as a stand alone desktop application under Windows Operating System. Our test shows that the system is capable to encrypt an image with a resolution of 500×500 to a size of 976 kilobytes with an acceptable running time.
Information hiding based on double random-phase encoding and public-key cryptography.
Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li
2009-03-02
A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.
Design, Assembly, and Characterization of TALE-Based Transcriptional Activators and Repressors
Thakore, Pratiksha I.; Gersbach, Charles A.
2016-01-01
Transcription activator-like effectors (TALEs) are modular DNA-binding proteins that can be fused to a variety of effector domains to regulate the epigenome. Nucleotide recognition by TALE monomers follows a simple cipher, making this a powerful and versatile method to activate or repress gene expression. Described here are methods to design, assemble, and test TALE transcription factors (TALE-TFs) for control of endogenous gene expression. In this protocol, TALE arrays are constructed by Golden Gate cloning and tested for activity by transfection and quantitative RT-PCR. These methods for engineering TALE-TFs are useful for studies in reverse genetics and genomics, synthetic biology, and gene therapy. PMID:26443215
A Scheme for Obtaining Secure S-Boxes Based on Chaotic Baker's Map
NASA Astrophysics Data System (ADS)
Gondal, Muhammad Asif; Abdul Raheem; Hussain, Iqtadar
2014-09-01
In this paper, a method for obtaining cryptographically strong 8 × 8 substitution boxes (S-boxes) is presented. The method is based on chaotic baker's map and a "mini version" of a new block cipher with block size 8 bits and can be easily and efficiently performed on a computer. The cryptographic strength of some 8 × 8 S-boxes randomly produced by the method is analyzed. The results show (1) all of them are bijective; (2) the nonlinearity of each output bit of them is usually about 100; (3) all of them approximately satisfy the strict avalanche criterion and output bits independence criterion; (4) they all have an almost equiprobable input/output XOR distribution.
Test and Verification of AES Used for Image Encryption
NASA Astrophysics Data System (ADS)
Zhang, Yong
2018-03-01
In this paper, an image encryption program based on AES in cipher block chaining mode was designed with C language. The encryption/decryption speed and security performance of AES based image cryptosystem were tested and used to compare the proposed cryptosystem with some existing image cryptosystems based on chaos. Simulation results show that AES can apply to image encryption, which refutes the widely accepted point of view that AES is not suitable for image encryption. This paper also suggests taking the speed of AES based image encryption as the speed benchmark of image encryption algorithms. And those image encryption algorithms whose speeds are lower than the benchmark should be discarded in practical communications.
Trajectory control of PbSe–γ-Fe2O3 nanoplatforms under viscous flow and an external magnetic field
Etgar, Lioz; Nakhmani, Arie; Tannenbaum, Allen; Lifshitz, Efrat; Tannenbaum, Rina
2010-01-01
The flow behavior of nanostructure clusters, consisting of chemically bonded PbSe quantum dots and magnetic γ -Fe2O3 nanoparticles, has been investigated. The clusters are regarded as model nanoplatforms with multiple functionalities, where the γ -Fe2O3 magnets serve as transport vehicles, manipulated by an external magnetic field gradient, and the quantum dots act as fluorescence tags within an optical window in the near-infrared regime. The clusters’ flow was characterized by visualizing their trajectories within a viscous fluid (mimicking a blood stream), using an optical imaging method, while the trajectory pictures were analyzed by a specially developed processing package. The trajectories were examined under various flow rates, viscosities and applied magnetic field strengths. The results revealed a control of the trajectories even at low magnetic fields (<1 T), validating the use of similar nanoplatforms as active targeting constituents in personalized medicine. PMID:20368678
Çolak, Senem; Durmuş, Mahmut; Yıldız, Salih Zeki
2016-06-21
In this study, 4-{4-[N-((3-dimethylamino)propyl)amide]phenoxy}phthalonitrile () and its zinc(ii) phthalocyanine derivative () were synthesized for the first time. 4-(N-((3-Dimethylamino)propyl)amide)phenoxy substituted zinc(ii) phthalocyanine () was converted to its water-soluble sulfobetaine (), betaine () and N-oxide () containing zwitterionic and quaternized cationic () derivatives. All newly synthesized compounds () were characterized by the combination of UV-vis, FT-IR, (1)H NMR, mass spectroscopy techniques and elemental analysis. The photophysical (fluorescence quantum yields and lifetimes) and photochemical (singlet oxygen quantum yields) properties were investigated in DMSO for all the synthesized zinc(ii) phthalocyanines () and in both DMSO and aqueous solutions for zwitterionic and cationic phthalocyanines () for the specification of their capability as photosensitizers in photodynamic therapy (PDT). The binding behavior of water soluble phthalocyanines () to the bovine serum albumin protein was also examined for the determination of their transportation ability in the blood stream.
NASA Astrophysics Data System (ADS)
Pons, Thomas
2017-02-01
Near infrared (NIR) emitting quantum dots based on copper indium chalcogenides present unique optical properties for in vivo fluorescence imaging. Here we present the synthesis of CuIn(S,Se)2/ZnS core/shell QDs with 30-50% quantum yield in the NIR range. These nanoprobes are solubilized in water using a block copolymer surface ligand composed of multiple binding groups for enhanced stability and zwitterionic groups for solubility and minimized nonspecific adsorption. They present limited toxicity compared to heavy metal-containing QDs. These versatile nanoprobes can be directly injected in the peritumoral region for sentinel lymph node imaging. We also demonstrate their vectorization with RGD peptides or their incorporation in folic acid-functionalized silica particles to target specific cancer cells. Their long fluorescence lifetime enables rejection of autofluorescence using time-gated detection. This considerably enhances the sensitivity of in vivo fluorescence imaging. These QDs have been used for long term labeling of cancer cells ex vivo. Following reinjection of these cells, time-gated detection enables in vivo imaging of these cancer cells in the blood stream at the single cell level. Finally, these QDs can be doped with paramagnetic manganese ions to provide multimodal contrast in both fluorescence and magnetic resonance imaging.
Bidirectional Classical Stochastic Processes with Measurements and Feedback
NASA Technical Reports Server (NTRS)
Hahne, G. E.
2005-01-01
A measurement on a quantum system is said to cause the "collapse" of the quantum state vector or density matrix. An analogous collapse occurs with measurements on a classical stochastic process. This paper addresses the question of describing the response of a classical stochastic process when there is feedback from the output of a measurement to the input, and is intended to give a model for quantum-mechanical processes that occur along a space-like reaction coordinate. The classical system can be thought of in physical terms as two counterflowing probability streams, which stochastically exchange probability currents in a way that the net probability current, and hence the overall probability, suitably interpreted, is conserved. The proposed formalism extends the . mathematics of those stochastic processes describable with linear, single-step, unidirectional transition probabilities, known as Markov chains and stochastic matrices. It is shown that a certain rearrangement and combination of the input and output of two stochastic matrices of the same order yields another matrix of the same type. Each measurement causes the partial collapse of the probability current distribution in the midst of such a process, giving rise to calculable, but non-Markov, values for the ensuing modification of the system's output probability distribution. The paper concludes with an analysis of a classical probabilistic version of the so-called grandfather paradox.
El control de las concentraciones empresariales en el sector electrico
NASA Astrophysics Data System (ADS)
Montoya Pardo, Milton Fernando
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
Tectonica activa y geodinamica en el norte de centroamerica
NASA Astrophysics Data System (ADS)
Alvarez Gomez, Jose Antonio
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
Estabilidad de ciertas ondas solitarias sometidas a perturbaciones estocasticas
NASA Astrophysics Data System (ADS)
Rodriguez Plaza, Maria Jesus
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
NASA Astrophysics Data System (ADS)
Malpica Velasco, Jose Antonio
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
Analisis espectroscopico de estrellas variables Delta Scuti
NASA Astrophysics Data System (ADS)
Solano Marquez, Enrique
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
Inversion gravimetrica 3D por tecnicas de evolucion: Aplicacion a la Isla de Fuerteventura
NASA Astrophysics Data System (ADS)
Gonzalez Montesinos, Fuensanta
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
Evolution tectonothermale du massif Hercynien des Rehamna (zone centre-mesetienne, Maroc)
NASA Astrophysics Data System (ADS)
Aghzer, Abdel Mouhsine
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
NASA Astrophysics Data System (ADS)
Bejar Pizarro, Marta
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
NASA Astrophysics Data System (ADS)
Fillali, Laila
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
2013-02-15
red fuming nitric acid (RFNA), which is composed of nitric acid (HNO3, 85 wt%) and NO2 (8–15 wt%). Recently the impinging stream vortex engine (ISVE... nitric acid [51]. As a result, growth of the particles is favored over H-abstraction reactions at the low temperatures of our experiments. As the...followed by the proton transfer from NAH bond to NO3 to form nitric acid , as shown in Scheme 3. Although it is very easy to form nitric acid (enthalpic
Recycling of mixed wastes using Quantum-CEP{trademark}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sameski, B.
1997-02-01
The author describes the process that M4 Environmental Management, Inc., is commercializing for the treatment of mixed wastes. He summarizes the types of wastes which the process can be applied to, the products which come out of the process, and examples of various waste streams which have been processed. The process is presently licensed to treat mixed wastes and the company has in place contracts for such services. The process uses a molten metal bath to catalyze reactions which break the incoming products down to an atomic level, and allow different process steams to be tapped at the output end.
NASA Technical Reports Server (NTRS)
Krzywoblocki, M. Z. V.
1974-01-01
The application of the elements of quantum (wave) mechanics to some special problems in the field of macroscopic fluid dynamics is discussed. Emphasis is placed on the flow of a viscous, incompressible fluid around a circular cylinder. The following subjects are considered: (1) the flow of a nonviscous fluid around a circular cylinder, (2) the restrictions imposed the stream function by the number of dimensions of space, and (3) the flow past three dimensional bodies in a viscous fluid, particularly past a circular cylinder in the symmetrical case.
NASA Astrophysics Data System (ADS)
Meng, X. F.; Peng, X.; Cai, L. Z.; Li, A. M.; Gao, Z.; Wang, Y. R.
2009-08-01
A hybrid cryptosystem is proposed, in which one image is encrypted to two interferograms with the aid of double random-phase encoding (DRPE) and two-step phase-shifting interferometry (2-PSI), then three pairs of public-private keys are utilized to encode and decode the session keys (geometrical parameters, the second random-phase mask) and interferograms. In the stage of decryption, the ciphered image can be decrypted by wavefront reconstruction, inverse Fresnel diffraction, and real amplitude normalization. This approach can successfully solve the problem of key management and dispatch, resulting in increased security strength. The feasibility of the proposed cryptosystem and its robustness against some types of attack are verified and analyzed by computer simulations.
Exponential H ∞ Synchronization of Chaotic Cryptosystems Using an Improved Genetic Algorithm
Hsiao, Feng-Hsiag
2015-01-01
This paper presents a systematic design methodology for neural-network- (NN-) based secure communications in multiple time-delay chaotic (MTDC) systems with optimal H ∞ performance and cryptography. On the basis of the Improved Genetic Algorithm (IGA), which is demonstrated to have better performance than that of a traditional GA, a model-based fuzzy controller is then synthesized to stabilize the MTDC systems. A fuzzy controller is synthesized to not only realize the exponential synchronization, but also achieve optimal H ∞ performance by minimizing the disturbance attenuation level. Furthermore, the error of the recovered message is stated by using the n-shift cipher and key. Finally, a numerical example with simulations is given to demonstrate the effectiveness of our approach. PMID:26366432
Moser, Harald; Pölz, Walter; Waclawek, Johannes Paul; Ofner, Johannes; Lendl, Bernhard
2017-01-01
The implementation of a sensitive and selective as well as industrial fit gas sensor prototype based on wavelength modulation spectroscopy with second harmonic detection (2f-WMS) employing an 8-μm continuous-wave distributed feedback quantum cascade laser (CW-DFB-QCL) for monitoring hydrogen sulfide (H 2 S) at sub-ppm levels is reported. Regarding the applicability for analytical and industrial process purposes aimed at petrochemical environments, a synthetic methane (CH 4 ) matrix of up to 1000 ppmv together with a varying H 2 S content was chosen as the model environment for the laboratory-based performance evaluation performed at TU Wien. A noise-equivalent absorption sensitivity (NEAS) for H 2 S targeting the absorption line at 1247.2 cm -1 was found to be 8.419 × 10 -10 cm -1 Hz -1/2 , and a limit of detection (LOD) of 150 ppbv H 2 S could be achieved. The sensor prototype was then deployed for on-site measurements at the petrochemical research hydrogenation platform of the industrial partner OMV AG. In order to meet the company's on-site safety regulations, the H 2 S sensor platform was installed in an industry rack and equipped with the required safety infrastructure for protected operation in hazardous and explosive environments. The work reports the suitability of the sensor prototype for simultaneous monitoring of H 2 S and CH 4 content in the process streams of a research hydrodesulfurization (HDS) unit. Concentration readings were obtained every 15 s and revealed process dynamics not observed previously.
Si-rich SiNx based Kerr switch enables optical data conversion up to 12 Gbit/s
Lin, Gong-Ru; Su, Sheng-Pin; Wu, Chung-Lun; Lin, Yung-Hsiang; Huang, Bo-Ji; Wang, Huai-Yung; Tsai, Cheng-Ting; Wu, Chih-I; Chi, Yu-Chieh
2015-01-01
Silicon photonic interconnection on chip is the emerging issue for next-generation integrated circuits. With the Si-rich SiNx micro-ring based optical Kerr switch, we demonstrate for the first time the wavelength and format conversion of optical on-off-keying data with a bit-rate of 12 Gbit/s. The field-resonant nonlinear Kerr effect enhances the transient refractive index change when coupling the optical data-stream into the micro-ring through the bus waveguide. This effectively red-shifts the notched dip wavelength to cause the format preserved or inversed conversion of data carried by the on-resonant or off-resonant probe, respectively. The Si quantum dots doped Si-rich SiNx strengthens its nonlinear Kerr coefficient by two-orders of magnitude higher than that of bulk Si or Si3N4. The wavelength-converted and cross-amplitude-modulated probe data-stream at up to 12-Gbit/s through the Si-rich SiNx micro-ring with penalty of −7 dB on transmission has shown very promising applicability to all-optical communication networks. PMID:25923653
Si-rich SiNx based Kerr switch enables optical data conversion up to 12 Gbit/s.
Lin, Gong-Ru; Su, Sheng-Pin; Wu, Chung-Lun; Lin, Yung-Hsiang; Huang, Bo-Ji; Wang, Huai-Yung; Tsai, Cheng-Ting; Wu, Chih-I; Chi, Yu-Chieh
2015-04-29
Silicon photonic interconnection on chip is the emerging issue for next-generation integrated circuits. With the Si-rich SiNx micro-ring based optical Kerr switch, we demonstrate for the first time the wavelength and format conversion of optical on-off-keying data with a bit-rate of 12 Gbit/s. The field-resonant nonlinear Kerr effect enhances the transient refractive index change when coupling the optical data-stream into the micro-ring through the bus waveguide. This effectively red-shifts the notched dip wavelength to cause the format preserved or inversed conversion of data carried by the on-resonant or off-resonant probe, respectively. The Si quantum dots doped Si-rich SiNx strengthens its nonlinear Kerr coefficient by two-orders of magnitude higher than that of bulk Si or Si3N4. The wavelength-converted and cross-amplitude-modulated probe data-stream at up to 12-Gbit/s through the Si-rich SiNx micro-ring with penalty of -7 dB on transmission has shown very promising applicability to all-optical communication networks.
Free-Space Quantum Key Distribution using Polarization Entangled Photons
NASA Astrophysics Data System (ADS)
Kurtsiefer, Christian
2007-06-01
We report on a complete experimental implementation of a quantum key distribution protocol through a free space link using polarization-entangled photon pairs from a compact parametric down-conversion source [1]. Based on a BB84-equivalent protocol, we generated without interruption over 10 hours a secret key free-space optical link distance of 1.5 km with a rate up to 950 bits per second after error correction and privacy amplification. Our system is based on two time stamp units and relies on no specific hardware channel for coincidence identification besides an IP link. For that, initial clock synchronization with an accuracy of better than 2 ns is achieved, based on a conventional NTP protocol and a tiered cross correlation of time tags on both sides. Time tags are used to servo a local clock, allowing a streamed measurement on correctly identified photon pairs. Contrary to the majority of quantum key distribution systems, this approach does not require a trusted large-bandwidth random number generator, but integrates that into the physical key generation process. We discuss our current progress of implementing a key distribution via an atmospherical link during daylight conditions, and possible attack scenarios on a physical timing information side channel to a entanglement-based key distribution system. [1] I. Marcikic, A. Lamas-Linares, C. Kurtsiefer, Appl. Phys. Lett. 89, 101122 (2006).
NASA Astrophysics Data System (ADS)
Plaza Garcia, Maria Asuncion
The rampant success of quantum theory is the result of applications of the 'new' quantum mechanics of Schrodinger and Heisenberg (1926-7), the Feynman-Schwinger-Tomonaga Quantum Electro-dynamics (1946-51), the electro-weak theory of Salaam, Weinberg, and Glashow (1967-9), and Quantum Chromodynamics (1973-); in fact, this success of 'the' quantum theory has depended on a continuous stream of brilliant and quite disparate mathematical formulations. In this carefully concealed ferment there lie plenty of unresolved difficulties, simply because in churning out fabulously accurate calculational tools there has been no sensible explanation of all that is going on. It is even argued that such an understanding is nothing to do with physics. A long-standing and famous illustration of this is the paradoxical thought-experiment of Einstein, Podolsky and Rosen (1935). Fundamental to all quantum theories, and also their paradoxes, is the location of sub-microscopic objects; or, rather, that the specification of such a location is fraught with mathematical inconsistency. This project encompasses a detailed, critical survey of the tangled history of Position within quantum theories. The first step is to show that, contrary to appearances, canonical quantum mechanics has only a vague notion of locality. After analysing a number of previous attempts at a 'relativistic quantum mechanics', two lines of thought are considered in detail. The first is the work of Wan and students, which is shown to be no real improvement on the iisu.al 'nonrelativistic' theory. The second is based on an idea of Dirac's - using backwards-in-time light-cones as the hypersurface in space-time. There remain considerable difficulties in the way of producing a consistent scheme here. To keep things nicely stirred up, the author then proposes his own approach - an adaptation of Feynman's QED propagators. This new approach is distinguished from Feynman's since the propagator or Green's function is not obtained by Feynman's rule. The type of equation solved is also different: instead of an initial-value problem, a solution that obeys a time-symmetric causality criterion is found for an inhomogeneous partial differential equation with homogeneous boundary conditions. To make the consideration of locality more precise, some results of Fourier transform theory are presented in a form that is directly applicable. Somewhat away from the main thrust of the thesis, there is also an attempt to explain, the manner in which quantum effects disappear as the number of particles increases in such things as experimental realisations of the EPR and de Broglie thought experiments.
NASA Astrophysics Data System (ADS)
Glattli, D. C.; Roulleau, P.
2016-08-01
We study the Hanbury Brown and Twiss correlation of electronic quasi-particles injected in a quantum conductor using current noise correlations and we experimentally address the effect of finite temperature. By controlling the relative time of injection of two streams of electrons it is possible to probe the fermionic antibunching, performing the electron analog of the optical Hong Ou Mandel (HOM) experiment. The electrons are injected using voltage pulses with either sine-wave or Lorentzian shape. In the latter case, we propose a set of orthogonal wavefunctions, describing periodic trains of multiply charged electron pulses, which give a simple interpretation to the HOM shot noise. The effect of temperature is then discussed and experimentally investigated. We observe a perfect electron anti-bunching for a large range of temperature, showing that, as recently predicted, thermal mixing of the states does not affect anti-bunching properties, a feature qualitatively different from dephasing. For single charge Lorentzian pulses, we provide experimental evidence of the prediction that the HOM shot noise variation versus the emission time delay is remarkably independent of the temperature.
[Transcription activator-like effectors(TALEs)based genome engineering].
Zhao, Mei-Wei; Duan, Cheng-Li; Liu, Jiang
2013-10-01
Systematic reverse-engineering of functional genome architecture requires precise modifications of gene sequences and transcription levels. The development and application of transcription activator-like effectors(TALEs) has created a wealth of genome engineering possibilities. TALEs are a class of naturally occurring DNA-binding proteins found in the plant pathogen Xanthomonas species. The DNA-binding domain of each TALE typically consists of tandem 34-amino acid repeat modules rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Such "genome engineering" has now been established in human cells and a number of model organisms, thus opening the door to better understanding gene function in model organisms, improving traits in crop plants and treating human genetic disorders.
Privacy authentication using key attribute-based encryption in mobile cloud computing
NASA Astrophysics Data System (ADS)
Mohan Kumar, M.; Vijayan, R.
2017-11-01
Mobile Cloud Computing is becoming more popular in nowadays were users of smartphones are getting increased. So, the security level of cloud computing as to be increased. Privacy Authentication using key-attribute based encryption helps the users for business development were the data sharing with the organization using the cloud in a secured manner. In Privacy Authentication the sender of data will have permission to add their receivers to whom the data access provided for others the access denied. In sender application, the user can choose the file which is to be sent to receivers and then that data will be encrypted using Key-attribute based encryption using AES algorithm. In which cipher created, and that stored in Amazon Cloud along with key value and the receiver list.
An algorithm for encryption of secret images into meaningful images
NASA Astrophysics Data System (ADS)
Kanso, A.; Ghebleh, M.
2017-03-01
Image encryption algorithms typically transform a plain image into a noise-like cipher image, whose appearance is an indication of encrypted content. Bao and Zhou [Image encryption: Generating visually meaningful encrypted images, Information Sciences 324, 2015] propose encrypting the plain image into a visually meaningful cover image. This improves security by masking existence of encrypted content. Following their approach, we propose a lossless visually meaningful image encryption scheme which improves Bao and Zhou's algorithm by making the encrypted content, i.e. distortions to the cover image, more difficult to detect. Empirical results are presented to show high quality of the resulting images and high security of the proposed algorithm. Competence of the proposed scheme is further demonstrated by means of comparison with Bao and Zhou's scheme.
Lakshmi, C; Thenmozhi, K; Rayappan, John Bosco Balaguru; Amirtharajan, Rengarajan
2018-06-01
Digital Imaging and Communications in Medicine (DICOM) is one among the significant formats used worldwide for the representation of medical images. Undoubtedly, medical-image security plays a crucial role in telemedicine applications. Merging encryption and watermarking in medical-image protection paves the way for enhancing the authentication and safer transmission over open channels. In this context, the present work on DICOM image encryption has employed a fuzzy chaotic map for encryption and the Discrete Wavelet Transform (DWT) for watermarking. The proposed approach overcomes the limitation of the Arnold transform-one of the most utilised confusion mechanisms in image ciphering. Various metrics have substantiated the effectiveness of the proposed medical-image encryption algorithm. Copyright © 2018 Elsevier B.V. All rights reserved.
Protect sensitive data with lightweight memory encryption
NASA Astrophysics Data System (ADS)
Zhou, Hongwei; Yuan, Jinhui; Xiao, Rui; Zhang, Kai; Sun, Jingyao
2018-04-01
Since current commercial processor is not able to deal with the data in the cipher text, the sensitive data have to be exposed in the memory. It leaves a window for the adversary. To protect the sensitive data, a direct idea is to encrypt the data when the processor does not access them. On the observation, we have developed a lightweight memory encryption, called LeMe, to protect the sensitive data in the application. LeMe marks the sensitive data in the memory with the page table entry, and encrypts the data in their free time. LeMe is built on the Linux with a 3.17.6 kernel, and provides four user interfaces as dynamic link library. Our evaluations show LeMe is effective to protect the sensitive data and incurs an acceptable performance overhead.
Multi-focus image fusion and robust encryption algorithm based on compressive sensing
NASA Astrophysics Data System (ADS)
Xiao, Di; Wang, Lan; Xiang, Tao; Wang, Yong
2017-06-01
Multi-focus image fusion schemes have been studied in recent years. However, little work has been done in multi-focus image transmission security. This paper proposes a scheme that can reduce data transmission volume and resist various attacks. First, multi-focus image fusion based on wavelet decomposition can generate complete scene images and optimize the perception of the human eye. The fused images are sparsely represented with DCT and sampled with structurally random matrix (SRM), which reduces the data volume and realizes the initial encryption. Then the obtained measurements are further encrypted to resist noise and crop attack through combining permutation and diffusion stages. At the receiver, the cipher images can be jointly decrypted and reconstructed. Simulation results demonstrate the security and robustness of the proposed scheme.
NASA Astrophysics Data System (ADS)
Moon, Dukjae; Hong, Deukjo; Kwon, Daesung; Hong, Seokhie
We assume that the domain extender is the Merkle-Damgård (MD) scheme and he message is padded by a ‘1’, and minimum number of ‘0’s, followed by a fixed size length information so that the length of padded message is multiple of block length. Under this assumption, we analyze securities of the hash mode when the compression function follows the Davies-Meyer (DM) scheme and the underlying block cipher is one of the plain Feistel or Misty scheme or the generalized Feistel or Misty schemes with Substitution-Permutation (SP) round function. We do this work based on Meet-in-the-Middle (MitM) preimage attack techniques, and develop several useful initial structures.
NASA Astrophysics Data System (ADS)
Takeda, Masafumi; Nakano, Kazuya; Suzuki, Hiroyuki; Yamaguchi, Masahiro
2012-09-01
It has been shown that biometric information can be used as a cipher key for binary data encryption by applying double random phase encoding. In such methods, binary data are encoded in a bit pattern image, and the decrypted image becomes a plain image when the key is genuine; otherwise, decrypted images become random images. In some cases, images decrypted by imposters may not be fully random, such that the blurred bit pattern can be partially observed. In this paper, we propose a novel bit coding method based on a Fourier transform hologram, which makes images decrypted by imposters more random. Computer experiments confirm that the method increases the randomness of images decrypted by imposters while keeping the false rejection rate as low as in the conventional method.
NASA Astrophysics Data System (ADS)
Siswantyo, Sepha; Susanti, Bety Hayat
2016-02-01
Preneel-Govaerts-Vandewalle (PGV) schemes consist of 64 possible single-block-length schemes that can be used to build a hash function based on block ciphers. For those 64 schemes, Preneel claimed that 4 schemes are secure. In this paper, we apply length extension attack on those 4 secure PGV schemes which use RC5 algorithm in its basic construction to test their collision resistance property. The attack result shows that the collision occurred on those 4 secure PGV schemes. Based on the analysis, we indicate that Feistel structure and data dependent rotation operation in RC5 algorithm, XOR operations on the scheme, along with selection of additional message block value also give impact on the collision to occur.
Consciousness, the brain, and spacetime geometry.
Hameroff, S
2001-04-01
What is consciousness? Conventional approaches see it as an emergent property of complex interactions among individual neurons; however these approaches fail to address enigmatic features of consciousness. Accordingly, some philosophers have contended that "qualia," or an experiential medium from which consciousness is derived, exists as a fundamental component of reality. Whitehead, for example, described the universe as being composed of "occasions of experience." To examine this possibility scientifically, the very nature of physical reality must be re-examined. We must come to terms with the physics of spacetime--as described by Einstein's general theory of relativity, and its relation to the fundamental theory of matter--as described by quantum theory. Roger Penrose has proposed a new physics of objective reduction: "OR," which appeals to a form of quantum gravity to provide a useful description of fundamental processes at the quantum/classical borderline. Within the OR scheme, we consider that consciousness occurs if an appropriately organized system is able to develop and maintain quantum coherent superposition until a specific "objective" criterion (a threshold related to quantum gravity) is reached; the coherent system then self-reduces (objective reduction: OR). We contend that this type of objective self-collapse introduces non-computability, an essential feature of consciousness which distinguishes our minds from classical computers. Each OR is taken as an instantaneous event--the climax of a self-organizing process in fundamental spacetime--and a candidate for a conscious Whitehead "occasion of experience." How could an OR process occur in the brain, be coupled to neural activities, and account for other features of consciousness? We nominate a quantum computational OR process with the requisite characteristics to be occurring in cytoskeletal micro-tubules within the brain's neurons. In this model, quantum-superposed states develop in microtubule subunit proteins ("tubulins") within certain brain neurons, remain coherent, and recruit more superposed tubulins until a mass-time-energy threshold (related to quantum gravity) is reached. At that point, self-collapse, or objective reduction (OR), abruptly occurs. We equate the pre-reduction, coherent superposition ("quantum computing") phase with pre-conscious processes, and each instantaneous (and non-computable) OR, or self-collapse, with a discrete conscious event. Sequences of OR events give rise to a "stream" of consciousness. Microtubule-associated proteins can "tune" the quantum oscillations of the coherent superposed states; the OR is thus self-organized, or "orchestrated" ("Orch OR"). Each Orch OR event selects (non-computably) microtubule subunit states which regulate synaptic/neural functions using classical signaling. The quantum gravity threshold for self-collapse is relevant to consciousness, according to our arguments, because macroscopic superposed quantum states each have their own spacetime geometries. These geometries are also superposed, and in some way "separated," but when sufficiently separated, the superposition of spacetime geometries becomes significantly unstable and reduces to a single universe state. Quantum gravity determines the limits of the instability; we contend that the actual choice of state made by Nature is non-computable. Thus each Orch OR event is a self-selection of spacetime geometry, coupled to the brain through microtubules and other biomolecules. If conscious experience is intimately connected with the very physics underlying spacetime structure, then Orch OR in microtubules indeed provides us with a completely new and uniquely promising perspective on the difficult problems of consciousness.
Analysis Resistant Cipher Method and Apparatus
NASA Technical Reports Server (NTRS)
Oakley, Ernest C. (Inventor)
2009-01-01
A system for encoding and decoding data words including an anti-analysis encoder unit for receiving an original plaintext and producing a recoded data, a data compression unit for receiving the recoded data and producing a compressed recoded data, and an encryption unit for receiving the compressed recoded data and producing an encrypted data. The recoded data has an increased non-correlatable data redundancy compared with the original plaintext in order to mask the statistical distribution of characters in the plaintext data. The system of the present invention further includes a decryption unit for receiving the encrypted data and producing a decrypted data, a data decompression unit for receiving the decrypted data and producing an uncompressed recoded data, and an anti-analysis decoder unit for receiving the uncompressed recoded data and producing a recovered plaintext that corresponds with the original plaintext.
An Image Encryption Algorithm Utilizing Julia Sets and Hilbert Curves
Sun, Yuanyuan; Chen, Lina; Xu, Rudan; Kong, Ruiqing
2014-01-01
Image encryption is an important and effective technique to protect image security. In this paper, a novel image encryption algorithm combining Julia sets and Hilbert curves is proposed. The algorithm utilizes Julia sets’ parameters to generate a random sequence as the initial keys and gets the final encryption keys by scrambling the initial keys through the Hilbert curve. The final cipher image is obtained by modulo arithmetic and diffuse operation. In this method, it needs only a few parameters for the key generation, which greatly reduces the storage space. Moreover, because of the Julia sets’ properties, such as infiniteness and chaotic characteristics, the keys have high sensitivity even to a tiny perturbation. The experimental results indicate that the algorithm has large key space, good statistical property, high sensitivity for the keys, and effective resistance to the chosen-plaintext attack. PMID:24404181
Differential Fault Analysis on CLEFIA
NASA Astrophysics Data System (ADS)
Chen, Hua; Wu, Wenling; Feng, Dengguo
CLEFIA is a new 128-bit block cipher proposed by SONY corporation recently. The fundamental structure of CLEFIA is a generalized Feistel structure consisting of 4 data lines. In this paper, the strength of CLEFIA against the differential fault attack is explored. Our attack adopts the byte-oriented model of random faults. Through inducing randomly one byte fault in one round, four bytes of faults can be simultaneously obtained in the next round, which can efficiently reduce the total induce times in the attack. After attacking the last several rounds' encryptions, the original secret key can be recovered based on some analysis of the key schedule. The data complexity analysis and experiments show that only about 18 faulty ciphertexts are needed to recover the entire 128-bit secret key and about 54 faulty ciphertexts for 192/256-bit keys.
Images Encryption Method using Steganographic LSB Method, AES and RSA algorithm
NASA Astrophysics Data System (ADS)
Moumen, Abdelkader; Sissaoui, Hocine
2017-03-01
Vulnerability of communication of digital images is an extremely important issue nowadays, particularly when the images are communicated through insecure channels. To improve communication security, many cryptosystems have been presented in the image encryption literature. This paper proposes a novel image encryption technique based on an algorithm that is faster than current methods. The proposed algorithm eliminates the step in which the secrete key is shared during the encryption process. It is formulated based on the symmetric encryption, asymmetric encryption and steganography theories. The image is encrypted using a symmetric algorithm, then, the secret key is encrypted by means of an asymmetrical algorithm and it is hidden in the ciphered image using a least significant bits steganographic scheme. The analysis results show that while enjoying the faster computation, our method performs close to optimal in terms of accuracy.
Decomposition of gas-phase trichloroethene by the UV/TiO2 process in the presence of ozone.
Shen, Y S; Ku, Y
2002-01-01
The decomposition of gas-phase trichloroethene (TCE) in air streams by direct photolysis, the UV/TiO2 and UV/O3 processes was studied. The experiments were carried out under various UV light intensities and wavelengths, ozone dosages, and initial concentrations of TCE to investigate and compare the removal efficiency of the pollutant. For UV/TiO2 process, the individual contribution to the decomposition of TCE by direct photolysis and hydroxyl radicals destruction was differentiated to discuss the quantum efficiency with 254 and 365 nm UV lamps. The removal of gaseous TCE was found to reduce by UV/TiO2 process in the presence of ozone possibly because of the ozone molecules could scavenge hydroxyl radicals produced from the excitation of TiO2 by UV radiation to inhibit the decomposition of TCE. A photoreactor design equation for the decomposition of gaseous TCE by the UV/TiO2 process in air streams was developed by combining the continuity equation of the pollutant and the surface catalysis reaction rate expression. By the proposed design scheme, the temporal distribution of TCE at various operation conditions by the UV/TiO2 process can be well modeled.
Multicore Programming Challenges
NASA Astrophysics Data System (ADS)
Perrone, Michael
The computer industry is facing fundamental challenges that are driving a major change in the design of computer processors. Due to restrictions imposed by quantum physics, one historical path to higher computer processor performance - by increased clock frequency - has come to an end. Increasing clock frequency now leads to power consumption costs that are too high to justify. As a result, we have seen in recent years that the processor frequencies have peaked and are receding from their high point. At the same time, competitive market conditions are giving business advantage to those companies that can field new streaming applications, handle larger data sets, and update their models to market conditions faster. The desire for newer, faster and larger is driving continued demand for higher computer performance.
Teoh, Andrew B J; Goh, Alwyn; Ngo, David C L
2006-12-01
Biometric analysis for identity verification is becoming a widespread reality. Such implementations necessitate large-scale capture and storage of biometric data, which raises serious issues in terms of data privacy and (if such data is compromised) identity theft. These problems stem from the essential permanence of biometric data, which (unlike secret passwords or physical tokens) cannot be refreshed or reissued if compromised. Our previously presented biometric-hash framework prescribes the integration of external (password or token-derived) randomness with user-specific biometrics, resulting in bitstring outputs with security characteristics (i.e., noninvertibility) comparable to cryptographic ciphers or hashes. The resultant BioHashes are hence cancellable, i.e., straightforwardly revoked and reissued (via refreshed password or reissued token) if compromised. BioHashing furthermore enhances recognition effectiveness, which is explained in this paper as arising from the Random Multispace Quantization (RMQ) of biometric and external random inputs.
Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains
NASA Astrophysics Data System (ADS)
Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao
2017-11-01
A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.
Coupling Functions Enable Secure Communications
NASA Astrophysics Data System (ADS)
Stankovski, Tomislav; McClintock, Peter V. E.; Stefanovska, Aneta
2014-01-01
Secure encryption is an essential feature of modern communications, but rapid progress in illicit decryption brings a continuing need for new schemes that are harder and harder to break. Inspired by the time-varying nature of the cardiorespiratory interaction, here we introduce a new class of secure communications that is highly resistant to conventional attacks. Unlike all earlier encryption procedures, this cipher makes use of the coupling functions between interacting dynamical systems. It results in an unbounded number of encryption key possibilities, allows the transmission or reception of more than one signal simultaneously, and is robust against external noise. Thus, the information signals are encrypted as the time variations of linearly independent coupling functions. Using predetermined forms of coupling function, we apply Bayesian inference on the receiver side to detect and separate the information signals while simultaneously eliminating the effect of external noise. The scheme is highly modular and is readily extendable to support different communications applications within the same general framework.
Simultaneous transmission for an encrypted image and a double random-phase encryption key
NASA Astrophysics Data System (ADS)
Yuan, Sheng; Zhou, Xin; Li, Da-Hai; Zhou, Ding-Fu
2007-06-01
We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.
Simultaneous transmission for an encrypted image and a double random-phase encryption key.
Yuan, Sheng; Zhou, Xin; Li, Da-hai; Zhou, Ding-fu
2007-06-20
We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.
Evaluation of Information Leakage from Cryptographic Hardware via Common-Mode Current
NASA Astrophysics Data System (ADS)
Hayashi, Yu-Ichi; Homma, Naofumi; Mizuki, Takaaki; Sugawara, Takeshi; Kayano, Yoshiki; Aoki, Takafumi; Minegishi, Shigeki; Satoh, Akashi; Sone, Hideaki; Inoue, Hiroshi
This paper presents a possibility of Electromagnetic (EM) analysis against cryptographic modules outside their security boundaries. The mechanism behind the information leakage is explained from the view point of Electromagnetic Compatibility: electric fluctuation released from cryptographic modules can conduct to peripheral circuits based on ground bounce, resulting in radiation. We demonstrate the consequence of the mechanism through experiments where the ISO/IEC standard block cipher AES (Advanced Encryption Standard) is implemented on an FPGA board and EM radiations from power and communication cables are measured. Correlation Electromagnetic Analysis (CEMA) is conducted in order to evaluate the information leakage. The experimental results show that secret keys are revealed even though there are various disturbing factors such as voltage regulators and AC/DC converters between the target module and the measurement points. We also discuss information-suppression techniques as electrical-level countermeasures against such CEMAs.
Xu, Hao; Han, Zhe; Zhang, Dongju; Zhan, Jinhua
2012-12-01
Although imidazolium-based ionic liquids (ILs) combined with oxygen-containing anions were proposed as the potential solvents for the selective separation of acetylene (C(2)H(2)) and ethylene (C(2)H(4)), the detailed mechanism at the molecular level is still not well understood. The present work focuses on a most effective IL for removing C(2)H(2) from a C(2)H(4) stream, 1-butyl-3-methylimidazolium acetate ([BMIM][OAc]), aiming at understanding the first steps of the adsorption process of the molecules at the IL surface. We present a combined quantum mechanical (QM) calculation and molecular dynamics (MD) simulation study on the structure and property of the IL as well as its interaction with C(2)H(2) and C(2)H(4) molecules. The calculated results indicate that C(2)H(2) presents a stronger interaction with the IL than C(2)H(4) and the anion of the IL is mainly responsible for the stronger interaction. QM calculations show a stronger hydrogen-binding linkage between an acidic proton of C(2)H(2)/C(2)H(4) and the basic oxygen atom in [OAc](-) anion, in contrast to the relative weaker association via the C-H···π interaction between C(2)H(2)/C(2)H(4) and the cation. From MD simulations, it is observed that in the interfacial region, the butyl chain of cations and methyl of anions point into the vapor phase. The coming molecules on the IL surface may be initially wrapped by the extensive butyl chain and then devolved to the interface or caught into the bulk by the anion of IL. The introduction of guest molecules significantly influences the anion distribution and orientation on the interface, but the cations are not disturbed because of their larger volume and relatively weaker interaction with the changes in the guest molecules. The theoretical results provide insight into the molecular mechanism of the observed selective separation of C(2)H(2) form a C(2)H(4) stream by ILs.
NASA Astrophysics Data System (ADS)
Grayson, M.; Zhou, Wang; Yoo, Heun-Mo; Prabhu-Gaunkar, S.; Tiemann, L.; Reichl, C.; Wegscheider, W.
A longitudinal magnetoresistance asymmetry (LMA) between a positive and negative magnetic field is known to occur in both the extreme quantum limit and the classical Drude limit in samples with a nonuniform doping density. By analyzing the current stream function in van der Pauw measurement geometry, it is shown that the electron density gradient can be quantitatively deduced from this LMA in the Drude regime. Results agree with gradients interpolated from local densities calibrated across an entire wafer, establishing a generalization of the van der Pauw method to quantify density gradients. Results will be shown of various semoconductor systems where this method is applied, from bulk doped semiconductors, to exfoliated 2D materials. McCormick Catalyst Award from Northwestern University, EECS Bridge Funding, and AFOSR FA9550-15-1-0247.
INVITED PAPER: Low power cryptography
NASA Astrophysics Data System (ADS)
Kitsos, P.; Koufopavlou, O.; Selimis, G.; Sklavos, N.
2005-01-01
Today more and more sensitive data is stored digitally. Bank accounts, medical records and personal emails are some categories that data must keep secure. The science of cryptography tries to encounter the lack of security. Data confidentiality, authentication, non-reputation and data integrity are some of the main parts of cryptography. The evolution of cryptography drove in very complex cryptographic models which they could not be implemented before some years. The use of systems with increasing complexity, which usually are more secure, has as result low throughput rate and more energy consumption. However the evolution of cipher has no practical impact, if it has only theoretical background. Every encryption algorithm should exploit as much as possible the conditions of the specific system without omitting the physical, area and timing limitations. This fact requires new ways in design architectures for secure and reliable crypto systems. A main issue in the design of crypto systems is the reduction of power consumption, especially for portable systems as smart cards.
A novel chaotic image encryption scheme using DNA sequence operations
NASA Astrophysics Data System (ADS)
Wang, Xing-Yuan; Zhang, Ying-Qian; Bao, Xue-Mei
2015-10-01
In this paper, we propose a novel image encryption scheme based on DNA (Deoxyribonucleic acid) sequence operations and chaotic system. Firstly, we perform bitwise exclusive OR operation on the pixels of the plain image using the pseudorandom sequences produced by the spatiotemporal chaos system, i.e., CML (coupled map lattice). Secondly, a DNA matrix is obtained by encoding the confused image using a kind of DNA encoding rule. Then we generate the new initial conditions of the CML according to this DNA matrix and the previous initial conditions, which can make the encryption result closely depend on every pixel of the plain image. Thirdly, the rows and columns of the DNA matrix are permuted. Then, the permuted DNA matrix is confused once again. At last, after decoding the confused DNA matrix using a kind of DNA decoding rule, we obtain the ciphered image. Experimental results and theoretical analysis show that the scheme is able to resist various attacks, so it has extraordinarily high security.
Deciphering the language of nature: cryptography, secrecy, and alterity in Francis Bacon.
Clody, Michael C
2011-01-01
The essay argues that Francis Bacon's considerations of parables and cryptography reflect larger interpretative concerns of his natural philosophic project. Bacon describes nature as having a language distinct from those of God and man, and, in so doing, establishes a central problem of his natural philosophy—namely, how can the language of nature be accessed through scientific representation? Ultimately, Bacon's solution relies on a theory of differential and duplicitous signs that conceal within them the hidden voice of nature, which is best recognized in the natural forms of efficient causality. The "alphabet of nature"—those tables of natural occurrences—consequently plays a central role in his program, as it renders nature's language susceptible to a process and decryption that mirrors the model of the bilateral cipher. It is argued that while the writing of Bacon's natural philosophy strives for literality, its investigative process preserves a space for alterity within scientific representation, that is made accessible to those with the interpretative key.
Securing your Site in Development and Beyond
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akopov, Mikhail S.
Why wait until production deployment, or even staging and testing deployment to identify security vulnerabilities? Using tools like Burp Suite, you can find security vulnerabilities before they creep up on you. Prevent cross-site scripting attacks, and establish a firmer trust between your website and your client. Verify that Apache/Nginx have the correct SSL Ciphers set. We explore using these tools and more to validate proper Apache/Nginx configurations, and to be compliant with modern configuration standards as part of the development cycle. Your clients can use tools like https://securityheaders.io and https://ssllabs.com to get a graded report on your level of compliancemore » with OWASP Secure Headers Project and SSLLabs recommendations. Likewise, you should always use the same sites to validate your configurations. Burp Suite will find common misconfigurations and will also perform more thorough security testing of your applications. In this session you will see examples of vulnerabilities that were detected early on, as well has how to integrate these practices into your daily workflow.« less
A novel encryption scheme for high-contrast image data in the Fresnelet domain
Bibi, Nargis; Farwa, Shabieh; Jahngir, Adnan; Usman, Muhammad
2018-01-01
In this paper, a unique and more distinctive encryption algorithm is proposed. This is based on the complexity of highly nonlinear S box in Flesnelet domain. The nonlinear pattern is transformed further to enhance the confusion in the dummy data using Fresnelet technique. The security level of the encrypted image boosts using the algebra of Galois field in Fresnelet domain. At first level, the Fresnelet transform is used to propagate the given information with desired wavelength at specified distance. It decomposes given secret data into four complex subbands. These complex sub-bands are separated into two components of real subband data and imaginary subband data. At second level, the net subband data, produced at the first level, is deteriorated to non-linear diffused pattern using the unique S-box defined on the Galois field F28. In the diffusion process, the permuted image is substituted via dynamic algebraic S-box substitution. We prove through various analysis techniques that the proposed scheme enhances the cipher security level, extensively. PMID:29608609
The Combination of RSA And Block Chiper Algorithms To Maintain Message Authentication
NASA Astrophysics Data System (ADS)
Yanti Tarigan, Sepri; Sartika Ginting, Dewi; Lumban Gaol, Melva; Lorensi Sitompul, Kristin
2017-12-01
RSA algorithm is public key algorithm using prime number and even still used today. The strength of this algorithm lies in the exponential process, and the factorial number into 2 prime numbers which until now difficult to do factoring. The RSA scheme itself adopts the block cipher scheme, where prior to encryption, the existing plaintext is divide in several block of the same length, where the plaintext and ciphertext are integers between 1 to n, where n is typically 1024 bit, and the block length itself is smaller or equal to log(n)+1 with base 2. With the combination of RSA algorithm and block chiper it is expected that the authentication of plaintext is secure. The secured message will be encrypted with RSA algorithm first and will be encrypted again using block chiper. And conversely, the chipertext will be decrypted with the block chiper first and decrypted again with the RSA algorithm. This paper suggests a combination of RSA algorithms and block chiper to secure data.
Performance analysis of AES-Blowfish hybrid algorithm for security of patient medical record data
NASA Astrophysics Data System (ADS)
Mahmud H, Amir; Angga W, Bayu; Tommy; Marwan E, Andi; Siregar, Rosyidah
2018-04-01
A file security is one method to protect data confidentiality, integrity and information security. Cryptography is one of techniques used to secure and guarantee data confidentiality by doing conversion to the plaintext (original message) to cipher text (hidden message) with two important processes, they are encrypt and decrypt. Some researchers proposed a hybrid method to improve data security. In this research we proposed hybrid method of AES-blowfish (BF) to secure the patient’s medical report data into the form PDF file that sources from database. Generation method of private and public key uses two ways of approach, those are RSA method f RSA and ECC. We will analyze impact of these two ways of approach for hybrid method at AES-blowfish based on time and Throughput. Based on testing results, BF method is faster than AES and AES-BF hybrid, however AES-BF hybrid is better for throughput compared with AES and BF is higher.
NASA Astrophysics Data System (ADS)
Al-Hayani, Nazar; Al-Jawad, Naseer; Jassim, Sabah A.
2014-05-01
Video compression and encryption became very essential in a secured real time video transmission. Applying both techniques simultaneously is one of the challenges where the size and the quality are important in multimedia transmission. In this paper we proposed a new technique for video compression and encryption. Both encryption and compression are based on edges extracted from the high frequency sub-bands of wavelet decomposition. The compression algorithm based on hybrid of: discrete wavelet transforms, discrete cosine transform, vector quantization, wavelet based edge detection, and phase sensing. The compression encoding algorithm treats the video reference and non-reference frames in two different ways. The encryption algorithm utilized A5 cipher combined with chaotic logistic map to encrypt the significant parameters and wavelet coefficients. Both algorithms can be applied simultaneously after applying the discrete wavelet transform on each individual frame. Experimental results show that the proposed algorithms have the following features: high compression, acceptable quality, and resistance to the statistical and bruteforce attack with low computational processing.
Vulnerabilities in GSM technology and feasibility of selected attacks
NASA Astrophysics Data System (ADS)
Voznak, M.; Prokes, M.; Sevcik, L.; Frnda, J.; Toral-Cruz, Homer; Jakovlev, Sergej; Fazio, Peppino; Mehic, M.; Mikulec, M.
2015-05-01
Global System for Mobile communication (GSM) is the most widespread technology for mobile communications in the world and serving over 7 billion users. Since first publication of system documentation there has been notified a potential safety problem's occurrence. Selected types of attacks, based on the analysis of the technical feasibility and the degree of risk of these weaknesses, were implemented and demonstrated in laboratory of the VSB-Technical University of Ostrava, Czech Republic. These vulnerabilities were analyzed and afterwards possible attacks were described. These attacks were implemented using open-source tools, software programmable radio USRP (Universal Software RadioPeripheral) and DVB-T (Digital Video Broadcasting - Terrestrial) receiver. GSM security architecture is being scrutinized since first public releases of its specification mainly pointing out weaknesses in authentication and ciphering mechanisms. This contribution also summarizes practically proofed and used scenarios that are performed using opensource software tools and variety of scripts mostly written in Python. Main goal of this paper is in analyzing security issues in GSM network and practical demonstration of selected attacks.
A Transcription Activator-Like Effector (TALE) Toolbox for Genome Engineering
Sanjana, Neville E.; Cong, Le; Zhou, Yang; Cunniff, Margaret M.; Feng, Guoping; Zhang, Feng
2013-01-01
Transcription activator-like effectors (TALEs) are a class of naturally occurring DNA binding proteins found in the plant pathogen Xanthomonas sp. The DNA binding domain of each TALE consists of tandem 34-amino acid repeat modules that can be rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Here we describe a toolbox for rapid construction of custom TALE transcription factors (TALE-TFs) and nucleases (TALENs) using a hierarchical ligation procedure. This toolbox facilitates affordable and rapid construction of custom TALE-TFs and TALENs within one week and can be easily scaled up to construct TALEs for multiple targets in parallel. We also provide details for testing the activity in mammalian cells of custom TALE-TFs and TALENs using, respectively, qRT-PCR and Surveyor nuclease. The TALE toolbox described here will enable a broad range of biological applications. PMID:22222791
SSL/TLS Vulnerability Detection Using Black Box Approach
NASA Astrophysics Data System (ADS)
Gunawan, D.; Sitorus, E. H.; Rahmat, R. F.; Hizriadi, A.
2018-03-01
Socket Secure Layer (SSL) and Transport Layer Security (TLS) are cryptographic protocols that provide data encryption to secure the communication over a network. However, in some cases, there are vulnerability found in the implementation of SSL/TLS because of weak cipher key, certificate validation error or session handling error. One of the most vulnerable SSL/TLS bugs is heartbleed. As the security is essential in data communication, this research aims to build a scanner that detect the SSL/TLS vulnerability by using black box approach. This research will focus on heartbleed case. In addition, this research also gathers information about existing SSL in the server. The black box approach is used to test the output of a system without knowing the process inside the system itself. For testing purpose, this research scanned websites and found that some of the websites still have SSL/TLS vulnerability. Thus, the black box approach can be used to detect the vulnerability without considering the source code and the process inside the application.
Ferrenberg Swendsen Analysis of LLNL and NYBlue BG/L p4rhms Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soltz, R
2007-12-05
These results are from the continuing Lattice Quantum Chromodynamics runs on BG/L. These results are from the Ferrenberg-Swendsen analysis [?] of the combined data from LLNL and NYBlue BG/L runs for 32{sup 3} x 8 runs with the p4rhmc v2.0 QMP-MPI.X (semi-optimized p4 code using qmp over mpi). The jobs include beta values ranging from 3.525 to 3.535 with an alternate analysis extending to 3.540. The NYBlue data sets are from 9k trajectories from Oct 2007, and the LLNL data are from two independent streams of {approx}5k each, taking from the July 2007 runs. The following outputs are produced bymore » the fs-2+1-chiub.c program. All outputs have had checksums produced by addCks.pl and checked by the checkCks.pl perl script after scanning.« less
Note: optical receiver system for 152-channel magnetoencephalography.
Kim, Jin-Mok; Kwon, Hyukchan; Yu, Kwon-kyu; Lee, Yong-Ho; Kim, Kiwoong
2014-11-01
An optical receiver system composing 13 serial data restore/synchronizer modules and a single module combiner converted optical 32-bit serial data into 32-bit synchronous parallel data for a computer to acquire 152-channel magnetoencephalography (MEG) signals. A serial data restore/synchronizer module identified 32-bit channel-voltage bits from 48-bit streaming serial data, and then consecutively reproduced 13 times of 32-bit serial data, acting in a synchronous clock. After selecting a single among 13 reproduced data in each module, a module combiner converted it into 32-bit parallel data, which were carried to 32-port digital input board in a computer. When the receiver system together with optical transmitters were applied to 152-channel superconducting quantum interference device sensors, this MEG system maintained a field noise level of 3 fT/√Hz @ 100 Hz at a sample rate of 1 kSample/s per channel.
Mental ability and psychological work performance in Chinese workers.
Zhong, Fei; Yano, Eiji; Lan, Yajia; Wang, Mianzhen; Wang, Zhiming; Wang, Xiaorong
2006-10-01
This study was to explore the relationship among mental ability, occupational stress, and psychological work performance in Chinese workers, and to identify relevant modifiers of mental ability and psychological work performance. Psychological Stress Intensity (PSI), psychological work performance, and mental ability (Mental Function Index, MFI) were determined among 485 Chinese workers (aged 33 to 62 yr, 65% of men) with varied work occupations. Occupational Stress Questionnaire (OSQ) and mental ability with 3 tests (including immediate memory, digit span, and cipher decoding) were used. The relationship between mental ability and psychological work performance was analyzed with multiple linear regression approach. PSI, MFI, or psychological work performance were significantly different among different work types and educational level groups (p<0.01). Multiple linear regression analysis showed that MFI was significantly related to gender, age, educational level, and work type. Higher MFI and lower PSI predicted a better psychological work performance, even after adjusted for gender, age, educational level, and work type. The study suggests that occupational stress and low mental ability are important predictors for poor psychological work performance, which is modified by both gender and educational level.
Super-Encryption Implementation Using Monoalphabetic Algorithm and XOR Algorithm for Data Security
NASA Astrophysics Data System (ADS)
Rachmawati, Dian; Andri Budiman, Mohammad; Aulia, Indra
2018-03-01
The exchange of data that occurs offline and online is very vulnerable to the threat of data theft. In general, cryptography is a science and art to maintain data secrecy. An encryption is a cryptography algorithm in which data is transformed into cipher text, which is something that is unreadable and meaningless so it cannot be read or understood by other parties. In super-encryption, two or more encryption algorithms are combined to make it more secure. In this work, Monoalphabetic algorithm and XOR algorithm are combined to form a super- encryption. Monoalphabetic algorithm works by changing a particular letter into a new letter based on existing keywords while the XOR algorithm works by using logic operation XOR Since Monoalphabetic algorithm is a classical cryptographic algorithm and XOR algorithm is a modern cryptographic algorithm, this scheme is expected to be both easy-to-implement and more secure. The combination of the two algorithms is capable of securing the data and restoring it back to its original form (plaintext), so the data integrity is still ensured.
Enhanced rearrangement technique for secure data transmission: case study credit card process
NASA Astrophysics Data System (ADS)
Vyavahare, Tushar; Tekade, Darshana; Nayak, Saurabh; kumar, N. Suresh; Blessy Trencia Lincy, S. S.
2017-11-01
Encryption of data is very important in order to keep the data secure and make secure transactions and transmission of data. Such as online shopping. whenever we give our card details there is possibility of data being hacked or intruded. So to secure that we need to encrypt the data and decryption strategy should be known only to that particular bank. Therefore to achieve this objective RSA algorithm can be used. Where only intended sender and receiver can know about the encryption and decryption of data. To make the RSA technique more secure in this paper we propose the technique we call it Modified RSA. for which a transposition module is designed which uses Row Transposition method to encrypt the data. Before giving the card details to RSA the input will be given to this transposition module which will scrambles the data and rearranges it. Output of transposition will be then provided to the modified RSA which produces the cipher text to send over the network. Use of RSA and the transposition module will provide the dual security to whole system.
Efficient Hardware Implementation of the Lightweight Block Encryption Algorithm LEA
Lee, Donggeon; Kim, Dong-Chan; Kwon, Daesung; Kim, Howon
2014-01-01
Recently, due to the advent of resource-constrained trends, such as smartphones and smart devices, the computing environment is changing. Because our daily life is deeply intertwined with ubiquitous networks, the importance of security is growing. A lightweight encryption algorithm is essential for secure communication between these kinds of resource-constrained devices, and many researchers have been investigating this field. Recently, a lightweight block cipher called LEA was proposed. LEA was originally targeted for efficient implementation on microprocessors, as it is fast when implemented in software and furthermore, it has a small memory footprint. To reflect on recent technology, all required calculations utilize 32-bit wide operations. In addition, the algorithm is comprised of not complex S-Box-like structures but simple Addition, Rotation, and XOR operations. To the best of our knowledge, this paper is the first report on a comprehensive hardware implementation of LEA. We present various hardware structures and their implementation results according to key sizes. Even though LEA was originally targeted at software efficiency, it also shows high efficiency when implemented as hardware. PMID:24406859
Fast, Parallel and Secure Cryptography Algorithm Using Lorenz's Attractor
NASA Astrophysics Data System (ADS)
Marco, Anderson Gonçalves; Martinez, Alexandre Souto; Bruno, Odemir Martinez
A novel cryptography method based on the Lorenz's attractor chaotic system is presented. The proposed algorithm is secure and fast, making it practical for general use. We introduce the chaotic operation mode, which provides an interaction among the password, message and a chaotic system. It ensures that the algorithm yields a secure codification, even if the nature of the chaotic system is known. The algorithm has been implemented in two versions: one sequential and slow and the other, parallel and fast. Our algorithm assures the integrity of the ciphertext (we know if it has been altered, which is not assured by traditional algorithms) and consequently its authenticity. Numerical experiments are presented, discussed and show the behavior of the method in terms of security and performance. The fast version of the algorithm has a performance comparable to AES, a popular cryptography program used commercially nowadays, but it is more secure, which makes it immediately suitable for general purpose cryptography applications. An internet page has been set up, which enables the readers to test the algorithm and also to try to break into the cipher.
Pseudo-random generator based on Chinese Remainder Theorem
NASA Astrophysics Data System (ADS)
Bajard, Jean Claude; Hördegen, Heinrich
2009-08-01
Pseudo-Random Generators (PRG) are fundamental in cryptography. Their use occurs at different level in cipher protocols. They need to verify some properties for being qualified as robust. The NIST proposes some criteria and a tests suite which gives informations on the behavior of the PRG. In this work, we present a PRG constructed from the conversion between further residue systems of representation of the elements of GF(2)[X]. In this approach, we use some pairs of co-prime polynomials of degree k and a state vector of 2k bits. The algebraic properties are broken by using different independent pairs during the process. Since this method is reversible, we also can use it as a symmetric crypto-system. We evaluate the cost of a such system, taking into account that some operations are commonly implemented on crypto-processors. We give the results of the different NIST Tests and we explain this choice compare to others found in the literature. We describe the behavior of this PRG and explain how the different rounds are chained for ensuring a fine secure randomness.
Compression of Encrypted Images Using Set Partitioning In Hierarchical Trees Algorithm
NASA Astrophysics Data System (ADS)
Sarika, G.; Unnithan, Harikuttan; Peter, Smitha
2011-10-01
When it is desired to transmit redundant data over an insecure channel, it is customary to encrypt the data. For encrypted real world sources such as images, the use of Markova properties in the slepian-wolf decoder does not work well for gray scale images. Here in this paper we propose a method of compression of an encrypted image. In the encoder section, the image is first encrypted and then it undergoes compression in resolution. The cipher function scrambles only the pixel values, but does not shuffle the pixel locations. After down sampling, each sub-image is encoded independently and the resulting syndrome bits are transmitted. The received image undergoes a joint decryption and decompression in the decoder section. By using the local statistics based on the image, it is recovered back. Here the decoder gets only lower resolution version of the image. In addition, this method provides only partial access to the current source at the decoder side, which improves the decoder's learning of the source statistics. The source dependency is exploited to improve the compression efficiency. This scheme provides better coding efficiency and less computational complexity.
Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling
2016-01-01
Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model. PMID:27898703
Information security: where computer science, economics and psychology meet.
Anderson, Ross; Moore, Tyler
2009-07-13
Until ca. 2000, information security was seen as a technological discipline, based on computer science but with mathematics helping in the design of ciphers and protocols. That perspective started to change as researchers and practitioners realized the importance of economics. As distributed systems are increasingly composed of machines that belong to principals with divergent interests, incentives are becoming as important to dependability as technical design. A thriving new field of information security economics provides valuable insights not just into 'security' topics such as privacy, bugs, spam and phishing, but into more general areas of system dependability and policy. This research programme has recently started to interact with psychology. One thread is in response to phishing, the most rapidly growing form of online crime, in which fraudsters trick people into giving their credentials to bogus websites; a second is through the increasing importance of security usability; and a third comes through the psychology-and-economics tradition. The promise of this multidisciplinary research programme is a novel framework for analysing information security problems-one that is both principled and effective.
Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling
2016-01-01
Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model.
EDITORIAL The 17th Central European Workshop on Quantum Optics
NASA Astrophysics Data System (ADS)
Man'ko, Margarita A.
2011-02-01
Although the origin of quantum optics can be traced back to the beginning of the 20th century, when the fundamental ideas about the quantum nature of the interaction between light and matter were put forward, the splendid blossoming of this part of physics began half a century later, after the invention of masers and lasers. It is remarkable that after another half a century the tree of quantum optics is not only very strong and spreading, but all its branches continue to grow, showing new beautiful blossoms and giving very useful fruits. A reflection of this progress has been the origin and development of the series of annual events called the Central European Workshops on Quantum Optics (CEWQO). They started at the beginning of the 1990s as rather small meetings of physicists from a few countries in central-eastern Europe, but in less than two decades they have transformed into important events, gathering 100 to 200 participants from practically all European countries. Moreover, many specialists from other continents like to attend these meetings, since they provide an excellent chance to hear about the latest results and new directions of research. Regarding this, it seems worth mentioning at least some of the most interesting and important areas of quantum optics that have attracted the attention of researchers for the past two decades. One of these areas is quantum information, which over the course of time has become an almost independent area of quantum physics. But it still maintains very close ties with quantum optics. The specific parts of this area are, in particular, quantum computing, quantum communication and quantum cryptography, and the problem of quantitative description of such genuine quantum phenomena as entanglement is one of the central items in the current stream of publications. Theory and experiment related to quantum tomography have also become important to contemporary quantum optics. They are closely related to the subject of so-called quantum-state engineering. Different schemes proposed within the framework of this new area enabled the creation in laboratories of various superpositions of quantum states which had previously existed only as beautiful mathematical constructions by theoreticians. Connected to this, recent experiments related to such old problems as decoherence and quantum-classical transition are quite impressive. The same can be said about the interrelations between quantum optics and physics of ultracold atoms and Bose-Einstein condensates. Great progress has been made in cavity quantum electrodynamics, and the past decade gave rise to the new area of circuit quantum electrodynamics. Nowadays, we are very close to the observation of the quantum behavior of macroscopic bodies (mirrors), and the methods used in quantum optics help to achieve this goal. Quantum optics over the past two decades has resulted in such impressive discoveries as the slowing down of light to extremely low velocities and the creation of photonic crystals. The new methods of achieving very strong coupling coefficients between quantized field modes and atomic degrees of freedom open new possibilities for storing and retrieving quantum information transmitted by light. New areas of terahertz, femto- and atto-second optics were born or were significantly developed during the past two decades. In addition, the tomographic-probability representation of photon-quantum states has created new possibilities both in theoretical and experimental aspects of quantum optics. Traditionally, measured optical tomograms of photon states were considered as a technical tool for reconstructing the Wigner functions of quantum states. It became clear that these measured tomograms are primary objects; one does not need to reconstruct the Wigner function to extract information on physical properties of the state, for example, on the state purity. Purity is experimentally obtained directly from measured optical tomograms of photon states. The uncertainty relations for photon quadratures were also checked for the thermal photon state using experimental values of optical tomograms and avoiding the reconstruction procedure of the Wigner function and its associated precision constrains. In the tomographic-probability representation of quantum mechanics and quantum optics, tomograms are used for the description of quantum states as an alternative to the wave function and density matrix. The purity, fidelity, entropy and photon temperature associated with quantum states are expressed in terms of tomograms. This provides the possibility of measuring these characteristics directly by taking optical tomograms and checking basic inequalities like entropic uncertainty relations, temperature-dependent quadrature uncertainty relations, etc. The better understanding that quantum states can be identified with measurable probability distributions like optical tomograms opens new prospects in quantum optics, for example, to check experimentally the uncertainty relations for higher quadrature momenta and to control the precision with which the fundamental inequalities of quantum mechanics are experimentally confirmed. This Topical Issue is a collection of papers presented at the 17th Central European Workshops on Quantum Optics (CEWQO10) held at the University of St Andrews, Scotland, UK, 6-11 June 2010. The other collaborators from different scientific centers who could not, due to different reasons, come to St Andrews but participated in the previous CEWQOs and plan to participate in future CEWQOs also contributed to this issue. The paper by Ulf Leonhardt and Natalia Korolkova, the CEWQO10 Organizers, opens this issue. The order of the following papers corresponds to the alphabetical order of the first author of the paper. The history of CEWQOs can be found in the Preface to the Proceedings of the 15th CEWQO (2009 Phys. Scr. T135 011005). The Proceedings of the 16th Central European Workshop on Quantum Optics (CEWQO09), held at the University of Turku, are also available (2010 Phys. Scr. T140). The 18th Central European Workshop on Quantum Optics (CEWQO11) will be held in Madrid, Spain on 30 May--3 June 2011. It will be chaired by Professor Luis Lorenzo Sanchez Soto from the Complutense University of Madrid. List of Papers The 17th Central European Workshop on Quantum Optics in St Andrews, Scotland Ulf Leonhardt and Natalia Korolkova Double self-Kerr scheme for optical Schrödinger-cat state preparation P Adam, Z Darázs, T Kiss and M Mechler Relations between scaling transformed Husimi functions, Wigner functions and symplectic tomograms describing corresponding physical states V A Andreev, D M Davidović, L D Davidović and M D Davidović Entanglement dynamics of two independent cavity-embedded quantum dots B Bellomo, G Compagno, R Lo Franco, A Ridolfo and S Savasta Dynamical stabilization of spin systems in time-dependent magnetic fields Yu V Bezvershenko, P I Holod and A Messina Entanglement dynamics of a bipartite system in squeezed vacuum reservoirs Smail Bougouffa and Awatif Hindi On Wheeler's delayed-choice Gedankenexperiment and its laboratory realization M Božić, L Vušković, M Davidović and Á S Sanz A smooth, holographically generated ring trap for the investigation of superfluidity in ultracold atoms Graham D Bruce, James Mayoh, Giuseppe Smirne, Lara Torralbo-Campo and Donatella Cassettari Parametric amplification of the classical field in cavities with photoexcited semiconductors V V Dodonov Mutually unbiased bases: tomography of spin states and the star-product scheme S N Filippov and V I Man'ko Quantum trajectory model for photon detectors and optoelectronic devices Teppo Häyrynen, Jani Oksanen and Jukka Tulkki Entanglement in two-mode continuous variable open quantum systems Aurelian Isar A classical field comeback? The classical field viewpoint on triparticle entanglement Andrei Khrennikov Experimental investigation of the enhancement factor and the cross-correlation function for graphs with and without time-reversal symmetry: the open system case Michał Ławniczak, Szymon Bauch, Oleh Hul and Leszek Sirko Independent nonclassical tests for states and measurements in the same experiment Alfredo Luis and Ángel Rivas On the classical capacity of quantum Gaussian channels Cosmo Lupo, Stefano Pirandola, Paolo Aniello and Stefano Mancini Entropic inequalities for center-of-mass tomograms Margarita A Man'ko Semiclassical dynamics for an ion confined within a nonlinear electromagnetic trap Bogdan M Mihalcea Zeno-like phenomena in STIRAP processes B Militello, M Scala, A Messina and N V Vitanov A beam splitter with second-order nonlinearity modeled as a nonlinear coupler V Peřinová, A Lukš and J Křepelka Energy-level shifts of a uniformly accelerated atom between two reflecting plates L Rizzuto and S Spagnolo Cross-Kerr nonlinearities in an optically dressed periodic medium K Słowik, A Raczyński, J Zaremba, S Zielińska-Kaniasty, M Artoni and G C La Rocca An approximate effective beamsplitter interaction between light and atomic ensembles Richard Tatham, David Menzies and Natalia Korolkova Stochastic simulation of long-time nonadiabatic dynamics Daniel A Uken, Alessandro Sergi and Francesco Petruccione
BOOK REVIEW: Path Integrals in Field Theory: An Introduction
NASA Astrophysics Data System (ADS)
Ryder, Lewis
2004-06-01
In the 1960s Feynman was known to particle physicists as one of the people who solved the major problems of quantum electrodynamics, his contribution famously introducing what are now called Feynman diagrams. To other physicists he gained a reputation as the author of the Feynman Lectures on Physics; in addition some people were aware of his work on the path integral formulation of quantum theory, and a very few knew about his work on gravitation and Yang--Mills theories, which made use of path integral methods. Forty years later the scene is rather different. Many of the problems of high energy physics are solved; and the standard model incorporates Feynman's path integral method as a way of proving the renormalisability of the gauge (Yang--Mills) theories involved. Gravitation is proving a much harder nut to crack, but here also questions of renormalisability are couched in path-integral language. What is more, theoretical studies of condensed matter physics now also appeal to this technique for quantisation, so the path integral method is becoming part of the standard apparatus of theoretical physics. Chapters on it appear in a number of recent books, and a few books have appeared devoted to this topic alone; the book under review is a very recent one. Path integral techniques have the advantage of enormous conceptual appeal and the great disadvantage of mathematical complexity, this being partly the result of messy integrals but more fundamentally due to the notions of functional differentiation and integration which are involved in the method. All in all this subject is not such an easy ride. Mosel's book, described as an introduction, is aimed at graduate students and research workers in particle physics. It assumes a background knowledge of quantum mechanics, both non-relativistic and relativistic. After three chapters on the path integral formulation of non-relativistic quantum mechanics there are eight chapters on scalar and spinor field theory, followed by three on gauge field theories---quantum electrodynamics and Yang--Mills theories, Faddeev--Popov ghosts and so on.There is no treatment of the quantisation of gravity.Thus in about 200 pages the reader has the chance to learn in some detail about a most important area of modern physics. The subject is tough but the style is clear and pedagogic, results for the most part being derived explicitly. The choice of topics included is main-stream and sensible and one has a clear sense that the author knows where he is going and is a reliable guide. Path Integrals in Field Theory is clearly the work of a man with considerable teaching experience and is recommended as a readable and helpful account of a rather non-trivial subject.
Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D; Cui, Licong
2015-11-10
A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f1, f2, ..., fk. The input for each function fi has 3 components: a random number r, an integer n, and input data m. The result, fi(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f1(r1, n1, m1), f2(r2, n2, m2), ..., fk(rk, nk, mk). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash.
Reforming of fuel inside fuel cell generator
Grimble, Ralph E.
1988-01-01
Disclosed is an improved method of reforming a gaseous reformable fuel within a solid oxide fuel cell generator, wherein the solid oxide fuel cell generator has a plurality of individual fuel cells in a refractory container, the fuel cells generating a partially spent fuel stream and a partially spent oxidant stream. The partially spent fuel stream is divided into two streams, spent fuel stream I and spent fuel stream II. Spent fuel stream I is burned with the partially spent oxidant stream inside the refractory container to produce an exhaust stream. The exhaust stream is divided into two streams, exhaust stream I and exhaust stream II, and exhaust stream I is vented. Exhaust stream II is mixed with spent fuel stream II to form a recycle stream. The recycle stream is mixed with the gaseous reformable fuel within the refractory container to form a fuel stream which is supplied to the fuel cells. Also disclosed is an improved apparatus which permits the reforming of a reformable gaseous fuel within such a solid oxide fuel cell generator. The apparatus comprises a mixing chamber within the refractory container, means for diverting a portion of the partially spent fuel stream to the mixing chamber, means for diverting a portion of exhaust gas to the mixing chamber where it is mixed with the portion of the partially spent fuel stream to form a recycle stream, means for injecting the reformable gaseous fuel into the recycle stream, and means for circulating the recycle stream back to the fuel cells.
Reforming of fuel inside fuel cell generator
Grimble, R.E.
1988-03-08
Disclosed is an improved method of reforming a gaseous reformable fuel within a solid oxide fuel cell generator, wherein the solid oxide fuel cell generator has a plurality of individual fuel cells in a refractory container, the fuel cells generating a partially spent fuel stream and a partially spent oxidant stream. The partially spent fuel stream is divided into two streams, spent fuel stream 1 and spent fuel stream 2. Spent fuel stream 1 is burned with the partially spent oxidant stream inside the refractory container to produce an exhaust stream. The exhaust stream is divided into two streams, exhaust stream 1 and exhaust stream 2, and exhaust stream 1 is vented. Exhaust stream 2 is mixed with spent fuel stream 2 to form a recycle stream. The recycle stream is mixed with the gaseous reformable fuel within the refractory container to form a fuel stream which is supplied to the fuel cells. Also disclosed is an improved apparatus which permits the reforming of a reformable gaseous fuel within such a solid oxide fuel cell generator. The apparatus comprises a mixing chamber within the refractory container, means for diverting a portion of the partially spent fuel stream to the mixing chamber, means for diverting a portion of exhaust gas to the mixing chamber where it is mixed with the portion of the partially spent fuel stream to form a recycle stream, means for injecting the reformable gaseous fuel into the recycle stream, and means for circulating the recycle stream back to the fuel cells. 1 fig.
Investigation of the Possibility of Using Nuclear Magnetic Spin Alignment
NASA Technical Reports Server (NTRS)
Dent, William V., Jr.
1998-01-01
The goal of the program to investigate a "Gasdynamic fusion propulsion system for space exploration" is to develop a fusion propulsion system for a manned mission to the planet mars. A study using Deuterium and Tritium atoms are currently in progress. When these atoms under-go fusion, the resulting neutrons and alpha particles are emitted in random directions (isotropically). The probable direction of emission is equal for all directions, thus resulting in wasted energy, massive shielding and cooling requirements, and serious problems with the physics of achieving fusion. If the nuclear magnetic spin moments of the deuterium and tritium nuclei could be precisely aligned at the moment of fusion, the stream of emitted neutrons could be directed out the rear of the spacecraft for thrust and the alpha particles directed forward into an electromagnet ot produce electricity to continue operating the fusion engine. The following supporting topics are discussed: nuclear magnetic moments and spin precession in magnetic field, nuclear spin quantum mechanics, kinematics of nuclear reactions, and angular distribution of particles.
NASA Technical Reports Server (NTRS)
Jackson, Dan E., Jr.
2015-01-01
The planetary exploration programs demand a totally new examination of data multiplexing, digital communications protocols and data transmission principles for both ground and spacecraft operations. Highly adaptive communications devices on-board and on the ground must provide the greatest possible transmitted data density between deployed crew personnel, spacecraft and ground control teams. Regarding these requirements, this proposal borrows from research into quantum mechanical computing by applying the concept of a qubit, a single bit that represents 16 states, to radio frequency (RF) communications link design for exploration programs. This concept of placing multiple character values into a single data bit can easily make the evolutionary steps needed to meet exploration mission demands. To move the qubit from the quantum mechanical research laboratory into long distance RF data transmission, this proposal utilizes polarization modulation of the RF carrier signal to represent numbers from zero to fifteen. It introduces the concept of a binary-to-hexadecimal converter that quickly chops any data stream into 16-bit words and connects variously polarized feedhorns to a single-frequency radio transmitter. Further, the concept relies on development of a receiver that uses low-noise amplifiers and an antenna array to quickly assess carrier polarity and perform hexadecimal to binary conversion. Early testbed experiments using the International Space Station (ISS) as an operations laboratory can be implemented to provide the most cost-effective return for research investment. The improvement in signal-to-noise ratio while supporting greater baseband data rates that could be achieved through this concept justifies its consideration for long-distance exploration programs.
Effect of sequential isoproturon pulse exposure on Scenedesmus vacuolatus.
Vallotton, Nathalie; Eggen, Rik Ilda Lambertus; Chèvre, Nathalie
2009-04-01
Aquatic organisms are typically exposed to fluctuating concentrations of herbicides in streams. To assess the effects on algae of repeated peak exposure to the herbicide isoproturon, we subjected the alga Scenedesmus vacuolatus to two sequential pulse exposure scenarios. Effects on growth and on the inhibition of the effective quantum yield of photosystem II (PSII) were measured. In the first scenario, algae were exposed to short, 5-h pulses at high isoproturon concentrations (400 and 1000 microg/l), each followed by a recovery period of 18 h, while the second scenario consisted of 22.5-h pulses at lower concentrations (60 and 120 microg/l), alternating with short recovery periods (1.5 h). In addition, any changes in the sensitivity of the algae to isoproturon following sequential pulses were examined by determining the growth rate-EC(50) prior to and following exposure. In both exposure scenarios, we found that algal growth and its effective quantum yield were systematically inhibited during the exposures and that these effects were reversible. Sequential pulses to isoproturon could be considered a sequence of independent events. Nevertheless, a consequence of inhibited growth during the repeated exposures is the cumulative decrease in biomass production. Furthermore, in the second scenario, when the sequence of long pulses began to approach a scenario of continuous exposure, a slight increase in the tolerance of the algae to isoproturon was observed. These findings indicated that sequential pulses do affect algae during each pulse exposure, even if algae recover between the exposures. These observations could support an improved risk assessment of fluctuating exposures to reversibly acting herbicides.
Lahar flow simulation using Laharz_py program: Application for the Mt. Halla volcano, Jeju, Korea
NASA Astrophysics Data System (ADS)
Chang, C.; Yun, S. H.; Yi, W.
2017-12-01
Lahar, one of catastrophic events, has the potential to cause the loss of life and damage to infrastructure over inhabited areas. This study using Laharz_py, was performed schematic prediction on the impact area of lahar hazards at the Mt. Halla volcano, Jeju island. In order to comprehensively address the impact of lahar for the Mt. Halla, two distinct parameters, H/L ratio and lahar volume, were selected to influence variable for Laharz_py simulation. It was carried out on the basis of numerical simulation by estimating a possible lahar volumes of 30,000, 50,000, 70,000, 100,000, 300,000, 500,000 m3 according to H/L ratios (0.20, 0.22 and 0.25) was applied. Based on the numerical simulations, the area of the proximal hazard zone boundary is gradually decreased with increasing H/L ratio. The number of streams which affected by lahar, tended to decrease with increasing H/L ratio. In the case of H/L ratio 0.20, three streams (Gwangryeong stream, Dogeun stream, Han stream) in the Jeju-si area and six streams (Gungsan stream, Hogeun stream, Seohong stream, Donghong stream, Bomok stream, Yeong stream-Hyodon stream) in the Seogwipo-si area are affected. In the case of H/L ratio 0.22, two streams (Gwangryeong stream and Han stream) in the Jeju-si area and five streams (Gungsan stream, Seohong stream, Donghong stream, Bomok stream, Yeong stream-Hyodon stream) in the Seogwipo-si area are affected. And in the case of H/L ratio 0.25, two streams (Gwangryeong stream and Han stream) in the Jeju-si area and one stream (Yeong stream-Hyodon stream) in the Seogwipo-si area are affected. The results of this study will be used as basic data to create a risk map for the direct damage that can be caused due to volcanic hazards arising from Mt. Halla. This research was supported by a grant [MPSS-NH-2015-81] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government.
Partial oxidation power plant with reheating and method thereof
Newby, Richard A.; Yang, Wen-Ching; Bannister, Ronald L.
1999-01-01
A system and method for generating power having an air compression/partial oxidation system, a turbine, and a primary combustion system. The air compression/partial oxidation system receives a first air stream and a fuel stream and produces a first partially oxidized fuel stream and a first compressed air stream therefrom. The turbine expands the first partially oxidized fuel stream while being cooled by the first compressed air stream to produce a heated air stream. The heated air stream is injected into the expanding first partially oxidized fuel stream, thereby reheating it in the turbine. A second partially oxidized fuel stream is emitted from the turbine. The primary combustion system receives said second partially oxidized fuel stream and a second air stream, combusts said second partially oxidized fuel stream, and produces rotating shaft power and an emission stream therefrom.
Partial oxidation power plant with reheating and method thereof
Newby, R.A.; Yang, W.C.; Bannister, R.L.
1999-08-10
A system and method are disclosed for generating power having an air compression/partial oxidation system, a turbine, and a primary combustion system. The air compression/partial oxidation system receives a first air stream and a fuel stream and produces a first partially oxidized fuel stream and a first compressed air stream therefrom. The turbine expands the first partially oxidized fuel stream while being cooled by the first compressed air stream to produce a heated air stream. The heated air stream is injected into the expanding first partially oxidized fuel stream, thereby reheating it in the turbine. A second partially oxidized fuel stream is emitted from the turbine. The primary combustion system receives said second partially oxidized fuel stream and a second air stream, combusts said second partially oxidized fuel stream, and produces rotating shaft power and an emission stream therefrom. 2 figs.
Predicting alpine headwater stream intermittency: a case study in the northern Rocky Mountains
Sando, Thomas R.; Blasch, Kyle W.
2015-01-01
This investigation used climatic, geological, and environmental data coupled with observational stream intermittency data to predict alpine headwater stream intermittency. Prediction was made using a random forest classification model. Results showed that the most important variables in the prediction model were snowpack persistence, represented by average snow extent from March through July, mean annual mean monthly minimum temperature, and surface geology types. For stream catchments with intermittent headwater streams, snowpack, on average, persisted until early June, whereas for stream catchments with perennial headwater streams, snowpack, on average, persisted until early July. Additionally, on average, stream catchments with intermittent headwater streams were about 0.7 °C warmer than stream catchments with perennial headwater streams. Finally, headwater stream catchments primarily underlain by coarse, permeable sediment are significantly more likely to have intermittent headwater streams than those primarily underlain by impermeable bedrock. Comparison of the predicted streamflow classification with observed stream status indicated a four percent classification error for first-order streams and a 21 percent classification error for all stream orders in the study area.
Code of Federal Regulations, 2011 CFR
2011-07-01
... wastewater streams and liquid streams in open systems within an MCPU? 63.2485 Section 63.2485 Protection of... Compliance Requirements § 63.2485 What requirements must I meet for wastewater streams and liquid streams in... to your wastewater streams and liquid streams in open systems within an MCPU, except as specified in...
Interaction between stream temperature, streamflow, and groundwater exchanges in alpine streams
Constantz, James E.
1998-01-01
Four alpine streams were monitored to continuously collect stream temperature and streamflow for periods ranging from a week to a year. In a small stream in the Colorado Rockies, diurnal variations in both stream temperature and streamflow were significantly greater in losing reaches than in gaining reaches, with minimum streamflow losses occurring early in the day and maximum losses occurring early in the evening. Using measured stream temperature changes, diurnal streambed infiltration rates were predicted to increase as much as 35% during the day (based on a heat and water transport groundwater model), while the measured increase in streamflow loss was 40%. For two large streams in the Sierra Nevada Mountains, annual stream temperature variations ranged from 0° to 25°C. In summer months, diurnal stream temperature variations were 30–40% of annual stream temperature variations, owing to reduced streamflows and increased atmospheric heating. Previous reports document that one Sierra stream site generally gains groundwater during low flows, while the second Sierra stream site may lose water during low flows. For August the diurnal streamflow variation was 11% at the gaining stream site and 30% at the losing stream site. On the basis of measured diurnal stream temperature variations, streambed infiltration rates were predicted to vary diurnally as much as 20% at the losing stream site. Analysis of results suggests that evapotranspiration losses determined diurnal streamflow variations in the gaining reaches, while in the losing reaches, evapotranspiration losses were compounded by diurnal variations in streambed infiltration. Diurnal variations in stream temperature were reduced in the gaining reaches as a result of discharging groundwater of relatively constant temperature. For the Sierra sites, comparison of results with those from a small tributary demonstrated that stream temperature patterns were useful in delineating discharges of bank storage following dam releases. Direct coupling may have occurred between streamflow and stream temperature for losing stream reaches, such that reduced streamflows facilitated increased afternoon stream temperatures and increased afternoon stream temperatures induced increased streambed losses, leading to even greater increases in both stream temperature and streamflow losses.
The Stream Depletion Model Paradox and a First Solution
NASA Astrophysics Data System (ADS)
Malama, B.
2017-12-01
Hitherto, stream depletion models available in the hydrogeology literature use the xed head Dirichletboundary condition at the stream, and as such do not account for groundwater pumping induced streamdrawdown. They simply treat stream depletion as the decrease in stream discharge due capture by pumping,the groundwater that would discharge to the stream without pumping. We refer to this model predictedstream depletion without stream drawdown as the depletion paradox. It is intuitively clear, however, thatadverse impacts of long-term groundwater abstraction in the neighborhood of a stream include streamdrawdown, which has led to many a dry streambed in the American west and other arid regions. Streamdrawdown is especially acute for low stream ows. A mathematical model that allows for transient streamdrawdown is proposed by introducing the concept of stream storage. The model simply extends the constanthead model at the stream by including a mass-balance condition. The model is developed for a fullypenetrating stream and groundwater abstraction in a conned aquifer. The dependence of model predictedstream depletion and drawdown on stream storage, streambed conductance, aquifer anisotropy, and radialdistance to the pumping well is evaluated. The model is shown to reduce to that of Hantush in the limitas stream storage becomes innitely large, and to the Theis solution with a no- ow boundary at the streamlocation when stream storage gets vanishingly small. The results suggest that using xed stream stage modelsleads to an underestimation the late-time aquifer drawdwon response to pumping in the neighborhood of astream because it correspond to innite stream storage. This is especially critical for management of surfacewater and groundwater resources in systems subjected to prolonged groundwater abstraction and measurablestream drawdown. The model also shows a maximum stream depletion rate, beyond which stream ow to thewell diminishes and eventually vanishes. This suggests that models with xed stream stage overestimate theavailable groundwater supply from streams to pumping wells because of the inherent assumption of innitestream storage. This has implications for sustainable management of groundwater resources near streams.
Benthic invertebrate fauna, small streams
J. Bruce Wallace; S.L. Eggert
2009-01-01
Small streams (first- through third-order streams) make up >98% of the total number of stream segments and >86% of stream length in many drainage networks. Small streams occur over a wide array of climates, geology, and biomes, which influence temperature, hydrologic regimes, water chemistry, light, substrate, stream permanence, a basin's terrestrial plant...
Lung evolution as a cipher for physiology
Torday, J. S.; Rehan, V. K.
2009-01-01
In the postgenomic era, we need an algorithm to readily translate genes into physiologic principles. The failure to advance biomedicine is due to the false hope raised in the wake of the Human Genome Project (HGP) by the promise of systems biology as a ready means of reconstructing physiology from genes. like the atom in physics, the cell, not the gene, is the smallest completely functional unit of biology. Trying to reassemble gene regulatory networks without accounting for this fundamental feature of evolution will result in a genomic atlas, but not an algorithm for functional genomics. For example, the evolution of the lung can be “deconvoluted” by applying cell-cell communication mechanisms to all aspects of lung biology development, homeostasis, and regeneration/repair. Gene regulatory networks common to these processes predict ontogeny, phylogeny, and the disease-related consequences of failed signaling. This algorithm elucidates characteristics of vertebrate physiology as a cascade of emergent and contingent cellular adaptational responses. By reducing complex physiological traits to gene regulatory networks and arranging them hierarchically in a self-organizing map, like the periodic table of elements in physics, the first principles of physiology will emerge. PMID:19366785
Power Consumption and Calculation Requirement Analysis of AES for WSN IoT.
Hung, Chung-Wen; Hsu, Wen-Ting
2018-05-23
Because of the ubiquity of Internet of Things (IoT) devices, the power consumption and security of IoT systems have become very important issues. Advanced Encryption Standard (AES) is a block cipher algorithm is commonly used in IoT devices. In this paper, the power consumption and cryptographic calculation requirement for different payload lengths and AES encryption types are analyzed. These types include software-based AES-CB, hardware-based AES-ECB (Electronic Codebook Mode), and hardware-based AES-CCM (Counter with CBC-MAC Mode). The calculation requirement and power consumption for these AES encryption types are measured on the Texas Instruments LAUNCHXL-CC1310 platform. The experimental results show that the hardware-based AES performs better than the software-based AES in terms of power consumption and calculation cycle requirements. In addition, in terms of AES mode selection, the AES-CCM-MIC64 mode may be a better choice if the IoT device is considering security, encryption calculation requirement, and low power consumption at the same time. However, if the IoT device is pursuing lower power and the payload length is generally less than 16 bytes, then AES-ECB could be considered.
The Development of a Portable Hard Disk Encryption/Decryption System with a MEMS Coded Lock.
Zhang, Weiping; Chen, Wenyuan; Tang, Jian; Xu, Peng; Li, Yibin; Li, Shengyong
2009-01-01
In this paper, a novel portable hard-disk encryption/decryption system with a MEMS coded lock is presented, which can authenticate the user and provide the key for the AES encryption/decryption module. The portable hard-disk encryption/decryption system is composed of the authentication module, the USB portable hard-disk interface card, the ATA protocol command decoder module, the data encryption/decryption module, the cipher key management module, the MEMS coded lock controlling circuit module, the MEMS coded lock and the hard disk. The ATA protocol circuit, the MEMS control circuit and AES encryption/decryption circuit are designed and realized by FPGA(Field Programmable Gate Array). The MEMS coded lock with two couplers and two groups of counter-meshing-gears (CMGs) are fabricated by a LIGA-like process and precision engineering method. The whole prototype was fabricated and tested. The test results show that the user's password could be correctly discriminated by the MEMS coded lock, and the AES encryption module could get the key from the MEMS coded lock. Moreover, the data in the hard-disk could be encrypted or decrypted, and the read-write speed of the dataflow could reach 17 MB/s in Ultra DMA mode.
NASA Astrophysics Data System (ADS)
Chuang, Cheng-Hung; Chen, Yen-Lin
2013-02-01
This study presents a steganographic optical image encryption system based on reversible data hiding and double random phase encoding (DRPE) techniques. Conventional optical image encryption systems can securely transmit valuable images using an encryption method for possible application in optical transmission systems. The steganographic optical image encryption system based on the DRPE technique has been investigated to hide secret data in encrypted images. However, the DRPE techniques vulnerable to attacks and many of the data hiding methods in the DRPE system can distort the decrypted images. The proposed system, based on reversible data hiding, uses a JBIG2 compression scheme to achieve lossless decrypted image quality and perform a prior encryption process. Thus, the DRPE technique enables a more secured optical encryption process. The proposed method extracts and compresses the bit planes of the original image using the lossless JBIG2 technique. The secret data are embedded in the remaining storage space. The RSA algorithm can cipher the compressed binary bits and secret data for advanced security. Experimental results show that the proposed system achieves a high data embedding capacity and lossless reconstruction of the original images.
Image Encryption Algorithm Based on Hyperchaotic Maps and Nucleotide Sequences Database
2017-01-01
Image encryption technology is one of the main means to ensure the safety of image information. Using the characteristics of chaos, such as randomness, regularity, ergodicity, and initial value sensitiveness, combined with the unique space conformation of DNA molecules and their unique information storage and processing ability, an efficient method for image encryption based on the chaos theory and a DNA sequence database is proposed. In this paper, digital image encryption employs a process of transforming the image pixel gray value by using chaotic sequence scrambling image pixel location and establishing superchaotic mapping, which maps quaternary sequences and DNA sequences, and by combining with the logic of the transformation between DNA sequences. The bases are replaced under the displaced rules by using DNA coding in a certain number of iterations that are based on the enhanced quaternary hyperchaotic sequence; the sequence is generated by Chen chaos. The cipher feedback mode and chaos iteration are employed in the encryption process to enhance the confusion and diffusion properties of the algorithm. Theoretical analysis and experimental results show that the proposed scheme not only demonstrates excellent encryption but also effectively resists chosen-plaintext attack, statistical attack, and differential attack. PMID:28392799
A novel algorithm for thermal image encryption.
Hussain, Iqtadar; Anees, Amir; Algarni, Abdulmohsen
2018-04-16
Thermal images play a vital character at nuclear plants, Power stations, Forensic labs biological research, and petroleum products extraction. Safety of thermal images is very important. Image data has some unique features such as intensity, contrast, homogeneity, entropy and correlation among pixels that is why somehow image encryption is trickier as compare to other encryptions. With conventional image encryption schemes it is normally hard to handle these features. Therefore, cryptographers have paid attention to some attractive properties of the chaotic maps such as randomness and sensitivity to build up novel cryptosystems. That is why, recently proposed image encryption techniques progressively more depends on the application of chaotic maps. This paper proposed an image encryption algorithm based on Chebyshev chaotic map and S8 Symmetric group of permutation based substitution boxes. Primarily, parameters of chaotic Chebyshev map are chosen as a secret key to mystify the primary image. Then, the plaintext image is encrypted by the method generated from the substitution boxes and Chebyshev map. By this process, we can get a cipher text image that is perfectly twisted and dispersed. The outcomes of renowned experiments, key sensitivity tests and statistical analysis confirm that the proposed algorithm offers a safe and efficient approach for real-time image encryption.
Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis
2013-05-01
Healthcare institutions worldwide have adopted picture archiving and communication system (PACS) for enterprise access to images, relying on Digital Imaging Communication in Medicine (DICOM) standards for data exchange. However, communication over a wider domain of independent medical institutions is not well standardized. A DICOM-compliant bridge was developed for extending and sharing DICOM services across healthcare institutions without requiring complex network setups or dedicated communication channels. A set of DICOM routers interconnected through a public cloud infrastructure was implemented to support medical image exchange among institutions. Despite the advantages of cloud computing, new challenges were encountered regarding data privacy, particularly when medical data are transmitted over different domains. To address this issue, a solution was introduced by creating a ciphered data channel between the entities sharing DICOM services. Two main DICOM services were implemented in the bridge: Storage and Query/Retrieve. The performance measures demonstrated it is quite simple to exchange information and processes between several institutions. The solution can be integrated with any currently installed PACS-DICOM infrastructure. This method works transparently with well-known cloud service providers. Cloud computing was introduced to augment enterprise PACS by providing standard medical imaging services across different institutions, offering communication privacy and enabling creation of wider PACS scenarios with suitable technical solutions.
A novel image encryption algorithm based on the chaotic system and DNA computing
NASA Astrophysics Data System (ADS)
Chai, Xiuli; Gan, Zhihua; Lu, Yang; Chen, Yiran; Han, Daojun
A novel image encryption algorithm using the chaotic system and deoxyribonucleic acid (DNA) computing is presented. Different from the traditional encryption methods, the permutation and diffusion of our method are manipulated on the 3D DNA matrix. Firstly, a 3D DNA matrix is obtained through bit plane splitting, bit plane recombination, DNA encoding of the plain image. Secondly, 3D DNA level permutation based on position sequence group (3DDNALPBPSG) is introduced, and chaotic sequences generated from the chaotic system are employed to permutate the positions of the elements of the 3D DNA matrix. Thirdly, 3D DNA level diffusion (3DDNALD) is given, the confused 3D DNA matrix is split into sub-blocks, and XOR operation by block is manipulated to the sub-DNA matrix and the key DNA matrix from the chaotic system. At last, by decoding the diffused DNA matrix, we get the cipher image. SHA 256 hash of the plain image is employed to calculate the initial values of the chaotic system to avoid chosen plaintext attack. Experimental results and security analyses show that our scheme is secure against several known attacks, and it can effectively protect the security of the images.
A novel chaos-based image encryption algorithm using DNA sequence operations
NASA Astrophysics Data System (ADS)
Chai, Xiuli; Chen, Yiran; Broyde, Lucie
2017-01-01
An image encryption algorithm based on chaotic system and deoxyribonucleic acid (DNA) sequence operations is proposed in this paper. First, the plain image is encoded into a DNA matrix, and then a new wave-based permutation scheme is performed on it. The chaotic sequences produced by 2D Logistic chaotic map are employed for row circular permutation (RCP) and column circular permutation (CCP). Initial values and parameters of the chaotic system are calculated by the SHA 256 hash of the plain image and the given values. Then, a row-by-row image diffusion method at DNA level is applied. A key matrix generated from the chaotic map is used to fuse the confused DNA matrix; also the initial values and system parameters of the chaotic system are renewed by the hamming distance of the plain image. Finally, after decoding the diffused DNA matrix, we obtain the cipher image. The DNA encoding/decoding rules of the plain image and the key matrix are determined by the plain image. Experimental results and security analyses both confirm that the proposed algorithm has not only an excellent encryption result but also resists various typical attacks.
Bravo, Jose Luis [Houston, TX; Harvey, III, Albert Destrehan; Vinegar, Harold J [Bellaire, TX
2012-04-03
Systems and methods of treating a gas stream are described. A method of treating a gas stream includes cryogenically separating a first gas stream to form a second gas stream and a third stream. The third stream is cryogenically contacted with a carbon dioxide stream to form a fourth and fifth stream. A majority of the second gas stream includes methane and/or molecular hydrogen. A majority of the third stream includes one or more carbon oxides, hydrocarbons having a carbon number of at least 2, one or more sulfur compounds, or mixtures thereof. A majority of the fourth stream includes one or more of the carbon oxides and hydrocarbons having a carbon number of at least 2. A majority of the fifth stream includes hydrocarbons having a carbon number of at least 3 and one or more of the sulfur compounds.
Atwood, Trisha; Richardson, John S.
2012-01-01
Two native, stream-associated amphibians are found in coastal streams of the west coast of North America, the tailed frog and the coastal giant salamander, and each interacts with stream insects in contrasting ways. For tailed frogs, their tadpoles are the primary life stage found in steep streams and they consume biofilm from rock surfaces, which can have trophic and non-trophic effects on stream insects. By virtue of their size the tadpoles are relatively insensitive to stream insect larvae, and tadpoles are capable of depleting biofilm levels directly (exploitative competition), and may also “bulldoze” insect larvae from the surfaces of stones (interference competition). Coastal giant salamander larvae, and sometimes adults, are found in small streams where they prey primarily on stream insects, as well as other small prey. This predator-prey interaction with stream insects does not appear to result in differences in the stream invertebrate community between streams with and without salamander larvae. These two examples illustrate the potential for trophic and non-trophic interactions between stream-associated amphibians and stream insects, and also highlights the need for further research in these systems. PMID:26466536
Venarsky, Michael P; Walters, David M; Hall, Robert O; Livers, Bridget; Wohl, Ellen
2018-05-01
In the Colorado Front Range (USA), disturbance history dictates stream planform. Undisturbed, old-growth streams have multiple channels and large amounts of wood and depositional habitat. Disturbed streams (wildfires and logging < 200 years ago) are single-channeled with mostly erosional habitat. We tested how these opposing stream states influenced organic matter, benthic macroinvertebrate secondary production, emerging aquatic insect flux, and riparian spider biomass. Organic matter and macroinvertebrate production did not differ among sites per unit area (m -2 ), but values were 2 ×-21 × higher in undisturbed reaches per unit of stream valley (m -1 valley) because total stream area was higher in undisturbed reaches. Insect emergence was similar among streams at the per unit area and per unit of stream valley. However, rescaling insect emergence to per meter of stream bank showed that the emerging insect biomass reaching the stream bank was lower in undisturbed sites because multi-channel reaches had 3 × more stream bank than single-channel reaches. Riparian spider biomass followed the same pattern as emerging aquatic insects, and we attribute this to bottom-up limitation caused by the multi-channeled undisturbed sites diluting prey quantity (emerging insects) reaching the stream bank (riparian spider habitat). These results show that historic landscape disturbances continue to influence stream and riparian communities in the Colorado Front Range. However, these legacy effects are only weakly influencing habitat-specific function and instead are primarily influencing stream-riparian community productivity by dictating both stream planform (total stream area, total stream bank length) and the proportional distribution of specific habitat types (pools vs riffles).
Feature integration and object representations along the dorsal stream visual hierarchy
Perry, Carolyn Jeane; Fallah, Mazyar
2014-01-01
The visual system is split into two processing streams: a ventral stream that receives color and form information and a dorsal stream that receives motion information. Each stream processes that information hierarchically, with each stage building upon the previous. In the ventral stream this leads to the formation of object representations that ultimately allow for object recognition regardless of changes in the surrounding environment. In the dorsal stream, this hierarchical processing has classically been thought to lead to the computation of complex motion in three dimensions. However, there is evidence to suggest that there is integration of both dorsal and ventral stream information into motion computation processes, giving rise to intermediate object representations, which facilitate object selection and decision making mechanisms in the dorsal stream. First we review the hierarchical processing of motion along the dorsal stream and the building up of object representations along the ventral stream. Then we discuss recent work on the integration of ventral and dorsal stream features that lead to intermediate object representations in the dorsal stream. Finally we propose a framework describing how and at what stage different features are integrated into dorsal visual stream object representations. Determining the integration of features along the dorsal stream is necessary to understand not only how the dorsal stream builds up an object representation but also which computations are performed on object representations instead of local features. PMID:25140147
High temperature methods for forming oxidizer fuel
Bravo, Jose Luis [Houston, TX
2011-01-11
A method of treating a formation fluid includes providing formation fluid from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes carbon dioxide, hydrogen sulfide, hydrocarbons, hydrogen or mixtures thereof. Molecular oxygen is separated from air to form a molecular oxygen stream comprising molecular oxygen. The first gas stream is combined with the molecular oxygen stream to form a combined stream comprising molecular oxygen and the first gas stream. The combined stream is provided to one or more downhole burners.
Pedersen, Morten Lauge; Kristensen, Klaus Kevin; Friberg, Nikolai
2014-01-01
We evaluated the restoration of physical habitats and its influence on macroinvertebrate community structure in 18 Danish lowland streams comprising six restored streams, six streams with little physical alteration and six channelized streams. We hypothesized that physical habitats and macroinvertebrate communities of restored streams would resemble those of natural streams, while those of the channelized streams would differ from both restored and near-natural streams. Physical habitats were surveyed for substrate composition, depth, width and current velocity. Macroinvertebrates were sampled along 100 m reaches in each stream, in edge habitats and in riffle/run habitats located in the center of the stream. Restoration significantly altered the physical conditions and affected the interactions between stream habitat heterogeneity and macroinvertebrate diversity. The substrate in the restored streams was dominated by pebble, whereas the substrate in the channelized and natural streams was dominated by sand. In the natural streams a relationship was identified between slope and pebble/gravel coverage, indicating a coupling of energy and substrate characteristics. Such a relationship did not occur in the channelized or in the restored streams where placement of large amounts of pebble/gravel distorted the natural relationship. The analyses revealed, a direct link between substrate heterogeneity and macroinvertebrate diversity in the natural streams. A similar relationship was not found in either the channelized or the restored streams, which we attribute to a de-coupling of the natural relationship between benthic community diversity and physical habitat diversity. Our study results suggest that restoration schemes should aim at restoring the natural physical structural complexity in the streams and at the same time enhance the possibility of re-generating the natural geomorphological processes sustaining the habitats in streams and rivers. Documentation of restoration efforts should be intensified with continuous monitoring of geomorphological and ecological changes including surveys of reference river systems.
Pedersen, Morten Lauge; Kristensen, Klaus Kevin; Friberg, Nikolai
2014-01-01
We evaluated the restoration of physical habitats and its influence on macroinvertebrate community structure in 18 Danish lowland streams comprising six restored streams, six streams with little physical alteration and six channelized streams. We hypothesized that physical habitats and macroinvertebrate communities of restored streams would resemble those of natural streams, while those of the channelized streams would differ from both restored and near-natural streams. Physical habitats were surveyed for substrate composition, depth, width and current velocity. Macroinvertebrates were sampled along 100 m reaches in each stream, in edge habitats and in riffle/run habitats located in the center of the stream. Restoration significantly altered the physical conditions and affected the interactions between stream habitat heterogeneity and macroinvertebrate diversity. The substrate in the restored streams was dominated by pebble, whereas the substrate in the channelized and natural streams was dominated by sand. In the natural streams a relationship was identified between slope and pebble/gravel coverage, indicating a coupling of energy and substrate characteristics. Such a relationship did not occur in the channelized or in the restored streams where placement of large amounts of pebble/gravel distorted the natural relationship. The analyses revealed, a direct link between substrate heterogeneity and macroinvertebrate diversity in the natural streams. A similar relationship was not found in either the channelized or the restored streams, which we attribute to a de-coupling of the natural relationship between benthic community diversity and physical habitat diversity. Our study results suggest that restoration schemes should aim at restoring the natural physical structural complexity in the streams and at the same time enhance the possibility of re-generating the natural geomorphological processes sustaining the habitats in streams and rivers. Documentation of restoration efforts should be intensified with continuous monitoring of geomorphological and ecological changes including surveys of reference river systems. PMID:25264627
Code of Federal Regulations, 2012 CFR
2012-07-01
... wastewater streams and liquid streams in open systems within an MCPU? 63.2485 Section 63.2485 Protection of... Standards, and Compliance Requirements § 63.2485 What requirements must I meet for wastewater streams and... subpart that applies to your wastewater streams and liquid streams in open systems within an MCPU, except...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 13 2012-07-01 2012-07-01 false Requirements for Wastewater Streams... to Subpart FFFF of Part 63—Requirements for Wastewater Streams and Liquid Streams in Open Systems... applies to your wastewater streams and liquid streams in open systems within an MCPU: For each . . . You...
Code of Federal Regulations, 2013 CFR
2013-07-01
... wastewater streams and liquid streams in open systems within an MCPU? 63.2485 Section 63.2485 Protection of... Standards, and Compliance Requirements § 63.2485 What requirements must I meet for wastewater streams and... subpart that applies to your wastewater streams and liquid streams in open systems within an MCPU, except...
Code of Federal Regulations, 2014 CFR
2014-07-01
... wastewater streams and liquid streams in open systems within an MCPU? 63.2485 Section 63.2485 Protection of... Standards, and Compliance Requirements § 63.2485 What requirements must I meet for wastewater streams and... subpart that applies to your wastewater streams and liquid streams in open systems within an MCPU, except...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 13 2014-07-01 2014-07-01 false Requirements for Wastewater Streams... to Subpart FFFF of Part 63—Requirements for Wastewater Streams and Liquid Streams in Open Systems... applies to your wastewater streams and liquid streams in open systems within an MCPU: For each . . . You...
Reyt, Ida; Bailliet, Hélène; Valière, Jean-Christophe
2014-01-01
Measurements of streaming velocity are performed by means of Laser Doppler Velocimetry and Particle Image Velociimetry in an experimental apparatus consisting of a cylindrical waveguide having one loudspeaker at each end for high intensity sound levels. The case of high nonlinear Reynolds number ReNL is particularly investigated. The variation of axial streaming velocity with respect to the axial and to the transverse coordinates are compared to available Rayleigh streaming theory. As expected, the measured streaming velocity agrees well with the Rayleigh streaming theory for small ReNL but deviates significantly from such predictions for high ReNL. When the nonlinear Reynolds number is increased, the outer centerline axial streaming velocity gets distorted towards the acoustic velocity nodes until counter-rotating additional vortices are generated near the acoustic velocity antinodes. This kind of behavior is followed by outer streaming cells only and measurements in the near wall region show that inner streaming vortices are less affected by this substantial evolution of fast streaming pattern. Measurements of the transient evolution of streaming velocity provide an additional insight into the evolution of fast streaming.
Venarsky, Michael P.; Walters, David M.; Hall, Robert O.; Livers, Bridget; Wohl, Ellen
2018-01-01
In the Colorado Front Range (USA), disturbance history dictates stream planform. Undisturbed, old-growth streams have multiple channels and large amounts of wood and depositional habitat. Disturbed streams (wildfires and logging < 200 years ago) are single-channeled with mostly erosional habitat. We tested how these opposing stream states influenced organic matter, benthic macroinvertebrate secondary production, emerging aquatic insect flux, and riparian spider biomass. Organic matter and macroinvertebrate production did not differ among sites per unit area (m−2), but values were 2 ×–21 × higher in undisturbed reaches per unit of stream valley (m−1 valley) because total stream area was higher in undisturbed reaches. Insect emergence was similar among streams at the per unit area and per unit of stream valley. However, rescaling insect emergence to per meter of stream bank showed that the emerging insect biomass reaching the stream bank was lower in undisturbed sites because multi-channel reaches had 3 × more stream bank than single-channel reaches. Riparian spider biomass followed the same pattern as emerging aquatic insects, and we attribute this to bottom-up limitation caused by the multi-channeled undisturbed sites diluting prey quantity (emerging insects) reaching the stream bank (riparian spider habitat). These results show that historic landscape disturbances continue to influence stream and riparian communities in the Colorado Front Range. However, these legacy effects are only weakly influencing habitat-specific function and instead are primarily influencing stream–riparian community productivity by dictating both stream planform (total stream area, total stream bank length) and the proportional distribution of specific habitat types (pools vs riffles).
Optimized heat exchange in a CO2 de-sublimation process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baxter, Larry; Terrien, Paul; Tessier, Pascal
The present invention is a process for removing carbon dioxide from a compressed gas stream including cooling the compressed gas in a first heat exchanger, introducing the cooled gas into a de-sublimating heat exchanger, thereby producing a first solid carbon dioxide stream and a first carbon dioxide poor gas stream, expanding the carbon dioxide poor gas stream, thereby producing a second solid carbon dioxide stream and a second carbon dioxide poor gas stream, combining the first solid carbon dioxide stream and the second solid carbon dioxide stream, thereby producing a combined solid carbon dioxide stream, and indirectly exchanging heat betweenmore » the combined solid carbon dioxide stream and the compressed gas in the first heat exchanger.« less
Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D
2015-01-01
Background A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. Objective This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. Methods NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f 1, f 2, ..., f k. The input for each function f i has 3 components: a random number r, an integer n, and input data m. The result, f i(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f 1(r 1, n 1, m 1), f 2(r 2, n 2, m 2), ..., f k(r k, n k, m k). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. Results We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). Conclusions The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash. PMID:26554419
Review of analytical models to stream depletion induced by pumping: Guide to model selection
NASA Astrophysics Data System (ADS)
Huang, Ching-Sheng; Yang, Tao; Yeh, Hund-Der
2018-06-01
Stream depletion due to groundwater extraction by wells may cause impact on aquatic ecosystem in streams, conflict over water rights, and contamination of water from irrigation wells near polluted streams. A variety of studies have been devoted to addressing the issue of stream depletion, but a fundamental framework for analytical modeling developed from aquifer viewpoint has not yet been found. This review shows key differences in existing models regarding the stream depletion problem and provides some guidelines for choosing a proper analytical model in solving the problem of concern. We introduce commonly used models composed of flow equations, boundary conditions, well representations and stream treatments for confined, unconfined, and leaky aquifers. They are briefly evaluated and classified according to six categories of aquifer type, flow dimension, aquifer domain, stream representation, stream channel geometry, and well type. Finally, we recommend promising analytical approaches that can solve stream depletion problem in reality with aquifer heterogeneity and irregular geometry of stream channel. Several unsolved stream depletion problems are also recommended.
The Maybe Stream: A Possible Cold Stellar Stream in the Ultra-diffuse Galaxy NGC1052-DF2
NASA Astrophysics Data System (ADS)
Abraham, Roberto; Danieli, Shany; van Dokkum, Pieter; Conroy, Charlie; Kruijssen, J. M. Diederik; Cohen, Yotam; Merritt, Allison; Zhang, Jielai; Lokhorst, Deborah; Mowla, Lamiya; Brodie, Jean; Romanowsky, Aaron J.; Janssens, Steven
2018-05-01
We report tentative evidence for a cold stellar stream in the ultra-diffuse galaxy NGC1052-DF2. If confirmed, this stream (which we refer to as "The Maybe Stream") would be the first cold stellar stream detected outside of the Local Group. The candidate stream is very narrow and has an unusual and highly curved shape.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 12 2011-07-01 2009-07-01 true Requirements for Wastewater Streams and... of Part 63—Requirements for Wastewater Streams and Liquid Streams in Open Systems Within an MCPU As... wastewater streams and liquid streams in open systems within an MCPU: For each . . . You must . . . 1...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 13 2013-07-01 2012-07-01 true Requirements for Wastewater Streams and... to Subpart FFFF of Part 63—Requirements for Wastewater Streams and Liquid Streams in Open Systems... applies to your wastewater streams and liquid streams in open systems within an MCPU: For each . . . You...
Liu, W C; Wang, C K J; Parkins, E J
2005-12-01
Although several studies support the existence of a negative stream effect on lower-ability stream students' academic self-concept, there is not enough longitudinal research evidence to preclude the possibility that the stream effect may only be temporary. In addition, not much is known about the effect of streaming on changes in students' academic self-concept over time. The main aims of the study were to examine the effect of streaming on (a) the students' academic self-concept immediately after the streaming process, and at yearly intervals for 3 consecutive years, and (b) the changes in students' academic self-concept over a 3 year period. The sample comprised 495 Secondary 1 students (approximate age 13) from three government coeducational schools in Singapore. A longitudinal survey using a self-reported questionnaire. Results showed that the lower-ability stream students had a more negative academic self-concept than the higher-ability stream students immediately after streaming, but they had a more positive academic self-concept 3 years after being streamed. In addition, it was established that the students' academic self-concept declined from Secondary 1 to Secondary 3. Nonetheless, the decline was more pronounced for the higher-ability stream students than the lower-ability stream students. Streaming may have a short-term negative impact on lower-ability stream students' academic self-concept. However, in the long run, being in the lower-ability stream may not be detrimental to their academic self-concept.
NASA Astrophysics Data System (ADS)
Doyle, Martin W.; Singh, Jai; Lave, Rebecca; Robertson, Morgan M.
2015-07-01
We use geomorphic surveys to quantify the differences between restored and nonrestored streams, and the difference between streams restored for market purposes (compensatory mitigation) from those restored for nonmarket programs. We also analyze the social and political-economic drivers of the stream restoration and mitigation industry using analysis of policy documents and interviews with key personnel including regulators, mitigation bankers, stream designers, and scientists. Restored streams are typically wider and geomorphically more homogenous than nonrestored streams. Streams restored for the mitigation market are typically headwater streams and part of a large, complex of long restored main channels, and many restored tributaries; streams restored for nonmarket purposes are typically shorter and consist of the main channel only. Interviews reveal that designers integrate many influences including economic and regulatory constraints, but traditions of practice have a large influence as well. Thus, social forces shape the morphology of restored streams.
Consciousness of Unification: The Mind-Matter Phallacy Bites the Dust
NASA Astrophysics Data System (ADS)
Beichler, James E.
A complete theoretical model of how consciousness arises in neural nets can be developed based on a mixed quantum/classical basis. Both mind and consciousness are multi-leveled scalar and vector electromagnetic complexity patterns, respectively, which emerge within all living organisms through the process of evolution. Like life, the mind and consciousness patterns extend throughout living organisms (bodies), but the neural nets and higher level groupings that distinguish higher levels of consciousness only exist in the brain so mind and consciousness have been traditionally associated with the brain alone. A close study of neurons and neural nets in the brain shows that the microtubules within axons are classical bio-magnetic inductors that emit and absorb electromagnetic pulses from each other. These pulses establish interference patterns that influence the quantized vector potential patterns of interstitial water molecules within the neurons as well as create the coherence within neurons and neural nets that scientists normally associate with more complex memories, thought processes and streams of thought. Memory storage and recall are guided by the microtubules and the actual memory patterns are stored as magnetic vector potential complexity patterns in the points of space at the quantum level occupied by the water molecules. This model also accounts for the plasticity of the brain and implies that mind and consciousness, like life itself, are the result of evolutionary processes. However, consciousness can evolve independent of an organism's birth genetics once it has evolved by normal bottom-up genetic processes and thus force a new type of top-down evolution on living organisms and species as a whole that can be explained by expanding the laws of thermodynamics to include orderly systems.
Direct spectroscopic evidence for isolated silanols in SiO x/Al 2O 3 and their formation mechanism
Mouat, Aidan R.; Kobayashi, Takeshi; Pruski, Marek; ...
2017-02-27
Here, the preparation and unambiguous characterization of isolated Brønsted-acidic silanol species on silica–alumina catalysts presents a key challenge in the rational design of solid acid catalysts. In this report, atomic layer deposition (ALD) and liquid-phase preparation (chemical liquid deposition, CLD) are used to install the SiO x sites on Al 2O 3 catalysts using the same Si source (tetraethylorthosilicate, TEOS). The ALD-derived and CLD-derived SiO x sites are probed with dynamic nuclear polarization (DNP)-enhanced 29Si– 29Si double-quantum/single-quantum (DQ/SQ) correlation NMR spectroscopy. The investigation reveals conclusively that the SiO x/Al 2O 3 material prepared by ALD and CLD, followed by calcinationmore » under an O 2 stream, contains fully spatially isolated Si species, in contrast with those resulting from the calcination under static air, which is widely accepted as a postgrafting treatment for CLD. Insight into the formation mechanism of these sites is obtained via in situ monitoring of the TEOS + γ-Al 2O 3 reaction in an environmental diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS) cell. Upon calcination, the DRIFTS spectra of SiO x/Al 2O 3 reveal a signature unambiguously assignable to isolated Brønsted-acidic silanol species. Surprisingly, the results of this study indicate that the method of preparing SiO x/Al 2O 3 catalysts is less important to the final structure of the silanol sites than the post-treatment conditions. This finding should greatly simplify the methods for synthesizing site-isolated, Brønsted-acidic SiO x/Al 2O 3 catalysts.« less
Salamander occupancy in headwater stream networks
Grant, E.H.C.; Green, L.E.; Lowe, W.H.
2009-01-01
1. Stream ecosystems exhibit a highly consistent dendritic geometry in which linear habitat units intersect to create a hierarchical network of connected branches. 2. Ecological and life history traits of species living in streams, such as the potential for overland movement, may interact with this architecture to shape patterns of occupancy and response to disturbance. Specifically, large-scale habitat alteration that fragments stream networks and reduces connectivity may reduce the probability a stream is occupied by sensitive species, such as stream salamanders. 3. We collected habitat occupancy data on four species of stream salamanders in first-order (i.e. headwater) streams in undeveloped and urbanised regions of the eastern U.S.A. We then used an information-theoretic approach to test alternative models of salamander occupancy based on a priori predictions of the effects of network configuration, region and salamander life history. 4. Across all four species, we found that streams connected to other first-order streams had higher occupancy than those flowing directly into larger streams and rivers. For three of the four species, occupancy was lower in the urbanised region than in the undeveloped region. 5. These results demonstrate that the spatial configuration of stream networks within protected areas affects the occurrences of stream salamander species. We strongly encourage preservation of network connections between first-order streams in conservation planning and management decisions that may affect stream species.
A River Runs Under It: Modeling the Distribution of Streams and Stream Burial in Large River Basins
NASA Astrophysics Data System (ADS)
Elmore, A. J.; Julian, J.; Guinn, S.; Weitzell, R.; Fitzpatrick, M.
2011-12-01
Stream network density exerts a strong control on hydrologic processes in watersheds. Over land and through soil and bedrock substrate, water moves slowly and is subject to chemical transformations unique to conditions of continuous contact with geologic materials. In contrast, once water enters stream channels it is efficiently transported out of watersheds, reducing the amount of time for biological uptake and stream nutrient processing. Therefore, stream network density dictates both the relative importance of terrestrial and aquatic influences to stream chemistry and the residence time of water in watersheds, and is critical to modeling and empirical studies aimed at understanding the impact of land use on stream water quantity and quality. Stream network density is largely a function of the number and length of the smallest streams. Methods for mapping and measuring these headwater streams range from simple measurement of stream length from existing maps, to detailed field mapping efforts, which are difficult to implement over large areas. Confounding the simplest approaches, many headwater stream reaches are not included in hydrographical maps, such as the U.S. National Hydrography Dataset (NHD), either because they were buried during the course of urban development or because they were seen as smaller than the minimum mapping size at the time of map generation. These "missing streams" severely limit the effective analyses of stream network density based on the NHD, constituting a major problem for many efforts to understand land-use impacts on streams. Here we report on research that predicts stream presence and absence by coupling field observations of headwater stream channels with maximum entropy models (MaxEnt) commonly implemented in biogeographical studies to model species distributions. The model utilizes terrain variables that are continuously accumulated along hydrologic flowpaths derived from a 10-m digital elevation model. In validation, the model correctly predicts the presence of 91% of all 10-m stream segments, and rarely miscalculates tributary numbers. We apply this model to the entire Potomac River Basin (37,800 km2) and several adjacent basins to map stream channel density and compare our results with NHD flowline data. We find that NHD underestimates stream channel density by a factor of two in most sub watersheds and this effect is strongest in the densely urbanized cities of Washington, DC and Baltimore, MD. We then apply a second predictive model based on impervious surface area data to map the extent of stream burial. Results demonstrate that the extent of stream burial increases with decreasing stream catchment area. When applied at four time steps (1975, 1990, 2001, and 2006), we find that although stream burial rates have slowed in the recent decade, streams that are not mapped in NHD flowline data continue to be buried during development. This work is the most ambitious attempt yet to map stream network density over a large region and will have lasting implications for modeling and conservation efforts.
Where Did All the Streams Go? Effects of Urbanization on Hydrologic Permanence of Headwater Streams
Headwater streams represent a majority (up to 70%) of the stream length in the United States; however, these small streams are often piped or filled to accommodate residential, commercial, and industrial development. Legal protection of headwater streams under the Clean Water Ac...
Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing
2012-12-14
Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing Matei Zaharia Tathagata Das Haoyuan Li Timothy Hunter Scott Shenker Ion...SUBTITLE Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...time. However, current programming models for distributed stream processing are relatively low-level often leaving the user to worry about consistency of
Comparison of drinking water treatment process streams for optimal bacteriological water quality.
Ho, Lionel; Braun, Kalan; Fabris, Rolando; Hoefel, Daniel; Morran, Jim; Monis, Paul; Drikas, Mary
2012-08-01
Four pilot-scale treatment process streams (Stream 1 - Conventional treatment (coagulation/flocculation/dual media filtration); Stream 2 - Magnetic ion exchange (MIEX)/Conventional treatment; Stream 3 - MIEX/Conventional treatment/granular activated carbon (GAC) filtration; Stream 4 - Microfiltration/nanofiltration) were commissioned to compare their effectiveness in producing high quality potable water prior to disinfection. Despite receiving highly variable source water quality throughout the investigation, each stream consistently reduced colour and turbidity to below Australian Drinking Water Guideline levels, with the exception of Stream 1 which was difficult to manage due to the reactive nature of coagulation control. Of particular interest was the bacteriological quality of the treated waters where flow cytometry was shown to be the superior monitoring tool in comparison to the traditional heterotrophic plate count method. Based on removal of total and active bacteria, the treatment process streams were ranked in the order: Stream 4 (average log removal of 2.7) > Stream 2 (average log removal of 2.3) > Stream 3 (average log removal of 1.5) > Stream 1 (average log removal of 1.0). The lower removals in Stream 3 were attributed to bacteria detaching from the GAC filter. Bacterial community analysis revealed that the treatments affected the bacteria present, with the communities in streams incorporating conventional treatment clustering with each other, while the community composition of Stream 4 was very different to those of Streams 1, 2 and 3. MIEX treatment was shown to enhance removal of bacteria due to more efficient flocculation which was validated through the novel application of the photometric dispersion analyser. Copyright © 2012 Elsevier Ltd. All rights reserved.
Episodic acidification and changes in fish diversity in Pennsylvania headwater streams
Heard, R.M.; Sharpe, W.E.; Carline, R.F.; Kimmel, William G.
1997-01-01
Current water chemistry and fish communities in 70 Pennsylvania streams were compared with historical records to determine whether fish species richness had declined and, if so, the possible role of acidification. First-, second-, and third-order streams were selected, and stream sites sampled during the 1961-1971 survey were resampled during May and June 1994 in the Appalachian Plateaus province and during June 1995 in the Valley and Ridge province. Stream-flow was measured and a habitat assessment was completed at each site. Dominant bedrock types influencing the stream sampling site were determined for the Appalachian Plateaus streams. Episodic water chemistry was collected for 39 of the 50 Appalachian Plateaus streams and 14 of the 20 Valley and Ridge streams during the winter and spring of 1996. Thirty-eight (76%) streams of the Appalachian Plateaus province and 13 (65%) streams in the Valley and Ridge province had a loss of fish species since the 1961-1971 sampling period. Habitat scores were not related to losses of fish species. Of the 53 streams sampled during runoff episodes 22 (42%) increased in total dissolved aluminum by more than 50 ??g/L, and 31 (58%) streams decreased in pH by 0.5 units or more. Minnows (Cyprinidae) and darters (Percidae) are sensitive to acidity and were the species most often lost. Streams draining watersheds of the Appalachian Plateaus province dominated by Pottsville bedrock had more acidic water quality during base flow and storm flow sampling periods than streams dominated by Pocono bedrock. The results of this study indicate that many Pennsylvania streams have undergone an alarming reduction in fish diversity during the past 25-34 years. In many of these streams the loss in fish diversity may be attributed to episodic acidification.
Jujasz, Albert J.; Burkhart, James A.; Greenberg, Ralph
1988-01-01
A method for the separation of gaseous mixtures such as air and for producing medium purity oxygen, comprising compressing the gaseous mixture in a first compressor to about 3.9-4.1 atmospheres pressure, passing said compressed gaseous mixture in heat exchange relationship with sub-ambient temperature gaseous nitrogen, dividing the cooled, pressurized gaseous mixture into first and second streams, introducing the first stream into the high pressure chamber of a double rectification column, separating the gaseous mixture in the rectification column into a liquid oxygen-enriched stream and a gaseous nitrogen stream and supplying the gaseous nitrogen stream for cooling the compressed gaseous mixture, removing the liquid oxygen-enriched stream from the low pressure chamber of the rectification column and pumping the liquid, oxygen-enriched steam to a predetermined pressure, cooling the second stream, condensing the cooled second stream and evaporating the oxygen-enriched stream in an evaporator-condenser, delivering the condensed second stream to the high pressure chamber of the rectification column, and heating the oxygen-enriched stream and blending the oxygen-enriched stream with a compressed blend-air stream to the desired oxygen concentration.
Methods and apparatuses for deoxygenating biomass-derived pyrolysis oil
Baird, Lance Awender; Brandvold, Timothy A.
2015-10-20
Embodiments of methods and apparatuses for deoxygenating a biomass-derived pyrolysis oil are provided. In one example, a method comprises the steps of separating a low-oxygen biomass-derived pyrolysis oil effluent into a low-oxygen-pyoil organic phase stream and an aqueous phase stream. Phenolic compounds are removed from the aqueous phase stream to form a phenolic-rich diluent recycle stream. A biomass-derived pyrolysis oil stream is diluted and heated with the phenolic-rich diluent recycle stream to form a heated diluted pyoil feed stream. The heated diluted pyoil feed stream is contacted with a deoxygenating catalyst in the presence of hydrogen to deoxygenate the heated diluted pyoil feed stream.
Land, Larry F.; Shipp, Allison A.
1996-01-01
Water samples collected from streams draining an agricultural area in the west-central part of the Trinity River Basin upstream from the Richland-Chambers Reservoir and from streams draining an urban area in the Dallas-Fort Worth metropolitan area during March 1993 - September 1995 were analyzed for nutrients (nitrogen and phosphorus compounds). A comparison of the data for agricultural and urban streams shows the maximum concentration of total nitrogen is from an urban stream and the maximum concentration of total phosphorus is from an agricultural stream. One-half of the samples have total nitrogen concentrations equal to or less than 1.1 and 1.0 milligrams per liter in the agricultural and urban streams, respectively; and one-half of the samples have total phosphorous concentrations equal to or less than 0.04 and 0.05 milligram per liter in the agricultural and urban streams, respectively. The highest concentrations of total nitrogen in both types of streams are in the spring. The minimum concentrations of total nitrogen are during the summer in the agricultural streams and during the winter in the urban streams. Concentrations of total phosphorus in agricultural streams show negligible seasonal variability. The highest concentrations of total phosphorus are in spring and possibly late summer in the urban streams. In the midrange of streamflow in the urban streams and throughout the range of streamflow in the agricultural streams, concentrations of total nitrogen increase. Concentrations of total phosphorus increase with streamflow in the middle and upper ranges of streamflow in both agricultural and urban streams.
Modeling the Impact of Stream Discharge Events on Riparian Solute Dynamics.
Mahmood, Muhammad Nasir; Schmidt, Christian; Fleckenstein, Jan H; Trauth, Nico
2018-03-22
The biogeochemical composition of stream water and the surrounding riparian water is mainly defined by the exchange of water and solutes between the stream and the riparian zone. Short-term fluctuations in near stream hydraulic head gradients (e.g., during stream flow events) can significantly influence the extent and rate of exchange processes. In this study, we simulate exchanges between streams and their riparian zone driven by stream stage fluctuations during single stream discharge events of varying peak height and duration. Simulated results show that strong stream flow events can trigger solute mobilization in riparian soils and subsequent export to the stream. The timing and amount of solute export is linked to the shape of the discharge event. Higher peaks and increased durations significantly enhance solute export, however, peak height is found to be the dominant control for overall mass export. Mobilized solutes are transported to the stream in two stages (1) by return flow of stream water that was stored in the riparian zone during the event and (2) by vertical movement to the groundwater under gravity drainage from the unsaturated parts of the riparian zone, which lasts for significantly longer time (> 400 days) resulting in long tailing of bank outflows and solute mass outfluxes. We conclude that strong stream discharge events can mobilize and transport solutes from near stream riparian soils into the stream. The impact of short-term stream discharge variations on solute exchange may last for long times after the flow event. © 2018, National Ground Water Association.
Leaf breakdown in streams differing in catchment land use
Paul, M.J.; Meyer, J.L.; Couch, C.A.
2006-01-01
1. The impact of changes in land use on stream ecosystem function is poorly understood. We studied leaf breakdown, a fundamental process of stream ecosystems, in streams that represent a range of catchment land use in the Piedmont physiographic province of the south-eastern United States. 2. We placed bags of chalk maple (Acer barbatum) leaves in similar-sized streams in 12 catchments of differing dominant land use: four forested, three agricultural, two suburban and three urban catchments. We measured leaf mass, invertebrate abundance and fungal biomass in leaf bags over time. 3. Leaves decayed significantly faster in agricultural (0.0465 day-1) and urban (0.0474 day-1) streams than in suburban (0.0173 day-1) and forested (0.0100 day-1) streams. Additionally, breakdown rates in the agricultural and urban streams were among the fastest reported for deciduous leaves in any stream. Nutrient concentrations in agricultural streams were significantly higher than in any other land-use type. Fungal biomass associated with leaves was significantly lower in urban streams; while shredder abundance in leaf bags was significantly higher in forested and agricultural streams than in suburban and urban streams. Storm runoff was significantly higher in urban and suburban catchments that had higher impervious surface cover than forested or agricultural catchments. 4. We propose that processes accelerating leaf breakdown in agricultural and urban streams were not the same: faster breakdown in agricultural streams was due to increased biological activity as a result of nutrient enrichment, whereas faster breakdown in urban streams was a result of physical fragmentation resulting from higher storm runoff. ?? 2006 The Authors.
Expected number of quantum channels in quantum networks.
Chen, Xi; Wang, He-Ming; Ji, Dan-Tong; Mu, Liang-Zhu; Fan, Heng
2015-07-15
Quantum communication between nodes in quantum networks plays an important role in quantum information processing. Here, we proposed the use of the expected number of quantum channels as a measure of the efficiency of quantum communication for quantum networks. This measure quantified the amount of quantum information that can be teleported between nodes in a quantum network, which differs from classical case in that the quantum channels will be consumed if teleportation is performed. We further demonstrated that the expected number of quantum channels represents local correlations depicted by effective circles. Significantly, capacity of quantum communication of quantum networks quantified by ENQC is independent of distance for the communicating nodes, if the effective circles of communication nodes are not overlapped. The expected number of quantum channels can be enhanced through transformations of the lattice configurations of quantum networks via entanglement swapping. Our results can shed lights on the study of quantum communication in quantum networks.
Expected number of quantum channels in quantum networks
Chen, Xi; Wang, He-Ming; Ji, Dan-Tong; Mu, Liang-Zhu; Fan, Heng
2015-01-01
Quantum communication between nodes in quantum networks plays an important role in quantum information processing. Here, we proposed the use of the expected number of quantum channels as a measure of the efficiency of quantum communication for quantum networks. This measure quantified the amount of quantum information that can be teleported between nodes in a quantum network, which differs from classical case in that the quantum channels will be consumed if teleportation is performed. We further demonstrated that the expected number of quantum channels represents local correlations depicted by effective circles. Significantly, capacity of quantum communication of quantum networks quantified by ENQC is independent of distance for the communicating nodes, if the effective circles of communication nodes are not overlapped. The expected number of quantum channels can be enhanced through transformations of the lattice configurations of quantum networks via entanglement swapping. Our results can shed lights on the study of quantum communication in quantum networks. PMID:26173556
Stream network and stream segment temperature models software
Bartholow, John
2010-01-01
This set of programs simulates steady-state stream temperatures throughout a dendritic stream network handling multiple time periods per year. The software requires a math co-processor and 384K RAM. Also included is a program (SSTEMP) designed to predict the steady state stream temperature within a single stream segment for a single time period.
While the effects of urbanization on stream ecosystems have been well-documented, little is known regarding the impact of burying streams within culverts. Our project aims to explore the ecological impacts of stream burial at a fine spatial scale. Two culverted urban streams in C...
Stream-groundwater exchange and hydrologic turnover at the network scale
NASA Astrophysics Data System (ADS)
Covino, Tim; McGlynn, Brian; Mallard, John
2011-12-01
The exchange of water between streams and groundwater can influence stream water quality, hydrologic mass balances, and attenuate solute export from watersheds. We used conservative tracer injections (chloride, Cl-) across 10 stream reaches to investigate stream water gains and losses from and to groundwater at larger spatial and temporal scales than typically associated with hyporheic exchanges. We found strong relationships between reach discharge, median tracer velocity, and gross hydrologic loss across a range of stream morphologies and sizes in the 11.4 km2 Bull Trout Watershed of central ID. We implemented these empirical relationships in a numerical network model and simulated stream water gains and losses and subsequent fractional hydrologic turnover across the stream network. We found that stream gains and losses from and to groundwater can influence source water contributions and stream water compositions across stream networks. Quantifying proportional influences of source water contributions from runoff generation locations across the network on stream water composition can provide insight into the internal mechanisms that partially control the hydrologic and biogeochemical signatures observed along networks and at watershed outlets.
Impact of riparian land use on stream insects of Kudremukh National Park, Karnataka state, India.
Subramanian, K A; Sivaramakrishnan, K G; Gadgil, Madhav
2005-12-31
The impact of riparian land use on the stream insect communities was studied at Kudremukh National Park located within Western Ghats, a tropical biodiversity hotspot in India. The diversity and community composition of stream insects varied across streams with different riparian land use types. The rarefied family and generic richness was highest in streams with natural semi evergreen forests as riparian vegetation. However, when the streams had human habitations and areca nut plantations as riparian land use type, the rarefied richness was higher than that of streams with natural evergreen forests and grasslands. The streams with scrub lands and iron ore mining as the riparian land use had the lowest rarefied richness. Within a landscape, the streams with the natural riparian vegetation had similar community composition. However, streams with natural grasslands as the riparian vegetation, had low diversity and the community composition was similar to those of paddy fields. We discuss how stream insect assemblages differ due to varied riparian land use patterns, reflecting fundamental alterations in the functioning of stream ecosystems. This understanding is vital to conserve, manage and restore tropical riverine ecosystems.
Slip stream apparatus and method for treating water in a circulating water system
Cleveland, J.R.
1997-03-18
An apparatus is described for treating water in a circulating water system that has a cooling water basin which includes a slip stream conduit in flow communication with the circulating water system, a source of acid solution in flow communication with the slip stream conduit, and a decarbonator in flow communication with the slip stream conduit and the cooling water basin. In use, a slip stream of circulating water is drawn from the circulating water system into the slip stream conduit of the apparatus. The slip stream pH is lowered by contact with an acid solution provided from the source thereof. The slip stream is then passed through a decarbonator to form a treated slip stream, and the treated slip stream is returned to the cooling water basin. 4 figs.
NASA Astrophysics Data System (ADS)
Singh, J.; Doyle, M.; Lave, R.; Robertson, M.
2015-12-01
Stream restoration is increasingly driven by compensatory mitigation; impacts to streams associated with typical land development activities must be offset via restoration of streams elsewhere. This policy creates an environment where restored stream 'credits' are traded under market-like conditions, comparable to wetland mitigation, carbon offsets, or endangered species habitat banking. The effect of mitigation on restoration design and construction is unknown. We use geomorphic surveys to quantify the differences between restored and nonrestored streams, and the difference between streams restored for market purposes (compensatory mitigation) from those restored for nonmarket programs. Physical study sites are located in the state of North Carolina, USA. We also analyze the social and political-economic drivers of the stream restoration and mitigation industry using analysis of policy documents and interviews with key personnel including regulators, mitigation bankers, stream designers, and scientists. Restored streams are typically wider, shallower and geomorphically more homogeneous than nonrestored streams. For example, nonrestored streams are typically characterized by more than an order of magnitude variability in radius of curvature and meander wavelength within a single study reach. By contrast, the radius of curvature in many restored streams does not vary for nearly the entire project reach. Streams restored for the mitigation market are typically headwater streams and part of a large, complex of long restored main channels, and many restored tributaries; streams restored for nonmarket purposes are typically shorter and consist of the main channel only. Interviews reveal that social forces shape the morphology of restored streams. Designers integrate many influences including economic and regulatory constraints, but traditions of practice have a large influence as well. Home to a fairly mature stream mitigation banking market, North Carolina can provide lessons for other states or countries with younger mitigation banking programs (e.g., Oregon and Montana) as well as places considering their introduction.
Kirby, C S; McInerney, B; Turner, M D
2008-04-15
Atmospheric acid deposition is of environmental concern worldwide, and the determination of impacts in remote areas can be problematic. Rainwater in central Pennsylvania, USA, has a mean pH of approximately 4.4. Bedrock varies dramatically in its ability to neutralize acidity. A GIS database simplified reconnaissance of non-carbonate bedrock streams in the Valley and Ridge Province and identified potentially chronically impacted headwater streams, which were sampled for chemistry and brook trout. Stream sites (n=26) that originate in and flow through the Tuscarora had a median pH of 5.0 that was significantly different from other formations. Shawangunk streams (n=6) and non-Tuscarora streams (n=20) had a median pH of 6.0 and 6.3, respectively. Mean alkalinity for non-Tuscarora streams (2.6 mg/L CaCO(3)) was higher than the mean for Tuscarora streams (0.5 mg/L). Lower pH and alkalinity suggest that the buffering capability of the Tuscarora is inferior to that of adjacent sandstones. Dissolved aluminum concentrations were much higher for Tuscarora streams (0.2 mg/L; approximately the lethal limit for brook trout) than for non-Tuscarora streams (0.03 mg/L) or Shawangunk streams (0.02 mg/L). Hook-and-line methods determined the presence/absence of brook trout in 47 stream reaches with suitable habitat. Brook trout were observed in 21 of 22 non-Tuscarora streams, all 6 Shawangunk streams, and only 9 of 28 Tuscarora stream sites. Carefully-designed hook-and-line sampling can determine the presence or absence of brook trout and help confirm biological impacts of acid deposition. 15% of 334 km of Tuscarora stream lengths are listed as "impaired" due to atmospheric deposition by the Pennsylvania Department of Environmental Protection. 65% of the 101 km of Tuscarora stream lengths examined in this study were impaired.
Żelazna-Wieczorek, Joanna; Nowicka-Krawczyk, Paulina
2015-12-15
A series of cascade artificial ponds were constructed to improve the ecological status of the stream. To evaluate the effects of restoration practices, a bioassessment, based on phytobenthic algae - the diatoms, was made. Hierarchical Cluster Analysis (HCA) and Principal Component Analysis (PCA) of diatom assemblages allowed for evaluating the influence of a series of cascade artificial ponds on stream integrity. To reveal which environmental factors had the greatest influence on shaping diatom assemblages, the BIO-ENV procedure was used, and in order to examine whether these factors had equal influence on diatoms along the stream, Redundancy Analysis (RDA) was used. The analysis of diatom assemblages allowed for the calculation of the diatom indices in order to assess the water quality and the ecological status of the stream. Artificial ponds constructed on the stream had significant effects on the integrity of the stream ecosystem. Diatom assemblages characteristic of stream habitats were disrupted by the species from ponds. HCA and PCA revealed that the stream was clearly divided into three sections: ponds, stream parts under the influence of ponds, and stream parts isolated from ponds. The ponds thus altered stream environmental conditions. Benthic diatom assemblages were affected by a combination of four environmental factors: the concentration of ammonium ions, dissolved oxygen, conductivity, and the amount of total suspended material in the water. These factors, together with water pH, had a diverse influence on diatom assemblages alongside the stream, which was caused by a series of cascade ponds. In theory, this restoration practice should restore the stream close to its natural state, but bioassessment of the stream ecosystem based on diatoms revealed that there was no improvement of the ecological status alongside the stream. The construction of artificial ponds disrupted stream continuity and altered the character of the stream ecosystem. Copyright © 2015 Elsevier B.V. All rights reserved.
Arismendi, Ivan; Dunham, Jason B.; Heck, Michael; Schultz, Luke; Hockman-Wert, David
2017-01-01
Intermittent and ephemeral streams represent more than half of the length of the global river network. Dryland freshwater ecosystems are especially vulnerable to changes in human-related water uses as well as shifts in terrestrial climates. Yet, the description and quantification of patterns of flow permanence in these systems is challenging mostly due to difficulties in instrumentation. Here, we took advantage of existing stream temperature datasets in dryland streams in the northwest Great Basin desert, USA, to extract critical information on climate-sensitive patterns of flow permanence. We used a signal detection technique, Hidden Markov Models (HMMs), to extract information from daily time series of stream temperature to diagnose patterns of stream drying. Specifically, we applied HMMs to time series of daily standard deviation (SD) of stream temperature (i.e., dry stream channels typically display highly variable daily temperature records compared to wet stream channels) between April and August (2015–2016). We used information from paired stream and air temperature data loggers as well as co-located stream temperature data loggers with electrical resistors as confirmatory sources of the timing of stream drying. We expanded our approach to an entire stream network to illustrate the utility of the method to detect patterns of flow permanence over a broader spatial extent. We successfully identified and separated signals characteristic of wet and dry stream conditions and their shifts over time. Most of our study sites within the entire stream network exhibited a single state over the entire season (80%), but a portion of them showed one or more shifts among states (17%). We provide recommendations to use this approach based on a series of simple steps. Our findings illustrate a successful method that can be used to rigorously quantify flow permanence regimes in streams using existing records of stream temperature.
NASA Astrophysics Data System (ADS)
Kelleher, C.; Archfield, S. A.
2016-12-01
Stream temperatures drive biogeochemical processes and influence ecosystem health and extent, with patterns of stream temperature arising from complex interactions between climate, land cover, and in-stream diversions and dams. While each of these individual drivers may have well-understood implications for changing stream temperatures, considering the concomitant impacts of these drivers along the stream network is much more difficult. This is true especially for the eastern United States, where downstream temperature integrates many different upstream impacts. To begin to decipher the influence of these different drivers on changing stream temperatures and how these impacts may manifest through time, we examined trends for 66 sites with continuous stream temperature measurements across the eastern United States. Stream temperature records were summarized as daily mean, maximum, and mimimum values, and sites consisting of 15 or more years of data were selected for analysis. While annual stream temperatures at 53 locations were warming, a few sites on larger rivers (n = 13) have been cooling. To explore the timing of these changes as well as their implications for aquatic species, we calculated trends for seasonal extremes (average of the five warmest and coolest daily stream temperatures) during spring, summer, and fall. Interestingly, while some streams displayed strong warming trends in peak summer temperatures (n = 43), many streams also displayed cooling trends (n = 23). We also found that peak stream temperatures were warming faster in fall than in summer for many locations (n = 36). Results of this analysis show that warming (and cooling) happens at different times in different places, as a function of climate and anthropogenic impacts. Finally, we explore potential drivers of these different patterns, to determine the relative impacts of climate, land cover, and in-stream water diversions on stream temperature change. Given that the number of regulated stream miles is only increasing, improving our understanding of linkages between landscape drivers and stream temperature variation may have important outcomes for river management in a changing world.
Programmable Quantum Photonic Processor Using Silicon Photonics
2017-04-01
quantum information processing and quantum sensing, ranging from linear optics quantum computing and quantum simulation to quantum ...transformers have driven experimental and theoretical advances in quantum simulation, cluster-state quantum computing , all-optical quantum repeaters...neuromorphic computing , and other applications. In addition, we developed new schemes for ballistic quantum computation , new methods for
Methods of producing alkylated hydrocarbons from an in situ heat treatment process liquid
Roes, Augustinus Wilhelmus Maria [Houston, TX; Mo, Weijian [Sugar Land, TX; Muylle, Michel Serge Marie [Houston, TX; Mandema, Remco Hugo [Houston, TX; Nair, Vijay [Katy, TX
2009-09-01
A method for producing alkylated hydrocarbons is disclosed. Formation fluid is produced from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes olefins. The liquid stream is fractionated to produce at least a second gas stream including hydrocarbons having a carbon number of at least 3. The first gas stream and the second gas stream are introduced into an alkylation unit to produce alkylated hydrocarbons. At least a portion of the olefins in the first gas stream enhance alkylation.
Process and apparatus for producing ultrafine explosive particles
McGowan, Michael J.
1992-10-20
A method and an improved eductor apparatus for producing ultrafine explosive particles is disclosed. The explosive particles, which when incorporated into a binder system, have the ability to propagate in thin sheets, and have very low impact sensitivity and very high propagation sensitivity. A stream of a solution of the explosive dissolved in a solvent is thoroughly mixed with a stream of an inert nonsolvent by obtaining nonlaminar flow of the streams by applying pressure against the flow of the nonsolvent stream, to thereby diverge the stream as it contacts the explosive solution, and violently agitating the combined stream to rapidly precipitate the explosive particles from the solution in the form of generally spheroidal, ultrafine particles. The two streams are injected coaxially through continuous, concentric orifices of a nozzle into a mixing chamber. Preferably, the nonsolvent stream is injected centrally of the explosive solution stream. The explosive solution stream is injected downstream of and surrounds the nonsolvent solution stream for a substantial distance prior to being ejected into the mixing chamber.
Fuel-cell engine stream conditioning system
DuBose, Ronald Arthur
2002-01-01
A stream conditioning system for a fuel cell gas management system or fuel cell engine. The stream conditioning system manages species potential in at least one fuel cell reactant stream. A species transfer device is located in the path of at least one reactant stream of a fuel cell's inlet or outlet, which transfer device conditions that stream to improve the efficiency of the fuel cell. The species transfer device incorporates an exchange media and a sorbent. The fuel cell gas management system can include a cathode loop with the stream conditioning system transferring latent and sensible heat from an exhaust stream to the cathode inlet stream of the fuel cell; an anode humidity retention system for maintaining the total enthalpy of the anode stream exiting the fuel cell related to the total enthalpy of the anode inlet stream; and a cooling water management system having segregated deionized water and cooling water loops interconnected by means of a brazed plate heat exchanger.
Fast algorithm for automatically computing Strahler stream order
Lanfear, Kenneth J.
1990-01-01
An efficient algorithm was developed to determine Strahler stream order for segments of stream networks represented in a Geographic Information System (GIS). The algorithm correctly assigns Strahler stream order in topologically complex situations such as braided streams and multiple drainage outlets. Execution time varies nearly linearly with the number of stream segments in the network. This technique is expected to be particularly useful for studying the topology of dense stream networks derived from digital elevation model data.
NASA Astrophysics Data System (ADS)
Heffernan, James B.
2018-06-01
Higher stream temperatures as the climate warms could lead to lower ecosystem productivity and higher CO2 emissions in streams. An analysis of stream ecosystems finds that such changes will be greatest in the warmest and most productive streams.
Academic Self-Concepts in Ability Streams: Considering Domain Specificity and Same-Stream Peers
ERIC Educational Resources Information Center
Liem, Gregory Arief D.; McInerney, Dennis M.; Yeung, Alexander S.
2015-01-01
The study examined the relations between academic achievement and self-concepts in a sample of 1,067 seventh-grade students from 3 core ability streams in Singapore secondary education. Although between-stream differences in achievement were large, between-stream differences in academic self-concepts were negligible. Within each stream, levels of…
Summer temperature patterns in the headwater streams of the Oregon coast range
Liz Dent; Danielle Vick; Kyle Abraham; Stephen Schoenholtz; Sherri Johnson
2008-01-01
Cool summertime stream temperature is an important component of high-quality aquatic habitat in Oregon coastal streams. Within the Oregon Coast Range, small headwater streams make up a majority of the stream network, yet little information is available on temperature patterns and the longitudinal variability for these streams. In this paper we describe preharvest...
Hydrogeologic controls on summer stream temperatures in the McKenzie River basin, Oregon
Christina Tague; Michael Farrell; Gordon Grant; Sarah Lewis; Serge Rey
2007-01-01
Stream temperature is a complex function of energy inputs including solar radiation and latent and sensible heat transfer. In streams where groundwater inputs are significant, energy input through advection can also be an important control on stream temperature. For an individual stream reach, models of stream temperature can take advantage of direct measurement or...
Bisinger, J J; Russell, J R; Morrical, D G; Isenhart, T M
2014-08-01
For 2 grazing seasons, effects of pasture size, stream access, and off-stream water on cow distribution relative to a stream were evaluated in six 12.1-ha cool-season grass pastures. Two pasture sizes (small [4.0 ha] and large [12.1 ha]) with 3 management treatments (unrestricted stream access without off-stream water [U], unrestricted stream access with off-stream water [UW], and stream access restricted to a stabilized stream crossing [R]) were alternated between pasture sizes every 2 wk for 5 consecutive 4-wk intervals in each grazing season. Small and large pastures were stocked with 5 and 15 August-calving cows from mid May through mid October. At 10-min intervals, cow location was determined with Global Positioning System collars fitted on 2 to 3 cows in each pasture and identified when observed in the stream (0-10 m from the stream) or riparian (0-33 m from the stream) zones and ambient temperature was recorded with on-site weather stations. Over all intervals, cows were observed more (P ≤ 0.01) frequently in the stream and riparian zones of small than large pastures regardless of management treatment. Cows in R pastures had 24 and 8% less (P < 0.01) observations in the stream and riparian zones than U or UW pastures regardless of pasture size. Off-stream water had little effect on the presence of cows in or near pasture streams regardless of pasture size. In 2011, the probability of cow presence in the stream and riparian zones increased at greater (P < 0.04) rates as ambient temperature increased in U and UW pastures than in 2010. As ambient temperature increased, the probability of cow presence in the stream and riparian zones increased at greater (P < 0.01) rates in small than large pastures. Across pasture sizes, the probability of cow presence in the stream and riparian zone increased less (P < 0.01) with increasing ambient temperatures in R than U and UW pastures. Rates of increase in the probability of cow presence in shade (within 10 m of tree drip lines) in the total pasture with increasing temperatures did not differ between treatments. However, probability of cow presence in riparian shade increased at greater (P < 0.01) rates in small than large pastures. Pasture size was a major factor affecting congregation of cows in or near pasture streams with unrestricted access.
StreamExplorer: A Multi-Stage System for Visually Exploring Events in Social Streams.
Wu, Yingcai; Chen, Zhutian; Sun, Guodao; Xie, Xiao; Cao, Nan; Liu, Shixia; Cui, Weiwei
2017-10-18
Analyzing social streams is important for many applications, such as crisis management. However, the considerable diversity, increasing volume, and high dynamics of social streams of large events continue to be significant challenges that must be overcome to ensure effective exploration. We propose a novel framework by which to handle complex social streams on a budget PC. This framework features two components: 1) an online method to detect important time periods (i.e., subevents), and 2) a tailored GPU-assisted Self-Organizing Map (SOM) method, which clusters the tweets of subevents stably and efficiently. Based on the framework, we present StreamExplorer to facilitate the visual analysis, tracking, and comparison of a social stream at three levels. At a macroscopic level, StreamExplorer uses a new glyph-based timeline visualization, which presents a quick multi-faceted overview of the ebb and flow of a social stream. At a mesoscopic level, a map visualization is employed to visually summarize the social stream from either a topical or geographical aspect. At a microscopic level, users can employ interactive lenses to visually examine and explore the social stream from different perspectives. Two case studies and a task-based evaluation are used to demonstrate the effectiveness and usefulness of StreamExplorer.Analyzing social streams is important for many applications, such as crisis management. However, the considerable diversity, increasing volume, and high dynamics of social streams of large events continue to be significant challenges that must be overcome to ensure effective exploration. We propose a novel framework by which to handle complex social streams on a budget PC. This framework features two components: 1) an online method to detect important time periods (i.e., subevents), and 2) a tailored GPU-assisted Self-Organizing Map (SOM) method, which clusters the tweets of subevents stably and efficiently. Based on the framework, we present StreamExplorer to facilitate the visual analysis, tracking, and comparison of a social stream at three levels. At a macroscopic level, StreamExplorer uses a new glyph-based timeline visualization, which presents a quick multi-faceted overview of the ebb and flow of a social stream. At a mesoscopic level, a map visualization is employed to visually summarize the social stream from either a topical or geographical aspect. At a microscopic level, users can employ interactive lenses to visually examine and explore the social stream from different perspectives. Two case studies and a task-based evaluation are used to demonstrate the effectiveness and usefulness of StreamExplorer.
NASA Astrophysics Data System (ADS)
Seok, Song Young; Ho, Song Yang; Ho, Lee Jung; Moo Jong, Park
2015-04-01
Due to the increase of impervious layers caused by increased rainfall and urbanization which were brought about by the climate change after the late 1990s, the flood damage in urban watersheds is rising. The recent flood damage is occurring in medium and small stream rather than in large stream. Particularly, in medium stream which pass the cities, sudden flood occurs due to the short concentration of rainfall and urban areas suffer large damage, even though the flood damage is small, since residential areas and social infrastructures are concentrated. In spite of the importance of medium and small stream to pass the cities, there is no certain standard for classification of natural or urban stream and existing studies are mostly focused on the impervious area among the land use characteristics of watersheds. Most of existing river studies are based on the watershed scale, but in most urban watersheds where stream pass, urban areas are concentrated in the confluence, so urban areas only occupy less than 10% of the whole watershed and there is a high uncertainty in the classification of urban areas, based the watershed of stream. This study aims to suggest a classification standard of medium and small stream between local stream and small stream where suffer flood damage. According to the classified medium and small stream, this study analyzed the stream area to the stream width and distance using Arcgis Buffer tool, based on the stream line, not the existing watershed scale. This study then chose urban watersheds by analyzing the river area at certain intervals from the center of the chosen medium and small stream, in different ways. Among the land use characteristics in urban areas, the impervious area was applied to the selection standard of urban watersheds and the characteristics of urban watersheds were presented by calculating the ratio of the stream area to the impervious area using the Buffer tool. Acknowledgement "This research was supported by a grant [NEMA-NH-2011-45] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea." Keywords: land use, urban watershed, medium and smaill stream, impervious area
Louisiana waterthrush and benthic macroinvertebrate response to shale gas development
Wood, Petra; Frantz, Mack W.; Becker, Douglas A.
2016-01-01
Because shale gas development is occurring over large landscapes and consequently is affecting many headwater streams, an understanding of its effects on headwater-stream faunal communities is needed. We examined effects of shale gas development (well pads and associated infrastructure) on Louisiana waterthrush Parkesia motacilla and benthic macroinvertebrate communities in 12 West Virginia headwater streams in 2011. Streams were classed as impacted (n = 6) or unimpacted (n = 6) by shale gas development. We quantified waterthrush demography (nest success, clutch size, number of fledglings, territory density), a waterthrush Habitat Suitability Index, a Rapid Bioassessment Protocol habitat index, and benthic macroinvertebrate metrics including a genus-level stream-quality index for each stream. We compared each benthic metric between impacted and unimpacted streams with a Student's t-test that incorporated adjustments for normalizing data. Impacted streams had lower genus-level stream-quality index scores; lower overall and Ephemeroptera, Plecoptera, and Trichoptera richness; fewer intolerant taxa, more tolerant taxa, and greater density of 0–3-mm individuals (P ≤ 0.10). We then used Pearson correlation to relate waterthrush metrics to benthic metrics across the 12 streams. Territory density (no. of territories/km of stream) was greater on streams with higher genus-level stream-quality index scores; greater density of all taxa and Ephemeroptera, Plecoptera, and Trichoptera taxa; and greater biomass. Clutch size was greater on streams with higher genus-level stream-quality index scores. Nest survival analyses (n = 43 nests) completed with Program MARK suggested minimal influence of benthic metrics compared with nest stage and Habitat Suitability Index score. Although our study spanned only one season, our results suggest that shale gas development affected waterthrush and benthic communities in the headwater streams we studied. Thus, these ecological effects of shale gas development warrant closer examination.
StreamMap: Smooth Dynamic Visualization of High-Density Streaming Points.
Li, Chenhui; Baciu, George; Han, Yu
2018-03-01
Interactive visualization of streaming points for real-time scatterplots and linear blending of correlation patterns is increasingly becoming the dominant mode of visual analytics for both big data and streaming data from active sensors and broadcasting media. To better visualize and interact with inter-stream patterns, it is generally necessary to smooth out gaps or distortions in the streaming data. Previous approaches either animate the points directly or present a sampled static heat-map. We propose a new approach, called StreamMap, to smoothly blend high-density streaming points and create a visual flow that emphasizes the density pattern distributions. In essence, we present three new contributions for the visualization of high-density streaming points. The first contribution is a density-based method called super kernel density estimation that aggregates streaming points using an adaptive kernel to solve the overlapping problem. The second contribution is a robust density morphing algorithm that generates several smooth intermediate frames for a given pair of frames. The third contribution is a trend representation design that can help convey the flow directions of the streaming points. The experimental results on three datasets demonstrate the effectiveness of StreamMap when dynamic visualization and visual analysis of trend patterns on streaming points are required.
Flotemersch, Joseph E; North, Sheila; Blocksom, Karen A
2014-02-01
Benthic macroinvertebrates are sampled in streams and rivers as one of the assessment elements of the US Environmental Protection Agency's National Rivers and Streams Assessment. In a 2006 report, the recommendation was made that different yet comparable methods be evaluated for different types of streams (e.g., low gradient vs. high gradient). Consequently, a research element was added to the 2008-2009 National Rivers and Streams Assessment to conduct a side-by-side comparison of the standard macroinvertebrate sampling method with an alternate method specifically designed for low-gradient wadeable streams and rivers that focused more on stream edge habitat. Samples were collected using each method at 525 sites in five of nine aggregate ecoregions located in the conterminous USA. Methods were compared using the benthic macroinvertebrate multimetric index developed for the 2006 Wadeable Streams Assessment. Statistical analysis did not reveal any trends that would suggest the overall assessment of low-gradient streams on a regional or national scale would change if the alternate method was used rather than the standard sampling method, regardless of the gradient cutoff used to define low-gradient streams. Based on these results, the National Rivers and Streams Survey should continue to use the standard field method for sampling all streams.
Bingham, Dennis N.; Wilding, Bruce M.; McKellar, Michael G.
2002-01-01
A process for the separation and liquefaction of component gasses from a pressurized mix gas stream is disclosed. The process involves cooling the pressurized mixed gas stream in a heat exchanger so as to condensing one or more of the gas components having the highest condensation point; separating the condensed components from the remaining mixed gas stream in a gas-liquid separator; cooling the separated condensed component stream by passing it through an expander; and passing the cooled component stream back through the heat exchanger such that the cooled component stream functions as the refrigerant for the heat exchanger. The cycle is then repeated for the remaining mixed gas stream so as to draw off the next component gas and further cool the remaining mixed gas stream. The process continues until all of the component gases are separated from the desired gas stream. The final gas stream is then passed through a final heat exchanger and expander. The expander decreases the pressure on the gas stream, thereby cooling the stream and causing a portion of the gas stream to liquify within a tank. The portion of the gas which is hot liquefied is passed back through each of the heat exchanges where it functions as a refrigerant.
Bingham, Dennis N.; Wilding, Bruce M.; McKellar, Michael G.
2000-01-01
A process for the separation and liquefaction of component gasses from a pressurized mix gas stream is disclosed. The process involves cooling the pressurized mixed gas stream in a heat exchanger so as to condense one or more of the gas components having the highest condensation point; separating the condensed components from the remaining mixed gas stream in a gas-liquid separator; cooling the separated condensed component stream by passing it through an expander; and passing the cooled component stream back through the heat exchanger such that the cooled component stream functions as the refrigerant for the heat exchanger. The cycle is then repeated for the remaining mixed gas stream so as to draw off the next component gas and further cool the remaining mixed gas stream. The process continues until all of the component gases are separated from the desired gas stream. The final gas stream is then passed through a final heat exchanger and expander. The expander decreases the pressure on the gas stream, thereby cooling the stream and causing a portion of the gas stream to liquify within a tank. The portion of the gas which is not liquefied is passed back through each of the heat exchanges where it functions as a refrigerant.
Quantum technology past, present, future: quantum energetics (Conference Presentation)
NASA Astrophysics Data System (ADS)
Choi, Sang H.
2017-04-01
Since the development of quantum physics in the early part of the 1900s, this field of study has made remarkable contributions to our civilization. Some of these advances include lasers, light-emitting diodes (LED), sensors, spectroscopy, quantum dots, quantum gravity and quantum entanglements. In 1998, the NASA Langley Research Center established a quantum technology committee to monitor the progress in this area and initiated research to determine the potential of quantum technology for future NASA missions. The areas of interest in quantum technology at NASA included fundamental quantum-optics materials associated with quantum dots and quantum wells, device-oriented photonic crystals, smart optics, quantum conductors, quantum information and computing, teleportation theorem, and quantum energetics. A brief review of the work performed, the progress made in advancing these technologies, and the potential NASA applications of quantum technology will be presented.
Slip stream apparatus and method for treating water in a circulating water system
Cleveland, Joe R.
1997-01-01
An apparatus (10) for treating water in a circulating water system (12) t has a cooling water basin (14) includes a slip stream conduit (16) in flow communication with the circulating water system (12), a source (36) of acid solution in flow communication with the slip stream conduit (16), and a decarbonator (58) in flow communication with the slip stream conduit (16) and the cooling water basin (14). In use, a slip stream of circulating water is drawn from the circulating water system (12) into the slip stream conduit (16) of the apparatus (10). The slip stream pH is lowered by contact with an acid solution provided from the source (36) thereof. The slip stream is then passed through a decarbonator (58) to form a treated slip stream, and the treated slip stream is returned to the cooling water basin (14).
Relating quantum coherence and correlations with entropy-based measures.
Wang, Xiao-Li; Yue, Qiu-Ling; Yu, Chao-Hua; Gao, Fei; Qin, Su-Juan
2017-09-21
Quantum coherence and quantum correlations are important quantum resources for quantum computation and quantum information. In this paper, using entropy-based measures, we investigate the relationships between quantum correlated coherence, which is the coherence between subsystems, and two main kinds of quantum correlations as defined by quantum discord as well as quantum entanglement. In particular, we show that quantum discord and quantum entanglement can be well characterized by quantum correlated coherence. Moreover, we prove that the entanglement measure formulated by quantum correlated coherence is lower and upper bounded by the relative entropy of entanglement and the entanglement of formation, respectively, and equal to the relative entropy of entanglement for all the maximally correlated states.
Numerical approaches to combustion modeling. Progress in Astronautics and Aeronautics. Vol. 135
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oran, E.S.; Boris, J.P.
1991-01-01
Various papers on numerical approaches to combustion modeling are presented. The topics addressed include; ab initio quantum chemistry for combustion; rate coefficient calculations for combustion modeling; numerical modeling of combustion of complex hydrocarbons; combustion kinetics and sensitivity analysis computations; reduction of chemical reaction models; length scales in laminar and turbulent flames; numerical modeling of laminar diffusion flames; laminar flames in premixed gases; spectral simulations of turbulent reacting flows; vortex simulation of reacting shear flow; combustion modeling using PDF methods. Also considered are: supersonic reacting internal flow fields; studies of detonation initiation, propagation, and quenching; numerical modeling of heterogeneous detonations, deflagration-to-detonationmore » transition to reactive granular materials; toward a microscopic theory of detonations in energetic crystals; overview of spray modeling; liquid drop behavior in dense and dilute clusters; spray combustion in idealized configurations: parallel drop streams; comparisons of deterministic and stochastic computations of drop collisions in dense sprays; ignition and flame spread across solid fuels; numerical study of pulse combustor dynamics; mathematical modeling of enclosure fires; nuclear systems.« less
Langmuir instability in partially spin polarized bounded degenerate plasma
NASA Astrophysics Data System (ADS)
Iqbal, Z.; Jamil, M.; Murtaza, G.
2018-04-01
Some new features of waves inside the cylindrical waveguide on employing the separated spin evolution quantum hydrodynamic model are evoked. Primarily, the instability of Langmuir wave due to the electron beam in a partially spin polarized degenerate plasma considering a nano-cylindrical geometry is discussed. Besides, the evolution of a new spin-dependent wave (spin electron acoustic wave) due to electron spin polarization effects in the real wave spectrum is elaborated. Analyzing the growth rate, it is found that in the absence of Bohm potential, the electron spin effects or exchange interaction reduce the growth rate as well as k-domain but the inclusion of Bohm potential increases both the growth rate and k-domain. Further, we investigate the geometry effects expressed by R and pon and find that they have opposite effects on the growth rate and k-domain of the instability. Additionally, how the other parameters like electron beam density or streaming speed of beam electrons influence the growth rate is also investigated. This study may find its applications for the signal analysis in solid state devices at nanoscales.
The potential of protein-nanomaterial interaction for advanced drug delivery.
Peng, Qiang; Mu, Huiling
2016-03-10
Nanomaterials, like nanoparticles, micelles, nano-sheets, nanotubes and quantum dots, have great potentials in biomedical fields. However, their delivery is highly limited by the formation of protein corona upon interaction with endogenous proteins. This new identity, instead of nanomaterial itself, would be the real substance the organs and cells firstly encounter. Consequently, the behavior of nanomaterials in vivo is uncontrollable and some undesired effects may occur, like rapid clearance from blood stream; risk of capillary blockage; loss of targeting capacity; and potential toxicity. Therefore, protein-nanomaterial interaction is a great challenge for nanomaterial systems and should be inhibited. However, this interaction can also be used to functionalize nanomaterials by forming a selected protein corona. Unlike other decoration using exogenous molecules, nanomaterials functionalized by selected protein corona using endogenous proteins would have greater promise for clinical use. In this review, we aim to provide a comprehensive understanding of protein-nanomaterial interaction. Importantly, a discussion about how to use such interaction is launched and some possible applications of such interaction for advanced drug delivery are presented. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2016-01-01
Dwarf galaxies or globular clusters orbiting the Milky Way can be pulled apart by tidal forces, leaving behind a trail of stars known as a stellar stream. One such trail, the Ophiuchus stream, has posed a serious dynamical puzzle since its discovery. But a recent study has identified four stars that might help resolve this streams mystery.Conflicting TimescalesThe stellar stream Ophiuchus was discovered around our galaxy in 2014. Based on its length, which appears to be 1.6 kpc, we can calculate the time that has passed since its progenitor was disrupted and the stream was created: ~250 Myr. But the stars within it are ~12 Gyr old, and the stream orbits the galaxy with a period of ~350 Myr.Given these numbers, we can assume that Ophiuchuss progenitor completed many orbits of the Milky Way in its lifetime. So why would it only have been disrupted 250 million years ago?Fanning StreamLed by Branimir Sesar (Max Planck Institute for Astronomy), a team of scientists has proposed an idea that might help solve this puzzle. If the Ophiuchus stellar stream is on a chaotic orbit common in triaxial potentials, which the Milky Ways may be then the stream ends can fan out, with stars spreading in position and velocity.The fanned part of the stream, however, would be difficult to detect because of its low surface brightness. As a result, the Ophiuchus stellar stream could actually be longer than originally measured, implying that it was disrupted longer ago than was believed.Search for Fan StarsTo test this idea, Sesar and collaborators performed a search around the ends of the stream, looking for stars thatare of the right type to match the stream,are at the predicted distance of the stream,are located near the stream ends, andhave velocities that match the stream and dont match the background halo stars.Histogram of the heliocentric velocities of the 43 target stars. Six stars have velocities matching the stream velocity. Two of these are located in the main stream; the other four may be part of a fan at the end of the stream. [Sesar et al. 2016]Of the 43 targets for which the authors obtained spectra, four stars met these criteria and are located beyond the main extent of the stream, possibly comprising a fan at the streams end. Including these stars as part of the Ophiuchus stream, its length becomes 3 kpc, implying that its time of disruption was closer to 400 million years ago. This relieves the timescale tension but does not resolve it.That said, the mere evidence of a fan in the Ophiuchus stream suggests that its progenitor may have been on a chaotic orbit. If this is the case, its entirely possible that the progenitor could have survived for ~11 Gyr, only to have been disrupted within the last 0.5 Gyr. Detailed modeling and further identification of potential fan stars in the Ophiuchus stream will help to test this idea and resolve the puzzle of this stream.CitationBranimir Sesar et al 2016 ApJ 816 L4. doi:10.3847/2041-8205/816/1/L4
Pyne, Matthew I.; Carlisle, Daren M.; Konrad, Christopher P.; Stein, Eric D.
2017-01-01
Regional classification of streams is an early step in the Ecological Limits of Hydrologic Alteration framework. Many stream classifications are based on an inductive approach using hydrologic data from minimally disturbed basins, but this approach may underrepresent streams from heavily disturbed basins or sparsely gaged arid regions. An alternative is a deductive approach, using watershed climate, land use, and geomorphology to classify streams, but this approach may miss important hydrological characteristics of streams. We classified all stream reaches in California using both approaches. First, we used Bayesian and hierarchical clustering to classify reaches according to watershed characteristics. Streams were clustered into seven classes according to elevation, sedimentary rock, and winter precipitation. Permutation-based analysis of variance and random forest analyses were used to determine which hydrologic variables best separate streams into their respective classes. Stream typology (i.e., the class that a stream reach is assigned to) is shaped mainly by patterns of high and mean flow behavior within the stream's landscape context. Additionally, random forest was used to determine which hydrologic variables best separate minimally disturbed reference streams from non-reference streams in each of the seven classes. In contrast to stream typology, deviation from reference conditions is more difficult to detect and is largely defined by changes in low-flow variables, average daily flow, and duration of flow. Our combined deductive/inductive approach allows us to estimate flow under minimally disturbed conditions based on the deductive analysis and compare to measured flow based on the inductive analysis in order to estimate hydrologic change.
Low-head sea lamprey barrier effects on stream habitat and fish communities in the Great Lakes basin
Dodd, H.R.; Hayes, D.B.; Baylis, J.R.; Carl, L.M.; Goldstein, J.D.; McLaughlin, R.L.; Noakes, D.L.G.; Porto, L.M.; Jones, M.L.
2003-01-01
Low-head barriers are used to block adult sea lamprey (Petromyzon marinus) from upstream spawning habitat. However, these barriers may impact stream fish communities through restriction of fish movement and habitat alteration. During the summer of 1996, the fish community and habitat conditions in twenty-four stream pairs were sampled across the Great Lakes basin. Seven of these stream pairs were re-sampled in 1997. Each pair consisted of a barrier stream with a low-head barrier and a reference stream without a low-head barrier. On average, barrier streams were significantly deeper (df = 179, P = 0.0018) and wider (df = 179, P = 0.0236) than reference streams, but temperature and substrate were similar (df = 183, P = 0.9027; df = 179, P = 0.999). Barrier streams contained approximately four more fish species on average than reference streams. However, streams with low-head barriers showed a greater upstream decline in species richness compared to reference streams with a net loss of 2.4 species. Barrier streams also showed a peak in richness directly downstream of the barriers, indicating that these barriers block fish movement upstream. Using S??renson's similarity index (based on presence/absence), a comparison of fish community assemblages above and below low-head barriers was not significantly different than upstream and downstream sites on reference streams (n = 96, P > 0.05), implying they have relatively little effect on overall fish assemblage composition. Differences in the frequency of occurrence and abundance between barrier and reference streams was apparent for some species, suggesting their sensitivity to barriers.
Brett B. Roper; John M. Buffington; Eric Archer; Chris Moyer; Mike Ward
2008-01-01
Consistency in determining Rosgen stream types was evaluated in 12 streams within the John Day Basin, northeastern Oregon. The Rosgen classification system is commonly used in the western United States and is based on the measurement of five stream attributes: entrenchment ratio, width-to-depth ratio, sinuosity, slope, and substrate size. Streams were classified from...
A Scalable Multimedia Streaming Scheme with CBR-Transmission of VBR-Encoded Videos over the Internet
ERIC Educational Resources Information Center
Kabir, Md. H.; Shoja, Gholamali C.; Manning, Eric G.
2006-01-01
Streaming audio/video contents over the Internet requires large network bandwidth and timely delivery of media data. A streaming session is generally long and also needs a large I/O bandwidth at the streaming server. A streaming server, however, has limited network and I/O bandwidth. For this reason, a streaming server alone cannot scale a…
Jiayu Wu; Timothy W. Stewart; Janette R. Thompson; Randy Kolka; Kristie J. Franz
2015-01-01
Urban stream condition is often degraded by human activities in the surrounding watershed. Given the complexity of urban areas, relationships among variables that cause stream degradation can be difficult to isolate. We examined factors affecting stream condition by evaluating social, terrestrial, stream hydrology and water quality variables from 20 urban stream...
The long term response of stream flow to climatic warming in headwater streams of interior Alaska
Jeremy B. Jones; Amanda J. Rinehart
2010-01-01
Warming in the boreal forest of interior Alaska will have fundamental impacts on stream ecosystems through changes in stream hydrology resulting from upslope loss of permafrost, alteration of availability of soil moisture, and the distribution of vegetation. We examined stream flow in three headwater streams of the Caribou-Poker Creeks Research Watershed (CPCRW) in...
Methods of making transportation fuel
Roes, Augustinus Wilhelmus Maria [Houston, TX; Mo, Weijian [Sugar Land, TX; Muylle, Michel Serge Marie [Houston, TX; Mandema, Remco Hugo [Houston, TX; Nair, Vijay [Katy, TX
2012-04-10
A method for producing alkylated hydrocarbons is disclosed. Formation fluid is produced from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes olefins. The liquid stream is fractionated to produce at least a second gas stream including hydrocarbons having a carbon number of at least 3. The first gas stream and the second gas stream are introduced into an alkylation unit to produce alkylated hydrocarbons. At least a portion of the olefins in the first gas stream enhance alkylation. The alkylated hydrocarbons may be blended with one or more components to produce transportation fuel.
Treatment of gas from an in situ conversion process
Diaz, Zaida [Katy, TX; Del Paggio, Alan Anthony [Spring, TX; Nair, Vijay [Katy, TX; Roes, Augustinus Wilhelmus Maria [Houston, TX
2011-12-06
A method of producing methane is described. The method includes providing formation fluid from a subsurface in situ conversion process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes olefins. At least the olefins in the first gas stream are contacted with a hydrogen source in the presence of one or more catalysts and steam to produce a second gas stream. The second gas stream is contacted with a hydrogen source in the presence of one or more additional catalysts to produce a third gas stream. The third gas stream includes methane.
Cash streams: five powerful income streams to increase your net income.
Means, G B
1998-01-01
You can dramatically increase your profits by: Cash stream #1--extending credit and earning interest on the unpaid balance; Cash stream #2--doing all of the undone treatment in your practice; Cash stream #3--providing financing for everyone who deserves it; Cash stream #4--treating bigger cases; Cash stream #5--avoid treating deadbeats. There isn't anything I know of, which will jump start your practice as much as these five cash streams--more new patients, better case acceptance as well as increased cash flow. But you must get good at financing. You must have in place an organized, proven, financing system--just like the finance companies do.
Two different streams form the dorsal visual system: anatomy and functions.
Rizzolatti, Giacomo; Matelli, Massimo
2003-11-01
There are two radically different views on the functional role of the dorsal visual stream. One considers it as a system involved in space perception. The other is of a system that codes visual information for action organization. On the basis of new anatomical data and a reconsideration of previous functional and clinical data, we propose that the dorsal stream and its recipient parietal areas form two distinct functional systems: the dorso-dorsal stream (d-d stream) and the ventro-dorsal stream (v-d stream). The d-d stream is formed by area V6 (main d-d extrastriate visual node) and areas V6A and MIP of the superior parietal lobule. Its major functional role is the control of actions "on line". Its damage leads to optic ataxia. The v-d stream is formed by area MT (main v-d extrastriate visual node) and by the visual areas of the inferior parietal lobule. As the d-d stream, v-d stream is responsible for action organization. It, however, also plays a crucial role in space perception and action understanding. The putative mechanisms linking action and perception in the v-d stream is discussed.
Impact of riparian land use on stream insects of Kudremukh National Park, Karnataka state, India
Subramanian, K.A.; Sivaramakrishnan, K.G.; Gadgil, Madhav
2005-01-01
The impact of riparian land use on the stream insect communities was studied at Kudremukh National Park located within Western Ghats, a tropical biodiversity hotspot in India. The diversity and community composition of stream insects varied across streams with different riparian land use types. The rarefied family and generic richness was highest in streams with natural semi evergreen forests as riparian vegetation. However, when the streams had human habitations and areca nut plantations as riparian land use type, the rarefied richness was higher than that of streams with natural evergreen forests and grasslands. The streams with scrub lands and iron ore mining as the riparian land use had the lowest rarefied richness. Within a landscape, the streams with the natural riparian vegetation had similar community composition. However, streams with natural grasslands as the riparian vegetation, had low diversity and the community composition was similar to those of paddy fields. We discuss how stream insect assemblages differ due to varied riparian land use patterns, reflecting fundamental alterations in the functioning of stream ecosystems. This understanding is vital to conserve, manage and restore tropical riverine ecosystems. PMID:17119631
Water Resources Data, Florida, Water Year 2003, Volume 3A: Southwest Florida Surface Water
Kane, R.L.; Fletcher, W.L.
2004-01-01
Water resources data for the 2003 water year in Florida consist of continuous or daily discharges for 385 streams, periodic discharge for 13 streams, continuous daily stage for 255 streams, periodic stage for 13 streams, peak stage for 36 streams and peak discharge for 36 streams, continuous or daily elevations for 13 lakes, periodic elevations for 46 lakes; continuous ground-water levels for 441 wells, periodic ground-water levels for 1,227 wells, and quality-of-water data for 133 surface-water sites and 308 wells. The data for Southwest Florida include records of stage, discharge, and water quality of streams; stage, contents, water quality of lakes and reservoirs, and water levels and water quality of ground-water wells. Volume 3A contains continuous or daily discharge for 103 streams, periodic discharge for 7 streams, continuous or daily stage for 67 streams, periodic stage for 13 streams, peak stage and discharge for 8 streams, continuous or daily elevations for 2 lakes, periodic elevations for 26 lakes, and quality-of-water data for 62 surface-water sites. These data represent the national Water Data System records collected by the U.S. Geological Survey and cooperating local, state, and federal agencies in Florida.
Basal melt beneath whillans ice stream and ice streams A and C
NASA Technical Reports Server (NTRS)
Joughin, I.; Teluezyk, S.; Engelhardt, H.
2002-01-01
We have used a recently derived map of the velocity of Whillans Ice Stream and Ice Streams A and C to help estimate basal melt. Temperature was modeled with a simple vertical advection-diffusion equation, 'tuned' to match temperature profiles. We find that most of the melt occurs beneath the tributaries where larger basal shear stresses and thicker ice favors greater melt (e.g., 10-20 mm/yr). The occurrence of basal freezing is predicted beneath much of the ice plains of Ice Stream C and Whillans Ice Stream. Modelled melt rates for when Ice Stream C was active suggest there was just enough melt water generated in its tributaries to balance basal freezing on its ice plain. Net basal melt for Whillans Ice Stream is positive due to smaller basal temperature gradients. Modelled temperatures on Whillans Ice Stream, however, were constrained by a single temperature profile at UpB. Basal temperature gradients for Whillans B1 and Ice Stream A may have conditions more similar to those beneath Ice Streams C and D, in which case, there may not be sufficient melt to sustain motion. This would be consistent with the steady deceleration of Whillans stream over the last few decades.
Effects of Debris Flows on Stream Ecosystems of the Klamath Mountains, Northern California
NASA Astrophysics Data System (ADS)
Cover, M. R.; Delafuente, J. A.; Resh, V. H.
2006-12-01
We examined the long-term effects of debris flows on channel characteristics and aquatic food webs in steep (0.04-0.06 slope), small (4-6 m wide) streams. A large rain-on-snow storm event in January 1997 resulted in numerous landslides and debris flows throughout many basins in the Klamath Mountains of northern California. Debris floods resulted in extensive impacts throughout entire drainage networks, including mobilization of valley floor deposits and removal of vegetation. Comparing 5 streams scoured by debris flows in 1997 and 5 streams that had not been scoured as recently, we determined that debris-flows decreased channel complexity by reducing alluvial step frequency and large woody debris volumes. Unscoured streams had more diverse riparian vegetation, whereas scoured streams were dominated by dense, even-aged stands of white alder (Alnus rhombiflia). Benthic invertebrate shredders, especially nemourid and peltoperlid stoneflies, were more abundant and diverse in unscoured streams, reflecting the more diverse allochthonous resources. Debris flows resulted in increased variability in canopy cover, depending on degree of alder recolonization. Periphyton biomass was higher in unscoured streams, but primary production was greater in the recently scoured streams, suggesting that invertebrate grazers kept algal assemblages in an early successional state. Glossosomatid caddisflies were predominant scrapers in scoured streams; heptageniid mayflies were abundant in unscoured streams. Rainbow trout (Oncorhynchus mykiss) were of similar abundance in scoured and unscoured streams, but scoured streams were dominated by young-of-the-year fish while older juveniles were more abundant in unscoured streams. Differences in the presence of cold-water (Doroneuria) versus warm-water (Calineuria) perlid stoneflies suggest that debris flows have altered stream temperatures. Debris flows have long-lasting impacts on stream communities, primarily through the cascading effects of removal of riparian vegetation. Because debris flow frequency increases following road construction and timber harvest, the long-term biological effects of debris flows on stream ecosystems, including anadromous fish populations, needs to be considered in forest management decisions.
Satellite imagery of the onset of streaming flow of ice streams C and D, West Antarctica
Hodge, S.M.; Doppelhammer, S.K.
1996-01-01
Five overlapping Landsat multispectral scanner satellite images of the interior of the West Antarctic ice sheet were enhanced with principal component analysis, high-pass filtering, and linear contrast stretching and merged into a mosaic by aligning surface features in the overlap areas. The mosaic was registered to geodetic coordinates, to an accuracy of about 1 km, using the five scene centers as control points. The onset of streaming flow of two tributaries of ice stream C and one tributary of ice stream D is visible in the mosaic. The onset appears to occur within a relatively short distance, less than the width of the ice stream, typically at a subglacial topographic feature such as a step or ridge. The ice streams extend farther up into the interior than previously mapped. Ice stream D starts about 150 km from the ice divide, at an altitude of about 1500 m, approximately halfway up the convex-upward dome shape of the interior ice sheet. Ice stream D is relatively much longer than ice stream C, possibly because ice stream D is currently active whereas ice stream C is currently inactive. The grounded portion of the West Antarctic ice sheet is perhaps best conceptualized as an ice sheet in which ice streams are embedded over most of its area, with slow moving ice converging into fast moving ice streams in a widely distributed pattern, much like that of streams and rivers in a hydrologic basin. A relic margin appears to parallel most of the south margin of the tributary of ice stream D, separated from the active shear margin by about 10 km or less for a distance of over 200 km. This means there is now evidence for recent changes having occurred in three of the five major ice streams which drain most of West Antarctica (B, C, and D), two of which (B and D) are currently active.
Stream Phosphorus Dynamics Along a Suburbanizing Gradient in Southern Ontario, Canada
NASA Astrophysics Data System (ADS)
Duval, T. P.
2017-12-01
While it is well known that urban streams are subject to impaired water quality relative to natural analogues, far less research has been directed at stream water quality during the process of (sub-) urbanization. This study determines the role of housing construction activities in Brampton, Canada on the concentration and flux of phosphorus (P) of a headwater stream. Prior to development the stream was engineered with a riffle-pool sequence, riparian plantings, and a floodplain corridor that was lined with sediment fencing. Stream sites were sampled daily over a period of six months at locations representing varying stages of subdivision completion (upper site -active construction; middle site -finished construction and natural vegetation; lower site -finished construction and active construction). A nearby urban stream site developed ten years prior to this study was selected as a reference site. There were no differences in total phosphorus (TP) levels or flux between the suburbanizing and urban streams; however, the forms of P differed between sites. The urban stream TP load was dominated by particulate phosphorus (PP) while suburbanizing stream P was mainly in the dissolved organic phosphorus (DOP) form. The importance of DOP to TP flux increased with the onset of the growing season. TP levels in all stream segments frequently exceeded provincial water quality guidelines during storm events but were generally low during baseflow conditions. During storm events PP and total suspended solid levels in the suburbanizing stream reached levels of the urban stream due to sediment fence failure at several locations along the construction-hillslope interface. Along the suburbanizing gradient, the hydrological connection to a mid-reach zone of no-construction activity / fallow field and native forest resulted in significantly lower P levels than the upper suburbanizing stream site. This suggests that stream channel design features as well as timing of construction activities and the hydrological connection between the stream and construction projects all contribute to downstream export of nutrients and ultimately stream water quality.
Streams in the urban heat island: spatial and temporal variability in temperature
Somers, Kayleigh A.; Bernhardt, Emily S.; Grace, James B.; Hassett, Brooke A.; Sudduth, Elizabeth B.; Wang, Siyi; Urban, Dean L.
2013-01-01
Streams draining urban heat islands tend to be hotter than rural and forested streams at baseflow because of warmer urban air and ground temperatures, paved surfaces, and decreased riparian canopy. Urban infrastructure efficiently routes runoff over hot impervious surfaces and through storm drains directly into streams and can lead to rapid, dramatic increases in temperature. Thermal regimes affect habitat quality and biogeochemical processes, and changes can be lethal if temperatures exceed upper tolerance limits of aquatic fauna. In summer 2009, we collected continuous (10-min interval) temperature data in 60 streams spanning a range of development intensity in the Piedmont of North Carolina, USA. The 5 most urbanized streams averaged 21.1°C at baseflow, compared to 19.5°C in the 5 most forested streams. Temperatures in urban streams rose as much as 4°C during a small regional storm, whereas the same storm led to extremely small to no changes in temperature in forested streams. Over a kilometer of stream length, baseflow temperature varied by as much as 10°C in an urban stream and as little as 2°C in a forested stream. We used structural equation modeling to explore how reach- and catchment-scale attributes interact to explain maximum temperatures and magnitudes of storm-flow temperature surges. The best predictive model of baseflow temperatures (R2 = 0.461) included moderately strong pathways directly (extent of development and road density) and indirectly, as mediated by reach-scale factors (canopy closure and stream width), from catchment-scale factors. The strongest influence on storm-flow temperature surges appeared to be % development in the catchment. Reach-scale factors, such as the extent of riparian forest and stream width, had little mitigating influence (R2 = 0.448). Stream temperature is an essential, but overlooked, aspect of the urban stream syndrome and is affected by reach-scale habitat variables, catchment-scale urbanization, and stream thermal regimes.
Interfacing External Quantum Devices to a Universal Quantum Computer
Lagana, Antonio A.; Lohe, Max A.; von Smekal, Lorenz
2011-01-01
We present a scheme to use external quantum devices using the universal quantum computer previously constructed. We thereby show how the universal quantum computer can utilize networked quantum information resources to carry out local computations. Such information may come from specialized quantum devices or even from remote universal quantum computers. We show how to accomplish this by devising universal quantum computer programs that implement well known oracle based quantum algorithms, namely the Deutsch, Deutsch-Jozsa, and the Grover algorithms using external black-box quantum oracle devices. In the process, we demonstrate a method to map existing quantum algorithms onto the universal quantum computer. PMID:22216276
Interfacing external quantum devices to a universal quantum computer.
Lagana, Antonio A; Lohe, Max A; von Smekal, Lorenz
2011-01-01
We present a scheme to use external quantum devices using the universal quantum computer previously constructed. We thereby show how the universal quantum computer can utilize networked quantum information resources to carry out local computations. Such information may come from specialized quantum devices or even from remote universal quantum computers. We show how to accomplish this by devising universal quantum computer programs that implement well known oracle based quantum algorithms, namely the Deutsch, Deutsch-Jozsa, and the Grover algorithms using external black-box quantum oracle devices. In the process, we demonstrate a method to map existing quantum algorithms onto the universal quantum computer. © 2011 Lagana et al.
Device for staged carbon monoxide oxidation
Vanderborgh, Nicholas E.; Nguyen, Trung V.; Guante, Jr., Joseph
1993-01-01
A method and apparatus for selectively oxidizing carbon monoxide in a hydrogen rich feed stream. The method comprises mixing a feed stream consisting essentially of hydrogen, carbon dioxide, water and carbon monoxide with a first predetermined quantity of oxygen (air). The temperature of the mixed feed/oxygen stream is adjusted in a first the heat exchanger assembly (20) to a first temperature. The mixed feed/oxygen stream is sent to reaction chambers (30,32) having an oxidation catalyst contained therein. The carbon monoxide of the feed stream preferentially absorbs on the catalyst at the first temperature to react with the oxygen in the chambers (30,32) with minimal simultaneous reaction of the hydrogen to form an intermediate hydrogen rich process stream having a lower carbon monoxide content than the feed stream. The elevated outlet temperature of the process stream is carefully controlled in a second heat exchanger assembly (42) to a second temperature above the first temperature. The process stream is then mixed with a second predetermined quantity of oxygen (air). The carbon monoxide of the process stream preferentially reacts with the second quantity of oxygen in a second stage reaction chamber (56) with minimal simultaneous reaction of the hydrogen in the process stream. The reaction produces a hydrogen rich product stream having a lower carbon monoxide content than the process stream. The product stream is then cooled in a third heat exchanger assembly (72) to a third predetermined temperature. Three or more stages may be desirable, each with metered oxygen injection.
Responses to riparian restoration in the Spring Creek watershed, Central Pennsylvania
Carline, R.F.; Walsh, M.C.
2007-01-01
Riparian treatments, consisting of 3- to 4-m buffer strips, stream bank stabilization, and rock-lined stream crossings, were installed in two streams with livestock grazing to reduce sediment loading and stream bank erosion. Cedar Run and Slab Cabin Run, the treatment streams, and Spring Creek, an adjacent reference stream without riparian grazing, were monitored prior to (1991-1992) and 3-5 years after (2001-2003) riparian buffer installation to assess channel morphology, stream substrate composition, suspended sediments, and macroinvertebrate communities. Few changes were found in channel widths and depths, but channel-structuring flow events were rare in the drought period after restoration. Stream bank vegetation increased from 50% or less to 100% in nearly all formerly grazed riparian buffers. The proportion of fine sediments in stream substrates decreased in Cedar Run but not in Slab Cabin Run. After riparian treatments, suspended sediments during base flow and storm flow decreased 47-87% in both streams. Macroinvertebrate diversity did not improve after restoration in either treated stream. Relative to Spring Creek, macroinvertebrate densities increased in both treated streams by the end of the posttreatment sampling period. Despite drought conditions that may have altered physical and biological effects of riparian treatments, goals of the riparian restoration to minimize erosion and sedimentation were met. A relatively narrow grass buffer along 2.4 km of each stream was effective in improving water quality, stream substrates, and some biological metrics. ?? 2007 Society for Ecological Restoration International.
Stone, Wesley W.; Gilliom, Robert J.; Martin, Jeffrey D.
2014-01-01
This report provides an overview of the U.S. Geological Survey National Water-Quality Assessment program and National Stream Quality Accounting Network findings for pesticide occurrence in U.S. streams and rivers during 2002–11 and compares them to findings for the previous decade (1992–2001). In addition, pesticide stream concentrations were compared to Human Health Benchmarks (HHBs) and chronic Aquatic Life Benchmarks (ALBs). The comparisons between the decades were intended to be simple and descriptive. Trends over time are being evaluated separately in a series of studies involving rigorous trend analysis. During both decades, one or more pesticides or pesticide degradates were detected more than 90 percent of the time in streams across all types of land uses. For individual pesticides during 2002–11, atrazine (and degradate, deethylatrazine), carbaryl, fipronil (and degradates), metolachlor, prometon, and simazine were detected in streams more than 50 percent of the time. In contrast, alachlor, chlorpyrifos, cyanazine, diazinon, EPTC, Dacthal, and tebuthiuron were detected less frequently in streams during the second decade than during the first decade. During 2002–11, only one stream had an annual mean pesticide concentration that exceeded an HHB. In contrast, 17 percent of agriculture land-use streams and one mixed land-use stream had annual mean pesticide concentrations that exceeded HHBs during 1992–2001. The difference between the first and second decades in terms of percent of streams exceeding HHBs was attributed to regulatory changes. During 2002–11, nearly two-thirds of agriculture land-use streams and nearly one-half of mixed land-use streams exceeded chronic ALBs. For urban land use, 90 percent of the streams exceeded a chronic ALB. Fipronil, metolachlor, malathion, cis-permethrin, and dichlorvos exceeded chronic ALBs for more than 10 percent of the streams. For agriculture and mixed land-use streams, the overall percent of streams that exceeded a chronic ALB was very similar between the decades. For urban land-use streams, the percent of streams exceeding a chronic ALB during 2002–11 nearly doubled that seen during 1992–2001. The reason for this difference was the inclusion of fipronil monitoring during the second decade. Across all land-use streams, the percent of streams exceeding a chronic ALB for fipronil during 2002–11 was greater than all other insecticides during both decades. The percent of streams exceeding a chronic ALB for metolachlor, chlorpyrifos, diazinon, malathion, and carbaryl decreased from the first decade to the second decade. The results of the 2002–11 summary and comparison to 1992–2001 are consistent with the results from more rigorous trend analysis of pesticide stream concentrations for individual streams in various regions of the U.S.
IMPERVIOUS COVER AS A REGIONAL INDICATOR
Increases in impervious surface area in a watershed gives rise to changes in stream hydrology, stream channel morphology, increased pollutant runoff, and an increase in stream water temperature. These physical changes in the stream systems in turn give rise to impacts on stream ...
A deeper look at the GD1 stream: density variations and wiggles
NASA Astrophysics Data System (ADS)
de Boer, T. J. L.; Belokurov, V.; Koposov, S. E.; Ferrarese, L.; Erkal, D.; Côté, P.; Navarro, J. F.
2018-06-01
Using deep photometric data from Canada-France-Hawaii Telescope/Megacam, we study the morphology and density of the GD-1 stream, one of the longest and coldest stellar streams in the Milky Way. Our deep data recovers the lower main sequence of the stream with unprecedented quality, clearly separating it from Milky Way foreground and background stars. An analysis of the distance to different parts of the stream shows that GD-1 lies at a heliocentric distance between 8 and 10 kpc, with only a shallow gradient across 45° on the sky. Matched filter maps of the stream density show clear density variations, such as deviations from a single orbital track and tentative evidence for stream fanning. We also detect a clear underdensity in the middle of the stream track at φ1 = -45° surrounded by overdense stream segments on either side. This location is a promising candidate for the elusive missing progenitor of the GD-1 stream. We conclude that the GD-1 stream has clearly been disturbed by interactions with the Milky Way disc or other subhaloes.
Photonic Programmable Tele-Cloning Network.
Li, Wei; Chen, Ming-Cheng
2016-06-29
The concept of quantum teleportation allows an unknown quantum states to be broadcasted and processed in a distributed quantum network. The quantum information injected into the network can be diluted to distant multi-copies by quantum cloning and processed by arbitrary quantum logic gates which were programed in advance in the network quantum state. A quantum network combines simultaneously these fundamental quantum functions could lead to new intriguing applications. Here we propose a photonic programmable telecloning network based on a four-photon interferometer. The photonic network serves as quantum gate, quantum cloning and quantum teleportation and features experimental advantage of high brightness by photon recycling.
Brian J. Palik; Stephen W. Golladay; P. Charles Goebel; Brad W. Taylor
1998-01-01
Large floods are an important process controlling the structure and function of stream ecosystems. One of the ways floods affect streams is through the recruitment of coarse woody debris from stream-side forests. Stream valley geomorphology may mediate this interaction by altering flood velocity, depth, and duration. Little research has examined how floods and...
Jeremy D. Groom
2013-01-01
Studies over the past 40 years have established that riparian buff er retention along streams protects against stream temperature increase. Th is protection is neither universal nor complete; some buff ered streams still warm, while other streamsâ temperatures remain stable. Oregon Department of Forestry developed riparian rules in the Forest Practices Act (FPA) to...
Determining long time-scale hyporheic zone flow paths in Antarctic streams
Gooseff, M.N.; McKnight, Diane M.; Runkel, R.L.; Vaughn, B.H.
2003-01-01
In the McMurdo Dry Valleys of Antarctica, glaciers are the source of meltwater during the austral summer, and the streams and adjacent hyporheic zones constitute the entire physical watershed; there are no hillslope processes in these systems. Hyporheic zones can extend several metres from each side of the stream, and are up to 70 cm deep, corresponding to a lateral cross-section as large as 12 m2, and water resides in the subsurface year around. In this study, we differentiate between the near-stream hyporheic zone, which can be characterized with stream tracer experiments, and the extended hyporheic zone, which has a longer time-scale of exchange. We sampled stream water from Green Creek and from the adjacent saturated alluvium for stable isotopes of D and 18O to assess the significance and extent of stream-water exchange between the streams and extended hyporheic zones over long time-scales (days to weeks). Our results show that water residing in the extended hyporheic zone is much more isotopically enriched (up to 11??? D and 2.2??? 18O) than stream water. This result suggests a long residence time within the extended hyporheic zone, during which fractionation has occured owing to summer evaporation and winter sublimation of hyporheic water. We found less enriched water in the extended hyporheic zone later in the flow season, suggesting that stream water may be exchanged into and out of this zone, on the time-scale of weeks to months. The transient storage model OTIS was used to characterize the exchange of stream water with the extended hyporheic zone. Model results yield exchange rates (??) generally an order magnitude lower (10-5 s-1) than those determined using stream-tracer techniques on the same stream. In light of previous studies in these streams, these results suggest that the hyporheic zones in Antarctic streams have near-stream zones of rapid stream-water exchange, where 'fast' biogeochemical reactions may influence water chemistry, and extended hyporheic zones, in which slower biogeochemical reaction rates may affect stream-water chemistry at longer time-scales. Copyright ?? 2003 John Wiley & Sons, Ltd.
Effects of urban stream burial on nitrogen uptake and ...
Urbanization has resulted in extensive burial and channelization of headwater streams, yet little is known about impacts on stream ecosystem functions critical for reducing downstream nitrogen pollution. To characterize the biogeochemical impact of stream burial, we measured NO3- uptake, using 15N-NO3- isotope tracer releases, and whole stream metabolism, during four seasons in three paired buried and open streams reaches within the Baltimore Ecosystem Study Long-term Ecological Research Network. Stream burial increased NO3- uptake lengths, by a factor of 7.5 (p < 0.01) and decreased nitrate uptake velocity and areal nitrate uptake rate by factors of 8.2 (p = 0.01) and 9.6 (p < 0.001), respectively. Stream burial decreased gross primary productivity by a factor of 9.2 (p < 0.05) and decreased ecosystem respiration by a factor of 4.2 (p = 0.06). From statistical analysis of Excitation Emissions Matrices (EEMs), buried streams were also found to have significantly less labile dissolved organic matter. Furthermore, buried streams had significantly lower transient storage and water temperatures. Overall, differences in NO3- uptake and metabolism were primarily explained by decreased transient storage and light availability in buried streams. We estimate that stream burial increases daily watershed nitrate export by as much as 500% due to decreased in-stream retention and may considerably decrease carbon export via decreased primary production. These results
The Midwest Stream Quality Assessment—Influences of human activities on streams
Van Metre, Peter C.; Mahler, Barbara J.; Carlisle, Daren M.; Coles, James F.
2018-04-16
Healthy streams and the fish and other organisms that live in them contribute to our quality of life. Extensive modification of the landscape in the Midwestern United States, however, has profoundly affected the condition of streams. Row crops and pavement have replaced grasslands and woodlands, streams have been straightened, and wetlands and fields have been drained. Runoff from agricultural and urban land brings sediment and chemicals to streams. What is the chemical, physical, and biological condition of Midwestern streams? Which physical and chemical stressors are adversely affecting biological communities, what are their origins, and how might we lessen or avoid their adverse effects?In 2013, the U.S. Geological Survey (USGS) conducted the Midwest Stream Quality Assessment to evaluate how human activities affect the biological condition of Midwestern streams. In collaboration with the U.S. Environmental Protection Agency National Rivers and Streams Assessment, the USGS sampled 100 streams, chosen to be representative of the different types of watersheds in the region. Biological condition was evaluated based on the number and diversity of fish, algae, and invertebrates in the streams. Changes to the physical habitat and chemical characteristics of the streams—“stressors”—were assessed, and their relation to landscape factors and biological condition was explored by using mathematical models. The data and models help us to better understand how the human activities on the landscape are affecting streams in the region.
Sabater, Sergi; Elosegi, Arturo; Acuña, Vicenç; Basaguren, Ana; Muñoz, Isabel; Pozo, Jesús
2008-02-15
Climate affects many aspects of stream ecosystems, although the presence of riparian forests can buffer differences between streams in different climatic settings. In an attempt to measure the importance of climate, we compared the seasonal patterns of hydrology, input and storage of allochthonous organic matter, and the trophic structure (abundance of algae and macroinvertebrates) in two temperate forested streams, one Mediterranean, the other Atlantic. Hydrology played a leading role in shaping the trophic structure of both streams. Frequency and timing of floods and droughts determined benthic detritus storage. Inputs and retention of allochthonous organic matter were higher in the Atlantic stream, whereas chlorophyll concentration was lower because of stronger light limitation. Instead, light availability and scour of particulate organic matter during late winter favoured higher chlorophyll concentration in the Mediterranean stream. As a result, in the Mediterranean stream grazers were more prevalent and consumers showed a higher dependence on autotrophic materials. On the other hand, the Atlantic stream depended on allochthonous materials throughout the whole study period. The overall trophic structure showed much stronger seasonality in the Mediterranean than in the Atlantic stream, this being the most distinctive difference between these two types of temperate streams. The different patterns observed in the two streams are an indication that climatic differences should be incorporated in proper measurements of ecosystem health.
ASSESSING HEADWATER STREAMS: LINKING LANDSCAPES TO STREAM NETWORKS
Headwater streams represent a significant land-water boundary and drain 70-80% of the landscape. Headwater streams are vital components to drainage systems and are directly linked to our downstream rivers and lakes. However, alteration and loss of headwater streams have occurre...
Methods of hydrotreating a liquid stream to remove clogging compounds
Minderhoud, Johannes Kornelis [Amsterdam, NL; Nelson, Richard Gene [Katy, TX; Roes, Augustinus Wilhelmus Maria [Houston, TX; Ryan, Robert Charles [Houston, TX; Nair, Vijay [Katy, TX
2009-09-22
A method includes producing formation fluid from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a gas stream. At least a portion of the liquid stream is provided to a hydrotreating unit. At least a portion of selected in situ heat treatment clogging compositions in the liquid stream are removed to produce a hydrotreated liquid stream by hydrotreating at least a portion of the liquid stream at conditions sufficient to remove the selected in situ heat treatment clogging compositions.
NASA Astrophysics Data System (ADS)
Pennino, M. J.; Kaushal, S. S.; Mayer, P. M.; Utz, R. M.; Cooper, C. A.
2015-12-01
An improved understanding of sources and timing of water and nutrient fluxes associated with urban stream restoration is critical for guiding effective watershed management. We investigated how sources, fluxes, and flowpaths of water, carbon (C), nitrogen (N), and phosphorus (P) shift in response to differences in stream restoration and sanitary infrastructure. We compared a restored stream with 3 unrestored streams draining urban development and stormwater management over a 3 year period. We found that there was significantly decreased peak discharge in response to precipitation events following stream restoration. Similarly, we found that the restored stream showed significantly lower monthly peak runoff (9.4 ± 1.0 mm d-1) compared with two urban unrestored streams (ranging from 44.9 ± 4.5 to 55.4 ± 5.8 mm d-1) draining higher impervious surface cover. Peak runoff in the restored stream was more similar to a less developed stream draining extensive stormwater management (13.2 ± 1.9 mm d-1). Interestingly, the restored stream exported most carbon, nitrogen, and phosphorus loads at relatively lower streamflow than the 2 more urban streams, which exported most of their loads at higher and less frequent streamflow. Annual exports of total carbon (6.6 ± 0.5 kg ha-1 yr-1), total nitrogen (4.5 ± 0.3 kg ha-1 yr-1), and total phosphorus (161 ± 15 g ha-1 yr-1) were significantly lower in the restored stream compared to both urban unrestored streams (p < 0.05) and similar to the stream draining stormwater management. Although stream restoration appeared to potentially influence hydrology to some degree, nitrate isotope data suggested that 55 ± 1 % of the nitrate in the restored stream was derived from leaky sanitary sewers (during baseflow), similar to the unrestored streams. Longitudinal synoptic surveys of water and nitrate isotopes along all 4 watersheds suggested the importance of urban groundwater contamination from leaky piped infrastructure. Urban groundwater contamination was also suggested by additional tracer measurements including fluoride (added to drinking water) and iodide (contained in dietary salt). Our results suggest that integrating stream restoration with restoration of aging sanitary infrastructure can be critical to more effectively minimize watershed nutrient export. Given that both stream restoration and sanitary pipe repairs both involve extensive channel manipulation, they can be considered simultaneously in management strategies. In addition, ground water can be a major source of nutrient fluxes in urban watersheds, which has been less considered compared with upland sources and storm drains. Goundwater sources, fluxes, and flowpath should also be targeted in efforts to improve stream restoration strategies and prioritize hydrologic "hot spots" along watersheds where stream restoration is most likely to succeed.
Speechley, William J; Ngan, Elton T C
2008-01-01
Delusions, a cardinal feature of schizophrenia, are characterized by the development and preservation of false beliefs despite reason and evidence to the contrary. A number of cognitive models have made important contributions to our understanding of delusions, though it remains unclear which core cognitive processes are malfunctioning to enable individuals with delusions to form and maintain erroneous beliefs. We propose a modified dual-stream processing model that provides a viable and testable mechanism that can account for this debilitating symptom. Dual-stream models divide decision-making into two streams: a fast, intuitive and automatic form of processing (Stream 1); and a slower, conscious and deliberative process (Stream 2). Our novel model proposes two key influences on the way these streams interact in everyday decision-making: conflict and emotion. Conflict: in most decision-making scenarios one obvious answer presents itself and the two streams converge onto the same conclusion. However, in instances where there are competing alternative possibilities, an individual often experiences dissonance, or a sense of conflict. The detection of this conflict biases processing towards the more deliberative Stream 2. Emotion: highly emotional states can result in behavior that is reflexive and action-oriented. This may be due to the power of emotionally valenced stimuli to bias reasoning towards Stream 1. We propose that in schizophrenia, an abnormal response to these two influences results in a pathological schism between Stream 1 and Stream 2, enabling erroneous intuitive explanations to coexist with contrary logical explanations of the same event. Specifically, we suggest that delusions are the result of a failure to reconcile the two streams due to both a failure of conflict to bias decision-making towards Stream 2 and an accentuated emotional bias towards Stream 1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A.; Troia, Matthew J.; DeRolph, Christopher R.
Stream classifications are an inventory of different types of streams. Classifications help us explore similarities and differences among different types of streams, make inferences regarding stream ecosystem behavior, and communicate the complexities of ecosystems. We developed a nested, layered, and spatially contiguous stream classification to characterize the biophysical settings of stream reaches within the Eastern United States (~ 900,000 reaches). The classification is composed of five natural characteristics (hydrology, temperature, size, confinement, and substrate) along with several disturbance regime layers, and each was selected because of their relevance to hydropower mitigation. We developed the classification at the stream reach levelmore » using the National Hydrography Dataset Plus Version 1 (1:100k scale). The stream classification is useful to environmental mitigation for hydropower dams in multiple ways. First, it creates efficiency in the regulatory process by creating an objective and data-rich means to address meaningful mitigation actions. Secondly, the SCT addresses data gaps as it quickly provides an inventory of hydrology, temperature, morphology, and ecological communities for the immediate project area, but also surrounding streams. This includes identifying potential reference streams as those that are proximate to the hydropower facility and fall within the same class. These streams can potentially be used to identify ideal environmental conditions or identify desired ecological communities. In doing so, the stream provides some context for how streams may function, respond to dam regulation, and an overview of specific mitigation needs. Herein, we describe the methodology in developing each stream classification layer and provide a tutorial to guide applications of the classification (and associated data) in regulatory settings, such as hydropower (re)licensing.« less
Evidence for fish dispersal from spatial analysis of stream network topology
Hitt, N.P.; Angermeier, P.L.
2008-01-01
Developing spatially explicit conservation strategies for stream fishes requires an understanding of the spatial structure of dispersal within stream networks. We explored spatial patterns of stream fish dispersal by evaluating how the size and proximity of connected streams (i.e., stream network topology) explained variation in fish assemblage structure and how this relationship varied with local stream size. We used data from the US Environmental Protection Agency's Environmental Monitoring and Assessment Program in wadeable streams of the Mid-Atlantic Highlands region (n = 308 sites). We quantified stream network topology with a continuous analysis based on the rate of downstream flow accumulation from sites and with a discrete analysis based on the presence of mainstem river confluences (i.e., basin area >250 km2) within 20 fluvial km (fkm) from sites. Continuous variation in stream network topology was related to local species richness within a distance of ???10 fkm, suggesting an influence of fish dispersal within this spatial grain. This effect was explained largely by catostomid species, cyprinid species, and riverine species, but was not explained by zoogeographic regions, ecoregions, sampling period, or spatial autocorrelation. Sites near mainstem river confluences supported greater species richness and abundance of catostomid, cyprinid, and ictalurid fishes than did sites >20 fkm from such confluences. Assemblages at sites on the smallest streams were not related to stream network topology, consistent with the hypothesis that local stream size regulates the influence of regional dispersal. These results demonstrate that the size and proximity of connected streams influence the spatial distribution of fish and suggest that these influences can be incorporated into the designs of stream bioassessments and reserves to enhance management efficacy. ?? 2008 by The North American Benthological Society.
How and Why Does Stream Water Temperature Vary at Small Spatial Scales in a Headwater Stream?
NASA Astrophysics Data System (ADS)
Morgan, J. C.; Gannon, J. P.; Kelleher, C.
2017-12-01
The temperature of stream water is controlled by climatic variables, runoff/baseflow generation, and hyporheic exchange. Hydrologic conditions such as gaining/losing reaches and sources of inflow can vary dramatically along a stream on a small spatial scale. In this work, we attempt to discern the extent that the factors of air temperature, groundwater inflow, and precipitation influence stream temperature at small spatial scales along the length of a stream. To address this question, we measured stream temperature along the perennial stream network in a 43 ha catchment with a complex land use history in Cullowhee, NC. Two water temperature sensors were placed along the stream network on opposite sides of the stream at 100-meter intervals and at several locations of interest (i.e. stream junctions). The forty total sensors recorded the temperature every 10 minutes for one month in the spring and one month in the summer. A subset of sampling locations where stream temperature was consistent or varied from one side of the stream to the other were explored with a thermal imaging camera to obtain a more detailed representation of the spatial variation in temperature at those sites. These thermal surveys were compared with descriptions of the contributing area at the sample sites in an effort to discern specific causes of differing flow paths. Preliminary results suggest that on some branches of the stream stormflow has less influence than regular hyporheic exchange, while other tributaries can change dramatically with stormflow conditions. We anticipate this work will lead to a better understanding of temperature patterns in stream water networks. A better understanding of the importance of small-scale differences in flow paths to water temperature may be able to inform watershed management decisions in the future.
Endogenous Delta/Theta Sound-Brain Phase Entrainment Accelerates the Buildup of Auditory Streaming.
Riecke, Lars; Sack, Alexander T; Schroeder, Charles E
2015-12-21
In many natural listening situations, meaningful sounds (e.g., speech) fluctuate in slow rhythms among other sounds. When a slow rhythmic auditory stream is selectively attended, endogenous delta (1‒4 Hz) oscillations in auditory cortex may shift their timing so that higher-excitability neuronal phases become aligned with salient events in that stream [1, 2]. As a consequence of this stream-brain phase entrainment [3], these events are processed and perceived more readily than temporally non-overlapping events [4-11], essentially enhancing the neural segregation between the attended stream and temporally noncoherent streams [12]. Stream-brain phase entrainment is robust to acoustic interference [13-20] provided that target stream-evoked rhythmic activity can be segregated from noncoherent activity evoked by other sounds [21], a process that usually builds up over time [22-27]. However, it has remained unclear whether stream-brain phase entrainment functionally contributes to this buildup of rhythmic streams or whether it is merely an epiphenomenon of it. Here, we addressed this issue directly by experimentally manipulating endogenous stream-brain phase entrainment in human auditory cortex with non-invasive transcranial alternating current stimulation (TACS) [28-30]. We assessed the consequences of these manipulations on the perceptual buildup of the target stream (the time required to recognize its presence in a noisy background), using behavioral measures in 20 healthy listeners performing a naturalistic listening task. Experimentally induced cyclic 4-Hz variations in stream-brain phase entrainment reliably caused a cyclic 4-Hz pattern in perceptual buildup time. Our findings demonstrate that strong endogenous delta/theta stream-brain phase entrainment accelerates the perceptual emergence of task-relevant rhythmic streams in noisy environments. Copyright © 2015 Elsevier Ltd. All rights reserved.
Recent progress of quantum communication in China (Conference Presentation)
NASA Astrophysics Data System (ADS)
Zhang, Qiang
2016-04-01
Quantum communication, based on the quantum physics, can provide information theoretical security. Building a global quantum network is one ultimate goal for the research of quantum information. Here, this talk will review the progress for quantum communication in China, including quantum key distribution over metropolitan area with untrustful relay, field test of quantum entanglement swapping over metropolitan network, the 2000 km quantum key distribution main trunk line, and satellite based quantum communication.
Hybrid quantum computing with ancillas
NASA Astrophysics Data System (ADS)
Proctor, Timothy J.; Kendon, Viv
2016-10-01
In the quest to build a practical quantum computer, it is important to use efficient schemes for enacting the elementary quantum operations from which quantum computer programs are constructed. The opposing requirements of well-protected quantum data and fast quantum operations must be balanced to maintain the integrity of the quantum information throughout the computation. One important approach to quantum operations is to use an extra quantum system - an ancilla - to interact with the quantum data register. Ancillas can mediate interactions between separated quantum registers, and by using fresh ancillas for each quantum operation, data integrity can be preserved for longer. This review provides an overview of the basic concepts of the gate model quantum computer architecture, including the different possible forms of information encodings - from base two up to continuous variables - and a more detailed description of how the main types of ancilla-mediated quantum operations provide efficient quantum gates.
Research progress on quantum informatics and quantum computation
NASA Astrophysics Data System (ADS)
Zhao, Yusheng
2018-03-01
Quantum informatics is an emerging interdisciplinary subject developed by the combination of quantum mechanics, information science, and computer science in the 1980s. The birth and development of quantum information science has far-reaching significance in science and technology. At present, the application of quantum information technology has become the direction of people’s efforts. The preparation, storage, purification and regulation, transmission, quantum coding and decoding of quantum state have become the hotspot of scientists and technicians, which have a profound impact on the national economy and the people’s livelihood, technology and defense technology. This paper first summarizes the background of quantum information science and quantum computer and the current situation of domestic and foreign research, and then introduces the basic knowledge and basic concepts of quantum computing. Finally, several quantum algorithms are introduced in detail, including Quantum Fourier transform, Deutsch-Jozsa algorithm, Shor’s quantum algorithm, quantum phase estimation.
System for adding sulfur to a fuel cell stack system for improved fuel cell stability
Mukerjee, Subhasish [Pittsford, NY; Haltiner, Jr., Karl J; Weissman, Jeffrey G [West Henrietta, NY
2012-03-06
A system for adding sulfur to a fuel cell stack, having a reformer adapted to reform a hydrocarbon fuel stream containing sulfur contaminants, thereby providing a reformate stream having sulfur; a sulfur trap fluidly coupled downstream of the reformer for removing sulfur from the reformate stream, thereby providing a desulfurized reformate stream; and a metering device in fluid communication with the reformate stream upstream of the sulfur trap and with the desulfurized reformate stream downstream of the sulfur trap. The metering device is adapted to bypass a portion of the reformate stream to mix with the desulfurized reformate stream, thereby producing a conditioned reformate stream having a predetermined sulfur concentration that gives an acceptable balance of minimal drop in initial power with the desired maximum stability of operation over prolonged periods for the fuel cell stack.
Prioritized Contact Transport Stream
NASA Technical Reports Server (NTRS)
Hunt, Walter Lee, Jr. (Inventor)
2015-01-01
A detection process, contact recognition process, classification process, and identification process are applied to raw sensor data to produce an identified contact record set containing one or more identified contact records. A prioritization process is applied to the identified contact record set to assign a contact priority to each contact record in the identified contact record set. Data are removed from the contact records in the identified contact record set based on the contact priorities assigned to those contact records. A first contact stream is produced from the resulting contact records. The first contact stream is streamed in a contact transport stream. The contact transport stream may include and stream additional contact streams. The contact transport stream may be varied dynamically over time based on parameters such as available bandwidth, contact priority, presence/absence of contacts, system state, and configuration parameters.
Origins of low energy-transfer efficiency between patterned GaN quantum well and CdSe quantum dots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Xingsheng, E-mail: xsxu@semi.ac.cn
For hybrid light emitting devices (LEDs) consisting of GaN quantum wells and colloidal quantum dots, it is necessary to explore the physical mechanisms causing decreases in the quantum efficiencies and the energy transfer efficiency between a GaN quantum well and CdSe quantum dots. This study investigated the electro-luminescence for a hybrid LED consisting of colloidal quantum dots and a GaN quantum well patterned with photonic crystals. It was found that both the quantum efficiency of colloidal quantum dots on a GaN quantum well and the energy transfer efficiency between the patterned GaN quantum well and the colloidal quantum dots decreasedmore » with increases in the driving voltage or the driving time. Under high driving voltages, the decreases in the quantum efficiency of the colloidal quantum dots and the energy transfer efficiency can be attributed to Auger recombination, while those decreases under long driving time are due to photo-bleaching and Auger recombination.« less
Fermionic entanglement via quantum walks in quantum dots
NASA Astrophysics Data System (ADS)
Melnikov, Alexey A.; Fedichkin, Leonid E.
2018-02-01
Quantum walks are fundamentally different from random walks due to the quantum superposition property of quantum objects. Quantum walk process was found to be very useful for quantum information and quantum computation applications. In this paper we demonstrate how to use quantum walks as a tool to generate high-dimensional two-particle fermionic entanglement. The generated entanglement can survive longer in the presence of depolorazing noise due to the periodicity of quantum walk dynamics. The possibility to create two distinguishable qudits in a system of tunnel-coupled semiconductor quantum dots is discussed.
Research to inform policy on headwater streams: ongoing and future directions
Headwater streams are the exterior links of stream networks and represent a substantial proportion of U.S. stream miles. Alteration and loss of headwater streams have occurred without an understanding of the potential consequences to larger downstream waterbodies. Recent court ca...
Effects of outdoor education stream classes on substrate movement and macroinvertebrate colonization
USDA-ARS?s Scientific Manuscript database
Environmental education and stream quality monitoring overlap in stream classes conducted at resident outdoor education (ROE) programs. ROE programs frequently use the same stream locations for their stream classes. The repeated use of the same location can potentially degrade aquatic macroinverte...
PREDICTION OF FUNDAMENTAL ASSEMBLAGES OF MID-ATLANTIC HIGHLAND STREAM FISHES
A statistical software tool, the Stream Fish Assemblage Predictor (SFAP), based on stream sampling data collected by the EPA in the mid-Atlantic Highlands, was developed to predict potential stream fish communities using characteristics of the stream and its watershed.
Step o...
RIPARIAN FOREST INDICATORS OF POTENTIAL FUTURE STREAM CONDITION
Large wood in streams can play an extraordinarily important role in influencing the physical structure of streams and in providing habitat for aquatic organisms. Since wood is continually lost from streams, predicting the future input of wood to streams from riparian forests is c...
Wilding, Bruce M; Turner, Terry D
2014-12-02
A method of natural gas liquefaction may include cooling a gaseous NG process stream to form a liquid NG process stream. The method may further include directing the first tail gas stream out of a plant at a first pressure and directing a second tail gas stream out of the plant at a second pressure. An additional method of natural gas liquefaction may include separating CO.sub.2 from a liquid NG process stream and processing the CO.sub.2 to provide a CO.sub.2 product stream. Another method of natural gas liquefaction may include combining a marginal gaseous NG process stream with a secondary substantially pure NG stream to provide an improved gaseous NG process stream. Additionally, a NG liquefaction plant may include a first tail gas outlet, and at least a second tail gas outlet, the at least a second tail gas outlet separate from the first tail gas outlet.
Photonic Programmable Tele-Cloning Network
Li, Wei; Chen, Ming-Cheng
2016-01-01
The concept of quantum teleportation allows an unknown quantum states to be broadcasted and processed in a distributed quantum network. The quantum information injected into the network can be diluted to distant multi-copies by quantum cloning and processed by arbitrary quantum logic gates which were programed in advance in the network quantum state. A quantum network combines simultaneously these fundamental quantum functions could lead to new intriguing applications. Here we propose a photonic programmable telecloning network based on a four-photon interferometer. The photonic network serves as quantum gate, quantum cloning and quantum teleportation and features experimental advantage of high brightness by photon recycling. PMID:27353838
Experimental entanglement of 25 individually accessible atomic quantum interfaces.
Pu, Yunfei; Wu, Yukai; Jiang, Nan; Chang, Wei; Li, Chang; Zhang, Sheng; Duan, Luming
2018-04-01
A quantum interface links the stationary qubits in a quantum memory with flying photonic qubits in optical transmission channels and constitutes a critical element for the future quantum internet. Entanglement of quantum interfaces is an important step for the realization of quantum networks. Through heralded detection of photon interference, we generate multipartite entanglement between 25 (or 9) individually addressable quantum interfaces in a multiplexed atomic quantum memory array and confirm genuine 22-partite (or 9-partite) entanglement. This experimental entanglement of a record-high number of individually addressable quantum interfaces makes an important step toward the realization of quantum networks, long-distance quantum communication, and multipartite quantum information processing.
Ivan Arismendi; Sherri L. Johnson; Jason B. Dunham; Roy Haggerty
2012-01-01
Temperature is a fundamentally important driver of ecosystem processes in streams. Recent warming of terrestrial climates around the globe has motivated concern about consequent increases in stream temperature. More specifically, observed trends of increasing air temperature and declining stream flow are widely believed to result in corresponding increases in stream...
What's a stream without water? Disproportionality in headwater regions impacting water quality.
Armstrong, Andrea; Stedman, Richard C; Bishop, Joseph A; Sullivan, Patrick J
2012-11-01
Headwater streams are critical components of the stream network, yet landowner perceptions, attitudes, and property management behaviors surrounding these intermittent and ephemeral streams are not well understood. Our research uses the concept of watershed disproportionality, where coupled social-biophysical conditions bear a disproportionate responsibility for harmful water quality outcomes, to analyze the potential influence of riparian landowner perceptions and attitudes on water quality in headwater regions. We combine social science survey data, aerial imagery, and an analysis of spatial point processes to assess the relationship between riparian landowner perceptions and attitudes in relation to stream flow regularity. Stream flow regularity directly and positively shapes landowners' water quality concerns, and also positively influences landowners' attitudes of stream importance-a key determinant of water quality concern as identified in a path analysis. Similarly, riparian landowners who do not notice or perceive a stream on their property are likely located in headwater regions. Our findings indicate that landowners of headwater streams, which are critical areas for watershed-scale water quality, are less likely to manage for water quality than landowners with perennial streams in an obvious, natural channel. We discuss the relationships between streamflow and how landowners develop understandings of their stream, and relate this to the broader water quality implications of headwater stream mismanagement.
Getzmann, Stephan; Näätänen, Risto
2015-11-01
With age the ability to understand speech in multitalker environments usually deteriorates. The central auditory system has to perceptually segregate and group the acoustic input into sequences of distinct auditory objects. The present study used electrophysiological measures to study effects of age on auditory stream segregation in a multitalker scenario. Younger and older adults were presented with streams of short speech stimuli. When a single target stream was presented, the occurrence of a rare (deviant) syllable among a frequent (standard) syllable elicited the mismatch negativity (MMN), an electrophysiological correlate of automatic deviance detection. The presence of a second, concurrent stream consisting of the deviant syllable of the target stream reduced the MMN amplitude, especially when located nearby the target stream. The decrease in MMN amplitude indicates that the rare syllable of the target stream was less perceived as deviant, suggesting reduced stream segregation with decreasing stream distance. Moreover, the presence of a concurrent stream increased the MMN peak latency of the older group but not that of the younger group. The results provide neurophysiological evidence for the effects of concurrent speech on auditory processing in older adults, suggesting that older adults need more time for stream segregation in the presence of concurrent speech. Copyright © 2015 Elsevier Inc. All rights reserved.
Urban Stream Burial Increases Watershed-Scale Nitrate Export.
Beaulieu, Jake J; Golden, Heather E; Knightes, Christopher D; Mayer, Paul M; Kaushal, Sujay S; Pennino, Michael J; Arango, Clay P; Balz, David A; Elonen, Colleen M; Fritz, Ken M; Hill, Brian H
2015-01-01
Nitrogen (N) uptake in streams is an important ecosystem service that reduces nutrient loading to downstream ecosystems. Here we synthesize studies that investigated the effects of urban stream burial on N-uptake in two metropolitan areas and use simulation modeling to scale our measurements to the broader watershed scale. We report that nitrate travels on average 18 times farther downstream in buried than in open streams before being removed from the water column, indicating that burial substantially reduces N uptake in streams. Simulation modeling suggests that as burial expands throughout a river network, N uptake rates increase in the remaining open reaches which somewhat offsets reduced N uptake in buried reaches. This is particularly true at low levels of stream burial. At higher levels of stream burial, however, open reaches become rare and cumulative N uptake across all open reaches in the watershed rapidly declines. As a result, watershed-scale N export increases slowly at low levels of stream burial, after which increases in export become more pronounced. Stream burial in the lower, more urbanized portions of the watershed had a greater effect on N export than an equivalent amount of stream burial in the upper watershed. We suggest that stream daylighting (i.e., uncovering buried streams) can increase watershed-scale N retention.
Urban Stream Burial Increases Watershed-Scale Nitrate Export
Beaulieu, Jake J.; Golden, Heather E.; Knightes, Christopher D.; Mayer, Paul M.; Kaushal, Sujay S.; Pennino, Michael J.; Arango, Clay P.; Balz, David A.; Elonen, Colleen M.; Fritz, Ken M.; Hill, Brian H.
2015-01-01
Nitrogen (N) uptake in streams is an important ecosystem service that reduces nutrient loading to downstream ecosystems. Here we synthesize studies that investigated the effects of urban stream burial on N-uptake in two metropolitan areas and use simulation modeling to scale our measurements to the broader watershed scale. We report that nitrate travels on average 18 times farther downstream in buried than in open streams before being removed from the water column, indicating that burial substantially reduces N uptake in streams. Simulation modeling suggests that as burial expands throughout a river network, N uptake rates increase in the remaining open reaches which somewhat offsets reduced N uptake in buried reaches. This is particularly true at low levels of stream burial. At higher levels of stream burial, however, open reaches become rare and cumulative N uptake across all open reaches in the watershed rapidly declines. As a result, watershed-scale N export increases slowly at low levels of stream burial, after which increases in export become more pronounced. Stream burial in the lower, more urbanized portions of the watershed had a greater effect on N export than an equivalent amount of stream burial in the upper watershed. We suggest that stream daylighting (i.e., uncovering buried streams) can increase watershed-scale N retention. PMID:26186731
Speechley, W J; Murray, C B; McKay, R M; Munz, M T; Ngan, E T C
2010-03-01
Dual-stream information processing proposes that reasoning is composed of two interacting processes: a fast, intuitive system (Stream 1) and a slower, more logical process (Stream 2). In non-patient controls, divergence of these streams may result in the experience of conflict, modulating decision-making towards Stream 2, and initiating a more thorough examination of the available evidence. In delusional schizophrenia patients, a failure of conflict to modulate decision-making towards Stream 2 may reduce the influence of contradictory evidence, resulting in a failure to correct erroneous beliefs. Delusional schizophrenia patients and non-patient controls completed a deductive reasoning task requiring logical validity judgments of two-part conditional statements. Half of the statements were characterized by a conflict between logical validity (Stream 2) and content believability (Stream 1). Patients were significantly worse than controls in determining the logical validity of both conflict and non-conflict conditional statements. This between groups difference was significantly greater for the conflict condition. The results are consistent with the hypothesis that delusional schizophrenia patients fail to use conflict to modulate towards Stream 2 when the two streams of reasoning arrive at incompatible judgments. This finding provides encouraging preliminary support for the Dual-Stream Modulation Failure model of delusion formation and maintenance. 2009 Elsevier Masson SAS. All rights reserved.
Comparison of pesticides in eight U.S. urban streams
Hoffman, R.S.; Capel, P.D.; Larson, S.J.
2000-01-01
Little is known of the occurrence of pesticides in urban streams compared to streams draining agricultural areas. Water samples from eight urban streams from across the United States were analyzed for 75 pesticides and seven transformation products. For six of the eight urban streams, paired agricultural streams were used for comparisons. The herbicides detected most frequently in the urban streams were prometon, simazine, atrazine, tebuthiuron, and metolachlor, and the insecticides detected most frequently were diazinon, carbaryl, chlorpyrifos, and malathion. In contrast to similar-sized agricultural streams, total insecticide concentrations commonly exceeded total herbicide concentrations in these urban streams. In general, the temporal concentration patterns in the urban streams were consistent with the characteristics of the local growing season. The insecticides carbaryl and diazinon exceeded criteria for the protection of aquatic life in many of the urban streams in the spring and summer. When the country as a whole is considered, the estimated mass of herbicides contributed by urban areas to streams is dwarfed by the estimated contribution from agricultural areas, but for insecticides, contributions from urban and agricultural areas may be similar. The results of this study suggest that urban areas should not be overlooked when assessing sources and monitoring the occurrence of pesticides in surface waters.
Triple-server blind quantum computation using entanglement swapping
NASA Astrophysics Data System (ADS)
Li, Qin; Chan, Wai Hong; Wu, Chunhui; Wen, Zhonghua
2014-04-01
Blind quantum computation allows a client who does not have enough quantum resources or technologies to achieve quantum computation on a remote quantum server such that the client's input, output, and algorithm remain unknown to the server. Up to now, single- and double-server blind quantum computation have been considered. In this work, we propose a triple-server blind computation protocol where the client can delegate quantum computation to three quantum servers by the use of entanglement swapping. Furthermore, the three quantum servers can communicate with each other and the client is almost classical since one does not require any quantum computational power, quantum memory, and the ability to prepare any quantum states and only needs to be capable of getting access to quantum channels.
Multi-strategy based quantum cost reduction of linear nearest-neighbor quantum circuit
NASA Astrophysics Data System (ADS)
Tan, Ying-ying; Cheng, Xue-yun; Guan, Zhi-jin; Liu, Yang; Ma, Haiying
2018-03-01
With the development of reversible and quantum computing, study of reversible and quantum circuits has also developed rapidly. Due to physical constraints, most quantum circuits require quantum gates to interact on adjacent quantum bits. However, many existing quantum circuits nearest-neighbor have large quantum cost. Therefore, how to effectively reduce quantum cost is becoming a popular research topic. In this paper, we proposed multiple optimization strategies to reduce the quantum cost of the circuit, that is, we reduce quantum cost from MCT gates decomposition, nearest neighbor and circuit simplification, respectively. The experimental results show that the proposed strategies can effectively reduce the quantum cost, and the maximum optimization rate is 30.61% compared to the corresponding results.
Deforestation and stream warming affect body size of Amazonian fishes.
Ilha, Paulo; Schiesari, Luis; Yanagawa, Fernando I; Jankowski, KathiJo; Navas, Carlos A
2018-01-01
Declining body size has been suggested to be a universal response of organisms to rising temperatures, manifesting at all levels of organization and in a broad range of taxa. However, no study to date evaluated whether deforestation-driven warming could trigger a similar response. We studied changes in fish body size, from individuals to assemblages, in streams in Southeastern Amazonia. We first conducted sampling surveys to validate the assumption that deforestation promoted stream warming, and to test the hypothesis that warmer deforested streams had reduced fish body sizes relative to cooler forest streams. As predicted, deforested streams were up to 6 °C warmer and had fish 36% smaller than forest streams on average. This body size reduction could be largely explained by the responses of the four most common species, which were 43-55% smaller in deforested streams. We then conducted a laboratory experiment to test the hypothesis that stream warming as measured in the field was sufficient to cause a growth reduction in the dominant fish species in the region. Fish reared at forest stream temperatures gained mass, whereas those reared at deforested stream temperatures lost mass. Our results suggest that deforestation-driven stream warming is likely to be a relevant factor promoting observed body size reductions, although other changes in stream conditions, like reductions in organic matter inputs, can also be important. A broad scale reduction in fish body size due to warming may be occurring in streams throughout the Amazonian Arc of Deforestation, with potential implications for the conservation of Amazonian fish biodiversity and food supply for people around the Basin.
Groundwater declines are linked to changes in Great Plains stream fish assemblages
Prekins, Joshuah S.; Gido, Keith B.; Falke, Jeffrey A.; Fausch, Kurt D.; Crockett, Harry; Johnson, Eric R.; Sanderson, John
2017-01-01
Groundwater pumping for agriculture is a major driver causing declines of global freshwater ecosystems, yet the ecological consequences for stream fish assemblages are rarely quantified. We combined retrospective (1950–2010) and prospective (2011–2060) modeling approaches within a multiscale framework to predict change in Great Plains stream fish assemblages associated with groundwater pumping from the United States High Plains Aquifer. We modeled the relationship between the length of stream receiving water from the High Plains Aquifer and the occurrence of fishes characteristic of small and large streams in the western Great Plains at a regional scale and for six subwatersheds nested within the region. Water development at the regional scale was associated with construction of 154 barriers that fragment stream habitats, increased depth to groundwater and loss of 558 km of stream, and transformation of fish assemblage structure from dominance by large-stream to small-stream fishes. Scaling down to subwatersheds revealed consistent transformations in fish assemblage structure among western subwatersheds with increasing depths to groundwater. Although transformations occurred in the absence of barriers, barriers along mainstem rivers isolate depauperate western fish assemblages from relatively intact eastern fish assemblages. Projections to 2060 indicate loss of an additional 286 km of stream across the region, as well as continued replacement of large-stream fishes by small-stream fishes where groundwater pumping has increased depth to groundwater. Our work illustrates the shrinking of streams and homogenization of Great Plains stream fish assemblages related to groundwater pumping, and we predict similar transformations worldwide where local and regional aquifer depletions occur.
McClurg, S.E.; Petty, J.T.; Mazik, P.M.; Clayton, J.L.
2007-01-01
Restoration programs are expanding worldwide, but assessments of restoration effectiveness are rare. The objectives of our study were to assess current acid-precipitation remediation programs in streams of the Allegheny Plateau ecoregion of West Virginia (USA), identify specific attributes that could and could not be fully restored, and quantify temporal trends in ecosystem recovery. We sampled water chemistry, physical habitat, periphyton biomass, and benthic macroinvertebrate and fish community structure in three stream types: acidic (four streams), naturally circumneutral (eight streams), and acidic streams treated with limestone sand (eight streams). We observed no temporal trends in ecosystem recovery in treated streams despite sampling streams that ranged from 2 to 20 years since initial treatment. Our results indicated that the application of limestone sand to acidic streams was effective in fully recovering some characteristics, such as pH, alkalinity, Ca2+, Ca:H ratios, trout biomass and density, and trout reproductive success. However, recovery of many other characteristics was strongly dependent upon spatial proximity to treatment, and still others were never fully recovered. For example, limestone treatment did not restore dissolved aluminum concentrations, macroinvertebrate taxon richness, and total fish biomass to circumneutral reference conditions. Full recovery may not be occurring because treated streams continue to drain acidic watersheds and remain isolated in a network of acidic streams. We propose a revised stream restoration plan for the Allegheny Plateau that includes restoring stream ecosystems as connected networks rather than isolated reaches and recognizes that full recovery of acidified watersheds may not be possible. ?? 2007 by the Ecological Society of America.
Groundwater declines are linked to changes in Great Plains stream fish assemblages.
Perkin, Joshuah S; Gido, Keith B; Falke, Jeffrey A; Fausch, Kurt D; Crockett, Harry; Johnson, Eric R; Sanderson, John
2017-07-11
Groundwater pumping for agriculture is a major driver causing declines of global freshwater ecosystems, yet the ecological consequences for stream fish assemblages are rarely quantified. We combined retrospective (1950-2010) and prospective (2011-2060) modeling approaches within a multiscale framework to predict change in Great Plains stream fish assemblages associated with groundwater pumping from the United States High Plains Aquifer. We modeled the relationship between the length of stream receiving water from the High Plains Aquifer and the occurrence of fishes characteristic of small and large streams in the western Great Plains at a regional scale and for six subwatersheds nested within the region. Water development at the regional scale was associated with construction of 154 barriers that fragment stream habitats, increased depth to groundwater and loss of 558 km of stream, and transformation of fish assemblage structure from dominance by large-stream to small-stream fishes. Scaling down to subwatersheds revealed consistent transformations in fish assemblage structure among western subwatersheds with increasing depths to groundwater. Although transformations occurred in the absence of barriers, barriers along mainstem rivers isolate depauperate western fish assemblages from relatively intact eastern fish assemblages. Projections to 2060 indicate loss of an additional 286 km of stream across the region, as well as continued replacement of large-stream fishes by small-stream fishes where groundwater pumping has increased depth to groundwater. Our work illustrates the shrinking of streams and homogenization of Great Plains stream fish assemblages related to groundwater pumping, and we predict similar transformations worldwide where local and regional aquifer depletions occur.
Deforestation and stream warming affect body size of Amazonian fishes
Yanagawa, Fernando I.; Jankowski, KathiJo; Navas, Carlos A.
2018-01-01
Declining body size has been suggested to be a universal response of organisms to rising temperatures, manifesting at all levels of organization and in a broad range of taxa. However, no study to date evaluated whether deforestation-driven warming could trigger a similar response. We studied changes in fish body size, from individuals to assemblages, in streams in Southeastern Amazonia. We first conducted sampling surveys to validate the assumption that deforestation promoted stream warming, and to test the hypothesis that warmer deforested streams had reduced fish body sizes relative to cooler forest streams. As predicted, deforested streams were up to 6 °C warmer and had fish 36% smaller than forest streams on average. This body size reduction could be largely explained by the responses of the four most common species, which were 43–55% smaller in deforested streams. We then conducted a laboratory experiment to test the hypothesis that stream warming as measured in the field was sufficient to cause a growth reduction in the dominant fish species in the region. Fish reared at forest stream temperatures gained mass, whereas those reared at deforested stream temperatures lost mass. Our results suggest that deforestation-driven stream warming is likely to be a relevant factor promoting observed body size reductions, although other changes in stream conditions, like reductions in organic matter inputs, can also be important. A broad scale reduction in fish body size due to warming may be occurring in streams throughout the Amazonian Arc of Deforestation, with potential implications for the conservation of Amazonian fish biodiversity and food supply for people around the Basin. PMID:29718960
Groundwater declines are linked to changes in Great Plains stream fish assemblages
Perkin, Joshuah S.; Gido, Keith B.; Falke, Jeffrey A.; Fausch, Kurt D.; Crockett, Harry; Johnson, Eric R.; Sanderson, John
2017-01-01
Groundwater pumping for agriculture is a major driver causing declines of global freshwater ecosystems, yet the ecological consequences for stream fish assemblages are rarely quantified. We combined retrospective (1950–2010) and prospective (2011–2060) modeling approaches within a multiscale framework to predict change in Great Plains stream fish assemblages associated with groundwater pumping from the United States High Plains Aquifer. We modeled the relationship between the length of stream receiving water from the High Plains Aquifer and the occurrence of fishes characteristic of small and large streams in the western Great Plains at a regional scale and for six subwatersheds nested within the region. Water development at the regional scale was associated with construction of 154 barriers that fragment stream habitats, increased depth to groundwater and loss of 558 km of stream, and transformation of fish assemblage structure from dominance by large-stream to small-stream fishes. Scaling down to subwatersheds revealed consistent transformations in fish assemblage structure among western subwatersheds with increasing depths to groundwater. Although transformations occurred in the absence of barriers, barriers along mainstem rivers isolate depauperate western fish assemblages from relatively intact eastern fish assemblages. Projections to 2060 indicate loss of an additional 286 km of stream across the region, as well as continued replacement of large-stream fishes by small-stream fishes where groundwater pumping has increased depth to groundwater. Our work illustrates the shrinking of streams and homogenization of Great Plains stream fish assemblages related to groundwater pumping, and we predict similar transformations worldwide where local and regional aquifer depletions occur. PMID:28652354
Real-Time Earthquake Monitoring with Spatio-Temporal Fields
NASA Astrophysics Data System (ADS)
Whittier, J. C.; Nittel, S.; Subasinghe, I.
2017-10-01
With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Rui; Praggastis, Brenda L.; Smith, William P.
While streaming data have become increasingly more popular in business and research communities, semantic models and processing software for streaming data have not kept pace. Traditional semantic solutions have not addressed transient data streams. Semantic web languages (e.g., RDF, OWL) have typically addressed static data settings and linked data approaches have predominantly addressed static or growing data repositories. Streaming data settings have some fundamental differences; in particular, data are consumed on the fly and data may expire. Stream reasoning, a combination of stream processing and semantic reasoning, has emerged with the vision of providing "smart" processing of streaming data. C-SPARQLmore » is a prominent stream reasoning system that handles semantic (RDF) data streams. Many stream reasoning systems including C-SPARQL use a sliding window and use data arrival time to evict data. For data streams that include expiration times, a simple arrival time scheme is inadequate if the window size does not match the expiration period. In this paper, we propose a cache-enabled, order-aware, ontology-based stream reasoning framework. This framework consumes RDF streams with expiration timestamps assigned by the streaming source. Our framework utilizes both arrival and expiration timestamps in its cache eviction policies. In addition, we introduce the notion of "semantic importance" which aims to address the relevance of data to the expected reasoning, thus enabling the eviction algorithms to be more context- and reasoning-aware when choosing what data to maintain for question answering. We evaluate this framework by implementing three different prototypes and utilizing five metrics. The trade-offs of deploying the proposed framework are also discussed.« less
Nontrivial Quantum Effects in Biology: A Skeptical Physicists' View
NASA Astrophysics Data System (ADS)
Wiseman, Howard; Eisert, Jens
The following sections are included: * Introduction * A Quantum Life Principle * A quantum chemistry principle? * The anthropic principle * Quantum Computing in the Brain * Nature did everything first? * Decoherence as the make or break issue * Quantum error correction * Uselessness of quantum algorithms for organisms * Quantum Computing in Genetics * Quantum search * Teleological aspects and the fast-track to life * Quantum Consciousness * Computability and free will * Time scales * Quantum Free Will * Predictability and free will * Determinism and free will * Acknowledgements * References
Similarity of Stream Width Distributions Across Headwater Systems
NASA Astrophysics Data System (ADS)
Allen, G. H.; Pavelsky, T.; Barefoot, E. A.; Tashie, A.; Butman, D. E.
2016-12-01
The morphology and abundance of streams control the rates of hydraulic and biogeochemical exchange between streams, groundwater, and the atmosphere. In large river systems, studies have used remote sensing to quantify river morphology, and have found that the relationship between river width and abundance is fractal, such that narrow rivers are proportionally more common than wider rivers. However, in headwater systems (stream order 1-3), where many biogeochemical reactions are most rapid, the relationship between stream width and abundance is unknown, reducing the certainty of biogeochemical flux estimates. To constrain this uncertainty, we surveyed two components of stream morphology (wetted stream width and length) in seven physiographically contrasting stream networks in Kings Creek in Konza Prarie, KS; Sagehen Creek in the N. Sierra Nevada Mtns., CA; Elder Creek in Angelo Coast Range Preserve, CA; Caribou Creek in the Caribou Poker Creek Research Watershed, AK; V40 Stream, NZ; Blue Duck Creek, NZ; Stony Creek in Duke Forest, NC. To assess temporal variations, we also surveyed stream geometry in a subcatchment of Stony Creek six times over a range of moderate streamflow conditions (discharge less than 90 percentile of gauge record). Here we show a strikingly consistent gamma statistical distribution of stream width in all surveys and a characteristic most abundant stream width of 32±7 cm independent of flow conditions or basin size. This consistency is remarkable given the substantial physical diversity among the studied catchments. We propose a model that invokes network topology theory and downstream hydraulic geometry to show that, as active drainage networks expand and contract in response to changes in streamflow, the most abundant stream width remains approximately static. This framework can be used to better extrapolate stream size and abundance from large rivers to small headwater streams, with significant impact on understanding of the hydraulic, ecological, and biogeochemical functions of stream networks.
Controls of streamwater dissolved inorganic carbon dynamics in a forested watershed
Finlay, J.C.
2003-01-01
I investigated controls of stream dissolved inorganic carbon (DIC) sources and cycling along a stream size and productivity gradient in a temperate forested watershed in northern California. Dissolved CO2 (CO2 (aq)) dynamics in heavily shaded streams contrasted strongly with those of larger, open canopied sites. In streams with canopy cover > 97%, CO2 (aq) was highest during baseflow periods (up to 540 ??M) and was negatively related to discharge. Effects of algal photosynthesis on CO2 (aq) were minimal and stream CO2 (aq) was primarily controlled by groundwater CO2 (aq) inputs and degassing losses to the atmosphere. In contrast to the small streams. CO2 (aq) in larger, open-canopied streams was often below atmospheric levels at midday during baseflow and was positively related to discharge. Here, stream CO2 (aq) was strongly influenced by the balance between autotrophic and heterotrophic processes. Dynamics of HCO3- were less complex. HCO3- and Ca2+ were positively correlated, negatively related to discharge, and showed no pattern with stream size. Stable carbon isotope ratios of DIC (i.e. ??13C DIC) increased with stream size and discharge, indicating contrasting sources of DIC to streams and rivers. During summer baseflows, ??13C DIC were 13C-depleted in the smallest streams (minimum of -17.7???) due to the influence of CO2 (aq) derived from microbial respiration and HCO3- derived from carbonate weathering. ??13C DIC were higher (up to -6.6???) in the larger streams and rivers due to invasion of atmospheric CO2 enhanced by algal CO2 (aq) uptake. While small streams were influenced by groundwater inputs, patterns in CO2 (aq) and evidence from stable isotopes demonstrate the strong influence of stream metabolism and CO2 exchange with the atmosphere on stream and river carbon cycles.
Harvey, Judson W.; Wagner, Brian J.; Bencala, Kenneth E.
1996-01-01
Stream water was locally recharged into shallow groundwater flow paths that returned to the stream (hyporheic exchange) in St. Kevin Gulch, a Rocky Mountain stream in Colorado contaminated by acid mine drainage. Two approaches were used to characterize hyporheic exchange: sub-reach-scale measurement of hydraulic heads and hydraulic conductivity to compute streambed fluxes (hydrometric approach) and reachscale modeling of in-stream solute tracer injections to determine characteristic length and timescales of exchange with storage zones (stream tracer approach). Subsurface data were the standard of comparison used to evaluate the reliability of the stream tracer approach to characterize hyporheic exchange. The reach-averaged hyporheic exchange flux (1.5 mL s−1 m−1), determined by hydrometric methods, was largest when stream base flow was low (10 L s−1); hyporheic exchange persisted when base flow was 10-fold higher, decreasing by approximately 30%. Reliability of the stream tracer approach to detect hyporheic exchange was assessed using first-order uncertainty analysis that considered model parameter sensitivity. The stream tracer approach did not reliably characterize hyporheic exchange at high base flow: the model was apparently more sensitive to exchange with surface water storage zones than with the hyporheic zone. At low base flow the stream tracer approach reliably characterized exchange between the stream and gravel streambed (timescale of hours) but was relatively insensitive to slower exchange with deeper alluvium (timescale of tens of hours) that was detected by subsurface measurements. The stream tracer approach was therefore not equally sensitive to all timescales of hyporheic exchange. We conclude that while the stream tracer approach is an efficient means to characterize surface-subsurface exchange, future studies will need to more routinely consider decreasing sensitivities of tracer methods at higher base flow and a potential bias toward characterizing only a fast component of hyporheic exchange. Stream tracer models with multiple rate constants to consider both fast exchange with streambed gravel and slower exchange with deeper alluvium appear to be warranted.
NASA Astrophysics Data System (ADS)
Livers, B.; Wohl, E.
2015-12-01
Human alteration to forests has had lasting effects on stream channels worldwide. Such land use changes affect how wood enters and is stored in streams as individual pieces and as logjams. Changes in wood recruitment affect the complexity and benefits wood can provide to the stream environment, such as zones of flow separation that store fine sediment and organic matter, increased nutrient processing, and greater habitat potential, which can enhance biota and cascade through stream-riparian ecosystems. Previous research in our study area shows that modern headwater streams flowing through old-growth, unmanaged forests have more wood than streams in young, managed forests, but does not explicitly evaluate how wood affects channel complexity or local ecology. 'Managed' refers to forests previously or currently exposed to human alteration. Alteration has long since ceased in some areas, but reduced wood loads in managed streams persist. Our primary objective was to quantify stream complexity metrics, with instream wood as a mediator, on streams across a gradient of management and disturbance histories in order to examine legacy effects of human alteration to forests. Data collected in the Southern Rocky Mountains include 24 2nd to 3rd order subalpine streams categorized into: old-growth unmanaged; younger, naturally disturbed unmanaged; and younger managed. We assessed instream wood loads and logjams and evaluated how they relate to channel complexity using a number of metrics, such as standard deviation of bed and banks, volume of pools, ratios of stream to valley lengths and stream to valley area, and diversity of substrate, gradient, and morphology. Preliminary results show that channel complexity is directly related to instream wood loads and is greatest in streams in old-growth. Related research in the field area indicates that streams with greater wood loads also have increased nutrient processing and greater abundance and diversity of aquatic insect predators.
Nowell, Lisa H.; Moran, Patrick W.; Schmidt, Travis S.; Norman, Julia E.; Nakagaki, Naomi; Shoda, Megan E.; Mahler, Barbara J.; Van Metre, Peter C.; Stone, Wesley W.; Sandstrom, Mark W.; Hladik, Michelle L.
2018-01-01
Aquatic organisms in streams are exposed to pesticide mixtures that vary in composition over time in response to changes in flow conditions, pesticide inputs to the stream, and pesticide fate and degradation within the stream. To characterize mixtures of dissolved-phase pesticides and degradates in Midwestern streams, a synoptic study was conducted at 100 streams during May–August 2013. In weekly water samples, 94 pesticides and 89 degradates were detected, with a median of 25 compounds detected per sample and 54 detected per site. In a screening-level assessment using aquatic-life benchmarks and the Pesticide Toxicity Index (PTI), potential effects on fish were unlikely in most streams. For invertebrates, potential chronic toxicity was predicted in 53% of streams, punctuated in 12% of streams by acutely toxic exposures. For aquatic plants, acute but likely reversible effects on biomass were predicted in 75% of streams, with potential longer-term effects on plant communities in 9% of streams. Relatively few pesticides in water—atrazine, acetochlor, metolachlor, imidacloprid, fipronil, organophosphate insecticides, and carbendazim—were predicted to be major contributors to potential toxicity. Agricultural streams had the highest potential for effects on plants, especially in May–June, corresponding to high spring-flush herbicide concentrations. Urban streams had higher detection frequencies and concentrations of insecticides and most fungicides than in agricultural streams, and higher potential for invertebrate toxicity, which peaked during July–August. Toxicity-screening predictions for invertebrates were supported by quantile regressions showing significant associations for the Benthic Invertebrate-PTI and imidacloprid concentrations with invertebrate community metrics for MSQA streams, and by mesocosm toxicity testing with imidacloprid showing effects on invertebrate communities at environmentally relevant concentrations. This study documents the most complex pesticide mixtures yet reported in discrete water samples in the U.S. and, using multiple lines of evidence, predicts that pesticides were potentially toxic to nontarget aquatic life in about half of the sampled streams.
Application of the Hydroecological Integrity Assessment Process for Missouri Streams
Kennen, Jonathan G.; Henriksen, James A.; Heasley, John; Cade, Brian S.; Terrell, James W.
2009-01-01
Natural flow regime concepts and theories have established the justification for maintaining or restoring the range of natural hydrologic variability so that physiochemical processes, native biodiversity, and the evolutionary potential of aquatic and riparian assemblages can be sustained. A synthesis of recent research advances in hydroecology, coupled with stream classification using hydroecologically relevant indices, has produced the Hydroecological Integrity Assessment Process (HIP). HIP consists of (1) a regional classification of streams into hydrologic stream types based on flow data from long-term gaging-station records for relatively unmodified streams, (2) an identification of stream-type specific indices that address 11 subcomponents of the flow regime, (3) an ability to establish environmental flow standards, (4) an evaluation of hydrologic alteration, and (5) a capacity to conduct alternative analyses. The process starts with the identification of a hydrologic baseline (reference condition) for selected locations, uses flow data from a stream-gage network, and proceeds to classify streams into hydrologic stream types. Concurrently, the analysis identifies a set of non-redundant and ecologically relevant hydrologic indices for 11 subcomponents of flow for each stream type. Furthermore, regional hydrologic models for synthesizing flow conditions across a region and the development of flow-ecology response relations for each stream type can be added to further enhance the process. The application of HIP to Missouri streams identified five stream types ((1) intermittent, (2) perennial runoff-flashy, (3) perennial runoff-moderate baseflow, (4) perennial groundwater-stable, and (5) perennial groundwater-super stable). Two Missouri-specific computer software programs were developed: (1) a Missouri Hydrologic Assessment Tool (MOHAT) which is used to establish a hydrologic baseline, provide options for setting environmental flow standards, and compare past and proposed hydrologic alterations; and (2) a Missouri Stream Classification Tool (MOSCT) designed for placing previously unclassified streams into one of the five pre-defined stream types.
Lyons, John; Zorn, Troy; Stewart, Jana S.; Seelbach, Paul W.; Wehrly, Kevin; Wang, Lizhu
2009-01-01
Coolwater streams, which are intermediate in character between coldwater “trout” streams and more diverse warmwater streams, occur widely in temperate regions but are poorly understood. We used modeled water temperature data and fish assemblage samples from 371 stream sites in Michigan and Wisconsin to define, describe, and map coolwater streams and their fish assemblages. We defined coolwater streams as ones having summer water temperatures suitable for both coldwater and warmwater species and used the observed distributions of the 99 fish species at our sites to identify coolwater thermal boundaries. Coolwater streams had June-through-August mean water temperatures of 17.0–20.5°C, July mean temperatures of 17.5–21.0°C, and maximum daily mean temperatures of 20.7–24.6°C. We delineated two subclasses of coolwater streams: “cold transition” (having July mean water temperatures of 17.5–19.5°C) and “warm transition” (having July mean temperatures of 19.5–21.0°C). Fish assemblages in coolwater streams were variable and lacked diagnostic species but were generally intermediate in species richness and overlapped in composition with coldwater and warmwater streams. In cold-transition streams, coldwater (e.g., salmonids and cottids) and transitional species (e.g., creek chub Semotilus atromaculatus, eastern blacknose dace Rhynichthys atratulus, white sucker Catostomus commersonii, and johnny darter Etheostoma nigrum) were common and warmwater species (e.g., ictalurids and centrarchids) were uncommon; in warm-transition streams warmwater and transitional species were common and coldwater species were uncommon. Coolwater was the most widespread and abundant thermal class in Michigan and Wisconsin, comprising 65% of the combined total stream length in the two states (cold-transition streams being more common than warm-transition ones). Our approach can be used to identify and characterize coolwater streams elsewhere in the temperate region, benefiting many aspects of fisheries management and environmental protection.
Quantum correlations in multipartite quantum systems
NASA Astrophysics Data System (ADS)
Jafarizadeh, M. A.; Heshmati, A.; Karimi, N.; Yahyavi, M.
2018-03-01
Quantum entanglement is the most famous type of quantum correlation between elements of a quantum system that has a basic role in quantum communication protocols like quantum cryptography, teleportation and Bell inequality detection. However, it has already been shown that various applications in quantum information theory do not require entanglement. Quantum discord as a new kind of quantum correlations beyond entanglement, is the most popular candidate for general quantum correlations. In this paper, first we find the entanglement witness in a particular multipartite quantum system which consists of a N-partite system in 2 n -dimensional space. Then we give an exact analytical formula for the quantum discord of this system. At the end of the paper, we investigate the additivity relation of the quantum correlation and show that this relation is satisfied for a N-partite system with 2 n -dimensional space.
Universal blind quantum computation for hybrid system
NASA Astrophysics Data System (ADS)
Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang
2017-08-01
As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.
Predictive Mapping of the Biotic Condition of Conterminous U.S. Rivers and Streams
Understanding and mapping the spatial variations in the biological condition of streams could provide an important tool for assessment and restoration of stream ecosystems. The US EPA’s National Rivers and Streams Assessment (NRSA) summarizes the percent of stream lengths within ...
The Stream-Catchment (StreamCat) Dataset: A database of watershed metrics for the conterminous USA
We developed an extensive database of landscape metrics for ~2.65 million streams, and their associated catchments, within the conterminous USA: The Stream-Catchment (StreamCat) Dataset. These data are publically available and greatly reduce the specialized geospatial expertise n...
URBAN STREAM BURIAL INCREASES WATERSHED-SCALE NITRATE EXPORT
Nitrogen (N) uptake in streams is an important ecosystem service that may be affected by the widespread burial of streams in stormwater pipes in urban watersheds. We predicted that stream burial reduces the capacity of streams to remove nitrate (NO3-) from the water column by in...
EFFECTS OF HYDROLOGY ON NITROGEN PROCESSING IN A RESTORED URBAN STREAM
In 2001, EPA undertook an intensive research effort to evaluate the impact of stream restoration on water quality at a degraded stream in an urban watershed. An essential piece of this comprehensive study was to characterize, measure and quantify stream ground water/ stream wate...
Robert M. Northington; Jackson R. Webster
2017-01-01
SummaryForested headwater streams are connected to their surrounding catchments by a reliance on terrestrial subsidies. Changes in precipitation patterns and stream flow represent a potential disruption in stream ecosystem function, as the delivery of terrestrial detritus to aquatic consumers and...
Buried Streams and the Loss of Ecosystem Services in Urban Watersheds
Nitrogen (N) retention in streams is an important ecosystem service that may be affected by the widespread burial of streams in stormwater pipes in urban watersheds. We predicted that stream burial suppresses the capacity of streams to retain nitrate (NO3-) by eliminating primary...
Headwater Stream Management Dichotomies: Local Amphibian Habitat vs. Downstream Fish Habitat
NASA Astrophysics Data System (ADS)
Jackson, C. R.
2002-12-01
Small headwater streams in mountainous areas of the Pacific Northwest often do not harbor fish populations because of low water depth and high gradients. Rather, these streams provide habitat for dense assemblages of stream-dwelling amphibians. A variety of management goals have been suggested for such streams such as encouraging large woody debris recruitment to assist in sediment trapping and valley floor formation, encouraging large woody debris recruitment to provide downstream wood when debris flows occur, providing continuous linear stream buffers within forest harvest areas to provide shade and bank stability, etc. A basic problem with analying the geomorphic or biotic benefits of any of these strategies is the lack of explicit management goals for such streams. Should managers strive to optimize downstream fish habitat, local amphibian habitat, or both? Through observational data and theoretical considerations, it will be shown that these biotic goals will lead to very different geomorphic management recommendations. For instance, woody debris greater than 60 cm diameter may assist in valley floor development, but it is likely to create subsurface channel flow of unknown value to amphibians. Trapping and retention of fine sediments within headwater streams may improve downstream spawning gravels, but degrades stream-dwelling amphibian habitat. In response to the need for descriptive information on habitat and channel morphology specific to small, non-fish-bearing streams in the Pacific Northwest, morphologies and wood frequencies in forty-two first- and second-order forested streams less than four meters wide were surveyed. Frequencies and size distributions of woody debris were compared between small streams and larger fish-bearing streams as well as between second-growth and virgin timber streams. Statistical models were developed to explore dominant factors affecting channel morphology and habitat. Findings suggest geomorphological relationships, specifically the role of woody debris in habitat formation, documented for larger streams do not apply to headwater streams. Relatively small wood (diameters between 10 and 40 cm), inorganic material, and organic debris (diameters less than 10 cm) were major step-forming agents while big woody debris pieces (> 40 cm dia.) created less than 10% of steps. Streams in virgin and managed stands did not differ in relative importance of very large woody debris. Due to low fluvial power, pool habitat was rare. These streams featured mostly step-riffle morphology, not step-pool, indicating insufficient flow for pool-scour. Stream power and unit stream power were dominant channel shaping factors.
NASA Astrophysics Data System (ADS)
Lidar, Daniel A.; Brun, Todd A.
2013-09-01
Prologue; Preface; Part I. Background: 1. Introduction to decoherence and noise in open quantum systems Daniel Lidar and Todd Brun; 2. Introduction to quantum error correction Dave Bacon; 3. Introduction to decoherence-free subspaces and noiseless subsystems Daniel Lidar; 4. Introduction to quantum dynamical decoupling Lorenza Viola; 5. Introduction to quantum fault tolerance Panos Aliferis; Part II. Generalized Approaches to Quantum Error Correction: 6. Operator quantum error correction David Kribs and David Poulin; 7. Entanglement-assisted quantum error-correcting codes Todd Brun and Min-Hsiu Hsieh; 8. Continuous-time quantum error correction Ognyan Oreshkov; Part III. Advanced Quantum Codes: 9. Quantum convolutional codes Mark Wilde; 10. Non-additive quantum codes Markus Grassl and Martin Rötteler; 11. Iterative quantum coding systems David Poulin; 12. Algebraic quantum coding theory Andreas Klappenecker; 13. Optimization-based quantum error correction Andrew Fletcher; Part IV. Advanced Dynamical Decoupling: 14. High order dynamical decoupling Zhen-Yu Wang and Ren-Bao Liu; 15. Combinatorial approaches to dynamical decoupling Martin Rötteler and Pawel Wocjan; Part V. Alternative Quantum Computation Approaches: 16. Holonomic quantum computation Paolo Zanardi; 17. Fault tolerance for holonomic quantum computation Ognyan Oreshkov, Todd Brun and Daniel Lidar; 18. Fault tolerant measurement-based quantum computing Debbie Leung; Part VI. Topological Methods: 19. Topological codes Héctor Bombín; 20. Fault tolerant topological cluster state quantum computing Austin Fowler and Kovid Goyal; Part VII. Applications and Implementations: 21. Experimental quantum error correction Dave Bacon; 22. Experimental dynamical decoupling Lorenza Viola; 23. Architectures Jacob Taylor; 24. Error correction in quantum communication Mark Wilde; Part VIII. Critical Evaluation of Fault Tolerance: 25. Hamiltonian methods in QEC and fault tolerance Eduardo Novais, Eduardo Mucciolo and Harold Baranger; 26. Critique of fault-tolerant quantum information processing Robert Alicki; References; Index.
NASA Astrophysics Data System (ADS)
Løgstrup Bjerg, Poul; Sonne, Anne T.; Rønde, Vinni; McKnight, Ursula S.
2016-04-01
Streams are impacted by significant contamination at the catchment scale, as they are often locations of multiple chemical stressor inputs. The European Water Framework Directive requires EU member states to ensure good chemical and ecological status of surface water bodies by 2027. This requires monitoring of stream water quality, comparison with environmental quality standards (EQS) and assessment of ecological status. However, the achievement of good status of stream water also requires a strong focus on contaminant sources, pathways and links to stream water impacts, so source management and remedial measures can be implemented. Fate and impacts of different contaminant groups are governed by different processes and are dependent on the origin (geogenic, anthropogenic), source type (point or diffuse) and pathway of the contaminant. To address this issue, we identified contaminant sources and chemical stressors on a groundwater-fed stream to quantify the contaminant discharges, link the chemical impact and stream water quality and assess the main chemical risk drivers in the stream system potentially driving ecological impact. The study was conducted in the 8 m wide Grindsted stream (Denmark) along a 16 km stream stretch that is potentially impacted by two contaminated sites (Grindsted Factory site, Grindsted Landfill), fish farms, waste water discharges, and diffuse sources from agriculture and urban areas. Water samples from the stream and the hyporheic zone as well as bed sediment samples were collected during three campaigns in 2012 and 2014. Data for xenobiotic organic groundwater contaminants, pesticides, heavy metals, general water chemistry, physical conditions and stream flow were collected. The measured chemical concentrations were converted to toxic units (TU) based on the 48h acute toxicity tests with D. magna. The results show a substantial impact of the Grindsted Factory site at a specific stretch of the stream. The groundwater plume caused elevated concentrations of chlorinated ethenes, benzene and site specific pharmaceuticals in both the hyporheic zone and the stream water. Observed stream water vinyl chloride concentrations (up to 6 μg/L) are far above the Danish EQS (0.05 μg/L) for several km downstream of the discharge area. For heavy metals, comparison with EQS in stream water, the hyporheic zone and streambed showed concentrations around or above the threshold values for barium, copper, lead, nickel and zinc. The calculated TU was generally similar along the stream, but for arsenic and nickel higher values were observed where the groundwater plume discharges into the stream. Also, log TU sum values for organic contaminants were elevated in both the hyporheic zone and stream. Thus, the overall chemical stress in the main discharge area is much higher than upstream, while it gradually decreases downstream. In conclusion, this work clearly shows that groundwater contaminant plumes can impact stream water quality significantly in discharge areas, and extend far downstream. A surprisingly high impact of heavy metals with diffuse and/or biogenic origin on stream quality was identified. This work highlights the importance of a holistic assessment of stream water quality to identify and quantify the main contaminant sources and resulting chemical stream stressors leading to potential ecological impacts.
Quantum thermodynamic cycles and quantum heat engines. II.
Quan, H T
2009-04-01
We study the quantum-mechanical generalization of force or pressure, and then we extend the classical thermodynamic isobaric process to quantum-mechanical systems. Based on these efforts, we are able to study the quantum version of thermodynamic cycles that consist of quantum isobaric processes, such as the quantum Brayton cycle and quantum Diesel cycle. We also consider the implementation of the quantum Brayton cycle and quantum Diesel cycle with some model systems, such as single particle in a one-dimensional box and single-mode radiation field in a cavity. These studies lay the microscopic (quantum-mechanical) foundation for Szilard-Zurek single-molecule engine.
Single-server blind quantum computation with quantum circuit model
NASA Astrophysics Data System (ADS)
Zhang, Xiaoqian; Weng, Jian; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing; Song, Tingting
2018-06-01
Blind quantum computation (BQC) enables the client, who has few quantum technologies, to delegate her quantum computation to a server, who has strong quantum computabilities and learns nothing about the client's quantum inputs, outputs and algorithms. In this article, we propose a single-server BQC protocol with quantum circuit model by replacing any quantum gate with the combination of rotation operators. The trap quantum circuits are introduced, together with the combination of rotation operators, such that the server is unknown about quantum algorithms. The client only needs to perform operations X and Z, while the server honestly performs rotation operators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasari, Venkat; Sadlier, Ronald J; Geerhart, Mr. Billy
Well-defined and stable quantum networks are essential to realize functional quantum applications. Quantum networks are complex and must use both quantum and classical channels to support quantum applications like QKD, teleportation, and superdense coding. In particular, the no-cloning theorem prevents the reliable copying of quantum signals such that the quantum and classical channels must be highly coordinated using robust and extensible methods. We develop new network abstractions and interfaces for building programmable quantum networks. Our approach leverages new OpenFlow data structures and table type patterns to build programmable quantum networks and to support quantum applications.
Nitrogen Removal by Streams and Rivers of the Upper Mississippi River Basin
Our study, based on chemistry and channel dimensions data collected at 893 randomly-selected stream and river sites in the Mississippi River basin, demonstrated the interaction of stream chemistry, stream size, and NO3-N uptake metrics across a range of stream sizes and across re...
Toward a Neurophysiological Theory of Auditory Stream Segregation
ERIC Educational Resources Information Center
Snyder, Joel S.; Alain, Claude
2007-01-01
Auditory stream segregation (or streaming) is a phenomenon in which 2 or more repeating sounds differing in at least 1 acoustic attribute are perceived as 2 or more separate sound sources (i.e., streams). This article selectively reviews psychophysical and computational studies of streaming and comprehensively reviews more recent…
MINEBANK RUN PROJECT AS AN APPROACH FOR RESTORING DEGRADED URBAN WATERSHEDS AND RIPARIAN ECOSYSTEMS
Elevated nitrate levels in streams and groundwater pose human and ecological threats. Minebank Run, an urban stream in Baltimore MD, will be restored in 2004/2005 using various techniques including reshaping stream banks to reconnect stream channel to flood plain, stream bank r...
Changes in the amount and types of land use in a watershed can destabilize stream channel structure, increase sediment loading and degrade in-stream habitat. Stream classification systems (e.g. Rosgen) may be useful for determining the susceptibility of stream channel segments t...
Changes in the amount and types of land use in a watershed can destabilize stream channel structure, increase sediment loading and degrade in-stream habitat. Stream classification systems (e.g., Rosgen) may be useful for determining the susceptibility of stream channel segments t...
Changes in the amount and types of land use in a watershed can destabilize stream channel structure, increase sediment loading and degrade in-stream habitat. Stream classification systems (e.g. Rosgen) may be useful for determining the susceptibility of stream channel segments t...
A Comprehensive Model for the Monoceros Tidal Stream
2005-06-10
stream that can be found in the literature. 5.1. The Triangulus/ Andromeda Stream In Figure 8 we show the location of the recent detected Tri/And tidal...recently discovered stream in Triangulus/ Andromeda as natural part of theMonoceros stream, both fitting accurately to the modeled kinematics and spatial
Unique Challenges to (Federal) Enterprise Streaming
NASA Technical Reports Server (NTRS)
Walls, Bryan
2006-01-01
Enterprise streaming has different parameters than consumer Streaming. The government enterprise has some differences on top of that. I'd like to highlight some issues shared by the Federal government as a whole, with a closer look at streaming within NASA. Then we'll look at NASA's strategy for streaming.
Effects of Urban Stream Burial on Organic Matter Dynamics and Reach Scale Nitrate Retention
Nitrogen (N) retention in streams is an important ecosystem service that may be affected by the widespread burial of streams in stormwater pipes in urban watersheds. We predicted that stream burial suppresses the capacity of streams to retain nitrate (NO3-) by eliminating primar...
40 CFR 434.61 - Commingling of waste streams.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Commingling of waste streams. 434.61... STANDARDS Miscellaneous Provisions § 434.61 Commingling of waste streams. Where waste streams from any facility covered by this part are combined for treatment or discharge with waste streams from another...
Hageman, Philip L.; Todd, Andrew S.; Smith, Kathleen S.; DeWitt, Ed; Zeigler, Mathew P.
2013-01-01
Scientists from the U.S. Geological Survey are studying the relationship between watershed lithology and stream-water chemistry. As part of this effort, 60 stream-water samples and 43 corresponding stream-sediment samples were collected in 2010 and 2011 from locations in Colorado and New Mexico. Sample sites were selected from small to midsize watersheds composed of a high percentage of one rock type or geologic unit. Stream-water and stream-sediment samples were collected, processed, preserved, and analyzed in a consistent manner. This report releases geochemical data for this phase of the study.
Pattern Discovery and Change Detection of Online Music Query Streams
NASA Astrophysics Data System (ADS)
Li, Hua-Fu
In this paper, an efficient stream mining algorithm, called FTP-stream (Frequent Temporal Pattern mining of streams), is proposed to find the frequent temporal patterns over melody sequence streams. In the framework of our proposed algorithm, an effective bit-sequence representation is used to reduce the time and memory needed to slide the windows. The FTP-stream algorithm can calculate the support threshold in only a single pass based on the concept of bit-sequence representation. It takes the advantage of "left" and "and" operations of the representation. Experiments show that the proposed algorithm only scans the music query stream once, and runs significant faster and consumes less memory than existing algorithms, such as SWFI-stream and Moment.
Rice, Karen C.; Bricker, Owen P.
1991-01-01
The report describes the results of a study to assess the sensitivity of streams to acidic deposition in Charles and Anne Arundel Counties, Maryland using a geology-based method. Water samples were collected from streams in July and August 1988 when streams were at base-flow conditions. Eighteen water samples collected from streams in Charles County, and 17 water samples from streams in Anne Arundel County were analyzed in the field for pH, specific conductance, and acid-neutralizing capacity (ANC); 8 water samples from streams in Charles County were analyzed in the laboratory for chloride and sulfate concentrations. The assessment revealed that streams in these counties are sensitive to acidification by acidic deposition.
NASA Technical Reports Server (NTRS)
Fishbach, L. H.; Koenig, R. W.
1972-01-01
A computer program which calculates steady-state design and off-design jet engine performance for two- or three-spool turbofans with one, two, or three nozzles is described. Included in the report are complete FORTRAN 4 listings of the program with sample results for nine basic turbofan engines that can be calculated: (1) three-spool, three-stream engine; (2) two-spool, three-stream, boosted-fan engine; (3) two-spool, three-stream, supercharged-compressor engine; (4) three-spool, two-stream engine; (5) two-spool, two-stream engine; (6) three-spool, three-stream, aft-fan engine; (7) two-spool, three-stream, aft-fan engine; (8) two-spool, two-stream, aft-engine; and (9) three-spool, two-stream, aft-fan engine. The simulation of other engines by using logical variables built into the program is also described.
Apparatus for the liquefaction of natural gas and methods relating to same
Wilding, Bruce M [Idaho Falls, ID; McKellar, Michael G [Idaho Falls, ID; Turner, Terry D [Ammon, ID; Carney, Francis H [Idaho Falls, ID
2009-09-29
An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through an expander creating work output. A compressor may be driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream.
Magnetic field advection in two interpenetrating plasma streams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryutov, D. D.; Kugland, N. L.; Levy, M. C.
2013-03-15
Laser-generated colliding plasma streams can serve as a test-bed for the study of various astrophysical phenomena and the general physics of self-organization. For streams of a sufficiently high kinetic energy, collisions between the ions of one stream with the ions of the other stream are negligible, and the streams can penetrate through each other. On the other hand, the intra-stream collisions for high-Mach-number flows can still be very frequent, so that each stream can be described hydrodynamically. This paper presents an analytical study of the effects that these interpenetrating streams have on large-scale magnetic fields either introduced by external coilsmore » or generated in the plasma near the laser targets. Specifically, a problem of the frozen-in constraint is assessed and paradoxical features of the field advection in this system are revealed. A possibility of using this system for studies of magnetic reconnection is mentioned.« less
NASA Astrophysics Data System (ADS)
Leach, J.; Moore, D.
2015-12-01
Winter stream temperature of coastal mountain catchments influences fish growth and development. Transient snow cover and advection associated with lateral throughflow inputs are dominant controls on stream thermal regimes in these regions. Existing stream temperature models lack the ability to properly simulate these processes. Therefore, we developed and evaluated a conceptual-parametric catchment-scale stream temperature model that includes the role of transient snow cover and lateral advection associated with throughflow. The model provided reasonable estimates of observed stream temperature at three test catchments. We used the model to simulate winter stream temperature for virtual catchments located at different elevations within the rain-on-snow zone. The modelling exercise examined stream temperature response associated with interactions between elevation, snow regime, and changes in air temperature. Modelling results highlight that the sensitivity of winter stream temperature response to changes in climate may be dependent on catchment elevation and landscape position.
Factoring stream turbulence into global assessments of nitrogen pollution.
Grant, Stanley B; Azizian, Morvarid; Cook, Perran; Boano, Fulvio; Rippy, Megan A
2018-03-16
The discharge of excess nitrogen to streams and rivers poses an existential threat to both humans and ecosystems. A seminal study of headwater streams across the United States concluded that in-stream removal of nitrate is controlled primarily by stream chemistry and biology. Reanalysis of these data reveals that stream turbulence (in particular, turbulent mass transfer across the concentration boundary layer) imposes a previously unrecognized upper limit on the rate at which nitrate is removed from streams. The upper limit closely approximates measured nitrate removal rates in streams with low concentrations of this pollutant, a discovery that should inform stream restoration designs and efforts to assess the effects of nitrogen pollution on receiving water quality and the global nitrogen cycle. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
Experimental entanglement of 25 individually accessible atomic quantum interfaces
Jiang, Nan; Chang, Wei; Li, Chang; Zhang, Sheng
2018-01-01
A quantum interface links the stationary qubits in a quantum memory with flying photonic qubits in optical transmission channels and constitutes a critical element for the future quantum internet. Entanglement of quantum interfaces is an important step for the realization of quantum networks. Through heralded detection of photon interference, we generate multipartite entanglement between 25 (or 9) individually addressable quantum interfaces in a multiplexed atomic quantum memory array and confirm genuine 22-partite (or 9-partite) entanglement. This experimental entanglement of a record-high number of individually addressable quantum interfaces makes an important step toward the realization of quantum networks, long-distance quantum communication, and multipartite quantum information processing. PMID:29725621
Adoption of Stream Fencing Among Dairy Farmers in Four New Zealand Catchments
NASA Astrophysics Data System (ADS)
Bewsell, Denise; Monaghan, Ross M.; Kaine, Geoff
2007-08-01
The effect of dairy farming on water quality in New Zealand streams has been identified as an important environmental issue. Stream fencing, to keep cattle out of streams, is seen as a way to improve water quality. Fencing ensures that cattle cannot defecate in the stream, prevents bank erosion, and protects the aquatic habitat. Stream fencing targets have been set by the dairy industry. In this paper the results of a study to identify the factors influencing dairy farmers’ decisions to adopt stream fencing are outlined. Qualitative methods were used to gather data from 30 dairy farmers in four New Zealand catchments. Results suggest that farm contextual factors influenced farmers’ decision making when considering stream fencing. Farmers were classified into four segments based on their reasons for investing in stream fencing. These reasons were fencing boundaries, fencing for stock control, fencing to protect animal health, and fencing because of pressure to conform to local government guidelines or industry codes of practice. This suggests that adoption may be slow in the absence of on-farm benefits, that promotion of stream fencing needs to be strongly linked to on-farm benefits, and that regulation could play a role in ensuring greater adoption of stream fencing.
Terrestrial–aquatic linkages in spring-fed and snowmelt-dominated streams
Sepulveda, Adam
2017-01-01
The importance of trophic linkages between aquatic and terrestrial ecosystems is predicted to vary as a function of subsidy quantity and quality relative to in situ resources. To test this prediction, I used multi-year diet data from Bonneville cutthroat trout Oncorhynchus clarki Utah in spring-fed and snowmelt-driven streams in the high desert of western North America. I documented that trout in spring-fed streams consumed more (number and weight) aquatic than terrestrial invertebrates, while trout in snowmelt-driven streams consumed a similar number of both prey types but consumed more terrestrial than aquatic invertebrates by weight. Trout in spring-fed streams consumed more aquatic invertebrates than trout in snowmelt streams and trout consumed more terrestrial invertebrates in snowmelt than in spring-fed streams. Up to 93% of trout production in spring-fed streams and 60% in snowmelt streams was fueled by aquatic invertebrates, while the remainder of trout production in each stream type was from terrestrial production. I found that the biomass and occurrence of consumed terrestrial invertebrates were not related to our measures of in situ resource quality or quantity in either stream type. These empirical data highlight the importance of autotrophic-derived production to trout in xeric regions.
Interactions between dorsal and ventral streams for controlling skilled grasp
van Polanen, Vonne; Davare, Marco
2015-01-01
The two visual systems hypothesis suggests processing of visual information into two distinct routes in the brain: a dorsal stream for the control of actions and a ventral stream for the identification of objects. Recently, increasing evidence has shown that the dorsal and ventral streams are not strictly independent, but do interact with each other. In this paper, we argue that the interactions between dorsal and ventral streams are important for controlling complex object-oriented hand movements, especially skilled grasp. Anatomical studies have reported the existence of direct connections between dorsal and ventral stream areas. These physiological interconnections appear to be gradually more active as the precision demands of the grasp become higher. It is hypothesised that the dorsal stream needs to retrieve detailed information about object identity, stored in ventral stream areas, when the object properties require complex fine-tuning of the grasp. In turn, the ventral stream might receive up to date grasp-related information from dorsal stream areas to refine the object internal representation. Future research will provide direct evidence for which specific areas of the two streams interact, the timing of their interactions and in which behavioural context they occur. PMID:26169317
NASA Astrophysics Data System (ADS)
Todd, R. E.
2016-02-01
The Gulf Stream plays a major role in the climate system and is a significant forcing agent for the coastal circulation along the US East Coast, yet routine subsurface measurements of Gulf Stream structure are only collected in the Florida Straits and between New Jersey and Bermuda. A recent pilot program demonstrated the feasibility of using underwater gliders to repeatedly survey across the Gulf Stream and to provide subsurface Gulf Stream observations to the community in realtime. Spray gliders were deployed on three-month missions from Miami, Florida to the New England shelf south of Cape Cod, during which they zigzagged back and forth across the Gulf Stream. Three such deployments have been completed so far with a total of more than 20 cross-Gulf Stream transects occupied. These new observations detail the subsurface structure and variability of the Gulf Stream upstream and downstream of its separation from the continental margin, reveal large-amplitude internal waves within the boundary current, and capture numerous eddies along the flanks of the Gulf Stream. Future routine glider deployments in the Gulf Stream promise to provide critical observations for examining inherent Gulf Stream variability, investigating western boundary current influences on coastal circulation, and constraining numerical simulations.
Stream-subsurface nutrient dynamics in a groundwater-fed stream
NASA Astrophysics Data System (ADS)
Rezanezhad, F.; Niederkorn, A.; Parsons, C. T.; Van Cappellen, P.
2015-12-01
The stream-riparian-aquifer interface plays a major role in the regional flow of nutrients and contaminants due to a strong physical-chemical gradient that promotes the transformation, retention, elimination or release of biogenic elements. To better understand the effect of the near-stream zones on stream biogeochemistry, we conducted a field study on a groundwater-fed stream located in the rare Charitable Research Reserve, Cambridge, Ontario, Canada. This study focused on monitoring the spatial and temporal distributions of nutrient elements within the riparian and hyporheic zones of the stream. Several piezometer nests and a series of passive (diffusion) water samplers, known as peepers, were installed along longitudinal and lateral transects centered on the stream to obtain data on the groundwater chemistry. Groundwater upwelling along the stream resulted in distinctly different groundwater types and associated nitrate concentrations between small distances in the riparian zone (<4m). After the upstream source of the stream surface water, concentrations of nutrients (NO3-, NH4+, SO42- and carbon) did not significantly change before the downstream outlet. Although reduction of nitrate and sulphate were found in the riparian zone of the stream, this did not significantly influence the chemistry of the adjacent stream water. Also, minimal retention in the hyporheic zones limited reduction of reactive compounds (NO3- and SO42-) within the stream channel. The results showed that the dissolved organic carbon (DOC) and residence time of water in the hyporheic zone and in surface water limited denitrification.
Potential Stream Density in Mid-Atlantic U.S. Watersheds
Elmore, Andrew J.; Julian, Jason P.; Guinn, Steven M.; Fitzpatrick, Matthew C.
2013-01-01
Stream network density exerts a strong influence on ecohydrologic processes in watersheds, yet existing stream maps fail to capture most headwater streams and therefore underestimate stream density. Furthermore, discrepancies between mapped and actual stream length vary between watersheds, confounding efforts to understand the impacts of land use on stream ecosystems. Here we report on research that predicts stream presence from coupled field observations of headwater stream channels and terrain variables that were calculated both locally and as an average across the watershed upstream of any location on the landscape. Our approach used maximum entropy modeling (MaxEnt), a robust method commonly implemented to model species distributions that requires information only on the presence of the entity of interest. In validation, the method correctly predicts the presence of 86% of all 10-m stream segments and errors are low (<1%) for catchments larger than 10 ha. We apply this model to the entire Potomac River watershed (37,800 km2) and several adjacent watersheds to map stream density and compare our results with the National Hydrography Dataset (NHD). We find that NHD underestimates stream density by up to 250%, with errors being greatest in the densely urbanized cities of Washington, DC and Baltimore, MD and in regions where the NHD has never been updated from its original, coarse-grain mapping. This work is the most ambitious attempt yet to map stream networks over a large region and will have lasting implications for modeling and conservation efforts. PMID:24023704
Stream salamanders as indicators of stream quality in Maryland, USA
Southerland, M.T.; Jung, R.E.; Baxter, D.P.; Chellman, I.C.; Mercurio, G.; Volstad, J.H.
2004-01-01
Biological indicators are critical to the protection of small, headwater streams and the ecological values they provide. Maryland and other state monitoring programs have determined that fish indicators are ineffective in small streams, where stream salamanders may replace fish as top predators. Because of their life history, physiology, abundance, and ubiquity, stream salamanders are likely representative of biological integrity in these streams. The goal of this study was to determine whether stream salamanders are effective indicators of ecological conditions across biogeographic regions and gradients of human disturbance. During the summers of 2001 and 2002, we intensively surveyed for stream salamanders at 76 stream sites located west of the Maryland Coastal Plain, sites also monitored by the Maryland Biological Stream Survey (MBSS) and City of Gaithersburg. We found 1,584 stream salamanders, including all eight species known in Maryland, using two 15 ? 2 m transects and two 4 m2 quadrats that spanned both stream bank and channel. We performed removal sampling on transects to estimate salamander species detection probabilities, which ranged from 0.67-0.85. Stepwise regressions identified 15 of 52 non-salamander variables, representing water quality, physical habitat, land use, and biological conditions, which best predicted salamander metrics. Indicator development involved (1) identifying reference (non-degraded) and degraded sites (using percent forest, shading, riparian buffer width, aesthetic rating, and benthic macroinvertebrate and fish indices of biotic integrity); (2) testing 12 candidate salamander metrics (representing species richness and composition, abundance, species tolerance, and reproductive function) for their ability to distinguish reference from degraded sites; and (3) combining metrics into an index that effectively discriminated sites according to known stream conditions. Final indices for Highlands, Piedmont, and Non-Coastal Plain regions comprised four metrics: number of species, number of salamanders, number of intolerant salamanders, and number of adult salamanders, producing classification efficiencies between 87% and 90%. Partial validation of these indices was obtained when a test of the number of salamanders metric produced an 82% correct classification of 618 MBSS sites surveyed in 1995-97. This study supports the use of stream salamander monitoring and a composite stream salamander index of biotic integrity (SS-IBI) to determine stream quality in Maryland.
NASA Astrophysics Data System (ADS)
Kristensen, P. B.; Kristensen, E. A.; Riis, T.; Baisner, A. J.; Larsen, S. E.; Verdonschot, P. F. M.; Baattrup-Pedersen, A.
2013-05-01
Predictions of the future climate infer that stream water temperatures may increase in temperate lowland areas and that streams without riparian forest will be particularly prone to elevated stream water temperature. Planting of riparian forest is a potential mitigation measure to reduce water temperatures for the benefit of stream organisms. However, no studies have yet determined the length of a forested reach required to obtain a significant temperature decrease. To investigate this we measured the temperature in five small Danish lowland streams from June 2010 to July 2011, all showing a sharp transition between an upstream open reach and a downstream forested reach. In all stream reaches we also measured canopy cover and a range of physical variables characterizing the streams reaches. This allowed us to analyse differences in mean daily temperature and amplitude per month among forested and open sections as well as to study annual temperature regimes and the influence of physical conditions on temperature changes. Stream water temperature in the open reaches was affected by heating, and in July we observed an increase in temperature over the entire length of the investigated reaches, reaching temperatures higher than the incipient lethal limit for brown trout. Along the forest reaches a significant decrease in July temperatures was recorded immediately (100 m) when the stream moved into the forested area. In three of our study streams the temperature continued to decrease the longer the stream entered into the forested reach, and the temperature decline did not reach a plateau. The temperature increases along the open reaches were accompanied by stronger daily temperature variation; however, when the streams entered into the forest, the range in daily variation decreased. Multiple regression analysis of the combined effects on stream water temperature of canopy cover, Width/Depth ratio, discharge, current velocity and water temperature revealed that canopy cover and Width/Depth were the two variables responsible for the reduced temperature observed when the streams enter the forest. In consequence, we conclude that even relatively short stretches (100-500 m) of forest alongside streams may combat the negative effects of heating of stream water and that forest planting can be a useful mitigation measure.
NASA Astrophysics Data System (ADS)
Donahoe, R. J.; Hawkins, P. D.
2017-12-01
The Lake Harris watershed was the site of legacy surface mining of coal conducted from approximately 1969 to 1976. The mine site was abandoned and finally reclaimed in 1986. Water quality in the stream draining the mined area is still severely impacted by acid mine drainage (AMD), despite the reclamation effort. Lake Harris is used as a source of industrial water, but shows no negative water quality effects from the legacy mining activities despite receiving drainage from the AMD-impacted stream. Water samples were collected monthly between October 2016 and September 2017 from a first-order stream impacted by acid mine drainage (AMD), a nearby first-order control stream, and Lake Harris. Stream water chemistry was observed to vary both spatially and seasonally, as monitored at five sample stations in each stream over the study period. Comparison of the two streams shows the expected elevated concentrations of AMD-indicator solutes (sulfate and iron), as well as significant increases in conductivity and acidity for the stream draining the reclaimed mine site. In addition, dramatic (1-2 orders of magnitude) increases in major element (Al, Ca, Mg, K), minor element (Mn, Sr) and trace element (Co, Ni) concentrations are also observed for the AMD-impacted stream compared to the control stream. The AMD-impacted stream also shows elevated (2-4 times) levels of other stream water solutes (Cl, Na, Si, Zn), compared to the control stream. As the result of continuing AMD input, the stream draining the reclaimed mine site is essentially sterile, in contrast to the lake and control stream, which support robust aquatic ecosystems. A quantitative model, constrained by isotopic data (δD and δ18O), will be presented that seeks to explain the observed temporal differences in water quality for the AMD-impacted stream as a function of variable meteoric water, groundwater, and AMD inputs. Similar models may be developed for other AMD-impacted streams to better understand and predict temporal variations in water quality parameters and their effect on aquatic ecosystems.
Lerch, R.N.; Blanchard, P.E.; Thurman, E.M.
1998-01-01
The contribution of hydroxylated atrazine degradation products (HADPs) to the total atrazine load (i.e., atrazine plus stable metabolites)in streams needs to be determined in order to fully assess the impact of atrazine contamination on stream ecosystems and human health. The objectives of this study were (1) to determine the contribution of HADPs to the total atrazine load in streams of nine midwestern states and (2) to discuss the mechanisms controlling the concentrations of HADPs in streams. Stream samples were collected from 95 streams in northern Missouri at preplant and postplant of 1994 and 1995, and an additional 46 streams were sampled in eight midwestern states at postplant of 1995. Samples were analyzed for atrazine, deethylatrazine (DEA), deisopropylatrazine (DIA), and three HADPs. Overall, HADP prevalence (i.e., frequency of detection) ranged from 87 to 100% for hydroxyatrazine (HA), 0 to 58% for deethylhydroxyatrazine (DEHA), and 0% for deisopropylhydroxyatrazine (DIHA) with method detection limits of 0.04-0.10 ??g L-1. Atrazine metabolites accounted for nearly 60% of the atrazine load in northern Missouri streams at preplant, with HA the predominant metabolite present. Data presented in this study and a continuous monitoring study are used to support the hypothesis that a combination of desorption from stream sediments and dissolved-phase transport control HADP concentrations in streams.The contribution of hydroxylated atrazine degradation products (HADPs) to the total atrazine load (i.e., atrazine plus stable metabolites) in streams needs to be determined in order to fully assess the impact of atrazine contamination on stream ecosystems and human health. The objectives of this study were (1) to determine the contribution of HADPs to the total atrazine load in streams of nine midwestern states and (2) to discuss the mechanisms controlling the concentrations of HADPs in streams. Stream samples were collected from 95 streams in northern Missouri at preplant and postplant of 1994 and 1995, and an additional 46 streams were sampled in eight midwestern states at postplant of 1995. Samples were analyzed for atrazine, deethylatrazine (DEA), deisopropylatrazine (DIA), and three HADPs. Overall, HADP prevalence (i.e., frequency of detection) ranged from 87 to 100% for hydroxyatrazine (HA), 0 to 58% for deethylhydroxyatrazine (DEHA), and 0% for deisopropylhydroxyatrazine (DIHA) with method detection limits of 0.04-0.10 ??g L-1. Atrazine metabolites accounted for nearly 60% of the atrazine load in northern Missouri streams at preplant, with HA the predominant metabolite present. Data presented in this study and a continuous monitoring study are used to support the hypothesis that a combination of desorption from stream sediments and dissolved-phase transport control HADP concentrations in streams.
Some foundational aspects of quantum computers and quantum robots.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benioff, P.; Physics
1998-01-01
This paper addresses foundational issues related to quantum computing. The need for a universally valid theory such as quantum mechanics to describe to some extent its own validation is noted. This includes quantum mechanical descriptions of systems that do theoretical calculations (i.e. quantum computers) and systems that perform experiments. Quantum robots interacting with an environment are a small first step in this direction. Quantum robots are described here as mobile quantum systems with on-board quantum computers that interact with environments. Included are discussions on the carrying out of tasks and the division of tasks into computation and action phases. Specificmore » models based on quantum Turing machines are described. Differences and similarities between quantum robots plus environments and quantum computers are discussed.« less
Fundamental rate-loss trade-off for the quantum internet
NASA Astrophysics Data System (ADS)
Azuma, Koji; Mizutani, Akihiro; Lo, Hoi-Kwong
2016-11-01
The quantum internet holds promise for achieving quantum communication--such as quantum teleportation and quantum key distribution (QKD)--freely between any clients all over the globe, as well as for the simulation of the evolution of quantum many-body systems. The most primitive function of the quantum internet is to provide quantum entanglement or a secret key to two points efficiently, by using intermediate nodes connected by optical channels with each other. Here we derive a fundamental rate-loss trade-off for a quantum internet protocol, by generalizing the Takeoka-Guha-Wilde bound to be applicable to any network topology. This trade-off has essentially no scaling gap with the quantum communication efficiencies of protocols known to be indispensable to long-distance quantum communication, such as intercity QKD and quantum repeaters. Our result--putting a practical but general limitation on the quantum internet--enables us to grasp the potential of the future quantum internet.
Bifurcation-based adiabatic quantum computation with a nonlinear oscillator network.
Goto, Hayato
2016-02-22
The dynamics of nonlinear systems qualitatively change depending on their parameters, which is called bifurcation. A quantum-mechanical nonlinear oscillator can yield a quantum superposition of two oscillation states, known as a Schrödinger cat state, via quantum adiabatic evolution through its bifurcation point. Here we propose a quantum computer comprising such quantum nonlinear oscillators, instead of quantum bits, to solve hard combinatorial optimization problems. The nonlinear oscillator network finds optimal solutions via quantum adiabatic evolution, where nonlinear terms are increased slowly, in contrast to conventional adiabatic quantum computation or quantum annealing, where quantum fluctuation terms are decreased slowly. As a result of numerical simulations, it is concluded that quantum superposition and quantum fluctuation work effectively to find optimal solutions. It is also notable that the present computer is analogous to neural computers, which are also networks of nonlinear components. Thus, the present scheme will open new possibilities for quantum computation, nonlinear science, and artificial intelligence.
Quantum random oracle model for quantum digital signature
NASA Astrophysics Data System (ADS)
Shang, Tao; Lei, Qi; Liu, Jianwei
2016-10-01
The goal of this work is to provide a general security analysis tool, namely, the quantum random oracle (QRO), for facilitating the security analysis of quantum cryptographic protocols, especially protocols based on quantum one-way function. QRO is used to model quantum one-way function and different queries to QRO are used to model quantum attacks. A typical application of quantum one-way function is the quantum digital signature, whose progress has been hampered by the slow pace of the experimental realization. Alternatively, we use the QRO model to analyze the provable security of a quantum digital signature scheme and elaborate the analysis procedure. The QRO model differs from the prior quantum-accessible random oracle in that it can output quantum states as public keys and give responses to different queries. This tool can be a test bed for the cryptanalysis of more quantum cryptographic protocols based on the quantum one-way function.
Bifurcation-based adiabatic quantum computation with a nonlinear oscillator network
NASA Astrophysics Data System (ADS)
Goto, Hayato
2016-02-01
The dynamics of nonlinear systems qualitatively change depending on their parameters, which is called bifurcation. A quantum-mechanical nonlinear oscillator can yield a quantum superposition of two oscillation states, known as a Schrödinger cat state, via quantum adiabatic evolution through its bifurcation point. Here we propose a quantum computer comprising such quantum nonlinear oscillators, instead of quantum bits, to solve hard combinatorial optimization problems. The nonlinear oscillator network finds optimal solutions via quantum adiabatic evolution, where nonlinear terms are increased slowly, in contrast to conventional adiabatic quantum computation or quantum annealing, where quantum fluctuation terms are decreased slowly. As a result of numerical simulations, it is concluded that quantum superposition and quantum fluctuation work effectively to find optimal solutions. It is also notable that the present computer is analogous to neural computers, which are also networks of nonlinear components. Thus, the present scheme will open new possibilities for quantum computation, nonlinear science, and artificial intelligence.